Let the Sun Shine In

Sunshine Week logo

National Sunshine Week, March 11 - 17, 2012.

This is Sunshine Week, March 11 – 17, celebrating the public’s right to know, a concept which these days goes by the less clarifying expression “transparency.” The culminating event of the week is Freedom of Information Day on March 16, the birthday of James Madison, the fourth U.S. president, who is considered the “Father of the Constitution” and champion of the Bill of Rights.

Sunshine Week is a nonpartisan event supported by politicians and public officials, journalists, libraries, schools, historians, museums, archives, and anyone who values open government.

You can find out more about activities in your community on the Sunshine Week website, which notes that events have been organized across the country to “enlighten and empower people to play an active role in their government at all levels, and to give them access to information that makes their lives better and their communities stronger.”

If you can’t get to one of the celebrations in your community, celebrate by viewing and re-reading the U.S. Constitution and learn more about Madison’s long road to ensuring its ratification on the National Archives website.

Cartoon contributed by Steve Greenberg, Los Angeles.

Writing that inspired me this week:

“Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”
~ First Amendment, U.S. Constitution

Statistical Significance: Making Sense of Numbers

Like a lot of writers and artists, I had an aversion to math when I was younger. Then, I took a statistics class in high school, and it changed the way I viewed the study of mathematics, its real-world application, and, perhaps most profoundly, my own ability to understand analytical concepts.

It sounds incredibly geeky, but once I learned the difference between mean, median and mode, and how frequently they’re confused, I never looked at statistics in a news report, election result, textbook chapter, or research paper in the same light.

Lately, I’ve read more than a handful of articles and press releases that misuse or leave out statistics gathered via surveys or studies.

As professional writers, we occasionally need to delve into the world of math and statistics. Whether you’re preparing a press release, writing an article for the company website, or reporting the news, your facts may rely on figures.

What you don’t want is to be associated with that old saying about lying with statistics. Your credibility rests on your transparency with numbers, especially when the goal is something like press coverage, promotion or a fact-driven news story.

Here are some basics for writing with numbers:

Include the number of subjects who participated in a survey or study
This figure is an absolute requirement (and I’ve seen it missing from more than three recent press releases or website posts in as many months). It is, after all, the starting point for any survey or study and it provides the reader with an ability to judge how relevant the data might be.

Note that in any study or survey there’s a percentage of answers or results that must be discarded: a participant chose not to answer one or more of the questions or results from a lab test were unclear. From a pool of 400 participants, a plus or minus error rate of 1 – 2 percent might be acceptable, but if only 10 people took the time to fill out a survey, even one incomplete or botched entry makes a huge difference to the quality of results.

Frankly, if the participating group is as small as 10, the survey or study probably doesn’t hold a lot of weight scientifically. This hasn’t stopped reporting of such results, but frequently what the reader isn’t told, because it would cast doubt on the validity of both the results and the assertions made about them, is how many people were in the study group.

In scientific and medical research, groups that small are often part of preliminary studies, which is why there is so much regulation around reporting findings that may not have any bearing on larger populations of patients.

Provide a breakdown of participant groups
The more details you provide, the more credible your story becomes. It doesn’t necessarily make the survey or study more credible, but you give your reader the ability to assess the information based on their own understanding of the subject and related facts.

Participant details (as long as you are sharing non-identifying, unconfidential information) can include things like gender, age group, political affiliation, economic strata, professional experience, blood type, etc.

Share types of questions asked, specific information or samples gathered from subjects
Certainly, there’ll be articles where this counts as too much detail, but in a scientific study it might be essential to understand that blood samples were taken within a certain time period following the administration of medication.

For surveys, sharing a greater level of detail comes in handy when highlighting a particularly notable response. Knowing the question helps the reader analyze the answers.

You get bonus points for including the scale used to score a survey (and for using more complex scales – five response options rather than two or three – when conducting a survey in the first place).

Be clear about the scope of findings
Avoid at all costs “universalizing” results. This is a mistake I see frequently. Unintentionally or not, extrapolating findings from the original study group to a large population (for example, using one exit poll to predict an election) misrepresents the scope of the results. It’s striving for a significance that isn’t there.

A survey or study has a specific number of participants; the findings refer to the original population, especially with only one study and no further research to verify the original results.

Results must be qualified with statements like “among people who took the survey,” “according to survey respondents,” “in the study,” and, for large studies, the data may need to be further broken down to percentages within the various subgroups (such as, “x% of the women/seniors, etc., in the study population experienced reactions”).

Avoid calling survey or study methodology “scientific”
Like extrapolating data, appending the word “scientific” to any old survey or study is striving for a kind of credibility the research may not have earned.

The scientific method has since the 17th century provided objective, measurable, repeatable standards and techniques for investigating subjects and gleaning new information. Wikipedia offers a more comprehensive explanation here, but suffice to say, if a study wasn’t conducted by an agency with no investment in the outcome (objective), didn’t start with a hypotheses against which results could be compared and contrasted, proved or disproved (measurable), and wasn’t repeated to ensure reliability of reporting, calling it “scientific” is a misuse of language. Speaking of which…

Understand key terms like “statistical significance”
The word “significant,” when attached to study data, does not translate as “key,” “momentous” or “important.”

“Statistically significant” is a phrase statisticians use when a result is “unlikely to have occurred by chance.” Which also doesn’t mean that it’s important simply because it’s unlikely. The unlikely result would need to be repeatable and measured through additional objective means (at the very least) in order to determine its real-world significance.

Do you have a mathematical pet peeve? What examples of exaggeration have you noticed when it comes to writing with numbers? Feel free to enumerate in the Comments.

Writing that inspired me this week:

“Figures often beguile me, particularly when I have the arranging of them myself; in which case the remark attributed to Disraeli would often apply with justice and force: ‘There are three kinds of lies: lies, damned lies, and statistics.’”
~ Mark Twain, “Chapters from My Autobiography”

Transparency Equals Credibility: Communicating What You Know When You Know It

Communicating is tough when the news is difficult for your audience. It’s even harder when details aren’t fully baked.

I’ve fallen victim to holding back – for example, waiting till all of the content was developed before launching an intranet site with only a few months left to go on a large-scale change project.

I look back on that experience and cringe. Sure, intranets were brand new then, and most of us were trying to make them look and function like external websites, but the main reason I delayed was fear that stakeholders needed more content. So, they waited while we created.

Doing Better than “I Don’t Know”

Recently, I wrote a post about communicating a new Pay & Rewards system for warehouse employees (“They All Laughed: Can Humor Be a Communications Asset?”). That post was about getting my foot in the door and learning an entirely new functional area – heck, understanding a whole different culture – at the company. I promised to tell you the rest of the story as a kind of case study for communicating difficult information, even when all the facts aren’t available.

For many years, change communicators were advised to get comfortable (and get leadership comfortable) with saying, “I don’t know.”

The fact is, “I don’t know” doesn’t work for most audiences. They suspect you do know something; you’re just not telling them.

Let’s face it, they’re usually right.

Instead of trying to make “I don’t know” work, it’s a lot better and creates far more credibility for everyone involved (leaders, the change team, Corp Comm, and your communications vehicles) to establish regular milestones for communication.

That’s the mandate our warehouse Pay & Rewards team started with. But, instead of providing regular updates, they sequestered themselves while rumors filled the information vacuum.

Rebuilding Credibility

Before I was ushered in to meet the Pay & Rewards team, the facility leader advised me he’d appointed warehouse employees to this team for a reason. He was adamant that, unless there was absolutely no other way, they needed to deliver the communications. This wasn’t going to be some slick communication from HQ. The credibility of the team and the pay system depended on it.

Sitting in on my first session with the team, it was easy to recognize their expertise. They’d been through weeks of training to understand pay systems. They’d spent months formulating and reformulating a design they thought was appropriate when employees moved from the old warehouse to a new distribution center where they’d be expected to work with a massive robotics system and computers.

The team was determined to do right by their colleagues. And that’s where they foundered. Concerned about creating a fair system, they were afraid to communicate a half-developed design – and air their debates as the design took shape. The risk of misunderstanding seemed too great.

Early in their deliberations, they began taping paper over the little window in the conference room door to keep people from peeking at anything that might be on the white board. That didn’t sit well with the warehouse.

Good News or Bad, Be Honest, Clear and Consistent

As we worked together, I realized the new design was good news for the majority of employees: most would see a pay increase when they moved to the new distribution center, and the process for earning bonus pay would be standardized, making it clear and fair for everyone.

(Now, I can hear you thinking, “But, you had good news to communicate. Just go out and tell employees, and the team’ll be fine. It’s not like employees were losing money. That’d be a lot harder to explain.” True, we were lucky there, but six months of silence is a hard act to follow. More to the point: I would have taken exactly the same steps if the new pay system had meant smaller paychecks and fewer bonuses.)

The first thing I did was have the team explain the new pay and rewards design to me, the timetable for its approval, and how it would work as employees transitioned to the new facility. This took a while, but in the confines of a private conference room, they presented the information like the experts they were.

Facing their co-workers was quite a different story. Up until their appointment, the Pay & Rewards team members worked in the warehouse; they’d never had to speak in public, and they were nervous and wary. We spent hours on presentation training, while I worked with a graphic designer back at HQ to bring their design to life.

The team had three critical goals for the presentation:

  • re-establishing the Pay & Rewards team’s credibility
  • explaining the new pay and bonus system clearly, so that everyone walked away with a solid understanding of where they fit in the new pay scale
  • establishing the equity of the new system

When It’s Okay to Bury the Lede

Together, we created a presentation that addressed each of the critical needs and used models (i.e., without actual hourly pay) to explain the pay scales conceptually before sharing the actual new pay and bonus rates.

The presentation went something like this, with each team member taking a turn:

  • the team’s training in pay design
  • the core value of developing a pay system in-house, rather than having one imposed from the outside
  • an objective overview of pay scales in the regional economy
  • benefits of the new pay and bonus system
  • models of the new pay and bonus system
  • expectations of employees: how they could be successful within the new structure
  • the new pay and rewards system
  • questions and answers

This wasn’t a crisis – and we were delivering complicated information that every single employee cares deeply about (“How much am I going to get paid?”) – so, we had good reasons to delay sharing the actual pay rates. If the team had shared the new pay rates at the beginning of the presentation, no one would have heard a word they said after that.

Despite jitters, the team delivered heroically, communicating to every employee in the facility within a 24-hour period and staying late into the night to present to third shift to ensure the information was delivered in a timely and consistent manner.

There were lots of questions, but very little confusion – the audience heard and understood the messages – and the team answered every single query, further re-establishing their credibility.

By the next morning, we’d placed thick notebooks with the full presentation, a long list of Q&As, and a schedule of regular Pay & Rewards team communication updates in the cafeteria and common areas. The team made themselves available for questions and we updated the Q&As whenever there was new information.

There were certainly concerns and questions for a couple of days following the presentation, but before the week was out, everything was back to normal and the old rumors had been put to rest.

Ultimately, the new design gave employees a positive incentive to move to the new facility, so they could work within the new pay system. And our change team moved on to other pressing issues of the transition, such as employee technophobia and business-stakeholder management.

Is there anything we would have done differently if the news had been bad and pay went down? The key indicator for that message was the pay scales in the regional economy and how our facility ranked in relation to other companies. Given bad news, you’d still follow the same order above, ending with a series of small group or brown bag lunch sessions to continue the discussion, reinforcing key messages and answering questions. Then, make all of the information available, as we did, in an open and transparent way. Ongoing communications from the team and their willingness to answer questions frankly and on the fly would help maintain their credibility over the long run.