1. "The way we ask the questions can determine the answers."
As voters around the country cast their ballots Tuesday, they may feel their decision is pretty clear cut. But when pollsters ask questions, the answer isn't always as simple as checking a box. Pollsters "can influence the results with artful word choices," says Michael Cornfield, acting director of the Political Management Program at George Washington University in Washington, D.C. Pollsters who work for or are paid by particular causes or candidates can easily skew results in their clients' favor based on the way the questions are worded, adds L.J. Shrum, professor and chair of marketing at the University of Texas at San Antonio College of Business. "But this wording is often not made clear when the results are reported."
Two questions speaking to the same issue can result in very different responses. Take, for example, "Do you favor cutting government spending to balance our budget?" Or, "Do you favor cutting Medicare for the elderly and Social Security for the poor?" What's more, politicians often take their lead on issues from polls, experts say. Say in the above example a good percentage of respondents answered yes to the latter question. That, in theory, could lead elected officials to believe that voters were aggressively seeking a cut in Medicare and Social Security, he says.
The placement of a question can have an even greater impact on poll results than the wording, particularly when it comes to politics and social issues, says Scott Keeter, director of survey research at Pew Research in Washington, D.C. "If I ask you about the financial troubles the country is facing and then ask you about the job that Obama is doing as president you may be less likely to give him a favorable rating," says Keeter. "But if I ask you about targeting senior figures in Al-Qaeda and the killing of Osama Bin Laden first, you may rate the president better based on his security record."
Pollsters say they avoid this problem by asking the same question in different ways to reduce bias and error. "It simply isn't as easy as asking, 'If the election for president were tomorrow, for which of the following candidates would you vote?'" says Brandon M. Macsata, managing partner at the The Macsata-Kornegay Group, a consulting firm in Washington, D.C.
This story has been updated; it originally ran on Jan. 10, 2012.
2."Our results are manipulated or just plain wrong."
love to get consumer opinions on everything from the cheapest plane tickets to the most popular brands of the moment. But some consumer questionnaires are backed by companies that have their own agenda. Pollster Peter Graves, for example, conducted a survey for a business magazine about the best employers in a particular state. "The editor of the magazine massaged the results so that his best advertisers were the best employers to work for," Graves says. "He wanted to make sure his buddies came out on top of the list." Graves, who was shocked by the incident, explains that his poll was crucial in highlighting the salary, working conditions and financial benefits received by employees. Potentially the results could encourage readers to work for a particular company on the list without realizing that the results are not what they seem, he adds.
Poll results can also be subject to misinterpretation and errors, experts say. Both Continental and American Airlines complained after they received bad ratings on a customer service survey released after Hurricane Irene hit last year. Turns out the survey, which was conducted over the phone and on Twitter, included results from defunct Twitter accounts. An American Airlines spokesman says the company handled over 100,000 calls on Aug. 26, 2011, the Friday before the hurricane, and customers waited an average of 21 minutes -- not the 1 hour and 32 minutes quoted in the survey. Consumer advocate and author Christopher Elliott says the airlines had good reason to complain. Consumers often choose their airlines based on service as well as price, he says.
3."People lie to say what they think is acceptable."
Survey respondents tend to stretch the truth when they are asked questions they deem to be taboo, analysts say. This can lead to polls under-representing how often people might, say, cheat on their taxes or pad their work expenses. The phenomenon even has a name. "It's called social respectability bias," Cornfield says. Here's a classic example. When asked something like, "Would you vote for a Mormon (or an African American, or a gay person) for president?" people know what the respectable answer is and they're inclined to give it, says Cornfield. "But they don't necessarily do what they say," he adds.
Good pollsters can manage this problem by attempting to normalize undesirable behavior, Keeter says. Before asking someone if they've registered to vote, for example, they may say something like, "A lot of people say they have yet to register. Are you one of them?" This makes it easier for people to admit to something that they may not be proud of.
And sometimes stretching the truth leads to a good end. "People tend to over-represent how often they give to charity," says Keeter. He says this could -- in theory -- encourage people to give more if they genuinely believed their friends and neighbors were doing the same. According to an annual poll conducted by the International Charities Aid Foundation, the U.S. ranked as the most generous in the world in terms of time and money in 2011, up from No. 5 in 2010.
4."The frontrunner should be afraid, very afraid."
President Obama's shifting fortunes in the polls over the months leading up to Tuesday's election are a textbook example of the pitfalls of an early lead. Frontrunners often fall under the unwelcome spotlight of the media, and often have more ground to lose than an upstart challenger, analysts say. Obama and his Republican rival Mitt Romney are now deadlocked, polls show. As voting began, Obama led 48% to 47% -- a difference of just seven voters among a pool of 1,475 surveyed, according to the latest Wall Street Journal/NBC poll, which has a margin of error of plus or minus 2.55 percentage points. But back in July, Obama had a clear lead of 49% to 43%. Keeter says polls can't change public opinion in a direct way, but they can create a "bandwagon" or "underdog" effect.
Romney reaped such rewards of trailing in the polls after his strong performance during the first debate. That Obama held his ground in the second and third debates did little to reverse the damage done in the first one, experts say. "The performance on the first debate made the anti-Obama intensity factor more relevant," Macsata says. "The more conservative elements of the Republican party started to find Romney a little easier to stomach."
Romney has benefited from being the underdog throughout this campaign, experts say. During the Republican primary season, Herman Cain was in the lead before accusations of sexual harassment brought down his campaign. Cain publicly denied the accusations, and said they were all false. And, back in July, Rep. Michelle Bachmann overtook then-leading candidate Mitt Romney, 25% to 21%, among likely Iowa caucus-goers, according to a poll commissioned by TheIowaRepublican.com. Oops. As we all know, Romney squeaked by with eight votes to win Iowa and Bachmann dropped out of the race.
5."I conducted this poll in my mother's basement."
Every Tom, Dick and Harry seems to be carrying out cheap and often cheerful online polls paid for by marketers that don't adhere to industry standards. When they are quoted in magazines or newspapers, few consumers know the difference, says social psychologist Matt Wallaert. The Council of American Survey Research Organizations and American Association for Public Opinion Research requires members to adhere to strict standards, such as being transparent about who paid for the poll and not pressuring people into answering certain questions.
When it comes to surveys, it's a diverse church: "Polls vary widely in quality," says Frank Newport, editor-in-chief at polling giant Gallup, a management consulting firm that uses poll data to advise its global portfolio of clients. Checking if a pollster is a member of one of these organizations is "one way to distinguish between the fly-by-nights and the real companies that are doing polling," says Barbara Bickart, associate professor of marketing at Boston University. Seth Rabinowitz, a partner at management consulting Silicon Associates, cautions against making any financial decisions -- choosing a credit card, for instance, or a mortgage lender -- based on information polled online.
To be sure, large, well-known pollsters are a world apart from the outfits that don't abide by industry standards, says Keeter of Pew Research, a nonprofit organization that receives most of its funding from the Pew Charitable Trusts. "We do not take on contract work," Keeter says, "and while we occasionally have partnerships in which survey costs are shared with other organizations, that is rare." For its part, Gallup has long had a strict policy to not work for any political party or candidate, or receive money from them. It does, however, have a financial relationship with USA Today for domestic polling.
6."Our samples are too small and not very random."
Few casual readers pay attention to the quality and size of poll samples, but how many and what types of people are asked the questions can have a dramatic effect on the results. Keeter of Pew Research says a small group of respondents leads to a bigger margin of error, meaning the poll may be misleading. Plus, it's more difficult to break down responses into subgroups, such as women or men under 30. That's something worth remembering next time you feel compelled to buy the best this or that because of a survey, he says.
And sample sizes are shrinking because of pure economics. Large, diverse polls are increasingly difficult and expensive for pollsters to carry out, says Wallaert, who is also a technology entrepreneur. One call can take 10 minutes, meaning only six surveys can be done per hour and that's not including time spent dialing or talking to people who don't want to participate, he says. If a pollster is paying someone $12 an hour to make the calls, each survey costs $2 in wages. Plus there's the cost of hiring, training, compiling the data, phone bills and rent. Compounding this problem, more people are also using Caller ID to screen calls, Macsata says. "Each poll comes with its own set of challenges, and can lead to not only fewer participants for the sample but also potentially skew the results," he says.
Typically, Pew Research samples 1,500 people but will boost that to 2,000 or 2,500 for political polls that are done close to the presidential elections. This is well above the 800 number some experts say creates a reasonable margin of error. Newport from Gallup says that, when it comes to sample sizes, "There is no magic number."
7."Don't make any decisions based on economic surveys."
Unemployment hovered at 7.9% in October, up a smidge from the 7.8% in September reported by the U.S. Labor Department for October. Unemployment data can be crucial for people trying to figure out if their jobs are secure or if they should curtail their spending. But even surveys as important as that can be based on confusing data. For instance, analysts point out that the unemployment number doesn't account for "discouraged" workers -- those who did not look for work in the prior four weeks. The government's headline unemployment figure doesn't include either discouraged workers or those who are working only part-time because the economy isn't strong enough to give them a full-time job; that "underutilized" figure was 14.6% in October versus 14.7% in September, according to government data. Gary Steinberg, a spokesman for the Bureau of Labor Statistics, says the bureau makes each definition clear and does not claim that one number is more accurate than the other. He says every job-seeker and employee is accounted for. "Each group is included somewhere," he says. "People are free to use our data any way they want."
The personal savings rate is another example. Compiled by the Bureau of Economic Analysis, this number measures savings as a percentage of personal disposable income. "The statistical measure itself is of the highest quality," says Sheldon Garon, a professor of history at Princeton University and author of "Beyond Our Means: Why America Spends While the World Saves." But, the personal saving rate is not the rate at which the average American saves, Garon explain, even though it is often interpreted that way.
For example, the savings rate spiked after 2008 from near-zero levels to 6%. That rise, however, was mainly due to wealthy Americans, who account for much of the nation's aggregate income. In response, Ralph Stewart, a spokesman for the Bureau of Economic Analysis, says it's a measure of savings, "not a measure of wealth. It does not include things like investment portfolios, 401(k)s or home equity." Rabinowitz says such data can encourage middle-class Americans to think the economy is healthier than it is: They may then buy that car or other big purchase they've been holding out for since the financial crisis of 2008, even if they can't really afford it yet.
8."Our surveys are designed to help you spend."
Surveys about consumer behavior might make interesting headlines, but ultimately they are used to help companies sell their wares by purchasing more detailed versions of those reports released to newspapers and magazines, according to Wallaert. The 2011 "American Pantry Study" conducted by the Harrison Group for the financial consultancy Deloitte, reports that 87% of shoppers found that several store brands are just as good as national brands, and 53% of respondents are bothered that they can't afford to always buy the brands they like. The survey also helped Deloitte clients determine just what brands consumers may want to buy. "We embarked on the American Pantry Study to gain a more sophisticated understanding of shoppers' attitudes, needs and behaviors to help our clients anticipate consumers' next move during and after the recession," says Pat Conroy, Deloitte's vice chairman and head of consumer products.
Plenty of polls are more focused on brand awareness for their sponsors than the results themselves, experts say. In January, Chase Card Services, a division of J.P. Morgan Chase, released a consumer survey saying: "Consumers take DIY approach to resolutions and rewards." The survey concludes that 59% of people want to pamper themselves at home rather than visiting a spa or salon, and 46% will use coupons from retailers. It also publicized Chase Card Services and its "Blueprint" online money management plan, which it says is designed "to help customers manage spending and borrowing." Tom O'Donnell, senior vice president at Chase, says the company aims to get more insight into consumers. "It gives us a view of what their mind-set is when it comes to spending, but also how they manage their spending and how they borrow money," he says. "Think of it as connecting the consumer's thinking with the kinds of financial products and services the bank can deliver."
The organization that funds a poll is often the most important detail for consumers to understand, says Cornfield, especially if that backer has a financial interest in the results. "Every time you look at a poll you have to look at the source," he says. But does reading the results of a poll like this increase the odds of someone spending beyond his/her budget limitations or common sense? "It certainly promotes a variety of 'keeping up with the Joneses,'" which is beneficial to the retail sector but not so helpful to those looking to cut their credit card debt, says Doug Short, vice president of research at the financial advisory service Advisor Perspectives, in Lexington, Mass.
9."We're being outclassed by social networking sites."
Many consumers use social networking sites to talk about their likes and dislikes. In the process, they give companies more valuable -- and free -- information than any poll could, experts say. Companies are gleaning this consumer behavior from social media sites like Facebook and Twitter, using complex algorithms. "People are putting data about themselves out there in unprecedented ways," says Mike Maples Jr., managing partner and co-founder at Floodgate, a seed stage investing firm based in Palo Alto, Calif. But he says this can have positive results for the consumer, too.
A case in point: Boston-based Crimson Hexagon, a social media analysis firm, scours opinions and attitudes from millions of tweets. Last November, it examined more than 1 million Tweets about Apple's iPhone 4S. The consensus wasn't as positive as Apple might have hoped: 37% were favorable, 29% were negative and 34% were neutral. What's more, some 11% of tweets mentioned that the battery drained too fast. (Apple did not respond to a request for comment.)
Competition from companies like Crimson Hexagon could ultimately make pollsters less relevant as social media becomes more pervasive, says Bickart of Boston University. This way of gathering information isn't as intrusive as traditional polling methods, she says, "but the question will be how accurate these methods are."
10."We haven't learned from past mistakes."
As Nov. 6 is election day, let's consider the mistakes of polls past. Here's just one infamous example: In 1936, the much respected Literary Digest magazine conducted a poll among 2.4 million voters concluding that Republican Gov. Alf Landon would likely be the overwhelming winner for U.S. president. Landon carried only Vermont and Maine, while Franklin Delano Roosevelt carried the other 46 states. Why the discrepancy between the poll and the results? The magazine only asked its own readers, who were telephone and automobile users who had higher-than-average disposable incomes compared with the average American.
Nearly eight decades later, the same kinds of mistakes risk being made by pollsters who believe President Obama's low approval ratings will dictate the election result, experts say. But Millennials -- Americans born between 1981 and 1993 -- may help the Democrats retain the White House in 2012, according to Schaefer. Younger people tend to use cell phones and be more liberal: A 2010 Pew Research study found that the percentage of people who said they would vote Republican in the 2010 Congressional election was 2.5 percentage points higher among a sample that was asked the question by landline only, versus a sample collected by both landline and cell phone.
Pew's Keeter says polling companies that adhere to industry standards make allowances for potential bias, and says the best polls give everyone a voice in helping to determine how their country is run. But, he adds, polls are not foolproof: "We like to think that people walk around with fully formulated opinions about certain issues, but they don't."