Q&A

Q. Many millions of people do not have internet access. How can you possibly reflect the views of the public as a whole?

A. When the internet was new, reliable surveys were indeed impossible to conduct, simply because too few people had access to it. But now it has spread to every significant demographic group, with 62% of GB Adults logging on. Online researchers are able to reach sufficient numbers of women as well as men, over-60s as well as under 30s, people on below-average as well as above-average incomes. National surveys are therefore conducted to represent the public as a whole.

Q. Yet surely it is easier for telephone polling companies to achieve a representative sample, because almost everyone has a telephone these days?

A. Not necessarily. There is a variety of reasons why telephone polling companies have difficulty obtaining a truly representative sample. They tend to reach people who spend the most time at home and who are most willing to talk to strangers. They are less likely to reach people who are out a lot, or often use voicemail to screen incoming calls, or don't like talking to strangers, or mainly or entirely use a mobile phone. (Telephone research companies use only landline numbers to reach people.) Telephone polling companies generally achieve only 15 interviews for every 100 residential numbers they dial.

Q. But perhaps the people in your samples, all of whom are online, may differ from those who are not online in significant respects, even if they resemble the general public demographically?

A. All companies face the challenge of taking the views of the people they can reach and estimating the views of the people they can’t. Even if they get the demographics right, how can they be sure that the people who do complete surveys are like the rest of the public? It is precisely to respond to this challenge that YouGov seeks to produce results that are not just demographically representative but also attitudinally representative. As well as weighting our raw data to ensure that our figures conform to the profile of the nation by age, gender, social class and region, we also weight our data by newspaper readership, and often by past vote. For example, we ensure that our published figures contain 22% who are Sun or Star readers and 4% % who are Guardian or Independent readers.

Q. You mention "weighting". Newspaper reports of surveys often say that the data have been “weighted”. What does this mean?

A. Almost all surveys involve weighting, whether they are conducted online, face-to-face or by telephone. This is to ensure that the published results properly reflect the population they seek to measure. For example, men comprise 48% of the electorate and women 52%. The raw figures in a well-conducted survey will be close to this, but not necessarily match these numbers exactly. Suppose the raw figures contain 50% men and 50% women. YouGov's computer would slightly "downweight" the replies given by the men (so that the replies of 50 men count as if they were 48) and slightly “upweight” the replies given by women (so that the replies of 50 women count as if they were 52).

In practice, the task is more complex than this, as matters such as age, social class, region and newspaper readership, as well as gender, have to be considered simultaneously. This is a task for YouGov's computer, which adjusts the raw data to take account of all these factors.

Q. Aren’t your surveys self-selecting – and therefore likely to be skewed to people who have strong views about the subject of any given poll, rather than representative of the public as a whole?

A. No. When we seek the views of the general public, we select which respondents we want to survey. Only those selected are able to complete the questionnaire. When we email them we do not give the subject of the survey. Furthermore, our incentive system is designed to attract people who are not interested in the subject in question, as well as those who are passionate about it. This means that we experience little or no "drop-off" when a multi-topic survey shifts from one subject to another.

Q. How big are your samples – and how do these compare with those of polls using traditional methods?

A. Most of YouGov's polls of the general public achieve samples of at least 2,000. Among other companies conducting regular polls for the media, MORI’s monthly political tracking polls also have samples of around 2,000, while ICM and Populus normally poll 1,000 people. The headline voting intention figures - which eliminate respondents who say "don’t know" or "won't vote" or, in the case of MORI, ICM and Populus, are judged unlikely to vote – are based on samples, typically, of 1,500 for YouGov, 1,000 for MORI and around 550 for ICM and Populus.

Q. Why does sample size matter?

A. Because the risk of random sampling error is related to sample size: the smaller the sample, the greater the risk of such error. On a sample of 550, we can be sure that, 19 times out of 20, the true figure – that is, the figure that would have been obtained had the whole population been polled using the same methods – is within 4% of the published figure. Random error on a sample of 1,000 is up to 3%, on 1,500 up to 2.5% and on 2,000 up to 2%. Larger samples also allow the views of subgroups, such as women voters or Conservative supporters, to be measured more accurately.

Q. Isn’t there a danger of a "panel effect": that is, by being polled regularly, YouGov's respondents become conditioned by the process and cease to be representative of the electorate as a whole?

A. It is rare for any YouGov respondent to be asked the same or similar question (e.g. voting intention) very often. Any panel effect is therefore likely to be negligible. However, we monitor this from time to time by comparing the results from "fresh" with "repeat" respondents. So far we have found no variation.

Simple logic tells us that, if it were true that repeatedly asking the same question of the same individual could for example change the way someone was going to vote, it would no doubt be a technique widely used by political parties. This is not the case.

Q. How can you tell that your respondents don’t simply give quick, random, slapdash answers, rather than their genuine views?

A. As with any polling company, we cannot completely guarantee that not a single respondent will lie or play silly games. However, there is no evidence that this is a real problem. From time to time we ask some respondents classification questions they have answered before to check for consistency – and find little or no evidence of anyone trying to take YouGov for a ride. There is evidence that many people are more honest when answering questions anonymously via a computer than talking to a stranger. They are also under no time pressure when completing surveys online. They can take as long as they want – which is one reason why online surveys are better than telephone or face-to-face surveys for asking complex questions that need time for thought.

Q. What is to prevent people – perhaps extremists of one kind or another – from signing on to YouGov's pool in order to influence YouGov's results?

A. When we conduct political and public policy surveys, the great majority of the people we survey are those we have proactively recruited via other carefully-selected websites. We monitor closely the minority who register with YouGov by visiting our site. If there is any sudden deviation in the pattern of such recruits, we retain the option of excluding them from our surveys.

It should be borne in mind that any organisation attempting to “move” our figures by, say, ten percentage points would have to infiltrate more than 5,000 people. Any attempt to do this would quickly be detected.

Q. Your surveys sometimes contradict those produced by companies using traditional methods. Doesn't this prove that your methods are unreliable?

A. No. All it proves is that different methods may produce different responses. In itself, this does not prove which (if either) is valid. In fact, such differences occur relatively infrequently. Normally – for example, attitudes to the Iraq war and public responses to the Hutton inquiry – when YouGov and other companies have asked similar questions at around the same time, our published figures are much the same.

However, there are two areas where significant differences do persist. The first concerns voting intention. YouGov's figures tend to report slightly more support for the Conservatives, and slightly less for Labour, than other companies. On this, the recent record at elections tends to support YouGov's methods. In 2001, for example, we predicted Labour’s lead over the Conservatives to within one percentage point. Every other company overstated Labour’s lead, some of them by significant amounts. In the elections to the Scottish Parliament in 2003, YouGov again was alone in predicting the Labour and Conservative shares to within one percentage point. Three other companies, all polling by telephone, all significantly overstated Labour's share and understated what turned out to be the Conservatives’ share.

The second subject on which YouGov differs from its rivals concerns taxation. We consistently find greater hostility to tax rises than our competitors. We believe that this is as a result of online surveys producing more honest answers. Our findings are consistent with the results of three local referendums on the council tax. None of them has produced a majority for paying enough extra tax to improve local services. Indeed, in two of the three places (Bristol and Croydon), majorities have chosen no tax increase at all, despite warnings that services would be reduced. For some years politicians, both Labour and Conservative, have argued that (conventional) polls have been wrong to detect an appetite for higher taxes. YouGov's surveys suggest that the politicians may have been right all along.

Q. Why does YouGov consistently seem to produce results more favourable to the Conservatives than other polls?

A. Given YouGov's record for accurate predictions, and the habit of almost all other polls in the recent elections understate Conservative support, perhaps the question ought to be: why do other polls consistently produce results LESS favourable to the Conservatives?

We believe that one major reason concerns a theory known among pollsters and political scientists as “spiral of silence”. This suggests that, for more than a decade, some people have been reluctant to admit to a stranger that they might vote Conservative. The numbers involved are small – probably no more than 2 or 3 per cent of any national sample; but this is enough to give a misleading picture of the state of the main parties, especially in a fairly close race (such as the 1992 general election).

If this theory is right, then it is likely that YouGov polls are more accurate because we have no “interviewer effect”, as people responding to our surveys are filling in their questionnaires on a computer screen, rather than talking to another human being.

Q. How can you show that YouGov's polls are consistently accurate?

A. The only way any polling company can demonstrate its accuracy is when its results can be compared with real events. There have been five occasions when YouGov has predicted the outcomes of such events in Britain. In each case we have come close to the outcome. As well as the 2001 general election and 2003 Scottish Parliament elections already discussed, YouGov also correctly "called" the 2005 Conservative leadership election, the 2002 Pop Idol contest, the 2002 London borough elections and the 2007 Scottish Parliament elections regional vote.

Q. Who ensures that YouGov polls are 'honest and accurate'?

A. YouGov is a member of the British Polling Council and abides by its rules. YouGov is also registered with the Information Commissioner, and the majority of YouGov’s employees are members of the Market Research Society.

The British Polling Council (www.britishpollingcouncil.org) is an association of polling organisations that publish polls. The objectives of the Council ensure standards of disclosure designed to provide consumers of survey results that enter the public domain have an adequate basis for judging the reliability and validity of the results.

YouGov is bound to respond in full to any bona fide enquiries about specific published polls.