SPUTNIK Russian daily omnibus
WHAT IS ‘VCIOM-SPUTNIK’? General information
VCIOM-SPUTNIK is a Russian daily telephone-based survey conducted by VCIOM. Every day we interview 600 respondents from at least 80 Russian regions and obtain reliable data.
A new tool helps to measure public opinion on any issue in a timely manner; within three days the sample size can be up to 1,800 respondents. Every week we survey 4,200 respondents in all the Russian regions representing 16,800 respondents a month and more than 200,000 Russians a year.
VCIOM-SPUTNIK is an omnibus-type survey that means that each survey questionnaire includes various topics (ranging from politics and economy to marketing). Certain blocks of questions are included in the survey regularly (weekly, monthly, quarterly or yearly) and help track changes in indicators.
The random (probability) sampling classical technique is used in VCIOM-SPUTNIK. This method helps to generalize the results to the entire population and to assess statistical error.
VCIOM-SPUTNIK provides a multilevel data quality control which comprises supervisor observation, listening check and logical control. This ensures full compliance with the survey requirements.
This tool can be helpful if:
WHAT VCIOM-SPUTNIK IS USEFUL FOR
BONUS! When you buy an omnibus survey, you get the following data free of charge:
Methodological report on response rate
How we select the phone numbers
We use all the ranges of mobile phone numbers from the Ministry of Digital Development website. The aggregate capacity comprises ABSOLUTELY all the telephone numbers which are already used by mobile service providers or can be introduced in accordance with applicable regulations..
The sampling frame is 40-45 thousand phone numbers randomly chosen (i.e. using random-number generator) from the database; the sample size depends on the season (bigger in summer and during winter holidays and smaller in autumn and in spring). To select the telephone numbers to dial we also use the random-number generator.
What accounts for this sample frame? The sample size calculation is based on our long-term experience and ensures that even those respondents who were not available on the first call are also included in the sample. If the sample frame is large, there is a greater probability that the number of respondents needed in a survey is reached even though all the phone numbers are called once. If the sample size is too small, there is a risk that the required number of respondents will not be obtained even if all the phone numbers are called multiple times. In VCIOM-SPUTNIK we make at least 8 attempts to get through to respondent (if no one picks up the phone; if the call failed; if it is not convenient for the respondent to talk).
One key rule is taken into consideration when designing the sample fame.
We keep the proportion of phones according to federal districts.
It is important because the response rate differs considerably depending on the federal district. Besides that, significant differences in time zones should also be taken into account. Otherwise, it may cause a bias in favor of those regions where it is evening or daytime at the beginning of the survey.
What regions and settlements are included in the sample?
As described above, the phone numbers are randomly selected from the entire array of the Russian phone numbers.
This is why we do not have any fixed list of sampling units (selected settlements) prepared in advance; this list can be designed upon survey completion.
However, certain sampling units are present almost in any survey (for example, large cities). This is due to the large populations which increase the probability of being included in the sample.
All the federal districts and at least 80 Russian regions are represented in each survey. The share of rural area inhabitants is approximately 18-20%.
How do we select respondents?
To obtain high representativeness, all the respondents have an equal chance of being included in the sample. For this purpose, the following requirements are respected:
We make at least 8 attempts to get through to every respondent. If the line is busy, the phone number will be automatically dialled in 30 minutes. If no one answers the phone, we redial every 2 hours. If it is inconvenient for the respondent to talk, we will call at the fixed time. If the respondent refuses to answer during the first contact, we will call again the next day.
We have a single time for calls – in each time zone the ring time is between 16:00 and 21:00.
We dial a mobile phone number - we interview the one who picks up the phone.
If it is not the right time for the respondent to be interviewed, we can call him/her at any time convenient for the respondent. The interviewer is not allowed to agree to interview any other person.
We conduct refusal conversion– if the respondent refuses to take part in the survey, we call him/her the next day and ask him/her again to take part in the survey. Approximately 10% agree to be interviewed.
Data representativeness and statistical error
Equal chances for all Russians aged 18 and more to be included in the sample provide data representativeness regardless their place of residence.
In theory, the sample excludes only those Russians who have no mobile phone number. Their number does not exceed 2%.
In probability theory, the law of large numbers states that the observed mean (arithmetic mean) of sufficiently large sample from the fixed distribution tends to get closer to the theoretical mean (expected value) of this distribution.
The sample size we use is large enough to make this law work and to make different distributions (for example, the share of men) in the sample frame and among Russian adults coincide. Due to random selection technique our data are close to the Rosstat data.
We reduce systematic error by redialing and refusal conversion (see above).
For the given random sample, the margin of error with a 95% confidence level does not exceed:
In addition to sampling error, minor changes in question wording and different circumstances arising during the fieldwork can introduce bias into the survey.
Data quality control
Data is our main deliverable, and we pay a special attention to the data quality. We carry out a multilevel control which starts at the preparation stage, continues throughout the survey and ends with different data quality assessment procedures.
How we provide the data quality:
In theory, random selection technique provides the distribution of basic characteristics close to the distribution in the general population.
However, to measure certain economic, social and political indicators it is important to make sure that not only the distributions of particular indicators are rather accurate but the basic social and demographic variables are also well-balanced. This is why when calculating social and political indicators, we weigh social and demographic indicators. As a result, social groups which are not numerous enough in the response array get an additional “bonus” to their viewpoint.
The variables we weigh:
Telephone-based survey response rate
We use the AAPOR techniques to calculate response rate in VCIOM-OMNIBUS (Standard definitions, revised 2016).
Key response rate indicators are as follows:
It is wrong to consider that a response rate increase directly improves the data quality. For instance, we could increase the response rate using a fixed list of telephone numbers instead of a random sample of phone numbers. But seeking to obtain high response rates we would lose on sample quality and pay with random selection and geographic coverage (no one knows how many phone numbers would be missed).
For the time being, researchers failed to prove that surveys with different response rates provide different results. Sample bias occurs if respondents differ from nonrespondents in some characteristics, but random sample technique minimizes this probability.
However, high non-response rate (compared to the average non-response rate in a given survey) may result in poor survey quality in general and a considerable bias in the survey results. Sample bias occurs if respondents differ from nonrespondents in some characteristics. That is why recording and analyzing the response rate is our common practice aimed at minimizing any risks of nonresponse bias.
If you need further information, please contact us at: firstname.lastname@example.org