One of these panels is Ipsos MyView and I thought I should share a recent experience with them that I found especially absurd and that might provide some insight into the lives of online panel poll recipients. I may not have gone public had it been entirely a one-off, but my general experience is that this panel is very buggy compared to at least some others. Surveys sometimes crash or screen the respondent out without providing any points, or lurk in the respondent's dashboard as awaiting completion for weeks but if you click on them you are told the survey isn't available as you may have already taken it. There has been some improvement in this in recent months, with the panel more often referring the respondent to a different survey rather than screening them out immediately, but there are still far too many glitches.
Much of the time MyView works like this: You are invited to take a survey and asked a few screening questions. If you are accepted and complete the survey you then get points, which once you have enough of them can be converted to shopping vouchers, gift cards, EFTPOS cards or charity donations. Most completed surveys are worth between 100 points (effectively $1) and 300 points (effectively $3). If you are screened out you should in theory get 5 points (5 cents!). A common annoyance on these surveys is filling out what may as well be a full survey and then being screened out, though I have found that complaining about the more ridiculous examples of this will often result in 100-150 points compensation.
Some of the surveys are reasonable value if you don't mind earning $15/hour or so in moments of spare time that are not otherwise useful. Some, however, are ridiculously long for the amount of points given if one answers seriously, to the point that the client cannot possibly be still getting useful responses by the end (I've used the term "sweatshop" in the feedback section at the end of surveys many times). As a guide to the sort of overall value one might gain from being a regular member of one of these things, in the last 12 months my dashboard shows me as completing 94 surveys, being screened out 163 times, and receiving points for other things (a mix of loyalty points and complaining) six times, for a total of $165.60 in voucher value. Respondents won't be spending that all at once, and that doesn't account for dozens of times I clicked on a survey link and ended up with nothing. Nonetheless, the surveys do provide some small measure of actual reward for people who enjoy them or find them interesting.
(Some other panels such as YouGov use a different approach in which the respondent is pretty much never screened out and in general given some kind of survey, but with a lower average value per completed survey - though also with more shorter surveys.)
The Drinks Survey
Nothing in my experience of the MyView panel over the past few years prepared me for the circus that was the drinks diary survey. It started (more or less, there may have been some earlier invitation that I agreed to) with an email invitation on 14 May that looked like this:
The survey turned out to be about drinks, of all varieties. I have no idea who the client commissioning it would have been or what they would use the information for. When I took one of these surveys I would be asked to give numbers of drinks of all kinds that I had consumed in the previous three days, even if they were only water. For one drink within this sample space I would be asked to provide many more details, for example: Where did I drink it? Was I with anyone? How large was it? Did I drink it in a bottle, a cup or a glass? What brand was it? Did I have food with it? What sorts of characteristics of the drink were important to me? Did I drink it in a box? Did I drink it with a fox?
The last two were of course not the real questions, but they would have made more sense than two of the actual questions, which asked me about how (from a list of options) I wanted the drink to make me feel, and how I wanted it to make me look (relaxed, confident, calm, energised etc.) What was especially senseless about those two questions was that they had no "none of the above option" and worse still no option for me to write an essay enquiring as to what portion of the human race really considered their status as a fashion model and visual influencer every time they consumed a glass of water or a cup of tea.
In theory I should have been sent invitations every three days through the period. What I actually found was that between the email system and the dashboard (which I will sometimes check in case I have missed any surveys sent out by email) I was getting invitations to take part in these surveys almost every day. (The dashboard was the main culprit here as the email invitations were fairly well spaced.) That was weird, but I thought that maybe I had entered the survey through multiple portals and was being polled twice (something that has actually happened to me with some other surveys).
When I filled out the surveys to the end, one of the following things happened, with each of them happening many times:
1. I completed the full survey, was told that was all for today, and to check back in three days for the next round.
2. I completed the full survey then was told my response had been rejected for data quality reasons. This tended to happen especially if I filled out the survey via the dashboard within a day or so after having a response accepted.
3. The survey asked me only a few of the usual drinks questions, said that that was all the questions it had had for me today, and then the final screen told me I had just earned 200 points.
As my experience with this survey was different to what had been foreshadowed, I was very curious to see what would happen with my points tally 30 days after the survey finished. On 20 June (a week late) I received this:
I was expecting at least 3000 points for this survey having filled it out to the end and had it accepted several times, and having also been told at least half a dozen times that I had just earned 200 points. So it was rather odd to look into my profile and find that I had been awarded, in total, 450 points (= $4.50) - one lot of 100 points, one lot of 350.
Subsequent Exchanges
I sent a brief please-explain and received a response on 23 June, the important parts of which were as follows:
"As shown in the screenshot above, please be advised that the 1st transaction of 100 points is for participating the diary surveys, and the 2nd transaction is total of additional amount based on your response quality. Please note that the number of transactions does not refers to the number of entries that you have submitted.
As you can see from the original email we sent you (as shown in the below screenshot), we did clearly state that the number of rewards would be dependent on the quality of the responses received. "
This was, of course, complete nonsense. The only references to response quality in the original email had referred to lots of 1000 or 5000 points - there was no way to earn 350 points for response quality. There was also no way to earn 100 points for responses since even a single response was worth 150 points. Indeed even if the points had all been for responses and not response quality, there was still no way to get 450 points. Depending on interpretation, 1 response was 150, 2 was either 200 or 300, 3 was either 550 or 750, 4 was either 700 or 1000 and so on. I wrote back again at some length, and also pointed out to them that they were again ignoring what I had said about often being told I had just earned 200 points.
On 27 June I received a response, the important parts of which were:
" After a thorough investigation, our record shows that you have completed 3 diary entries. Please refer to the breakdown of your points as below :
Please refer below for the breakdown of your points :
3 diary entries x 100 points per response = 300 points
3 diary entries x 150 extra points per response = 450
Extra points for providing 3 responses = 150"
Now this still doesn't make sense, because they've already given me the promised 150 points extra per response for 3 responses and therefore there is no additional "extra points for providing 3 responses" to add. But the bigger problem is that they're claiming I only completed 3 entries, which I know isn't true, and which I had already told them wasn't true. I filled out the surveys to the point of either being told to check back in a few days, or to the point of being told that was all for today and I had just earned 200 points, at least a dozen times - not counting the times my entry was rejected.
I wrote back again, and on 31 July received a supposed itemised list of my sessions of answering the drinks survey. Their system had a record of a plausible 20 times that I had started a session of the survey. However their system recorded:
* three of these as successful completions
* five as screenouts because I had completed surveys more rapidly than once every three days (consistent with my memory of the timing of cases where I submitted an entry that was rejected after filling in one survey from my dashboard soon after another)
* twelve supposed dropouts
The alleged dropouts included all nine cases where I started a session in June, up to June 8 (the day I was last emailed an invitation for this survey series). In a large number of cases I completed either the long or the short version of the survey and was told either to check back or that I had just earned 200 points, yet MyView clearly must have either recorded these events as dropouts or not recorded them at all.
In their first three emails to me, at no point did Ipsos MyView:
* Acknowledge or directly address my comment that in several cases I was told I had just earned 200 points (beyond finally saying in their third email that "Production team also confirmed that the reward per response is only 100 points as stated in the invitation email.")
* Accept my comments that the number of entries I successfully completed greatly exceeded three.
* Give me any points as an apology for the false nature of their original response (which meant I had to write a much longer email explaining why they were wrong) or a thankyou for the advice I have given them on how to stop this debacle happening again. (My advice was to either provide points at the time or at least record a survey as completed in a member's record so the member can see immediately whether their responses were accepted or not). They have said the feedback will be passed on to the client who will hopefully improve the survey as a result.
* Show any awareness that their system is full of bugs and that a customer who complains about bugs in their system is probably right.
They, in effect, chose to believe the records of their bug-ridden system rather than the person taking the survey.
Finally (this update added 3 August) after a fourth email from me they compensated me 1500 points and apologised for the issues with these surveys.
Tales Of The Punch-Drunk Panellists
Based on my overall experience of the MyView panel, it's no surprise to me that pollsters have to continually deal with respondents who speed, straightline, satisfice and in other ways attempt to game the system to get points as fast as possible. Doubtless some do it from the start, but I suspect others would start out with good intentions and become more cynical over time.
More often than not MyView works correctly, but it is so often buggy that it becomes a ripoff if a respondent answers all questions honestly and thoughtfully to the best of their ability. I'd expect that some respondents would come to see it as a game in which they became more interested in getting rewards as easily as possible than in the quality of information they were providing. Things have changed since the days when most polling was done by live phone and most people were happy to have a yarn to the pollster for nothing!
MyView is not only used by Ipsos. Through it I have taken polls by a number of minor Australian pollsters including Dynata, Painted Dog, Q+A Research and others that I probably do not remember. Ipsos is a member of the Australian Polling Council. I hope that Ipsos MyView will improve its service to its members in the interest of quality participation in the polls that its panel provides.
Ipsos MyView are welcome to respond in comments.
No comments:
Post a Comment
The comment system is unreliable. If you cannot submit comments you can email me a comment (via email link in profile) - email must be entitled: Comment for publication, followed by the name of the article you wish to comment on. Comments are accepted in full or not at all. Comments will be published under the name the email is sent from unless an alias is clearly requested and stated. If you submit a comment which is not accepted within a few days you can also email me and I will check if it has been received.