October 26th, 2008
It’s no secret that if you put ‘sex’ on the cover of a magazine or newspaper you shift more copies. And if you include a sex supplement or two in your paper you can be sure sales will go up.
Which makes my take on today’s Observer ‘Sex’ supplement slightly jaded. I’d love to believe it is a genuine attempt to investigate and accurately report the nation’s sexual habits. But I’m afraid I’m not convinced.
I just find these supplements run to the same kind of format. Throw in a few glossy pics than hint at a bit of rudery. Add a few interviews with sex salespeople, therapists, healthcare staff and religious ministers. Talk to some teens and oldies and see how they differ. Mix some columns that discuss chastity, changing attitudes to sex, gender differences, the internet and intimacy. Finish off with an obligatory survey (which will be used to generate publicity for the supplement).
Only a few weeks The Independent ran a week long ‘investigation’ into our love lives – with daily supplements covering a range of topics that conform to the process outlined above. Today The Observer has done it. No doubt by the close of the year or early into 2009 some other publication will do the same.
With any of these supplements there are always interesting aspects of them. But the fact that they never really venture too far out of safe territory while claiming to be highly provocative always irritates me. The same kind of faces, the same folk with products and services to promote, the similar messages about sex just feed what seems to be rather lazy journalism.
This is a shame because there’s so much innovative and exciting research going on, activism that’s not being reported, and debates needing to be had. You have to ask why it is these stories are neglected in favour of others. It’s no accident that sex continues to be constructed in a particular way.
I don’t want to bore you with deconstructing the whole Observer Sex supplement, and as mentioned certain parts of it are interesting. Their sex survey does require investigation though. So if you’re into sex research then feel free to read on – but this is a bit of a lengthy blog – so if you’re not so bothered about sex surveys you’ve my blessing to leave the class now
In fairness this sex survey is only a ‘D plus’ rather than the usual ‘F’ grade offerings we see in the press. But the fact The Observer are passing this off as some amazingly accurate picture of British sex lives is a worry since the survey doesn’t truly offer this. It’s so full of methodological holes we can’t really accept any of the findings.
Unusually for a PR survey (because let’s face it this is really what this questionnaire represents) a ‘methodology’ section is included. It states ‘A sample of 1,044 UK adults aged 16+ were interviewed by ICM Research in September 2008. Participants completed a confidential questionnaire, which was then placed in a sealed envelope. Interviews were conducted across the country and the results have been weighted to the profile of all adults’.
‘What’s wrong with this?’ I hear you ask. ‘You’re always moaning that people don’t reveal their methodologies, so why’s this a problem?’
Let’s take a closer look at this statement. It doesn’t give a proper age breakdown. It mentions participants were aged 16+ but not the whole age range. We’ve no idea if most of the respondents were aged under 30, 40, 70, 80. There are some mentions in the report of a 65+ age group, but we don’t know how many participants fell within different age bands. This could very easily skew the survey, and although we’re told the results were weighted by age it’s unclear if that’s within the sample or to the wider UK population.
It’s great that the questionnaire was confidential, and we can assume since the questionnaire was put into a sealed envelope that it wasn’t an online survey. But we can’t tell much more than that. Were the questionnaires completed by a participant on their own, or by an interviewer? Did participants complete the questionnaires in any specific location? Did they answer the questionnaire alone or in a group? Who noted the answers – an interviewer or participant?
This isn’t me being pedantic. A participant who gives their answers to an interviewer may well answer differently than if they were completing the questionnaire themselves. A participant who answers alone may answer differently than if they were in a group setting. Someone doing a survey in their own home may answer in a different manner than if they were in a research centre.
We’ve no idea from this ‘methodology’ how the people in the research were even recruited. Did researchers just turn up at their homes and ask them to complete a survey? Was the survey advertised somewhere and participants invited to participate? Was the survey run at a number of research settings across the UK? Were participants incentivised to take part? Were they people who were signed up to ICM or another survey agency as someone willing to take part in research? Were they ‘expert participants’ (someone who is paid to do lots of surveys) or sampled to be more representative of the UK population?
I’m not suggesting anything sinister has gone on by asking these questions, but simply drawing attention to the fact that without knowing the answers we don’t really know who took part, how they were recruited, what their motivation was for doing the survey, how this would affect their answers and how representative of the UK population they are.
A kosher sex survey would tell you all this – and let you know how many people in total were approached and of them how many refused to take part (and why). That way you can work out how reliable and representative the overall survey data is.
The kinds of questions asked are also rather random. Some ask about what sexual activity participants have had, some ask for people to predict what they might do, some ask about general attitudes. The survey ranges from safer sex, to disability, to race, cosmetic surgery, sexuality, attitudes to paedophiles, views on prostitution, sexual performance, infidelity, number of sexual partners, virginity, and sex education. But the ordering and phrasing of questions could easily influence answers and don’t all seem to fit together (or rather they construct a very particular view of sex and relationships).
The survey seems to break one of the cardinal rules of questionnaire research which is to define your key terms. So participants are asked a lot of questions about ‘sex’, but it’s never made clear what this means. Presumably nobody was given any working definitions on the questionnaire, which is concerning since we know if you ask people what they think ‘sex’ means not everyone responds as though they were talking about penetration. Unless you define sex you don’t know what you’re really measuring.
Within the questionnaire people are asked to rate their ‘performance’ or talk about how often they have ‘sex’. This is misleading. Here’s why. On a scale of 5 (being very good) I might rate myself a 5 for hand jobs, a 3 for oral and a 1 for intercourse. Either because I think I am better at some activities than others, or because I do some activities more than others. I might have intercourse once a week but I may masturbate twice a day – I may consider both of those things or only one of them to be ‘sex’.
So when this survey was asking people to rate ‘performance’ you’ve no idea exactly what they were rating, or what ‘sex’ they were having (and how often).
We’ve no idea how much data was lost within this survey because key terms weren’t defined and sexual activity including oral sex, masturbation, anal sex, fantasy and role play, kissing or any other activity we might see as sex are not included.
Many of the answers on the survey (particularly around losing virginity, sexual partners and seeing a prostitute) fit with current (kosher) sex research. This is reassuring. But the strange thing is nobody writing the Sex supplement appeared to know or notice – or at least report – that a. any existing research had been done, and b. how the results of this survey fitted with existing data.
This is a key problem because any reputable sex survey does not stand in isolation. It would build upon existing surveys and compare findings with them. Which is why The Observer have presented their data as if they’ve discovered something groundbreaking and ‘new’, whereas the same findings have been consistently reported throughout this decade from more reliable UK sex surveys. It’s just poor practice to make claims of your data that imply they’re somehow novel and not seem to notice (or declare) that they aren’t.
I was also worried to see really bad parameters being used in the survey. Yes, yes, I know it makes me sound geeky, but look at why it’s a problem. Participants were asked things like ‘have you ever used sex toys?’ and given the answer yes or no. Meaningless really as it doesn’t differentiate between the person who used one once and then left it in the back of the sock drawer or the person who is permanently attached to their Fleshlight. It doesn’t indicate what sex toys were used or how. Meaning the ‘yes’ answers could include the woman who does her boyfriend up the bum with a strap on or the lady who likes a vibrator. And it doesn’t tell us whether the person liked the experience or not.
Or take the question ‘which of the following best describes how frequently you have been unfaithful?’ (Answer – regularly, occasionally, rarely, only once). Kosher sex research never uses terms like ‘unfaithful’ because we know such value laden words prevent people feeling safe enough to answer honestly. Terms like ‘frequently’, ‘regularly’ and ‘rarely’ mean nothing unless you define them clearly. My ‘frequently’ is having several affairs per year; yours may be having two or three during the course of a relationship – or even during your lifetime.
The survey asks about one night stands but doesn’t define them. A one night stand can be something you deliberately go out and search for, or it could be a sexual encounter you don’t want to repeat, or a relationship you hoped would get off the ground but never did. We don’t know of the 49% of folk who said they had a ONS exactly what kind of experience they had – or how they felt about it. There’s a built in assumption they’re bad within the survey, and the fact this question seems to come after ones on infidelity could definitely affect how participants responded to the questions.
There’s some weird reporting of data too. For example 9% of respondents say they’ve seen a prostitute. Later we’re told 18% of all British men have visited a prostitute. This doesn’t make sense. Even more strangely participants were asked to say if they’d consider having sex for cash, which of course a fair number said they would. Saying you’d do something isn’t the same as actually doing it.
A fair amount of attention is given to sexually transmitted infections (incorrectly referred to as ‘diseases’) but the contraception questions begin this part which may have led participants to already pre-empt answers to give. Asking people if they’ve ever had an STI is very invasive and if this was completed as an interviewer administered survey then participants may well not have replied openly for fear of being judged.
It’s also fairly pointless asking if someone has had an STI given many are symptomless. Again, we know from reputable sex surveys that people often report they always use condoms or have never had an STI but if you test them for infection you find they do have one. Not because participants lie, but because they don’t know the answer. A more accurate way of asking would be to see if they’d ever had a test for an infection (and what the result was).
The survey peculiarly asks whether respondents have had sex with a disabled person, but doesn’t seem to measure whether respondents themselves are disabled. Predictably the answer is that 70% wouldn’t have sex with someone with a disability, but again disability isn’t defined. We know from reliable research that people are not so concerned about a disability that may not be immediately obvious (e.g. deafness or aspergers) but may well be prejudiced against someone whose disability they notice straight away (e.g. someone with cerebral palsy or who uses a wheelchair).
While I don’t dispute there’s prejudice against disabled people when it comes to dating and relationships, lumping all ‘disability’ into one category is unhelpful. More than that it’s somewhat distasteful and fairly pointless to include within the questionnaire – particularly as it comes just before the question on whether you’d sleep with someone who’s a different colour than you. You can almost hear the survey’s been set up for the white and able bodied to answer, with a couple of questions to draw out the fact they find sex with someone disabled or a different colour a. a bit icky and b. very similar.
In any reputable sex survey, particularly ones that cover issues relating to STIs, child abuse, infidelity or sexuality it’s always important to offer support to those taking part in the research. Usually this takes the form of providing sources of help for those affected by the study (for example a list of self help groups a participant could contact if they needed to). This may have happened in this survey, but no mention has been made of any support offered – or any acknowledgement that distress could have been caused.
It’s all well and good to say you put questionnaires into a sealed envelope (btw how do you place a questionnaire into a sealed envelope – surely you mean put the questionnaire into an envelope that was sealed?). If your survey questions aren’t reliable or clear and if you allow sex negative terms or statements to influence the questionnaire then your survey isn’t adding to our knowledge of sex.
If The Observer, or any other paper, is interested in knowing how to set up a reliable and accurate sex survey they could use that would truly give a picture of our sex lives then my door is always open.
It really isn’t all that difficult – and since you’re going to keep on doing sex surveys and spending loads of cash on them you might as well ensure the survey is at least something that’s useful while you’re at it.Tweet