Last week I shared my notes from the first two sessions of the DC-AAPOR/ WSS Summer conference preview/review. Here are the rest of the notes, covering the rest of the conference:
Session 3: Accessing and Using Records
- Side note: Some of us may benefit from a support group format re: matching administrative records
- AIR experiment with incentives & consent to record linkage: $2 incentive s/t worse than $0. $20 incentive yielded highest response rate and consent rate earlies t in the process, cheaper than phone follow-up
- If relevant data is available, $20 incentive can be tailored to likely nonrespondents
- Evaluating race & Hispanic origin questions- this was a big theme over the course of this conference. The social constructiveness of racial/ethnic identity doesn’t map well to survey questions. This Census study found changes in survey answers based on context, location, social position, education, ambiguousness of phenotype, self-perception, question format, census tract, and proxy reports. Also a high number of missing answers.
Session 4: Adaptive Design in Government Surveys
- A potpourri of quotes from this session that caught my eye:
- Re: Frauke Kreuter “the mother of all paradata”
Peter Miller: “Response rates is not the goal”
Robert Groves: “The way we do things is unsustainable”
- Re: Frauke Kreuter “the mother of all paradata”
-
- Response rates are declining, costs are rising
- Create a dashboard that works for your study. Include the relevant cars you need in order to have a decision maing tool that is tailored/dynamic and data based
- Include paradata, response data
- Include info re: mode switching, interventions
- IMPORTANT: prioritize cases, prioritize modes, shift priorities with experience
- Subsample open cases (not yet respondes)
- STOP data collection at a sensible point, before your response bias starts to grow exponentially and before you waste money on expensive interventions that can actually work to make your data less representative
- Interviewer paradata
- Chose facts over inference
- Presence or absence of key features (e.g. ease of access, condition of property)
- (for a phone survey, these would probably include presence or absence of answer or answering mechanism, etc.)
- For a household survey, household factors more helpful than neighborhood factors
- Three kinds of adaptive design
- Fixed design (ok, this is NOT adaptive)- treat all respondents the same
- Preplanned adaptive- tailor mailing efforts in advance based on response propensity models
- Real-time adaptive- adjust mailing efforts in response to real-time response data and evolving response propensities
- Important aspect of adaptive design: document decisions and evaluate success, re-evaluate future strategy
- What groups are under-responding and over-responding?
- Develop propensity models
- Design modes accordingly
- Save $ by focusing resources
- NSCG used adaptive design
Session 5: Public Opinion, Policy & Communication
- Marital status checklist: categories not mutually exclusive- checkboxes
- Cain conducted a meta-analysis of federal survey practices
- Same sex marriage
- Because of DOMA, federal agencies were not able to use same sex data. Now that it’s been struck down, the question is more important, has funding and policy issues resting on it
- Exploring measurement:
- Review of research
- Focus groups
- Cognitive interviews
- Quantitative testing ß current phase
- Estimates of same sex marriage dramatically inflated by straight people who select gender incorrectly (size/scope/scale)
- ACS has revised marriage question
- Instead of mother, father, parent 1, parent 2, …
- Yields more same sex couples
- Less nonresponse overall
- Allow step, adopted, bio, foster, …
- Plain language
- Plain Language Act of 2010
- See handout on plain language for more info
- Pretty much just good writing practice in general
- Data visualization makeovers using Tufte guidance
- Maybe not ideal makeovers, but the data makeover idea is a fun one. I’d like to see a data makeover event of some kind…
Session 7: Questionaire Design and Evaluation
- Getting your money’s worth! Targeting Resources to Make Cognitice Interviews Most Effective
- When choosing a sample for cognitive interviews, focus on the people who tend to have the problems you’re investigating. Otherwise, the likelihood of choosing someone with the right problems is quite low
- AIR experiment: cognitive interviews by phone
- Need to use more skilled interviewers by phone, because more probing is necessary
- Awkward silences more awkward without clues to what respondent is doing
- Hard to evaluate graphics and layout by phone
- When sharing a screen, interviewer should control mouse (they learned this the hard way)
- ON the Plus side: more convenient for interviewee and interviewer, interviewers have access to more interviewees, data quality similar, or good enough
- Try Skype or something?
- Translation issues (much of the cognitive testing centered around translation issues- I’m not going into detail with them here, because these don’t transfer well from one survey to the next)
- Education/internationall/translation: They tried to assign equivalent education groups and reflect their equivalences in the question, but when respondents didn’t agree to the equivalences suggested to them they didn’t follow the questions as written
Poster session
- One poster was laid out like candy land. Very cool, but people stopped by more to make jokes than substantive comments
- One poster had signals from interviews that the respondent would not cooperate, or 101 signs that your interview will not go smoothly. I could see posting that in an interviewer break room…
Session 8: Identifying and Repairing Measurement and Coverage Errors
- Health care reform survey: people believe what they believe in spite of the terms and definitions you supply
- Paraphrased Groves (1989:449) “Although survey language can be standardized, there is no guarantee that interpretation will be the same”
- Politeness can be a big barrier in interviewer/respondent communication
- Reduce interviewer rewording
- Be sure to bring interviewers on board with project goals (this was heavily emphasized on AAPORnet while we were at this conference- the importance of interviewer training, valuing the work of the interviewers, making sure the interviewers feel valued, collecting interviewer feedback and restrategizing during the fielding period and debriefing the interviewers after the fielding period is done)
- Response format effects when measuring employment: slides requested
I like what you guys are up also. Such smart work and reporting! Keep up the superb works guys I have incorporated you guys to my blogroll. I think it will improve the value of my website 🙂