$63,000 worth of abusive research . . . or just a really stupid waste of time?

As someone who relies strongly on survey research, it’s good for me to be reminded that some surveys are useful, some are useless, but one thing they almost all have in common is . . . they waste the respondents’ time.

I thought of this after receiving the following email, which I shall reproduce here. My own comments appear after.

Recently, you received an email from a student asking for 10 minutes of your time to discuss your Ph.D. program (the body of the email appears below). We are emailing you today to debrief you on the actual purpose of that email, as it was part of a research study. We sincerely hope our study did not cause you any disruption and we apologize if you were at all inconvenienced. Our hope is that this letter will provide a sufficient explanation of the purpose and design of our study to alleviate any concerns you may have about your involvement. We want to thank you for your time and for reading further if you are interested in understanding why you received this message. We hope you will see the value of the knowledge we anticipate producing with this large academic study.

We are decision-making researchers interested in how choices differ when they are made for “now” versus for “later”. Previous research has shown that people tend to favor doing things they viscerally want to do over what they believe they should do when making decisions for now, while they are more likely to do what they believe they should when making decisions for later (for a review, see Milkman, Rogers and Bazerman, 2008). The email you received from a student asked for a meeting with you either today (if you were randomly assigned to the “now” condition) or in a week (if you were randomly assigned to the “later” condition). This email was actually from a fictional student. It was designed for a study of the responsiveness of University faculty to meeting requests from prospective students of various backgrounds made on short notice versus well in advance. Faculty members at the top 260 U.S. Universities (as ranked by U.S. News and World Report) and affiliated with Ph! .D. programs were identified as potential participants in this study, and a random sample (6,300 faculty in total – one per Ph.D. program) were selected to receive emails. In addition to examining the responsiveness of faculty to meeting requests for “now” versus “later”, we are also interested in how the identity of the applicant affects, or does not affect, response rates, and as such, the name of the student sending a meeting request was varied (by race and by gender). We expected that students from underrepresented groups would receive fewer meeting acceptances than other students, though we have competing hypotheses about whether this would effect would be stronger in the “now” or the “later” condition.

The email you received as a part of this study contained the following message:

“I am writing you because I am a prospective Ph.D. student with considerable interest in your research. My plan is to apply to Ph.D. programs this coming fall, and I am eager to learn as much as I can about research opportunities in the meantime.

I will be on campus today/(next Monday), and although I know it is short notice, I was wondering if you might have 10 minutes when you would be willing to meet with me to briefly talk about your work and any possible opportunities for me to get involved in your research. Any time that would be convenient for you would be fine with me, as meeting with you is my first priority during this campus visit.

Thank you in advance for your consideration.”

As soon as the results of our research are available, we will post them on our websites. Please rest assured that no identifiable data will ever be reported from this study, and our between subject design ensures that we will only be able to identify email responsiveness patterns in aggregate – not at the individual level. No individual or university will be identifiable in any of the research or data we publish. Of course, any one individual email response is not meaningful as there are multiple reasons why an individual faculty member might accept or decline a meeting request. All data has already been de-identified and the identifiable email responses have already been deleted from our databases and related server. In addition, during the time when the data was identifiable, it was protected with strong and secure passwords. And as is always the case when academics conduct research involving human subjects, our research protocols were approved by our universities’ ! Institutional Review Boards (the Columbia University Morningside IRB and the University of Pennsylvania IRB).

If you have any questions about your rights as a research subject, you may contact the Columbia University Morningside Institutional Review Board at 212-851-7040 or by email at [email protected] and/or the University of Pennsylvania Institutional Review Board at 215-898-2614.

Thank you again for your time and understanding of the work we are doing.

Sincerely,

Katherine L. Milkman
Assistant Professor of Operations and Information Management
The Wharton School, University of Pennsylvania

Modupe Akinola
Assistant Professor of Management
Columbia Business School

My reply:

Dear Drs. Milkman and Akinola:

I believe that it would be appropriate and ethical for you to compensate all of the participants in your study. My understanding is that standard ethical guidelines require that research subjects (which, in this case, include me) be compensated for their time. I am surprised that your study passed the IRB without this–especially in a situation such as this in which we were not asked ahead of time whether we wanted to participate.

According to your email, 6300 of us have participated in your study. Perhaps you could pay $10 to each of us for our involuntary participation. This would come to $63,000 (plus the costs of checks, letters, and stamps)–surely a small price to pay for this valuable increase in scientific knowledge?

P.S. Some might say that it is mean of me to send such a sarcastic email to two evidently serious researchers. If I had been asked to participate in the study, I would respond more discreetly, but the unsolicited nature of the project seemed to demand an equivalent response. I am indeed sensitive to the ethical difficulties of survey research, but this does not stop me from feeling that my helpful impulses toward inquiring students are being abused by this sort of study, which I think belongs in the trash heap of ill-advised research projects along with Frank Flynn’s notorious survey from a few years ago when he tried to get free meals out of NYC restaurants by falsely claiming food poisoning. What is it with Columbia Business School researchers?

40 thoughts on “$63,000 worth of abusive research . . . or just a really stupid waste of time?

  1. I got the same email, but it began

    "Dear Dr. Hoffman, I am writing you because I am a prospective Ph.D. student with considerable interest in your research"

    Maybe getting people's names wrong was a randomized treatment assignment.

  2. I completely disagree with you. (I'm not familiar with human subjects protocols at all, this is just my personal opinion.) How long does it take to read an email? How many other emails do you get that actually have no redeeming value? Do you charge the senders of those? I don't see how a study like this could have been conducted under the conditions you state. And it provides valuable information–you are not considering the time wasted by many of us who carefully craft emails to professors and get no response. After many years (both as a student and after receiving my PhD, when several of these carefully crafted emails were about significant research questions and required me to spend time figuring out who would be best equipped to answer) I have concluded that the best rule of thumb is that research professors do not respond to email from people they don't know (which may include their former students in classes they taught). It would have been worthwhile to me to have learned this earlier (and of course, information about how to make a response more likely would have been even more useful).

    Not to sound overly negative–a few years ago I was pleasantly surprised by the prompt, substantial and helpful response to my unsolicited inquiry from David Kreps, who did not know me at all.

  3. Peter: Indeed, maybe so.

    Ruchira: I'm not objecting to the study, I'm objecting to them doing it without compensating the participants such as myself who had no choice of whether to be in the study. I respond all the time to email from people I don't know, and I don't like being hassled with something fake.

  4. I'd contact the IRB at the respective departments to complain/ask for their rationale in allowing the study. I'm pretty surprised that a study that purposefully deceived, costing the involuntary study participants valuable time, would pass without some sort of compensation component. In econo-speak, the study generates a clear negative externality unlikely to be fully offset by any positive externalities coming from the study's results. It'd be one thing if you had voluntarily assented to be a study participant and then were subjected to deception of some sort…

    Anyway, if complaints aren't made we'll likely only see more such studies coming out of these places (it sure is a cheap and easy way to collect data…).

  5. In addition to criticisms well made:

    I suspect this study will undermine support for genuine enquiries from potential students. Even though I wasn't a participant in this study, I will now be more sceptical about any such enquiries in the future. Honest people will lose out.

    So, what else is new?

  6. I'm also interested in whether they bothered to cancel the meeting in advance. I tend to be extremely open to meeting students and wouldn't be at all pleased if I shifted plans around only to have a "no show" occur.

  7. I participated in this study as well. I promptly contacted my university's IRB to ask for their opinion on the matter. I was informed that the researchers should have contacted our IRB for permission to use our faculty as research subjects.

    I was in the "now" condition. As a junior researcher inside a new PhD program, I felt quite flattered at the inquiry, and was looking forward to meeting this enthusiastic student who sought me out.

    Maybe older, senior researchers at better-known universities get these kinds of requests all the time. But for me, it felt like being asked on a date and later being told it was all a joke. The fact that this project came out of ivy-league schools adds insult to injury.

    I'll take the $10 for my troubles. But what I really want is a formal apology from both institutions, and an acknowledgment that this project should never, ever have gotten IRB approval.

  8. Everyone is so precious about this. The study required reading an email, then possibly checking your diary and responding with 1 or 2 sentences. It would have taken a similar amount of time to read an respond to an email asking if you wanted to participate. The fact everyone here wrote, read or commented on this blog post suggests that a few minutes here and there don't cost anyone much. As for compensation, when was the last time anyone got paid to fill out a short survey? Besides, it was your work time not your personal time, so if you do get the $10 check, I hope you hand it on to your employer. I think it is more the feeling of being tricked that you don't like.

  9. You should have asked them whether they would compensate you next week. They would presumably have been more receptive if they were being asked to compensate participants later rather than now.

  10. David:

    The issue isn't the deception, it's that the participation was involuntary. The researchers can feel free to offer me $10 to participate in a study (or, if you'd prefer, to offer $10 to my employer), and I can decide myself whether it's worth my time. You might feel that "a few minutes here and there don't cost anyone much," but many of us are getting overwhelmed by email, and still try to respond to every student's query. It's not so easy to do that when we're also getting messages from fake students!

    I admit that my reaction is over the top. I just feel that this experiment is a bit of a betrayal of trust. And I'm already on record as opposing robocalls. In my opinion, wasting a lot of people's time, even only a few minutes for each person, is unethical unless you compensate them. And, more than that, it betrays decades of trust built up by pollsters and academic researchers.

  11. Because if they'd said up front that they were trying to find out whether professors are less willing to give their time to women and minorities, everybody would have just given an honest and valid self-report, right?
    Look, I was basically with you until I registered the part about them studying race and gender. One of the criteria for justifying deception in research is that it's not feasible to do the research another way — and in this case, it's virtually impossible to study topics like implicit prejudice if you tell subjects what you're doing. And as in all social science research, the cost to participants and use of deception has to be balanced against the potential value of the work. If this study revealed that professors are (perhaps unconsciously) less likely to help members of underrepresented groups who want to enter academia, and if it helped us understand why (and perhaps under what conditions that's more likely to occur), I think there's a reasonable case to be made.
    Also, I don't know that I'd describe this as a "survey" in the conventional sense. It's a randomized experiment where the dependent variable is a behavior (whether & how subjects respond to an email request).

  12. Sanjay: It is, I believe, standard to compensate experimental subjects. I think this is even more necessary given that nobody asked us to participate. If the researchers were to pay us $10 each, I think it would be much more reasonable. I am not bothered by deception in psychology experiments, from Milgram on down. But it would be easy enough for them to first ask for our participation without telling us that they're studying race and gender, or for them to do the study and then pay us.

  13. Ah, I started composing my reply before I saw your followup comments. I think I took your apparently sarcastic (?) remark about a "valuable increase in scientific knowledge" and the bit about "the trash heap of ill-advised research projects" to be a broader comment on the merits of the research. But I definitely see your point about compensation.

  14. What if the researchers had asked for your uncompensated cooperation in advance, for a study that promised take very little of your time and effort, details to be provided only after you consent (and then unwittingly respond to the fake email).

  15. There is a good discussion of the ethics of similar research here: Riach, Peter A. and Rich, Judith. "Deceptive Field Experiments of Discrimination: Are They Ethical?" Kyklos, 2004, 57(3), pp. 457-70.

  16. I think part of what's objectionable, though, is the thing that's being asked for – an appointment. (Frankly, no one who agreed to this would expect the meeting to take the 10 minutes asked for.) Whenever someone asks for a meeting it stands to break up a block of time that could have been used for something substantial, or run over into some other engagement, or place constraints on another meeting that could otherwise have been allowed to run over if things got interesting– so working out whether it could fit in is tricky, and agreeing to it, if one does, is extremely generous.

    Unfortunately, academics have to deal with a fair number of timewasters in the ordinary way of things – the sort of student who makes an appointment and cancels at the last minute is already an occupational hazard, and one has to make an effort anyway not to allow this to prejudice one unduly against students who come along late in the game. So it's extremely thoughtless to expose 6300 academics to a gratuitous plea for generosity followed by cancellation. I just hope no one decides to try something similar with writers.

  17. Andrew, I don't believe it is standard to compensate research participants for trivial interruptions. (Once I managed to get $100 for being a subject but it cost me over an hour and a piece of my leg.) They apologized for the inconvenience and deception, gave an explanation and a description of their potentially interesting project. To compare that to a robocall is disingenuous, especially as this type of study is likely to remain very rare. Maybe it's a double dummy study design where they wanted to see how easy it is to annoy professors with email.

  18. 1. Andrew is right, generally participants in an experiment are given a) briefing, and b) some small renumeration.

    2. But the briefing is full of misdirection, and participants are interviewed afterwards to discard those results from participants who saw through the misdirection.

    3. I am going to bet that the researchers didn't have a $6,300 let alone $63,000 budget.

  19. As a student interested to make an appointment with professors I am afraid professor Gelman and others be more relutanct to agree to apointments.
    Also, I was wondering if they aren't monitoring the reactions of professors to this new e-mail as a post-treatment variable!

    If you think about it you should see how abusive is to engage anyone in an experiment withou prior consent. This conversation could be parte of the study? Wouldn't you feel abused if it was so?

    regards,
    Manoel

  20. I'm curious. Those who have complained about the way the study was performed: what would be your reaction if the researchers discover something truly noteworth: that only 10% of the supposed students with "black" names got appointments but 50% of those with "white" names (Matthew, perhaps?) did? Or what if the acceptance rate were the same for all groups, and academics really are paragons of equality?

    If I'd been duped into this study, my personal involvement would mean I'd rather find out the answer to those questions than get a cheque for $10.

  21. Alex:

    Good point. Here's my solution. Milkman and Akinola can send me and the other participants $10 each. (If they don't have the cash on hand, they could take out a loan–as business school professors, I'm sure they're good for it.) Then, after they send us the study results, if we think the findings are interesting, we can each individually decide whether to send the $10 back to them.

    They could make it even easier than that by setting up a tip jar on their project's webpage. That way anybody anywhere could donate to support their research.

    They could even get NSF funding, perhaps using this sort of phrasing:

    We propose to make use of the vast untapped resource which is college professors' free time, time spent online which would otherwise be used in socially unproductive activities such as blogging or responding to email from actual students.

  22. You wrote:

    "My understanding is that standard ethical guidelines require that research subjects (which, in this case, include me) be compensated for their time."

    Which standard ethical guidelines?

    I found this in the APA guidelines on deception :

    "(a) Psychologists do not conduct a study involving deception unless they have determined that the use of deceptive techniques is justified by the study’s significant prospective scientific, educational, or applied value and that effective nondeceptive alternative procedures are not feasible.

    "(b) Psychologists do not deceive prospective participants about research that is reasonably expected to cause physical pain or severe emotional distress.

    "(c) Psychologists explain any deception that is an integral feature of the design and conduct of an experiment to participants as early as is feasible, preferably at the conclusion of their participation, but no later than at the conclusion of the data collection, and permit participants to withdraw their data. (See also Standard 8.08, Debriefing.)"

    There was nothing in there on compulsory compensation.

  23. Andy: As I noted in an earlier comment, I am not bothered by deception in psychology experiments, from Milgram on down. What bothers me is that we were involuntary participants in the study. The researchers took advantage of our time and our good nature (that we were willing to meet with an unfamiliar student). Not cool.

  24. This study is a copy of Bertrand and Mullainathan (AER '04) in which they sent out resumes with black-sounding names and white-sounding names to prospective employers. They set up answering machines to count callbacks.

    I don't remember any ruckus at all about the ethics of wasting employers' time.

  25. Blame the REBs – Columbia University Morningside Institutional Review Board and University of Pennsylvania Institutional Review Board

    It was their responsibility to balance the interests of "science", the researchers (always with an understandibly biased view of the value of their research), "society" and the participants

    I believe with a prejudice to protect the interests of the participants over the researchers?

    I have been on REBs and deliberated similar concerns

    K?

  26. I work in psychology and I think your response confuses the issues of payment, consent and deception. To work, this study had to start with an element of deception, as Alex Cook pointed out. And it seems you don't have a problem with that. But once the researchers debriefed you, I think they should have offered a retrospective consent option, e.g. – do you now agree that your response can contribute to our dataset? Or do you think we are so unethical that you want to withdraw your data?

    That way, you would not be an involuntary participant because you would have the chance to withdraw from the study retrospectively. In our work, we always have to give the option of retrospective withdrawal. And obviously in this case, withdrawal would be a way to punish the researchers for wasting your time by reducing the size of your dataset. For a study that only takes 10 mins, money doesn't really come in to it.

  27. While I agree with your sentiment, I'd put a far lower–but non-trivial–price tag on this research. I think these researchers have stolen, conservatively, about $2,500 from the employers of their subjects and perhaps a similar sum from the subjects themselves.

    Let's guestimate that it takes a minute, on average, for subjects to respond to the initial e-mail (many people will read and delete the e-mail within seconds while others will conscientiously respond at length). At sixty minutes in an hour, eight working hours in a day, 22 working days in a month, and nine months of compensated time in a year, that's 1/950040 of the working year. With average annual pay of $75k, that minute is worth about 79 cents. 79 times 6,300 is $4,970, or ballpark $5,000.

    Now let's assume that the minute is equally split between time that would have been spent on other productive activities for the subjects' employers and time that the subjects would have spent goofing off or sleeping. That suggests that the employers are collectively out $2,500 and so are the subjects.

    You could play around with my estimates, but I've tried to be conservative. The upshot, however, is that very quickly you start talking about amounts that could cover an RA for a semester. Is the knowledge gained worth it? And what distributive issues are raised?

    Since the researchers took the subjects' time without asking, I think they're guilty of stealing something of this magnitude. People often get put in jail for less.

    I do recognize that I'm being a hypocrite and just stole about $8 (from myself, not my employer) by commenting, but I got consent from myself first. I'd write a clearer response, but that would take more time than the tail end of my lunch break, and then I'd have to ask Andrew to compensate my employer.

  28. I am posting the abstract from a paper published recently in the American Sociological Review (2009, VOL. 74, October:777–799) by researchers from Harvard and Princeton. The study was considerably more onerous than this one, sending testers with fictitious resumes to real employers real interviews. This was funded by the NSF and approved by review boards at Harvard and Princeton. A similar earlier study was done by Marianne Bertrand (U of Chicago) and Sendhil Mullainathan (MIT) and published in the American Economic Review ("Are Emily and Greg More Employable Than Lakisha and Jamal? A Field Experiment on Labor Market Discrimination," Vol. 94, No. 4 (Sep., 2004), pp. 991-1013).

    These studies were published in the very best journals in sociology and economics by well-respected researchers at the top universities. I am stunned by the venom of some of these postings. It is impossible to ask for advance permission and still conduct the study. But if we want a race-blind society, some such studies seem clearly needed. I look forward to seeing the results of this study by Professors Akinola and Milkman. They are studying an important problem and I only hope we as professors do better than potential employers (and landlords in other studies).

    Discrimination in a Low-Wage Labor Market: A Field Experiment
    Devah Pager Princeton University
    Bruce Western Harvard University
    Bart Bonikowski Princeton University

    Decades of racial progress have led some researchers and policymakers to doubt that discrimination remains an important cause of economic inequality. To study contemporary discrimination, we conducted a field experiment in the low-wage labor market of New York City, recruiting white, black, and Latino job applicants who were matched on demographic characteristics and interpersonal skills. These applicants were given equivalent résumés and sent to apply in tandem for hundreds of entry-level jobs. Our results show that black applicants were half as likely as equally qualified whites to receive a callback or job offer. In fact, black and Latino applicants with clean backgrounds fared no better than white applicants just released from prison. Additional qualitative evidence from our applicants’ experiences further illustrates the multiple points at which employment trajectories can be deflected by various forms of racial bias. These results point to the subtle yet systematic forms of discrimination that continue to shape employment opportunities for low-wage workers.

  29. Anonymous:

    In this case, I don't think it would work to give participants a serious option to withdraw our data. This could bias results if, for example, people who didn't respond to fake emails from ethnic minorities are more likely to withdraw their data, compared to people who did respond.

    It would be much better, I think, to simply ask for participation ahead of time, for example by sending an email several weeks in advance saying that they are b-school professors doing a research project and that they will contact me in a few weeks by email. In the initial contact, they don't give info on the topic of the study. Then, if I agree to participate, the rest of the study goes as planned, and I'd have no idea that the purported student's email is fake.

    CC:

    I think it would've been appropriate for Pager, Western, and Bonikowski to have compensated the employers in their study. (Maybe they actually did so.) I am not claiming it is a legal requirement for them to have done so, but I think it is only right to compensate the employers for wasting their time with fake job applicants.

    On to the Milkman and Akinola study: If it is as important as you say it is, then, as I wrote above, $63,000 is surely a small price to pay for this valuable increase in scientific knowledge?

  30. I also was deceived into participating. Let me note that it wasn't necessarily as simple as taking 2 minutes to check a calendar and compose a reply.

    I knew it would take longer than 10 minutes to really talk to such a student and worked to make time which involved moving around my schedule and coordinating with others. Then I spent time consulting with an administrator to see if we had information about this prospective student.

    After I replied to the message with a possible meeting time and received the cancellation message I replied to that message with a kind and informational follow-up message. Why? Because that's what we do in my department. We take pride in how we treat our students from the point of interest and initial contact onward. I also set my schedule back to the initial (and preferable) plan for that day. It was not an insignificant investment of my time. It involved other people as well.

    When I received the message that I was a participant in this deception study I felt like a total fool for having gone to that much trouble to try to accommodate a prospective student who would be on campus for one day only. Yes, I'm the stooge who bends over backwards to help a graduate student when apparently the expected response was to barely consider the request and dash off a hasty yes or no, followed by brushing off the cancellation and forgetting that prospective student ever existed. My only consolation here is in knowing that any of my colleagues would have done the same to accommodate the student.

    Further, I was annoyed that I was not given the opportunity to consent but was merely informed of my participation and what had happened to my data.

    I still can't believe this got through an IRB. I'm confident it would not get through the IRB at my institution.

  31. Frank Grupt: i think you seriously underestimate the cost to those who decided to respond positively to the email.

    Even a single planned 10 minute interruption converts a day from a potentially continuous block of time where you can concentrate carefully into two half-sized blocks of time in which you potentially can not do anything like the same amount of work. For more on this concept see:

    http://www.paulgraham.com/makersschedule.html

    Scheduling even one appointment in a day can convert that day into a different type of work, potentially delaying that project significantly.

    To be conservative, I would say anyone who made a positive response and adjusted their schedule might have lost the equivalent of a whole day of work. If they are a maker + manager like most professors are, to be conservative again, let's say that day was a day of "making" and that such a "making" day only comes once per week, delaying their project up to 1 week of delay. If they're working on a $300,000 grant over 3 years a week of delay is $1923

  32. Milkman and Akinola did not state an interest in testing for racial discrimination as their primary goal. They were interested in a much more vague and nonsensical distinction between "want to" and "have to" decision making. Among the many serious flaws in their protocol is their conflation of this question with a discrimination question. Those defending this research here are being disingenuous. This is not serious social research. It's marketing research. And it's both careless and so poorly designed as to render any of its findings utterly useless.

    They sent these emails in early May. No busy professor is going to be able to meet a prospective student on short notice in early May. Very few prospects even contact departments or faculty at this time of year. My inbox is overwhelmed with pressing inquiries all the time, but early May is the absolute worst.

    They also did not control for some key variables. In my department, they sent deceptive emails to two faculty members I am aware of. NEITHER ONE was the appropriate contact person for PhD admissions, and we have a policy of routing PhD admission inquiries through specific faculty members first. One was ON LEAVE and had his email set to auto-reply. How did they count that? One was a minority member himself. How did they code their findings for this? Did they control for the racial identity of the person contacted, or her/his gender? If not, their findings are also spurious, and it is telling that so many commenters on this blog seem to assume all professors are white. What planet do y'all live on?

    I'm going after these people with both Penn's and Columbia's IRBs, and with the provosts of both schools and the deans of both business schools. This is unethical research, and it's also badly designed research. It should never have passed IRB review. But beyond that, it's evidence that "research" in Marketing is a parodic version of serious social or psychological science.

    And lest you wonder what a musicologist would know about any of this, I'm a PhD in anthropology, a PI on dozens of approved IRB protocols, and my own work is funded by the NSF.

    Aaron Fox
    Assoc. Prof. and Chair
    Department of Music
    Columbia University

  33. In economics, some professors were/are quite upset that the Bertrand and Mullainathan study was approved by the IRB…

  34. That's an interesting component I hadn't thought about. I was simply trying to emphasize that blithe assumptions about small amounts of time can add up to serious consequences.

    Back in grad school, I often ate in the cafeteria of the university's med school. The cafeteria served made-to-order sandwiches, which, while yummy, took a little time to prepare. Often, you'd see a line 50 people deep, many of whom were highly paid MDs. I can't remember my calculations now, but even with very conservative assumptions about the value of time, how much people might be paid, and the extent to which the med school would benefit from reducing waiting time, it was completely clear that the benefits to the med school of adding more sandwich preparers would far outweigh the cost.

    While I hope that simple e-mails don't derail major research projects very often, I do think that both researchers and IRBs should pay attention to the extent to which many small disruptions can add up to non-trivial impacts.

  35. I too received the email, and wrote the following to the PIs and the IRB email address provided (Pennsylvania's email was not provide). I don't give a fig about compensation, it wouldn't compensate for having my trust violated:

    Professors Milkman and Akinola,

    This study violates the basic principles of informed consent. I can't imagine how it got through your IRB. I was pleased to visit with the prospective (presumably male) doctoral student "later" (the following Monday) (the only ones I deflect are those interested in projects outside my expertise, when I direct them to the appropriate faculty member, or if I will not be available at the time they are on campus), but now I feel duped and used. I'm surprised how visceral my reaction is.

    I don't know how you would design a valid experiment that doesn't involve duping subjects, and I know your study doesn't put me at risk in any way, but nonetheless I feel my trust and good will have been abused.

    Perhaps I'm especially sensitive because I, and my students and colleagues, work with human subjects all the time in our [ethnographic] research, and one of our fundamental canons is that the people we work with must know that we are engaged in research, and must have some understanding of the nature of that research. A good number of years ago, I was on the Board of a Migrant Headstart program in which a linguistics student obtained a job without stating that he was involved in research that involved observing fellow teachers' and students' language usage. This enraged the teachers and put our program in legal jeopardy since he had no consent from the children's parents.

    I'm copying your IRB because I think this is an issue they should carefully consider. I'm glad to see that their guidelines are not excessively rigid, applying medical standards to social research as so many IRBs do. But they may be too loose, or not provide adequate guidance.

  36. I have spent over a decade serving on my institution's IRB. In my opinion, where this study goes awry is that it did not provide an option to allow participants to withdraw their data once they had been debriefed. That is a critical component of any study that involved deception, and it is normally mandated whenever it is feasible, which in this case–because the data were identifiable–it was in fact possible to arrange the subsequent withdrawal of data.

    I urge those participants who are angered by the study to send a complaint to the Office for Human Research Protections (in addition to the individual schools involved).

  37. Easy for you to say. But a colleague of mine was fooled by the e-mail. She told me next time she got an inquiry from someone, she would pause and think if it is a prank.

  38. We conducted a study with very similar research design on statistical discrimination in the apartment rental market. I understand why the authors of the study mentioned thought they had to do what they did, but their participants are a very unique set. For example , if we had asked for participation in our landlord study, we could not have answered the questions we posed. We spent a lot of energy convincing IRB that the study did little harm to the landlords. The key parts of the research design were:

    – landlords post public ads that request contact
    – email asked a simple Y/N question in a few sentences
    – went at great lengths to email a landlord only once (out of 20,000 emails, we may have sent duplicates to about 15 landlords)
    – the system automatically responded back to responses (within an hour) something like: "Thanks for your response, but I just found an apartment." This ensured that landlords didn't keep a place for us.

    For the study mentioned in the post, it is important for the authors and IRB to remember that professors are not soliciting such emails. Thus, they likely receive few of them and assume all are legit when they do get them.

    Our research may have some important things to say about statistical discrimination, which could not have been feasibly answered with the standard audit studies.

Comments are closed.