Skip to content

Automated Inference on Criminality Using High-tech GIGO Analysis

Yee Whye Teh writes:

You might be interested in this article.

My reply was that this is just a big joke, one more bit of hype over a bunch of correlations. Lots of obvious problems with this paper, and it’s too bad that journalists fell for it. And, as an MIT grad, I’m particularly sad to see Technology Review be among the suckers.

I don’t have the energy to point out all the problems. Conveniently, though, I came across this pretty thorough discussion by Carl Bergstrom and Jevin West.

It’s an interesting example because, for once, we’re not in Type M / Type S error territory. Instead, we have the other big problem in quantitative social science: measurements that are not closely enough tied to the underlying constructs of interest.

tl;dr: No, despite what you might read in Technology Review, it’s not true that a neural network learned to identify criminals by their faces.


  1. Daniel Simpson says:

    Can’t wait until they try this on gay people.

  2. Jonathan (another one) says:

    From the Bergstrom-West piece: “Unfortunately, it appears that unattractive individuals are more likely to be found guilty in jury trials than their more attractive peers.”

    Yes, but they have more sons as well.

Leave a Reply