NEWS

Can a major AI conference shed its reputation for hosting sexist behaviour?

The Neural Information Processing Systems meeting is trying to be more inclusive, but a recent survey reveals there is a long way to go.

Search for this author in:

Humanoid robot 'Sophia' appears to frown at the audience at a digital forum.

A humanoid robot called Sophia. An artificial-intelligence conference has been accused of having a discriminatory culture.Credit: Tomasz Wiech/AFP/Getty

Hordes of artificial-intelligence researchers will descend on the Canadian city of Montreal this week for one of the field’s hottest tickets: the Neural Information Processing Systems conference.

But although attendees at this annual event will hear talks on cutting-edge ideas in computer science, another issue will also be front and centre: whether the conference can provide a welcoming environment for women as the field of artificial intelligence (AI) grapples with a culture of harassment and discrimination.

The issues were thrown into stark relief earlier this month with the release of a survey of 2,375 people — most of whom had either attended the meeting or submitted papers for consideration in previous years.

Respondents reported experiencing sexual harassment, seeing the conference welcome sexist people and regularly hearing sexist or sexually abusive comments and jokes. Women reported unwelcome, persistent advances from men at the conference. The analysis does not reveal what percentages of respondents reported these experiences, but does say that 15% of respondents were women.

Those who completed the survey also reported anti-Semitism, racism and ageism, as well as discrimination against straight men. “The environment at the conference is one in which many have experienced harassment, bullying, microaggressions, or lack of respect,” says the report.

Terrence Sejnowski, president of the foundation that oversees the conference, told Nature that the foundation’s board, and others, had read the report with great interest, and thanked the authors for the analysis. “It provides us with valuable information for understanding our community,” he said.

Diversity measures

The survey was carried out by Katherine Heller, a machine-learning researcher at Duke University in Durham, North Carolina, and Hal Daumé, a machine-learning researcher at the University of Maryland in College Park, who are the diversity and inclusion chairs at this year’s event.

In December 2017, Sejnowski and the chairs of the boards of the 2017 and 2018 conferences acknowledged in a statement that several events held at or in conjunction with the 2017 conference had fallen short of the standards required to “provide an inclusive and welcoming environment for everyone”. They said that they would take immediate action, including recruiting the diversity and inclusion chairs, formalizing the process for reporting concerns and strengthening an existing code of conduct, by which all attendees and sponsors will have to abide in future.

The statement came shortly after several female machine-learning researchers spoke out about their experiences at last year’s event in Long Beach, California, and other AI conferences.

Reports first surfaced on Twitter of a joke about sexual assault, allegedly made by a member of a band composed of leading researchers at a party coinciding with the 2017 event. That prompted Kristian Lum, a statistician, to write a blogpost detailing numerous incidents of sexual harassment that she experienced at meetings before leaving the field of AI, including receiving inappropriate Facebook messages from the band member and being groped by another researcher at a different conference years earlier. At the time, Heller also told The Guardian that she and one of her students had been harassed by the other researcher, but did not specify at which conferences.

The updated code of conduct bans offensive comments related to a list of characteristics, including gender, age, disability, religion, ethnicity, physical appearance, body size, politics and technology choices. It also says that corporate sponsors, who pay up US$80,000 to be associated with the event, “should not use images, activities, or other materials that are of a sexual, racial, or otherwise offensive nature”.

Other measures posted on the 2018 conference website include subsidized childcare and a diversity meeting. There are also now several ways for conference-goers with concerns about how they or others are being treated to notify organizers.

And on November 16, the board abandoned the commonly used acronym for the conference, NIPS, and renamed the event NeurIPS. That decision came about after much discussion, including a March 2018 letter to the board, signed by 122 academics at Johns Hopkins University in Baltimore, Maryland, that said the NIPS acronym was “prone to unwelcome puns” and revealed further goings-on at the conference. “Disappointing behavior” at past events included an unofficial sister event named “TITS” and T-shirts spotted bearing the slogan “my NIPS are NP-hard”, said the letter.

Infuriating questions

Researchers have mixed views about whether, and when, the board’s efforts will bring meaningful change. Raia Hadsell, a machine-learning researcher at DeepMind in London who has been attending NeurIPS for more than a decade has not witnessed a “rampant culture of discrimination, bias or harassment” at NeurIPS but has seen and experienced many things described in the gender section of the survey there.

“I find it infuriating to be asked whether I am a recruiter, or a ‘plus one’, or whether I ‘did the work myself’ — do men ever ever get asked questions like that?” she says.

She thinks that the machine-learning community wants to address the problems, but that their complexity, and the speed with which the field is growing, makes it difficult. “I think that there will still be a problem come December in Montreal.”

Miguel Alonso, a computer scientist at Florida International University in Miami, says that the measures introduced this year are a good start, but that the real progress will be made when people begin to act more inclusively and encourage diversity on a large scale “when the camera is turned off and the lights are not shining”. Meaningful actions could include people recognizing their own bias, mentoring young scientists and having day-to-day interactions with a diverse range of people.

Elana Fertig, a computational biologist at Johns Hopkins University who signed the March letter to the board, says that altering the name is a powerful first step that has heightened awareness of the issues and shows that change is possible.

But two of Fertig’s students decided earlier this year not to attend the event because of the reported culture. And she worries about a backlash against the name change, noting that there were negative, sometimes threatening, comments that accompanied the debate over the change.

Nature 563, 610-611 (2018)

doi: 10.1038/d41586-018-07552-1
Nature Briefing

Sign up for the daily Nature Briefing email newsletter

Stay up to date with what matters in science and why, handpicked from Nature and other publications worldwide.

Sign Up