6.2 C
Washington
Saturday, November 9, 2024

Accused of Cheating by an Algorithm, and a Professor She Had Never Met

TechAccused of Cheating by an Algorithm, and a Professor She Had Never Met
00cheating facebookJumbo

Dr. Orridge did not respond to requests for comment for this article. A spokeswoman from Broward College said she could not discuss the case because of student privacy laws. In an email, she said faculty “exercise their best judgment” about what they see in Honorlock reports. She said a first warning for dishonesty would appear on a student’s record but not have more serious consequences, such as preventing the student from graduating or transferring credits to another institution.

Honorlock hasn’t previously disclosed exactly how its artificial intelligence works, but a company spokeswoman revealed that the company performs face detection using Rekognition, an image analysis tool that Amazon started selling in 2016. The Rekognition software looks for facial landmarks — nose, eyes, eyebrows, mouth — and returns a confidence score that what is onscreen is a face. It can also infer the emotional state, gender and angle of the face.

Honorlock will flag a test taker as suspicious if it detects multiple faces in the room, or if the test taker’s face disappears, which could happen when people cover their face with their hands in frustration, said Brandon Smith, Honorlock’s president and chief operating officer.

Honorlock does sometimes use human employees to monitor test takers; “live proctors” will pop in by chat if there is a high number of flags on an exam to find out what is going on. Recently, these proctors discovered that Rekognition was mistakenly registering faces in photos or posters as additional people in the room.

When something like that happens, Honorlock tells Amazon’s engineers. “They take our real data and use it to improve their A.I.,” Mr. Smith said.

Rekognition was supposed to be a step up from what Honorlock had been using. A previous face detection tool from Google was worse at detecting the faces of people with a range of skin tones, Mr. Smith said.

But Rekognition has also been accused of bias. In a series of studies, Joy Buolamwini, a computer researcher and executive director of the Algorithmic Justice League, found that gender classification software, including Rekognition, worked least well on darker-skinned females.

Check out our other content

Check out other tags:

Most Popular Articles