Categories
Writing

Some Challenges Facing Physician AI Scribes

Recent reporting from the Associated Press highlights the potential challenges in adopting emergent generative AI technologies into the working world. Their reporting focused on how American health care providers are using OpenAI’s transcription tool, Whisper, to transcribe patients’ conversations with medical staff.

These activities are occurring despite OpenAI’s warnings that Whisper should not be used in high-risk domains.

The article reports that a “machine learning engineer said he initially discovered hallucinations in about half of the over 100 hours of Whisper transcriptions he analyzed. A third developer said he found hallucinations in nearly every one of the 26,000 transcripts he created with Whisper. The problems persist even in well-recorded, short audio samples. A recent study by computer scientists uncovered 187 hallucinations in more than 13,000 clear audio snippets they examined.”

Transcription errors can be very serious. Research by Prof. Koenecke and Prof. Sloane of the University of Virgina found:

… that nearly 40% of the hallucinations were harmful or concerning because the speaker could be misinterpreted or misrepresented.

In an example they uncovered, a speaker said, “He, the boy, was going to, I’m not sure exactly, take the umbrella.”

But the transcription software added: “He took a big piece of a cross, a teeny, small piece … I’m sure he didn’t have a terror knife so he killed a number of people.”

A speaker in another recording described “two other girls and one lady.” Whisper invented extra commentary on race, adding “two other girls and one lady, um, which were Black.”

In a third transcription, Whisper invented a non-existent medication called “hyperactivated antibiotics.”

While, in some cases, voice data is deleted for privacy reasons this can impede physicians (or other medical personnel) from double checking the accuracy of transcription. While some may be caught, easily and quickly, more subtle errors or mistakes may be less likely to be caught.

One area where work stills needs to be done is to assess the relative accuracy of the AI scribes versus that of physicians. While there may be errors introduced by automated transcription what is the error rate of physicians? Also, what is the difference in quality of care between one whom is self-transcribing during a meeting vs reviewing transcriptions after the interaction? These are central questions that should play a significant role in assessments of when and how these technologies are deployed.

Categories
Links Photography

Medical Photography is Failing Patients With Darker Skin

Georgina Gonzalez, reporting for the Verge:

Most clinical photos are taken by well-intentioned doctors who haven’t been trained in the nuances of photographing patients of different races. There are fundamental differences in the physics of how light interacts with different skin tones that can make documenting conditions on skin of color more difficult, says Chrystye Sisson, associate professor and chair of the photographic science program at Rochester Institute of Technology, the only such program in the nation. 

Interactions between light, objects, and our eyes allow us to perceive color. For instance, a red object absorbs every wavelength of light except red, which it reflects back into our eyes. The more melanin there is in the skin, the more light it absorbs, and the less light it reflects back.

But standard photographic setups don’t account for those differences.

One of the things that I routinely experience shooting street photography in a multicultural city is just how screwy camera defaults treat individuals of different racial backgrounds. And I’ve yet to find a single default that captures darker skin accurately despite shooting for many years.

Categories
Aside Links

2019.7.29

After reading several articles from a special series in Elemental+ on ticks I’m now appropriately concerned about how dangerous ticks can be, and our utter inability to meaningfully address the threat.