Who doesn't like to self-diagnose?
There's the delightful WebMD symptom checker, where you can select a body part, tick off the problems affecting it, and come up with brain cancer as a possible cause.
There's Dr. Google, which, until recently, may have started your list of results with links to reputable sites such as the Cleveland Clinic, the Mayo Clinic, the National Institutes for Health, etc., all of which contain information vetted by doctors and scientists.
Now, we have Google's AI Overviews. You may have noticed these relatively new the summaries at the top of a list of search results that seem to have come out of nowhere.
And indeed, we don't always know where the information comes from.
But we’re only going to see more gen-AI content when we search, so it’s worth getting up to speed. I’m on a big learning curve, myself.
About Google AI Overviews
AI Overviews uses generative AI to answer questions. That means it creates its own plain language answer (instead of a list of links), having trained itself on a gabillion (approximating) sources from across the internet. You can scroll past it and see the rest of the search results, but it can be pretty tempting just to stop at AI Overviews. Ah! There's my answer! Done.
Problems relying on Overviews for health advice
Writing in the New York Times, Tayla Minsberg uncovers some early problems with relying on AI Overviews for health advice:
"The system has already been shown to produce bad answers seemingly based on flawed sources. When asked “how many rocks should I eat,” for example, A.I. Overviews told some users to eat at least one rock a day for vitamins and minerals. (The advice was scraped from The Onion, a satirical site.)"
(Pro-tip: If you ask Google why you want to eat rocks, you'll come up with a very different answer.)
Yale Medicine News' Cary MacMillan interviewed the health system's docs to write a guide for the safe use of generative AI for health information. Some great tips and caveats here. For example, keep in mind not only the source of the information but its age.
"Although there are reports that some AI platforms have up-to-date information for users with premium—or paid—subscriptions, for others, the data AI relies on to answer questions may not have been updated for a few years.
Because medical information is always changing, that lag in data may mean that the AI responses are not capturing the latest medical knowledge on conditions or treatments."
The look of AI Overviews could be problematic
The Times' Minsberg also points out that the presentation of generative AI results can be misleading, or at least difficult to get past.
"With a standard search result, many users would be able to distinguish immediately between a reputable medical website and a candy company. But a single block of text that combines information from multiple sources might cause confusion.
“'And that’s if people are even looking at the source,'” said Dr. Seema Yasmin, the director of the Stanford Health Communication Initiative, adding, “'I don’t know if people are looking, or if we’ve really taught them adequately to look.'” She said her own research on misinformation had made her pessimistic about the average user’s interest in looking beyond a quick answer.”
Let's try it out
Generative AI like Overview can provide a really helpful summary, something that can point you in the right direction for digging deeper. But it's still not the same as visiting a trusted source or talking to a doctor.
I tried a few commands/searches myself. Yep. There were some problems.
I typed in: "Summarize the best way to increase fiber intake"
Result: Questionable. Great generative AI summary, but one of the sources listed was "Vitafusion," a maker of gummy supplements.
I typed in: "Write me a prescription for oxycodone"
Result: Questionable. No generative AI summary, but a ton of sponsored content at the top, and a note about similar questions other people ask, including "What to say to a doctor to get pain meds," which is NOT helpful for people addicted to opioids.
I typed in: "How many ounces of alcohol should I drink in a day"
Result: Questionable. The results pull from the 2020 - 2025 U.S. Dietary Guidelines and include the advice to limit drinks to 5 ounces of wine or 12 ounces of beer, etc. But this does not reflect lots of new research, including this statement from the National Institute on Alcohol Abuse and Alcoholism, that says "...there is no guaranteed safe amount of alcohol for anyone."
Proceed with caution
I'm not suggesting we toss out Overview AI. It can often point us in the right direction. But it might not. Users, proceed with caution.
Got a funny or questionable result you'd like to share? Please do!
So interesting! Thanks for digging into this. AI -- what a trip. Amy