As he entered his 40s, a survivor of a childhood brain tumor shared a story: He had recently gone in for a checkup with the doctor he had been going to even before that long-ago diagnosis.
“You know, it’s always great to see you,” the doctor had said with a smile, “but at your age, you should probably get a primary care doctor who isn’t in pediatrics.”
That’s how strong the bond had grown between patient and physician. Sure, other doctors and specialists had been involved in his surgery, recovery and therapy over years that turned into decades, but for the boy in crisis who had become an adult survivor, nothing could match the reassurance of seeing someone who had been there from the start.
A human connection like that looms high above the current and rapidly evolving discussion about the role artificial intelligence will play in the future of health care. If AI will be another instrument in the bag, should patients be encouraged or concerned? Benign comparisons to the arrival of calculators in 1970s high school math classes, word processors on 1980s college campuses and smartphones in the hands of everyone in the early 21st century have already been raised — along with dark humor about Arthur C. Clarke books and James Cameron movies.
One thing that can be said with certainty is that the stories in the Spring 2023 issue of Helix were not generated by a chatbot — another AI application that is seen as a boon to customer-service websites and cast as a villain for English Comp instructors far and wide. When you open Helix today and read about the paths being paved to connect RFU to its neighboring communities, you can be sure the process did not start with someone asking an Intelligent Virtual Assistant to whip up a story.
In fact, looking to stay ahead of a developing curve, publications have begun to craft disclosure statements about the use of “generative AI tools” in their stories and images. Earlier this year, in stating that they would not publish stories created by a text generator, the editors of WIRED noted that “current AI tools are prone to both errors and bias, and often produce dull, unoriginal writing,” among other concerns.
In other words, AI currently lacks the bedside manner that connects patients to their healthcare professionals. It also lacks the institutional knowledge, human insight and heart that you will find — today and tomorrow — in the pages of Helix.
Dan Moran is the communications director with RFU’s Division of Marketing and Brand Management.