Should we put the eye into AI?
Credit: Rolf van Root, Unsplash

Should we put the eye into AI?

September 20, 2023 Renee Lunder

OK, I’ll admit it: as a journalist and copywriter, I use the artificial intelligence (AI) language model ChatGPT. Before your opinion of me sours, let me try to redeem myself by explaining how and why I use it.

 

Primarily, I use AI when I need a summary of a topic I know nothing about – a scenario I frequently encounter as a journalist. Rather than doing a Google search and trawling through pages and pages of information, ChatGPT spits out an overview for me in less than 10 seconds. When I subsequently do my due diligence on what it says, I usually find the AI is pretty accurate. I also use it when I get stuck and need an analogy, or another way of writing a sentence to avoid repetition. It’s handy in my personal life too. Can’t think of what to make for dinner? Input your fridge contents and ask it for some options!

 

Since I’m fairly familiar with how ChatGPT outputs information, you can imagine my surprise when I recently interviewed an ‘expert’ via email to find the returned answers looked suspiciously AI-like. To be sure, I opened a new ChatGPT search, threw in my interview questions and up came all the interviewee’s answers, almost word-for-word! I ‘interviewed’ this person because they were meant to be a specialist in their field, so I wanted their personal views and insights on the topic, not ChatGPT’s. Granted, they could have had a busy week, but for me using AI rendered much of their interview unusable.

 

This got me thinking about using AI in this context, so I sought some more knowledgeable opinions on using it to create social media posts and content, as well as potential clinical applications. These are their opinions… and yes, I checked, none of them used AI for their answers!

 

Generating social media and website content

 

Irina Yakubin, a US-based primary-care and low-vision optometrist with a background in writing and social media management for the eyecare industry, points out AI isn’t new – many of us have used spellcheckers, grammar-checkers and even those handy suggested text lines when writing an email. But she sees the new versions of AI-driven tools, such as ChatGPT, as a step up. “AI can save time and give you ideas for what to create. You can use it to create a framework for a blog, spruce-up content, or even generate text ideas for a post.”

 

Irina Yakubin

 

 

Sam Petrusma, co-founder of Australian marketing and creative company Advantage Agency, says consistency is key in marketing. “For those with limited time, it can be a real challenge to put out regular quality content.” This is where AI can be of most benefit, he says. “It automates the collation of essentially the entire internet’s knowledge up to 2021 at least. This provides a tangible time-and cost-saving benefit. The tool can be guided to be more conversational, funny or professional, depending on the tone of your brand.”

 

AI tools’ greatest advantage is their wordsmith capabilities, says Robert Springer, director of OptomEDGE, an Australian marketing agency for the optical industry. “Even suppliers have a hard time generating quality optical marketing materials. If you want to introduce a new specialty on your website and social media, it can be a challenge to find the words for a technical concept. AI tools can help you generate examples which you can adapt into your own words.”

 

Words to the wise

 

All agree these advantages come with a few caveats, the first being in relation to the data itself. Campbell Wiltshire, business development and marketing manager at Auckland’s Independent Optometry Group, says, “You need to keep in mind that the responses generated by the system are based on patterns and examples from the training data which is fed in. This means they may not always be accurate or appropriate for your context.” Most of that written data is generated by humans, often with their own inherent biases, he says.

 

 Campbell Wiltshire

 

 

“With the current generation of AI tools, there is no way to ensure that the information generated is correct,” says Petrusma. “ChatGPT’s data only goes as far as 2021, so more recent information is not available. Sometimes it may simply not be able to get something correct if it’s particularly technical, or maybe it’s information that hasn’t reached scientific consensus and therefore the tool cannot determine which side is accurate. This could be a significant disadvantage (when) the latest information is crucial for your patients.”

 

AI-driven software can also conflate information and names with potentially damaging results. According to a story in The Washington Post, when a US researcher asked ChatGPT to generate a list of academic professionals involved in sexual misconduct, it wrongly named a real law professor as the culprit in a sexual harassment scandal. It is also susceptible to bias. The same Post article said a team of researchers at the Technical University of Munich and the University of Hamburg posted a preprint of an academic paper which concluded ChatGPT has a “pro-environmental, left-libertarian orientation.”

 

Sam Petrusma

 

 

So, as Wiltshire says, it’s extremely important to thoroughly review the generated content and make any necessary edits to ensure it aligns with your brand's messaging and tone. Petrusma agrees, saying these tools, for now at least, need guidance. “(While) it can help with the heavy lifting, it is not a replacement for a person and their intimate industry knowledge.” Wiltshire recommends using AI only to supplement your content-creation efforts and ensure your social media presence remains authentic and engaging for your audience.

 

AI-led systems are, however, potential gamechangers when it comes to generating patient emails, sending reminders and even integrating with online optical and digital store interfaces, says Yakubin. “It would also be really cool to see AI integrated into apps to help increase patient compliance with medications and treatment regimens.”

 

Overall, AI is not a path to success in itself, says Springer. “It’s an enabler which will allow practice owners to improve their digital marketing footprint and online patient experience, if used correctly.”

 

Retaining the human touch

 

For independent optometrists, the ability to showcase your point of difference is a key tool in your marketing toolkit. So if AI is the only one you’re relying on you will miss the opportunity to be authentic, says Springer. “There is no way for AI tools to capture spontaneous moments of your practice in action or convey the specific personal approach that you provide.” AI can produce content that doesn’t emotionally engage with your audience, agrees Petrusma, primarily because it’s based on pre-existing data and patterns, rather than human thought and feeling.

 

Robert Springer

 

 

“You can imagine there will be a wave of early adopters running more digital advertising and AI-generated promotions, therefore the level of marketplace activity will likely drive up the cost of advertising,” notes Springer, adding he worries every practice will suddenly move to rely on an AI-based marketing approach. “There will be declining points-of-differentiation between independent practices because the marketing materials may end up looking largely the same.”

 

Yakubin raises the point that as regulations around AI evolve, using AI-generated content might also impact your business’s online search-ability. “It’s no surprise that Google and other tech giants are unhappy at possibly being displaced and they may choose to limit AI-generated content on their platforms. It’s possible search engines and social media sites will identify AI-generated content and penalise its users in the future.”

 

An ethical dilemma

 

When it comes to the ethics of using AI, Springer says it’s “important to be mindful that a key aspect of a healthcare professional’s credibility is being a trusted source of information and industry expertise. If you outsource your website content and social media posts to an AI app, it bypasses the authenticity of the healthcare practice.”

 

Working solely in the marketing and advertising industry, Petrusma’s view is somewhat surprising. “Initially, I was unsure as to whether I was on board, but after trialling these tools and realising that they aren’t quite a replacement for real jobs – yet – I just see them as another tool in our design and marketing suite.” He says using AI is an opportunity for small businesses to churn out quality content in a way which levels the playing field against large businesses, provided you or your staff are trained in how to use it. “While the total impact is yet to be seen, I think AI tools are here to stay and, as we have seen countless times in human history, we will adapt. Protecting people’s jobs is important to me and I believe it will take government support or intervention at some level to ensure these tools are adopted in a way that is collectively beneficial.”

 

Using AI in a clinical setting

 

Having done a fair amount of work around AI’s potential clinical applications, Yakubin says it could help create patient-education modules and AI virtual scribes, or improve training opportunities to ensure optometrists receive plenty of practice before diving into in-person patient care. “AI can even be used to help analyse data and render diagnosis and treatment suggestions. Research has shown, for instance, that AI can be ‘trained’ to recognise diabetic retinopathy,” she says. ChatGPT is also well suited to reviewing and extracting key content from cancer patients’ records, interpreting next-generation sequencing reports and offering a list of potential clinical trial options, reported Assistant Professor Dipresh Upreti and his team of oncologists and computer scientists in a 2023 paper published in Cancer.

 

The Australian Alliance for AI in Healthcare comprises more than 100 national and international partners and stakeholders in academia, government, consumer, clinical and industry organisations. The group’s stated aims are to accelerate the adoption of AI-enabled healthcare in Australia, while exploring “the unintended consequences of these technologies that we are truly unprepared for.” The technology has already been banned in several Australian hospitals after a doctor used ChatGPT to generate a patient discharge summary. Perth's South Metropolitan Health Service (SMHS) forbade its use – but out of concern for security, not the software’s accuracy, said SMHS chief executive Paul Forden. “At this stage, there is no assurance of patient confidentiality when using AI bot technology, such as ChatGPT, nor do we fully understand the security risks.”

 

Yakubin agrees, reminding us the World Health Organization has warned healthcare providers to be cautious in adopting AI-driven models. “As a data-driven tool, AI can be manipulated or even corrupted by ‘bad’ data. For instance, AI may generate an inaccurate diagnosis if it is relying on a dataset that is incomplete or biased toward a certain demographic. There are also concerns about the potential for AI systems to be hacked.”

 

At the end of the day, she feels it’s up to the optometry community to work together to ensure the AI tools we collectively choose to adopt into clinical practice are safe and reliable, she says. “Even then, we shouldn’t expect AI to be infallible. Ultimately, we are the humans of the eyecare world and will still need to be the final decision-makers and fact-checkers when utilising AI in an optometry setting.”

 

Includes additional reporting by Drew Jones

 

 

Renee Lunder is an Australian freelance journalist and proud specs wearer and a regular contributor to NZ Optics.