As a busy dentist, you appreciate tools that save you time in your practice. Although ChatGPT and other AI-enhanced tools may be tempting and seemingly helpful to use, you may want to reconsider.
Posted in Articles on Tuesday, September 10, 2024
The use of Artificial Intelligence and ChatGPT engines have become the norm in many aspects of our lives, including the medical community. While talking with patients, some doctors have discovered that they can conceal their phones in their pockets, record the conversation, then have ChatGPT write up the notes.
Sure, it sounds great. But before you make ChatGPT the newest member of your staff, know that serious risks accompany the technology—ones that could prove to be detrimental in the long run. (Or sooner.)
Caring, Not Sharing
Although the risks are high and the safety concerns are numerous, the appeal of using ChatGPT for notes is easy to see—it takes just a few steps to record the conversation, upload and have an automated summary generated.
Unfortunately, that summary isn’t kept private to only you. Instead, the ChatGPT engine can now access your patients' protected health information (PHI), not to mention personal information from your browser. This is information that you’ve granted access to by accepting their terms and agreements.
What you import into ChatGPT is recorded and stored, which has the potential to be a major privacy risk, and does not adhere to HIPAA privacy regulations. By feeding Chat GPT what it needs to generate a desirable response, you open the books on your patient’s health information.
Faster is Not Better
Over the years, your practice has evolved to always remain HIPAA compliant. You most likely made painstaking efforts to ensure patient confidentiality. By using ChatGPT, you swiftly undercut those efforts and open yourself up to violations and lawsuits. Not only are you knowingly placing a patient’s PHI into the cybersphere, but if the patient discovers you releasing their information, it could absolutely destroy their trust in you as a doctor.
Is Your Information Secure?
ChatGPT has already been subject to breaches and hacks, which can leave confidential information of your patients exposed for anyone to view. If a patient discovers their information has been leaked, will they hold you or AI accountable?
In addition to breaches and hacks of the more “popular” ChatGPT engines, a simple online search will give you hundreds of links to possible AI sites. Which of these can be trusted? Which of these are risky links? There’s no clear-cut way to tell until it’s too late.
Accuracy is Everything
When you read through summaries that you generated yourself, you can feel confident that what you’ve recorded is correct. After all, you have first-hand knowledge of your patient. Can you trust ChatGPT to give you the correct summary? ChatGPT can be highly subject to error. It really is just eavesdropping on a confidential conversation, and then running that through other confidential conversations of the same nature.
The response you receive from ChatGPT might be a complete misdiagnosis and cause potential harm to the patient. If you’re not reviewing your own notes in a timely manner, your diagnosis of a patient may not be as accurate as you initially thought.
If a doctor or staff member is just cutting and pasting into EHR, it can result in cookie-cutter records that are meaningless in relation to patient treatment. And as we've seen before, copy-and-paste types of records often don't hold up in court.
Trust your notes; trust yourself.
What About Accountability?
You hold yourself accountable for all things in your practice, but with AI, there is no transparency or accountability. If there’s a breach or discrepancy in your office, you know who you can ask, but when dealing with ChatGPT, there is usually nothing more than a “Contact Us” link. You are at the whim of the engine creator on whether they wish to deal with what, in their opinion, is a very important issue.
Steer Clear for Now
A health care-specific ChatGPT could prove to be an incredibly beneficial tool, but one that would need to be regulated and secure. It could provide doctors quick access to documents and summaries, but until one exists, entering a patient’s PHI outside of a trusted, secure record keeping platform is a risky undertaking.