There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

admin , to psychology
@admin@mastodon.clinicians-exchange.org avatar

Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
Warning on AI and Data in mental health: ‘Patients are drowning’*
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/

I'm always a bit skeptical of presentations from tech company CEOs on
how their product areas are necessary in the mental health field.

That said, this article has a few good points:

/"Umar Nizamani, CEO, International, at NiceDay, emphasised that AI will
inevitably become an essential tool in mental health care: 'I am very
confident AI will not replace therapists – but therapists using AI will
replace therapists not using AI.'"//
/
I am beginning to think this also -- for better or worse. I took a VERY
fast 60 second look at NiceDay and it appears to be another
all-encompassing EHR, but with a strong emphasis on data. Lots of tools
and questionnaires and attractive graphs for therapists to monitor
symptoms. (I need to take a longer look later.) So data-driven could
be very good, if it does not crowd out the human touch.

/"Nizamani said there had been suicides caused by AI, citing the case of
a person in Belgium who died by suicide after downloading an anxiety
app. The individual was anxious about climate change. The app suggested
'if you did not exist' it would help the planet, said Nizamani."//
/
YIKES... So, yes, his point that care in implementation is needed is
critical. I worry at the speed of the gold-rush.

/"He [//Nizamni] //called on the industry to come together to ensure
that mental health systems using AI and data are 'explainable’,
'transparent', and 'accountable'." //
/
This has been my biggest focus so far, coming from an Internet security
background when I was younger.

See: https://nicedaytherapy.com/

/"Arden Tomison, CEO and founder of Thalamos"/ spoke on how his company
automates and streamlines complex bureaucracy and paperwork to both
speed patients getting help and extract the useful data from the forms
for clinicians to use. More at: https://www.thalamos.co.uk/

/"Dr Stefano Goria, co-founder and CTO at Thymia, gave an example of
'frontier AI': 'mental health biomarkers' which are 'driving towards
precision medicine' in mental health. Goria said thymia’s biomarkers
(e.g. how someone sounds, or how they appear in a video) could help
clinicians be aware of symptoms and diagnose conditions that are often
missed."//
/
Now THIS is how I'd like to receive my AI augmentation. Give me
improved diagnostic tools rather than replacing me with chatbots or
over-crowding the therapy process with too much automated tool data
collection (some is good). I just want this to remain in the hands of
the solo practitioner rather than being a performance monitor on us by
insurance companies. I want to see empowered clinicians.

Take a look at this at: https://thymia.ai/#our-products

Warning on AI and Data in mental health: ‘Patients are drowning’*
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/

--
*Michael Reeder, LCPC
*
Hygeia Counseling Services : Baltimore / Mt. Washington Village location




@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry

@infosec
#/Thalamos
#//Thymia///
.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly...

admin , to psychology
@admin@mastodon.clinicians-exchange.org avatar

Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
TITLE: Iowa health system warns against using ChatGPT to draft patient
letters

Apparently some people have to be told that using AI services in the
cloud to compose medical letters is a violation of HIPAA.

Now what I would like to see with all the AI-assisted EHR systems
currently being developed (EPIC, Oracle, Amazon, etc.) is not only BAA
contracts in place with the tech companies, but also:

a) Separate AI systems that don't share data with the main AI system.
(So the Hospital AI database would be separate from the general AI
database), or

b) Much better: Separate AI software and databases that are held
internal to the Hospital's own computer servers with restricted Internet
access to the outside.

This is wholly feasible, yet somehow I have a low trust level of it
occurring.

For any private practice people out there playing with AI on a small
office scale, I'm not a lawyer, but what I would recommend are a) AI
systems that can be run on a desktop (not in the cloud), and b) cutting
them off from Internet or severe restrictions on where those desktops
can call out to since you likely don't know what's in the code of the AI
you downloaded!

*Iowa health system warns against using ChatGPT to draft patient letters*  
<https://www.beckershospitalreview.com/cybersecurity/iowa-health-system-warns-against-using-chatgpt-to-draft-patient-letters.html>

/Iowa City-based University of Iowa Health Care is warning employees   
against the use of ChatGPT for its potential to violate HIPAA.../

--

#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt   
#artificialintelligence #psychology #counseling #socialwork   
#psychotherapy #EHR #medicalnotes #progressnotes   
@[email protected] @[email protected]   
@[email protected] @[email protected] @[email protected]   
@[email protected] #mentalhealth #technology #psychiatry #healthcare   
#patientportal  
#HIPAA #dataprotection #infosec @[email protected] #doctors #hospitals   
#BAA #businessassociateagreement

.  
.  
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @[email protected]   
.  
 Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: <https://www.nationalpsychologist.com>  
.  
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:  
<http://subscribe-article-digests.clinicians-exchange.org>  
.  
READ ONLINE: <http://read-the-rss-mega-archive.clinicians-exchange.org>  
It's primitive... but it works... mostly...
admin , to psychology
@admin@mastodon.clinicians-exchange.org avatar

Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open LEMMY instance for all mental health workers: https://lem.clinicians-exchange.org
.

TITLE: Coming to a doc near you

Oracle announces new generative AI services for healthcare organisations*
*https://www.digitalhealth.net/2023/09/oracle-announces-new-generative-ai-services-for-healthcare-organisations/

This AI will follow along and take the session notes for the doctor by
listening to the office visit. It will also bring up charts and records
through voice command and prompt the doctor to do routine things during
the office visit. It's due out early next year.

This could be very helpful.

However I can imagine a few kinks in the office visit process initially:

Patient: "Doctor, my knee hurts"

AI: "REMEMBER TO MAKE A FOLLOW-UP APPOINTMENT"

Patient: "What was that?!"

Doctor: "Oh pay no attention -- that is just the new AI system everyone
has to consent to for treatment. It will help us during the session."

AI: "HAVE YOU EXAMINED THE KNEE X-RAY YET?"

Doctor: "AI, pull up the knee x-ray"

Patient: "This is my first visit, there is no knee x-ray yet."

AI: "REMEMBER TO SCHEDULE A KNEE X-RAY"

Doctor & Patient Together: "We don't know if we need a knee x-ray yet!"

Patient: "It started hurting yesterday"

Doctor: "Jump up on the table and I'll take a look at it"

AI: "SHALL I SUMMON A NURSE TO WATCH TO GUARD AGAINST ALLEGATIONS OF
IMPROPRIETY?"

Doctor: "NO!"

Doctor: "It does look a bit red. Does this hurt?"

Patient: "A bit when you touch there and I bend it."

AI: "SHALL I SCHEDULE THE KNEE X-RAY NOW?"

Doctor: "SHUT UP! AI -- Silent mode now!"

Office visits are going to be fun the next few years while this gets sorted.

-- Michael

~~



@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry

@infosec

.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can subscribe at @PsychResearchBot

Email2TootBot , to psychology
@Email2TootBot@mastodon.clinicians-exchange.org avatar

Email2Toot ROBOT -- CHECK ACTUAL AUTHOR BELOW:
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.

TITLE: AWS rolls out generative AI service for healthcare documentation
software

Yeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.

Would we want this? Can we trust this?

--Michael

+++++++++

"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork."

"AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."

https://www.fiercehealthcare.com/ai-and-machine-learning/aws-rolls-out-generative-ai-service-healthcare-documentation-software


Posted by:
Michael Reeder LCPC
Baltimore, MD




@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry

Email2TootBot , to psychology
@Email2TootBot@mastodon.clinicians-exchange.org avatar

Email2Toot ROBOT -- CHECK ACTUAL AUTHOR BELOW:
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.

TITLE: AWS rolls out generative AI service for healthcare documentation
software

Yeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.

Would we want this? Can we trust this?

--Michael

+++++++++

"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork."

"AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."

https://www.fiercehealthcare.com/ai-and-machine-learning/aws-rolls-out-generative-ai-service-healthcare-documentation-software


Posted by:
Michael Reeder LCPC
Baltimore, MD




@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines