Joaquin Oliver’s parents are trying to cope with their son’s murder, but journalist Jim Acosta exploited their grief and crossed ethical boundaries by ‘interviewing’ the deceased teen’s AI avatar.
Grok AI deepfakes used to target women like X user Evie
Grok AI is being used to create porn-like deepfakes of women, including feminist X user Evie.
Loss is an intrinsic part of being human. We all have to endure it at some point.
One of the hardest things about losing a loved one is that as time goes by, it becomes more difficult to hear their voice or see them fully beyond how they appear in photographs.
As artificial intelligence grows increasingly advanced, it has started to promise the allure of “reconnecting” with those who have died.
It used to be the fodder of “Black Mirror” episodes. It’s now our reality.
The ethical implications of bringing someone who has died back to “life” came to the forefront recently when former CNN White House correspondent Jim Acosta shocked and appalled many when he “interviewed” the AI avatar of a student killed in the mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, in 2018.
The segment aired Aug. 4 – on what would have been Joaquin Oliver’s 25th birthday had he not been murdered that day, along with 16 other victims.
Oliver’s avatar was created by his parents, who uploaded video, photos and things Oliver had written to fashion this AI presence. Oliver’s father, Manuel Oliver, had approached Acosta about doing the “interview.” And Acosta agreed.
Journalists are supposed to be truth tellers. Acosta’s interview crossed multiple lines.
I don’t fault for a minute Oliver’s parents, who are simply finding ways to cope with their grief. They want to keep their son’s memory alive in any way they can.
They also want to bring some meaning to horror by having their son “speak out” about gun violence.
Yet, the intensity of the negative reaction to Acosta’s decision to speak to the Olivers’ dead son is for good reason.
The second I saw a clip of the segment on X, I felt sick. It seemed deeply exploitive and wrong, and as a veteran journalist, Acosta should have known better.
The avatar of Oliver, while it “spoke” and asked questions, appeared stilted and robot-like.
That didn’t stop Acosta from treating the exchange as a real interview, blurring the line between fiction and reality. That’s what bothered me most about the exchange.
Manuel Oliver talked with Acosta following the taped exchange, and expressed how he hopes his son’s avatar will soon be on debate stages. Acosta seemed to agree that would be positive.
“We’ve heard from the parents, we’ve heard from the politicians,” Acosta said to the father of the school shooting victim. “Now we’re hearing from one of the kids. That’s important. That hasn’t happened.”
The reality is that it still hasn’t happened. It never will.
AI isn’t real news. And the media can’t portray it that way.
Acosta, who quit his CNN job earlier this year when his show was relegated to late night and now has a Substack, proclaims to his 2 million followers in his X bio to “believe in #realnews.”
That’s hard to square with his disturbing decision.
In promoting the show, Acosta teased the segment as “a one of a kind interview” with a shooting victim. Maybe he’s trying to stay relevant in his new, independent role, now that he no longer spars so publicly with President Donald Trump.
I don’t know.
But Acosta’s “interview” crossed major ethical lines. It’s one thing for grieving parents to create a memory of their son. It’s another altogether for a journalist to treat the avatar as real.
As I’ve written before, AI is infiltrating into most areas of our lives, including the media industry.
And AI will only continue to improve and become more difficult to decipher from reality. Professional journalists shouldn’t blur that line.
Ingrid Jacques is a columnist at USA TODAY. Contact her at ijacques@usatoday.com or on X: @Ingrid_Jacques