Anonymous ID: e9abab July 28, 2022, 3:36 a.m. No.16896551   ๐Ÿ—„๏ธ.is ๐Ÿ”—kun

PB

>>16894109

 

https://twitter.com/lyzl/status/1540495806217732103?cxt=HHwWjoCyucOx-OAqAAAA

ASB412๐Ÿ‡บ๐Ÿ‡ฆ๐Ÿณ๏ธโ€๐ŸŒˆ๐Ÿดโ€โ˜ ๏ธ๐Ÿ‡บ๐Ÿ‡ธ

@AmyBamy55037

ยท

58m

Replying to

@lyzl

I hope they pulled him out of that truck.

Nick Wessum

@snarkandmoobz

ยท

33m

Having fled the Peopleโ€™s Republic of Seattle, trust meโ€ฆwe know the leftโ€™s tactics.

Take your foot off the gas, get pulled out and beaten to death (or close to it).

No thanks.

You made the rules.

 

YES ^^^^^^^^^

Summer of Love

taught us well

not stopping

to see what happens

with the "mostly peaful protestors"

 

 

all pb

>>16888314

>>16887286

>>16886892

>>16889727

>>16895155

>>16893725

>>16888895

>>16888563

>>16890680

>>16894239

>>16886387

>>16887424

Anonymous ID: e9abab July 28, 2022, 3:38 a.m. No.16896803   ๐Ÿ—„๏ธ.is ๐Ÿ”—kun

Amazon, just say no: The looming horror of AI voice replication

 

If an AI like Alexa really can convert less than a minute of recorded voice into real-time speech, it opens the door to dystopian gaslighting at a whole new level. It's frightening, creepy, and disturbing.

 

Last week, we ran a news article entitled, "Amazon's Alexa reads a story in the voice of a child's deceased grandma." In it, ZDNet's Stephanie Condon discussed an Amazon presentation at its re:MARS conference (Amazon's annual confab on topics like machine learning, automation, robotics, and space).

 

In the presentation, Amazon's Alexa AI Senior VP Rohit Prasad showed a clip of a young boy asking an Echo device, "Alexa, can grandma finish reading me 'The Wizard of Oz'?" The video then showed the Echo reading the book using what Prasad said was the voice of the child's dead grandmother.

 

Hard stop. Did the hairs on the back of your neck just raise up? 'Cause that's not creepy at all. Not at all.

 

Prasad, though, characterized it as beneficial, saying "Human attributes of empathy and affect are key for building trust. They have become even more important in these times of the ongoing pandemic, when so many of us have lost someone we love. While AI can't eliminate that pain of loss, it can definitely make their memories last."

 

Hmm. Okay. So let's deconstruct this, shall we?

I hear dead people

 

There is a psychological sensory experience clinically described as SED, for "sensory and quasi-sensory experiences of the deceased." This is a more modern clinical term for what used to be described as hallucinations.

 

According to a November 2020 clinical study, Sensory and Quasi-Sensory Experiences of the Deceased in Bereavement: An Interdisciplinary and Integrative Review, SED experiences aren't necessarily a psychological disorder. Instead, somewhere between 47% and 82% of people who have experienced life events like the death of a loved one have experienced some sort of SED.

 

The study I'm citing is interesting in particular because it is both integrative and interdisciplinary, meaning it aggregates results of other studies across a variety of research areas. As such, it's a good summing up of the general clinical perception of SED.

 

According to the study, SED experiences cross boundaries and are experienced by all age groups, members of many religions, across all types of relationship loss, and even death circumstances.

 

But whether an SED experience is considered comforting or disturbing depends both on the individual and that individual's belief system. SED also manifests in all sorts of ways, from hearing footsteps, to experiences of presence, to sightings. It doesn't always have to be voice reproduction.

 

Overall, the report stops short of making a clinical value judgement about whether SED experiences are psychologically beneficial or detrimental, stating that further study is needed.

 

https://www.zdnet.com/article/just-say-no-the-potential-horrors-of-ai-voice-replication/