Anonymous ID: bab0e9 Aug. 7, 2022, 9:11 p.m. No.17191076   🗄️.is 🔗kun

>>17190376

as always third time is the charm and then I'll shut up about it

 

 

a reminder that there was fraudulent activity which affected vote counts in very strange ways, nation wide.

 

 

i make no claim to understand why this swing occurred or that any one party was behind this, but VA CD-01 swung in a way which benefitted Trump in that district (but still lost the state so, nbd… right?)

 

 

days after polls closed, the official VA election site documented the disappearance of over 90k votes, resulting in a swing from -19k to +21k for DJT.

 

 

data archived here:

 

 

https://archive.ph/https://results.elections.virginia.gov/vaelections/2020%20November%20General/Site/Presidential.html

Anonymous ID: bab0e9 Aug. 7, 2022, 9:12 p.m. No.17191255   🗄️.is 🔗kun   >>2546

>>17190376

https://www.theverge.com/2022/6/23/23179748/amazon-alexa-feature-mimic-voice-dead-relative-ai

 

Amazon shows off Alexa feature that mimics the voices of your dead relatives

 

Amazon has revealed an experimental Alexa feature that allows the AI assistant to mimic the voices of users’ dead relatives.

 

The company demoed the feature at its annual MARS conference, showing a video in which a child asks Alexa to read a bedtime story in the voice of his dead grandmother.

 

“As you saw in this experience, instead of Alexa’s voice reading the book, it’s the kid’s grandma’s voice,” said Rohit Prasad, Amazon’s head scientist forAlexa AI. Prasad introduced the clip by saying that adding “human attributes” to AI systems was increasingly important “in these times of the ongoing pandemic, when so many of us have lost someone we love.”

 

“While AI can’t eliminate that pain of loss, it can definitely make their memories last,” said Prasad. You can watch the demo itself below:

 

Amazon has given no indication whether this feature will ever be made public, but says its systems can learn to imitate someone’s voice from just a single minute of recorded audio. In an age of abundant videos and voice notes, this means it’s well within the average consumer’s reach to clone the voices of loved ones — or anyone else they like.

 

The Verge’s new docuseries, The Future Of — where we explore how technology will change everything — is now streaming on Netflix. Watch it here!

 

Although this specific application is already controversial, with users on social media calling the feature “creepy” and a “monstrosity,” such AI voice mimicry has become increasingly common in recent years. These imitations are often known as “audio deepfakes” and are already regularly used in industries like podcasting, film and TV, and video games.

 

Audio deepfakes are already common in podcasting and film

Many audio recording suites, for example, offer users the option to clone individual voices from their recordings. That way, if a podcast host flubs her or his line, for example, a sound engineer can edit what they’ve said simply by typing in a new script. Replicating lines of seamless speech requires a lot of work, but very small edits can be made with a few clicks.

 

The same technology has been used in film, too. Last year, it was revealed that a documentary about the life of chef Anthony Bourdain, who died in 2018, used AI to clone his voice in order to read quotes from emails he sent. Many fans were disgusted by the application of the technology, calling it “ghoulish” and “deceptive.” Others defended the use of the technology as similar to other reconstructions used in documentaries.

 

Amazon’s Prasad said the feature could enable customers to have “lasting personal relationships” with the deceased, and it’s certainly true that many people around the world are already using AI for this purpose. People have already created chatbots that imitate dead loved ones, for example, training AI based on stored conversations. Adding accurate voices to these systems — or even video avatars — is entirely possible using today’s AI technology, and is likely to become more widespread.

 

However, whether or not customers will want their dead loved ones to become digital AI puppets is another matter entirely.