Anonymous ID: b26740 June 24, 2023, 6:14 a.m. No.19064015   🗄️.is 🔗kun   >>4234 >>4370 >>4574 >>4712

https://www.cognitivefinance.ai/single-post/aladdin-and-the-genius-that-is-larry-fink

Aladdin and the Genius that Is Larry Fink

“To really understand BlackRock, you need to understand Aladdin,” says chief operating officer Rob Goldstein, who oversees Aladdin as head of BlackRock Solutions. “In the earliest days of BlackRock—almost from day one—the firm was very focused on building this risk capability to understand each and every asset, each and every benchmark, and each and every portfolio.”

Goldstein joined BlackRock as an analyst in 1994, which was dubbed the Great Bond Massacre year, as the Fed began raising interest rates more than the markets expected. Fixed-income portfolios blew up amid rising interest rates and shrinking bond prices. In this volatile environment, the risk analytics built as Aladdin revealed its value. BlackRock’s funds held up well and had minimal loss compared to the overall market. Aladdin enabled BlackRock investment teams to understand what they had bought. The market took note, and people started to call BlackRock, asking them to “take a look at their portfolio.” This was the first instance that showed BlackRock they had built something their competitors didn’t have. Although the level of technical sophistication was not that high, according to Rob Goldstein, “you were a highly technical person if you knew how to use Lotus 123.” In mid-1990, BlackRock already “had the capability to capture trades electronically, to have dashboards with different colours to manage the work flow digitally, to have all positions in real time. That was shockingly rare at that time.” In the midst of 1995 crisis, BlackRock quickly understood that the technology that they developed and that they thought others might have, was actually “quite unique in the industry at that time.”

Goldstein candidly recalls the epiphany they had when they realised that they could sell this technology to third parties. It was on Halloween 1994, when they received a call from Kidder Peaboddy, the brokerage subsidiary of General Electric, asking BlackRock to help them value its assets by looking at the data on a disc called the “Michelle Spreadsheet,” with more than 1,000 rows of trades. As they would perform the trades, they would shout at Michelle (an employee at Kidder Peaboddy), who would input those trades in her spreadsheet.

Aladdin was originally designed as a piece of tech to analyse risk. In time, it has evolved into an embedded enterprise system that supports a wide range of business processes like a central nervous system of the enterprise.

Anonymous ID: b26740 June 24, 2023, 6:19 a.m. No.19064032   🗄️.is 🔗kun

https://en.wikipedia.org/wiki/Leopold_II_of_Belgium

NSALA OF WALA IN THE NSONGO DISTRICT 1904

with the hand and foot of his little girl of five years old—all that remained of a cannibal feast by armed rubber sentries. The sentries killed his wife, his daughter, and a son, cutting up the bodies, cooking and eating them.

Anonymous ID: b26740 June 24, 2023, 6:35 a.m. No.19064106   🗄️.is 🔗kun

http://www.cnn.com/chat/transcripts/2001/02/20/lunev/

Former Russian spy Col. Stanislav Lunev’s reaction to FBI agent’s arrest for spying

Anonymous ID: b26740 June 24, 2023, 7:04 a.m. No.19064213   🗄️.is 🔗kun   >>4218 >>4223 >>4234 >>4251 >>4370 >>4574 >>4712

https://www.washingtonpost.com/technology/2023/06/19/artificial-intelligence-child-sex-abuse-images/

AI-generated child sex images spawn new nightmare for the web

Investigators say the disturbing images could undermine efforts to find real-world victims

The revolution in artificial intelligence has sparked an explosion of disturbingly lifelike images showing child sexual exploitation, fueling concerns among child-safety investigators that they will undermine efforts to find victims and combat real-world abuse.

Generative-AI tools have set off what one analyst called a “predatory arms race” on pedophile forums because they can create within seconds realistic images of children performing sex acts, commonly known as child pornography.

Thousands of AI-generated child-sex images have been found on forums across the dark web, a layer of the internet visible only with special browsers, with some participants sharing detailed guides for how other pedophiles can make their own creations.

“Children’s images, including the content of known victims, are being repurposed for this really evil output,” said Rebecca Portnoff, the director of data science at Thorn, a nonprofit child-safety group that has seen month-over-month growth of the images’ prevalence since last fall.

“Victim identification is already a needle-in-a-haystack problem, where law enforcement is trying to find a child in harm’s way,” she said. “The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.”

The flood of images could confound the central tracking system built to block such material from the web because it is designed only to catch known images of abuse, not detect newly generated ones. It also threatens to overwhelm law enforcement officials who work to identify victimized children and will be forced to spend time determining whether the images are real or fake.

The images have also ignited debate on whether they even violate federal child-protection laws because they often depict children who don’t exist. Justice Department officials who combat child exploitation say such images still are illegal even if the child shown is AI-generated, but they could cite no case in which someone had been charged for creating one.

The new AI tools, known as diffusion models, allow anyone to create a convincing image solely by typing in a short description of what they want to see. The models, such as DALL-E, Midjourney and Stable Diffusion, were fed billions of images taken from the internet, many of which showed real children and came from photo sites and personal blogs. They then mimic those visual patterns to create their own images.

The tools have been celebrated for their visual inventiveness and have been used to win fine-arts competitions, illustrate children’s books and spin up fake news-style photographs, as well as to create synthetic pornography of nonexistent characters who look like adults.

But they also have increased the speed and scale with which pedophiles can create new explicit images because the tools require less technical sophistication than past methods, such as superimposing children’s faces onto adult bodies using “deepfakes,” and can rapidly generate many images from a single command.

It’s not always clear from the pedophile forums how the AI-generated images were made. But child-safety experts said many appeared to have relied on open-source tools, such as Stable Diffusion, which can be run in an unrestricted and unpoliced way.

Stability AI, which runs Stable Diffusion, said in a statement that it bans the creation of child sex-abuse images, assists law enforcement investigations into “illegal or malicious” uses and has removed explicit material from its training data, reducing the “ability for bad actors to generate obscene content.”

But anyone can download the tool to their computer and run it however they want, largely evading company rules and oversight. The tool’s open-source license asks users not to use it “to exploit or harm minors in any way,” but its underlying safety features, including a filter for explicit images, is easily bypassed with some lines of code that a user can add to the program.

Anonymous ID: b26740 June 24, 2023, 7:05 a.m. No.19064218   🗄️.is 🔗kun   >>4220

>>19064213

Testers of Stable Diffusion have discussed for months the risk that AI could be used to mimic the faces and bodies of children, according to a Washington Post review of conversations on the chat service Discord. One commenter reported seeing someone use the tool to try to generate fake swimsuit photos of a child actress, calling it “something ugly waiting to happen.”

But the company has defended its open-source approach as important for users’ creative freedom. Stability AI’s chief executive, Emad Mostaque, told the Verge last year that “ultimately, it’s people’s responsibility as to whether they are ethical, moral and legal in how they operate this technology,” adding that “the bad stuff that people create … will be a very, very small percentage of the total use.”

Stable Diffusion’s main competitors, Dall-E and Midjourney, ban sexual content and are not provided open source, meaning that their use is limited to company-run channels and all images are recorded and tracked.

OpenAI, the San Francisco research lab behind Dall-E and ChatGPT, employs human monitors to enforce its rules, including a ban against child sexual abuse material, and has removed explicit content from its image generator’s training data so as to minimize its “exposure to these concepts,” a spokesperson said.

“Private companies don’t want to be a party to creating the worst type of content on the internet,” said Kate Klonick, an associate law professor at St. John’s University. “But what scares me the most is the open release of these tools, where you can have individuals or fly-by-night organizations who use them and can just disappear. There’s no simple, coordinated way to take down decentralized bad actors like that.”

On dark-web pedophile forums, users have openly discussed strategies for how to create explicit photos and dodge anti-porn filters, including by using non-English languages they believe are less vulnerable to suppression or detection, child-safety analysts said.

On one forum with 3,000 members, roughly 80 percent of respondents to a recent internal poll said they had used or intended to use AI tools to create child sexual abuse images, said Avi Jager, the head of child safety and human exploitation at ActiveFence, which works with social media and streaming sites to catch malicious content.

Forum members have discussed ways to create AI-generated selfies and build a fake school-age persona in hopes of winning children’s trust, Jager said. Portnoff, of Thorn, said her group also has seen cases in which real photos of abused children were used to train the AI tool to create new images showing those children in sexual positions.

Yiota Souras, the chief legal officer of the National Center for Missing and Exploited Children, a nonprofit that runs a database that companies use to flag and block child-sex material, said her group has fielded a sharp uptick of reports of AI-generated images within the past few months, as well as reports of people uploading images of child sexual abuse into the AI tools in hopes of generating more.

Though a small fraction of the more than 32 million reports the group received last year, the images’ increasing prevalence and realism threaten to burn up the time and energy of investigators who work to identify victimized children and don’t have the ability to pursue every report, she said. The FBI said in an alert this month that it had seen an increase in reports regarding children whose photos were altered into “sexually-themed images that appear true-to-life.”

“For law enforcement, what do they prioritize?” Souras said. “What do they investigate? Where exactly do these go in the legal system?”

Some legal analysts have argued that the material falls in a legal gray zone because fully AI-generated images do not depict a real child being harmed. In 2002, the Supreme Court struck down two provisions of a 1996 congressional ban on “virtual child pornography,” ruling that its wording was broad enough to potentially criminalize some literary depictions of teenage sexuality.

Anonymous ID: b26740 June 24, 2023, 7:05 a.m. No.19064220   🗄️.is 🔗kun

>>19064218

The ban’s defenders argued at the time that the ruling would make it harder for prosecutors arguing cases involving child sexual abuse because defendants could claim the images didn’t show real children.

In his dissent, Chief Justice William H. Rehnquist wrote, “Congress has a compelling interest in ensuring the ability to enforce prohibitions of actual child pornography, and we should defer to its findings that rapidly advancing technology soon will make it all but impossible to do so.”

Daniel Lyons, a law professor at Boston College, said the ruling probably merits revisiting, given how the technology has advanced in the past two decades.

“At the time, virtual [child sexual abuse material] was technically hard to produce in ways that would be a substitute for the real thing,” he said. “That gap between reality and AI-generated materials has narrowed, and this has gone from a thought experiment to a potentially major real-life problem.”

Two officials with the Justice Department’s Child Exploitation and Obscenity Section said the images are illegal under a law that bans any computer-generated image that is sexually explicit and depicts someone who is “virtually indistinguishable” from a real child.

They also cite another federal law, passed in 2003, that bans any computer-generated image showing a child engaging in sexually explicit conduct if it is obscene and lacks serious artistic value. The law notes that “it is not a required element of any offense … that the minor depicted actually exist.”

“A depiction that is engineered to show a composite shot of a million minors, that looks like a real kid engaged in sex with an adult or another kid — we wouldn’t hesitate to use the tools at our disposal to prosecute those images,” said Steve Grocki, the section’s chief.

The officials said hundreds of federal, state and local law-enforcement agents involved in child-exploitation enforcement will probably discuss the growing problem at a national training session this month.

Separately, some groups are working on technical ways to confront the issue, said Margaret Mitchell, an AI researcher who previously led Google’s Ethical AI team.

One solution, which would require government approval, would be to train an AI model to create examples of fake child-exploitation images so online detection systems would know what to remove, she said. But the proposal would pose its own harms, she added, because this material can come with a “massive psychological cost: This is stuff you can’t unsee.”

Other AI researchers now are working on identification systems that could imprint code into images linking back to their creators in hopes of dissuading abuse. Researchers at the University of Maryland last month published a new technique for “invisible” watermarks that could help identify an image’s creator and be challenging to remove.

Such ideas would probably require industry-wide participation for them to work, and even still they would not catch every violation, Mitchell said. “We’re building the plane as we’re flying it,” she said.

Even when these images don’t depict real children, Souras, of the National Center for Missing and Exploited Children, said they pose a “horrible societal harm.” Created quickly and in massive amounts, they could be used to normalize the sexualization of children or frame abhorrent behaviors as commonplace, in the same way predators have used real images to induce children into abuse.

“You’re not taking an ear from one child. The system has looked at 10 million children’s ears and now knows how to create one,” Souras said. “The fact that someone could make 100 images in an afternoon and use those to lure a child into that behavior is incredibly damaging.”

Anonymous ID: b26740 June 24, 2023, 7:09 a.m. No.19064236   🗄️.is 🔗kun   >>4246

https://en.wikipedia.org/wiki/Homunculus

That the sperm of a man be putrefied by itself in a sealed cucurbit for forty days with the highest degree of putrefaction in a horse's womb, or at least so long that it comes to life and moves itself, and stirs, which is easily observed. After this time, it will look somewhat like a man, but transparent, without a body. If, after this, it be fed wisely with the Arcanum of human blood, and be nourished for up to forty weeks, and be kept in the even heat of the horse's womb, a living human child grows therefrom, with all its members like another child, which is born of a woman, but much smaller.

Anonymous ID: b26740 June 24, 2023, 7:18 a.m. No.19064270   🗄️.is 🔗kun   >>4283

https://twitter.com/vicktop55/status/1672525703760297984

Moment of today's broadcast on the BBC. HSE lecturer Olga Krasnyak began to say that the Russian people would definitely not accept the rebellion, because "We are one people, we are one nation, we are united in achieving our goals." But you understand, this is not what the British state channel hosts wanted to hear. Therefore, the presenter interrupted the speaker.

Anonymous ID: b26740 June 24, 2023, 7:25 a.m. No.19064308   🗄️.is 🔗kun   >>4311 >>4323 >>4370 >>4574 >>4712

https://www.nytimes.com/2023/06/24/opinion/father-hunter-biden-addiction.html

The Real Lesson From the Hunter Biden Saga

One of our most urgent national problems is addiction to drugs and alcohol. It now kills about a quarter-million Americans a year, leaves many others homeless and causes unimaginable heartache in families across the country — including the family living in the White House.

Hunter Biden, who has written about his tangles with crack cocaine and alcohol, reached a plea agreement on tax charges a few days ago that left some Republicans sputtering, but to me, the main takeaway is a lesson the country and the president could absorb to save lives.

While the federal investigation appears to be ongoing, for now I see no clear evidence of wrongdoing by President Biden himself — but the president does offer the country a fine model of the love and support that people with addictions need.

When Biden was vice president and trailed by Secret Service agents, he once tracked down Hunter when he was on a bender and refused to leave until his son committed to entering treatment. Biden then gave his son a tight hug and promised to return to make sure he followed through.

“Dad saved me,” Hunter wrote in his memoir, “Beautiful Things,” adding: “Left on my own, I’m certain I would not have survived.”

On another occasion, the Biden family staged an intervention, and Hunter stormed out of the house. Biden ran down the driveway after his son. “He grabbed me, swung me around and hugged me,” Hunter wrote. “He held me tight in the dark and cried for the longest time.”

Last year Sean Hannity broadcast an audio recording of a voice mail message that President Biden left for Hunter. Hannity thought it reflected badly on the president; my reaction was that if more parents showed this kind of support for children in crisis, our national addiction nightmare might be easier to overcome.

“It’s Dad,” the president says in the message, and he sounds near tears. “I’m calling to tell you I love you. I love you more than the whole world, pal. You gotta get some help. I don’t know what to do. I know you don’t, either. But I’m here, no matter what you need. No matter what you need. I love you.”

I don’t have family members with addictions, but I’ve lost far too many friends to drugs and alcohol. At this moment, I have two friends who have disappeared, abandoning their children, and when last seen were homeless, abusing drugs and supporting themselves by selling fentanyl. I fear every day that they’ll die from an overdose, or that they’ll sell drugs to someone else who overdoses.

Anonymous ID: b26740 June 24, 2023, 7:26 a.m. No.19064311   🗄️.is 🔗kun   >>4370 >>4574 >>4712

>>19064308

I’m terrified for them and furious at them — but even more, I’m outraged that so many Americans are suffering pain and inflicting pain, yet our policies toward addiction are lackadaisical and ineffective: Only about 6 percent of people with substance use disorder get treatment, according to the federal government. We should be expanding access, boosting research for medication-assisted treatment, pressing China harder to curb exports of fentanyl precursors and addressing the economic despair that drives some people to substance abuse.

The Bidens benefited from the connections and resources often necessary to access detox and rehab programs; these should be readily available to all.

Some Republicans allege that the president himself was engaged in influence peddling, and that Hunter received favorable treatment from the Justice Department; an I.R.S. whistle-blower who assisted in the investigation says that a prosecutor in the Justice Department did indeed interfere on the side of Hunter. Maybe there will be future revelations, but for now, as best I can sort things out: 1) Hunter acted inappropriately to monetize his proximity to the White House, just as Donald Trump and members of his family did; and 2) Joe Biden acted honorably (although I do think it was a mistake to take Hunter to China on Air Force Two in 2013 when he was pursuing business there, and Biden was flatly wrong to say in May that “my son has done nothing wrong”).

The Biden administration kept on a Trump appointee as U.S. attorney in Delaware, precisely to continue an independent and credible investigation of Hunter. The prosecutors appear to have pored over 15 years of Hunter’s business dealings and have not so far identified any misconduct by the president. And the plea agreement the prosecutor reached with Hunter does not seem lenient. (Most people in similar circumstances, including Roger Stone, have not been prosecuted criminally.)

Congressional Republicans will continue to make allegations. Some Democrats have seemed reluctant to engage, perhaps finding the Hunter saga sordid and likely to taint those who touch it. I think that’s a mistake. What I see is an opportunity for the president to take on the nation’s drug and alcohol problem as forcefully as he took on his son’s. Hunter Biden appears to have come back from the brink, and that can reassure families now in despair; millions of desperate Americans could use that hope.

One precedent: The former first lady Betty Ford’s heroic acknowledgment in the 1970s of her struggles with drugs and alcohol pulled back the curtain on addiction and got many more people into treatment.

Joe Biden undertook a major federal push to combat cancer after it claimed his son Beau; I wish he would transform his administration’s present ineffective effort against addiction into a similar all-out initiative against the forces that almost killed Hunter.

Anonymous ID: b26740 June 24, 2023, 7:33 a.m. No.19064349   🗄️.is 🔗kun

Western media do not hide their joy.

"The man who wants to overthrow Putin"

"Muscovites are packing their bags"

In the German editions - fireworks and champagne

Anonymous ID: b26740 June 24, 2023, 8 a.m. No.19064483   🗄️.is 🔗kun   >>4486

https://www.rollingstone.com/politics/politics-features/arizona-republicans-embrace-qanon-quack-covid-hearing-1234742074/

Arizona Republicans Embrace QAnon With Quack Covid Hearing

A new committee has adopted a hashtag from the conspiracy theory’s followers, and is holding a two-day anti-vax circus at the state Capitol

Anonymous ID: b26740 June 24, 2023, 8:05 a.m. No.19064504   🗄️.is 🔗kun

Then: “Sure, Azov may have a FEW bad apples, but Wagner is FULL of Nazi who do Nazi things!”

Now: “The Wagner Freedom Fighters are LIBERATING Russia from Putin!”

Anonymous ID: b26740 June 24, 2023, 8:38 a.m. No.19064639   🗄️.is 🔗kun

https://www.dailymail.co.uk/news/article-12227091/Hunter-Biden-tax-probe-began-investigation-amateur-porn-ring.html

Hunter Biden tax probe began as investigation into an amateur porn platform: Bombshell allegations from IRS whistleblower reveal President's son deducted thousands in payments to a prostitute and sex club

 

House Ways and Means released transcript of IRS whistleblowers

Gary Shapley charged that DOJ gave Hunter special treatment on taxes owed

He said probe began as look at amateur porn platform; expenses questioned