Anonymous ID: 171bee March 7, 2019, 10:34 a.m. No.5559935   🗄️.is 🔗kun

THE PENTAGON CANCELED its so-called LifeLog project, an ambitious effort to build a database tracking a person's entire existence.

 

Run by Darpa, the Defense Department's research arm, LifeLog aimed to gather in a single place just about everything an individual says, sees or does: the phone calls made, the TV shows watched, the magazines read, the plane tickets bought, the e-mail sent and received. Out of this seemingly endless ocean of information, computer scientists would plot distinctive routes in the data, mapping relationships, memories, events and experiences.

 

LifeLog's backers said the all-encompassing diary could have turned into a near-perfect digital memory, giving its users computerized assistants with an almost flawless recall of what they had done in the past. But civil libertarians immediately pounced on the project when it debuted last spring, arguing that LifeLog could become the ultimate tool for profiling potential enemies of the state.

 

Researchers close to the project say they're not sure why it was dropped late last month. Darpa hasn't provided an explanation for LifeLog's quiet cancellation. "A change in priorities" is the only rationale agency spokeswoman Jan Walker gave to Wired News.

 

However, related Darpa efforts concerning software secretaries and mechanical brains are still moving ahead as planned.

LifeLog is the latest in a series of controversial programs that have been canceled by Darpa in recent months. The Terrorism Information Awareness, or TIA, data-mining initiative was eliminated by Congress – although many analysts believe its research continues on the classified side of the Pentagon's ledger. The Policy Analysis Market (or FutureMap), which provided a stock market of sorts for people to bet on terror strikes, was almost immediately withdrawn after its details came to light in July.

 

"I've always thought (LifeLog) would be the third program (after TIA and FutureMap) that could raise eyebrows if they didn't make it clear how privacy concerns would be met," said Peter Harsha, director of government affairs for the Computing Research Association.

 

"Darpa's pretty gun-shy now," added Lee Tien, with the Electronic Frontier Foundation, which has been critical of many agency efforts. "After TIA, they discovered they weren't ready to deal with the firestorm of criticism."

 

That's too bad, artificial-intelligence researchers say. LifeLog would have addressed one of the key issues in developing computers that can think: how to take the unstructured mess of life, and recall it as discreet episodes – a trip to Washington, a sushi dinner, construction of a house.

 

"Obviously we're quite disappointed," said Howard Shrobe, who led a team from the Massachusetts Institute of Technology Artificial Intelligence Laboratory which spent weeks preparing a bid for a LifeLog contract. "We were very interested in the research focus of the program … how to help a person capture and organize his or her experience. This is a theme with great importance to both AI and cognitive science."

 

To Tien, the project's cancellation means "it's just not tenable for Darpa to say anymore, 'We're just doing the technology, we have no responsibility for how it's used.'"

 

Private-sector research in this area is proceeding. At Microsoft, for example, minicomputer pioneer Gordon Bell's program, MyLifeBits, continues to develop ways to sort and store memories.

 

David Karger, Shrobe's colleague at MIT, thinks such efforts will still go on at Darpa, too.

 

"I am sure that such research will continue to be funded under some other title," wrote Karger in an e-mail. "I can't imagine Darpa 'dropping out' of such a key research area."

 

http://archive.is/OvB9T#selection-1347.0-1607.33

Anonymous ID: 171bee March 7, 2019, 10:38 a.m. No.5559987   🗄️.is 🔗kun

The Pentagon is shopping for ways to capture everything a person sees, says and hears, as part of a project it says is meant to help create smarter robots.

 

The projected system, called LifeLog, would take in all of a subject's experience, from phone numbers dialed and e-mail messages viewed to every breath taken, step made and place gone. The idea is to index the material and make patterns easily retrievable, in an effort to make machines think more like people, learning from experience.

 

The Defense Advanced Research Projects Agency, or Darpa, the Pentagon's cradle for new technologies, is sponsoring a competition for proposals to set up such a system.

 

The project could result in more effective computers capable of building on a user's past and interpreting his or her commands, said Jan Walker, a Darpa spokeswoman.

 

Ms. Walker said the new project had nothing to do with the agency's Terrorist Information Awareness program, formerly called Total Information Awareness – a research initiative, criticized by civil liberties groups, to create a vast computer-based surveillance system intended to thwart terrorism.

 

The goal of LifeLog is to create a searchable database of human lives, initially those of the developers, to promote artificial intelligence, the agency said. The technology would advance a new class of systems able to reason in a number of ways, learn from experience and respond in a robust manner to surprises, the agency's Information Processing Technology Office said.

Continue reading the main story

 

To do so, the office said, the system must index the details of daily life and make it possible to infer the user's routines, habits and relationships with other people, organizations, places and objects, and to exploit these patterns to ease its task.

 

Darpa said any proposals from developers would have to address human subject approval, data privacy and security, copyright and legal considerations that would affect the LifeLog development process.

 

Steven Aftergood, who tracks government secrecy for the Federation of American Scientists, said he was not prepared to call the LifeLog initiative illegitimate. But, you know, it's one more program that demands vigilant oversight, he said. The more personal experience that can be captured by digital means, the more vulnerable that experience is to unwanted surveillance.

 

https: //www.nytimes.com/2003/05/30/us/pentagon-explores-a-new-frontier-in-the-world-of-virtual-intelligence.html

Anonymous ID: 171bee March 7, 2019, 10:44 a.m. No.5560064   🗄️.is 🔗kun

When Randi Zuckerberg’s family pic got out in 2012 the family was upset Facebook didn’t guard their privacy more than that of the littles

Anonymous ID: 171bee March 7, 2019, 10:51 a.m. No.5560177   🗄️.is 🔗kun   >>0187 >>0236 >>0308 >>0309

Facebook sent a doctor on a secret mission to ask hospitals to share patient data

1:39 PM ET Thu, 14 June 2018 | 01:02

 

Facebook has asked several major U.S. hospitals to share anonymized data about their patients, such as illnesses and prescription info, for a proposed research project. Facebook was intending to match it up with user data it had collected, and help the hospitals figure out which patients might need special care or treatment.

 

The proposal never went past the planning phases and has been put on pause after the Cambridge Analytica data leak scandal raised public concerns over how Facebook and others collect and use detailed information about Facebook users.

 

"This work has not progressed past the planning phase, and we have not received, shared, or analyzed anyone's data," a Facebook spokesperson told CNBC.

 

But as recently as last month, the company was talking to several health organizations, including Stanford Medical School and American College of Cardiology, about signing the data-sharing agreement.

 

While the data shared would obscure personally identifiable information, such as the patient's name, Facebook proposed using a common computer science technique called "hashing" to match individuals who existed in both sets. Facebook says the data would have been used only for research conducted by the medical community.

 

The project could have raised new concerns about the massive amount of data Facebook collects about its users, and how this data can be used in ways users never expected.

 

That issue has been in the spotlight after reports that Cambridge Analytica, a political research organization that did work for Donald Trump, improperly got ahold of detailed information about Facebook users without their permission. It then tried to use this data to target political ads to them.

 

Facebook said on Wednesday that as many as 87 million people's data might have been shared this way. The company has recently announced new privacy policies and controls meant to restrict the type of data it collects and shares, and how that data can be used.

 

The exploratory effort to share medical-related data was led by an interventional cardiologist called Freddy Abnousi, who describes his role on LinkedIn as "leading top-secret projects." It was under the purview of Regina Dugan, the head of Facebook's "Building 8" experiment projects group, before she left in October 2017.

 

Facebook's pitch, according to two people who heard it and one who is familiar with the project, was to combine what a health system knows about its patients (such as: person has heart disease, is age 50, takes 2 medications and made 3 trips to the hospital this year) with what Facebook knows (such as: user is age 50, married with 3 kids, English isn't a primary language, actively engages with the community by sending a lot of messages).

 

The project would then figure out if this combined information could improve patient care, initially with a focus on cardiovascular health. For instance, if Facebook could determine that an elderly patient doesn't have many nearby close friends or much community support, the health system might decide to send over a nurse to check in after a major surgery.

 

The people declined to be named as they were asked to sign confidentiality agreements.

 

Facebook provided a quote from Cathleen Gates, the interim CEO of the American College of Cardiology, explaining the possible benefits of the plan:

 

"For the first time in history, people are sharing information about themselves online in ways that may help determine how to improve their health. As part of its mission to transform cardiovascular care and improve heart health, the American College of Cardiology has been engaged in discussions with Facebook around the use of anonymized Facebook data, coupled with anonymized ACC data, to further scientific research on the ways social media can aid in the prevention and treatment of heart disease—the #1 cause of death in the world. This partnership is in the very early phases as we work on both sides to ensure privacy, transparency and scientific rigor. No data has been shared between any parties."

 

https://www.cnbc.com/2018/04/05/facebook-building-8-explored-data-sharing-agreement-with-hospitals.html

Anonymous ID: 171bee March 7, 2019, 10:51 a.m. No.5560187   🗄️.is 🔗kun

>>5560177

Health systems are notoriously careful about sharing patient health information, in part because of state and federal patient privacy laws that are designed to ensure that people's sensitive medical information doesn't end up in the wrong hands.

 

To address these privacy laws and concerns, Facebook proposed to obscure personally identifiable information, such as names, in the data being shared by both sides.

 

However, the company proposed using a common cryptographic technique called hashing to match individuals who were in both data sets. That way, both parties would be able to tell when a specific set of Facebook data matched up with a specific set of patient data.

 

The issue of patient consent did not come up in the early discussions, one of the people said. Critics have attacked Facebook in the past for doing research on users without their permission. Notably, in 2014, Facebook manipulated hundreds of thousands of people's news feeds to study whether certain types of content made people happier or sadder. Facebook later apologized for the study.

 

Health policy experts say that this health initiative would be problematic if Facebook did not think through the privacy implications.

 

"Consumers wouldn't have assumed their data would be used in this way," said Aneesh Chopra, president of a health software company specializing in patient data called CareJourney and the former White House chief technology officer.

 

"If Facebook moves ahead (with its plans), I would be wary of efforts that repurpose user data without explicit consent."

 

When asked about the plans, Facebook provided the following statement:

 

"The medical industry has long understood that there are general health benefits to having a close-knit circle of family and friends. But deeper research into this link is needed to help medical professionals develop specific treatment and intervention plans that take social connection into account."

 

"With this in mind, last year Facebook began discussions with leading medical institutions, including the American College of Cardiology and the Stanford University School of Medicine, to explore whether scientific research using anonymized Facebook data could help the medical community advance our understanding in this area. This work has not progressed past the planning phase, and we have not received, shared, or analyzed anyone's data."

 

"Last month we decided that we should pause these discussions so we can focus on other important work, including doing a better job of protecting people's data and being clearer with them about how that data is used in our products and services."

 

Facebook has taken only tentative steps into the health sector thus far, such as its campaign to promote organ donation through the social network. It also has a growing "Facebook health" team based in New York that is pitching pharmaceutical companies to invest its ample ad budget into Facebook by targeting users who "liked" a health advocacy page, or fits a certain demographic profile.

 

https://www.cnbc.com/2018/04/05/facebook-building-8-explored-data-sharing-agreement-with-hospitals.html