ID: c94da6 June 8, 2020, 2:31 a.m. No.9531098   🗄️.is 🔗kun

think this may be worth noting. the Open AI group just put out a paper that is causing some excitement:

 

GPT-3 is terrifying because it's a tiny model compared to what's possible, trained in the dumbest way possible on a single impoverished modality on tiny data, yet the first version already manifests crazy runtime meta-learning—and the scaling curves still are not bending!

 

This the paper referenced "Language Models are Few Shot Learners"

 

https://arxiv.org/pdf/2005.14165.pdf

Language Models are Few-Shot Learners

Language Models are Few-Shot Learners Tom B. Brown Benjamin Mann Nick Ryder Melanie Subbiah Jared Kaplany Prafulla Dhariwal Arvind Neelakantan Pranav Shyam Girish Sastry Amanda Askell Sandhini Agarwal Ariel Herbert-Voss Gretchen Krueger Tom Henighan

arxiv.org

 

This some commentary on the above:

 

https://intelligence.org/2017/10/13/fire-alarm/

 

Perhaps those theories about an intelligence explosion, a singularity, the apocalypse of the nerds are not so far off.