God's Trousers! It's a potential Singularity
Sound the alarm klaxons! Intelligence explosion immanent!
"The engines can't take much more! They're going to blow Jim!" - Scotty
>GPT-3 is terrifying because it's a tiny model compared to what's possible, trained in the dumbest way possible on a single impoverished modality on tiny data, yet the first version already manifests crazy runtime meta-learning—and the scaling curves still are not bending!
This the paper referenced "Language Models are Few Shot Learners"
https://arxiv.org/pdf/2005.14165.pdf
This some commentary on the above:
"In the classic experiment by Latane and Darley in 1968, eight groups of three students each were asked to fill out a questionnaire in a room that shortly after began filling up with smoke. Five out of the eight groups didn’t react or report the smoke, even as it became dense enough to make them start coughing. Subsequent manipulations showed that a lone student will respond 75% of the time; while a student accompanied by two actors told to feign apathy will respond only 10% of the time. This and other experiments seemed to pin down that what’s happening is pluralistic ignorance. We don’t want to look panicky by being afraid of what isn’t an emergency, so we try to look calm while glancing out of the corners of our eyes to see how others are reacting, but of course they are also trying to look calm."
https://intelligence.org/2017/10/13/fire-alarm/