The Singularity Isn't Here… Yet

So GPT-4 is out, and it's over for us meatbags. The hype has reached fever pitch, here in the latest and greatest of AI chatbots, we finally have something that can top us. The singularity has happened, and personally, I welcome our new AI overlords.

Wait a minute though, I smell a rat, and that just defines intelligence. In my time, I've dated a lot of very bright people, as well as a lot of not-so-brilliant people who nevertheless think they're very smart just because they have a bunch of qualifications and degrees. Unfortunately, the experience did not bestow me with God-like intelligence, but it did give me an idea of ​​the difference between intelligence and knowledge.

My premise is that we humans are conditioned by our educational system to equate learning with intelligence, mainly because we have faulty processors and poorer memory, which makes learning a bit difficult. So when we see an AI, a machine that can learn anything because it has a decent processor and memory, we are conditioned to consider it intelligent because that is what our schools train us to do. In fact, he seems smart to us not because he thinks of new things, but simply because he knows things that we don't know because we haven't had the time or the ability to learn them.

Growing up and having my first career at a major university, I saw this in action so many times, people mastering a skill, memorizing the textbook or the favorite opinions and theories of the university tutor, and spew them all over the exam sheet to get their amazing qualifications. On paper, they're the creme de la creme, and while it's true that they aren't thick, they're rarely the smart people they think they are. There are people with truly above-average intelligence, but in smaller numbers, and their occurrence is not a 1:1 match with those with advanced university degrees.

Even the vaunted examples of GPT's brilliance tend to reinforce this. He can take the bar exam or the SAT test, so we're told he's as smart as a school-age kid or a lawyer. Both of these qualifications follow the faulty premise of our education system that education equals intelligence, so as a machine that has learned all the facts, it follows my point above about rote learning. The machine has simply swallowed what it has learned that the answers are on the exam paper. Is it intelligence? Is a search engine smart?

That's not to say that tools like GPT-4 aren't amazing creations that have a lot of potential to do good things besides filling the Internet with superficially readable spam. Everyone should play with them and investigate their potential, and from this will undoubtedly come some very interesting things. Do not confuse them with real people, because sometimes the meat bags can surprise you.

The Singularity Isn't Here… Yet

So GPT-4 is out, and it's over for us meatbags. The hype has reached fever pitch, here in the latest and greatest of AI chatbots, we finally have something that can top us. The singularity has happened, and personally, I welcome our new AI overlords.

Wait a minute though, I smell a rat, and that just defines intelligence. In my time, I've dated a lot of very bright people, as well as a lot of not-so-brilliant people who nevertheless think they're very smart just because they have a bunch of qualifications and degrees. Unfortunately, the experience did not bestow me with God-like intelligence, but it did give me an idea of ​​the difference between intelligence and knowledge.

My premise is that we humans are conditioned by our educational system to equate learning with intelligence, mainly because we have faulty processors and poorer memory, which makes learning a bit difficult. So when we see an AI, a machine that can learn anything because it has a decent processor and memory, we are conditioned to consider it intelligent because that is what our schools train us to do. In fact, he seems smart to us not because he thinks of new things, but simply because he knows things that we don't know because we haven't had the time or the ability to learn them.

Growing up and having my first career at a major university, I saw this in action so many times, people mastering a skill, memorizing the textbook or the favorite opinions and theories of the university tutor, and spew them all over the exam sheet to get their amazing qualifications. On paper, they're the creme de la creme, and while it's true that they aren't thick, they're rarely the smart people they think they are. There are people with truly above-average intelligence, but in smaller numbers, and their occurrence is not a 1:1 match with those with advanced university degrees.

Even the vaunted examples of GPT's brilliance tend to reinforce this. He can take the bar exam or the SAT test, so we're told he's as smart as a school-age kid or a lawyer. Both of these qualifications follow the faulty premise of our education system that education equals intelligence, so as a machine that has learned all the facts, it follows my point above about rote learning. The machine has simply swallowed what it has learned that the answers are on the exam paper. Is it intelligence? Is a search engine smart?

That's not to say that tools like GPT-4 aren't amazing creations that have a lot of potential to do good things besides filling the Internet with superficially readable spam. Everyone should play with them and investigate their potential, and from this will undoubtedly come some very interesting things. Do not confuse them with real people, because sometimes the meat bags can surprise you.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow