I have a counter argument. From an evolutionary standpoint, if you keep doubling computer capacity exponentially isn’t it extraordinarily arrogant of humans to assume that their evolutionarily stagnant brains will remain relevant for much longer?
You can make the same argument about humans that you do AI, but from a biological and societal standpoint. Barring any jokes about certain political or geographical stereotypes, humans have gotten “smarter” that we used to be. We are very adaptable, and with improvements to diet and education, we have managed to stay ahead of the curve. We didn’t peak at hunter-gatherer. We didn’t stop at the Renaissance. And we blew right past the industrial revolution. I’m not going to channel my “Humanity, Fuck Yeah” inner wolf howl, but I have to give our biology props. The body is an amazing machine, and even though we can look at things like the current crop of AI and think, “Welp, that’s it, humans are done for,” I’m sure a lot of people thought the same at other pivotal moments in technological and societal advancement. Here I am, though, farting taco bell into my office chair and typing about it.
You can compare human intelligence to centuries ago on a simple linear scale. Neural density has not increased by any stretch of the imagination in the way that transistor density has. But I’m not just talking density I’m talking about scalability that is infinite. Infinite scale of knowledge and data.
Let’s face it people are already not that intelligent, we are smart enough to use the technology of other smarter people. And then there are computers, they are growing intelligently with an artificial evolutionary pressure being exerted on their development, and you’re telling me that that’s not going to continue to surpass us in every way? There is very little to stop computers from being intelligent on a galactic scale.
Computer power doesn’t scale infinitely, unless you mean building a world mind and powering if off of the spinning singularity at the center of the galaxy like a type 3 civilization, and that’s sci-fi stuff. We still have to worry about bandwidth, power, cooling, coding and everything else that going into running a computer. It doesn’t just “scale”. There is a lot that goes into it, and it does have a ceiling. Quantum computing may alleviate some of that, but I’ll hold my applause until we see some useful real world applications for it.
Furthermore, we still don’t understand how the mind works, yet. There are still secrets to unlock and ways to potentially augment and improve it. AI is great, and I fully support the advancement in technology, but don’t count out humans so quickly. We haven’t even gotten close to human level intelligence and GOFAI, and maybe we never will.
You can believe whatever you want, but I don’t think it’s arrogant to say what I did. You are basing your view of humanity on what you think humanity has done, and basing your view on AI based on what you think it will do. Those are fundamentally different and not comparable. If you want to talk about the science fiction future of AI, we should talk about the science fiction future of humanity as well. Let’s talk about augmenting ourselves, extending lifespans, and all of the good things that people think we’ll do in the coming centuries. If you want to look at humans and say that we haven’t evolved at all in the last 3000 years, then we should look at computers the same way. Computers haven’t “evolved” at all. They still do the same thing they always have. They do a lot more of it, but they don’t do anything “new”. We have found ways to increase the processing power, and the storage capacity, but a computer today has the same limits that the one that sent us to the moon had. It’s a computer, and incapable of original thought. You seem to believe that just because we throw more ram and processors at it that somehow that will change things, but it doesn’t. It just means we can do the same things, but faster. Eventually we’ll run out of things to process and data to store, but that won’t bring AI any closer to reality. We are climbing the mountain, but you speak like we have already crested. We’ve barely left base camp in the grand scheme of artificial intelligence.
Holy wall of unparagraphed word salad, Again you are not understanding what is and isn’t an evolutionary process, a disease can wipe out half a species and that is considered a process of evolution. You don’t have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That’s all that is needed to have an evolutionary force.
With computers we don’t have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress. Do you think computers are going to stop improving? The road maps for chip architecture for the next ten years doesn’t seem to suggest it’s slowing down yet.
And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.
On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.
Therefore computer intelligence is vastly going to outpace or own. And very soon too.
The laws of physics still apply. We already have to do all kinds of crazy tricks to make transistors as small as they are and not leak electrons all over the place due to quantum tunneling. The best thing we figured out how to do is just pile on more CPU/GPU cores.
It’s also arrogant to assume we will continue on this exponential industrial-revolution growth of the last 300 years and not plateau as a species again for the next thousand. We could be looking at an eon of just burnin’ away our oil while we try to cling more and more to whatever other energy impinges on this pitiful little planet, trapped in our local space unable to use our pathetic spacecraft to push us any further.
The laws of physics are no less or more applicable to our own biology in terms of complexity, density, scale, and information capacity and in most ways is far less efficient and accurate than their silicon counterparts.
There is nothing to suggest the growth in computer intelligence is going to stop occurring or it’s don’t anything but just getting started.
Apart from your use of infinite I agree, there is no reason we shouldn’t be able to surpass nature with synthetic intelligence. The time computers have existed is a mere blip on a historic scale, and computers has surpassed us at logic games like Chess and at math already long ago.
Modern LLM models are just the current stage, before that it could be said it was pattern recognition. We had OCR in the 80’s as probably the most practical example. It may seem there is long between the breakthroughs, but 40 years is nothing compared to evolution.
I have no doubt strong AI will be achieved eventually, and when we do, I have no doubt AI will surpass our intelligence in every way very quickly.
If you keep doubling the number of fruit flies exponentially, isn’t it likely that humanity will find itself outsmarted?
The answer is no, it isn’t. Quantity does not quality make and all our current AI tech is about ways to breed fruit flies that fly left or right depending on what they see.
As a counter argument against that, companies are trying to make self driving cars work for 20 years. Processing power has increased by a million and the things still get stuck. Pure processing power isn’t everything.
Magic as in street magician, not magic as in wizard. Lots of the things that people claim AI can do are like a magic show, it’s amazing if you look at it from the right angle, and with the right skill you can hide the strings holding it up, but if you try to use it in the real world it falls apart.
The masses have been treating it like actual magic since the early stages and are only slowly warming up to the idea it‘s calculations. Calculations of things that are often more than the sum of it‘s parts as people start to realize. Well some people anyway.
There’s magic?
Only if you believe in it. Many CEOs do. They’re very good in magical thinking.
I have a counter argument. From an evolutionary standpoint, if you keep doubling computer capacity exponentially isn’t it extraordinarily arrogant of humans to assume that their evolutionarily stagnant brains will remain relevant for much longer?
You can make the same argument about humans that you do AI, but from a biological and societal standpoint. Barring any jokes about certain political or geographical stereotypes, humans have gotten “smarter” that we used to be. We are very adaptable, and with improvements to diet and education, we have managed to stay ahead of the curve. We didn’t peak at hunter-gatherer. We didn’t stop at the Renaissance. And we blew right past the industrial revolution. I’m not going to channel my “Humanity, Fuck Yeah” inner wolf howl, but I have to give our biology props. The body is an amazing machine, and even though we can look at things like the current crop of AI and think, “Welp, that’s it, humans are done for,” I’m sure a lot of people thought the same at other pivotal moments in technological and societal advancement. Here I am, though, farting taco bell into my office chair and typing about it.
You can compare human intelligence to centuries ago on a simple linear scale. Neural density has not increased by any stretch of the imagination in the way that transistor density has. But I’m not just talking density I’m talking about scalability that is infinite. Infinite scale of knowledge and data.
Let’s face it people are already not that intelligent, we are smart enough to use the technology of other smarter people. And then there are computers, they are growing intelligently with an artificial evolutionary pressure being exerted on their development, and you’re telling me that that’s not going to continue to surpass us in every way? There is very little to stop computers from being intelligent on a galactic scale.
Computer power doesn’t scale infinitely, unless you mean building a world mind and powering if off of the spinning singularity at the center of the galaxy like a type 3 civilization, and that’s sci-fi stuff. We still have to worry about bandwidth, power, cooling, coding and everything else that going into running a computer. It doesn’t just “scale”. There is a lot that goes into it, and it does have a ceiling. Quantum computing may alleviate some of that, but I’ll hold my applause until we see some useful real world applications for it.
Furthermore, we still don’t understand how the mind works, yet. There are still secrets to unlock and ways to potentially augment and improve it. AI is great, and I fully support the advancement in technology, but don’t count out humans so quickly. We haven’t even gotten close to human level intelligence and GOFAI, and maybe we never will.
As I said that answer seems incredibly arrogant in the face of evolutionary pressure and logarithmic growth.
You can believe whatever you want, but I don’t think it’s arrogant to say what I did. You are basing your view of humanity on what you think humanity has done, and basing your view on AI based on what you think it will do. Those are fundamentally different and not comparable. If you want to talk about the science fiction future of AI, we should talk about the science fiction future of humanity as well. Let’s talk about augmenting ourselves, extending lifespans, and all of the good things that people think we’ll do in the coming centuries. If you want to look at humans and say that we haven’t evolved at all in the last 3000 years, then we should look at computers the same way. Computers haven’t “evolved” at all. They still do the same thing they always have. They do a lot more of it, but they don’t do anything “new”. We have found ways to increase the processing power, and the storage capacity, but a computer today has the same limits that the one that sent us to the moon had. It’s a computer, and incapable of original thought. You seem to believe that just because we throw more ram and processors at it that somehow that will change things, but it doesn’t. It just means we can do the same things, but faster. Eventually we’ll run out of things to process and data to store, but that won’t bring AI any closer to reality. We are climbing the mountain, but you speak like we have already crested. We’ve barely left base camp in the grand scheme of artificial intelligence.
Holy wall of unparagraphed word salad, Again you are not understanding what is and isn’t an evolutionary process, a disease can wipe out half a species and that is considered a process of evolution. You don’t have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That’s all that is needed to have an evolutionary force.
With computers we don’t have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress. Do you think computers are going to stop improving? The road maps for chip architecture for the next ten years doesn’t seem to suggest it’s slowing down yet.
And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.
On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.
Therefore computer intelligence is vastly going to outpace or own. And very soon too.
The laws of physics still apply. We already have to do all kinds of crazy tricks to make transistors as small as they are and not leak electrons all over the place due to quantum tunneling. The best thing we figured out how to do is just pile on more CPU/GPU cores.
It’s also arrogant to assume we will continue on this exponential industrial-revolution growth of the last 300 years and not plateau as a species again for the next thousand. We could be looking at an eon of just burnin’ away our oil while we try to cling more and more to whatever other energy impinges on this pitiful little planet, trapped in our local space unable to use our pathetic spacecraft to push us any further.
The laws of physics are no less or more applicable to our own biology in terms of complexity, density, scale, and information capacity and in most ways is far less efficient and accurate than their silicon counterparts.
There is nothing to suggest the growth in computer intelligence is going to stop occurring or it’s don’t anything but just getting started.
Apart from your use of infinite I agree, there is no reason we shouldn’t be able to surpass nature with synthetic intelligence. The time computers have existed is a mere blip on a historic scale, and computers has surpassed us at logic games like Chess and at math already long ago.
Modern LLM models are just the current stage, before that it could be said it was pattern recognition. We had OCR in the 80’s as probably the most practical example. It may seem there is long between the breakthroughs, but 40 years is nothing compared to evolution.
I have no doubt strong AI will be achieved eventually, and when we do, I have no doubt AI will surpass our intelligence in every way very quickly.
If you keep doubling the number of fruit flies exponentially, isn’t it likely that humanity will find itself outsmarted?
The answer is no, it isn’t. Quantity does not quality make and all our current AI tech is about ways to breed fruit flies that fly left or right depending on what they see.
As a counter argument against that, companies are trying to make self driving cars work for 20 years. Processing power has increased by a million and the things still get stuck. Pure processing power isn’t everything.
Magic as in street magician, not magic as in wizard. Lots of the things that people claim AI can do are like a magic show, it’s amazing if you look at it from the right angle, and with the right skill you can hide the strings holding it up, but if you try to use it in the real world it falls apart.
I wish there was actual magic
It would make science very difficult.
What if it magically made it easier?
Mmm irrational shit makes rationality harder
Look at quantum mechanics
Everything is magic if you don’t understand how the thing works.
I wish. I don’t understand why my stomach can’t handle corn, but it doesn’t lead to magic. It leads to pain.
Have you eaten hominy corn? The nixtamalisation process makes it digestible.
I don’t have access to that, sadly. I’m pretty sure my body would reject it however. At least from my reading on what it is.
If you’re a thechbro, this is the new magic shit, man! To the moooooon!
The masses have been treating it like actual magic since the early stages and are only slowly warming up to the idea it‘s calculations. Calculations of things that are often more than the sum of it‘s parts as people start to realize. Well some people anyway.
oh the bubble’s gonna burst sooner than some may think
Next week, some say
Sam Altman will make a big pile of investor money disappear before your very eyes.
If only.