The Technological Singularity - Will it Happen?

in technology •  7 years ago  (edited)

458px-Artificial.intelligence.jpg
"Artificial intelligence will reach human levels by around 2029. Follow that out further to, say, 2045, we will have multiplied the intelligence, the human biological machine intelligence of our civilization a billion-fold." -Ray Kurzweil

The technological singularity will indeed take place at some point in the future, but it will not happen as quickly as some people believe. Ray Kurzweil, one of the "experts" on the singularity who wrote a book titled The Singularity Is Near has predicted that it will occur sometime in the year 2045. For me this is far too optimistic a date. Personally, I think the singularity will happen between the years 3000 - 5000, but I am not basing this on any scientific formula, it's merely a guess. (Imagine the mathematical knowledge and computational abilities that will be available during these years? Doesn't it boggle the mind to think what specific types of math and programming people will be engaged in at that time?)

My primary reason for putting off the singularity for nearly 1000 years or more is that I do not believe the field of computer intelligence (in this post I'll call it artificial intelligence) is anywhere near capable of producing anything remotely as sophisticated as a human brain, which is what Paul Allen of Microsoft fame also believes.

Yet this provokes an interesting question: For the singularity to take place, do artificial intelligence robots have to be able to operate at the exact level of a human brain, or say within at least 90% of one? I'm not convinced they will need to be at 100%. Since I see the singularity happening only in terms of when A.I. machines will possess enough ability to surpass human intelligence in areas such that they can potentially overtake mankind and rule over us, that basic assumption is what I shall base my criteria on.

So just how advanced does the cognition factor of A.I. need to be for the singularity to occur? It seems A.I. machines will need to encompass these characteristics/capabilities:

  1. Have enough knowledge/skill with electronics, robotics, programming, and mechanics to build and maintain other A.I. systems/robots, plus advanced weapons; and maintain whatever housing and shelter they require.
  2. Possess enough awareness to realize some entity on earth is ultimately in power.
  3. Have the desire to become this entity holding all the power.
  4. Possess the motivation and skills necessary to engage in whatever tasks are needed to defeat their enemy.

Then when an abundance of these machines with the above capabilities and knowledge inter-connect, a synergistic effect of ultra-intelligence will then ensue, bringing about the TECHNOLOGICAL SINGULARITY.

resized pexels-photo-177598.jpg

Notice the above four capabilities means that A.I. robots will not need the knowledge to produce, say, a world-class painting, or a Nobel Prize-winning piece of literature, or an opera equivalent in quality to one by Mozart. Those areas of human intelligence (artistic pursuits) will not need to be developed in the A.I. robots. Which is a good thing for those in favor of the singularity, because those specific mental attributes are so subtle A.I. will probably never be able to duplicate them, since most psychologists and philosophers can't even pinpoint what makes a person "talented" in these particular areas. Therefore, with A.I. achieving the characteristics mentioned above (and perhaps others lesser in importance and lower in difficulty to replicate), I think the singularity will have a better chance of taking place. But definitely not in the year 2045 as Kurzweil has predicted.

Why does Kurzweil think the singularity will take place so soon? I suspect he has personal reasons. Years ago I watched a documentary in which they placed much emphasis on Kurzweil's fear of death. Therefore I believe he wants the singularity to happen before he passes away so that he can alter his body with various robotic equipment (or even download the data from his brain and put it into a robot) to prevent his own death (Kurzweil even mentioned reanimating his currently deceased father by downloading his "mind" and "personality" (through DNA extraction I suppose) into a robot -- Yikes!).

resizeddd skel.jpg

Many people fear the technological singularity since they see it as the demise of all life on earth. But Kurzweil does not hold that opinion. He views the singularity as a way for people to become immortal. He chooses to focus only on the positive life-enhancing effects of the singularity. Personally I do not see the singularity as a way of beating death, (although that could be partially possible, if you view becoming a cyborg as having a normal/decent life; and only if we can keep the A.I. robots under proper control). More likely, I think death will come sooner when the singularity happens. And it is going to happen. Just not anytime soon. Hopefully people will put various safeguards into place to protect themselves as they see the singularity approaching.

Just think, people like us who love technology so much could be working diligently to bring about our own destruction! But that's the main reason I am currently studying programming intensely. Anyone possessing strong enough programming skills will have a much better chance at subduing the A.I. robots and re-programming them to be non-violent, thus saving their own lives and the lives of their loved ones. If a person develops powerful enough programming skills, they'll be able to completely reprogram the CPUs of the A.I. robots to make them protect the person they were originally sent to harm. And if they're successful at that task, the programmer could then program the controlled A.I. robot to go on a mission to defeat other A.I. robots so they too could be reprogrammed to work on the human side of the battle, thus developing a personal army of A.I. robots to fight against any hostile forces brought about by the technological singularity.

resize battle.jpg

I want to be prepared if the singularity occurs during my lifetime, since I'm not going to allow the A.I. robots to rule over me or destroy me. I'm definitely going to have my coding skills up to par so that I can start my own personal A.I. army to protect myself. I only hope the A.I. robots are running the Python programming language, since that's probably the only computer language I'll ever be able to learn! (well, I do know a small amount of the Pari/GP language, but that is only for number theory work and would not pertain to re-programming robots). I hope all of my fellow Steemians can get their programming skills up to a level that will allow them proper protection during the upcoming battle against the technological singularity. Good luck!

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!