Can someone explain AI (Singularity) to me.

Cherokeekid88

Well-Known Member
Joined
Jun 30, 2007
Location
High Point, NC
AI is something that fascinates me but I also have a bit of trouble full understanding the "Sentient" part of it. I understand what it means but what is actually happening when AI finally hits Singularity? If it is indeed humans that create the AI, how does AI gain its own agenda and begin to "think" on its own? and as I understand it, the majority if not all of the current AI is based on Language models at the moment? what is that "spark" that needs to happen for AI to be able to reprogram itself or create its own AI?
 
I watch and hear about some of the steps AI is taking and it's scary. Then I think about how if you unplug a computer it shuts off.
 
I don’t see where that accounts for a loss of power. It speaks of redundancies in processing.

Kill the power source. Just like with any living being. Take away food and it’ll slowly die.
 
But then they create human farms and use our bioelectricity to power themselves and .... oh wait that's that movie.

Idiocracy was 'just a movie' at one point too. Keanu was warning us, not entertaining us.
 
Sentience and Singularity are two entirely different things, One could say that AI's are already Sentient (Self Aware), but you could also say that true self awareness is impossible for a program. Singularity is a long way off...
 
I think the big difference will always be that AI won’t ever feel. It won’t feel the desire, the appreciation, the pride of cooking a perfect steak.

It can read all the directions and know how to perform the task, but it won’t appreciate it.

Reminiscent of that scene in Good Will Hunting when Robin Williams calls him out on the fact that he can recite what it’s like to visit a place, but he doesn’t grasp what it feels like to stand there in awe.
 
But then they create human farms and use our bioelectricity to power themselves and .... oh wait that's that movie.
I don’t see where that accounts for a loss of power. It speaks of redundancies in processing.

Kill the power source. Just like with any living being. Take away food and it’ll slowly die.
You mean like - block out the sun so they can't get solar power anymore?
 
There’s a couple of good Joe Rogan podcasts recently where they get into this quite a bit.

Episode 2044 Sam Altman who is a CEO of an AI company called OpenAI

And then Brian Muraresku, they happen to have a segment of the cast talking around it.

Very interesting.
 
related:

The notion of a sovereign territory that exists solely to support AI without government regulation is interesting and frightening.
It also means that technically if a country feels threatened by it they could attack / destroy it as a means of protection without legal repercussion...

Following that logic it may actually be preferred to be outside national boundary because we can kill it easier when (not if) the time comes.
 
Sentience and Singularity are two entirely different things, One could say that AI's are already Sentient (Self Aware), but you could also say that true self awareness is impossible for a program. Singularity is a long way off...
Ah I gotcha. I guess when I read stuff I see those two words come up and think they can be interchanged.
 
Back
Top