Technological Singularity

vimothy

yurp
Doubtless most of these people will be wrong. Prediction is hard, especially about the future and all that. Still, it promises to be pretty weird.
 

Mr. Tea

Let's Talk About Ceps
Computers that can build better computers than humans.

Again, aren't we doing this already? Surely ICs have been designed using computer algorithms for decades? I can't imagine there are huge rooms full of engineers painstakingly deciding where each and every transistor in a Pentium Dual Core has to go.

And I don't see that this whole concept is really, at base, all that groundbreaking. It's obvious that you can make a better axe if you have some elementary metalworking and woodworking tools lying around than if you're using your bare hands to manipulate a lump of flint, a stick and some twine. An axe is a tool for chopping stuff up and a computer is a tool for performing calculations very quickly.

Sorry if this is sounding at all facetious, it's really not meant that way!
 

massrock

Well-known member
I don't think the concept is supposed to groundbreaking as such, it's attempting to be a prediction, to say this is where it looks like things are going. And part of that is based on extrapolating from observations of where we are now, like you above saying that computers are already being used to design computers.
 

massrock

Well-known member
the only serious arguments I've ever heard against the eventual development of genuinely intelligent machines all boil down to a thinly veiled belief that there just has to be something more to human intelligence than mere neurons and biochemistry
Even this question doesn't necessarily matter so much. No absolute reason why AIs should be based on human intelligence.

Maybe the issue for AI coming up is one of Initiative (or Will). ?
 

Client Eastwood

Well-known member
Thanks for all the info here. I wish I could comment more but im still finding out stuff on all of this. Will read the thread in detail and links more fully later.
 

vimothy

yurp
Sorry if this is sounding at all facetious, it's really not meant that way!

Of course it's true that humans use technology to design and build more technology. But I tihnk that the transhumanists would make a distinction between that and...
 

turtles

in the sea
The singularity for AI is not about making an AI that can learn, it's about making an AI that can write an AI smarter than itself (which then makes an AI smarter than itself, etc etc). Neural nets learn, but they haven't, as yet, learned how to make something better at learning than a neural net.

The main problem with this idea as applied to the singularity is that there's no proof that intelligence increases in any sort of continuous fashion (there could be large, discontinuous jumps in between levels) and that there isn't an upper maximum to intelligence. It's some weak induction plus wishful thinking.

Would be cool though. I'd upload my brain no problem.
 

Client Eastwood

Well-known member
AI and NN have been around a while, I think Genetic Algorithms may hold the key in that they generate loads of solutions and then access if they are fit for purpose, keeping the nearest fits and basing new better solutions on the previous sets of logic. Kinda like they self analyse. I dont think they will replicate human intelligence and intuition im not even sure that that would be a requirement in machine intel. I think some of what these TS guys talk about could happen but perhaps not in the time frames they talk about ie by 2035 . . . Still feeling my way around this . . .
 

nomadthethird

more issues than Time mag
AI already exists, it will likely get more streamlined and advanced. So? There are already programs that write programs, so it's not a huge leap to think there will be AI machines that create AI machines.

We are AI machines. This is what people don't seem to get. There's not any difference between our brains and the brain of an AI machine, except arguably a few layers of emergence/complexity. We're similarly "programmed" with all sorts of biological information, we're limited by our biochemical hardware, and our brains function exactly the same way a computer does, by turning a bunch of inputs into output.

What bothers me about this Singularity business is the notion that lurks in the background, which is that "intelligence" progresses in lifeforms in a sort of steep line of reflection up the x-y-- pretty silly. The idea that you can abstract the "intelligence" of a person (as if it's some kind of essence) from the body itself and then simply inject it into other bodies is pretty ridiculous in its own right. The mind IS part of the body. If we can use computer chips or something to make our minds work more efficiently or better, we should do it. But for goodness sake, it won't be such a huge deal. We've already got microchips in there. Amino acids, proteins, nucleic acids, lipids.
 

massrock

Well-known member
But aren't most of the Singularity proponents talking about what if we or something subsequent to us can overcome the limitations of out biological hardware and the limitations of having to make our environments with solid matter? What if human like or human level or better intellects can operate on super efficient hardware and inhabit practically infinitely designable software environments? Then taking the possibilities afforded by that scenario and projecting to the next iteration, and the next etc.
 

grizzleb

Well-known member
Well I like the idea of protein machines, you could get all sorts of crazee designs on the go that were still 'biological' in nature. Sensory upgrades et al. I don't really think that machines can be conscious, at least not until these neural nets complexity reaches such a vast degree. Even then? Maybe if we make machines that can talk to us without ever being taught how to do so. Or what talking is, like babies have to. I can't see that happening though.
 

sufi

lala
i find this all a bit previous actually,
far as i'm concerned technological singularity is upon us and has been for a while, we have a typically human brain-fuzz preventing us from recognising how far it's gone;
given the wooly definitions of what is intelligence anyway, havent we already massively augmented human thought (both on an individual and species level) using simple technologies like writing and having technological capacity to store and share information, so it's not really about cyborgs or any wicked scifi shit like that
& furthermore
this is not just restricted to primitive mind/body distinctions. by technologically augmenting our communications in this way (not to mention other essentials such as accommodation, food, transport, medicine etc etc) the species and the individual achieve huge benefits, almost to the point that within a few generations we have come to rely utterly on machines, and now it's a struggle to imagine survival without them. Increasingly they actually have the upper hand, and humans without access to tech, or to the 'machine' are more and more disadvantaged and excluded

roll on astronomical singularity :cool:
 

massrock

Well-known member
far as i'm concerned technological singularity is upon us and has been for a while, we have a typically human brain-fuzz preventing us from recognising how far it's gone;
I don't know if it is though, upon us that is, at least not by any definition that fans of the Singularity concept would recognise. The idea is that technological advancements proceed at an exponential rate eventually hitting an effectively infinite rate of innovation / change / improvement. That's the Omega Point or whatever. I'm not necessarily saying I believe that's how it will go down but that's what they say, and I can see reasons for thinking something like this might happen.

But yeah, of course the process of technological development has been under way for some time, that's the basis of the idea anyway isn't it?
 

Mr. Tea

Let's Talk About Ceps
I don't know if it is though, upon us that is, at least not by any definition that fans of the Singularity concept would recognise. The idea is that technological advancements proceed at an exponential rate eventually hitting an effectively infinite rate of innovation / change / improvement. That's the Omega Point or whatever. I'm not necessarily saying I believe that's how it will go down but that's what they say, and I can see reasons for thinking something like this might happen.

But yeah, of course the process of technological development has been under way for some time, that's the basis of the idea anyway isn't it?

Thing is, exponential increase just keeps getting gradually steeper and steeper, there's no sudden cut-off moment where everything goes "WHOOSH!" all in one go. I think the idea of the Singularity is that some breakthrough is made which enables not merely a quantitative change in the pace of technological innovation, which after all is happening all the time (Moore's law) , but a qualitative shift so that a graph of processor power or whatever vs. time effectively looks like a vertical wall.

One possible catalyst people have mentioned already could be computers that are better at designing things than we are - genetic algorithms and the like. Another could be quantum computers, which (once some fairly substantial practical difficulties are solved) offer effectively limitless computing power. I think some theorists think they may even be able to solve problems that are even in principle insoluble to classical Turing machines (eg. common-or-garden computers as they exist today). Then there's work people have been doing with pieces of DNA, using base pairs as digits to perform immensely complex calculations...some people think DNA/RNA can unzip and re-zip much more quickly than it 'should' be able to according to semi-classical molecular dynamics, which means the nucleotides may be existing in quantum superposition before actually binding to the phosphate backbone to complete the reaction.

I dunno if it counts as a fully-fledged subdiscipline yet, but people are already writing papers on 'quantum biology'. :D And some of them have a bit more of a basis in experimental reality than Penrose's magic tubules, too.
 
Last edited:

massrock

Well-known member
Thing is, exponential increase just keeps getting gradually steeper and steeper, there's no sudden cut-off moment where everything goes "WHOOSH!" all in one go. I think the idea of the Singularity is that some breakthrough is made which enables not merely a quantitative change in the pace of technological innovation, which after all is happening all the time (Moore's law) , but a qualitative shift so that a graph of processor power or whatever vs. time effectively looks like a vertical wall.
"Effectively infinite". I don't think it's precise, I don't think that you can necessarily measure "technological developments" as proceeding at an "exponential rate", this is obviously an approximation to aid visualisation. Moore's law isn't a law either as such, but it's proved to be pretty close to the truth.

I don't know if it's necessary that a specific technological breakthrough be made, though several of significance may be made along the way - Teilard De Chardin's original Omega Point thing had to do with a certain threshold level of complexity of organisation being reached in the Universe.
 

nomadthethird

more issues than Time mag
grizzleb said:
Well I like the idea of protein machines

I do, too! I am one. A protein machine where each protein is ultimately made out of only 20 amino acids. Which are in turn just amine groups with carboxyl group side chains. Which are ultimately just hydrogen and carbon and oxygen bonded in various combinations.

Then there's work people have been doing with pieces of DNA, using base pairs as digits to perform immensely complex calculations...some people think DNA/RNA can unzip and re-zip much more quickly than it 'should' be able to according to semi-classical molecular dynamics, which means the nucleotides may be existing in quantum superposition before actually binding to the phosphate backbone to complete the reaction.

I dunno if it counts as a fully-fledged subdiscipline yet, but people are already writing papers on 'quantum biology'. :D And some of them have a bit more of a basis in experimental reality than Penrose's magic tubules, too.

This shit is the bomb. There are people who can fucking swap codons around to see what happens. I just heard a seminar by a lady who does it, something about "codon wobble" in tRNA and arg pairs (I was sleeping with my eyes open). Those are small! O the patience her lackies must have...
 

3 Body No Problem

Well-known member
Another could be quantum computers, which (once some fairly substantial practical difficulties are solved) offer effectively limitless computing power. I think some theorists think they may even be able to solve problems that are even in principle insoluble to classical Turing machines (eg. common-or-garden computers as they exist today).

There is no reason to believe quantum computers can do that. QC can speed up some computations like the factoring of integers (which would be problematic for the kind of cryptography used today to secure the internet.

I dunno if it counts as a fully-fledged subdiscipline yet, but people are already writing papers on 'quantum biology'.

Others discuss if there's substance behind quantum biology.
 
Top