Jump to content

Singularity


Hypatia

Recommended Posts

Could be true. They reckon CPU speeds double every 18 months. So that means that by 2030, CPU speeds will be approx 2^17 times quicker than today - thats about 131,000 x faster!! ohmy.gif
Link to comment
Share on other sites

QUOTE (rickyrob @ Apr 27 2005, 01:56 PM)
Could be true. They reckon CPU speeds double every 18 months. So that means that by 2030, CPU speeds will be approx 2^17 times quicker than today - thats about 131,000 x faster!! ohmy.gif

I remember the 8 mhz 8086 processor coming out (and I believe you could hit the turbo button on the computer to make it 12 mhz)... That was only 20 years ago... Now we are talkng 3200 mhz+ being available retail...

 

However I believe that going from say 1.6 Ghz to 3.2 Ghz is not doubling the speed... Can't remember how this works, but am pretty sure that there is a formula and it isn't as simple as 2x the mhz=2x the speed...

Link to comment
Share on other sites

QUOTE (RushRevisited @ Apr 27 2005, 06:59 PM)
QUOTE (rickyrob @ Apr 27 2005, 01:56 PM)
Could be true. They reckon CPU speeds double every 18 months. So that means that by 2030, CPU speeds will be approx 2^17 times quicker than today - thats about 131,000 x faster!!  ohmy.gif

I remember the 8 mhz 8086 processor coming out (and I believe you could hit the turbo button on the computer to make it 12 mhz)... That was only 20 years ago... Now we are talkng 3200 mhz+ being available retail...

 

However I believe that going from say 1.6 Ghz to 3.2 Ghz is not doubling the speed... Can't remember how this works, but am pretty sure that there is a formula and it isn't as simple as 2x the mhz=2x the speed...

You are probably right RR, but I'm not arguing with Stephen Hawking, as I read this in one of his books, and just double-checked it. wink.gif

 

It was based on computations per second rather than GHz etc.

Link to comment
Share on other sites

QUOTE (rickyrob @ Apr 27 2005, 01:06 PM)


It was based on computations per second rather than GHz etc.

Yes, Moore's Law is based on computations per second. Kurzweil believes that Moore's Law itself is exponential -- the rate at which the rate of computations per second doubles (18 months) is itself increasing. In other words, several years down the road the time it takes for computation speed to double will be less than 18 months.

Link to comment
Share on other sites

Very often in nature "things" tend to happen in way as well represented by the "Logistic" equation.

 

Nature seems to impose limits on things like the speed of light or how many coyotes can live in Connecticut or the height of a growing sunflower etc. If, for instance, some new species is intoduced into an area well suited for them their population will grow exponentially for a time. But the limitations of the habitat (food sources etc) will eventually slow this population growth.

 

 

If we never saw a sunflower before and plotted its height as a function of time, we may think, OMG! its growing FASTER and FASTER!, soon it will reach the moon! Of course, we know better.

 

 

My point is that we are probably in the early-mid stages of computer development so the growth of speed of these computers looks exponential. But nature will eventually impose limits, and the growth of the speed of these machines will flatten.

 

 

 

Link to comment
Share on other sites

Sure!

 

In a dripping faucet at low pressure, drops come off the faucet with equal timing between them. As the pressure is increased the drops begin to fall with two drops falling close together, then a longer wait, then two drops falling close together again. In this case, a simple periodic process has given way to a periodic process with twice the period, a process described as "period doubling". If the flow rate of water through the faucet is increased further, often an irregular dripping is found and the behavior can become chaotic.

 

http://www.exploratorium.edu/complexity/CompLexicon/LogisticBifn.gif

 

 

laugh.gif laugh.gif laugh.gif

Link to comment
Share on other sites

QUOTE (RushRevisited @ Apr 27 2005, 02:27 PM)
Sure!

In a dripping faucet at low pressure, drops come off the faucet with equal timing between them. As the pressure is increased the drops begin to fall with two drops falling close together, then a longer wait, then two drops falling close together again. In this case, a simple periodic process has given way to a periodic process with twice the period, a process described as "period doubling". If the flow rate of water through the faucet is increased further, often an irregular dripping is found and the behavior can become chaotic.

http://www.exploratorium.edu/complexity/CompLexicon/LogisticBifn.gif


laugh.gif laugh.gif laugh.gif

Yes!....but its not just dripping faucets...

Link to comment
Share on other sites

QUOTE (My_Shrimp_Cot @ Apr 27 2005, 02:07 PM)
My point is that we are probably in the early-mid stages of computer development so the growth of speed of these computers looks exponential. But nature will eventually impose limits, and the growth of the speed of these machines will flatten.

Kurzweil agrees with you on individual technologies like integrated circuits. He argues, however, that generally as the growth in one technology flattens out a new technology is found to step over the natural limit imposed by the previous technology; Kurzweil calls this change in technology a paradigm shift. For example, computing with vacuum tube electronics was nearing its technological limit in the late 50s when transistor technology was co-opted from the portable radio industry and thus did away with the limits vacuum tube technology was placing on computer performance. In turn transistors gave way to integrated circuits. Present day we still have a decade left (Kurzweil's estimate) of exponential growth in IC technology, but people are already working on the next paradigm shift in computing technology -- e.g. - quantum computing, molecular computing, etc.

Link to comment
Share on other sites

QUOTE (Hypatia @ Apr 27 2005, 02:30 PM)
QUOTE (My_Shrimp_Cot @ Apr 27 2005, 02:07 PM)
My point is that we are probably in the early-mid stages of computer development so the growth of speed of these computers looks exponential. But nature will eventually impose limits, and the growth of the speed of these machines will flatten.

Kurzweil agrees with you on individual technologies like integrated circuits. He argues, however, that generally as the growth in one technology flattens out a new technology is found to step over the natural limit imposed by the previous technology; Kurzweil calls this change in technology a paradigm shift. For example, computing with vacuum tube electronics was nearing its technological limit in the late 50s when transistor technology was co-opted from the portable radio industry and thus did away with the limits vacuum tube technology was placing on computer performance. In turn transistors gave way to integrated circuits. Present day we still have a decade left (Kurzweil's estimate) of exponential growth in IC technology, but people are already working on the next paradigm shift in computing technology -- e.g. - quantum computing, molecular computing, etc.

Sweet. tongue.gif

 

I hope that he is right. I need some nanomachines to do my laundry. 653.gif

 

 

 

 

Link to comment
Share on other sites

QUOTE (Hypatia @ Apr 27 2005, 02:30 PM)
QUOTE (My_Shrimp_Cot @ Apr 27 2005, 02:07 PM)
My point is that we are probably in the early-mid stages of computer development so the growth of speed of these computers looks exponential. But nature will eventually impose limits, and the growth of the speed of these machines will flatten.

Kurzweil agrees with you on individual technologies like integrated circuits. He argues, however, that generally as the growth in one technology flattens out a new technology is found to step over the natural limit imposed by the previous technology; Kurzweil calls this change in technology a paradigm shift. For example, computing with vacuum tube electronics was nearing its technological limit in the late 50s when transistor technology was co-opted from the portable radio industry and thus did away with the limits vacuum tube technology was placing on computer performance. In turn transistors gave way to integrated circuits. Present day we still have a decade left (Kurzweil's estimate) of exponential growth in IC technology, but people are already working on the next paradigm shift in computing technology -- e.g. - quantum computing, molecular computing, etc.

Quantum computing looks to be the most promising next great leap.

 

Man, I wish I could remember who said that by 2100 humans and artificial intelligence would be indistinguishable from one another. Was it William Gibson?

 

No matter, that may be the prophecy after all. It does open up an entirely new realm of concerns, i.e. granting citizen status and all that entails to AI machines.

Link to comment
Share on other sites

If a computer/machine/android can fool us into believing it is a real person, of course it still isn't sentient. Even if it did have electronic nerves, a memory and everything else that comprises "the soul," we'd maintain that it isn't sentient because it is after all a machine, a program. Yes, plenty of thinkers tackled this question, e.g. P.K. Dick; but I propose that they're looking at it the wrong way. It isn't the robots we need to philosophize about, but ourselves.

Once it is no longer possible to distinguish between artificial intelligence and our own, it doesn't mean the robot is sentient; it means we are not. It is necessarily a choice between two possibilities: that we're both free, intelligent, sentient beings, or that we're both preprogrammed, delusional something-elses.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...