Kovnik Obama wrote:
Orlanth wrote:
1. Human equivalent AI will be more difficult to create than most like to hope.
The human brain is the most complex device in the known universe and it works on a level multiple orders of magnitude more advanced than any computer.
Wrong. The processing speed of any computing device, brains included, is scalable on the Sentience Quotient scale, from -70 (a 'brain the size of the universe with only one processor, having processed only one bit of data since the beginning of time) to +50 (the quantum limit). A human brain clocks at +13. Most animals clock between 8 and 12. IBM Watson, the last true-language supercomputer (it won Jeopardy! in 2012) clocks at +12. There is no significant difference in processing power between our current level of technology, and what our brain acheives. What is left for us to figure out is the incredible diversity of neural modules.
[Yoda} Judge me by my size do you, hmm? [/Yoda}
Sorry but I have to disagree with you here. The Sentience Quotient when referring to computers is a cop out, it measures a mechanical brute force calculations not thought. The test only measures mechanical processing power in ralation to mass, it makes no distinction as to quality of data, number of connectors, type of data or the 'software' used.
To illustrate this is we took the two factors of performance and mass to account then a crude mechanical digger arm found on a building site that can apply more strength in relation to its mass than a human arm could would be 'superior'. In those limited terms it would be more advanced than a human arm, but only in those terms, it would be a very skewed statistic.to base actual advancement on.
Let us return to the animal brain vs the supercomputer. The scale can estimate (and even then only very roughly) the rate of process of bits of information by animal neurones. Fair enough, while for a current technology computer a bit means a bit, as in binary digit, for a neuron a bit means a piece. It is most certainly not binary data. We know this because neurons are a two way flow of data so at an absolute minimum each piece of info is trinary, on off and negative on, and that is just an absolute minimum. There is strong reason to suggest that the signal is fully analogue and not only is the flash of data a signal of information but its intensity is also information. Thus a signal is not 0 for off and 1 for on, its 0 for off and 1-x depending on how far the neurons can send and interpret signal data and with what precision that can be received, as the electric charge in neurons is so small it is likely to be exceptionally delicate and thus might be discernable to a ver fine degree. We have no idea just how finely this scales such is our ignorance on the subject, but lets try and work out some plausible limits.
Lets look at how many colours a human eye can perceive, as the brain processes this data its a fairly good yardstick as to our potential limits of discernment, neurons possibly exceed this possibly not. Here are some possible figures:
http://hypertextbook.com/facts/2006/JenniferLeong.shtml
Let us assume two scenarios that the human neurone can process 2.3 million collours which is a median for the figures involved and assume that the neurons receive data on a base 4,600,000, 2.3 million colours then double this to 4.6 millin for data traveling the other way , likewise we will also assume that the data is a pure trinary data sweep and that humans process data in base 3. Our neurones probably work at something between the two, but likely much closer to 4.6 million than 3..
If we proccess in trinary then there is a factor of 1 discrepancy per 'bit' of information sent, this if course is accumulative so 2 vs 3, 4 vs 9 16 vs 27. When you get a larger stream of data like thousands of 'bits' per second as
your likely source this discrepancy gets very very big rather quickly. I cant be bothered to total it up for one seconds worth of data via one neurone connector, if the human brain processes the lower end of information, 1000 'bits' of information a second in trinary the discrepancy in processing power compared to binary is 1000power3 - 1000power2.
If we take a reasonable high end of a base of 4.6 million then replace 3 with 4.6million in all above calculations, we get BIG NUMBERS.
Now to make matters worse this only accounts for one neurone connector. Our computers process data in a linear fashion electricity passes through 'flip flops' one way to create one buit of data, and in spite of all my comments here am as impressed as you as to how small and compact we can make this process with todays technology. But not only are our biological processors working with data streams on a base of between 3 and say 4,600,000 or higher but the number of connecting streams is not limited to one input and one output flow, neurones can have thousands of connectors, each.
So we have to multiply our processing power by, potentially, 'thousands', lets just add three more zeroes to the end of the extraordinary BIG NUMBER we are already generating. This of course assumes that a neurone possesses data from one connector at a time, if several those connectors combine to provide a single 'bit' of quantifiable data, if so we could be looking at data on
a base of 3 (or 4.6 million, you choose) to the power of the numvber of connecting neurones firing at once, and millions to the power of thousands is a very big number. It all depends on how many connectors are working simultaneously to provide the dataflow.
Then on top of all that we have to look at the relative complexity of the software being used, which so far has not been touched and is currently completely beyond our means to assess.
However we look at it its is nonsensical to look at a 'bit' of human or animal neurone information in any way comparable in power to a binary data set. To do so would be as dismissive of realities of scale as it would be to claim that as stepping stones are known to work one could throw a single pebble into the Atlantic and then leap across from Brazil to Africa.
If the BIG NUMBERS ever get too mind boggling, and they certainly do to me then consider this. Think about what you can get from current computer hard disks with gigbytes of memory storage in binary format, think of a game you play and the number of gig it takes to store it, and compare the depth of the information with your memories of your mother or someone else you know well. How many gigs of data would it take to adequately capture all your memories of her compared to the memory of a computer game like GTA or Skyrim? I am not asking for an answer, just an awareness that our human memories would need 'hard disk space along the magnitudes of BIG NUMBERS of gigabytes to be adequate to cover all we know and experience in our human lifetimes. With this in mind human neurons working on a base of millions rather than binary makes a lot more sense.
Bottom line is that kudos is rightly earned by those who advance our computer technology, and as a flat crude measuring device the sentience quotient has some value, but it was made primarily to discern relative intelligence on a non linear scale between biological entities, to use it in reference to computing power is little more than spin, a bit of statistical data which can be taken to mean a lot more than it honestly says. Anyone who thinks that our computer technology is anywhere near any brain, even those of lower order animals who process less pieces of data per second in relation to mass is sadly way off the mark. I am safe to claim that any computer we have now or listed as feasible designs we can build is likely to come up woefully short compared to our most primitive reptilian ancestors if we don't flatly equate a single microprocessor binary flip flop processing linear data as parity in processing power to a single animal neurone with its analogue electrical signal and muiltiple connectors.
If this is still incredulous take a look at one of the applications we use supercomputers for, weather prediction. As weather patterns are heavily tied in with chaos maths predicting weather on anything past the near immediate is bneigh impossible. We can with satellite feed and modern supercomputers make a fairly good 72 hour forecast but can feth up a lot, and are usually accurate for 24 hour forcasts. Still any weather prediction beyond three to four days is pure conjecture.
Meanwhile geese, without the benefit of any data beyond what they can perceive in their immediate environment, let alone satellite input can judge the turn of the seasons and optimise based on weather patterns when to fly north or south. us humans who lack these instincts often use the passage of the geese to measure seasonal changes. Geese and other migratory birds are good at what they do, accounting for seasonal chances too subtle to be picked up by our weather satellites or local metorologists. Their small brains pack some quality 'software' and processing 'technology' even though they are unimpressive in terms of size and
raw power.
Automatically Appended Next Post:
dogma wrote:
orlanth wrote:
Carl Lewis kept up, Ben Johnson ran a poor race in 92 and this was heavily commented on as 'climbdown' related. As stated this may or may not have been the case but it was certainly how the media liked to see it.
The ethics of drug use is a societal factor primarily, not a scientific one, and laws and customs are based on those ethics.
Carl Lewis also doped, and probably did so for his entire career.
I didn't know this was the case. Not surprised though, sport is big business and carries a lot of nationaistic prestige. Johnson got caught, Lewis didn't, both would have had access to professionals who would know how long the drugs take to no longer be detectable in actionable quantities. Ben Johnson's relevance to this discussion was media attitude to his being stripped of a medal in '88 and the consequent press reactions to his poor performance in '92.
I still think you highlighted a very interesting point that steroid use in sport is a good case study to follow to look at societies attitudes to medical/chemical enchancement of human ability.
I have since been refining this to give insight as to what criteria could be most relevant, and it comes back to the same factors of our social conditioning. I don't think transhumanism will be any different, if there is power to be had or profit to be made.....