Owner: Ati
Members: 6




 
Odds of a simulated universe - 26 January, 2007
Ati says
Here is an argument I read regarding the odds of living in a simulated universe.

It goes as follows:
If a civilization progresses beyond a certain level, and can spare the resources, it would begin to create simulated universes stocked with simulated people (for social experiments, sheer megalomania, etc.). There would likely be a large number of these universes, and it seems to follows that you odds of living in a simulated universe as oppsed to a real one are pretty low.

Has anyone heard this argument before? What is your reaction to it?
Total Topic Karma: 15 - More by this Author
Nadeem says
+3 Karma
Check this out.
- 26 January, 2007
Ati says
+0 Karma
Thanks for the link. I saw the title and simply crossed my fingers in the hope that I hadn't botched my re-telling to greatly. It is an interesting argument, but when I look over it the first thing that comes to mind is 'so what?'.

It seems to me that if a simulated reality such as the one postulated were good enough to fool (almost) every person in the world, it would also be good enough for it not to really matter whether or not it was real.
- 26 January, 2007
(Guest) Guest says
+0 Karma
I guess anything is possible, I often think I'm the only one alive; that everything else is simulated even people, to teach me howto become another being.. then I was told I'm mentally deficient .
- 26 January, 2007
Ati says
+0 Karma
Well, the point of the argument is not that you are the only one alive, but that you and everyone else in the world is in fact part of the simulation. The version you mention is commonly reffered to as the 'brain in a vat' theory. All of these are at least partially solipsist arguments.
- 26 January, 2007
eviljawdy says
+0 Karma
WoW? Ragnorok? Or any other MMO for that matter! Sure, they're not quite being a simulated universe in the grandest of ideas, but they are alternate civilisations, and they have claimed the lives of many people.
- 05 February, 2007
Ati says
+0 Karma
Well, the difference between this theory and current MMO's is that in the theory we are discussing, all of the people are simulated - to put it another way, in the 'super-mmo' we are discussing, there are no players: just sentient NPC's.
- 05 February, 2007
Nadeem says
+1 Karma
How do you know I'm an NPC? I might be an actual player from the reality that's simulating this one.
- 05 February, 2007
Ati says
+0 Karma
I don't. But if there is only one 'real' person playing, then the odds are about 1 in 6,000,000,000 that it's you. And that's assuming that whoever it is is definitely playing as a human - if they're not, the odds go down by several orders of magnitude.
- 05 February, 2007
Nadeem says
+0 Karma
Yeah - the players are more likely to be little white mice anyway.
- 05 February, 2007
eviljawdy says
+0 Karma
Eek, not trans-dimensional pan-galatic hyper-intelligent mice?

And for this kind of idea to exist - then someone would need to 100% win out on the Turing test... which would take some doing, as we're all intricate beings. Damn our individuality... or not!
- 05 February, 2007
Nadeem says
+0 Karma
I sometimes worry about the validity of the Turing test. I know people who probably wouldn't be able to pass it.
- 05 February, 2007
Ati says
+0 Karma
As do I. I have taken to referring to such people as 'homunculi'. There seem to be an alarming number of them in politics.

Although the Turing test has it's good points - it's really the only practical test for sentience we have today.
- 05 February, 2007
p0ss says
+0 Karma
Ati, i dont believe that is true,
have you read:
http://en.wikipedia.org/wiki/Sentience_Quotient
(disclaimer, i am silentfire on wikipedia and contributed to that page)

Sentience is a sliding scale, and defined by the proccessing power of neurons.

A sufficently extensive autopsy should be able to assertain the proccessing power of a lifeforms brain(or equivilant)

The ammount of data the organism is capable of proccessing could be used to define its relative sentience without resorting to the turing test which is incredibly human-centric, and is indeed extremely language/culture-centric.

The odds are great that we are living in a simulation, but what difference does it make? how real is real?
- 06 February, 2007
Nadeem says
+1 Karma
You know, if the subject being tested really is sentient, it might not appreciate being dissected.
- 06 February, 2007
p0ss says
+0 Karma
lol, i expect if it could express its lack of appreciation, we might refrain from dissecting it
- 06 February, 2007
eviljawdy says
+0 Karma
The last Turing test I saw (many eons ago) was on Tomorrows World (before it was cancelled, and before it's now been resurrected) on BBC1, where they had 3 Turing "challengers" and Craig Charles (Lister from Red Dwarf) - then they had 200+ participants "talk" to the "people", and after the conversations the entries, the participants would choose who they thought was the real person.
I think it was a 90% on Charles, with 10% split over the 3 AI's.

And P0ss, bit of a dumb question, but how does one calculate the bit/s of a Neuron, "I" ?
- 06 February, 2007
p0ss says
+1 Karma
data in the brain is transfered as a combination of the speed and pattern of action potentials we could take each action potential as a bit, and since the conduction velocity of the action potential is 0.6-120 m/s (we'll say 60m/s) and the average size of a neuron is 50 micron (4 micron for a granule cell to 100 micron for a motor neuron in cord) we end up with (or i did) the capicity to transfer somewhere in the vicinity of 12,000,000,000 bits per second, or 12 GBp/s per neuron. and that is "I"

If we multiply that by the number of neurons in the brain (approx 1 billion) we get 12,000,000,000,000,000,000 bits per second, or 12 exabits per second. and thats without counting all the glia.

Mind you, that is an upper theoretical limit, not anything practicle, but if every neuron in the brain fired off at once and the fired off again the instant the initial action potential reached the far side of the neuron, that is the speed you would be looking at.



- 08 February, 2007
eviljawdy says
+0 Karma
o_0 I would like to say "I see" and "hmm, it all makes sense now" but... that would be a lie!
Complicated stuff this neurology (is that the term?)! Still, interesting to know, and I like how this theory puts the communication methods of the human brain into standard "geek" units, like bits
- 08 February, 2007
Ati says
+1 Karma
Well, there are certain things that must be remembered. The brain is a non-linear, non binary computation unit, so attempting to express it in terms of a digital computer is always kind of iffy.
- 08 February, 2007
Nadeem says
+1 Karma
Um, actually, the brain really is essentially binary. Neurons only have an on and off state, that's it.

There is one difference, though - brain signals aren't discrete with respect to time, so they can fire at varying intervals. That aspect of brain function isn't digital - it's analog.
- 08 February, 2007
eviljawdy says
+0 Karma
So in order to try to gather the information from a biological neural network, you'd have to be constantly gathering data?

I've done Artifical Neural Networks, my final year thesis was on a Hopfield Network to find hand geometry similarities - so simulating neural net's is easy enough (well, it's NOT easy, but it's understandable at the least).
- 08 February, 2007
Nadeem says
+0 Karma
Well, there's this quote I once read which goes: "The brain isn't a computer, the brain is the brain."

That said, I don't think I know enough about brain function to know how significantly it differs from neural networks.

Shamefully enough, I haven't really looked into neural networks much - but that's something I should be remedying in the coming weeks, for this project I'm doing for my Machine Learning course. The idea we've got so far is to use a convolutional neural network to learn opening strategies for Go, using the temporal difference algorithm.
- 08 February, 2007
p0ss says
+0 Karma
the biggest problem in replicating the brain is that the brain is not a purely electrical system, it is an electrochemical system, the chemical aspect of the brain is very hard to replicate in hardware. If fact chemicals are largely responsible for our emotions and jnstincts, and may well be the key to developing a truly human ai.
- 08 February, 2007
Ati says
+0 Karma
"the brain really is essentially binary. Neurons only have an on and off state, that's it."

Really? I was undert the impression that each neuron was capable of performing several computations simultaneously.


- 08 February, 2007
p0ss says
+0 Karma
Ati, well, yes and no.

The action potentials can vary in frequency and intensity, allowing for a range of responses, not just 1 and 0.

for the purposes of calculating the processing speed of a neuron i (admitedly somewhat arbitrarily) equated one action potential to one bit, as they are both in essence the act of turning on and then off. which is not exactly acurate as an action potential can carry a greater variety of information than a bit.

The correct expression would be APps or Action Potential per second. not Bps. but i maintain that it was a fair judgement of equivilency as it is still the base unit of information transfered by a neuron. If a neuron was proccessing bits and not action potentials, 12GBps is how fast it could proccess them.
- 08 February, 2007
p0ss says
+0 Karma
Ati, as for your original question.

If there are civilisations cabable of creating virtual worlds, they are most likely to make them similar to their own universe, and if they are significantly advanced they seem likely to establish AI's within that world. It seems likely that those AI's, in their makers image, would create their own virtual realities, complete with AI's,

This creates the possibility of an infinite hall of mirrors, worlds within worlds, in such a situation the chances of existing within the original universe drop to nearly zero. Indeed it is worth questioning if there is an original universe, or did the last virtual reality create the original reality?

- 08 February, 2007
Ati says
+0 Karma
I see, thank you for clarifying.
- 08 February, 2007
Ati says
+0 Karma
Ah, just saw our second post.

Welll, I kind of disagree with that, as the 'real' universe (assuming it functions on laws similar to ours), has a limited amount of computing power to work with (even in the exponential increase asosciated with quantum computers there are still limits), and if such a infinite mirror hall of nested realities, you would run out of memory and processor space quickly. As such, it seems that the creators of the original universe would probably put intentional laws or blocks in place to prevent simulated civilizations from becoming posthuman.

But I could be wrong.

- 09 February, 2007
Nadeem says
+1 Karma
As such, it seems that the creators of the original universe would probably put intentional laws or blocks in place to prevent simulated civilizations from becoming posthuman.

This short story actually addresses this, Ati. Check it out. Might not have that much to do with it, but it's fun to read.
- 09 February, 2007
Ati says
+0 Karma
That was a great story, although I saw the ending coming around page 2.

It kind of reminds me of Asimov's 'The Last Question'.
- 09 February, 2007
Nadeem says
+0 Karma
Yeah, it's a real classic. But I especially like their idea of merging with AI. Fits in neatly with what I want, anyway.
- 09 February, 2007
Ati says
+0 Karma
Yes. The one bone I have to pick with most post-singularity stories is that they keep a certain level of conservativism: all of the charactors are usually only one person at one time.

It seems likely to me that posthumans will 'serialize' by having hundreds (perhaps thousands) of different copies of themselves running simultaneously, with a communal memory (who wouldn't want to be in several places at one?).

Perhaps this is an unpopular idea, or perhaps its simply easier to follow the story line if its told from the perspective of a single person, but it was always something that bugged me.
- 09 February, 2007
Nadeem says
+2 Karma
Have you read the Golden Age trilogy? They actually have collectives that run that way, but people tend not to like it much.

Interestingly, all the AIs on the planet support a higher-level consciousness called the Earthmind, who they cannot comprehend, even though they generate it. And once every 1000 years, all the intelligence in the solar system(Earthmind, Lunamind, and all the others, including the posthumans themselves) joins together to create one mind so fucking powerful it defies description. This event is called the Golden Transcendence, and usually sets the tone for what humanity will do for the next millennium.

The description in book 3 is breathtaking - enough to give you an intellectual orgasm or something.

What I find really amusing, though, is that this mega-mind only thinks about the course of humanity's next millennium for a tiny interval of its brief existence. The rest of it is all devoted to stuff that lower minds can't comprehend.
- 09 February, 2007
Ati says
+0 Karma
Interesting; I'll have to check that out

*Chalks up another reserve at the library, beside 'The Diamond Age'*
- 09 February, 2007
Ati says
+0 Karma
The best posthuman novel I've read so far is 'Accelerando'.

You may have already read it, but if you haven't, I highly recommend it.
- 10 February, 2007
Nadeem says
+1 Karma
Yeah, I have. I recommended it to someone else on some thread around here.
- 10 February, 2007
Ati says
+0 Karma
THe one thing I find interesting is that most posthuman novels and stories that I've read aren't really about posthuman's. The charactors may be moderately augmented, but they are still essentially ortho-human. What are your thoughts on this? Would you read a book in which the protagonist was not recognizably human in his thought processes?
- 10 February, 2007
Nadeem says
+1 Karma
I'd certainly try, but I'm a bit of an ornery critter. I'm not sure whether it would be good literature, though. How does a writer do justice to a non-human mode of thought?
- 10 February, 2007
Ati says
+0 Karma
That's a valid point. It definitely wouldn't be an easy book to write. I supose it's a philisopical question, to an extent: Can a human imagine thinking in ways fundamentally differently than the way he does...
- 10 February, 2007
Ati says
+0 Karma
I got The Golden Age.

I'm currently around page 126 of the first book, and its proving an interesting read.


I finally got to the point where I understood what was going on around page 45.


It usually doesn't take me that long, which I take to mean that this is going to be a worthwhile read.
- 16 February, 2007
Nadeem says
+1 Karma
Have fun! One of my favourite scenes turns up somewhere in the first book(I think) - when Phaethon is on trial. Involves an amazingly scathing retort.
- 16 February, 2007
Ati says
+0 Karma
I'm mid-way through the trial now.

Oh, now I'm all impatient...
- 16 February, 2007
Nadeem says
+1 Karma
I don't remember exactly where it turns up, though. And I won't say what the retort is...spoilers aren't fun when you're actually reading the book.

It's this retort which is almost like something out of Ayn Rand. If that doesn't give it away when it turns up, I'll tell you when you're done.
- 16 February, 2007
Ati says
+0 Karma
I'll look for it.
- 17 February, 2007
Viczy says
+0 Karma
@ P0ss

Unless I'm reading that wrong (and I'm good at reading things wrong ), I think 1 EHz sounds a bit high. Is it possible 60 m/s is an overestimate?
- 20 February, 2007
p0ss says
+0 Karma
@ Viczy

60 m/s is an average between vastly varying neuron types, some will be much faster and some will be much slower.
If we are were to take the line that a system is only as strong as its weakest link then yes, the minimum speed is 0.6m/s 100 times slower than the average. The reson i didn't take that line is because the brain is a distributed system, and if we were to use the metaphor of the internet, major sites have strong connections and can allow for the traffic of many minor connections to run through them, so the overall speed of the internet is not limited to the single guy in uganda with a 9600 baud modem.
I accept that my numbers are not definitive, as surprising as it may be, i am not a neurosurgeon.
- 20 February, 2007
Viczy says
+0 Karma
Fair 'nuff. Cool.
- 20 February, 2007
Ati says
+0 Karma
Nadeem,
I have reached the end of the golden age, and I haven't noticed any really spectaculr come backs. Could you point out which one you were referring too?


Thanks,
Ati.
- 20 February, 2007
Nadeem says
+0 Karma
Er, I hope it was in that book. I remember him doing this thing at a trial, where some guy wants him to commit suicide for the 'public good', or some such thing. And he goes "Your good be damned, sir, if it requires the destruction of men like me!"

The sheer audacity and arrogance of that statement really appeals to me on some level. It's like he's taking the guy and rubbing his nose in the dirt, without even laying a finger on him.
- 21 February, 2007
Ati says
+0 Karma
Ah yes. I remember that one.


It was Orpheus wasn't it?
- 21 February, 2007
Nadeem says
+0 Karma
Yeah, that was the guy.

Man, I need to get hold of that book again. Been hung up reading Aristoi for the second time.
- 21 February, 2007
Ati says
+0 Karma
Hmmmmm... that looks interesting.


Darn you! I'm never going to get to the rest of my reading list if this keeps up!




Oh, incidentally, I think you misunderstood me a few posts up.

When I was talking about serializing, I think you meant I was talking about Borganisms (many minds unified through a shared identity). I was actually talking about the inverse: one identity shared through many minds.

Basically, the idea is that you create a number of partials, exact duplicates, and modified versions of your original conciousness template, and run them simultaneously to the original you, and all of them share a communal memory, so they all act as a single unit. The whole shebang is governed by you Primary, which maintains the original personality template, but with hundreds of times more processing power. It also maintains a central identity, and reabsorbs versions that start to differentiate too far from the original.
- 23 February, 2007
Comment:

Name: