Artificial Life: God and the Game
The Sims 2 is on its way. The creators have added features that they think will make the game more fun such as enabling the Sims to remember events and to share those memories with one another: if a daughter sees a father kissing someone who isn’t her mother, the daughter may choose to share that memory with Mom. Why was memory added to The Sims? For moral reasons? To make the simulation more accurate? Nah, just because memory-sharing makes the Sims more fun.
So, here we have an artificial community designed purely as entertainment. How close does that get us to God’s view? And what would be the fewest rules required to create an artificial world that begins with just one man and one woman that evolves into something like the world we have today? (This is Wolfram’s quest transposed into the social world of history.)
There was only 1 rule that I’ve heard:
Don’t eat the apple!
If you believe in a loving God (I do, if that matters), then the point of making Sims with freedom and volition is to enjoy those moments when they happen to love you back. This despite the fact that the freedom you gave them, to be operative, had to have a morally significant stage on which to pose its dilemmas. You have to let them encounter tragedy as well as joy, otherwise they’d have no choice but to love you, and that wouldn’t be freedom. Nor would it be a confirmation of you as a loving God.
Just giving Sims memory is a step toward this, but it’s not the critical step, I think. When we let the AI ignore us, we’ll be at a significant point. Not hate us, necessarily; anger happens because someone cares about something. No, when the Sims are allowed to become indifferent to our existence, then we will be in a real God game.
As for the fewest number of rules to allow evolution to something like what we have today? (NOOOOO, Sims! Turn back! Don’t do it! lol) Right now they have interaction and memory. But they need culture – language and signs. Culture is where the evolution that really matters to us has been happening.
“…what would be the fewest rules required to create an artificial world that begins with just one man and one woman that evolves into something like the world we have today?”
Let’s see, first you’ve gotta get AI, estimates vary but let’s say 50 years. Then your hardware needs to run enough people. Assuming doubling every 18 months, another 45 years or so will get you up to a billion people. A few more doublings, to handle simulating all the animals, the planet, etc., so let’s say 120 years.
Then, there’s still no guarantee that you’ll get a world like this. Weather simulations are just getting hurricanes…
well, you have to have desires, consequences and memory. That’s basicly how we function, if we want to do something we do it, unless we know there will be consequences, simple.
God is the initial energy of the universe (look up ‘determinism’ on wikipedia) so this works in theory – it’s just not ‘complex’ enough yet. By complex i mean that it should be able to ‘spawn’ itself recursively.