| Cyphesis home page | << | < | > | >>
Elementary building blocks ('atoms') are much larger than in 'real world'. However minds are not build of atoms, instead they are itself building blocks. It would be difficult to build computer from atoms and it would not certainly be comparable to mind. Think of atoms like lego building blocks and minds like MindStorm computer block.
I doubt all is going to be simulated from basic building blocks in version 1.0, but it sets goal for some future day.
For example farmer could have these tasks(/events):
There are going to be characters that are going to have a lot detail including relationship with other NPC (especially anybody that lives near them or is otherwise related). They are good/neutral/evil characters.
You can choose one of those as your character and then:
This idea about choice in occupation for starting character by Olli
God=Wimpy
King=Easy
:
Slave=Hard
Goal can be anything you want. Classical would be to rid world of evil.
There is going to be two options to patch AI deficiences: partly
pre-scripted scenarios and active GM.
This idea by Lavoie Philippe
(Original mail)
Slant in development is likely zero/single/multiplayer/online world in this order. This means that 1.0 release should have game world suitable for single player and enough large for small online world.
Zero player=screensaver ;-)
Or more seriously one where you follow what happens in the world and then
tweak it. Repeat. This is only mode currently.
What I mean that world should just run itself like 'real' virtual world does. Then some of those creature might be controlled by mind that is actually human.
Example1: House receives fire event. It catches fire, modifies it's environment and begins to send fire events.
Example2: Body receives "remove sword" -event. It send to world "I don't contain anymore sword" -event. It might send to mind "touch: thief is stealing sword" -event if it has good perception.
Minds receive input events from their bodies. They then decide what to do. They generate appropriate events(=output) and body then process those events. It then forwards events (possibly modified) to world. World then decides what happens and generates next input events.
Example4: Body receives "Farmer says 'I want to buy axe'" -event. It forwards it to mind and mind then thinks about it. It generates "Say to smith: 'It costs this much'". Body then utters those words.
Example5: Body receives from mind move forward event. It decides that it's too exhausted to do that. It sends to world "fall down" -event. It sends to mind "dream" -event.
Example6: Mind receives "sight: orc attacking" -event. It decides to draw sword. Body then either draws it or sends back "not here" -event.
Currently only events between world and mind are implemented.
Suggestion for events between world and things is by
BrainBoy: Original mail
In future there will be 2D (Gnome canvas?) and 3D interfaces (Crystal Space?).
Simulation and story are IMHO important and therefore in the beginning there is only text interface. Goal is to have your mind immersed in the game like reading book.
It is important to have your visual to be immersed too, so good 3D interface is goal too.
1.0? Sometime next millenium ;-)
Suggestions: Aloril