creating life through the act of writing
© 2006, Laurent MIGNONNEAU & Christa SOMMERER
University of Art and Design, Linz, Austria
We are artists working since 1991 on the creation of interactive computer installations for which we design metaphoric, emotional, natural, intuitive and multi-modal interfaces. The interactive experiences we create are situated between art, design, entertainment and edutainment. One of our key concepts is to create interactive artworks that constantly change and evolve and adapt to the users’ interaction input . For several of our interactive systems we have therefore applied Artificial Life and Complex Systems principles and used genetic programming to create open-ended systems that can evolve and develop over time through user interaction.
Text as Genetic Code
In 1997 we produced Life Spacies for the NTT-ICC InterCommunication Museum in Tokyo as part of their permanent collection. It is an interaction and communication environment where remotely located visitors on the Internet and the on-site visitors to the installation at the NTT-ICC Museum in Tokyo can interact with each other through artificial creatures. Artificial creatures are created by on-line participants through writing email messages to the Life Spacies web page. Each text message is encoded into the genetic code for a creature, our in-house text-to-form editor allows us to translate text into 3D shapes. When a text is written into the Life Spacies web site GUI an email message is generated and an artificial creature starts to live in the interaction environment at the NTT-ICC Museum..
Our text-to-form editor links the characters and syntax of the written text to specific parameters in the creature’s design. The default form of a creature is a body made up by a sphere consisting of 100 vertices, 10 rings with 10 vertices each. All vertices can be modified in x, y and z axes to stretch the sphere and create new body forms. Several bodies can also be attached to each other and several limbs can be generated through the text as well.
According to the sequencing of the characters in the text, the parameters of x, y and z for each of the 100 vertices can be stretched and scaled, the color values and texture values for each body and limb can be modified, the number of bodies and limbs can be changed and new locations for attachment points of bodies and limbs can be created. In translating the characters of the text message into these design function values, we assign an ASCII value for each character. This is done according to the standard ASCII table. When messages are sent, the incoming text modifies and “sculpts” the default module by changing its form, size, color, texture, number of bodies/limbs, copying parts and so forth. Depending on the complexity of the text, the body and limbs of the creature become increasingly shaped, modulated and varied. As there is usually great variation among the texts sent by different people, the creatures themselves also vary greatly in appearance, thus providing a personal creature for each author of a text. As soon as this message is sent to the server in Tokyo, the creature starts to live in its virtual environment and the author of the text receives a picture of his or her creature in return. When more complex messages with more characters, words and varied syntax are sent, more elaborate creatures with more bodies, limbs and variation in body form, texture, size and color can be created.
The interaction setup in Tokyo consists of two independent interaction sites that are linked together via a data line, allowing visitors at remote locations to be displayed and interact in the same virtual three-dimensional space. On-site visitors in Tokyo can directly interact with the creatures through touching and catching them in the immersive environment. Users see themselves integrated into the 3D environment on the screens and they can play with the creatures through gesture-based interaction. If a visitor for example catches a creature it makes a perfect copy of itself, but if two remotely located visitors each catch a creature, these two creatures mate and create an offspring creature. In this case, the offspring inherits the genetic code of the parent creatures; this is done through cross-over of the parents’ codes with some minimal mutation. A creature’s default life span is 24 hours, but as the life span is also a function from the design function table it will be updated and changed through the values of the specific characters in the text. When the creature has died, a report is given to it's author, telling him or her how long the creature lived and how many children and clones it produced.
Life Spacies is thus a system where interaction and exchange happens between real life and artificial life on human-human, human-creature and creature-creature levels. The system is multi-modal as it combines a gesture-based interface and a keyboard-based interaction. The use of the system is very intuitive as users need only to move around in the 3D environment and play with the creatures by catching them through their hand gestures. A detailed description of the system and its follow-up system Life Spacies II is provided in literature .
Life Writer consits of an old-style type writer that evokes the area of analogue text processing. In addition a normal piece of paper is used as projection screen and the position of the projection is always matched with the position of the type writer roll. When users type text into the keys of the type writer, the resulting letters appear as projected characters on the normal paper. When users then push the carriage return, the letters on screen transform into small black and white artificial life creatures that appear to float on the paper of the type writer itself. The creatures are based on genetic algorithms where text is used as the genetic code that determines the behaviour and movements of the creatures. The algorithms were developed for one of our previpus works called Life Spacies  and here the text functions as genetic code for the creation of artifical life creatures.
As in the Life Spacies system the artificial creatures created by the act of typing can be faster or slower depending on their genetic code and body shape. All of the artificial life creatures also need to eat text in order to stay alive and when users type a new text the creatures will quickly try to snap up these characters from the paper in order to get energy. Once creatures have eaten enough text they can also reproduce and have off-spring so eventuelly the screen can become very full when creatures a fed well.
The user can also push the creatures around when using the scroll of the typing machines cylinder. She can for example push the creatures back into the machine which will crush them or scroll the creatures off the screen alltogether, making new place for new creatures.
By connecting the act of typing to the act of creation of life, Life Writer deals with the idea of creating an open-ended artwork where user-creature and creature-creature interaction become essential to the creation of digital life and where an emergent systems of life-like art emerges on the boundaries between analog and digital worlds.
 Sommerer, C. and Mignonneau, L. 1997. "Interacting with Artificial Life: A-Volve," In: Complexity Journal. New York: Wiley, Vol. 2, No. 6, pp. 13-21.
 Sommerer, C., Mignonneau, L. and Lopez-Gulliver, R. 1999. "LIFE SPACIES II: from text to form on the Internet using language as genetic code," In: Proceedings of the 9th International Conference on Artificial Reality and Tele-Existence (ICAT'99), Tokyo: Virtual Reality Society, Dec. 1999, pp. 215-220.