Blurring the Lines the Virtuality of Human Reality

Blurring the Lines the Virtuality of Human Reality

Blurring The Lines – The Virtuality of Human Reality

Gerald M. Santoro

Presented at

“Playing to Win’08: The Business and Social Frontiers of Video Games”

The Pennsylvania State University

March, 2008

Introduction

This is not a scientific paper. I will not be revealing any cutting-edge research, nor will I be offering a unique interpretation of existing data. Instead, I hope to offer a number of lines of evidence to support the proposition that all human reality is virtual, and then to propose some possibly interesting implications for the emerging technology of Virtual Worlds.

Philosophy and Levels of Reality

From the time the earliest ancestral human pondered the surrounding world, the question of who we are has been central to human culture. Throughout the years, philosophers, prophets, artists and others have tried to come up with an answer to the nature of human reality. This was important as early humans tried to cope with a world filled with many threats to their existence. One result has been the development of many complex mythologies that define humanity’s place in the universe and give us purpose and hope.

They are, of course, imaginary, but that is not to say they are not real. The reality of the citizen of ancient Greece circa 400 BC was filled with gods, demons, monsters and vague personalities that filled the gaps in his real understanding of the physical world. Far from being useless, these symbolic entities provided a foundation for the construction of the earliest human communities and cultures. Many of the stories of the gods were designed to teach complex ethical and moral lessons. Others were designed to impart a common belief set that would help to bond a community and get it through times of strife such as from a war, drought, etc.

An example is the Divinia Comedia, written by Dante Aligheri in 13’th century Italy (1). This piece of pop culture was controversial at the time, but has since become widely recognized as one of the most important examples of medieval literature. The importance of this work comes not from the artistry but from how Dante expresses the ‘reality’ shared by people of his time. The afterlife of Inferno, Purgatorio and Paradiso were very real to them. Their ‘existence’ motivated real people to certain positive behaviors and constrained antisocial behaviors. Angels, demons, monsters and the spirits of the dead filled the gaps between what Dante’s contemporaries experienced and what they thought they understood. The lasting impact of the Divine Comedy was in its ability to discuss ethics and morals through the placement of various celebrities (political, religious, legendary, etc.) on the different levels of the afterlife.

It appears that the situation is one of multiple ‘levels’ of reality. On one hand there is the reality of human experience (HRL) – what we see, hear, feel, understand, etc. Then there is the reality of the physical world around us (PRL). Both of these are important because both are ‘real’ as far as we are concerned. But there is an added problem – we are only able to experience physical reality subjectively – through the biochemical mechanisms of sensory cognition. The philosopher Plato recognized this, and wrote of it in his ‘Allegory of the Cave Wall’ (2) What we perceive is not the underlying physical reality, but rather the effect of that reality. This was an amazing insight, for almost nothing was known about human physiology or the mechanisms of cognition. We will return to this later.

In 1984, Richard Gregg (Penn State) wrote an extremely interesting book that examined the origins of human rhetoric. (3) He recognized that the basis for effective communication is in symbolic reality. The imaginary creatures, forces and places provided the means by which ancient people could discuss their world and their place within it. These symbols represented aspects of the whole of human reality that could not be described in any other way. Gregg referred to this as ‘symbolic inducement:’

"All that we experience, all that we “know,” all of the meaning we create and respond to is made possible by our innate capacity to symbolize. It is all symbolic behavior. Our neurophysiological processing is always and inevitably geared to structure our experiencing symbolically, and basic but complex principles of mind-brain activity guide and shape all of the symbolizing we engage in. (p. 131)”

In summarizing Gregg’s work after his passing in 1990, friend and colleague Tom Benson wrote:

“This book goes far beyond the psychological observations that had emerged in the first years of his work. It is the result of a systematic study of anthropological, neurophysiological, psycholinguistic, psychological, literary, and rhetorical theory” (4)

It would have been interesting to see how Prof. Gregg would have reacted to the development of virtual worlds (VW). Places where technology can bring symbolic realities to sensory life. In a sense coming full-circle by providing supportive cognitive input to engineered symbolic (virtual) existence.

Cognition and Consciousness

Cognitive Science emerged in the 1970’s as a recognized discipline with its own academic journal and professional organization. At this writing (2008), although much has been learned, the origins of ‘mind’ and ‘consciousness’ (and therefore HRL) are still very much a mystery. Two things we are fairly certain of are (a) that the brain is directly involved in the development and maintenance of HRL, and (b) that the specific mechanisms of our senses affect the nature and extent of what we can perceive.

Regarding the role of the brain, in “The Man Who Mistook His Wife for a Hat”, neurologist Oliver Sacks (5) wrote of patients whose ‘reality’ was changed due to various physical head injuries. The title story describes the case of a man with visual agnosia (6) – essentially an ability to see objects but not to recognize them. Specific symptomatic patterns vary, but there appears to be a disconnect between the patient’s sensory mechanisms and those that give meaning to the sensations. In essence this is a failure of the system that takes in perceptual data and correctly matches it with stored models of reality.

The mechanisms and causes underlying the function (and malfunction) of the brain in its construction of HRL are not understood. The subjectivity of HRL does nothing to shield us from its effects. Dreams, originating in HRL, can spur the dreamer to action in PRL. A rather obvious example might be the ghosts from Dickens “Christmas Carol” – although unclear whether they visited in a dream or in reality – the effect was to make a marked change in Ebineezer Scrooge and, by effect, those around him. In some cases these effects are actually desired. Mind-altering drugs and experiences have been both the backbone of communities (as in some rituals) and their destruction (as in crack or alcohol addiction).

As for sensory limitations - this feature of HRL can be a bit unsettling, but we live with it. Our senses work very well for most human endeavors. Optical illusions (7) provide entertainment and a sense of magic. For those cases where extrasensory data is required technology can fill the gap. As a simple example, many homes have CO (Carbon Monoxide) detectors. People cannot detect CO, although it causes an estimated 500 deaths in the United States each year (8). Artificial lights have enabled humans to overcome the night, with the result that the darkness of night is little more than a minor hassle.

Before the latter part of the 1600’s, people had no clue as to the cause of disease, food poisoning, and the rotting and disintegration of dead animals and plants. Such unexplained phenomena supported a whole range of demons, curses, and madnesses which were imagined to be the underlying cause. It was not until the work of Antony van Leeuwenhoek (9) that the world of bacteria became recognized. Today we understand that these microscopic organisms inhabited the Earth for at least a full billion years (10) before multi-cellular organisms evoloved. In a sense, we are simply higher-level organisms built upon (and dependent upon) the world of bacteria – a world we cannot perceive.

Today this is generally taken for granted. Little children are taught to wash their hands and discard food dropped on the floor. Although we still cannot directly detect bacteria, the microscope has helped us accept an understanding of the PRL of the nano world of bacteria, viruses and much more. We have accepted the notion that there are things beyond our ability to directly sense – and so our technologies have provided assistance. (As I write this I am wearing glasses.)

In fact, many of our favorite technologies have been tailored specifically for the sensory limits of human cognition. Edison recognized this when he described the Kinetoscope (11) in 1888 (then built by one of his employees). Today, few people truly appreciate that when we watch motion pictures we are watching a series of stills move by at a rate tuned precisely to fool the brain into perceiving seamless motion. Similarly, when you watch television using a cathode-ray display you are really watching the lag of a single dot as it zooms across and down the screen at a rate precisely tuned to again fool your brain into perceiving motion and objects.

As a result of the evolution of information technology, new emphasis has been placed on the development of devices and approaches for improving the human-computer interface (HCI). Once limited to text, during the past 35 years (and especially the past 10) the quality of HCI has made remarkable advances. Display technology has resulted in amazingly realistic 2-D and 3-D images on screens ranging from 1-inch to 8-feet wide (or larger). Since Motorola created the digital signal processor (Motorola 56000 family – late 1980s) almost every computer has had the capability for digital stereo audio at a quality level beyond that of most humans hearing.

And the trend continues. Research has already begun to try to develop the ‘brain-computer’ direct interface (BCI) (12). This ‘new modality’ offers tremendous potential – especially for handicapped and elderly persons. But even beyond those obvious applications there lay the potential for a better form of HCI for the average person. Currently, carpal-tunnel syndrome (CTS), results in about 260,000 operations annually in the United States (13). Although it istill debated whether keyboard and mouse use contribute to CTS, it is still a major pain for those of us who depend on computers for everything from work to entertainment. Even if a causal link is disproven – most computer users would welcome a better interface anyway – one that would allow us to control computers via intentional thought rather than a physical action.

To that end, Emotive Systems (14) has just announced the availability of their EPOC headset. This is a set of sensors that detects patterns in the users brain activity, and that can be trained to connect them with actions in a computer application. The idea is that video games and other computer applications can be controlled by training the brain to utilize the EPOC as the primary interface. This is still a far cry from the computer understanding the users raw ‘intent’ (as a symbolic construct) but is close, as it removes the middleman (translation of intent into hand movement) from the interaction.

The opposite is also intriguing. A number of companies have been experimenting with systems that mount displays in the users eyeglasses. (15) Some have even proposed focusing the display directly on the users retina. A recent WTEC Workshop on Brain Computer Interface Research explored the use of sensors directly implanted in the brain (16) – the ultimate BCI (the wire). This research has the intent of making it easier to control mechanical systems (input), and to acquire sensory data from those systems (output).

Virtual Realities, Virtual Worlds and Social Spaces

In the early 1990’s, former Harvard psychologist Timothy Leary (17) visited Penn State to talk with information technology faculty and staff about virtual reality. I attended the meetings, and became intrigued by Dr. Leary’s argument. He was essentially trying to convince people that virtual reality (VR), using feedback loops from the signals given off by human brain waves, could induce a state of psychedelic nirvana. Although I enjoyed chatting with Dr. Leary, I felt that his idea was fanciful and betrayed a lack of understanding of VR and HCI technologies.

Today I am not so sure he was entirely wrong. If HRL is truly, subjectively, based on the mechanisms of the brain and at the mercy of the sensory input that comprises the raw material for our mental model – then perhaps controlled sensory input can result in controlled HRL, with resultant physiological effects. Unfortunately, Dr. Leary crossed a very dangerous line by experimenting on himself, and so his credibility within the scientific community was lost and his ideas relegated to 60’s pop culture.

We now find ourselves in a socio-technical landscape where hundreds of millions of people are similarly experimenting on themselves through their participation in Internet-based online multi-user games (MMORPG) and virtual worlds (VW). Although the nature of the experimentation is far different from ingesting a drug, given the arguments outlined above it is possible that some psychological or physiological effects can result. A new category of behavioral addiction, Internet Addiction Disorder (18) has been proposed by some behavior professionals – although it is controversial – to draw attention to the potential negative effects resulting from obsessive computer use.

Many cases have been reported where the lure of virtual experience (a component of HRL) results in problems in PRL. In 2002, a Milwaukee man committed suicide as a result of (according to his mother) his addiction to the online game Everquest. (19)

In 2005, a 28-year-old South Korean man collapsed and died after 50 straight hours playing the online game Starcraft at an Internet café. (20) In 2007, MSNBC reported on a young couple in Reno, convicted of child neglect, who allowed their infant children to go without food and other care as they were obsessed with Dungeons and Dragons. (21) For that matter, many parents of children during the 1980’s, 1990’s or 2000’s have had to literally tear a screaming child away from the alluring stimulation of the video game.

Why the obsession? What is it about online environments that results in obsessive user behaviors? Can it be that they are actually addressing human needs for socialization, adventure, and personal development that, through adequate HCI and user experience, become a real part of HRL? Is it possible that the mind (and underlying HRL) finds these experiences as engaging as ‘real’ experiences? If so, can there also be positive effects from the directed application of these technologies to HRL?

Virtual Worlds (VW) (22) are a subset of VRs that follow certain paradigm constraints.

Where a VR can take any form of visualization imaginable, VW’s typically are modeled on 3-dimensional simulations of the real world. There are ‘objects’ with properties such as size, mass, inertia, volume, location, orientation and levels of solidity (whether one can pass through them or not). Objects can have properties attached in the form of scripts, which are limited by the underlying software design of the world. Users interact with the VW through ‘avatars,’ which are in-world characters that may be customized according to the users desires.

Early VW’s left much to be desired in terms of HCI. During the late 1970’s, my colleagues and I played the D&D-like game ‘adventure’ (23) on a DecSystem-10 mainframe computer using timesharing and LA-36 teletype terminals. This game was a single-user trek through a simulated cave. The user would issue English commands such as ‘north’ or ‘pick up lantern’ and the program would give a response, such as ‘the lantern has been picked up’ and would then explain what is being seen. Many of my adventures seemed to end in the ‘maze of many twisty passages – all alike.’

The next generation of VW added a social component. Since these programs were based on multi-user computers (mainframes, mini-computers and eventually workstations) it was fairly simple to allow groups of users to share the same program, and therefore the same VW. Initially the users had to be logged into the same computer but with early nertworking (late 1970’s and 80’s) telnet access to a server was all that was required. The era of social online gaming had begun – with names such as MOO, MUD and MUSH. One very popular example is the PernMUSH (24) – developed to simulate the Pern books by author Anne McCaffery. Friends of mine who spent hundreds of hours in PernMUSH explained that it was not the adventure but the socialization that would bring them back. They were able to share, exchange, cooperate and experience with others – all from the relative safety of their avatar. These were their ‘friends’ – a community of interest and shared values, not just of geographical accident.

VW’s today have benefited from a number of major technological developments that make one think of the telnet-connected MUSH as similar to the early days of black-and-white television. VW client programs allow for immersive graphics on standard PC’s. Broadband networks (many wireless) allow users to acquire the VW from many locations. Faster processors, graphics accelerators, and digital signal processors allow much of the work of the VW to be handled on the client system. The result is a set of VWs that provide a much better level of cognitive input, and therefore a more realistic experience for the user.