Perceptive Computing

Presented by

Sameer babu. K

Under the guidance of

Miss. Shyma muhammed

Dept Of EEE

What is a perceptive computer

A computer with perceptual capabilities.

Abilities to perceive, integrate, and interpret visual, auditory and touch information.

Just like communicating with a human.

This makes the human – computer interaction more –

Natural

Easier

Richer

Computers would be much more powerful if they had even a small fraction of the perceptual ability of humans.

Project BlueEyes

Features

Uses non-obtrusive technology

Extracts key information from the user’s cues and gestures.

These cues are analysed to determine the user’s physical, emotional or informational state.

Attentive environments – environments that are user and context aware.

Non Obtrusive Technology

  • Sensors
  • Video cameras
  • Microphones
  • Trackers
Attributes
  • Eye/pupil movement
  • Voice commands
  • Head gestures
  • User’s physiological state
BlueEyes Projects

Emotion mouse

Suitor

MAGIC

Facial expression identification

Pupil Finder

Emotion Mouse

Simply by touching or using the mouse normally the computer will be able to determine the person’s emotional state

The emotional state can then be related to the task the user is currently doing on the computer.

Over time, a user model can be built up and the computer can adapt to the user to create a better working environment.

The IR, GSA and GSR are inputs into a series of discriminate function analyses.

The result is correlated to an emotional state.

Hence, now an association has been established the user’s physiological state and the associated state.

IR – heart rate

GSR – Galvanic Skin Response

GSA – General Somatic Activity

Person using an early prototype

Final prototypes

Facial Expression Identification

  • Assesses emotions by using the image of the person.
  • Simple states like anxious and happy can be assessed easily.
  • Complex states still being researched.

MAGIC (Manual Acquisition with Gaze Initiated Control)

Uses gaze tracking to perceive user’s area of interest

The mouse cursor moves to the area of the screen that the user gazes at.

Still mouse button is used for selection since the accuracy of the eye is not good.

SUITOR (Simple User Interest Tracker)

Puts computational devices in touch

with their user’s changing information needs.

Suitor can infer what kind of information will be interesting to the user at a particular time and deliver it.

Intelligent data gathering and data mining.

Pupil Finder

A fast, robust, and low cost pupil detection technique.

Uses two infra red (IR) time multiplexed light sources.

Can be used to track many pupils at a time too.

To detect the user’s cues or gestures given by the eye.

Multimodal Interfaces

Development of software libraries for incorporating multimodal input into human computer interfaces.

Allow human-computer interaction with an intuitive mix of voice, gesture, speech, gaze and body motion.

Projects

Intelligent Conversational Avatar

HMRS (Hand Motion Recognition Sytem)

Applications

Computers

Embedded devices

PDAs

Mobile phones

Human interface devices

Present

Voice recognition

dialing in mobile phones

commands for PCs

Data mining used by websites

Future

Computer training or educational programs

Interactive entertainment

Advertising

Mood perceptible software

Household appliances that we can talk to

User based search engines

Possibilities are endless …

More responsive user interfaces

REFERENCES

  1. Ekmen. P and Rosenberg. E. “What the Face Reveals: Basic and Applied Studies of Spontaneous Expression using the Facial Action Coding System (FACS).”OxfordUniversity press: New York,1997
  1. Dryer D.C. “Getting personal with computers: How to design personalities for agents”. Applied Artificial Intelligence , vol 13, pp 273-295 , 1999.