Objective
- Director Principal Researcher Takashi Matsuyama (Graduate School of Informatics, Kyoto University)
- Principal Researcher Atsushi Iriki(Section of Cognitive Neurobiology, Department of Maxillofacial Biology, Tokyo Medical and Dental University)
- Principal Researcher Masato Sasaki(Graduate school of Interdisciplinary lnformation Studies/Graduate, School of Education, The University of Tokyo)
- Principal Researcher Yasuo Kuniyoshi(Graduate School of Information Science and Technology, The University of Tokyo)
The goal of the project is to analyze and understand processes and mechanisms of human information processing and to realize man-machine symbiotic systems.
Man-machine symbiotic systems propose the next generation way of getting along with `machines (information systems)' in our every day lives. To make our concept clearer, let us review the history of our civilization (Figure1).
Figure1: Tools, machines, and symbiotic systems
Tools
First of all, we, human beings, can be well characterized as animals that can exibly use a variety of tools: hammers, cutters, bows, and so on. It is not too much to say that our civilization owes heavily to tools.
Our exibility in using tools may come from our capability of assimilating tools into our physical bodies. That is, human beings have enough sophisticated perception and motor systems to realize such assimilation and consequently, tools are regarded as extended bodies augmenting our physical capabilities.
Machine Interfaces
After the industrial revolution, various powered machines were developed and the concept of automatic control was introduced. From an informatics point of view, automatic machines can be discriminated from tools in that the former have internal states which are modied according to their own dynamics. To get along with automatic machines, therefore, we need to have some ways of monitoring their states and controlling their state transitions. We may call such monitoring and controlling methods Machine Interfaces.
Among others, since computers can be programmed to have an innumerably large size of dierent states and exhibit unpredictably complex state transitions, it became crucial to develop easy-understandable computer interfaces. Thus, starting from Character User Interface, we now enjoy Graphical User Interface. To further improve computer interfaces, moreover, Multi-Modal User Interface and Perceptual User Interface are being studied: the former employs sound and tactile interfaces as well as visual interface and the latter understands human gestures and speech to free us from keyboards and mouses. It is our short-term research goal to develop these advanced human-computer interfaces.
Man-Machine Symbiotic Systems
Man-machine symbiotic systems we are proposing go a step further beyond the machine interfaces. That is, the interaction model employed in any machine interfaces is designed based on Order-and-Response Model or Master-and-Slave Model, in which a human user is regarded as an omnipotent master and a computer as an obedient server. In the manmachine symbiotic systems, on the other hand, a human and a machine are considered to be on equal terms and interact with each other as partners. Of course, we do not claim that we should give machines human rights, but what we want to realize is such machines that work for humans even if they are not explicitly ordered.
Thus, our fundamental question to be answered in our project is that how we can design such machines that can be regarded as partners rather than slaves. An intuitive answer would be while they have autonomy, they always care about humans: observing human behaviors to understand their intentions and moods, and supporting them voluntarily.
Since it would be a very long range research goal to realize partner machines, we set the following middle range research goals in the project(Figure 2):
- (I) Knowing Human Beings:
- analyzing characteristics of natural human beings from viewpoints of neurophysiology and psychology.
- (II-1) Observing Human Activities:
- sensing and understanding human activities using vision, auditory, and tactile sensors.
- (II-2) Attracting Human:
- realizing attractive multimedia presentation methods, which elicit human reactions.
- (III) Interacting with Human:
- developing real-time interaction protocols.
- (IV) Living with Human:
- developing robots that can live together with human in every day life environments.
Among various technical aspects to realize partner machines, we believe dynamic multimodal interaction protocols between human and machine are crucial: to make interactions smooth and casual, at what timing pauses are introduced, at what speed responses are given, and in what media information is represented.
Figure2: Realizing man-machine symbitic systems