Why UTokyo?
Home > Why UTokyo? > In Depth > Invitation to Science > Creating a World where Man and Machine are One
In Depth

Creating a World where Man and Machine are One

The World of Augmented Humans



Contributed by Jun Rekimoto
Professor
Interfaculty Initiative in Information Studies
http://lab.rekimoto.org/
AR (Augmented Reality) is a term that is often heard these days. What Professor Jun Rekimoto is working on is one step past AR—Augmented Humans, or AH. The phrase Augmented Humans refers to research that looks into enhancing human capabilities through the use of technology. Consider what it would be like to expand your consciousness and experience systems that allow you to immerse your own senses into robots and other devices with capabilities different from humans. Let's take a look into the future world in which man and machine are combined.


What will become of humanity as continued progress in technology leads to the strength and expansion of human capability? While this question has been constantly asked ever since humanity first came to possess technology, it is at the same time a theme at the forefront of modern science and technological research. In my laboratory, we call this strength and expansion of human capability "human augmentation." Not limited to increasing information-gathering capabilities, human augmentation can take many different forms, including the strengthening of physical capabilities and skills, as well as the fortification of bodily systems (health).
 
Figure 1: Connecting remotely with a drone
A particular example of human augmentation that I am working on is a series of remote immersion-type systems collectively called "Jack In" systems. "Jack In" is a term that was coined in the sci-fi novel Neuromancer*, and means immersing all of one's senses into cyberspace, a virtual space created by computers. I expanded upon that meaning and use the term "Jack In" to refer to imputing one's senses into a robot with superhuman ability, or into another human being altogether.

For instance, Figure 1 shows an example of a "Jack In" system used with a drone. The images seen in the head-mounted display are sent in from a live camera feed installed on the drone. When the wearer looks around, the drone changes direction in accordance with the wearer's movements, allowing one to feel as if they have become the drone. This system enables professionals, for example, to virtually approach disaster-stricken areas from a distance to assess the situation.
Figure 2: Sports training with out of body vision generation systems (top: a drone following a runner; bottom: a submersible robot moving alongside a swimmer)



This concept can also be applied to viewing yourself from outside your body. In sports training and rehabilitation, it is important for one to have their posture and form objectively observed from an external viewpoint. This need is normally filled by personal trainers, and also partially through the use of mirrors and recording video of oneself, though the latter methods are not thorough enough.


Figure 2 shows people experiencing an out of body vision generation system. In the top photo, a drone follows a runner, flying closely in their vicinity. The runner can look at the footage the drone is taking to analyze their running form. The bottom photo of Figure 2 presents an out of body vision generation system designed for swimming. This machine swims with the swimmer and the footage it takes can be checked to externally ascertain the swimmer's form. These kinds of systems are particularly useful for remote coaching, as coaches can "Jack In" to the flying drone or the swimming robot and use the machine to give the athlete instructions from a distant location.
What would happen if we could "Jack In" to not only robots, but also to other people? For instance, we are experimenting with a device which allows professionals to "Jack In" from a distance to networks transmitting the sensual data, such as visual and aural, of people working or doing other activities in a particular location (Figure 3). The user puts on a wearable computer that allows them to record 360-degree footage around themselves. The computer is equipped with a stabilization function, so even when the wearer moves their head, the footage will not shake. The kind of information gained from this technology can be used to bring together and match several people with differing abilities. Also, this technology creates new possibilities for video content, such as allowing users to see from a top athlete's point of view as the athlete is engaged in sports and thereby immerse themselves in the athlete's environment.
 
Figure 3-1: "Jack In" headgear worn by the person on the left of the image below
Phrases such as the Japanese term "jinba ittai" ("the horse and rider as one") refer to when tools used by humans are so refined that they feel almost as if they have become part of one's body. I continue with my research believing that the age is coming in which the relationship between humans and technology will ultimately reach the same outcome, being referred to as "jinki ittai" ("the machine and human as one").
Figure 3-2: A "Jack In" system that connects people's abilities and senses (the person on the left is wearing headgear that takes footage all around him, sending it back to the device worn by the person on the right)
* A 1984 novel written by William Gibson. As a pioneering book in the cyberpunk genre, this novel influenced many later works, including the Matrix and Ghost in the Shell series.

** This article is a translation of an article that was originally printed in Tansei 31 (Japanese language only).
 
   - Tansei 16 -

Inquiries about the content of this page: Public Relations GroupSend inquiry

Access Map
Close
Kashiwa Campus
Close
Hongo Campus
Close
Komaba Campus
Close