EyeCam and Anthropomorphic Webcam

What if seeing devices looked like us? Eyecam is a prototype exploring the potential future design of sensing devices. Eyecam is a webcam shaped like a human eye that can see, blink, look around and observe us.

What if seeing devices looked like us? Eyecam is a prototype exploring the potential future design of sensing devices. Eyecam is a webcam shaped like a human eye that can see, blink, look around and observe us.

Sensing devices are everywhere, up to the point where we become unaware of their presence.
Eyecam is a critical design prototype exploring the potential futures of sensing devices. Eyecam is a webcam shaped like a human eye. It can see, blink, look around and observe you.
More info about the project, or request for more media? Contact me at contact@marcteyssier.com.
You can download the HD pictures (mirror) and the HD video (without captions: link).
Interested in building one ? Eyecam is Open-Source!

Eyecam is a research project developed and built by Marc Teyssier. The paper received contribution from Marion Koelle, Paul Strohmeier, Bruno Fruchard and Jürgen Steimle.
This research was conducted during my stay at Saarland University Human-Computer Interaction Lab.

What is Eyecam?
Eye contact. Human eyes are crucial for communication. Through the look, we can perceive happiness, anger, boredom or fatigue. The eyes move around when someone is curious and took straight to maintain focus. We are familiar with these interaction cues influencing our social behavior.  While webcams share the same purpose as the human eye —seeing—, they are not expressive, not conveying and transmitting affect as the human eyes do. Eyecam brings back the affective aspects of the eye in the camera.

Eyecam is a webcam shaped like a human eye. It can see, blink, look around and observe you.

Rethinking our relation with our digital world
The purpose of this project is to speculate on the past, present and future of technology. We are surrounded by sensing devices. From surveillance camera observing us in the street, Google or Alexa speakers listen to us or webcam in our laptop, constantly looking at us. They are becoming invisible, blending into our daily lives, up to a point where we are unaware of their presence and stop questioning how they look, sense, and act.

What are the implications of their presence on our behavior? This Anthropomorphic webcam highlights the potential risks of hiding devices functions and challenges conventional devices design.

When the form matches the function
Modelled on human physiology, Eyecam is composed of three main parts: the skin layer, the (robotic)musculoskeletal system and the eyeball.

Movement and sight
Eyecam comprises an actuated eyeball with a seeing pupil, actuated eyelids and actuated eyebrow. Their coordinated movement replicates realistic human-like motion. The device is composed of six servo-motors positioned optimally to reproduce the different eye muscles. The motors replicate the lateral and vertical motion of the eyeball, the movement of the eyelids closing and the eyebrow moving. The control of these motors is made with an Arduino Nano. A small camera is positioned inside the pupil,  sensing a high-resolution image (720p60). This camera is connected to a Raspberry Pi Zero and is detected by the computer as a  conventional plug-and-play webcam.

Human-Like behavior
The device is not only designed to look like an eye but also to act like an eye. To make the movements feel believable and natural, the device reproduces the physiological unconscious behavior and conscious behavior. Like a Human, Eyecam is always blinking and the eyelids dynamically adapt to movements of the eyeball: when Eyecam looks up, the top eyelid opens widely while the lower one closes completely. Eyecam can be autonomous and react on its own to external stimuli, such as the presence of users in front of it.

Interpretation and decision
Like our brain, the device can interpret what is happening in its environment. We rely on computer-vision algorithms to process the image flux, detect the relevant features and interprets what is happening. Does it know this face? Should they follow it?

Bones and Flesh
Like in our skull, the skin layer sits on a hard shell. The realistic skin is manually sculpted over the shell and then cast in silicone. Finally, human hairs are implanted in silicone for the eyebrows and eyelashes.

A device to Reflect and Think
Eyecam is uncanny, unusual, weird. Its goal is to spark speculations on devices aestheticism and functions. We challenge conventional relationships with sensing devices and call to re-think how sensing devices might appear and behave. Inspired by critical design, Eyecam allows for critical reflections on the devices functionalities and their impact on human-human and human-device relations. This open up a debate on plausible and implausible ways future sensing devices might be designed.

Should the device be transparent and invisible to the user? What are the next social and ethical challenges of IoT? What is the balance between mediation and intrusion? How can we design for the right amount of agency to smart sensing devices? How can reinforce privacy and show the user they are being watched? How can we design smart devices to be present where needed, but respectfully absent when not?

It is time to rethink the relationship between humans and sensing devices through novel design.

Open-Source and Open-Hardware
The design of Eyecam result of a long iterative process. The main technical challenge was to pack the motors and electronics as tightly as possible to maintain the eye proportions. I believe that this mechanism can benefit others. It should be reproduced and appropriated by researchers, designers, or makers who wish to experience it, explore it, and extend to create provoking, novel or uncanny sensing devices.
Hence, The project is fully open-Source and open-hardware. Access the files on GitHub. A video tutorial of How to make Eyecam from scratch is currently in the making!

More information
Eyecam is published at The 2021 ACM CHI Conference on Human Factors in Computing Systems.
For any request or questions, please reach me via email contact@marcteyssier.com or Twitter @marcteyssier
This research was conducted Saarland University Human-Computer Interaction Lab. This project received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement 714797 ERC Stg InteractiveSkin)

#SarcasticGamer #SarcasticReview

SOURCE: Marc Teyssier

What do you think of this blog? Write down at the COMMENT section below.

No comments:

About Simpro