Video Block
Double-click here to add a video by URL or embed code. Learn more

TELEPRESENT ROBOTICS

 

Suprastudio Technology-Seminar

Research Lead: Guvenc Ozel

Instructor: Benjamin Ennemoser

Technologies: gaming engines, XR, digital fabrication, projection mapping

Students: Yibing Wu, Jeremy Nguyen, Roxana Perez-Antonio, Peitong Zhang, Lingjie Wu, Hengzhi Ye, Ning Zhu, Qianqian Song, Shiyi Xin, Dhwani Pareshlal Gogri, Yevheniia Terzi, Ruodi Yufang, Yi Qian, Lu Zhan, Yu Zhang

Building up on the research in the studio, we are going to investigate in robotic systems that are controlled and manipulated through Virtual and Augmented Reality applications. Through the rapid development of consumer technology and an increasing connectivity of consumer devices within the concept of the Internet Of Things, Big Data, Artificial Intelligence, Machine Learning and Automation, the interaction and manipulation of physical environments through virtual tools becomes more and more relevant. Regarding this scope, the Tech-Seminar explores how spatial robotic systems are orchestrated and defined by Virtual Reality environments. The robotic systems as such, are a series of modules, panels or end-effectors in the form of an enclosure, that are attached to the KUKA robots in the IDEAS robotics lab. The goal is to design a transformable array of spatial modules/ systems that are assembled and combined with the robots in real-time and VR, while the user is teleported as an avatar next to the industrial robots. Meaning, we develop an enclosure that is attached to two different KUKA robots as end-effectors. The end-effectors are controlled by the VR Controller from the point of view of a 360 camera. And finally, the AR application or projection mapping augment the enclosure in specific scenarios or phases of transformation. The modules have the ability to form a greater composition, while the composition itself should be able to contain multiple scenarios, elements and states. Further, we also focus on the detail and moments of connection between the modules/ panels and how these are changing the silhouette and spatial performance of the overall composition while transforming. The Seminar is divided in three distinct phases and students aregoing to work in teams. Each Team has to develop a series of dynamic and responsive systems as well as Virtual and Augmented Reality environments that can interact with the KUKA robots in real-time. During the quarter we focus on the concept of telepresence, realtime interaction, compositing, cyber-physical systems and gesture control. Further, the students will learn and improve their skills in computational design, digital fabrication of large scale objects, robotics, projection mapping, Virtual and Augmented reality, compositing and motion design. Each team has to design a spatial system in the form of an enclosure that can transform according to the possible motion of the small KUKA Agilus robots. We develop the concept design through digital animation and simulation and small kinetic models. The design has to be attached to the robots as an extended end-effector. Meaning, each robot gets an end-effector that holds a part of the enclosure/panel. Therefor, we investigate in how to design and build end-effectors for robotic system with the ability to transform.