Keynotes and Invited Talks

Prof Oussama Khatib

Prof Oussama Khatib

Stanford University

Deep-Sea Robotics Exploration: OceanOneK

Date, Time, Room

Abstract:

OceanOneK is a robotic diver with a high degree of autonomy for physical interaction with the marine environment. The robot’s advanced autonomous capabilities for physical interaction in deep-sea are combined with the cognitive abilities of a human expert through an intuitive
haptic/stereo-vision interface. OceanOneK was deployed in several archeological expeditions in the Mediterranean with the ability to reach 1000 meters and more recently the robot was tested in challenging tasks at Deep Dive Dubai. Distancing humans physically from dangerous
and unreachable spaces while connecting their skills, intuition, and experience to the task promises to fundamentally alter remote work. These developments show how human-robot collaboration-induced synergy can expand our abilities to reach new resources, build and maintain infrastructure, and perform disaster prevention and recovery operations – be it deep in oceans and mines, at mountain tops, or in space.

Bio:

Oussama Khatib received his PhD from Sup’Aero, Toulouse, France, in 1980. He is Professor of Computer Science and Director of the Robotics Laboratory at Stanford University. His research focuses on methodologies and technologies in human-centered robotics, haptic interactions, artificial intelligence, human motion synthesis and animation. He is President of the International Foundation of Robotics Research (IFRR) and an IEEE Fellow. He is Editor of the Springer STAR and SPAR series, and Springer Handbook of Robotics. He is recipient of the IEEE Robotics and Automation, Pioneering Award, the George Saridis Leadership Award, the Distinguished Service Award, the Japan Robot Association (JARA) Award, the Rudolf Kalman
Award, and the IEEE Technical Field Award. Professor Khatib is Knight of the National Order of Merit and a member of the National Academy of Engineering.

Magnus Egerstedt

Magnus Egerstedt

UC Irvine

Assured Autonomy, Self-Driving Cars, and the Robotarium

Date, Time, Room

Abstract:

Long-duration autonomy, where robots are to be deployed over longer time-scales outside of carefully curated labs, is fundamentally different from its “short-duration” counterpart in that what might go wrong sooner or later will go wrong. What this means is that stronger guarantees are needed in terms of performance. For instance, in the US, a road fatality occurs roughly every 100 million miles, which means that for an autonomous vehicle to live up to its promise of being safer than human-driven vehicles, that is the benchmark against which it must be compared. But a lot of strange and unpredictable things happen on the road during a 100 million mile journey, i.e., rare events are all of a sudden not so rare and the tails of the distributions must be accounted for by. The resulting notion of “assured autonomy” has implications for how goals and objectives should be combined, how information should be managed, and how learning processes should be endowed with safety guarantees. In this talk, we will discuss these issues, instantiated on the Robotarium, which is a remotely accessible swarm robotics lab that has been in (almost) continuous operation for over five years, participating in over 7,500 remotely managed autonomy missions.

Bio:

Dr. Magnus Egerstedt is the Stacey Nicholas Dean of Engineering in the Samueli School of Engineering and a Professor in the Department of Electrical Engineering and Computer Science at the University of California, Irvine. Prior to joining UCI, Egerstedt was on the faculty at the Georgia Institute of Technology, serving as the Chair in the School of Electrical and Computer Engineering and the Director for Georgia Tech’s Institute for Robotics and Intelligent Machines. He received the M.S. degree in Engineering Physics and the Ph.D. degree in Applied Mathematics from the Royal Institute of Technology, Stockholm, Sweden, the B.A. degree in Philosophy from Stockholm University, and was a Postdoctoral Scholar at Harvard University. Dr. Egerstedt conducts research in the areas of control theory and robotics, with particular focus on control and coordination of multi-robot systems. Magnus Egerstedt is a Fellow of IEEE and IFAC, and is a Foreign member of the Royal Swedish Academy of Engineering Science. He has received a number of teaching and research awards, including the Ragazzini Award, the O. Hugo Schuck Best Paper Award, the Outstanding Doctoral Advisor Award and the Outstanding Teacher Award from Georgia Tech, and the Alumni of the Year Award from the Royal Institute of Technology.

Dr. John G. Blitch

Dr. John G. Blitch

Blitz Solutions LLC

Pinning Donkey Tails: Ethical Accountability in High Dynamic Tele-Tactical Operations

Date, Time, Room

Abstract:

Parallel to the sliding scale from direct tele-operation to full autonomy lies an equivalent gradient of responsibility for how an unmanned system behaves or fails to behave. Given the escalation of drone warfare in the Ukraine and Middle East we can no longer kick the robo-ethics can down the road. Killer robots have arrived whether we like it or not. Who do we hold accountable for robotic war crimes? How can we update the Geneva Conventions to accommodate unmanned systems technology?  How does our pursuit of telepresence affect these issues across multiple sensing modalities and decision levels? We introduce the notion of decision density as an initial scaffold with which to discuss these complex issues in the context of dynamic telepresence for urgent high-risk activities.

Bio:

John G. Blitch is a cognitive psychologist whose work is focused on naturalistic neurophysiological monitoring of human robot teams performing dynamic high-risk tasks in extreme conditions. He holds BSc in engineering from the US Military Academy at West Point, a MSc in Math and Computer Science from the Colorado School of Mines, and a PhD and second MSc in Cognitive Psychology from Colorado State University.

Lieutenant Colonel Blitch’s military experience is derived from multiple command tours in nuclear weapons delivery, special forces, hostage rescue, and robot assisted emergency response. He holds additional military skill identifiers in Artificial Intelligence, Robotics, Space Operations, and a multitude of special operations activities. In addition to his military awards, Dr. Blitch is the recipient of the Lawler award from the Association of Computing Machinery for his pioneering work as the founding director of CRASAR (the Center for Robot Assisted Search and Rescue) and was inducted into the Space Technology Hall of Fame in 2006.

Allison Okamura

Allison Okamura

Professor Mechanical Engineering Dept. Stanford University

Haptics Anywhere: Enabling Mobile Telepresence

Date, Time, Room

Abstract:

Haptic devices allow touch-based information transfer between humans and intelligent systems, enabling communication in a salient but private manner that frees other sensory channels. For such devices to be mobile, their physical and computational aspects must be intuitive and unobtrusive. The amount of information that can be transmitted through touch is limited in large part by the location, distribution, and sensitivity of human mechanoreceptors. Not surprisingly, many haptic devices are designed to be held or worn at the highly sensitive fingertips, yet stimulation using a device attached to the fingertips precludes natural use of the hands. Thus, we explore the design of a wide array of haptic feedback mechanisms, ranging from devices that can be actively touched by the fingertips to multi-modal haptic actuation mounted on the arm. We demonstrate how these devices are effective in telepresence scenarios for task training, robot-assisted surgery, and social touch.

Bio:

Allison M. Okamura received the BS degree from the University of California at Berkeley in 1994, and the MS and PhD degrees from Stanford University in 1996 and 2000, respectively, all in mechanical engineering. She is the Richard W. Weiland Professor of Engineering at Stanford University in the mechanical engineering department, with a courtesy appointment in computer science. She was previously Professor and Vice Chair of mechanical engineering at Johns Hopkins University. She is currently a deputy director of the Wu Tsai Stanford Neurosciences Institute and a Science Fellow/Senior Fellow (courtesy) at the Hoover Institution. She is currently co-program chair of the ICRA@40 Conference and has been editor-in-chief of the journal IEEE Robotics and Automation Letters, associate editor of the IEEE Transactions on Haptics, editor-in-chief of the IEEE International Conference on Robotics and Automation Conference Editorial Board, an editor of the International Journal of Robotics Research, and co-chair of the IEEE Haptics Symposium. Her awards include the 2020 IEEE Engineering in Medicine and Biology Society Technical Achievement Award, 2019 IEEE Robotics and Automation Society Distinguished Service Award, 2016 Duca Family University Fellow in Undergraduate Education, 2009 IEEE Technical Committee on Haptics Early Career Award, 2005 IEEE Robotics and Automation Society Early Academic Career Award, and 2004 NSF CAREER Award. She is an IEEE Fellow. Her academic interests include haptics, teleoperation, virtual environments and simulators, medical robotics, soft robotics, neuromechanics and rehabilitation, prosthetics, and education. Outside academia, she enjoys spending time with her husband and two children, running, and playing ice hockey. For more information, please see Allison Okamura’s CV (pdf).

Prof Holly Yanco

Prof Holly Yanco

University of Maryland, Lowell, Director NERVE Center

Designing Effective Telepresence for Human-Robot Interaction

Date, Time, Room

Abstract:

We provide a retrospective of over two decades of research on the design and evaluation of human-robot interaction with remote robot systems. We discuss a wide variety of interaction modalities: keyboard and mouse, game controllers, joysticks, multi-touch devices, and virtual reality. We will show examples from applications such as telepresence robots in office environments, search and rescue robots, and human-in-the-loop planning for manipulation and mobility in remote environments. The talk will also include results from a study of the human-robot interaction methods at the DARPA Robotics Challenges Trials and Finals. Finally, we provide design guidelines and best practices grounded in findings from real world applications.

Bio:

Dr. Holly Yanco is a Distinguished University Professor and Chair of the Richard A. Miner School of Computer & Information Sciences, as well as Director of the New England Robotics Validation and Experimentation (NERVE) Center, all at the University of Massachusetts Lowell. For over 25 years, she has assembled and led a wide range of interdisciplinary collaborations to solve open problems in robotics and AI. Her research interests include human-robot interaction, evaluation metrics and methods for robot systems, and the use of robots in K-12 education to broaden participation in computer science. Yanco’s research has been funded by NSF, including a CAREER Award, the Advanced Robotics for Manufacturing (ARM) Institute, ARO, DARPA, DOE-EM, ONR, NASA, NIST, Amazon Robotics, Google, Microsoft, and Verizon. Yanco is a member of the Computing Research Association’s Computing Community Consortium (CCC) Council and the DARPA ISAT Study Group. She served as Co-Chair of the Steering Committee for the ACM/IEEE International Conference on Human-Robot Interaction from 2013-2016, General Chair of the 2012 ACM/IEEE International Conference on Human-Robot Interaction, and was a member of the Executive Council of the Association for the Advancement of Artificial Intelligence (AAAI) from 2006-2009.She has a PhD and MS in Computer Science from the Massachusetts Institute of Technology and a BA in Computer Science and Philosophy from Wellesley College. Yanco is a Fellow of the Association for the Advancement of Artificial Intelligence (AAAI) and of the American Association for the Advancement of Science (AAAS).

Dr Ming Hou

Dr Ming Hou

Defence Research and Development Canada

AI, Autonomy, and Digital Reality: Frontier of Symbiotic Telepresence Technologies for Successful Teleoperations

Date, Time, Room

Abstract:

What is left behind when humans perform tasks in a remote workplace? Teleoperations need to take into account visual, auditory, tactile, somatogravic sensory issues with associated physiological threat (modality underload and fatigue), communication delay or datalink loss, indirect control, and disorientation and perception concerns (sound, speed, pressure, temperature), etc.  To address these challenges, artificial intelligence (AI), Autonomy, and Digital Reality (Virtual Reality and Mixed Reality) can provide cutting-edge solutions as the frontier telepresence technologies for system performance augmentation. This talk discusses how these emerging disruptive technologies can be leveraged and integrated to grapple the issues and support decision-making process based on an interaction-centred design paradigm. A real-world example of a large-scale international exercise will be introduced to explain the operational and legal challenges when managing multiple uncrewed systems remotely and simultaneously in different mission environments. Related AI, Autonomy, and Digital Reality technologies as symbiotic decision aids will be highlighted to demonstrate their utilities to tackle the issues. Participating subject matter experts’ feedback on the effectiveness of these telepresence technologies will be briefed to illustrate their implications in human-technology collaboration strategy, interaction methodology, and team trustworthiness.

Bio:

Dr. Hou is a Principal Scientist and Authority in Human-Technology Interactions within the Department of National Defence (DND), Canada. He is responsible for delivering cutting-edge technological solutions, science-based advice, and evidence-based policy recommendations on AI, Autonomy, and Telepresense science, technology, and innovation strategies to senior decision makers within DND and their national and international partner organizations including the United Nations. As the Canadian National Leader in human-AI/autonomy teaming (HAT), he directs the Canadian innovation ecosystems to deliver science and technology capability development programs and support major acquisition projects and large-scale live, virtual, and constructive international joint exercises. As the Co-Chair of an international Human-Autonomy Teaming Specialist Committee, he leads the development of international standards for the integration of AI-enabled autonomous systems into the civilian airspaces. His book entitled “Intelligent Adaptive Systems – An Interaction-Centered Design Perspective” is considered authoritative and outlines a systemic approach to Human-AI/Autonomy Symbiotic Collaborations. It has been instrumental in the development of HAT R&D roadmap, novel industrial autonomy technologies, new international standards, and AI and autonomy policy and regulation frameworks. Dr. Hou is the recipient of the most prestigious DND Science and Technology Excellence Award in 2020 and the President’s Achievement Award of the Professional Institute of the Public Service of Canada in 2021. He is an IEEE Fellow, Distinguished Lecturer, and the General Chair of the 2024 IEEE International Conference on Human-Machine Systems and International Defence Excellence and Security Symposium. Dr. Hou is also an Adjunct Professor at the University of Toronto and University of Calgary.

(25) Ming Hou | LinkedIn

Dr Neal Y Lii

Dr Neal Y Lii

DLR - German Aerospace Center Institute of Robotics and Mechatronics Autonomy and Teleoperation, Domain Head

Scalable Autonomy: The many different ways of using robots to realize telepresence in space and on Earth

Date, Time, Room

Abstract:

Telepresence brings us closer to the people and places we want to be. Today, if we cannot be there in person, at least we can virtually. Robots have become our best partners and avatars on site. Their manipulators give them dexterity to interact with objects and environments. Wheels and legs give them the mobility to survey large areas, and a growing plethora of cameras and sensors allow them to see and touch their surroundings. These capabilities allow us to experience a different world through the robot. Paired with ever-advancing robot intelligence, we can use robots as our physical extension on-location, and delegate tasks to them to execute under our command or supervision. The idea of intuitive and easy transition between deep immersion and involvement, to physical and mental workload sharing, is the basis of Scalable Autonomy. User interface designs with different modalities including, visual, textual, auditory, force reflection, and tactile interaction, let’s us command and receive meaningful feedback from a robot, or a team of robots to carry out large complex tasks in different places. With them, we can explore space, rescue and repair in dangerous locations, and perform minimally invasive surgery on a patient from afar, all while being intimately involved (when we want to be). Together, we will look at the advances we have made so far in space and on Earth, and ponder the new technologies and use cases to help us be virtually present at the places we want to be, the way we want to be.

Bio:

Neal Y. Lii is the Domain Head of Space Robotic Assistance, and co-founding head of the Modular Dexterous (Modex) Robotics Laboratory, at the Institute of Robotic and Mechatronics, German Aerospace Center (DLR) in Oberpfaffenhofen, Germany. Prior to joining DLR, he worked in the Automotive sector in the Silicon Valley, and at BMW in Germany, where he studied drive-by-wire systems. Neal received his BS from Purdue University, MS from Stanford University, and PhD from University of Cambridge.

Neal works on telerobotics and the different modalities of user interface (UI) for space and terrestrial applications. His hope is to enable people to effectively and effortlessly command complex robots and robotic teams. Aside from diving into hand-arm and other multi-modal UI systems that can give users an immersive experience down to the fingertips, he has also served as the principal investigator of two space telerobotic experiments, METERON SUPVIS Justin, and Surface Avatar. These experiments look at how astronauts on board the International Space Station (ISS) can command different robots on Earth with different UI designs and command modalities, and by extension, how we can command teams of robots in future space missions.

Prof Saeid Nahavandi

Prof Saeid Nahavandi

Swinburne University of Technology, Australia

Haptically Enabled Robotics Systems

Date, Time, Room

Abstract:

This presentation will showcase the advancements in tele-robotics, tele-presence, and haptics technologies, highlighting the integration of haptics for delivering tactile feedback in mission critical tele-operations. This integration is poised to transform the way robots interact with humans by enabling them to sense and respond to “touch-and-feel” characteristics. Such advancements will enhance the efficiency and collaboration between robots and humans, especially in tasks that are demanding, hazardous, or unhygienic. Furthermore, the discussion will delve into the foundational research on haptics in robotics and tele-operations, underlining its significance in diverse applications. By granting robots the ability to perceive and react to physical sensations, they can elevate their capabilities across various industries. Ultimately, the aim is to illustrate how haptics can enable robots to coexist seamlessly with humans and overcome challenges that were previously considered formidable.

Bio:

Distinguished Professor Saeid Nahavandi is Swinburne University of Technology’s inaugural Associate Deputy Vice-Chancellor Research and Chief of Defence Innovation. He previously served as Pro Vice-Chancellor (Defence Technologies) and Founding Director of the Institute for Intelligent Systems Research and Innovation, Deakin University. Saeid Nahavandi received a Ph.D. from Durham University, U.K. in 1991. His research interests include
autonomous systems, modeling of complex systems, robotics and haptics. He has published over 1200 scientific papers in various international journals and conferences. Saeid was the recipient of the Clunies Ross Entrepreneur of the Year Award 2022 from the Australian Academy of Technological Sciences & Engineering, Researcher of the Year for Australian Space Awards 2021, Australian Defence Industry Awards – Winner of Innovator of the year, The Essington Lewis Awards, and Australian Engineering Excellence Awards – Professional Engineer of the Year. Saeid has carried out industry based research with several major international companies such as Airbus, Boeing, Bosch, Ford Motor Company, General Motors, General Dynamics, Holden, Lockheed Martin, Nissan, Thales and Vestas just to name a few. Professor Nahavandi holds six patents, two of which have resulted in two very successful start-ups. Professor Nahavandi is the Vice President: Human-Machine Systems, IEEE SMCS, Senior Associate Editor: IEEE Systems Journal, Associate Editor of IEEE Transactions on Cybernetics and IEEE Press Editorial Board member. Professor Nahavandi is a Fellow of IEEE (FIEEE), Engineers Australia (FIEAust), the Institution of Engineering and Technology (FIET). Saeid is a Fellow of the Australian Academy of Technology and Engineering (ATSE). Saeid was the General Chair for IEEE SMC 2021.

Dr. Jeng Yen

Dr. Jeng Yen

NASA-JPL

Telerobotics Operation on Mars Surface in the Last Two Decades

Date, Time, Room

Abstract:

The exploration of Mars has been revolutionized by telerobotics, a field that combines telecommunications and robotics to enable remote operation of rovers and landers on the Martian surface. Since 2003, NASA’s Jet Propulsion Laboratory (JPL) has been at the forefront of this technology, deploying missions like Curiosity and Perseverance to traverse the alien terrain and conduct groundbreaking research. These rovers have brought significant advancements in on-board autonomy and ground data systems, allowing for more efficient and productive missions. For example, the Perseverance rover has achieved remarkable speeds and sample collection rates, surpassing its predecessor, Curiosity. This increase in productivity stems from both evolutionary software developments and process
improvements.

The Robot Sequencing and Visualization Program (RSVP) is a human-inthe-loop user interface software in the missions’ Ground Data System, designed with a visual based human interface and a command text-based robot interface, where two interfaces relate to a deep-linking message-passing system. RSVP uses 3D visualization of reconstructed Martian environment, and NVidia stereo vision technology for human operator to create and verify robot activities, then the corresponding commands for the robot are auto-generated and validated using flight-software-in-the-loop simulation. RSVP’s integration of human-in-the-loop tele-robotic operations exemplifies the practical application of tele-presence in space exploration technology. The advancements in robotic operation software over the past twenty years have significantly contributed to the success of Mars exploration missions. By integrating telepresence technology, we’ve enhanced collaboration between humans and robots, allowing for more complex and nuanced tasks to be completed. However, adapting the RSVP framework for Lunar surface and near-Earth orbit missions presents unique challenges, including latency issues and the need for autonomous decision-making capabilities in robots.

Bio:

Dr. Jeng Yen’s extensive experience with JPL’s robotic operations has been pivotal in advancing the tele-robotic operation and telepresence capabilities for Mars surface missions. Over the past 25 years, he has led the design and development of robotic operation software, the Robot Sequencing and Visualization Program (RSVP). RSVP has been used for construction and validate all the robotic sequences to the Mars Exploration Rovers, Phoenix lander, Mars Science Laboratory Curiosity rovers, the InSight mission’s robotic lander, and Mars 2020 mission’s Perseverance rover and Ingenuity helicopter. With over one hundred thousand hours of teleoperation, RSVP has operated seven robotic spacecrafts on the surface of Mars in a total of eighteen Martian years.

Dr. Yen’s pioneering work in developing RSVP capabilities has significantly enhanced the way humans interact and collaborate with robots, especially in the context of Mars surface exploration. By integrating telepresence technology, Dr. Yen has enabled operators to control robots from Earth with greater precision and reliability, overcoming the challenges posed by vast interplanetary distances. The synergy of human cognitive skills and robotic accuracy that Dr. Yen’s research emphasizes is not only advancing our scientific endeavors on Mars but also setting a new standard for remote operations in space. His dedication to pushing the boundaries of technology exemplifies the spirit of innovation that drives interplanetary science and discovery forward. Dr. Yen’s academic background, with a Ph.D. in Applied Mathematics and a minor in Mechanical Engineering from the University of Iowa, has provided a strong foundation for his contributions to JPL’s robotic missions. His work ensures that the teleoperation and telepresence tools continue to support the Mars rover and lander missions eYectively.