Keynotes and Invited Talks
Prof Oussama Khatib
Deep-Sea Robotics Exploration: OceanOneK
Date, Time, Room
Abstract:
OceanOneK is a robotic diver with a high degree of autonomy for physical interaction with the marine environment. The robot’s advanced autonomous capabilities for physical interaction in deep-sea are combined with the cognitive abilities of a human expert through an intuitive
haptic/stereo-vision interface. OceanOneK was deployed in several archeological expeditions in the Mediterranean with the ability to reach 1000 meters and more recently the robot was tested in challenging tasks at Deep Dive Dubai. Distancing humans physically from dangerous
and unreachable spaces while connecting their skills, intuition, and experience to the task promises to fundamentally alter remote work. These developments show how human-robot collaboration-induced synergy can expand our abilities to reach new resources, build and maintain infrastructure, and perform disaster prevention and recovery operations – be it deep in oceans and mines, at mountain tops, or in space.
Bio:
Oussama Khatib received his PhD from Sup’Aero, Toulouse, France, in 1980. He is Professor of Computer Science and Director of the Robotics Laboratory at Stanford University. His research focuses on methodologies and technologies in human-centered robotics, haptic interactions, artificial intelligence, human motion synthesis and animation. He is President of the International Foundation of Robotics Research (IFRR) and an IEEE Fellow. He is Editor of the Springer STAR and SPAR series, and Springer Handbook of Robotics. He is recipient of the IEEE Robotics and Automation, Pioneering Award, the George Saridis Leadership Award, the Distinguished Service Award, the Japan Robot Association (JARA) Award, the Rudolf Kalman
Award, and the IEEE Technical Field Award. Professor Khatib is Knight of the National Order of Merit and a member of the National Academy of Engineering.
Magnus Egerstedt
Assured Autonomy, Self-Driving Cars, and the Robotarium
Date, Time, Room
Abstract:
Long-duration autonomy, where robots are to be deployed over longer time-scales outside of carefully curated labs, is fundamentally different from its “short-duration” counterpart in that what might go wrong sooner or later will go wrong. What this means is that stronger guarantees are needed in terms of performance. For instance, in the US, a road fatality occurs roughly every 100 million miles, which means that for an autonomous vehicle to live up to its promise of being safer than human-driven vehicles, that is the benchmark against which it must be compared. But a lot of strange and unpredictable things happen on the road during a 100 million mile journey, i.e., rare events are all of a sudden not so rare and the tails of the distributions must be accounted for by. The resulting notion of “assured autonomy” has implications for how goals and objectives should be combined, how information should be managed, and how learning processes should be endowed with safety guarantees. In this talk, we will discuss these issues, instantiated on the Robotarium, which is a remotely accessible swarm robotics lab that has been in (almost) continuous operation for over five years, participating in over 7,500 remotely managed autonomy missions.
Bio:
Dr. Magnus Egerstedt is the Stacey Nicholas Dean of Engineering in the Samueli School of Engineering and a Professor in the Department of Electrical Engineering and Computer Science at the University of California, Irvine. Prior to joining UCI, Egerstedt was on the faculty at the Georgia Institute of Technology, serving as the Chair in the School of Electrical and Computer Engineering and the Director for Georgia Tech’s Institute for Robotics and Intelligent Machines. He received the M.S. degree in Engineering Physics and the Ph.D. degree in Applied Mathematics from the Royal Institute of Technology, Stockholm, Sweden, the B.A. degree in Philosophy from Stockholm University, and was a Postdoctoral Scholar at Harvard University. Dr. Egerstedt conducts research in the areas of control theory and robotics, with particular focus on control and coordination of multi-robot systems. Magnus Egerstedt is a Fellow of IEEE and IFAC, and is a Foreign member of the Royal Swedish Academy of Engineering Science. He has received a number of teaching and research awards, including the Ragazzini Award, the O. Hugo Schuck Best Paper Award, the Outstanding Doctoral Advisor Award and the Outstanding Teacher Award from Georgia Tech, and the Alumni of the Year Award from the Royal Institute of Technology.
John Blitch
Title
Date, Time, Room
Abstract:
Bio:
Allison Okamura
Haptics Anywhere: Enabling Mobile Telepresence
Date, Time, Room
Abstract:
Haptic devices allow touch-based information transfer between humans and intelligent systems, enabling communication in a salient but private manner that frees other sensory channels. For such devices to be mobile, their physical and computational aspects must be intuitive and unobtrusive. The amount of information that can be transmitted through touch is limited in large part by the location, distribution, and sensitivity of human mechanoreceptors. Not surprisingly, many haptic devices are designed to be held or worn at the highly sensitive fingertips, yet stimulation using a device attached to the fingertips precludes natural use of the hands. Thus, we explore the design of a wide array of haptic feedback mechanisms, ranging from devices that can be actively touched by the fingertips to multi-modal haptic actuation mounted on the arm. We demonstrate how these devices are effective in telepresence scenarios for task training, robot-assisted surgery, and social touch.
Bio:
Allison M. Okamura received the BS degree from the University of California at Berkeley in 1994, and the MS and PhD degrees from Stanford University in 1996 and 2000, respectively, all in mechanical engineering. She is the Richard W. Weiland Professor of Engineering at Stanford University in the mechanical engineering department, with a courtesy appointment in computer science. She was previously Professor and Vice Chair of mechanical engineering at Johns Hopkins University. She is currently a deputy director of the Wu Tsai Stanford Neurosciences Institute and a Science Fellow/Senior Fellow (courtesy) at the Hoover Institution. She is currently co-program chair of the ICRA@40 Conference and has been editor-in-chief of the journal IEEE Robotics and Automation Letters, associate editor of the IEEE Transactions on Haptics, editor-in-chief of the IEEE International Conference on Robotics and Automation Conference Editorial Board, an editor of the International Journal of Robotics Research, and co-chair of the IEEE Haptics Symposium. Her awards include the 2020 IEEE Engineering in Medicine and Biology Society Technical Achievement Award, 2019 IEEE Robotics and Automation Society Distinguished Service Award, 2016 Duca Family University Fellow in Undergraduate Education, 2009 IEEE Technical Committee on Haptics Early Career Award, 2005 IEEE Robotics and Automation Society Early Academic Career Award, and 2004 NSF CAREER Award. She is an IEEE Fellow. Her academic interests include haptics, teleoperation, virtual environments and simulators, medical robotics, soft robotics, neuromechanics and rehabilitation, prosthetics, and education. Outside academia, she enjoys spending time with her husband and two children, running, and playing ice hockey. For more information, please see Allison Okamura’s CV (pdf).
Prof Imre Rudas
Bridging Human and System Interaction: Spatial Solutions for System of Systems
Date, Time, Room
Abstract:
In the complex realm of system of systems, engineers and designers face significant challenges in managing and supervising intricate, multi-layered structures. The need for rapid comprehension and intuitive interaction is often hampered by the dense abstraction layers inherent in these systems. Traditional user interfaces, burdened by their abstract nature, can significantly increase cognitive load and induce fatigue among users. However, the evolution of 3D and web technologies over the past decades has paved the way for high-performance and realistic visualizations, enabling a transformative shift in interface design.
This presentation introduces a natively spatial approach to Human – System of Systems Interaction, leveraging these technological advancements to create less abstract, more intuitive user interfaces. A key feature of this approach is the integration of an Industrial Internet of Things (IIoT) ecosystem, which enhances the system’s capabilities by enabling data-driven decision support. This integration not only simplifies user interactions but also improves the efficiency and accuracy of decisions in complex operational environments.
By presenting real industrial use cases, the talk will demonstrate how this approach can significantly reduce cognitive burdens and enhance user efficiency and satisfaction. Attendees will gain insights into how the fusion of advanced visualization technologies, IIoT integrations and user-centered design principles can revolutionize the interaction with complex systems, paving the way for more effective and user-friendly supervision and management of these intricate environments with optimized operational outcomes.
Bio:
Prof Holly Yanco
Title
Date, Time, Room
Abstract:
Bio:
Prof Witold Kinsner
Towards a Federated LEO Satellites and Terrestrial System (LEO SatS) for Real-Time Telepresence and Teleoperations
Date, Time, Room
Abstract:
A federated LEO SatS is a cognitive, symbiotic, and memetic system intended to enhance communications with low latency, high precision, and high accuracy timing, data processing at the edges, and optimal decision-making in extreme environments. It requires high autonomy, preservation and maintenance of causality, security, and awareness of the need for symbiotic coexistence. Such a system could be used for telepresence and teleoperations in diverse applications.
This six-part talk will describe various attempts to develop improved terrestrial networks with a focus on high-precision and accurate time handling with inexpensive atomic clocks, and corresponding improvements in network protocols. The second part will describe the design, development and deployment of nano- and femto-satellites by university/college students, with examples from our seven generations of such satellites. The third part will focus on the need for improved ground stations and their networking with a high level of autonomy with edge computing to receive low-power noisy signals from fast-moving constellations of objects, as well as sending control signals to those objects. The fourth part will describe ways of accelerating the development of the federated LEO SatS through three-level student competitions: (i) undergraduate final-year Capstone projects, (ii) graduate research projects, and (iii) IEEE HKN student projects. The fifth part will discuss how to make such a terrestrial/space system secure because any hacking could become extraordinarily dangerous. The sixth part will describe experiential learning and teaching of educators and STEAM teachers through workshops on satellites and courses on the certification of their operators. We hope that this talk will encourage the audience to engage their students in the projects.
Bio:
Dr Witold Kinsner is Professor Emeritus and Senior Scholar in the Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg, Canada where he helped establish the Computer Engineering program since the late 1970s.
He obtained his PhD degree in Electrical and Computer Engineering from McMaster University in the area of computer bubble memories in 1974 and became an Assistant Professor at McMaster University and later at McGill University. He was a co-founder and Director of Research of the Industrial Applications of Microelectronics Centre from 1979 to 1987, and spent his sabbatical at National Semiconductor, Santa Clara, CA, developing resilient computer memories for extreme environments. He is a registered Professional Engineer in Canada.
His main research focus is on symbiotic cognitive systems and computing engines for extreme environments. He has authored and co-authored over 880 publications in his research areas, including books and patents, as well as supervised over 80 Post-Doctoral Fellows, Doctorate and Master’s graduate students, over 200 undergraduate final-year thesis/capstone project students, and mentored over 35 summer research students. He has taught over 200 undergraduate and graduate university courses in computer engineering. He established teams of students to design, build, test, and deploy nano-satellites in 2010. With many colleagues, he has been leading the IEEE Low-Earth Orbit Satellites and Systems (LEO SatS) Initiative of the IEEE Future Directions since 2021. His ORCID number is 0000-0002-6759-1410, and he is Erdös number 2.
He is a Fellow of several organizations and has been active in IEEE since 1971, including IEEE R7 (Canada) President / Director 2016-17 (PE 2014-15; PP 2018-19) and IEEE Vice President of Educational Activities 2018, 2019 (IP-VP 2020-22). He is an IEEE HKN Member (Eta Chapter), a member of 10 IEEE Societies, and is serving on many IEEE committees.
Dr Ming Hou
AI, Autonomy, and Digital Reality: Frontier of Symbiotic Telepresence Technologies for Successful Teleoperations
Date, Time, Room
Abstract:
What is left behind when humans perform tasks in a remote workplace? Teleoperations need to take into account visual, auditory, tactile, somatogravic sensory issues with associated physiological threat (modality underload and fatigue), communication delay or datalink loss, indirect control, and disorientation and perception concerns (sound, speed, pressure, temperature), etc. To address these challenges, artificial intelligence (AI), Autonomy, and Digital Reality (Virtual Reality and Mixed Reality) can provide cutting-edge solutions as the frontier telepresence technologies for system performance augmentation. This talk discusses how these emerging disruptive technologies can be leveraged and integrated to grapple the issues and support decision-making process based on an interaction-centred design paradigm. A real-world example of a large-scale international exercise will be introduced to explain the operational and legal challenges when managing multiple uncrewed systems remotely and simultaneously in different mission environments. Related AI, Autonomy, and Digital Reality technologies as symbiotic decision aids will be highlighted to demonstrate their utilities to tackle the issues. Participating subject matter experts’ feedback on the effectiveness of these telepresence technologies will be briefed to illustrate their implications in human-technology collaboration strategy, interaction methodology, and team trustworthiness.
Bio:
Dr. Hou is a Principal Scientist and Authority in Human-Technology Interactions within the Department of National Defence (DND), Canada. He is responsible for delivering cutting-edge technological solutions, science-based advice, and evidence-based policy recommendations on AI, Autonomy, and Telepresense science, technology, and innovation strategies to senior decision makers within DND and their national and international partner organizations including the United Nations. As the Canadian National Leader in human-AI/autonomy teaming (HAT), he directs the Canadian innovation ecosystems to deliver science and technology capability development programs and support major acquisition projects and large-scale live, virtual, and constructive international joint exercises. As the Co-Chair of an international Human-Autonomy Teaming Specialist Committee, he leads the development of international standards for the integration of AI-enabled autonomous systems into the civilian airspaces. His book entitled “Intelligent Adaptive Systems – An Interaction-Centered Design Perspective” is considered authoritative and outlines a systemic approach to Human-AI/Autonomy Symbiotic Collaborations. It has been instrumental in the development of HAT R&D roadmap, novel industrial autonomy technologies, new international standards, and AI and autonomy policy and regulation frameworks. Dr. Hou is the recipient of the most prestigious DND Science and Technology Excellence Award in 2020 and the President’s Achievement Award of the Professional Institute of the Public Service of Canada in 2021. He is an IEEE Fellow, Distinguished Lecturer, and the General Chair of the 2024 IEEE International Conference on Human-Machine Systems and International Defence Excellence and Security Symposium. Dr. Hou is also an Adjunct Professor at the University of Toronto and University of Calgary.
Dr Neal Y Lii
Scalable Autonomy: The many different ways of using robots to realize telepresence in space and on Earth
Date, Time, Room
Abstract:
Telepresence brings us closer to the people and places we want to be. Today, if we cannot be there in person, at least we can virtually. Robots have become our best partners and avatars on site. Their manipulators give them dexterity to interact with objects and environments. Wheels and legs give them the mobility to survey large areas, and a growing plethora of cameras and sensors allow them to see and touch their surroundings. These capabilities allow us to experience a different world through the robot. Paired with ever-advancing robot intelligence, we can use robots as our physical extension on-location, and delegate tasks to them to execute under our command or supervision. The idea of intuitive and easy transition between deep immersion and involvement, to physical and mental workload sharing, is the basis of Scalable Autonomy. User interface designs with different modalities including, visual, textual, auditory, force reflection, and tactile interaction, let’s us command and receive meaningful feedback from a robot, or a team of robots to carry out large complex tasks in different places. With them, we can explore space, rescue and repair in dangerous locations, and perform minimally invasive surgery on a patient from afar, all while being intimately involved (when we want to be). Together, we will look at the advances we have made so far in space and on Earth, and ponder the new technologies and use cases to help us be virtually present at the places we want to be, the way we want to be.
Bio:
Neal Y. Lii is the Domain Head of Space Robotic Assistance, and co-founding head of the Modular Dexterous (Modex) Robotics Laboratory, at the Institute of Robotic and Mechatronics, German Aerospace Center (DLR) in Oberpfaffenhofen, Germany. Prior to joining DLR, he worked in the Automotive sector in the Silicon Valley, and at BMW in Germany, where he studied drive-by-wire systems. Neal received his BS from Purdue University, MS from Stanford University, and PhD from University of Cambridge.
Neal works on telerobotics and the different modalities of user interface (UI) for space and terrestrial applications. His hope is to enable people to effectively and effortlessly command complex robots and robotic teams. Aside from diving into hand-arm and other multi-modal UI systems that can give users an immersive experience down to the fingertips, he has also served as the principal investigator of two space telerobotic experiments, METERON SUPVIS Justin, and Surface Avatar. These experiments look at how astronauts on board the International Space Station (ISS) can command different robots on Earth with different UI designs and command modalities, and by extension, how we can command teams of robots in future space missions.
Prof Saeid Nahavandi
Haptically Enabled Robotics Systems
Date, Time, Room
Abstract:
This presentation will showcase the advancements in tele-robotics, tele-presence, and haptics technologies, highlighting the integration of haptics for delivering tactile feedback in mission critical tele-operations. This integration is poised to transform the way robots interact with humans by enabling them to sense and respond to “touch-and-feel” characteristics. Such advancements will enhance the efficiency and collaboration between robots and humans, especially in tasks that are demanding, hazardous, or unhygienic. Furthermore, the discussion will delve into the foundational research on haptics in robotics and tele-operations, underlining its significance in diverse applications. By granting robots the ability to perceive and react to physical sensations, they can elevate their capabilities across various industries. Ultimately, the aim is to illustrate how haptics can enable robots to coexist seamlessly with humans and overcome challenges that were previously considered formidable.
Bio:
Distinguished Professor Saeid Nahavandi is Swinburne University of Technology’s inaugural Associate Deputy Vice-Chancellor Research and Chief of Defence Innovation. He previously served as Pro Vice-Chancellor (Defence Technologies) and Founding Director of the Institute for Intelligent Systems Research and Innovation, Deakin University. Saeid Nahavandi received a Ph.D. from Durham University, U.K. in 1991. His research interests include
autonomous systems, modeling of complex systems, robotics and haptics. He has published over 1200 scientific papers in various international journals and conferences. Saeid was the recipient of the Clunies Ross Entrepreneur of the Year Award 2022 from the Australian Academy of Technological Sciences & Engineering, Researcher of the Year for Australian Space Awards 2021, Australian Defence Industry Awards – Winner of Innovator of the year, The Essington Lewis Awards, and Australian Engineering Excellence Awards – Professional Engineer of the Year. Saeid has carried out industry based research with several major international companies such as Airbus, Boeing, Bosch, Ford Motor Company, General Motors, General Dynamics, Holden, Lockheed Martin, Nissan, Thales and Vestas just to name a few. Professor Nahavandi holds six patents, two of which have resulted in two very successful start-ups. Professor Nahavandi is the Vice President: Human-Machine Systems, IEEE SMCS, Senior Associate Editor: IEEE Systems Journal, Associate Editor of IEEE Transactions on Cybernetics and IEEE Press Editorial Board member. Professor Nahavandi is a Fellow of IEEE (FIEEE), Engineers Australia (FIEAust), the Institution of Engineering and Technology (FIET). Saeid is a Fellow of the Australian Academy of Technology and Engineering (ATSE). Saeid was the General Chair for IEEE SMC 2021.
Dr Kamak Ebadi
Title
Date, Time, Room
Abstract:
Bio: