Skip to main content

SIRI Projects

Penn State Projects

Developing Human-Robot Collaboration for Semi-Automated Assembly of Small Parts

PI: Ilya Kovalenko
Small part assemblies have sub-components that can have various orientations and require careful handling to avoid damage. Previously, we have developed a robotic system that uses machine vision to identify part orientations and deposits these parts for assembly. However, this robotic system is not very flexible to changes in part specifications and does not respond well to anomalies in the system (e.g., tangled springs or obscured parts). Human operator feedback and assistance is often required to ensure an effective assembly process. The overall objective of the proposed research is to develop a semi-automated robotic assembly system that adapts to changes in subassembly configurations and operator capabilities. As part of the research, we want to explore the integration of voice- commanded operations during the assembly process through Large Language Models (LLMs). For example, when the system detects errors by machine vision, it notifies the human operator and can be corrected through voice commands improving the process to reduce error rates and assembly time. This approach not only increases productivity but also increases accessibility to customized products.

PURL Project Ideas Repository

PI: Eric Johnson
This project will occur at State College Airport; students on this project will require their own transportation. Perform advanced unmanned aircraft systems research, including flight-testing with a variety of research systems. PURL is within Penn State College of Engineering’s Aerospace Engineering Department. The laboratory includes dedicated research vehicle systems (airplane, helicopter, multirotor, and more), a comprehensive set of simulation tools, dedicated space for indoor flight with motion capture systems, areas for aircraft maintenance/storage, an avionics workshop. The laboratory’s recognized strengths are in adaptive/reliable flight control, vision-based control, and conducting flight validation. Most projects involve teaming with government/industrial partners and/or other academic units.

Resilient Autonomous Systems

PI: Rômulo Meira-Góes
In this project, students will develop techniques for designing safe and resilient autonomous systems. Potential research tasks include (1) integrating computer vision and decision-making techniques, (2) designing and testing different control techniques both classical and learning-based techniques, and (3) developing demonstrations. Students will also get hands-on experience applying these techniques to real-world case studies, such as autonomous vehicles, and mobile robots. Students are expected to have some background in programming and will learn basic control theory, and programming in ROS.

Effect of reward function when optimizing exoskeleton controllers

PI: Anne Martin
When humans walk, they choose to walk in a manner that optimizes some (unknown) reward function. The reward function likely includes terms such as minimizing metabolic cost (effort) and walking in the manner they are used to. When designing exoskeleton controllers, it may be helpful to have the exoskeleton match what the human is trying to optimize. As a first step in studying this, we would like to know how participant walking changes when the exoskeleton reward function changes. In this pilot hardware study, students will implement a reinforcement learning algorithm to choose the best exoskeleton control parameters. Students will perform a pilot study to compare two different reward functions. For this project, students should have persistence and be willing to problem solve. Some prior experience with block diagrams, machine learning, and/or working with hardware is a plus.

Purdue Projects

Testbeds for Human Interactive Control Applications

PI: Inseok Hwang
In this project, the students will engage in the development of a simulation platform for various human interactive control applications. The simulation platform will utilize the ROS (Robot Operating System), PX4, and the Gazebo, which are widely used simulation and experimental environments in robotics. The project’s unique focus lies in the design, verification, and validation of the impact of human-robot interactions on the performance of control applications. As part of this project, the students are expected to acquire practical programming skills and develop an advanced understanding of control applications and human-robot interaction. Additionally, once the simulation platform development is successfully completed, the control applications can be ported to the actual hardware systems.

Autonomous UAV

PI: Dengfeng Sun
As Unmanned Aerial Vehicles (UAVs), particularly delivery drones, gain popularity in academic and civil sectors, the imperative for collision prevention becomes apparent yet remains inadequately addressed. This research project seeks to harness a machine learning and simulator- based approach to optimize safety distances for swarm cargo drones, addressing both airspace safety and efficiency challenges within the delivery UAV fleet. In this project, we expect to design an onboard (UAV) system to compute and predict safety distances between UAVs and obstacles under diverse weather conditions by integrating machine learning with simulated UAV models featuring accurate aerodynamic parameters. Employing a simulator-based reinforcement learning framework, the project will establish a comprehensive simulation environment and evaluation process. The integrated system will utilize historical weather data sourced from the National Oceanic and Atmospheric Administration (NOAA) database for both training and validation purposes. Key features will include a modified Iris drone model tailored for various missions and an enhanced reinforcement learning model, prioritizing factors such as wind/gust speed, wind/gust directions, and cargo weight. The simulation will be powered by the Robot Operation System (ROS) network and the Gazebo simulator, providing a realistic environment that accurately models real wind data.

Intelligent Swarm Formation Control with Safety Guarantees

PI: Phil Pare
This project focuses on the safe formation control of robot swarms, emphasizing robust safety guarantees within collective behaviors. By harnessing principles from control theory, our aim is to develop algorithms that optimize swarm coordination while ensuring safety conditions are met for all time. This project will center on addressing challenges like collision avoidance, specialized formation behavior, and adaptability in dynamic environments and will integrate theoretical models into practical applications through simulation and real-world testing. Students will have the opportunity to work with real quadrotor drones, develop and write Python code, and have their ideas tested in Purdue's indoor flight test facility.

Software Engineering in Cyber physical Systems (IoT)

PI: James Davies
Software influences the physical world one way or another. Unlike traditional business software, in which physical-world effects are mediated by humans, Internet of Things (IoT) systems allow software to directly interact with the physical world through interconnected devices. Embedded systems are some of the oldest computing systems (e.g. avionics), and there are well established engineering methods to reduce catastrophic failure. However, these methods are not being applied in many safety-sensitive contexts such as medical devices.

AIM: Artificial Intelligence in Music

PI: Yung-Hsiang Lu
This project will develop and integrate techniques to create two AI- enabled tools to support string music performers. The first tool, the Evaluator, aims to improve individual practice and performance. It analyzes a musician’s sound and compares it to digitized music scores to detect deviations in intonation, rhythm, and dynamics and suggest better posture based on sample performers’ recording with correct posture. The second tool, the Companion, plays the part of one or several instruments to replace absent musicians with matching tempo, and style of the human musicians through audio analysis of their performance while also responding in real-time to verbal instructions.