Skip to main content

SIRI Projects

Ambassadors for Inclusion in Human-Machine Interactions (AI HMI)

Dr. Tahira Reid Smith
tahira@purdue.edu

Background: The human in HMI implies inclusion regardless of ability, ethnicity, race, or social class, but the current research paradigm shows many examples of exclusion. The processes and procedures that govern HMI do not account for the heterogeneity that exists in human beings included in the data. Efforts to overcome implicit biases being propagated through the HMI system require team members who can challenge cultural hegemony, along with great governance of design and operation. In our quest to advance automation, there are entire communities of people being left behind. Researchers in industry and academia struggle to diversify their datasets and need methods for doing so in ways that are sensible, respectful, and time-sensitive.

Objectives: The goal of this project is to engage and learn from diverse communities as it relates to key cyber-physical systems. We invite students that interact with and/or live in diverse communities whose thoughts and ideas do not have a chance to influence future technologies. Students will be trained on good practice in conducting studies involving people and will learn qualitative data collection and analysis methods. Students will have an active role in bringing the appropriate and culturally relevant context to the methods. Students will be empowered as thought leaders for a small-scale project and will serve as a bridge between their communities and technology design considerations. The outcomes of this work will help identify best practices for others to model and will provide preliminary results to assist researchers.

Applicants are encouraged to read:

Reid, T. and Gibert, J. (2022) “Inclusion in Human-Machine Interactions”, Science, 375(6577), pp. 149-150

Reid, T. and Gibert, J. (Jan 17, 2022), “Building machines that work for everyone – how diversity of test subjects is a technology blind spot and what to do about it” on “The Conversation”

Algorithms for Resilient Coordination and Situational Awareness in Swarms

Prof. Shreyas Sundaram
sundara2@purdue.edu

Various applications ranging from disaster response, environmental monitoring, and control of cyber-physical-systems, require a team of mobile agents (such as drones) to work together to learn what is happening in the environment. This is a challenging problem, as each agent can only observe a small portion of the overall environment, and the communication links between the agents will change over time as they move around. Furthermore, some of the agents may send incorrect information to the others, due to faults or attacks. Thus new approaches are required to enable large swarms to work together to gain situational awareness. We will tackle this problem in this project; depending on the interests of the students, we will work on the development of new algorithms, testing existing algorithms in simulation environments, or implementing algorithms on physical drones. Experience with Python and MATLAB would be beneficial for this project.

Explorations in Safe, Printable, Biological Sensors and Sensor Systems

Prof. Richard Voyles
rvoyles@purdue.edu

The Collaborative Robotics Lab is pursuing what we refer to as "Form + Function 4-D Printing" of polymer-based robotic materials. Robotic materials are those that incorporate sensing, computation, and actuation (function) into the structure (form) of the materials from which new CPS systems can be designed and easily fabricated. Two of the applications of polymer robotic materials are in-vivo sensing for precision animal agriculture and printable soft robots. We are looking for undergraduate students to work closely with grad students in the testing and implementation of parametrizable, printable transistors, actuators, and placeable components from polymer semiconductors to support neuromorphic computing within these robotic materials and for the coding of computer-aided design tools to assist non-expert users to select functionality. Students with expertise in analog and digital electronics, 3-D printing and CAD, materials science, and entry-level micro/nanofabrication, and/or software coding skills can contribute to this exciting and impactful project.

Studying Human Cognitive Behavior During Their Interactions with Automation

Prof. Neera Jain
neerajain@purdue.edu

In this project, two students will work with one graduate student to design and implement a logic-based control algorithm, via Amazon Mechanical Turk, aimed at assessing how the use of feedback of the human’s trust or self-confidence can help an autonomous agent better train them to land a drone in a simulated environment. In particular, they will modify and adapt an existing experiment to consider closed-loop interactions between the participant and the agent. The students will primarily be coding in Javascript to implement the experiment for online deployment. They will also be involved in data processing and analysis, first using data that will have been previously collected, and time permitting, on data collected from the experiment they help to implement during SIRI.

Autonomous Collapsed Building Search and Rescue

Prof. James Goppert
jgoppert@purdue.edu

As unmanned aerial systems (UAS), colloquially known as drones, become more ubiquitous, it will become necessary to develop autonomous systems to police the skies. Various counter UAS (C-UAS) systems are now on the market, but automated planning and control at the scale of a city has not been addressed. Through observation of regular air-traffic patterns in the city, the system will apply deep-learning to rapidly identify vehicles which may pose a threat to public safety. In anticipation of threats, C-UAS agents will be deployed and guided using a game theoretic approach. Students will have a unique opportunity to develop a C-UAS system and see it in action within a scale city. The Purdue UAS Research and Test Facility houses one of the largest indoor motion capture systems in the world, and the motion capture system will provide simulated sensor data for both the cooperative and non-cooperative vehicles in the environment. In the event that students must take the course virtually, they can deploy their algorithms remotely on the simulation and observe live testing administered by students working at the facility.

Advanced Vehicle Automation and Human-Subject Experimentation

Prof. Brandon Pitts
bjpitts@purdue.edu
Prof. Tahira Reid Smith
tahira@purdue.edu

Background: Vehicle automation is developing at a rapid rate worldwide. While fully autonomous vehicles will not dominate the roadway for the next several years, many research initiatives are currently underway to understand and design approaches that will make this technology a future reality. This work ranges from the development of sensors and controls algorithms, to schemes for networks and connectivity, to the creation of in-vehicle driver interfaces. The field of Cyber-physical systems (CPS) helps to integrate all of these activities. Here, one component that is key to the effective design of next-generation autonomous driving systems is the human driver and, thus studying human-vehicle interactions and defining driver’s roles/tasks will be important.

Objectives: The goal of this project is to describe and measure the ways in which a person interacts with advanced vehicle automation. Students will assist with multiple activities and will learn a combination of the following: how to a) develop/code advanced driving simulation scenarios (using the National Advanced Driving Simulator), b) collect driving performance data, c) analyze driver and performance data (using methods via software packages), and d) write technical reports and/or publications. Students will also gain experience collecting and analyzing complementarity physiological measures, such as eye movement data, brain activity, skin conductance, and heart rate. The students will work closely with graduate student mentors to enhance learning.

Hazard perception and cognition in construction

Prof. Hubo Cai
hubocai@purdue.edu
Prof. Phillip Dunston
dunston@purdue.edu

The construction industry is the most dangerous among all private industries. Thousands of construction workers die every year in the United States attributed to unforeseen jobsite hazards such as falls and struck-by. The goal of this research is to identify jobsite hazards in real time using electroencephalogram (EEG) based on the connection among electrical activity in brain and the perception and cognition of hazards. Specifically, experiments on virtual reality (VR) platform will be set to collect EEG signals under the existence of various hazards to identify the brain activity patterns associated with specific types of hazards.

Participating undergraduate interns will receive necessary training in setting up VR and using EEG and eye trackers, assist a PhD student in setting up the experiments, conducting experiments, and analyzing data, and prepare technical report and presentation slides. Students are expected to learn the fundamentals of sensing technologies and data science and develop practical skills in technology implementation.