Reseach Projects


 

NSF Award for Weather Research UAVs

Dr. Christopher Crick, in partnership with other colleagues at OSU and other universities, has been awarded a $6 million NSF Award. The award will be an OSU-led collaboration to develop weather research UAVs.

Faculty: Dr. Christopher Crick


NFS Robotics Grant

Dr. Christopher Crick, in partnership with colleagues in the College of Engineering and the Department of Psychology, has been awarded a $900,000 grant under the auspices of the National Science Foundation's National Robotics Initiative. This project will investigate learning from demonstration and skills transfer between humans and robots, with a primary focus on applications in the construction and farming equipment industry.

Faculty: Dr. Christopher Crick

 

Deep Learning-based Analysis of Metagenome Sequences for Automatic Diagnosis

Funding Agency: USDA

Amount: $158,136

Duration: Jan 1, 2020 – Dec 31, 2021

This is a joint effort between Dr. Aakur, Dr. Bagavathi and OSU VetMed.

Description: The goal of this project is to develop a machine learning-based framework that will be capable of screening metagenomic data and identifying the presence of unique pathogen signatures that will aid in disease diagnostics. The primary focus of the proposed framework is to construct robust representations of genome sequences that can be used to distinguish between the various possible sources such as host, pathogen and non-pathogen, non-host sequences. Learning such robust feature representations will advance downstream tasks such as disease detection and dormant pathogen discovery to name a few. The proposed framework will advance the state-of-the-art in automated routine disease diagnostics while reducing both the time required for diagnosis as well as the number of samples for sequencing.

More details can be found here


Deep Event Understanding in Streaming Videos with Continuous Predictive Learning

Funding Agency: NSF

Amount: $1,005,543 

Duration: Oct 1, 2020 - Sep 30, 2024

Dr. Sathyanarayanan Aakur is leading this collaborative effort between Oklahoma State University, Florida State University, and the University of South Florida.

Description: The goal of this proposed study is to formulate a computer vision-based event understanding algorithm that operates in a self-supervised, streaming fashion. The algorithm will predict and detect old and new events, learn to build hierarchical event representations, all in the context of a prior knowledge-base that is updated over time. The intent is to generate interpretations of an event that go beyond what is seen, rather than just recognition. This research pushes the frontier of computer vision by coupling the self-supervised learning process with prior knowledge, moving the field towards open-world algorithms, and needing little or no supervision.

More details can be found here


 Explaining Feature Representation as learned by Neural Networks and Convolution Networks

The success of neural and deep algorithms on image and text data has increased the interest of various healthcare organizations to use AI for supporting better patient care. Although the performance accuracies of these data-based, representation-learning models are quite high, their reputation as computational ‘black-box’ raises questions of trustworthiness when used for developing decision-support systems and performing predictive analysis in healthcare and related fields. Thus, there is a critical need to identify how these    networks internally represent or encode the features of a given data type. Otherwise, they remain a blackbox, reducing their wide-spread use. This project involves mathematical modeling, designing and running experiments, and running simulations.

Faculty: Dr. Rittika Shamsuddin


 Healthcare Time Series Data Synthesis

Time series data is successive observations that represent measurements taken at equally spaced time intervals. Examples in healthcare include data from Electroencephalography (EEG) and activity recognition from accelerometers. Time series data collection methods are time-consuming and costly and such is the case when using gold-coated markers to track tumor movement in lung patients or the number for electrodes used to capture EEG data. Availability and access to large time-series datasets is limited yet it is an essential need for researchers seeking to build and test computational models used in the delivery of effective medical systems. The challenge here is to learn a data distribution that is not distorted while maintaining the time dependence and other important structural patterns. The ultimate aim is to explore unseen data and data patterns.

Faculty: Dr. Rittika Shamsuddin


Sepsis 

Sepsis is a life-threatening condition that occurs when the body's response to infection causes tissue damage, organ failure, or death (Singer et al., 2016). In the U.S., nearly 1.7 million people develop sepsis and 270,000 people die from sepsis each year; over one-third of people who die in U.S. hospitals have sepsis (CDC). Internationally, an estimated 30 million people develop sepsis and 6 million people die from sepsis each year; an estimated 4.2 million newborns and children are affected (WHO). Sepsis costs U.S. hospitals more than any other health condition at $24 billion (13% of U.S. healthcare expenses) a year, and a majority of these costs are for sepsis patients that were not diagnosed at admission (Paoli et al., 2018). Sepsis costs are even greater globally with the developing world at most risk. Altogether, sepsis is a major public health issue responsible for significant morbidity, mortality, and healthcare expenses. Early detection and antibiotic treatment of sepsis are critical for improving sepsis outcomes, where each hour of delayed treatment has been associated with roughly a 4-8% increase in mortality (Kumar et al., 2006; Seymour et al., 2017). To help address this problem, clinicians have proposed new definitions for sepsis (Singer et al., 2016), but the fundamental need to detect and treat sepsis early still remains, and basic questions about the limits of early detection remain unanswered. More on PhysioNet_Challenge.

Faculty: Dr. Rittika Shamsuddin


 Host-pathogen protein-protein interactions (HPI)

Use NLP methodologies to extract interaction between macromolecules that regulate the infection of hosts by pathogenic agents. Construct interaction pathways from the database of extracted interactions and conduct statistical inference to discover novel interactions. Combine HPI information on SARS-CoV-2 with other clinical and demographic information towards the COVID-19 fight.

Faculty: Dr. Thanh Thieu


Whole-person mobility functioning information

Function is an under-explored health indicator besides morbidity and mortality. This project standardizes and extracts mobility functioning information from electronic health records. Efforts are being made to combine mobility functioning information with other clinical and demographic information towards the COVID-19 fight.

Faculty: Dr. Thanh Thieu


 Lexical complexity in monolingual rewriting

Paraphrasing is an important task in NLP that contributes to multiple other tasks including data augmentation, language modeling, entailment, question answering, etc. This work aims at improving lexical variation of paraphrases while preserving reference meaning. Applications include science education and content generation.

Faculty: Dr. Thanh Thieu


RAPID: Creation of a Virtual Reality simulator to train COVID-19 first responders, funded by NSF

This project involves the creation of Virtual Reality (VR) based simulation environments to support training of first responders including nurses and other first responders in hospitals in communities and cities in the US to respond more effectively to the recent COVID-19 outbreak. This COVID-19 virus pandemic has put an overwhelming strain on the Nation's ability to treat patients; the number of patients who need to be tested continues to rise (and is expected to increase substantially to rise in the coming weeks and months). Currently, there is an urgent need to train such responders (nurses, physician assistants and others) to perform the screening/testing activities in a methodical, safe and efficient manner. Dr. Cecil and his students are creating Virtual Reality simulators for such training, which will accomplish 2 objectives: (i) increase the pool of trained first responders to perform the testing and treatment of potentially infected patients (ii) enabling these first responders to be better trained for such a high risk pandemic situation. The simulator's training modules will also be used to introduce university students to the process of designing and building such Virtual Reality simulators for medical and healthcare contexts.

Faculty: Dr. J. Cecil


 

US IGNITE: A gigabit network and Cyber-Physical framework for Advanced Manufacturing, funded by NSF

In this NSF project, Dr. Cecil and his students are exploring the design and implementation of an Internet-of-Things (IoT) based cyber physical framework in the context of Industry 4.0 supporting the agile assembly of Micro Devices. An advanced Test Bed has been created which includes a complex array of cyber and physical components which collaborate using cloud-based principles and emerging Next Generation SDN Internet technologies. An information centric engineering approach has been adopted to design the cyber physical interactions, which provide a foundation for implementing this cyber physical framework. The cyber modules are capable of assembly planning, path planning, Virtual Reality based assembly simulation and physical command generation. The physical assembly activities are accomplished using automated robotic micro assembly work cells. Latency experiments have been conducted to study the feasibility of SDN based techniques supporting such cyber-physical interactions.

Faculty: Dr. J. Cecil


US IGNITE: Exploring ultrafast networks for training surgeons using Virtual Reality based environments, funded by NSF

As part of an NSF / US Ignite initiative, Dr. Cecil and his research team are investigating the creation of a Next Generation Network based Virtual Reality (VR) based simulation training framework that can be used by medical residents for training in orthopedic surgery. Dr. Miguel Pirela-Cruz (orthopedic surgeon at Dignity Regional Medical Center) in Chandler, AZ is collaborating with the OSU team in this initiative. Specifically, we have shown that Software Defined Networking (SDN) based networks are effective in linking expert surgeons in different hospitals with medical residents at many geographically distributed locations and supporting such distributed collaborative VR/haptic based training activities. Two surgical environments are being created: one to support training in LISS plating surgery and other for Condylar plate surgery (which are used to treat fractures of the femur). The use of such advanced cyber infrastructure also lays the foundation for reducing health care costs in hospitals and medical education programs. The design of cyber-human centric environments is continuing with an emphasis on design and assessment of such VR based simulators. Measures to facilitate the design and assessment of such environments are being investigated including visual affordance, cognitive load, situation awareness, among others.

Faculty: Dr. J. Cecil


REU Site: Research Experiences in Information Centric Engineering, funded by NSF REU

The project goal is to introduce engineering/computer science students to research activities in Virtual Reality based simulation in Engineering, Manufacturing and Bio-Medical fields. Dr. Cecil and Miguel Pirela-Cruz are leading this REU initiative. This project also encourages students with learning and physical challenges to be exposed to ICE research areas and kindle their interest in IT-oriented engineering and graduate research careers. As a part of this project, REU participants are exploring a variety of cutting edge research topics in the area of Information Centric Engineering which includes the design of cyber-physical environments, creation of AR/VR based approaches and environments to support collaborative engineering and smart health activities.

Faculty: Dr. J. Cecil


Investigation of impact of Virtual Reality based cyber learning approaches, funded by OCAST

The objective of this project is to study the potential of teaching science and engineering concepts to children with autism using Virtual Reality based Learning Environments (VLEs). In this project, Dr. Cecil is collaborating with Dr. Mary Sweet-Darter (ABA Oklahoma and Anselm Center) in exploring the design of different types of Virtual Reality (VR) based environments to teach science and engineering to autistic children. In these VLEs, students will learn science/engineering concepts in an exciting VR environment using game controllers and haptic devices (the latter will allow them to touch and feel objects inside a learning environment).

Faculty: Dr. J. Cecil


Investigation of a Next Generation Application Tool for Cyber manufacturing, funded by NSF

This NSF project being investigated by Dr. Cecil and his students deals with the creation of an innovative 'app' (which can be run on smart phones) to allow users from a cross section of backgrounds to design and build parts interactively in a 3D environment (on a smart phone). A core aspect of this 'app' would be accessibility and simplicity in modifying a given design. Unlike cumbersome CAD tools, the app (whose preliminary version has been completed) allows those in the maker community and other users (such as high school students, hobbyists, budding inventors, among others) to interact intuitively in a 3D environment where they can modify existing designs using a touch screen interface on a smart phone or tablet; a base set of designs is provided which can be modified interactively. The completed designs can then be 3D printed through network connections to remotely located 3D printers.

Faculty: Dr. J. Cecil


Design of Next Generation Augmented Reality based User Interfaces for NASA's Gateway, funded by NASA

NASA’s Gateway seeks to establish an autonomous platform to refine and mature short- and long-duration deep space exploration capabilities through the 2020s. The overall objective of this X-Hab project (which was funded by NASA) was to design innovative Augmented Reality (AR) User Interfaces (UIs) to support autonomous operations of NASA’s proposed Gateway and associated modules Specifically, our project objectives involves creation of AR based interfaces to assist training of astronauts in performing 2 activities within the Gateway: (a) performing laptop related service tasks such as replacing a memory board (b) transporting payloads from one location to another.

Faculty: Dr. J. Cecil (PI) and Drs. R. Krishnamurthy, S. Kennison, S. Aakur, C. Crick and J. Thomas


Entrepreneurship in Cyber Manufacturing, funded by NSF

Drs. J. Cecil and Blayne Mayfield are involved with this NSF funded initiative aimed at encouraging OSU students to become entrepreneurs in cyber manufacturing and IoT areas; as part of this effort, students are exposed to entrepreneurial concepts/practices as part of senior and graduate electives; one of the class project activities involves identifying and developing new ideas and formulating a business plan to launch both cyber and physical products including ‘apps’.

Faculty: Dr. J. Cecil and Dr. B. Mayfield

 


Empirical Analysis of Online Social Media Polarization

Dr. Bagavathi is collaborating with researchers from Sociology and Computer Science at OSU and UNC Charlotte on a project to characterize hate speech, echo chambers and affective polarization in online social media forums (Reddit, Twitter, and Gab) with network science and machine learning models.

Faculty: Dr. Bagavathi