Research

Current Projects

Synthetic Team Training and Evaluation Environment

This project will develop and evaluate an immersive virtual reality team training platform that incorporates team dynamic monitoring algorithms, GIFT for intelligent after-action tutoring, and microlearning burst gaming training.

DoD-Army DEVCOM SC (Contract number). Synthetic Team Training and Evaluation Environment. PI; $1,449,243. 2023-2025.


Completed Projects

PERvasive Learning System (PERLS) – Verification, Validation, and Experimental Testing

This purpose of this project is to evaluate PERLS, an intelligent recommendation and delivery system for instruction that also dynamically solicits content from users. The current system is a mobile-based Personal Assistant for Learning (PAL) which works on IOS. This effort aims to expand and improve the current system by evaluating the UX/UI of the system, create example content with a learning partner, and providing a final evaluation of the system.

DoD-ADL (HQ0034-19-C-0018) An evaluation of the PERvasive Learning System from an End-User Perspective; $1,139,181. 2019-2022.

Exploring Social Learning in Collaborative Augmented Reality with Virtual Agents as Learning Companions

In the context of medical student education, this Cyberlearning project will investigate how to design rich social learning experiences that integrate real and virtual features (objects or people) to enhance the learning process. To simulate a role-playing experience for patient communication, the project will incorporate augmented reality, a 3D technology that enhances perception of the real world through a contextual overlay of virtual objects/information onto physical objects in real time. This project mirrors how medical students may work in a future telemedicine environment where intelligent virtual entities and human teams seamlessly interact for patient care. Results will inform the design of virtual agents/humans to support learning in a variety of educational domains. The research will provides powerful new tools for medical education including communication with patients which will foster lifelong, and just-in-time training and will contribute to advancing national health, prosperity and welfare.

The research employs a human-like, artificial intelligence-driven, high-fidelity, Emotive Virtual Patient that has life-like emotions and nonverbal expression with conversational and assessment capability using natural language processing. This is integrated within the Microsoft HoloLens to allow a remote participant to review a student’s performance and provide feedback through text, audio, and video. Through iterative design-based research, the research will investigate: 1) how students learn by proxy through observing co-located and remote virtual and real collaborators; 2) whether students prefer to receive educational feedback from a co-located or remote peer, virtual peer, or a virtual professor; and, 3) the effects of collective agency when students can choose to learn socially from combinations of real or virtual learning companions.

NSF-IIS (# 1917994)  $781,382.00. 2019-2022.

Outcomes:

Villa, S. C., Craig, S. D., Zakhangir, D. & Zielke, M. (Accepted: 2021). Utilizing a Learning Strategy Analysis to Determine a System’s Potential Impact on Student Learning: The Augmented-Reality Emotive Virtual Patient System Platform. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. XXXX-XXXX). Sage Publications.

A Biometric Measurement Suite to Understand the Process Behind Human Performance in Complex Settings

This was a hardare grant to purchase a biometric measruement Suite from iMotions. The BMS software and hardware suite integrates several biosensors and synchronizes and visualizes eye tracking, facial expression analysis, electroencephalogram (EEG), Galvanic Skin Response (GSR), electromyography (EMG), electrocardiogram (ECG) and surveys into one platform. This platform enables researchers to collect psychophysiological data to measure individuals’ or teams’ biometric responses while engaged in complex tasks such as driving a car or operating an unmanned aerial system.  The BMS will synchronize multiple sensor data streams and facilitate data analysis to answer questions related to behavioral, cognitive and emotional processes underlying human learning and performance in complex settings

U. S. Department of Defense: DURIP – ONR;  $286,155. 2017-2018

Outcomes: Biometric Measuremetn suit purchased and opperational

Tools for Implementing Speech Agents in Crew Resource Management Training Systems

An evaluation project of Educational technology for the US Navy.

U. S. Department of Defense: $89,666.10; 2020

Outcomes: Project completed and reports provided to funder.

Craig, S. D., Curley, R.J., Lapujade, L., Cooke, N., Gutzwiller, R., Kwan, J., Fong, A., Lu, H., & Killilea, J. (2020).  A Human Systems Evaluation of a Multi-Agent Team Training System for Communication. Presented at iFEST: Innovation Instruction and Implementation in Federal E-Learning Science & Technology Conference. Alexandria, VA.

Science of Learning and Readiness (SoLaR)

The SoLaR project will identify the possibilities of the science of learning area for improving education and readiness. Well-established evidenced-based learning techniques take time to establish. However, the need for training and education for military personal and learning organizations are immediate. In order to provide reliable guidance, the SoLaR project will synthesize best practices from the existing literature and applied learning areas for adult distributed learning broadly (e.g., from learning theory to organizational governance), and apply these principles to an exemplar course within a DoD training/education context. The recommendations and examples from this project will help in the production of effective content, efficient methods of integration, and support required for establishing resilient distributed learning environments. 

U.S. Department of Defense: ADL. (HQ0034-19-C-0015) Science of Learning for Education and Readiness; $894,942.  2019-2021.

Outcomes: Project completed and reports provided to funder.

Li, S. & Craig, S. D. (Accepted: 2021). Case report of the use of the Learning Science Evaluation checklist. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. XXXX-XXXX). Sage Publications.

Siegle, R. F., Cooke, N. J., Schroeder, N. L., Li, S., & Craig, S. D., (Accepted: 2021). Scaling team training: Using virtual worlds to support learning in massive open online courses. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. XXXX-XXXX). Sage Publications.

Why do we adopt e-internships in eLearning curriculum development? A Model of Career-oriented Learning Experiences, Motivation, and Self-Regulated Learning
Li, S.; Craig, S.
2020, Arizona State University

Immersive Learning Environments at Scale: Constraints and Opportunities
Siegle, R.; Roscoe, R.; Schroeder, N.; Craig, S.
2020, Arizona State University

Educational Data Mining and Learning Analytics for Improving Online Learning Environments
Paredes, Y.; Siegle, R.; Hsiao, I.; Craig, S.
2020, Arizona State University

Science of Learning and Readiness (SoLaR) Recommendation Report: Science of Learning Practices for Distributed Online Environments
Craig, S.; Schroeder, N.
2020, Arizona State University

Science of Learning and Readiness (SoLaR) Exemplar Report: A Path Toward Learning at Scale
Craig, S.; Li, S.; Prewitt, D.; Morgan, L.; Schroeder, N.
2020, Arizona State University

Science of Learning and Readiness (SoLaR) State-of-the-Art Report
Craig, S.; Schroeder, N.; Roscoe, R.
2020, Arizona State University

Contact

Scotty Craig leads the ADL partnership as faculty members in Human Systems Engineering (HSE) Program at the Polytechnic campus.