Charles River Analytics  

Cambridge,  MA 
United States
  • Booth: 1239

Stop by booth 1239 to demo our adaptive intelligent training, and AR/VR solutions!

Charles River Analytics conducts cutting-edge AI, robotics, and human-machine interface R&D to create custom solutions for your organization. Our customer-centric focus directs us towards problems that matter, and our passion for science and engineering drives us to create actionable, impactful solutions.

We were founded in 1983 and became an employee-owned company in 2012, setting the stage for the next generation of innovation, service, and growth. Our employees make a difference for a “who’s who” in government and industry by delivering results on government programs and working with commercial partners.

We come to work every day because we want to advance technology to solve today’s hardest problems.

At Charles River Analytics, we turn research into results.

 Press Releases

  • Charles River Analytics Inc., developer of intelligent systems solutions, will showcase our applied AI research and development at Interservice/Industry Training, Simulation and Education Conference (I/ITSEC 2019), the world’s largest modeling, simulation, and training event.

    Stop by Booth 1239 to demo our adaptive intelligent training and AR/VR solutions!

    Discover Our Adaptive Intelligent Training Tools

    Charles River Analytics worked with the Air Force Research Laboratory to develop effective games to revolutionize aircraft maintenance training. AFRL is using MAGPIE, our immersive, augmented-reality learning environment, to turn novice F-15E aircraft avionics technicians into experts. The US Air Force praised our training system as a "virtual solution that could revolutionize aircraft maintenance training.”

    MAGPIE is a powerful software base that can be adapted for diverse applications. For example, on the EAGLE project, we created a just-in-time training tool for US Air Force satellite communications students and deployed personnel. This training is available whenever and wherever a student needs it, providing immersive rehearsal of unfamiliar and complex procedures at a low cost.

    Learn more about our Intelligent Tutoring and Game-Based Training efforts at I/ITSEC 2019.


    We worked with the Army Research Laboratory to support natural human interactions in virtual, augmented, and mixed reality environments. Natural interactions are especially important when students need learned muscle memory for physical tasks, such as in combat casualty training.

    VIRTUOSO automatically assesses skill proficiency so students can work independently and delivers feedback from expert trainers remotely observing a session.

    Because VIRTUOSO gracefully incorporates so many leading commercial control and display peripherals, it spotlights which equipment is best suited to a training task—and can support the equipment available when an individual is ready to train. Simulations that incorporate VIRTUOSO are resilient to future technology advances.

    Our free and open-source VIRTUOSO Software Development Kit (VSDK) seamlessly provides natural human interactions into the virtual training experience.

    With VSDK’s robust and intuitive tools, developers can consistently design more immersive, resilient, and naturally interactable AR and VR experiences, yielding higher user engagement and more effective training outcomes. With VSDK, you can deliver a more realistic training product.

    Visit Booth 1239 to demo VIRTUOSO and see more examples of our Virtual Reality/Augmented Reality efforts at work.

    Learn About Our Human Performance Tools

    Recently, Charles River Analytics worked with NASA to assess astronaut workload and performance during the testing and evaluation of new NASA systems. We used the measurements from our CAPT PICARD system to determine how best to display health and status on the Orien space vehicle.

    CAPT PICARD is built on Sherlock™, our open and extensible software and hardware platform that provides a unified, end-to-end solution. With Sherlock, you can rapidly prototype applications to collect, analyze, visualize, and reason about human physiological, neurological, and behavioral data.

    Meet with Us at I/ITSEC

    Contact us to schedule a meeting and learn more about the tools and capabilities we will feature at I/ITSEC 2019!

    Charles River Analytics conducts leading-edge AI, robotics, and human-machine interface R&D, and leverages that research to create custom solutions for your organization. We have a stellar track record developing successful solutions for Government and commercial clients across a diverse collection of markets—defense, intelligence, medical technology, training, transportation, space, and cyber security. Our customer-centric focus guides us towards problems that matter, while our passion for science and engineering drives us to find impactful, actionable solutions.

  • Charles River Analytics Inc., developer of intelligent systems solutions, has expanded our research and product development for adaptive training and maintenance—we create intelligent, adaptive training systems, simulation-based training systems, and skill modeling technologies that improve the quality and efficiency of training.

    Our skill modeling techniques help students gain the most out of training sessions—these affordable techniques enhance procedural training, decision-making, and teamwork by including clearly defined steps, objective metrics, and dependencies.

    “The most expensive element of any training investment is time for instructors, students, and equipment,” said Dr. Krysta Chauncey, Scientist at Charles River Analytics. “Students can use our adaptive and cost-effective intelligent training solutions on their own time, so they can focus on what they need most from instructors and equipment when they are available. Since independent time has many pressures, gamification elements help students stay engaged and motivated so they acquire the necessary skills and knowledge.”

    We leverage rich AI technology to build intelligent tutors in a variety of domains, including avionics maintenance training, communication system troubleshooting, marksmanship, and battlefield medicine. Our adaptive intelligent response components incorporate dynamic behaviors into the training, tailoring it to the student’s needs on the fly.

    Our recent adaptive training efforts include:

    • EAGLE, our just-in-time intelligent virtual trainer. EAGLE helps US Air Force satellite communications students learn and remember complex procedures for the Hawkeye III Lite satellite communications system. EAGLE incorporates our virtual maintenance trainer, MAGPIE.
    • MAGPIE, our game-based training environment. MAGPIE combines an integrated suite of efficient content authoring tools, models of trainee skill and motivation, and a game adaptation engine to dynamically deliver game-based maintenance training that responds to individual learning needs, performance, and instructor guidance. 
    • WEAVER, our tablet-based intelligent virtual trainer. WEAVER helps trainees in high-risk career fields amass more training time and hands-on practice with tasks that risk damage or injury.
    • SNAPPR, our system that creates probabilistic models of Naval system components, the environment in which they operate, and the missions that they serve. SNAPPR helps operators understand the operational availability of hardware components onboard a vessel.
    • DATEM, our real-time alerting system. DATEM provides tripwires to detect, classify, and localize system faults, and recommends appropriate responses to improve mission success.
    • BADGERS, our decision-support system. BADGERS enables shipboard maintainers to rapidly analyze system status and predicted malfunctions, evaluate high-level mission impacts, and efficiently make maintenance decisions based on intuitive and innovative data visualizations.
    • STRUDEL, our tool that models how simulation fidelity, trainee experience, and task interact to produce different learning outcomes. STRUDEL helps operators understand how the model applies to Navy training simulations and fidelity choices for future training solutions.

    Stop by Booth 1239 to demo our adaptive intelligent training and AR/VR solutions or contact us to set up a meeting. 


    An Open-Source XR Software Development Kit...

  • An Open-Source XR Software Development Kit 

    Immerse into virtual worlds like never before with VSDK, our free and open-source software development kit

    Our VIRTUOSO Software Development Toolkit (VSDK) gracefully provides natural human interactions in virtual training simulations. Developers now have a powerful language to create complex, mixed-reality worlds, enticing users into a realistic AI environment. 

    Robust and adaptable, immersive and engaging, VSDK delivers a state-of-the-art virtual experience.

    Charles River Analytics has a rich history of work in the augmented and virtual reality sector, along with empirical research on the challenges of doing it right. We are currently the only provider of an augmented/mixed-reality development framework, and we offer it all for free.

    What will you create?

    VSDK allows seamless integration of various hardware peripherals from different manufacturers so that developers can produce the best possible experience and interactions for their end users.  

    VSDK includes support for major off-the-shelf XR systems—including the HTC Vive, Oculus Rift, and Oculus Quest—as well as more innovative peripherals for haptics and hand tracking, including the bHaptics TactSuit, Leap Motion, and ManusVR gloves. VSDK is equally suited for development of training applications, research studies, and games.


    The following features make VSDK unique:

    Hand Tracking for Interactions

    Supports Leap Motion, ManusVR, Sense Glove, and other systems so users can interact with virtual environments with their own hands.

    Haptic Feedback

    Physics-based and pattern-driven haptic feedback with support for devices as simple as controllers or as extensive as full-body haptic suits—including the bHaptics TactSuit

    Rapid prototyping through the Reaction System

    Extensible, event-driven behavior based on naturalistic interactions and usable without writing any code

    Free and Open-Source

    VSDK is free to use and modify

    Naturalistic Interactions

    Enable immersive experiences with hand tracking and haptic feedback

    Device Interoperability

    Plug-and-play a variety of XR systems—including HTC Vive, Oculus Rift, and Oculus Quest—and peripherals to increase hardware flexibility and reduce repeated work

    Scenario Design, Editing, and Management

    Efficiently generate dynamic training scenarios that combine the immersive capabilities of VR with time- and event-based scripting

    User Virtual Environment Interaction Library

    Draw from a library of common user interactions and objects native to virtual environments to increase immersion and training transfer

    One of VSDK’s focus areas is support for fine-motor control interactions, to allow serious simulation and training application developers to more accurately recreate real-world scenarios. 

    Learn More at Booth 1239!

    VSDK files and documentation are available at Github

  • fNIRS Explorer™ Sensor
    Brain Sensors Designed for Comfort in the Field: Charles River Analytics and Plux Launch the fNIRS Explorer™ Sensor...

  • Charles River Analytics Inc., developer of intelligent systems solutions, and PLUX, a biomedical engineering company in Portugal, proudly announce the joint commercial launch of the fNIRS Explorer™ sensor. The fNIRS Explorer is a wireless and ruggedized wearable that lets users acquire high-quality brain activation data, even out of the lab.

    Functional near-infrared spectroscopy (fNIRS) sensors measure visible and infrared light reflectance in cortical tissue. Typically applied on the forehead, fNIRS sensors estimate blood oxygen saturation level in brain tissue.

    “Historically, fNIRS sensors are clunky and cumbersome,” explained Dr. Bethany Bracken, Principal Scientist at Charles River Analytics. “We developed small and portable fNIRS sensors to enhance our human state assessment projects. Our fNIRS explorer sensor comes with an adjustable, flexible headband and is made of rubber making it comfortable to wear for long periods of time. It is hermetically sealed, allowing it to be useful even in environments with sun, sand, and dust, letting wearers perform training activities both in and out of the lab.”

    fNIRS Explorer builds on the success of our fNIRS Pioneer™ sensor, also developed in collaboration with PLUX. Both the fNIRS Pioneer and fNIRS Explorer capture high-quality signals at a fraction of the cost of current systems. These wearables are available for purchase through PLUX or its resellers’ network.

    Our Solutions in Action

    Our MEDIC system uses our fNIRS sensors and Sherlock™ platform to collect and process information about brain activity and deliver estimates of cognitive workload during high fidelity training simulations.

    Contact us learn more about our fNIRS sensors and our other Human Performance Modeling/Monitoring capabilities. 

    Visit Us at Booth 1239 at I/ITSEC!

    At Charles River Analytics, our deep understanding of intelligent adaptive training and AR/VR solutions results from cutting-edge AI, robotics, and human-machine interface R&D. We create custom solutions for your organization’s toughest challenges and we turn research into results.

    Stop by Booth 1239 to demo our adaptive intelligent training and AR/VR solutions or contact us to set up a meeting. 

    The development of this sensor is based upon work supported by the United States Army Medical Research and Materiel Command under Contract No. W81XWH-17-C-0205. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Army Medical Research and Materiel Command.

Send Email

Type your information and click "Send Email" to send an email to this exhibitor. To return to the previous screen without saving, click "Reset".

For Technical Support with this webpage, please contact support.