The Neurotech application team works at the intersection of basic neuroscience research and its application in industry. We
provide services that combine neuroscience and Araya's advanced AI technology to support our customers. Also, we are developing products that utilize AI technology to access mental states with non-invasive brain data. We care about human welfare and aim to break through the limitation of the human body and brain.
#Consulting and analysis of brain science experiments.
#Building automated analysis pipelines for brain data.
Brain activity estimation from facial video
NeuroTech (also called BrainTech) is a rising field of industrial application of Neuroscience. In this field, non-invasive recording of brain activity plays a pivotal role, and for that purpose Electroencephalography (EEG) is widely used because of its higher time resolution, wearability, and cost-effectiveness. Nonetheless, even wearable EEG devices
are hard to implement in some real-life (working) situations. Thus, our team developed a system that estimates a frequency-specific EEG power from the facial feature captured by a conventional digital camera, by harnessing the power of deep learning (patent applied for). The video on the left shows an example of estimation by our system. Here, the target
is the power of delta-wave, which has a strong relation with wakefulness/sleepiness. The blue line shows an actual recording from an EEG system and an orange line shows
estimation from facial video. We will develop various products/solutions based on this technology, including drowsiness detection and alert systems for drivers and workers.
Brain Machine Interface (BMI)
While brain-sensing technologies enable us to objectively access our own mental state through brain activities, another NeuroTech technology enables us to control robotic arms or computer devices by our will. This technology field is called Brain-Machine Interface (BMI) or Brain-Computer Interface (BCI). In Araya, we are working on this technology too. The below video shows an example of our development. Here, one of our staff is controlling a racing car on the screen by his EEG, without touching a controller. This is achieved by detecting EEG activity evoked by visual stimuli that change in specific frequencies (Steady State Visual Evoked Potential： SSVEP). Only for demonstration purposes, the staff is indicating which direction he intends to turn with his finger. Through developing versatile BMI/BCI algorithms and compact wearable EEG devices (in collaboration with VIE STYLE, Inc.), we aim to quickly realize a future in which anyone can operate any device directly from the brain in a hands-free manner.