COSENTINO, Sarah

写真a

Affiliation

Faculty of Science and Engineering, Global Center for Science and Engineering

Job title

Associate Professor(without tenure)

Profile

Sarah is a hands-on engineer and academic researcher. She started working as a freelance collaborator in an Electronics company during high-school, until earning her M.Sc. in Electronic engineering. Straight after her graduation she moved to Japan for a prestigious 1-year industrial internship program, which was extended to a full employee contract by the company for another 2 years. After 3 years in Japan, working for challenging projects in Electronics R&D, she decided it was time to change, applied and was selected for a scholarship program, and enrolled in a Ph.D. course in one of the top Japanese Universities. During her studies she collaborated with several other researchers across the globe, spending months in leading universities in U.S. and Europe. Her main interests are human physiology, human sensing, human communication, affective computing and human-machine interaction. She has hands-on experience in electronic design and assembly, and a wide researching experience in developing sensor systems for applications mostly related to human sensing, and human-robot interaction, authoring several publications on her specific work.

Research Institute 【 display / non-display

  • 2020
    -
    2022

    理工学術院総合研究所   兼任研究員

Degree 【 display / non-display

  • Waseda University   Ph.D. in Robotics

 

Research Areas 【 display / non-display

  • Mechanics and mechatronics

  • Robotics and intelligent system

  • Intelligent robotics

  • Human interface and interaction

Papers 【 display / non-display

  • The Italy-Japan Workshop: A History of Bilateral Cooperation, Pushing the Boundaries of Robotics

    Gabriele Trovato, Leonardo Ricotti, Cecilia Laschi, Massimiliano Zecca, Sarah Cosentino, Luca Bartolomeo, Shuji Hashimoto, Atsuo Takanishi, Paolo Dario

    IEEE Robotics and Automation Magazine    2021

    DOI

  • Development of Performance System With Musical Dynamics Expression on Humanoid Saxophonist Robot.

    Jia-Yeu Lin, Mao Kawai, Yuya Nishio, Sarah Cosentino, Atsuo Takanishi

    IEEE Robotics Autom. Lett.   4 ( 2 ) 1684 - 1690  2019

    DOI

  • Expressive humanoid robot for automatic accompaniment

    Guangyu Xia, Mao Kawai, Kei Matsuki, Mutian Fu, Sarah Cosentino, Gabriele Trovato, Roger Dannenberg, Salvatore Sessa, Atsuo Takanishi

    SMC 2016 - 13th Sound and Music Computing Conference, Proceedings     506 - 511  2019

     View Summary

    We present a music-robotic system capable of performing an accompaniment for a musician and reacting to human performance with gestural and facial expression in real time. This work can be seen as a marriage between social robotics and computer accompaniment systems in order to create more musical, interactive, and engaging performances between humans and machines. We also conduct subjective evaluations on audiences to validate the joint effects of robot expression and automatic accompaniment. Our results show that robot embodiment and expression improve the subjective ratings on automatic accompaniment significantly. Counterintuitively, such improvement does not exist when the machine is performing a fixed sequence and the human musician simply follows the machine. As far as we know, this is the first interactive music performance between a human musician and a humanoid music robot with systematic subjective evaluation.

  • Group Emotion Recognition Strategies for Entertainment Robots

    Sarah Cosentino, Estelle I. S. Randria, Jia-Yeu Lin, Thomas Pellegrini, Salvatore Sessa, Atsuo Takanishi

    Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems     813 - 818  2018.10  [Refereed]

  • A Synchronization Feedback System to Improve Interaction Correlation in Subjects With Autism Spectrum Disorder

    Yi-Hsiang, M. A, Han, Y, Lin, J. Y, Cosentino, S, Nishio, Y, Oshiyama, C, Takanishi, A

    International Conference on Awareness Science and Technology (iCAST)IEEE     285 - 290  2018.09  [Refereed]

display all >>

Misc 【 display / non-display

  • Musical Articulation System on Humanoid Saxophonist Robot

    Jia Yeu Lin, Mao Kawai, Kei Matsuki, Sarah Cosentino, Salvatore Sessa, Atsuo Takanishi, Atsuo Takanishi

    CISM International Centre for Mechanical Sciences, Courses and Lectures   584   392 - 399  2019.01

     View Summary

    © 2019, CISM International Centre for Mechanical Sciences. In this paper, a musical articulation system for the Waseda Anthropomorphic Saxophonist robot is proposed. First, the specifications to produce musical articulation with a saxophone are determined. Then, a tonguing mechanism and a fast air flow control valve are implemented on the robot along with an integrated feedback control system to perform articulation by synchronously stopping reed vibration and reducing the air flow. The effectiveness of the system is verified by conducting a performance comparison experiment between the robot and a professional musician. Results are briefly discussed and future works in the same direction are considered.

    DOI

  • Development of a low-cost smart home system using wireless environmental monitoring sensors for functionally independent elderly people

    D. Zhang, W. Kong, R. Kasai, Z. Gu, Y. Minami Shiguematsu, S. Cosentino, S. Sessa, A. Takanishi

    2017 IEEE International Conference on Robotics and Biomimetics, ROBIO 2017   2018-January   153 - 158  2018.03

     View Summary

    © 2017 IEEE. The number of older adults is increasing much faster than the other age groups in the world. Especially in Japan, more and more older adult are becoming functionally independent. The smart home are appearing which provides comfort and monitors the life of the residents. However, when thinking about healthy older adults, there are two important points to take into account for the smart home: providing comfort and monitoring the position inside the house of the older adults on their daily life. This can also help their families giving them peace of mind. In this study, we designed a smart home system which includes low-cost sensor parts and a finger robot. This system can be used to control home appliances, such as air conditioners to keep the residents in a comfortable environment. Without the need of additional sensors only by using CO2 sensors, this system can provide information such as the room the resident is in, or if he/she is at home or not. As a result, we show a working system and the CO2 sensor is able to judge the position of the resident, and we are even able to understand the movement path of the resident.

    DOI

  • Evaluation of a sensor system for detecting humans trapped under rubble: A pilot study

    Di Zhang, Salvatore Sessa, Ritaro Kasai, Sarah Cosentino, Cimarelli Giacomo, Yasuaki Mochida, Hiroya Yamada, Michele Guarnieri, Atsuo Takanishi

    Sensors (Switzerland)   18 ( 3 )  2018.03

     View Summary

    Rapid localization of injured survivors by rescue teams to prevent death is a major issue. In this paper, a sensor system for human rescue including three different types of sensors, a CO2 sensor, a thermal camera, and a microphone, is proposed. The performance of this system in detecting living victims under the rubble has been tested in a high-fidelity simulated disaster area. Results show that the CO2 sensor is useful to effectively reduce the possible concerned area, while the thermal camera can confirm the correct position of the victim. Moreover, it is believed that the use of microphones in connection with other sensors would be of great benefit for the detection of casualties. In this work, an algorithm to recognize voices or suspected human noise under rubble has also been developed and tested.

    DOI PubMed

  • Audience mood estimation for the Waseda Anthropomorphic Saxophonist 5 (WAS-5) using cloud cognitive services

    RANDRIA Estelle I. S, COSENTINO Sarah, LIN Jia‐Yeu, PELLEGRINI Thomas, SESSA Salvatore, TAKANISHI Atsuo

    日本ロボット学会学術講演会予稿集(CD-ROM)   35th   ROMBUNNO.1C1‐04  2017.09

    J-GLOBAL

  • Reliability of stride length estimation in self-pace and brisk walking with an inertial measurement unit on shank

    R. Kasai, T. Kodama, Z. Gu, D. Zhang, W. Kong, S. Cosentino, S. Sessa, Y. Kawakami, A. Takanishi

    2017 IEEE International Conference on Mechatronics and Automation, ICMA 2017     671 - 676  2017.08

     View Summary

    The use of Inertial Measurement Unit (IMU) for gait analysis is gaining popularity because of its advantages of low cost and non-limited workspace. In this context, researchers are focusing on methods for automated data analysis. For example, many algorithms for stride length estimation have been developed. These algorithms rely on event detection to compute gait parameters during walking and on orientation estimation for a more precise double integration of acceleration. However, at the present, there is not comparison between existing algorithms, and the applicability of each algorithm for different walking patterns is not clear. In this paper, we studied the effect on the stride length estimation using three different techniques of event detection and two techniques of orientation estimation, by using an IMU on the lateral side of shank above the ankle. In total 6 patterns of stride estimation algorithms were compared on different walking patterns of normal and brisk walking. We evaluated the techniques in terms of precision, accuracy, and shape of the histogram of the stride estimation error.

    DOI

display all >>

Research Projects 【 display / non-display

  • Robot music therapy for effective social training of ASD children

    Grant-in-Aid for Young Scientists (B)

    Project Year :

    2017.04
    -
    2020.03
     

    COSENTINO SARAH

    Authorship: Principal investigator

Specific Research 【 display / non-display

  • SMART – System for Multitask Analysis、 Rating and Trending

    2020   Jia-Yeu LIN, Zixi GU, Mauricio MENDEZ, Shengxu LIU, Shi FENG

     View Summary

    This project follows the idea presented in the rejected kakenhi project Kiban B, and serves as a base for starting research and acquire preliminary data on unobtrusive and ecological methods to continuously analyze Multi-Task (MT) mobility and assess cognitive activities in activities of daily living (ADL). The core idea is to create a sensorized environment which can monitor the users and automatic assess theirbehavior and vital parameters, detecting possible anomalies and triggering safety responses when needed, while respecting the user privacy at all times. Such system will prove very useful in ensuring safety and independence to senior citizens during ADL, drastically improving the users' Quality of Life (QoL). Due to the pandemic, the system development was severely delayed and experiments in real environment with external users were not possible. For this reason, we acquired all the necessary material and we are planning to continue the development in the next year. In particular, we will add a module for anonimized data upload to the cloud, and start monitoring experiments in the laboratory, to refine the sensor system and AI behavior data analysis module.

  • Development of gesture recognition system for therapy applications

    2019  

     View Summary

    This project stemmed from the selected kakenhi project Wakate B 17K18178, and served as a base for continuing research on robotic platforms for cognitive and social therapy applications. We developed two systems: a gesture recognition system to analyze human movement via real-time video data analysis, a system for movement input recognition, and a system for real-time video gaze direction recognition. These systems were integrated in two therapy session protocols for ASD children, for social training interaction robotic therapy using our WAS Waseda Anthropomorphic Saxophonist robot.From the analysis of body movement and gaze direction, as well as, if present, facial expression, the robot can recognize basic interactive behaviors and emotional patterns of the child, and adapt its performance consequently, following the suitable therapeutic protocol. In this way, also the effects of both active and receptive music therapy can be combined, as the robot can choose appropriate melodies and music tunes according to the current needs of the child under therapy.This supporting grant was useful in running more experiments and refining the prototype systems developed during Wakate B 17K18178.

  • Iconic facial expressions able robotic face for direct signaling in human-robot dynamic musical interaction

    2016   Salvatore Sessa

     View Summary

    After an extensive evaluation and recalibration of the needed hardware for the implementation of the proposed solution, we decided to move the focus of the facial interaction from action to recognition. We implemented a system that allows the robot, using the cloud computing and the Microsoft APIs, to recognize facial expression and emotional cues, and adapt its performance accordingly.This system would allow the robot not only to recognize specific facial expression cues from partner human players during a joint performance, but also and especially to recognize emotional expressions from the audience and respond in a tailored way.To do so, we had to implement the facial recognition software system, but also to improve the musical expressivity of the robot, by refining the hardware. We then redesigned the robot control system to produce better and more diverse sounds, allowing the robot to add specific emotional connotation to the performed piece.In the future, we are planning to redesign the entire robot hands, to be able to extend the register range of the performance and extend the possible musical emotional connotations.

 

Syllabus 【 display / non-display

display all >>

Teaching Experience 【 display / non-display

display all >>