2026/04/30 更新

写真a

リン カウ
林 家宇
所属
理工学術院 国際理工学センター(理工学術院)
職名
講師(任期付)
学位
博士(工学) ( 2021年03月 早稲田大学 )

経歴

  • 2023年04月
    -
    継続中

    早稲田大学理工学術院 国際理工学センター 講師

  • 2019年04月
    -
    2024年02月

    東京デザインテクノロジーセンター専門学校 ロボット専攻 非常勤講師

  • 2021年04月
    -
    2023年03月

    早稲田大学 理工学術院 助教

学歴

  • 2018年04月
    -
    2021年03月

    早稲田大学   大学院先進理工学研究科   生命理工学専攻  

  • 2016年04月
    -
    2018年03月

    早稲田大学   大学院先進理工学研究科   生命理工学専攻  

  • 2011年09月
    -
    2016年01月

    国立台湾大学   物理学部  

研究キーワード

  • ヒューマンインタフェース

  • 知能機械システム

  • ロボティクス

 

論文

▼全件表示

Misc

  • 人間を模擬した指揮者ロボットの開発-人間手首の自由度を有する球面リンク機構の開発-

    國谷大樹, 佐竹慶洸, 谷子曦, 林家宇, 高西淳夫, 高西淳夫

    日本ロボット学会学術講演会予稿集(CD-ROM)   42nd  2024年

    J-GLOBAL

  • ネックホルダーを使用した人間型サキソフォン演奏ロボットの開発-現行指部の改良と再検証-

    水上和彦, 内山純, 山田晃久, 馬翊翔, 韓衍, 林家宇, コセンティノ・サラ, 高西淳夫

    日本ロボット学会学術講演会予稿集(CD-ROM)   38th  2020年

    J-GLOBAL

  • 人間型サクソフォン演奏ロボットの開発-音圧範囲の拡大を目指した新型首部機構の設計・製作-

    山田晃久, 馬翊翔, 韓衍, 西尾祐哉, 内山純, 林家宇, 高西淳夫, コセンティノ・サラ

    日本ロボット学会学術講演会予稿集(CD-ROM)   37th  2019年09月

    J-GLOBAL

  • 人間形サキソフォン演奏ロボットの開発-腕と手首を用いる運指の評価-

    西尾祐哉, 松木慧, 河合雅央, 林家宇, セッサ サルバトーレ, コセンティノ サラ, 上山景子, 高西 淳夫

    日本ロボット学会学術講演会予稿集(CD-ROM)   35th  2017年09月

  • クラウド認知サービスを用いたWaseda人間型サキソフォン奏者5(WAS-5)のための聴衆の気分推定

    日本ロボット学会学術講演会予稿集(CD-ROM)   35th  2017年09月

 

現在担当している科目

▼全件表示

 

特定課題制度(学内資金)

  • Development of an Omnidirectional Treadmill-Based Simulation Platform for Bipedal Walking Robot Training and Evaluation

    2025年  

     概要を見る

    1. IntroductionThis report presents the results of the design, implementation, and evaluation of an omnidirectional treadmill-based simulation platform intended for training and assessing bipedal walking robots. The platform enables continuous locomotion in a constrained physical space while preserving natural gait dynamics, thereby facilitating robust training, testing, and benchmarking of locomotion control algorithms under diverse conditions.In addition, this study incorporates the recently published work “MicCheck: Repurposing Off-the-Shelf Pin Microphones for Easy and Low-Cost Contact Sensing”, which introduces a lightweight and cost-effective approach for contact sensing. By leveraging acoustic sensing for detecting contact events, the proposed platform extends its sensing capabilities beyond conventional force and motion measurements.2. System DesignThe proposed system integrates a robotic locomotion interface with an omnidirectional treadmill, supporting unrestricted planar movement and real-time feedback. The design consists of the following components:Mechanical Structure:A low-inertia omnidirectional treadmill with a concave walking surface, enabling passive recentering of the robot. The surface is coated with low-friction materials to minimize resistance while ensuring stable foot-ground contact.Locomotion Support and Stabilization:A harness-based support mechanism combined with a passive gimbal structure allows vertical load compensation and prevents falls without constraining horizontal motion. This ensures safety during unstable gait phases and early-stage controller training.Actuation and Drive System:The treadmill employs multiple independently controlled omnidirectional rollers arranged beneath the walking surface. These rollers dynamically adjust velocities to counteract the robot’s motion, effectively keeping it centered while allowing arbitrary walking directions.Sensing and State Estimation:The platform integrates motion capture cameras, inertial measurement units (IMUs), and force sensors embedded in the treadmill surface. In addition, MicCheck, low-cost pin microphones are embedded near foot contact regions to capture high-frequency acoustic signals generated during foot-ground interactions. These signals are processed to detect contact timing, slippage, and subtle interaction dynamics that are difficult to capture with force sensors alone.Control and Software Integration:A real-time control framework synchronizes treadmill actuation with the robot’s locomotion. The system interfaces with robot controllers and simulation environments (e.g., reinforcement learning frameworks), enabling closed-loop training and evaluation. Adaptive control algorithms adjust treadmill response based on predicted robot motion to minimize tracking error. Acoustic sensing data from the microphone array is fused with conventional sensor data to enhance contact state estimation.3. Implementation and TestingThe system was implemented and evaluated using a standard bipedal humanoid robot platform. Key performance metrics include:Motion Compensation Accuracy:The treadmill maintained the robot within a 5 cm radius of the center during continuous walking at speeds up to 1 m/s, demonstrating effective motion compensation across multiple directions.Gait Stability and Repeatability:Robots trained on the platform exhibited stable walking patterns with a reduction in lateral drift compared to fixed-ground experiments. Step variability decreased significantly, indicating improved gait consistency.Contact Detection Enhancement:By integrating the MicCheck-inspired sensing approach, foot-ground contact events were detected with higher temporal resolution. The system also enabled detection of micro-slip and impact characteristics, improving gait phase estimation and control responsiveness.Training Efficiency:When integrated with reinforcement learning algorithms, the platform reduced training time due to continuous locomotion without boundary interruptions.4. Results and DiscussionThe omnidirectional treadmill-based platform demonstrated substantial advantages for bipedal robot training and evaluation. Continuous walking without spatial constraints enabled longer and more diverse locomotion sequences, improving both controller robustness and generalization.The integration of MicCheck-based acoustic sensing provided a novel and effective complement to traditional sensing modalities. Unlike force sensors, the microphone-based approach captures high-frequency contact dynamics, enabling more precise detection of initial contact, slip onset, and surface interaction characteristics. This multimodal sensing significantly enhanced the fidelity of gait phase estimation.The platform proved particularly effective for data-driven approaches such as reinforcement learning, where uninterrupted motion and richer sensory input improved sample efficiency and policy robustness. The safety mechanisms further enabled aggressive experimentation without risking hardware damage.However, some limitations remain. Acoustic sensing is sensitive to environmental noise and requires signal filtering and calibration. Additionally, discrepancies in treadmill surface properties compared to natural terrain may still influence locomotion dynamics, particularly at higher speeds, where minor tracking errors were observed.5. ConclusionThis study successfully developed an omnidirectional treadmill-based simulation platform for bipedal walking robot training and evaluation, enhanced by the integration of a low-cost acoustic contact sensing approach by MicCheck. The system provides a controlled, safe, and efficient environment for continuous locomotion experiments, significantly improving training efficiency, gait stability, and contact awareness.Future work will focus on improving noise-robust acoustic signal processing, enhancing high-speed response capabilities, and incorporating variable terrain simulation. Further integration of multimodal sensing, including vision and tactile feedback, will expand the platform’s applicability to more complex real-world locomotion scenarios.

  • Design and Implementation of a Tactile Sensation Simulator for Enhanced Interaction in Virtual Reality Environments

    2024年  

     概要を見る

    1. IntroductionThis report presents the results of designing and implementing an independent finger joint force feedback system for a tactile sensation simulator in virtual reality (VR) environments. The system aims to enhance user interaction by providing precise force feedback at individual finger joints, thereby improving the realism of virtual object manipulation.2. System DesignThe proposed system consists of a wearable exoskeleton with actuators that independently apply force feedback to each finger joint. The design incorporates:Mechanical Structure: A lightweight, ergonomic framework that allows natural hand movements while supporting force feedback.Actuation Mechanism: Miniature servo motors that generate controlled resistance at metacarpophalangeal , proximal interphalangeal , and distal interphalangeal joints.Sensors and Control: Embedded force and position sensors to dynamically adjust feedback intensity based on user interactions in VR.Software Integration: A real-time control algorithm that synchronizes force feedback with VR interactions, ensuring responsive and realistic touch sensations.3. Implementation and TestingThe system was built and integrated with a VR simulation to evaluate its performance. Key aspects tested include:Force Precision and Response Time: The actuators provided forces up to 5N per joint with an average response time of 20 ms, ensuring smooth interaction with virtual objects.User Comfort and Wearability: The lightweight exoskeleton (weighing approximately 250g) was tested for extended wear without discomfort.Interaction Accuracy: Users successfully differentiated between virtual objects of varying stiffness and textures with over 85% accuracy, demonstrating enhanced tactile perception.System Latency: The end-to-end system latency, including sensor feedback and actuation, was maintained below 30 ms, aligning with VR real-time interaction requirements.4. Results and DiscussionThe independent finger joint force feedback system significantly improved the realism of VR interactions. The ability to apply localized force feedback to each joint allowed for more nuanced object manipulation, such as grasping, pressing, and sliding motions. Users reported increased immersion and tactile awareness, with a preference for the system over traditional haptic gloves.5. ConclusionThis research successfully developed and implemented an independent finger joint force feedback system for VR applications. Future improvements include optimizing actuator efficiency, refining control algorithms, and integrating additional sensory modalities (e.g., temperature and texture simulation) to further enhance tactile realism in VR environments.

  • Development of multi-sensory feedback system for interactive exploration of the virtual-reality

    2023年  

     概要を見る

    Our research of this year have accomplish the following task:1) Propose multi-sensory feedback system architecture, including haptic feedback devices, audio systems, and scent dispensers.2) Software development, including integration with VR platforms and programming of sensory stimuli.

  • Develop of instrumental ensemble system on musical robots for human-robot musical interaction

    2022年  

     概要を見る

    Our research is to Model the behaviors of human musicians ina quantified approach to understand the mechanisms of playing instruments andreproduce them on musical robots to realize bidirectional musical human-robotinteraction.The system is developed based on the following two parts:(1) Musical expression system with psychoacoustic elementsfor interactive performance on a anthropomorphic saxophonist robot(2) A conductor robot performs conducting gestures andalgorithm to generate commands base on the auditory information

  • Development of a conductor robot for effective emotional human-robot musical interaction

    2022年  

     概要を見る

    Our research is to Model the behaviors of human musicians and conductor in a quantified approach to understand the mechanisms of exchanging musical information and reproduce them on musical robots to realize bidirectional musical human-robot interaction.The system is developed based on the following two parts:(1) Musical expression system with psychoacoustic elementsfor interactive performance on a anthropomorphic saxophonist robot(2) A conductor robot performs conducting gestures andalgorithm to generate commands base on the auditory information

  • Develop of anthropomorphic saxophonist robot performance system for human-robot musical interaction

    2021年  

     概要を見る

    Music is a social activity that can powerfully influence large groups of people. In the robotics field, several musical robots have been developed to play different kinds of instruments in order to provide entertainment experience comparable to human musicians. In this research, we aimed to model and reproduce human’s behavior of playing the instrument on the Waseda Anthropomorphic Saxophonist (WAS) robot, and build an interactive system for bidirectional musical human-robot interaction. During a high-quality musical performance, techniques like dynamics are used to give extra expressions to the music for emotional interaction. Since the WAS robot has been developed to play a typical alto saxophone with mechanisms similar to human players, we reproduce human’s behavior of playing the instrument by modeling musician’s motions. The system will be based on the following two parts for delivering expressive performance and analysis the emotional instructions from the music pieces.

▼全件表示