Xuanyou Liu Xuanyou Liu

Hello, I'm

Xuanyou Liu (Zed)

Ph.D. Student in Human-Computer Interaction

Department of Computer Science
Northwestern University

Scroll to explore

About

I design and build interactive systems that augment human capabilities through novel sensing and haptic technologies.

I am a Ph.D. student in Computer Science at Northwestern University (SPICE Lab), advised by Prof. Karan Ahuja. My research lies at the intersection of human-computer interaction, wearable computing, and haptic interfaces.

Before joining Northwestern, I worked at the University of Chicago(HCI Lab) with Prof. Pedro Lopes. I received my M.S.E. in Robotics from the University of Pennsylvania (GRASP Lab) and my B.E. in Industrial Design from Xi'an Jiaotong University.

News

  • Sep 2025 Started my Ph.D. at Northwestern University!
  • May 2025 Graduated from UPenn with M.S.E. in Robotics (GPA: 4.0, Rank: 1/64).
  • Jan 2025 Our paper "Seeing with the Hands" was accepted to CHI 2025!
  • May 2024 Presented TacTex at CHI 2024 in Honolulu, Hawaii.

Publications

* denotes equal contribution

Seeing with the Hands

Seeing with the Hands: A Sensory Substitution That Supports Manual Interactions

Shan-Yuan Teng*, Gene Kim*, Xuanyou Liu*, Pedro Lopez

CHI 2025 | ACM Conference on Human Factors in Computing Systems

Sensory-substitution devices enable perceiving objects by translating one modality (e.g., vision) into another (e.g., tactile). While many explored the placement of the haptic-output (e.g., torso, forehead), the camera's location remains largely unexplored—typically seeing from the eyes' perspective. Instead, we propose that seeing & feeling information from the hands' perspective could enhance flexibility & expressivity of sensory-substitution devices to support manual interactions with physical objects. To this end, we engineered a back-of-the-hand electrotactile-display that renders tactile images from a wrist-mounted camera, allowing the user's hand to feel objects while reaching & hovering. We conducted a study with sighted/Blind-or-Low-Vision participants who used our eyes vs. hand tactile-perspectives to manipulate bottles and soldering-irons, etc. We found that while both tactile perspectives provided comparable performance, when offered the opportunity to choose, all participants found value in also using the hands' perspective. Moreover, we observed behaviors when "seeing with the hands" that suggest a more ergonomic object-manipulation. We believe these insights extend the landscape of sensory-substitution devices.
TacTex

TacTex: A Textile Interface with Seamlessly-Integrated Electrodes for High-Resolution Electrotactile Stimulation

Hongnan Lin, Xuanyou Liu, Shengsheng Jiang, Qi Wang, Ye Tao, Guanyun Wang, Wei Sun, Teng Han, Feng Tian

CHI 2024 | ACM Conference on Human Factors in Computing Systems

This paper presents TacTex, a textile-based interface that provides high-resolution haptic feedback and touch-tracking capabilities. TacTex utilizes electrotactile stimulation, which has traditionally posed challenges due to limitations in textile electrode density and quantity. TacTex overcomes these challenges by employing a multi-layer woven structure that separates conductive weft and warp electrodes with non-conductive yarns. The driving system for TacTex includes a power supply, sensing board, and switch boards to enable spatial and temporal control of electrical stimuli on the textile, while simultaneously monitoring voltage changes. TacTex can stimulate a wide range of haptic effects, including static and dynamic patterns and different sensation qualities, with a resolution of 512 × 512 and based on linear electrodes spaced as closely as 2mm. We evaluate the performance of the interface with user studies and demonstrate the potential applications of TacTex interfaces in everyday textiles for adding haptic feedback.

Projects

Compact Electrotactile Module

Compact Electrotactile Module for Wearable Haptic Interfaces

Independent study at University of Pennsylvania developing compact electrotactile stimulation modules for wearable haptic feedback systems.

Variable Topology Trusses Robot

Variable Topology Trusses Robot

Research at ModLab, University of Pennsylvania on reconfigurable truss robots capable of changing their topology for adaptive locomotion and manipulation.

Ins-Bucks Food 3D Printer

Ins-Bucks Food 3D Printer

Award-winning design (Most Innovative Award) at the RCA-Imperial Design for Global Challenges Competition, focusing on sustainable food printing technology.

Teaching & Leadership

Teaching Assistant: ESE5190 Smart Devices

Graduate course at University of Pennsylvania covering embedded systems, IoT, and smart device development.

University of Pennsylvania

Code Instructor

Teaching Python & Arduino programming fundamentals to middle school students at Fife-Penn CS Academy.

Fife-Penn CS Academy

Student Representative

Represented students at the Global Youth Forum 2021, organized by the International Labor Organization.

ILO Global Youth Forum 2021

Contact

I'm always happy to chat about research, potential collaborations, or just to connect!

Room 3546, Mudd Building
Department of Computer Science
Northwestern University
Evanston, IL

Let's Connect

Feel free to reach out if you're interested in novel sensing & haptic technologies in HCI. I'm also happy to discuss potential research collaborations or just have a casual chat about the field!