NYU GLASS

Designing accessible sensing systems that empower independent living.

The GLASS Lab at NYU explores how assistive technologies, wearable devices, and spatial computing intersect to improve quality of life. This page captures my ongoing work, collaborations, and research projects within the program.

NYU GLASS

Involvements

Investing in the GLASS community through mentorship and research.

I work alongside faculty and student teams to translate inclusive design principles into lab programs, workshops, and prototyping sessions. These initiatives help GLASS share knowledge across the NYU ecosystem while supporting a thoughtful maker culture.

Faculty Collaboration

Partnering with advisors to shape semester roadmaps and cross-lab research goals.

  • Coordinate weekly syncs to align experiment priorities.
  • Document findings and surface blockers early for design reviews.

Student Mentorship

Providing technical guidance for multi-disciplinary student cohorts exploring assistive tech.

  • Host office hours focused on ideation and rapid prototyping.
  • Share project planning frameworks adapted for GLASS timelines.

Community Programming

Supporting workshops that bridge accessibility advocacy with emerging tools.

  • Develop interactive demos showcasing wearable sensing concepts.
  • Facilitate feedback sessions with community partners and users.

Experiences

Hands-on work translating research into tangible prototypes.

From embedded software experiments to data collection pilots, each project strengthens my ability to ship inclusive technology. These highlights represent the core experiences that inform my contributions to GLASS.

Sensing Pipeline Pilot

Prototyped a wearable sensor pipeline that streams real-time mobility metrics for lab studies using lightweight edge hardware.

  • Implemented data capture scripts with error detection and recovery.
  • Mapped calibration workflow to minimize participant setup time.

Accessibility UX Audit

Led heuristic reviews of existing GLASS prototypes, prioritizing refinements for auditory and tactile feedback support.

  • Defined scoring rubric aligned to WCAG and lab research objectives.
  • Delivered iterative design notes to accelerate developer handoff.

Spatial Computing Sprint

Explored how AR affordances can assist indoor navigation for visually impaired users, experimenting with GLASS lab datasets.

  • Created proof-of-concept overlays for haptic direction cues.
  • Documented technical trade-offs for continued investigation.

GLASS Project

Project: Adaptive Indoor Navigation Assistant.

This ongoing project pairs depth sensing with audio feedback to help students and visitors orient themselves inside dense campus spaces. The concept is built collaboratively with GLASS stakeholders and community testers.

Current Focus

We are validating the user journey from onboarding to daily use. The prototype blends a lightweight wearable node with a mobile companion experience to deliver contextual cues.

2025 Assistive Tech Research & Development