top of page
이미지 제공: isco

At SignAR, our goal is to break down barriers for non-verbal communicators

          We believe everyone has an opinion and opportunities should be for all.

Built with 
Augmented Reality / Unity / Oculus / Leap Motion / C# / Adobe Illustrator / Interviews

메인: 인사

Video

Inspiration

We are inspired to augmented reality to create an experience for those who are

hearing and speech impaired to be able to communicate via sound and text.

SignAR is the next generation of sign language application,

providing non-verbal communicators new tools to communicate.

*This project was done at 2020 MIT XR Hackation, “MIT Reality Hack”.

메인: 소개

Functions

How it works

Untitled.gif
Untitled.gif
Untitled.gif

1. Translate user’s sign language

Gesture → Sound

2. Translate conversation partner’s voice

Voice → Text

3. Hot key

Customize vocabulary and phrases

메인: 기능
andrei-lazarev-GfiSwG1B5Zw-unsplash.jpg

This tool could be a very significant product for non-verbal communicators both locally and globally.

          Exploring Watson integration will allow machine learnings to power up the vocabulary to scale the potential for meaningful communication around the world.

Future Features

  • Integrate with open source Watson, leverage machine learnings to rapidly build out a robust vocabulary database.

  • Integrate with open source past MIT Hackathon project to integrate existing voice to gesture project.

  • Effectively closing the loop on meaningful communication for non-verbal communicators.

  • Develop directional audio functionality, allowing for presentation mode or more private conversations to take place.

메인: 인사

Team Members

Ajinkya Hukerikar

Video, Interview

Alistair Leyland

I produced the project, helped with the video, lead aspects of the pitch presentation
and tried to keep the team super positive during a very condensed timeline.

Runze Zhang

Experienced in AR/VR multi-user development for collective spatial experience.

I created a mini-base for recognizing sign language and sentences; built the interaction function in unity.

Soyoung Lim

I worked on UX/UI and logo design. I tried to maximize usability of
SignAR functions in situation of face-to-face conversation.

Zhuoneng Wang

I am an XR developer who has a deep interest in interactive technology.

HMU if you want to chat about XR, AEC and anything else!

I originated the idea of this project, worked on hand gesture recognition features

and general Unity issues to make sure the prototype works properly.

메인: 팀 소개
bottom of page