Nothing Special   »   [go: up one dir, main page]

Navi Logo

navi - an ambient smart home experience

Inspiration

We believe that the next step to user interfaces is next to none at all. Accessing information quickly should happen instantly in a smart home environment and we believe that our project, Navi, is a step towards that future.

What it does

Navi is an ambient user experience platform for the smart home built on top of three core technologies, the Nest Camera, the Amazon Echo and the Navi Web App.

Navi uses the Nest Camera and extends its motion detection capabilities with Microsoft's Facial Recognition API allowing it to recognize users from its constant video feed.

Once a user is recognized coming into the house, this allows the user's next interactions with the Navi Alexa Skill to be personalized towards them, allowing them to obtain information and perform actions that are personalized to them.

Interacting with the Navi Alexa App also updates the user's session in real time, connecting directly with their associated Navi Account and populating the Navi web app with any requests that they are currently making to provide extended information on their computer as well as their phones.

With this quick recognition system, users will have to set up just once and never have to bother going through clunky user interfaces to access their personal information again. Best of all, because of the facial comparison we implemented from Project Oxford, multiple users will be able to have their own dynamic experience on a single set of devices.

How we built it

Our team used the Nest Cam to detect motion and send back images which we then sent to Microsoft's Cognitive Services to be processed by their Facial Recognition API. Upon completion, this returns back the most likely match within our database of recorded user faces which allows us to populate our Firebase-connected Navi server with the last recognized user, built on NodeJS and deployed on AWS Lambda.

Once we had the face recognition, we worked on creating a Navi Alexa Skill for the Amazon Echo, built on NodeJS and deployed on AWS Lambda.

These two microservices then passed data back and forth between the Navi EC2 web app which interacted with Firebase and an Angular Front End to manipulate and display our user's interactions and information dynamically across both the Navi Alexa Skill and Navi Web App.

Challenges we ran into

No WiFi. No Power. Challenges with implementing different hardware and APIs to work together and sync throughout all devices using Firebase.

Accomplishments that we're proud of

What we learned

What's next for Navi

We want to connect our home with technologies to enable us to become our very own Tony Starks and we believe that the next step in that is designing an easy to integrate interface for Navi allowing any developer to easily add their own APIs and services to our platform.

Share this project:

Updates