This app was developed by me and two other student for a group project at Boise State University. The goal of the app is to help users with autism or who are non-verbal to easier communicate what they are wanting. Users can point the phone camera at an item and through machine learning the app identifies the object and the user can then save the object to favorites or use text to speech to convey that they 'want' or 'don't want' the object.
Develop
I worked primarily as a developer on this project. One of my group members created a wireframe of what the app should look like and I was in charge of building the favorite list page. The app needed to be able to save user data so I decided to impliment Google Firestore and Firebase. When a user would take a picture of an object and hit the favorite button, their image and the identification of the object would get saved to the Firebase database for later use. This also allowed for the user's favorite list to be synced across all of their devices.