January 19, 2019

Augmented Reality Manual

An instructional assembly manual for putting together a piece of furniture. The manual guides the user through an AR experience to show exactly how the parts fit together. It also shows the user which parts to put together next.

The technologies used for this project are the following

  • ARKit 2.0 – for plane detection and keeping track of where the phone is in the room. Also to keep track of the 3D space.
  • Machine Learning – Yolo v2 network trained to detect objects through the images from the camera. Network trained via transfer learning.
  • Xcode – for developing the iOS app
  • Tensorflow – for writing a script (Python) for training the ML network.
  • CoreML – For importing the trained model into Xcode
  • Apple Vision – for tracking object in real time + barcode scanning
  • Scrum Project Methodology – for collaborating together in the team
  • LateX for writing the report

The full project report can be read here and the repository is available at GitHub.

You may also like...