Projektname


  • GUITAR

.

Datum


  • Mär. 2021 – Jil.2021
  • .

Gewünschtes Gerät


  • AR Brille
  • .

Software


  • AE ∙ PR ∙ Unity ∙ C#

.

I chose a guitar tutorial as an example to simulate, in order to figure out what a 3D Interface might look like in Mixed Reality for AR glasses. The Goal is to let user learn guitar in an easier way.


Idea for AR

explore the possibilities of this technique

Design Frage über Mixed Reality

Wie kann ich Interface in MR gestalten, um Räumlichkeit und Interaktion gut zu interagieren? 


In meinem Projekt möchte ich erforschen, welche Rollen 3D Räumlichkeit und Motion Design in MR spielen könnten, und in wieweit man dadurch besser und intuitiver die Informationen erhalten kann.

Umsetzung und Ansatz

Ich erforsche am Beispiel einer Gitarren-Kurses die Verwendung von 3D Mixed-Reality User-Interfaces, und verbinde dabei räumliche Gestaltung, Motion Design und UI/UX.


Ein wesentlicher Unterschied zu herkömmlichen Tutorials besteht darin, dass durch MR die Erklärungen räumlich am Körper des Betrachters zu sehen sind.


Durch dieses intuitive und vereinfachte Interface trainiert man die körperliche Koordination und lernt unmittelbar durch die eigene Tätigkeit.

Zielgruppe

  • - Guitar Selber-Lerner
  • - Anfänger

Zielfunktionen

- 3D Menu

- left hand: Chord-coaching

- right hand: notes-reading + interface for the strings

3D Menu for the left hand

 I design this 3D Menu for the left hand without any buttons, so that the user can control it just with the gestures. 


Chord coaching for the left hand

This is the instruction for the chord-coaching for the left hand. In this way the user can get the information directly through their fingers and may know whether the fingers are in the right position or not.


3D Menu for the right hand

This 3D Menu is for the right hand. So that the user can choose which exercise they want to start with.


Chord exercise for the right hand

With the feature for the chord exercise I created a more intuitive notes-reading and built an interface for the strings, so that the user may know wether they play the right string or not.

Courses for AR

learning C# and Unity on LinkedIn Learning

Um meine Idee zu ermöglichen, habe ich zusätzlich in den Kursen bei Linked Learning Einblicke in Unity, C# und User-Interface und -Experience bekommen


AR Plane Detection

Coding task: creat a AR Plane, to enable the plane detection and place a virtual object on the detected plane at the location wherever it’s tapped.

AR Cube control

Coding task: hide the AR Plane and enable to drag the virtual cube with user’s finger.

AR Planets

coding task: make the planets rotate

3D Menu

coding task: create the user interface for refreshing the scene, hiding the UI, and allow user to put the selected food on the the target.

AR Car – 01

Coding task: light projector, creating the light which only be projected on certain object.

AR Car – 02

coding task: create ambient light estimation and implement a warning text. It would be show to user when it is in a low lighting condition.

AR Car – 03

Coding task: create 3D UI enable to let the user change the car's color and let the user rotate and drag the car with fingers.