• Skip to primary navigation
  • Skip to main content

Broken Place

XR Music Playscapes

  • Home
  • AR
  • VR
  • About
  • Dev Log
  • Contact

antpb

Using OscCore to build an AR Music Controller

antpb · Feb 25, 2020 · Leave a Comment

I woke up last Saturday and saw the below tweet by Stella Cannefax where she announced the 1.0 release of her OSC Library (compatible with Windows, MacOS, Android, and iOS.) I immediately needed to try and build something! To my delight, it took very little of my Saturday morning to integrate OSC components into the AR music interfaces of Broken Place AR.

open sound control core, 1.0

a new, polished version of my OSC library for Unity (2018.4+), with a stability guarantee.

* basic send/receive handling with no code
* simple API
* supports all OSC types
* kind to the garbage collector
* tiny https://t.co/Xq6fPkuqxF

— 𝕤𝕥𝕖𝕝𝕝𝕫 (@0xNerevar) February 22, 2020

OSC and why it matters

Open Sound Control is a very popular messaging protocol with high fidelity options in broadcasting midi-like information. It is typically used in music production/performances, visual experiences, and so much more. For the sake of this morning project, I focused on making an AR controller for the Renoise music software. Renoise natively supports Open Sound Control and exposes much of the project control parameters through various routes. Below is a screenshot of the message options allowed in Renoise. You may notice that the messages available are very simple routes, for instance in /renoise/song/bpm , BPM is given a value via the protocol; in this case numbers.

In the previous example of /renoise/song/bpm the final parameter is listening for a certain value type, often a boolean or a number. Lucky for us, those types of values are very easy to find in Unity components and translate using the handy OscCore components. 😈

Getting started with OscCore in Unity:

OscCore highlights that it is tiny and I think the greatest testament to that is how fast it was to configure. The system requires two things, a network object that is handling the sending of messages, and an object to define messages or routes to send values to. The later is what maps your game object values to a desired OSC message. Below is a screenshot of the primary “network” component assigned to it’s own root level game object in a standard AR Foundation scene. (Note, I’m using iOS build target.)

The OSC Sender component allows you to set the IP Address and ports that messages will be broadcast to. I plan to expose this option via player prefs in a later version of this project so that the user can define the IP address of their target computer. It’s important to note, if using this in a public setting, you should create your own private network to connect both your app and host computer. When you have both devices on the same safe network, you can find your target computer’s IP address in your system network settings and configure the component accordingly. This is where having a player pref option to set the IP will come in handy. ;D

Sending messages with OscCore

Messages can be sent to a defined route using the Property Output component. This component is what maps a defined value of a component, in my case a VJUI knob, to a given route. I recommend placing these components on the same object that it is referencing. In my experimentation, I had a top level game object referencing multiple knobs and because of how my AR Foundation scene instantiated the prefab, it lost the references. So yeah, attach the component to the thing it is watching if it makes sense.

Below we have a close up screenshot of the Property Output component. You’ll notice that the component references the OSC Sender component, of which will be sending the messages defined. In the example of my knob, the Property Output component is attaching the /renoise/song/instrument/-1/macro3 route to the Knob’s value parameter. You may notice immediately that changing the value parameter sends the messages to the target app. Below are also some video tweets that show off my working proof of concept. Yay!

The latency is super not bad?! Uh, I think I have an AR performance controller for Renoise 😳 pic.twitter.com/bczlRDtokU

— antpb (@antpb) February 23, 2020

Where to take this project?

Well shoot, I have a new fun project to chip away at! Something I want to try next is targeting two separate devices from the single AR app. Imagine that you have a computer hooked to a projector glitching a feed of the AR app’s perspective and another computer on a stage performing music. With a single controller, the performer can be in both places at the same time and maybe even unify the sound controls with visuals by attaching multiple Property Output components to a single UI component. I have so many ideas of things to build with OscCore. You should check it out and contribute back if you find improvements! I highly recommend giving the project’s README a glance before getting started. It gets deep into the code and makes my mind race with ideas. 😀 Thanks Stella!

Anthony Burchell

Founder, Broken Place

  • Website
  • Twitter
  • Instagram

Interacting with Augmented Reality Music Interfaces

antpb · Jun 3, 2019 · Leave a Comment

Broken Place AR is making leaps of progress! I have officially uploaded the first 0.1 beta to TestFlight and am very close to opening beta to the public. Of all of the progress that I think would be worth highlighting, the interaction system redesign was particularly interesting. I never thought I’d have to reinvent the knob in the metaverse, but here we are!

Challenges in interacting with UI in Augmented Reality

Imagine walking through your every day life via AR. How would you reach for items without being able to pass through the screen? I initially thought some type of hand recognition using computer vision would be the way to go but I learned it is not remotely close to being accessible on iOS. The computer vision approach with mobile AR is assuming way too much about players and leaves out folks that do not have use of more than one arm. Using touch input all over the screen also assumes two hands and I scrapped that approach quickly.

I have strived to build systems in Broken Place with the intention of allowing the game to be playable using only one hand. Broken Place VR has mostly been doing a good job of that. Broken Place AR, however, was severely lacking in that pursuit. In Broken Place AR’s previous interaction system, a user was expected to keep the phone aligned to the object they wanted to interact with while touching the screen with their other hand on the UI element desired. It didn’t really feel like an AR experience and more like I was trying to keep UI pages aligned to fill my viewport. Hands are also insanely obstructive to the experience. It was hard to see where you were touching.

I looked for inspiration. The average camera app is a great example of a way to trigger an action while not obstructing a scene and it is generally able to be used with only one hand. I decided to take some queues from that approach in the new system.

Click and Rotate ?

Image of knob UI element being clicked and rotated

Now, a player can use the familiar action of toggling from the right (or left) center, much like taking a photo. So long as you are holding the toggle, you will stay attached to the UI element you are interacting with. Gaze center is where I determined it best to send the raycast to the UI, but proved to be a bit confusing understanding that the toggle was associated with the center reticle. I added a thin line that renders from the toggle to the center to assist in giving a more clear indicator that the two are related. I’m giving myself bonus points for making the line animate to the music. 😉

The other unintended benefit of this approach is that now a user can attach to a UI element from much further away. Having a clear center reticle makes it much more precise in selecting your interaction point.

Most every music interaction in Broken Place AR can be accomplished with a knob like z rotation. Sliders were a bit difficult in translating the Z rotation to a value because of how thin they were. I added sudo knobs on the bottom of the sliders to allow a wider hit point for attaching to the element. Visually, it just kinda makes more sense.

Explaining the Unity side in a tweet length description:

Each UI item has a box collider. When the main toggle is pressed, a raycast is sent from center reticle and hits the collider. The name of the collider is stored while held and the the object’s child knob’s (or slider’s) value increases/decreases based on the z rotation of the camera.

Lefty FTW!

I’m left handed. No way I’m going to ignore my lefties! I added a setting to flip the landscape orientation and align the toggle to the opposite side.

I’m left handed, so you best believe I put a setting for that. 🤗 pic.twitter.com/TsdFdGdp1t

— antpb (@antpb) May 30, 2019

Beta Signup

I am very close to a public ready beta. If you would like to be notified when beta is ready, please sign up for the newsletter at http://broken.place

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2

© 2025 Broken Place | Privacy Policy | Open Source Acknowledgements