Headmaster Prototype at The Boston VR Bender


(Image courtesy of Owlchemy Labs)

This might seem a little odd to be posting so long after this event, but I want to catch you up on what I’ve been doing and this was where Headmaster got its start. Kinda bridge the gap…

So, one of the first things that I did after I quit my studio job was to truck on over to Boston for the Boston VR Bender, a VR game jam put on by Alex Schwartz and Devin Reimer of Owlchemy Labs in June 2014. I was really interested in getting to know indies in my area, as well as dive headfirst into VR. They invited some folks from Unity and Valve to come and the Valve guys brought their prototype desktop VR hardware for folks to try prototyping projects on. Not only was this hardware running at a really high frame rate of 95fps but it was also much higher resolution than the currently (at the time) available Oculus DK1. Most importantly though, it had positional tracking via a camera. It was the closest thing to an Oculus DK2 that existed at the time and so was a great jump start for anyone that wanted to try out that technology. The challenge of the jam was to make a game that took advantage of the positional tracking.

I had an idea immediately. I play soccer. I also like building simple games with easy to learn/hard to master mechanics. Heading soccer balls was a perfect thing to try. I didn’t know if it would be fun but that’s what game jams are for.

So I work for a couple hours and think I have something worth testing. Pop a build out of Unity, put it on the USB key, and bring it to the room where the Valve setup is.

I ran the build and put the headset on. At that point I hadn’t tried their demos, so all I knew was what I saw of my demo on a DK1 and what I was seeing on their hardware and WHHHOOOOA. Even with my fairly ugly two hour prototype of a flat green field and a soccer goal, it was obvious that this hardware was in another league. In hindsight, I’d say it was like a DK2 + 50%. Definitely better than Oculus’s soon to be released devkit.

And you know what? The header prototype, the first time I ran it, was fun. It is not hard to make a ball launch at your head in VR, and just the act of seeing it coming and then moving my head to hit it was really really novel. I think just seeing me flailing my head around got some of the other guys interested in trying it, so Chet Faliszek and Aaron Lieby of Valve, Pete Moss of Unity, and Alex and Devin gave it a shot. I spent the rest of the jam adding a simple scoring system to give us something to compete over and we’d keep trying to beat each other’s score.

I had to cut things a bit short since I was living on borrowed time away from my wife and 5 week old daughter, but I had a mechanic that had a lot of promise. Pete even made mention of it in his blog post about the event for Unity. I went home and put it on the back burner since I didn’t have a DK2 and had no real way of playing it without positional tracking. I figured I’d approach it again once I got my hands on some hardware.

The Story of Independent Ben

Hoooo boy. Here we go. Gotta catch this baby up. This is my first post about what I’m doing in a LONG time. I haven’t blogged since I wrote Detonator in 2009. Let me start from about there.

After writing Detonator and then spending a year or so of game contracting with Infrared 5 and others using Unity, I went back to work for Vicarious Visions (VV) in 2010. This ended up being a fantastic decision. I’ve written about my time at VV in my About page, but the long and short of it is that I worked on some awesome projects in the Skylanders franchise. It was great – great people, challenging work in a big time environment – an awesome experience. After a while though, it became evident that VV wasn’t gonna be my final stop.

I have spent the last 10 years working to learn everything I’d need to make my own games. It’s a lot of stuff! While my work at studios was satisfying in many ways, being behind the safe curtain of a large company insulates you from so much as a gamemaker. I wanted to experience all the fears and freedoms of building my own product. I’ve never organically followed my own prototype to polished experience. I’ve never talked directly to my customers (in games at least) and cultivated a fan base. I’ve rarely been able to talk to other developers about what I’m currently working on. All of these things were growing increasingly important to me as my career went on. Grass is greener maybe? Who knows! But… I wanted to make my own games.

So that’s nice that I want to do that, right? Well sometimes you need a reason to break out of routine and start getting serious about your life goals. For me, this started when I came down with a nasty case of Lyme Disease in 2012. This led to some dark days and a lot of reflection about what is important. Getting better took over a year. I still deal with complications. VV was immensely supportive. I came out of it feeling that life is short and you can’t put off your dreams forever. Get busy livin…

Then my wife Kristin got pregnant in late 2013. (I hate it when people say “we” got pregnant. She got pregnant. Nothing happened to me.) After spending a much of my son’s infant years with my nose down in work at the office, I started thinking about how I didn’t want to repeat that with the new baby. So family was also a major factor in my decision.

We had our baby girl at home in April. It was fantastic and went very smoothly. If you need birth tub tips I’m your guy. Well, except for the time I drained it into the bathroom floor/kitchen ceiling. Learn from my mistakes!

Shortly after the birth, once I came up for air, I put in my notice and started down this new path.

Ironically, I started with a sabbatical that focused on family and warmer weather. I knew that I needed some time to decompress from years of deadlines to figure out what I wanted to make. I stole away a bit of time and did a good 9 hours on Ludum Dare 29 and made a terrible VR game called VR Mowing Hero 2014. I went to half of the Boston VR Bender and started what would become my first project, Headmaster. I helped friends with their projects and worked as much as I could, maybe quarter-time. For the majority of the spring and summer though, I was family Ben. We had a real summer vacation with real relaxation – as much as possible with a baby. I learned to sail my sunfish (poorly). We hadn’t had a real summer… ever. It was awesome.


Sometimes we even used the sail.

With the family in balance and the brain clear, I got into a project in the fall. Now that I’m getting a bit of traction on that project I’m looking forward to connecting with people by writing on here more often. Follow me on Twitter to get updates of new posts or just drop a comment!

Oculus Rift DK2 OSX + Unity3D Setup


(Edited 10/28/14 after Oculus 0.4.3 update)

Here’s my setup that works on OSX in Unity editor. The big caveat is that it will only do 60hz when mirrored, so there will be judder on the Rift. I have only been able to get 75hz in a standalone build on the Mac, and it’s been sporadic at that.


  • DK2 running the 0.4.2 (or 0.4.3) runtime on
  • Macbook Pro 2013 Retina with OSX Mavericks.
  • Unity 4.5.5
  • External Thunderbolt display


  1. Make all 3 screens active. Don’t close the lid on the Macbook.
  2. Open the Displays preferences
  3. Go to Arrangement
  4. Set the DK2 to 90deg rotation so that it actually is right side up and in landscape mode 1080p.
  5. OPTION DRAG the Macbook display to the DK2 display. This will selectively mirror those two displays while leaving the other display alone. Set the displays to optimize for Rift DK2. After this the Macbook and DK2 should be rendering at 1080p landscape.
  6. Move the Mirrored DK2/Macbook display stack to the LEFT of the external monitor. It appears that the DK2 only likes to render if it is the left-most display.
  7. Make the DK2/Macbook display stack the primary monitor in the Displays/Arrangement panel by dragging the white bar from the external display to the mirrored ones. This may unmirror the displays. If it does, just remirror them with OPTION DRAG.
  8. Open the Unity editor and place the game window on the mirrored displays. Make it fill the screen.
  9. Set the game window to FREE ASPECT. For some reason 16:9 caused it to go black for me.
  10. Hit play

Now under 0.4.3, the Rift render in the Unity game view always shows up. This makes iterating on my game MUCH more efficient than before, where the view would go completely black at random times. However there are lots of other issues with 0.4.3, so your mileage may vary.

Oculus 0.4.3 Unity Camera Orientation Changes

Well I was brave and figured I’d grab the latest Oculus 0.4.3 runtime + Unity integration and see if it breaks anything too badly. For me, the biggest issue was that they deprecated the OVRCameraController and two methods that it had: OVRCameraController.SetOrientationOffset(Quat q) and OVRCameraController.GetOrientationOffset(Quat q). I used the GetOrientationOffset to interpret user gestures like nods and shakes (more on that later). This one was pretty straightforward to figure out. I started digging around in the new codebase, which now uses a class called OVRCameraRig instead of the old OVRCameraController. Interestingly though, this didn’t have what I was looking for.

Instead, the best replacement I found for getting a single Quaternion of the user’s head orientation was by doing this:

Quaternion quat = Quaternion.identity;
OVRPose ovp = OVRManager.display.GetHeadPose();
quat = ovp.orientation;

OVRManager has some static members, one of them being a handle to OVRDisplay. OVRDisplay has a handy GetHeadPose that returns a wrapper OVRPose. Inside there, you can get position and orientation information.

Ok so that’s how to get the orientation now. But what about setting it? You used to be able to set the offset, which would let you take some control over the camera during gameplay. Obviously you don’t want to go crazy with this and make players sick, but I find that a little bit of camera shake in certain situations can help sell impact.

Well the Unity integration folks at Oculus have made things much simpler. Now you can manipulate the OVRCameraRig gameobject and it just works. Previously I went through some gyrations to set that offset rotation, but now a good camera shake is as easy as

iTween.PunchRotation(_ovrCameraRig.gameObject, headPunchVector, .5f);

And if you want to get fancy you can find a proper vector before doing so.

Do you do camera shake differently? Am I doing stupid stuff? Drop me a comment!


Detonator Update for Unity 4.x

Detonator has been dormant for a couple years but I’m starting to resurrect it. First step was to resolve some build and import problems that have cropped up as Unity itself has been updated underneath Detonator’s really old codebase. Version 1.2 is now live in the Asset Store. It unfortunately is set to use minimum version of Unity 4.5.3, which is my bad. I’m resubmitting today but you can just grab the proper version from here: Detonator 1.22 on my Dropbox.

Asset Store Link   |    Unity Forum Link
  • All of the supporting files have been converted to C#.
  • Fixed issue with a few textures getting interpreted incorrectly as normal maps on import.
  • Tuned up prefabs to look better
  • New skybox in test scene
Demo Scene (Unity Web Player) :

Detonator for Unity 3.x and 2.x

(Update 10/26/2010 – A version of Detonator for Unity 3.0 is now available.)

Detonator is an extension for Unity3D that lets you make good looking explosions quickly and easily for your Unity projects. How you use it depends on who you are and what your goals are. Solo coders can quickly get prototype explosions going while artists can stack effects to quickly make complex explosions.


Detonator was created originally for the Unity Summer of Code 2009.

Download Detonator 1.3b for Unity 3.0

Download Detonator 1.02 for Unity 2.6

Bug Reports

Sorry, the Unity 3.x and earlier versions of Detonator are no longer supported. Please grab the latest version from here. However feel free to drop notes in the comments as I or other users may be able to help.

Detonator release today?


(Update: Apparently the Unity guys had some issues updating their site. Stuff should be up 9/17)

Well as of this writing it’s not out just yet, but it will be very shortly. I’m going to bed in a few minutes and the boys in Copenhagen will be getting up, and Detonator, if not all of the SOC projects, will be released tomorrow. I wanted to just make a quick post for anyone that ends up here and is looking for more info, wondering where development is headed, or finds bugs.

I plan on continuing development on the project and will be posting the code and the current dev build to some public source sharing service like Assembla.com or Google Code in the next couple of days. I’m SURE that people will find bugs and I’d like to stay on top of them as best I can. Really, there was minimal testing done on this project so if something acts weird, it probably is.

In the meantime, if you find a bug and don’t know what to do with it, drop it in the comments here. Thanks!

Detonator – Progress Point 5


(This post is mirrored on the Unity Technologies Blog as well.)

We’re just 9 days away from the August 31 deadline so it’s time for an update. The good news is that it’s almost done, and that means a ton of changes since the first concept Unity players that I posted on my blog (see Point3, Point2, and Point1). The main effort has been towards making Detonator entirely code driven. This involved creating a new particle component (DetonatorBurstEmitter) that calls some of the scriptable functions on the standard Unity particle system, but makes one shot emissions and the other sort of scaling effects easier to create.

So, what about actually using it? The simplest use case is to take a GameObject, attach a Detonator component to it (in code or in the Inspector), and either call Explode() in code or check the “Explode on Start” checkbox in the Inspector. That will do a whole bunch of stuff… create all kinds of emitters, a light, a force, all corresponding default materials, and then BOOM, you’re exploding. That usage case was a primary design goal and it’s met.


If you want to take it one step further, you can tweak parameters on the Detonator component. The default explosion has a 10m radius, and that can be changed to whatever you’d like – all effects scale accordingly. As anyone that works with particle systems knows, this is not a trivial thing because it needs to change particle size, emitter radius, velocity, emitter position, and forces all in unison.

Performance scalability is also a concern with effects, because, well, they can scale.  For that there’s the detail parameter, which affects the number of particles spawned, and even whether or not certain entire sub-components get created. Each piece has a detailThreshold parameter that lets you customize how your Detonator explosion scales to different performance specs. I’ll be looking into how this will hook into the global Unity quality settings as well – no promises for release but I’ll get it in shortly after if it doesn’t make it then.

After detail there’s color. Changing the color of the main Detonator component will have differing effect depending on its alpha value. Since using alpha purely for transparency didn’t make a lot of sense in this context, it instead serves as the color influence. So if you make your color Blue with 50% alpha, then colors of all sub-components will be 50% blended to that blue. Since the normal fireball is orange and other parts are white, this gives a nice non-uniform coloration. By the same token if you’d like to go for a stylized look, crank the alpha to 100%.

Duration can also be adjusted. This was tricky because just altering this naively made the explosions really dim, so the alpha values of all the emitters’ color animations try to stay more opaque when the duration is shorter. Everything is tuned to make changing parameters make sense. Of course, we’ll learn a ton more when people are using this en masse, but I’ve given it my best shot to start with.

So that is the main Detonator component. Many people will just use that, but underneath is a full-fledged explosion construction kit. For instance, DetonatorFireball is one of the sub components that a Detonator normally auto-creates. Instead, you can make your own by  dragging a DetonatorFireball script onto that same GameObject. You can add one, two, or ten of these and then get busy changing their relative positions, sizes, colors. You can even time when they go off (with randomness) to create startling layered effects. Then add some sparks, smoke, a glow, a light, or whatever you want. In just a few minutes I was able to make a pretty nice mushroom cloud. I can’t want to see what people do with this.


And for the artists out there like myself, you can switch out the materials and textures that your Detonator components use. Either replace them at the top level and let them cascade down to subcomponents, or replace them piece by piece, it’s up to you. I’d really like to see what is possible with stylized or toon explosions with this system.

So what needs to still be done? I still need to reimplement a few components that were in the concept effect… namely the chunk emitter (which sprays any gameobject you’d like with trailing smoke) and the physics force (which acts on rigidbodies and even sets them on fire if you want). The UI of the main Detonator could use some spicing up, but that would mean reimplementing material slots through the drag and drop API, which might not be worth it at this stage. The main thing the UI would do is put buttons in to create each subcomponent so one wouldn’t need to manually drag scripts onto it. It feels like that would add a nice level of polish so I’ll have a look.