Hi,
I'm Will.

I have been developing enterprise applications in the XR space for six years in industries from automotive, retail, education, industrial manufacturing, and others. I've worked on user interaction systems, graphics, multiuser networking, anchoring, video, spatial mapping, and other relevant systems but especially performance in all contexts. I've done groundbreaking work creating the first ever ray/pathtracing application with AR and/or VR support, collaborated on VR particle physics education software with the University of Washington Physics department, and written for industry publications like Blur Busters.

I'm a Senior Unity Developer at Project Archer where I'm developing user interaction systems, application logic, and optimizing performance for a retail focused, headset based, AR contextual assistant utility. The project is being built with massive scale as a focus.
 
Before that I was a Senior Software Developer at Scope AR. There I worked primarily on the remote assistance and training client application (Worklink) for ARCore, ARKit, and HoloLens 1 & 2. I worked with image and model recognition systems, 3D model data pipelines, and video transmission with WebRTC. I was responsible for VisionLib model and image tracking integration as well as AR anchoring on all AR platforms. I was also actively involved in CI pipeline optimizations.

Lockheed Martin demonstrating Worklink instructions

I previously worked for Volkswagen developing AR and VR applications for product design, testing, and visualization. Projects included processing and displaying super high detail car models, multiple 3D tracking systems, configuring massive ┬ÁLED displays, VR pathtracing, handtracking and alternate input methods, as well as multiuser applications and networking. I've also written a post about the pathtracing project I added VR support and other features to as part of my '20%' time.


Personally, I'm a bit of a framerate enthusiast and have worked on projects associated with recording and optimizing framerates in excess of 120Hz or even 240Hz+. I take great pride and enjoyment out of optimizing complex applications and making them performant for devices as low power as the HoloLens. I strongly believe that performance is generally undervalued in software development and clearly it's most consequential in VR.

I worked for Ong Innovations for year and a half developing primarily business applications in Unity3D with C# for the HoloLens and Virtual Reality devices.

Sean Ong (Ong Innovations), myself, and Jesse McCulloch (Microsoft HoloLens team)
This most recent application is a mapping project to plan and demonstrate the route of trucks carrying large objects like wind turbine blades. It's being developed primarily for Google Daydream, ARCore/ARKit, and SteamVR.

Route Outline Project



I worked on a remote assistant application (not shown here) that uses WebRTC for voice, video, and data transmission. It runs on Android and the HoloLens.

More generally, I have done many projects using Photon to network multiple users both locally and remotely. I also contributed to the MRTK repository by creating a standard universal networking tool that could operate with Photon, WebRTC, or other solutions interchangeably with no work from the developer.

I also worked on a smart city visualization application with points of interest and panels of information. We modified it as a demonstration app and added a data sphere with imported solar energy data from the US DOE. This is a cross-platform Windows Mixed Reality app for the HoloLens and immersive headsets.

Multiuser 3D Model Viewer 
 
I worked on a multi-user 3D model viewer demonstrated below developed for the HoloLens. Multiple users could interact with the same environment simultaneously and sessions were stored on the Azure cloud.

More information is available on this Ong Innovations page.


I also built this street sweeper training app shown below for the HoloLens. I had to dynamically load in assets at runtime to optimize memory usage.

More information is available on this Ong Innovations page.



I also worked at Studio 216 for a short contract to help them out with a couple of projects. The following is a simple project I spearheaded there to demonstrate how different cabinet and table materials and designs would look in a particular laboratory setting. I was responsible for writing every bit of the code new code for this project including revamping the "dollhouse" miniature model you see in the video nearly from scratch. This particular app was build primarily for WMR but works well without modification in the HoloLens.

Virtual Reality Lab Finishings Showcase Project for Studio 216



While at Studio 216 I also developed a traditional 3D photo-realistic experience for desktop to place sculptures in an environment and see how they would look. I was entirely responsible for the controls which included an XBox controller, a flight sim joystick, and mouse/keyboard controls. I also created all the menus and placement/edit behaviors.

Photo-realistic Flat XBox Controller Sculpture Placement Project for Studio 216



Both at Studio 216 and Ong Innovations I have and continue to work closely with Microsoft, specifically the Mixed Reality team, to improve SDKs, developer tools, and provide feedback. 


I started off with an internship at Anybots Inc. The company made telepresence robots. Imagine Skype on wheels if you will. They looked a bit like a Segway without the person.

Between the Old and the New.
I learned Python on the job there and also worked on a mobile android application to control the robot but there were issues with the back-end that prevented it from being usable.


Representing Ong Innovations at Build 2018

If you're interested in contacting me about an employment opportunity or something else feel free to contact me by any method listed below. Email is the quickest and most reliable way to get in contact with me.

jobs@willse.me

LinkedIn
GitHub