top of page
Code.png

Programming

I have learned gameplay and tools programming over the course of 12 years mainly because I wanted to try out some of my ideas and prototype. Selling an idea is infinitely easier if I can build a rough prototype.

Vehicle HMI

Automotive HMI

Unreal has put an emphasis on vehicle Human Machine interfaces as an application of the game engine. Here is an example of an HMI I made for an electric vehicle.
Features
- Widget Blueprint Scripting
- Custom material shader for the fill bars

- Color Blind Mode

- Scrollable album list
- Accounts for generating power while braking

- Units switch between Metric and Imperial

- Red warning vignetting on the border to draw attention to the screen if there is a new alert

*The art is not mine but I made the functionality

Dr. Follybottom and the Stone of Xotl

This is a recreation of one of the first games I made while as a student at the Guildhall at SMU. The original game was made in Torque and I wanted to see what I could do in Unity3D. It's just a small slice of the original but it includes these features.

- 2D Player Controller with "Pushing" mechanic

- Old timey film grain effects

- Wall Destruction by pushing the stone

oncortitle.png

Oncor Super Safe Kids

Third person adventure game that teaches electricity safety to kids

This is a contract job due to be complete this summer. It's a game made for kids ages 9-12 to teach them electrical safety. I was the primary programmer on this project and got to make some tools to maximize the efforts of our small team.

Cutscene System

While at BioWare I got to see how they handled the massive number of cutscenes effectively. They used a virtual stage and for each line of dialogue they would spawn in the appropriate actors on the correct marks playing the right animations. All this was controlled by a data-table. This means that instead of placing actors by hand and animating them for each line, you simply fill out a spreadsheet. 


I implemented the same type of system so each line of dialogue can spawns characters at the appropriate marks within view of a camera at a good angle. I can also override the stage and play a cutscene if I want to incorporate some camera movements or animations that require the actors to move beyond the stage. 

Procedural Face Animations

Lacking an animator with the time to help on the project, I had our modeler create a rig with a working jaw and create bland shapes for the characters' faces. Using those blendshapes I procedurally generated talking animation by interpolating between these blend shape values to form different mouth shape phonemes like "O", "La" "Fe" etc. By randomly choosing phonemes and interpolating between them, you get fairly realistic looking facial animations.

"Capture" Gameplay Mechanic

I built a game mechanic where you "capture" items with a camera viewfinder. The dialogue gives you quests to find certain objects. When you find one and enter viewfinder mode a red glow appears around it and you can capture it with a mouse click which updates the quest. The glow was accomplished with a combination of a shader and post processing effect.

LOMTitle.png

Light of Mine

VR Horror Game where the statues come alive and chase you.

"No Nausea" Control Scheme

We wanted to make the game feel tense but being able to teleport around like in most vr games broke the gameplay. We wanted the player to walk around the world but found that using the trigger or a treating the track pad like a controller it made people ill. We then had to figure out how to minimize motion sickness. We took away the ability to strafe and used the thumb track pad as an analog movement that would allow people to control their acceleration. We also made steering based on the head orientation so there is never a disconnect between desired movement on the controller to the motion of the player.

Red Light, Green Light

I happened up on a perfect set of nodes that track when an object is out of the player's view frustrum. This led me to try to make the red light green light mechanic found in those classic Dr. Who villains the weeping angels. They would only come to life and move when not observed. In this case the ai is trying to get to the player but when out of the player's view I just turn off their movement abilities. I also add effects like glowing red eyes that get more intense the longer you let them move unobserved. If you are caught, the world goes dark and the horrible vissage of the statue looms over you.

Realistic Candle

The candle was one of the primary mechanics of the game. By getting different colors of flame you could unlock different parts to the temple. Since you can literally bring it right up to your face, it needed to look realistic. I found a really nice flame sprite, then I added rotational lerping depending on the current velocity and orientation of the flame. I even put a sub surface scattering on the candle itself so the light of the candle shines through where the wax is thin.

HighresScreenshot00002.png

On Tour

Rhythm game made for VR and PC that takes the guitar hero formula and converts it so it is not necessary to play with a guitar controller or use your keyboard as the controller. it can be played with a mouse and a single keyboard button or in VR with one hand strumming and one hand moving a cursor around to catch the notes.

 Guitar Hero without the controller

I have been a rhythm game fan for years but I have always wondered if there was a way to make it more accessible by using standard peripherals. I have conceived of two methods, one for VR and one for standard mouse and keyboard.


For VR, the "neck" hand controller moves a cursor left and right as if you were moving your hand up and down the frets of the guitar. The string hand registers movement in the controller and if enough distance has been reached in a short enough amount of time it registers it as a strum.


For PC and I have been experimenting with using the horizontal and vertical axis of the mouse to replace the 5 buttons of the guitar hero system. The 5 basic positions are still there, but now you simply move your mouse the postion instead of pressing a button to hold down blue, green red, etc. The issue of chords is handled by the vertical axis. A the position of the lowest note in the chord sets where the note is on the horizontal axis and the highest note determines where the note is vertically. It is a work in progress but there is some potential.

User Made Music Import System

Since Rock Band and Guitar Hero have fallen out of fashion, several PC games have come out that allow a dedicated group of fans to continue making new playable music using a format called "charts". A chart shows where the note needs to be on the note runway as the song progresses and the correct spacing between notes. There are several charting programs, though my favorite is one called moonscraper, and there is an online database of thousands of charted songs. Being able to open up a rythym game to all of these songs is crucial.


I used C++ to convert .chart files into text and then parsed it out into an array of notes. I then procedurally place those notes in the world along a spline and as the player travels they encounter the notes and they move the cursor meet them. What this means is that all you need is to go to the chart archive, grab the song you want, put it in the right folder and it will appear as s playable option in the song select menu. 

The Journey

I wanted to try to get away from the traditional formula in rythym games where the note runway comes at you at a constant pace and you only really focus on the notes. That seemed like a waste especially in VR, so instead of a linear runway I made it a spline that flies through the game world. I even included events that happen around the player like a space battle between capital ships as the player flies through an asteroid belt.

HighresScreenshot00002.png

Void Royale

Multiplayer Squad based fighter vs capital ship space combat sim.

Environment Reactive UI

There was a game called Descent: Freespace that had an interesting menu UI where the environment would react to your mouse movements. Your camera was located in a hanger bay on a capital ship and as you hovered your mouse over doors and objects, they would animate and react and labels would appear indicating what would happen if you clicked on it. I wanted to create that similar "cool" factor but I wanted the menus to flow similar to a modern game like Fortnight, Apex Legends or Valorant that favor ease of use with functionality spread out over various sub menus. So I made the menus then hooked up various animation sequences to create a visual feast for the eyes as you navigated through menus.

6DOF AI Pathfinding/Combat Logic

While Unreal's native AI is great, it does not handle flight navigation very well. I created steering behaviors that allowed AI fighter craft to navigate towards an objective but also steer around obstacles. There is still a lot of jerkiness but at least they are not getting stuck.

I have also implemented high level combat decisions using a behavior tree. Enemies work in wings and follow the orders of their wing commander. When a wing commander dies another member of the wing becomes the wing commander and gives target instructions.

Capital Ship Battle Damage

One of the hallmarks of a monster hunter style game is the ability to knock pieces off the monster. To accomplish the same thing I made a capital ship where you could blast off pieces and make holes in the armor to reveal vital ship components.


The problem I faced was making a model look seamless so the holes are not obvious, but have a way to detect what piece the player has hit and remove it but at the same time not have to spend a lot of time in editor hand placing pieces into each hole. I also wanted to have a lot of variety to each hole or piece broken off. It can break immersion if each hole is exactly the same.

I ended up going into Blender, taking a full capital ship model and using the intersect tool starting cutting out chunks of the ship, but leaving the chunks in their same place, so the ship looked seamless but if you removed the chunk the hole would be there. I then attached each cut out piece to a skeleton rig and imported it into Unreal. I then set it up to detect damage on a piece and when hit with enough damage it will disappear and spawn in a particle effect of some debris, leaving a perfect hole in the ship. 

O'Neill Cylinder

I've always wanted to try and make a dyson sphere or O'Neill Cylinder in a game, but at the time I know I would be able to texture it properly. I devised a way to make a heightmap in photoshop, put it into blender to make a terrain, then bend the terrain to make a cylinder shape, then in Unreal use a heightlerp to make different colors appear at different heights on the heightmap. You can see more on how I accomplished this in my blog.

SS_20081226_SithAcademy_1600x780.jpg

Star Wars: The Old Republic

My job as a world designer involved implementing quest content and combat encounters. We use an internal scripting system in concert with the hero engine. With its narrative driven campaitn, there were lots of opportunities to make the game feel dynamic and exciting. 

The video is a dungeon encounter called "The Hate Machine" that I helped implement that has a lot of visual effects and combat.


Some other notable tasks I worked on.

- Escort quest leading a minigun wielding soldier through a dungeon.

- Creating companion quests that would trigger when certain quest line and relationship conditions were met.


Level Ex Remote Play™: The First Cloud Gaming Platform in Healthcare
SWTOR -- Sith Warrior story part 8 Nar Shaddaa
SWTOR:  The Hate Machine hc Korriban
bottom of page