Josh Terry
is a
UX designer
who empowers people with great digital experiences.
He's currently pursuing a Master's in
Human-Computer Interaction
Georgia Tech
with a 4.0 GPA, working with the sea otter team at the
Georgia Aquarium
, and doing UX and QA work at
Akupara Games
Recent Work
Rick & Morty: Virtual Rick-ality
I helped redesign the first time user experience and height-calibration UX flows of the PSVR port of this Emmy-nominated title as QA Lead.
2018 - Adult Swim Games
UX Design

Game Development

Product Management
Problem Space
Helping first-time users of Playstation VR fully enjoy Rick and Morty: Virtual Rick-ality (RMVR) on their home systems.
Users had high expectations of the title due to the success of its release on Steam for PC.
PSVR hardware imposed significant limitations on the game's tracking and overall performance.

Adult Swim Games was without a Product Manager while I was Quality Assurance Lead on this title. I filled this gap by gathering user-centered feedback from the QA team to present to developers in the form of issues and design implications.
Through a combination of think-aloud cognitive walkthroughs, contextual inquiry, semi-structured interviews, and research of player comments online, I identified several problems that users were likely to encounter in the title. From these problems, I developed design recommendations that I then presented to the developers at Owlchemy Labs.
Issues & Design Implications
Design Implications
I. Players were confused about controls when first entering the application, regardless of their familiarity with RMVR for PC.
The title should orient players with a first time user experience. This FTUE took the shape of a tutorial that situated players in the universe, calibrated players' devices, and helped familiarize players with the game's controls, mechanics, and progression system.
II. The title requires players to turn around while playing, but the PSVR hardware is only capable of tracking players facing the system's camera.
The title should allow players to turn around in game space without needing to turn themselves around in real life. This lead to snap rotation being implemented in the PSVR version of the title: one button rotates a player's view 45 degrees to the left, while another rotates it to the right.
III. Players were often unable to pick up items that were low on the ground or high above the player due to these items being outside of the system's tracking range.
The title should allow players to adjust their height quickly and natively, without needing to adjust their height in the PSVR system software. This was implemented as a height calibration menu flow that players may access from any place, at any time.
IV. The height calibration menu flow was frustratingly slow for experienced players to use.
The title should allow players to perform height calibration without a dedicated menu flow. This was implemented in the form of two buttons that may be pressed to immediately increase or decrease a player's height in the game.
I performed multiple contextual inquiry sessions with users. These consisted of task-based cognitive walkthroughs during which participants were asked to think aloud about the actions they wanted to take in the game. I followed these sessions with semi-structured interviews concerning the design elements and bugs encountered during the tasks of the earlier cognitive walkthroughs. The interviews were also helpful for gaining feedback on general user sentiment around the title, both positive and negative.
The original height adjust menu prompted players to assume a T-pose in front of the Playstation Move camera. This proved bothersome and time-consuming to most participants. To alleviate this frustration, the triangle and circle buttons were configured to increase and decrease the in-game height of players. Users reacted positively to this change both during testing and in online reviews.
While snap-rotation and quick height adjust controls were not available on launch, they were added to the game in the launch of version 1.1, resulting in improved gameplay and user reviews. These changes empowered users to play the game with fewer technical errors, likely resulting in fewer low points along their journeys to completing the game.
Accessible Outdoors App
I did iterative, user-centered UX research and design on this project as the visual design specialist on a team of four MS HCI students.
2020 - Lucy Chen, Christian Gutierrez, Matt Rossman, Josh Terry
UX Design

UX Research

Project Management
Problem Space
Empowering persons with mobility impairments to research and engage with local parks.
This project was completed during the Fall of 2020, during the COVID-19 pandemic. All user studies were conducted remotely to ensure safe social distancing, with the exception of an observation study in which team members went to local parks to conduct socially-distant research. This introduced interesting opportunities and complications to the project.
Prior to this project, all team members were unfamiliar with the social and physical caveats of working with members of the motor-impaired community. This project proved to be an excellent learning and empathy-building exercise, and all team members are now comfortable and passionate when working with this vulnerable audience.

While existing widespread navigation tools feature some accessibility settings, the accessibility information on these platforms lacks depth. Also, accessibility-oriented trail databases lack the breadth of data featured by widespread navigation tools. How do we satisfy both of these requirements?
Early Research Activities
Prior to designing and iterating on possible solutions for this problem space, we performed six research activities. These included:
I. A survey distributed to the 75 members of a support group for individuals who survived traumatic spinal cord injuries.

II. Observation of individuals at public parks during which we counted the number of people with motor impairments at those parks, as well as difficulties these individuals may have encountered navigating through these parks.

III. Unstructured interviews with members of, and experts on, the motor-impaired community and their experience engaging with the great outdoors.

IV. Affinity mapping exercises through which we hierarchically organized stakeholder remarks on the problem space and determined user needs.

V. Personas and empathy maps distilled from identified user needs and remarks to better understand and build for the end users of our product.

VI. Journey maps that provided the team with explicit user stories that our product might be used as an intervention to improve.
The green "stickies" above show the four high-level takeaways from the affinity mapping exercise: users enjoyed connecting with others, being independent, appreciating nature, and overcoming the limited accessibility features present in the space.
The above journey map shows the actions one of our personas might take to go on a walk somewhere new, as well as his emotional state as he travels along that journey.
Early Research Findings
From the above research activities, our team collaboratively identified eleven high-level findings, several of which we determined were particularly important to understanding our stakeholders, and ultimately the success of the project. Below are the four critically-important research findings:

I. Users want more engagement with natural parks. Therefore, our solution should enable users to engage with natural parks more frequently than they currently do without our solution.

II. Users want nature to be as unaltered as possible, but many accessibility solutions involve the physical alteration of that nature. Therefore, our solution should avoid physically altering nature wherever possible.

III. Users value collaboration. Therefore, a crowdsourced data model would likely be effective for the case of our solution.

IV. Users value independence. Therefore, our solution should assume an ethos that focuses on empowering users rather than helping them accomplish tasks.
The above sketches are for the team's Spatial prototype, mocked up by Christian Gutierrez and Matt Rossman. This platform would allow users to research parks and trails online from their desktops, then view live alerts while engaging with those trails and parks. The team evaluated this interface by performing user- and expert-involved cognitive walkthroughs and heuristic evaluations.
The above sketches are of the team's smartwatch app prototype, illustrated by team mate Lucy Chen. The tool would notify users of upcoming hazards in a trail and prompt them to confirm whether or not the specified hazard was present in the trail. Additionally, users would be able to report issues encountered on the trail that would then be shown to other users. Like the last interface, we evaluated this smartwatch prototype by performing user- and expert-involved cognitive walkthroughs and heuristic evaluations.
The above two flow charts depict conversation flows in the team's third prototype, a voice assistant app made for identifying accessible parks and trails. I created these simulated conversation flows with In keeping with the other sketches, we evaluated this interaction mockup by performing user- and expert-involved cognitive walkthroughs and heuristic evaluations.
Spatial Sketch Feedback
Spatial Sketch Feedback & Recommendations
I. Unclear what the accessibility rating represents.
Show what factors influence the accessibility rating each time it appears.
II. The 2D interface seems to only support large monitor displays, while users may want to look up parks using their mobile device.
The 2D interface should be designed with mobile devices in mind, if not specifically for mobile devices.
III. Overlays are presented as a "one size fits all."
Users should be able to customize what warnings they see.
IV. User can't tell what terrain trails are made of.
Convey the surface material of trails to users, such as whether they are paved, gravel, or grassy.
Smartwatch Sketch Feedback
Smartwatch Sketch Feedback & Recommendations
I. Too many UI elements are presented at once.
Reduce the amount of information present on the watch face so that users are able to more easily interpret it.
II. Visual design of arrow appears to indicate that navigation begins behind the user instead of in front of them.
Refine visual design for clarity, and confirm this clarity with further user feedback sessions.
III. Platform relies on colors to convey information, which could negatively impact those with colorblindness.
The platform should rely less on color to convey information, and the interface should be designed with colorblind users in mind.
IV. The platform does not appear to allow for many options of input.
The platform should allow for multiple forms of input, such as voice and gesture interactions. This will allow more of our stakeholders to seamlessly interact with the platform.
Voice Assistant Sketch Feedback
Voice Assistant Feedback & Recommendations
I. User wants to ask about what parks were nearby without mentioning their impairment every time.
The system should dynamically suggest parks to users based on pre-existing knowledge of the user's abilities. This could be done by syncing with an account that the user has in an app within this platform's ecosystem.
II. Normal use case conversation flow prompts the user to create an account too quickly after they said "no" during the first time user experience.
The platform should not ask users whether they want a tailored experience too frequently or immediately after the FTUE, but it should allow users to opt into a personalized experience at any point.
After taking the sketch feedback into consideration, we rebuilt the Spatial prototype's wireframes using Figma and, two online prototyping tools. Similar to the last round of prototypes, all wireframes at this phase were evaluated with user- and expert-involved cognitive walkthroughs and heuristic evaluations.
The smartwatch prototype interface was also rebuilt in Figma, featuring two menu flows in which the user would be prompted to respond to an upcoming trail hazard, and report a trail hazard.
Compiled Findings
From the wireframe feedback sessions and previous research findings, we distilled feedback from users into four design recommendations for the final design of our platform. These design recommendations indicated that the current iteration of the Spatial platform was a more effective solution than the smart watch app in the problem space. Due to the structure of the course this project was for, the team decided to only move forward with this Spatial prototype. The four design recommendations are below:
I. Allow users to customize which UI elements they are shown in any platform moving forward.

II. Offer hands-free input wherever possible, and do not require tactile input as a prerequisite to verbal input.

III. Prompt the user with output across multiple sensory channels to better alert them of upcoming hazards or prompt them for feedback.

IV. Make iconography as simple as possible, free of potentially-ableist imagery, and without text wherever possible.
Final Prototype
The final prototype of the spatial platform's mobile touchpoint was redesigned from the ground up in Figma, and the AR prototype was iterated on with imagery that was deemed more informative, more accessible, and generally more effective by expert and user study participants alike.

The Mobile platform prototype and AR prototype are both viewable online.
Future Work
I. Involve more users. There is so much diversity within our stakeholder audience, that our team feels as though we weren't able to fully capture the needs of our users, or their sentiment toward our product. This is due largely in part to the difficulties imposed by COVID-19, the size of our user population, and the tight 4-month schedule this project was operated on.
II. Park alteration research. While we learned that users were generally opposed to park features that they felt worsened their experience in nature, we did not learn much nuance about where this sentiment was coming from, or how we might circumvent this user need while making physical changes to parks.

III. Iterate. The iterative process of this project was greatly helpful to its rapid growth and likely success at fulfilling its purpose, but given more time, I would like to keep iterating on this project until users indicate that it is ready to be released into the wild.
IV. Business model. If the team were to continue this project into the future, we would need to establish an effective business model to keep the project afloat for the foreseeable future.
V. Establish audience. Users are just as important as functionality when it comes to the success of a platform. If this project were to be implemented and published, we would need to establish a group of users for the platform to maximize its chances of success in the real world.
VI. Implement and publish. With all of the previous steps taken care of, we would need to finalize the platform's functionality and polish, then publish it to the appropriate platforms. Depending on the business model of the platform, this would likely be followed by a live ops plan.
Ermine: Indie 3D Platformer
I designed and implemented all aspects of a functioning 3D platformer with a focus on game feel and procedural animation.
2020 - Independent Project
Game Development

Technical Art

Project Management
Project Goal
Making a third-person platformer that feels great to play.
With very little prior experience in Blender and Unity, I decided to independently create this third-person platformer as a personal project during the summer between my undergraduate and graduate studies.
During development, I actively implemented mechanics and graphical features that build toward the "feel" of the game, such as procedural animation, jump fine-tuning, inverse kinematics, physics animation, and tight controls.
Early development involved plenty of experimentation with the mechanics that go into making a 3D platformer feel good. To accomplish this, I modeled and animated an ermine character that I could use to navigate through an environment. With this character implemented, I rapidly prototyped and fine-tuned numerous features that are often present in platformers that users often don't think about, Including:

I. Character model procedurally tilts toward direction of acceleration.

II. Character acceleration and deceleration fine-tuned to feel as though the character has weight, but is easily controlled and stopped on a dime.

III. The character jumps higher when the jump button is held, and falls faster when it is not.

IV. The player may make the character sprint, crouch, and slide, each of which affects the animations, speed, and movement behavior of the character.

V. Jump/fall animations are procedurally combined based on vertical velocity of character while airborne

VI. The character "sticks" to the ground when walking or running downhill, but falls as expected according to gravity when running off of an edge.

VII. If the player tries to jump immediately after falling off of an edge, they are still able to jump, allowing users to better control the character and recover from mistakes.

VIII. The character's feet "stick" to the ground with a hand-implemented inverse kinematics algorithm.

IX. The character's scarf hangs behind them and is physics-animated, behaving as though it were physically a real scarf.

X. The camera is fine-tuned to smoothly follow the character, pivot around the character, and not intersect with terrain via a ray-casting script.

XI. The character's object lives inside of a squash & stretch controller, which makes the character's movement appear dynamic and fluid.
Like many of my projects, this one started with an idea: what if I tried to make the most of the time between semesters by having some fun in Blender and Unity? Little did I know that these humble beginnings would lead to a huge undertaking in the form of the most complicated game I've independently developed to date.
I knew that I wanted the game to have a laidback, Pacific Northwest inspired environment due to my interest in the area. Furthermore, I was inspired by a GDC talk on the procedural animation in Overgrowth, and I was determined to try it myself. To start, I mocked up the above animation state machine with pencil and paper.
Next, I wanted to establish a development schedule. I once again looked to the GDC talk on the procedural animation in Overgrowth to provide myself with a challenging, but achievable, order of operations.
I even created a color palette that I would use to drive the visual design of the project moving forward. Taking inspiration from The Pathless, I wanted the game to be full of subtle blues, greens, grays, and browns, with the occasional flash of red to draw the player's eye.
Environment design proved an interesting challenge that I'd never encountered in an industry setting before, so I took my time when mocking up and planning assets to use throughout my Unity game.
From the sketches above, I created several 2D assets for the game's foliage: pine needles, leaves, flowers, and grass were the first of many environment assets to come.
Next came sketches of rocks, and eventually 3D models that I would dot the landscape with. The texture of these rocks was hand-painted with acrylic, and scanned into my computer, and further painted in Blender for what I thought was an interesting visual effect.
Having some environment assets complete and a character model ready for a game engine, I designed and implemented many of the intended aesthetics, dynamics, and mechanics of the project in Unity. Alternating between feedback-gathering and feature-refining proved an effective exercise in iterative game design.
Later in development, I decided that it was time for a visual overhaul to the main character. I also decided that it was time to give him some tools through which he could interact with the world, affording players a greater degree of agency. Weapons are a popular way to accomplish this in games, as well as an interesting area to explore with game feel in mind, so i decided to give him a sword and bow.

I plan to implement an animation in which the character shakes one weapon in the air and it transforms into the other weapon. The metallic ball at the center would function as a unique magical tool that the player can use to pick up elemental charges from enemies or the environment, then later solve puzzles with. This is where narrative development comes in.
The character is fully rigged with inverse kinematics and shape keys for clothing-wearing, facial expressions, and speech.
Future Work
First, I would like to fully implement combat and puzzles to bring a more engaging experience to the project. This would include weapon and enemy animations, behavior, AI, and additional mechanics.
Second, I would like to further build out a compelling environment for the player to not only navigate through, but meaningfully engage with. I would like to accomplish this with small-scale procedural interactions and animations that would afford the player a greater degree of agency in the world.
Third, I would like to further develop the project's narrative to provide the player with a reason to exercise their agency in the game.
ZenVR: Meditation Instructor
I optimized 3D assets, UX flows, and production pipelines on this virtual reality meditation instruction tool for the Oculus Quest.
2020 - Satori Studios
Game Development

Technical Art

Optimizing and polishing a VR meditation tool in a fast-paced startup environment.
Before the inception of the startup Satori Studios, ZenVR was the Master's project of Rachel Feinberg and Matt Golino, two now-graduates from Georgia Tech's MS HCI program.
While Matt and Rachel had designed and implemented the title, it had numerous graphical, functional, and performance issues. Matt knew that I had experience in Unity and Blender, so he invited me aboard to optimize and polish the title.

The team was well aware of several of these issues, but they had not tracked them in an organized database. I took on the task of logging and tracking known issues with Trello during my time with the team.

Through a combination of functionality and exploratory testing, I was able to identify several issues, each of which had a clear correspoding solution.
Issues & Remediations
I. Asset meshes were unnecessarily complex, often featuring thousands more polygons than necessary, resulting in poor performance.
I retopologized and UV unwrapped the meshes of all assets that were unnecessarily complex, and I ensured that the remaining meshes were low-poly enough to run smoothly on the Oculus Quest. This vastly improved the performance of the title with no discernable difference in graphical quality.
II. Every asset had its own material, often times consisting of numerous high-resolution maps, resulting in poor performance and load times.
I combined the materials of assets into large complex materials that contained the textures of many meshes. Through this method, I was able to reduce the number of materials in the project's environment from 40 to 2. Similar to the previous solution, this improved performance with no significant hit to the title's graphical quality.
III. The title was set up with Unity's default rendering pipeline, resulting in poor performance and graphical quality.
I configured the Unity project to use Unity's Universal Rendering Pipeline, this time improving both performance and graphical quality.
ZenVR takes place inside of a dojo, in which Wei, your Shifu, guides you through eight lessons on how to effectively meditate on your own. Before optimizing, many of the assets inside of the dojo were unnecessarily detailed. For instance, the pillows on the floor of the room each contained about 2000 vertices. Through some clever retopology, these were brought down to about 40 vertices each.
The assets inside of the dojo, and some of the architecture of the dojo itself, was also due for material optimization. To accomplish this, I combined the texture and normal maps of each asset in Photoshop CC, then remapped the UV unwraps of each asset onto the corresponding texture and normal maps of this new material. Pictured above is the result of combining 16 materials into just one.
The outside of the dojo features mountainous terrain, trees, a Shinto shrine, and the building itself. Similar to the assets inside of the building, the meshes of these assets featured unnecessarily high vertex counts and many materials.
Above are the results of an optimization process similar to that applied to the assets inside of the dojo. During this round of optimizations, I combined 24 materials into one.
Samurai Jack: Battle Through Time
I planned and tracked production of the physical release of this critically-acclaimed title as Associate Producer.
2020 - Adult Swim Games

Game Development

Project Management
Problem Statement
Ensuring that players are able to fully enjoy their physical and collector's edition versions of Samurai Jack: Battle Through Time.

I. Issue
Image Desc
Image Desc
Image Desc
Hi! My name is Josh. I take a user-centered approach to designing, implementing, and iterating on great experiences in games and technology.

Through the years I've spent with Adult Swim Games, Mailchimp, and Georgia Tech, I've picked up a formidable background in game design, development, and production, as well as UX design, research, and engineering. Fast-paced AGILE environments are where I thrive.

Feel free to check out my resume.

Here's what some folks I've worked with have to say!
"Josh came in every day with a smile, no matter what we threw at him, and provided a consistent and much-needed work ethic. I would recommend Josh to work anywhere with the talent, dedication, work ethic, and downright cheerfulness he possesses."
Abigail Tyson
Community Manager,
Bethesda Softworks
"Josh covered tons of ground with Adult Swim Games, partly due to his ability to dig deep into any situation, understand the context, and turn out quality work across production, QA, and product/design departments.
My only regret is that I couldn't bring him on full time!"
David Verble
Production Manager,
Adult Swim Games
"Not only is Josh very talented, he’s just a really great human to work with. He cares about doing the right thing and doing it well, and it shines through every aspect of his work."
Kieran Helbling, MBA
Director of Support Operations,
UX Design
I believe that it's the smallest things in life that make the biggest differences. That's why I like using my experience in visual design, computer science, and psychology to make breathtaking digital experiences.
Game Development
I've got 4 years of games industry experience as a producer, QA engineer, UX designer, and full-stack independent developer. I'm confident in my ability to make the next great game.
I'm a people person, I'm organized, and I've got a knack for getting things done. Among my 17+ shipped titles are Samurai Jack: Battle Through Time, Rick and Morty: Virtual Rick-ality, Duck Game, and Pocket Mortys.
Technical Art
I love tackling complex problems to bridge the gap between art and technology. With my interdisciplinary background, I know how to talk the talk and walk the walk with both artists and engineers.
Want to reach out? I'd love to hear from you!