Game Art Methods - Project Postmortem, Part I

General / 25 March 2018


First and foremost, I definitely aim to continue working on my skills in all of the tools  and processes used in this project to further optimize the workflow.  I might take a short break from this particular project to refresh my mind and to approach it again soon with a new perspective.  Some thoughts for future work include expanding the scene's hero area and working my way into visual effects and characters to further enhance the scene.  I think this scene could be the start of a very interesting story/game.  It's very gratifying to envision a scene and then see it come to life.  A lot of my previous and current work is in virtual and augmented reality, so I'm already thinking of how to turn this scene into a environment that I can virtually walk through and interact with.  Each part of the process was captivating in its own way, particularly because it allowed me to simply create.  I have a new found affinity for modeling and sculpting assets of all kinds, with a strong emphasis on optimizing them.  The foliage assets was probably the most challenging, yet I really want to be better at them because they're such a strong part of many scenes.


Throughout this experience, I found a few aspects to be the most critical elements to focus on first because they create the foundation for the infrastructure of files and processes.  I thought I had started the course project with a good organization system and thought process, but as I learned new tools and processes they needed to be refined.  So by the end, it was a bit messy, but it became a good template and learning experience for setting up correctly in the future.


The various tools used in this project were significantly impacted by computer hardware.  Some are GPU intensive while others are CPU intensive and yet others are both.  Keep Windows Task Manager open somewhere on the desktop space to see which software is using resources.  Manage multitasking so resources are dedicated to specific tasks rather than trying to have several resource-intensive software open at the same time. 

Additionally, if the computer's fans start sounding like a jet engine, it's a good sign that the machine is running a pretty intensive process somewhere and should be monitored.  I was fortunate that my computer did not fatally crash at any point, but there were instances where it sounded and looked like it was stalling in the middle of an intensive process.  Save any progress before initiating such processes and test small files first then gradually increasing if necessary.  With this in mind, it became really important to keep model and texture sizes managed for an optimal workflow.


Moving content from one software to another requires a clear file and folder system.  It can easily become really complicated even for oneself.  There were a few times at the beginning where I had to test open several files to remember which file had the information that I needed.  As such, follow a consistent and clear file-naming convention, keep file locations stable, avoid duplicate file locations, and archive with a version control system.  In its simplest form, version control included naming files for archiving with dates in addition to their file name.  

Redundant backups were on the top of my priority list from the beginning since I had a previous experience that taught me the value of such resources.  I used Microsoft OneDrive and its automated cloud backup system along with an external hard drive with its own automated backup system.  The key is for them to be automated so its something that one doesn't have to think about; it just happens as work progresses.  However, at the same time, periodically check that they are processing correctly.  I encountered an instance where my file and folder system extended beyond the character limits of one of the backup systems and couldn't backup the files.

Maya's project folder structure is a good start for a template.  It's very capable of organizing working models, exported models, textures, and other supporting documentation all in one place.  I had one Maya project folder structure for each type of asset, such as hero asset, foliage asset, architectural asset, etc.  Subfolders within each would compartmentalize each individual asset and its textures and supporting files.  From Maya's folder structure, the content is linked to the Unreal Engine 4 (UE4) project folder structure.   UE4 allows for content to be refreshed from the source location.  This is a quick way to keep UE4 content updated after making edits outside of UE4.  UE4's folder structure also needs to be set up to isolate assets, materials, textures, particle effects, cameras, lighting, etc.

Streamline Workflow

Consistent and clear organization streamlines the workflow, particularly one that involves several tools and files.  In a quick glimpse, my workflow entailed:

  1. Maya (Low-res models, with some subdivision of high-poly models)
  2. Mudbox & ZBrush (High-poly model refinement)
  3. Substance Painter (High-poly to low-poly baking and texture development)
  4. UE4 (full integration and rendering)

Additional software used in the process included:

  • World Creator (Landscape height map)
  • Affinity Photo & Substance B2M (texture development)
  • Marmoset Toolbox (test rendering and model sheet presentation)

Continuing Education

Interacting with others in the course and seeing their processes made me realize that there are different approaches to production pipelines and tools.  One could effectively accomplish the same tasks with half of the tools that I explored, but my incentive to learn new industry tools was to evaluate their effectiveness in optimizing my production.  I tried a few early on like xNormal and KeyShot, which were effective, but eventually I pursued other tools that integrated better with my processes. 

Learning the tools themselves wasn't too bad because my focus was on learning the process, which typically translates among comparable tools.  For example, ZBrush and Mudbox are basically comparable, yet I found Mudbox's user interface better suited for quicker modeling with stencils and stamps.  However, my experience with ZBrush has shown it to be more effective as a custom sculpting tool because of its various brushes and the way it reacts to sculpting with a tablet.  Ultimately, I see learning as a constant part of my workflow and even more so because this industry is so diverse in its tools and production pipelines.  The lesson for me here is that focusing on the process for a task rather than the tool itself allows one to quickly pick up any tool.


Complex tools and processes require constant documentation, particularly for a novice.  Throughout the project, some tasks became repetitive and easily retained, whereas other tasks were not as frequently performed and it's these that need to truly be documented.  Take notes, record short videos, and include any links to sources for future reference.  At least for me, relearning tools and processes are an accepted part of the workflow, but my documentation makes it easier and quicker to recollect them without having to research them from scratch.  The process of documenting also reinforces it within me, so it's sort of like repeating the task as part of learning.

Postmortem continued in Part 2...