Creating the scene in Epic’s Unreal Development Kit was fun and interesting. The next step was to create the same scene, shown at the beginning of the article, in Unity.
Welcome to the final part of the Experiences with Environments series. This article will go through importing assets into Unity, creation of Shaders in Acegikmo’s Shader Forge, lighting using Lightmapping Extended, and post-processing using Image Effects.
Like with UDK, I wanted to begin with creating the same composition as the previous two scenes. Before I started importing assets, however, I needed to first create an organizational menu to place all of my new assets in. You can create a menu any way you would like, but this is how I created it:
I created the Cubemap, Materials, Meshes, Scenes, Scripts, Shaders, and Textures folders to keep my project as organized as possible. Inside of the Cubemaps folder, I right-clicked, went into the “Create” option, and clicked on Cubemaps. I created 3 cubemaps, one for the floor and one for each wall.
Note: My reflections in this scene are not correct. I have not yet found the proper way to create Real-Time reflections that are not too expensive. The reflections explained in this article are my attempt at creating reflections.
Next, I imported each asset to their assigned folder. After importing each mesh, I clicked the “Generate Lightmap UVs” checkbox inside of the “Import Settings” and set the size to 1. Generating the lightmap UVs allows the meshes to have lightmap textures to be created onto them. Lightmap textures are the result of baking the lights from a scene onto a texture that is then placed on the models’ Uvs.
Note: The meshes must be the right size in Maya to set the size to 1. Each Maya unit is 1 unit in Unity. You can scale accordingly.
I dragged the objects onto the scene and created the visual below:
After placings the assets, I wanted to create materials to emulate the same effects that I created in UDK. The first shader I created was a blinking material for the generator and circular floor, shown below:
Starting from the right, I used a time node to calculate how fast I want it to blink. On the Time node, there are four options: “t/2”, “t”, “t*2”, and “t*3.” The “t” in the equation signifies time. The first option makes time twice as slow, the second option uses time at normal speed, the third option multiplies time by 2, and the last option multiplies time by 3.
I used the last option, and multiplied it again by 2. I wanted the blinking in this scene to be much faster than the other scene to show some differentiation between the two.
Next, I entered the time product into a Sin wave, just like I did in UDK. Instead of putting the product of Sin and Time into a clamp, I entered it into a Lerp. This will act similarly to the Clamp in UDK. I then entered two numbers that act as the dullest and brightest the emissive texture will get while blinking. Next, I multiplied the product by another number to increase how bright the whole emission is.
For some reason, when the emissive texture is multiplied by the previous operation, the result creates a white blinking texture instead of the color shown on the texture. Therefore, I multiplied the emissive texture by the color I wanted (in this case a light blue). After multiplying the two operations together, I placed the product into the Emission node.
Next, I wanted to create a reflection shader that acted similarly to the material in UDK; however, it ended up being a much tougher task than I expected it to be. It does not work correctly, but the shader below is what I ended up creating:
I will first start with the top operation. I created a cubemap that created the general reflections of the room from its given location (it changes depending on which wall or floor you are targeting). Next, I needed to create two variables to enter a Lerp. This operation simulates irregular reflections emanating from the floor. Each value is created by multiplying the cubemap by a number to create a different reflection value and are entered into the ‘A’ and ‘B’ parts of the Lerp. The Diffuse texture is then enter into the ‘T’ part of the Lerp. This creates the reflection to go on top of the Diffuse map.
Now I will start the bottom operation. First, I used a View Reflection node which calculates the reflection that the player will see on the floor. It then enters a Comp. Mask to remove the ‘X’ and ‘Z’ part of the view direction, to only output the ‘Y.’ A Comp. Mask is short of Component Mask. This takes in a value and outputs the value specified within the operation on the node. That is then multiplied to the product of the operation explained in the previous paragraph. I then multiplied everything by the Spec map so the reflection would show more in the exposed parts of the surface.
Note: Shaders are not materials. While creating the shaders, I never put actual textures into the Texture2D slots or cubemaps into the Cubemap slots. Shaders are applied to materials. Materials are where the textures and cubemaps are placed. The material is then placed onto the model.
After creating these materials, the result is shown below:
My next step was creating the first lightpass. I started with trying to put the lights in the same places that I did in UDK. I soon figured out lights in Unity do not act the same as they did in UDK. Therefore, I played with the lights until I found a proper first pass. I used the Lightmapping Extended menu to give me more variety in the choices I make when baking the lights. Here is a picture of the Lightmapping Extended window:
Note: I highlighted the two areas that I played around with while creating the effect that I wanted.
Instead of baking my Ambient Occlusion in the Post-Processing, like I did in UDK, there is an option to create it in Lightmapping Extended. I created the same settings for the Ambient Occlusion that I did in UDK. Then I bumped the Global Illumination up. This creates light reflection which is dependent on the lights within the scene. After baking the lights for my first pass, I came up with this picture below:
The scene is too dark and I implemented the same process that I did with the UDK scene. I took a screenshot of the scene, and attempted to improve it in Photoshop, shown below:
It is similar to the UDK version, but I believed that the Unity version was different enough to create a separate Unity Photoshop version. I then implemented that color scheme into the scene, shown below:
Finally, I wanted to create some Post-Production effects. Instead of creating post-production in a chain, I needed to add the effects onto the Main Camera. First, I imported the Unity Package of “Image Effects.” Unity makes it easy to mess with what effects you want by including this package. For my scene, I picked Vignetting, Chromatic Aberration, Bloom, and Noise and Grain.
Vignetting adds a small shading around the edges of the screen to create a better focal point within the scene. Chromatic Aberration and Bloom act the same way in Unity as it did in UDK; see Part 2 UDK for more information. And finally, I wanted to add some grittiness to the scene; therefore I added a little Noise and Grain.
Note: Image Effects only work in Unity Pro.
The final product is shown in the visual below:
It looks much softer than the UDK version, but I was still very happy with the result.
Looking back, the scene changes moving from concept to implementation in each engine. The three pictures below show the full progression, working from concept, to UDK, to Unity:
This whole process was an eye-opening experience; as my football coach would have said, I had a “Come To Jesus” moment. What I learned more than anything was that failing was only a stepping stone to learning the proper way to create assets. That statement may sound cliché, but I failed multiple times throughout this process, and I know I will never make those mistakes again. At first, I was very discouraged when I failed. After a few fails, I learned how important it is to fail.
I could not think of a better process to go through as my team enters the Final Project in the Game Design program. I am more confident knowing what problems I had in this process and how I will plan to avoid them moving forward.
I want to thank all the professors and fellow students that helped me through this process. I especially want to thank Victor Kam, Calder Archinuk, Jordan Lang, and Rupert Morris for teaching me the proper tools to create this scene. Thank you for reading the Experiences with Environments series and I hope you enjoyed it. If you have any questions, email me at GD35JamesW@vfs.com or JamesMithcellWatsonJr@gmail.com.
James Watson is a Game Design student at VFS