Environment Simulation Sandbox
(2023-25)This project is an extension of another project: Art Informed Game Design Framework. After being in the Department of Art for two years, I switched into the Department of Design because my project required a design-focused approach to problem solving. I became more serious about the end user and less about creating games purely as artistic expressions. To give a general overview of the project, I’m reimagining how environments might be built, represented, and centered in games through systemic design.
My life and academic roots have always been tied to the environment. Whilst studying architecture, I managed to fit a lightwell with a garden in it into each of my designs. They were always at the center of my projects. I used architecture to contain plants. Now, I use games as the frame for plants. Beyond this, I found myself gardening in small places (apartment living) and finding joy where plants squeezed through urban fabric to live their life.
With games and environments as my new medium, I wondered what translation or mirroring there may be between screens and the real world. In the back of my head, I hope that bringing greater focus to plants and environments in games will act as an educational experience for people interfacing with real environments.
Taking inspiration from Pokemon Gold/Silver/Crystal’s ground tiles, which were 16 pixels by 16 pixels, I created an algorithm using p5.js that generated an array of ground tiles. I thought of each tile as a set of data, containing information about the ground’s nutrients, moisture, and temperature.
I’d curate the tiles to create maps that symbolized architectural spaces where plants grew or things moved. I made this into a daily practice for a short period. I created tile arrays in p5.js, curated the ones I wanted, and designed maps in Aseprite. I could make a map or two in a few hours, but I wanted to optimize the process.
In the summer of 2023, I experimented with ChatGPT for writing code. With its help, I was able to cut out the middle-software of p5.js and Aseprite by placing the program directly in Unity. I took a break from making maps and familiarized myself with using ChatGPT as a collaborator in addition to experimenting with new tile types.
Following this experience of making new tile types, I wanted to use a software that would assert more control over the creative process compared to ChatGPT. For a school project, I was asked to visualize a research space. After assembling a bounty of academic papers on Zotero, I exported from Zotero and imported it into Houdini. I then created my own node network to turn each paper into a 2D pixelated rock. Information related to the paper’s date, type, and (personal) research interest was used to influence the rock’s shape, size, outline, and amount of lichen.
After making my 2D rocks using Zotero data, I made my first interactive game with ChatGPT as a collaborator. I specifically used ChatGPT to help me set up the game manager and write the scripts that used a plugin for Unity called Shapes, which uses GPU-based rendering. This process helped me understand using ChatGPT as a programmer while I thought through systems of data influence.
At the same time as working on these prototypes, I began making arguments for methods of representing environments in media. My graduate education became focused on this as a thesis topic. I had previously made art about it, but I hadn’t yet unpacked systems of representing environments through writing.
At the same time as working on these prototypes, I began making arguments for methods of representing environments in media. My graduate education became focused on this as a thesis topic. I had previously made art about it, but I hadn’t yet unpacked systems of representing environments through writing.
Running with Houdini as a tool for generating 2D pixelated graphics, I created a new node network capable of generating a flower. It animated the flower through four different stages: sapling -> stem -> bud growth -> bloom. Most importantly, this prototype gave the plant a sense of life through the inclusion of 200+ frames.
Immersing myself in these studies, papers, and discourse, I created a handbook featuring case studies, popular media, and academic papers related to the subject of environment-centered gameplay. In it, I created conjectures using my own generative tools to imagine what this play could look like through renders and audio sketches.
At the start of the 2024 school semester, I solidified my idea of working on a simulation sandbox game as part of my thesis project. With all the neat prototypes I was making, I had always imagined tying them to earth science data. After taking an earth sciences course in the summer for fun, I began applying this knowledge with assistance from the AI tool Claude. We collaborated to make the system diagrams above based on real-life earth systems. In addition, I started studying worlding algorithms, history, and principles through Dwarf Fortress and Minecraft.
Returning to my roots of map making, I collaged what a generated world might look like with content I’ve generated. This helped me answer questions about intimacy in addition to scale. I realized at this point that I wanted to work with higher definition graphics due to pixelated graphics feeling factory-made. I also chose the scale based on what felt ‘garden-like’ rather than ‘god-mode.’
Lastly, I opted to separate the world into rooms. I use the room as an architectural frame that the player can consider spatially and gardenly. Only what is visible is being rendered on screen, while the rest of the data for the world is being simulated behind the curtains.
Moving Forward
Find more information and progress reports on my Substack.