top of page
Echoes_Poster_NoText.png
EOTSG_logo_WhiteVersion.png

ROLE

Project Lead and Lead Gameplay Programmer

DESCRIPTION

Echoes of a Silent Garden is a VR psychological horror experience with a haunting hand-drawn black-and-white art style. In the game, you play as Chrys, the daughter of a botanist whose obsession with reviving a dying world unleashes a malevolent force that consumes your sister. As the remnants of your once-thriving life crumble around you, you will need to navigate this eerie world to save your family before it’s too late.

YEAR

2024 - 2025

GENRE

Psycological Horror

Engine and Add ons

Unity Engine
FMOD

Hurricane VR (V1)

AutoHand (V2)

PLATFORM

Meta Quest 3

Trailer

Contributions

Team Lead

  • Led a cross-discipline team of 11 Peers

  • Created a Game Design Doc 

  • Led bi-daily meetings

  • facilitated communication between the different teams.

  • Communicated progress with team leads and advisors

  • Assisted in approving assets for implementation

  • Created Presskit and steam assets

Contributions information

Main Gameplay Programmer

  • Created an advanced Goal-Oriented Action Planning (GOAP) based AI System  

    • Modular System​

    • Hearing Detection

    • Proximity Detection 

    • Sound Detection

      • Flashlight awareness integration​

  • ​Optimized project for Standalone VR

    • Multi-scene workflow​

    • Culling Objects

  • Implementation of sound using FMOD

  • Implementation of Autohand VR Physics System

Dev Logs

Click the button below to view the full Bluesky thread

GOAP AI System

  • Goal-Oriented Action Planning (GOAP)–based AI architecture

  • Multi-stage detection system progressing from unaware to fully engaged

  • Reactive behavior driven by perception rather than scripted states

  • Hearing-based awareness integrated into AI decision-making

Extended NavMesh destination selection with obstacle checks and path validation to prevent unreachable targets in tight spaces.

This system represents the primary gameplay AI framework for the project and was designed to support a reactive creature that responds dynamically to player behavior. I implemented a GOAP-based architecture to allow the AI to evaluate goals and select actions based on changing world state rather than relying on fixed state machines.
 

The creature’s awareness is governed by a five-stage detection model, ranging from fully unaware to actively engaged. Detection values increase based on player actions and proximity, with hearing serving as a primary sensory input. As detection escalates, the AI re-evaluates goals and behaviors in real time, allowing it to transition naturally between investigation, searching, and pursuit.

Player-Driven Detection & Risk–Reward Mechanics
Gameplay-driven AI detection via flashlight noise, overcharge events, and hearing-based perception.

This system ties player interaction directly into AI perception through sound. Cranking the flashlight increases visibility but also raises the player’s detection value, emitting sound events that the AI can hear. Overcharging the flashlight triggers a loud audio spike that immediately reveals the player’s location, creating a deliberate risk–reward decision during exploration.

AI Navigation System

AI navigation refinement demonstrating wall avoidance and reliable target selection for a large character in constrained environments.

(Gameplay footage captured during development)

  • Facing correction using vector dot-product alignment checks

  • Wall avoidance via forward raycasts and directional steering

  • Validation of navigation targets using spatial overlap tests

  • Reliable AI movement for oversized characters in constrained spaces

Technical Overview
This navigation system builds on a NavMesh Agent foundation and addresses issues caused by character scale in tight level geometry. Because the AI character frequently clipped into walls or selected unreachable destinations, I implemented additional spatial checks to improve the reliability and readability of movement.

To ensure reliable movement, I corrected facing direction and wall collisions using forward raycasts combined with vector dot-product alignment checks between the character’s facing direction and target direction. Movement was only committed when directional alignment was valid, reducing erratic turning and wall-hugging behavior.

I also implemented validation checks for randomly generated navigation points using a sphere-overlap test and obstacle-layer filtering. This prevented targets from spawning beneath or inside level geometry, eliminating cases where the AI selected unreachable destinations.

Optimization 

Video demonstration of occlusion culling reducing off-screen and obstructed geometry.
T_Atlas1_Contrast.png
Primary texture atlas containing approximately 80% of in-game assets
  • Performance optimization for standalone VR hardware

  • Use of occlusion culling to reduce overdraw and unnecessary rendering

  • Texture atlasing and transparency reduction to improve GPU efficiency

  • Successful deployment to Quest 3 at a stable 90 FPS using Android API

Technical Overview
I was responsible for optimizing the project to meet the performance requirements of standalone VR on Quest 3. Because the target platform has strict GPU and CPU constraints, I focused on reducing rendering cost while preserving visual clarity and gameplay readability.

To minimize overdraw and unnecessary draw calls, I implemented occlusion culling to prevent off-screen and obstructed geometry from being rendered. As a technical lead, I advocated using texture atlases to reduce material count. I then integrated the atlased assets provided by the environment team into the material pipeline, verified rendering configurations, and resolved transparency and double-sided rendering issues to improve batching efficiency and overall performance.

These changes resulted in a stable 90 FPS experience on Quest 3 and enabled successful deployment to a standalone headset using the Android API, meeting VR comfort and performance requirements established by Meta

Production Debugging Case Study – GDC Showcase Preparation

Context
Four days prior to a scheduled GDC showcase, the project experienced severe and unexpected performance degradation that made it unsuitable for public demonstration.

 

Problem
Frame rate dropped significantly on target hardware, despite previously stable performance. Initial profiling did not reveal an obvious single bottleneck, and the issue appeared inconsistently across scenes.

 

Approach
To isolate the problem, I duplicated the project into a clean Unity environment and rebuilt the content incrementally, repeatedly testing performance as systems and assets were reintroduced. This process was repeated multiple times to rule out project corruption and identify configuration-level issues rather than gameplay logic errors.

 

Root Cause & Resolution
The performance issue was ultimately traced to a texture atlas containing foliage materials with transparency, where double-sided rendering had been unintentionally enabled. Because this atlas represented a large portion of visible geometry, the configuration dramatically increased overdraw and fill-rate cost on standalone VR hardware. Disabling double-sided rendering immediately restored stable performance.

 

Outcome
The project was stabilized and successfully showcased at GDC as part of the SCAD booth. Performance met target frame-rate requirements, and the experience was demonstrated reliably throughout the event.

FMOD Intergration

FMOD event timeline showing layered chase music controlled by gameplay-driven detection parameters.
  • Gameplay-driven audio integration using FMOD

  • Dynamic music layering based on AI detection state

  • Real-time parameter control driven by gameplay systems

  • Tension scaling inspired by asymmetric horror design pattern

Gameplay-driven FMOD parameter updates, smoothing detection and distance values to control dynamic music layers.

Technical Overview
I implemented a dynamic music system using FMOD that responds to enemy awareness and player detection levels. As a creature’s detection value increases, additional musical layers are introduced in real time, increasing tension and signaling threat escalation to the player.

 

The system was designed to remain tightly coupled to gameplay state rather than scripted triggers. Detection values are passed as parameters to FMOD, allowing the music to evolve smoothly as AI awareness changes, rather than switching abruptly between tracks. This approach helped reinforce player feedback and atmosphere without requiring additional UI elements.

Showcase Regcongintion and Industry Feeback

GDC 2025

IMG_20250326_111441.png

Showcase Experience
A demo was presented continuously to attendees, including industry professionals, developers, and students. The team focused on delivering a stable, polished vertical slice that clearly communicated core mechanics, atmosphere, and pacing within a short play session.

 

Industry Exposure
Through existing connections with Unity, the team was invited to attend the Unity Education Mixer, providing opportunities to discuss the project with developers and educators from across the industry. Player reactions during the demo helped validate the effectiveness of the game’s tension and moment-to-moment gameplay.

The Rookies 

Echoes_of_a_Silent_Garden.png

Recongnition
The project was selected as a finalist in the Immersive category for The Rookies, recognizing the quality of its interactive design, technical execution, and overall presentation among international student submissions.

Video Renders 

Screenshots 

bottom of page