Behind the Code: Building This Workspace

At first I just wanted to make a website to put my resumes. I thought it would be a lot cooler & handy. But then as I added things like chat bot, home page, and this tech page, this website has truly become a demonstration of myself! Most importantly, I learned a lot along the way!

1. State Management in an Astro SPA

The interactive creatures on the bottom right were a passion project, requiring days to perfect the eye-tracking algorithms and the fluid, liquid-body effects. However, whenever a user clicked a link, the page would hard-refresh, resetting the creatures and breaking the illusion. To fix this, I needed to convert the site into a Single Page Application (SPA).

Because Astro is fundamentally a Multi-Page Application (MPA) framework, implementing Astro’s <ViewTransitions /> introduced a wave of lifecycle bugs. My client-side JavaScript wasn’t firing properly on new pages, but I also needed the creatures to never reload. I solved this by treating different component logic with two distinct Astro directives. For the creatures, I used the persist tag to tell Astro to carry the exact HTML node across page navigations seamlessly.

Orange_Creature.astro
<div id="orange-creature" transition:persist>
</div>

For standard page interactions (like modals or hover effects), the scripts needed to re-evaluate every time the user entered a new route. Since standard DOMContentLoaded doesn’t work in an Astro SPA, I wrapped my logic in Astro’s custom lifecycle hook to guarantee execution on every navigation.

InteractiveCards.astro
// This ensures the logic re-runs every time a new page is injected into the DOM
document.addEventListener('astro:page-load', () => {
const interactiveCards = document.querySelectorAll('.interactive-card');
interactiveCards.forEach(card => initHoverLogic(card));
});

2. The AI Chatbot: RAG & System Prompting

Going into this, I assumed building the AI backend would be the hardest part. Surprisingly, modern APIs make integration incredibly smooth. The real challenge was giving the bot a personality and grounding it in my actual data.

Instead of a generic chatbot, I utilized Retrieval-Augmented Generation (RAG) principles and rigorous System Prompting. I engineered a highly specific system prompt that fed the AI my resume data, project history, and strict behavioral guidelines, ensuring it spoke in my tone and only answered questions related to my professional background. On the frontend, I shifted my focus to the UI and UX. I built a custom typewriter effect to handle the streaming text chunks and integrated precise audio cues from ZAPSPLAT to give the chat a tactile, mechanical keyboard feel.

3. CI/CD, i18n, & AI Pair Programming

Managing English, Japanese, and Chinese translations manually was an architectural nightmare. I needed to modularize the codebase so content and code were completely decoupled, but refactoring the entire file tree manually would take days.

To solve this, I turned to a local AI CLI Agent. I wrote strict architectural blueprints and instructed the agent to execute sweeping, multi-file refactors directly in my terminal. Because giving an AI autonomous read/write access to my file system is incredibly risky, I engineered a secure Docker container to cage the AI. This sandbox ensured the agent could only touch my project files, which ultimately became my gateway into learning Docker-based Dev Environments.

4. Fluid Motion & UX

A modern web app needs to feel tactile. I wanted the scrolling to feel like gliding, and elements to reveal themselves organically. I implemented Lenis for mathematically smoothed scrolling and GSAP (GreenSock) for complex scroll-triggered animations. Since I had never used these libraries before, it was a crash course in reading documentation, managing timeline states, and preventing memory leaks on unmounted components during Astro view transitions.

5. Bridging Code and Canvas

To avoid copyright issues, I decided to create my own visual assets. I briefly considered generative AI, but opted to learn pixel art instead. I severely underestimated it! As a developer, shifting from pure logic to visual UI design was my steepest learning curve.

I used Aseprite to painstakingly design and animate the pixel art assets frame-by-frame, which taught me a lot about sprite sheets and CSS keyframe integration. Overall, translating my technical background into a visually cohesive UI—balancing flexbox constraints, typography, and negative space—was the most challenging, yet rewarding, part of this entire build.



Where did I get the SFX sound effects?

I got most of the sound effects from ZAPSPLAT!

This website is made possible because:

Astro LogoAstro.js
Google LogoGoogle Gemini & CLI
Docker LogoDocker
Docker LogoDocker
GitHub LogoGitHub
Aseprite LogoAseprite
Zapsplat LogoZAPSPLAT
GSAP LogoGSAP
Lenis