How I Built a Next.js Portfolio & Markdown Engine with AI in 60 Minutes
It’s 2026, and the role of the modern engineer is shifting from hand-crafting boilerplates to becoming an architectural orchestrator.
As a Platform Engineer used to driving large-scale migrations—like moving 7+ EKS clusters to Datadog or automating CI/CD pipelines across 60+ repositories—I wanted my personal site to accurately reflect my day-to-day context: speed, extreme polish, and deep infrastructural competence.
But building a multi-page React application with a bespoke markdown parser and complex state-machines traditionally takes days. I decided to see what happens when you treat an LLM as an autonomous orchestration layer.
Here is how I built this entire site from scratch in under 60 minutes.
The Vision
I wanted a digital environment that didn't just list technologies, but actively showcased engineering velocity. The goal was to build a site that acted as a literal "terminal" for hiring managers assessing top-tier talent.
- State-Machine Terminal: A fully interactive, POSIX-style shell built natively in React.
- Component-Driven Static Generation: Next.js 15 App router for sub-millisecond edge delivery.
- Autonomous Ingestion: A Markdown (
gray-matter) engine so I can write technical blogs via my local/contentvault without ever touching TSX.
Architectural Orchestration
Rather than asking the AI to "build me a website" (which results in generic templates), we paired on distinct system layers with extreme precision:
- The Core Routing Engine: First, we initialized Next.js 15 with the App Router. We modularized the layout, stripping away default globals and introducing a grid-pattern background animation driven by CSS keyframes and Tailwind primitives.
- The Interactive Terminal Engine: I tasked the agent with building
<TerminalPrompt />. It spun up a stateful React component with a fully mocked OS boot sequence. We then implemented anlsandcatparser that synchronizes directly with the actual portfolio data structures. We even mapped Arrow-Up/Arrow-Down key events to cycle through command history indexing, exactly like a realzshenvironment. We went as far as intercepting thecontactcommand to fire native Next.js client-side routing (useRouter) straight to the contact portal. - The Static Markdown Compiler: The most computationally complex phase was instructing the agent to build
src/lib/blog.ts. It utilizes Node'sfsandgray-matterto aggressively slice markdown files at build-time. It computes reading durations, extracts frontmatter (Title, Summary, Date), and auto-generates the static page parameters dictating/blog/[slug]during the Next.js compilation step.
The Execution and Refactor Phase
While AI agents are excellent at generating executing code, they require strict architectural constraints to achieve true production polish. I routinely audited and refactored the underlying generations to ensure enterprise-grade depth:
- State-Syncing the Virtual Filesystem: Rather than hardcoding the terminal output, we mapped exact data states from the React components to synchronize the terminal's virtual filesystem. Running
cat experience.login the CLI operates on the exact same underlying logic structure as the GUI/experiencepage. - Micro-Animations & Physics: I fragmented the static blog feed sequence into a highly specialized Client Component (
BlogList.tsx). By intentionally disjointing the SSR payload fetchers from the client-layer, we successfully wrapped the DOM nodes in a staggeredframer-motionsequence to provide cinematic, native-app entrance physics. - Event-Driven POSIX Mechanics: Standard form inputs aren't shell environments. We aggressively suppressed default DOM browser hooks (stripping focus rings and spellcheckers) and wired strict event listeners to hijack the
ArrowUpandArrowDownkeystrokes. We then managed a custom index-tracker to mutate the input string dynamically, exactly mirroring standardzshexecution sequences. - Deep Component Disaggregation: The architecture began as a single-page monolith. We shattered it into a highly modular Next.js App Router topology (
/projects,/experience,/contact), extracted a persistent<Navbar />layout wrapper, and subsequently bound the terminal's execution context to leverage nativeuseRouter()navigation. - Hardening Authentic Artifacts: AI defaults to generic hallucinated templates. We preempted this by aggressively injecting fully authentic, raw HashiCorp Vault GitOps YAML configurations straight from my production clusters, replacing generic placeholders with hard quantitative metric realities.
The Bottom Line
Writing code is no longer the bottleneck; the bottleneck is conceptualizing the final architecture and directing the system design. By operating as the architectural orchestrator while leveraging AI for the raw execution layer, I built an incredibly polished, edge-delivered GitOps portfolio in the time it traditionally takes to configure Webpack.
When you can automate the execution, you get to focus entirely on the engineering impact. Feel free to cat projects.md or cycle through your history on the Home terminal to look around!