Blog

  • Hosting a Private Killing Floor 2 Server for My Kids (and My Sanity)

    I have three kids. Two of them are boys, and like a lot of kids their age, they love video games.

    What’s been especially fun for me is that I actually enjoy playing video games with them. It’s one of those rare overlaps where everyone genuinely wants to be there.

    One game that keeps coming back into our rotation is Killing Floor 2. It’s fast, cooperative, and easy to pick up for a quick session after dinner or before bedtime.

    Unfortunately, playing online came with a big downside.

    The Ping Problem

    Some nights, the game was borderline unplayable:

    • Enemies lagging or teleporting
    • Shots not registering
    • Everything feeling delayed and sluggish

    After checking where we were connecting, it became clear that we were often landing on servers halfway across the world—sometimes in Europe or Russia. There’s nothing wrong with global gaming communities, but high ping absolutely kills the experience.

    So I asked myself a simple question:

    Why not host the server myself?

    Why a Private Server Works So Well

    Running a private server locally solved a lot of problems at once:

    • Extremely low ping
    • No random players joining
    • Full control over difficulty and maps
    • A predictable, kid-friendly environment

    Even better, it turned out to be far easier than I expected.

    A Quick Note on Killing Floor 3

    Yes, Killing Floor 3 exists.

    I do own a copy, but not multiple licenses for the whole family.

    More importantly, Killing Floor 2 is still perfect for what we want:

    • Mindless co-op fun
    • Easy drop-in gameplay
    • No heavy progression or long sessions

    If you’re a parent looking for something quick and fun to play with your kids, KF2 still delivers.


    What You Need

    You don’t need anything fancy:

    • An Ubuntu server (VM, bare metal, or cloud)
    • Or a Docker-capable system
    • About 40 GB of disk space
    • Roughly 20–30 minutes

    I ran this on an Ubuntu VM, but the same idea works in Docker or on a home server.


    Installing the Server (The Important Commands)

    1. Install Dependencies

    On Ubuntu, start by installing the required 32-bit libraries and tools:

    sudo apt update
    sudo apt install -y lib32gcc-s1 lib32stdc++6 curl wget tmux

    2. Download SteamCMD

    SteamCMD is Valve’s command-line tool for installing dedicated servers.

    mkdir ~/steamcmd
    cd ~/steamcmd
    wget https://steamcdn-a.akamaihd.net/client/installer/steamcmd_linux.tar.gz
    tar -xvzf steamcmd_linux.tar.gz

    You should now see steamcmd.sh in the directory.

    3. Install the Killing Floor 2 Server

    This is the key command. It:

    • Forces Linux binaries
    • Logs in anonymously
    • Downloads the KF2 server files

    ./steamcmd.sh \
    +@sSteamCmdForcePlatformType linux \
    +force_install_dir ~/kf2server \
    +login anonymous \
    +app_update 232130 validate \
    +quit

    This downloads about 30–40 GB, so give it a few minutes.

    When it finishes, you should see folders like:


    Binaries  Engine  KFGame

    Running a Local-Only Server

    1. Find Your Local IP


    ip addr show | grep inet


    192.168.1.214



    2. Start the Server
    2. Start the Server

    From the server binaries directory:

    cd ~/kf2server/Binaries/Win64

    ./KFGameSteamServer.bin.x86_64 \
    KF-BurningParis?Game=KFGameContent.KFGameInfo_Survival \
    -Port=7777 \
    -QueryPort=27015 \
    -Multihome=192.168.1.214 \
    -NoSteamServer

    What this does:

    • -Multihome binds the server to your LAN IP only
    • -NoSteamServer prevents public internet advertising
    • The server stays entirely local

    The first launch takes a minute or two while everything initializes.


    Connecting to the Server

    On the gaming PC:

    1. Launch Killing Floor 2
    2. Open the console (~)
    3. Run:


    open 192.168.1.214:7777

    That’s it. Instant connection. No lag.


    Why This Was Worth It

    The difference was night and day:

    • Near-zero latency
    • Smooth combat
    • No frustration
    • Just fun

    Most importantly, it turned gaming nights back into something we all looked forward to instead of something we had to tolerate.

    Final Thoughts

    This wasn’t about building the “perfect” server or learning new tech for its own sake.

    It was about removing friction and creating better shared experiences with my kids.

    If you already have Ubuntu or Docker running somewhere, you can absolutely get this running in under half an hour—and the payoff is immediate

  • I’ve Probably Wasted Hundreds of Dollars on My Home Lab — Learn From My Mistakes

    If you’re running a home lab, especially something like an Unraid server, let me save you some money — because I’ve probably burned hundreds (maybe thousands) of dollars learning lessons the hard way.

    This post isn’t about enterprise best practices or perfect setups. This is about years of mistakes, bad assumptions, and overreacting when something went wrong. If you’re building or running a home lab, hopefully this helps you avoid doing the same.


    My Setup (and My Bad Habits)

    I’ve been running an Unraid server at home for years. It hosts:

    • Multiple Docker containers
    • Several VMs
    • Home automation
    • Random development projects
    • Side projects that come and go

    Like many homelab users, I built it up over time — adding drives when I needed space, tweaking things when they broke, and generally treating it like a mini data center… even though it’s absolutely not one.

    Over the years, I’ve dealt with:

    • SSDs suddenly going read-only
    • Hard drives randomly “disappearing”
    • Drives getting disabled by Unraid
    • Occasional data loss (the painful kind)

    And my default reaction for a long time?

    “Welp, drive failed. Time to buy a new one.”

    That mindset cost me a lot of money.


    The Drive Replacement Trap

    Most of my array is made up of 6–10 TB drives, with 10 TB being the sweet spot for years. Back when you could find 10 TB drives for $60–$70 (especially secondhand), replacing one didn’t feel like a big deal.

    But here’s the mistake:

    👉 Not every “failed” drive is actually failed.

    Unraid is conservative by design. If it detects write errors, timeouts, or weird behavior, it may:

    • Disable the drive
    • Mark it as read-only
    • Drop it from the array

    That doesn’t automatically mean the drive is dead.

    What I should have been doing (and now do):

    1. Put Unraid into Maintenance Mode
    2. Run a SMART check
    3. Run an XFS repair (if applicable)
    4. If no real errors appear:
      • Remove the drive
      • Re-seat or reconnect it
      • Add it back and let Unraid rebuild

    Many times, the drive is perfectly fine.

    Instead, I was panic-buying replacements.


    Drives Fail — That Doesn’t Mean You Failed

    Here’s something I’ve finally accepted:

    • Multiple containers running
    • VMs writing constantly
    • Cache activity
    • Background parity checks

    Stuff breaks sometimes.
    A transient write failure doesn’t mean your entire system is doomed.

    If the drive passes SMART and filesystem checks, put it back. Worst case, it fails again — and then you replace it.

    Save the $100–$200 when you can.


    Parity: One vs Two (Learned the Hard Way)

    I also learned this lesson the painful way:

    👉 One parity drive is fine… until it isn’t.

    I’ve lost data because:

    • One drive failed
    • I tried rebuilding
    • Another drive hiccupped during parity
    • Game over

    If you can afford it:

    • Two parity drives are worth it
    • Especially if you’re using secondhand disks

    Yes, it costs more up front.
    But it’s cheaper than replacing drives and losing data.

    Rule of thumb:

    • Your parity drive(s) must be at least as large as your largest data drive
    • If most of your array is 10 TB → parity should be 10 TB

    You Probably Don’t Need As Much Storage As You Think

    This one hurt my pride a little.

    I have 40–50 TB of storage.

    Do I actually need that much?

    No. Not even close.

    Most people:

    • Aren’t storing massive video libraries
    • Aren’t running long-term archival projects
    • Aren’t hosting production services

    A lot of my space is filled with:

    • Old laptop backups
    • Forgotten projects
    • “Just in case” data

    For most people:

    • A handful of 10 TB drives is plenty
    • Even with home projects and media

    Storage creep is real — and expensive.


    What Actually Matters: Family Data

    Here’s the real truth:

    The most important data on my server is family photos and videos.

    Not:

    • Side projects
    • VMs
    • Docker containers
    • Experimental apps

    Those can be rebuilt.

    Photos from Christmas, birthdays, kids growing up?
    Those can’t.

    This changed how I think about backups.

    What I now prioritize backing up:

    • Family photos & videos
    • Phone media moved to Unraid
    • App data (because reconfiguring sucks)

    I even keep multiple Unraid servers and manually copy the important stuff between them. It’s not fancy — but it works.


    App Data, Plex, and the “Too Many Files” Problem

    Backing up app data is great — but some apps (like Plex) generate tons of small files:

    • Thumbnails
    • Metadata
    • Optimized images
    • Indexes

    For large photo libraries (tens of thousands of files), this can explode in size and I/O.

    Takeaway:

    • Back up app data
    • But be mindful of how much junk some apps generate
    • Sometimes restoring from scratch is cleaner

    Cache Drives, VMs, and a Gotcha I Learned Late

    Unraid does something smart — but it can surprise you:

    👉 If a VM is larger than your cache, it will live on the array.

    That matters because:

    • Array = parity overhead + slower writes
    • Cache = faster, designed for frequent writes

    I used to give VMs 500 GB each. Totally unnecessary.

    Now my VMs are typically:

    • 100–200 GB
    • Smaller
    • Faster
    • Easier to manage

    No noticeable downside.


    Final Takeaways (The TL;DR)

    If I had to summarize years of trial-and-error:

    • Don’t panic when a drive disappears
    • Run SMART and filesystem checks first
    • Re-add drives before replacing them
    • Two parity drives > one (if you can afford it)
    • Most people don’t need massive storage
    • Back up what actually matters (family data)
    • Smaller VMs are usually better
    • Home labs are not production environments — and that’s okay

    Hard drives are more expensive now than they used to be. That makes learning these lessons before replacing hardware even more important.

    If this post saves even one person from impulse-buying a drive they didn’t need — then my wasted money wasn’t totally wasted after all.

  • Anime Convention Reflections (Anime Frontier 2025)

    Context

    I have three kids: a 2-year-old, a 7-year-old, and a 12-year-old. I’ve been to anime conventions before—specifically Momocon, but that was 10–15 years ago, and the experience today feels very different. This was my first time attending an anime convention as a parent, which changes how I evaluate the experience.


    Overall Experience

    Going to an anime convention with kids was interesting and mostly positive. There was a lot to do and see, and overall it felt well-organized and family-friendly, even if not everything fully aligned with my personal interests.


    What I Liked

    • Early show premieres (including shows not yet released on Crunchyroll)
    • Dub versions available to watch
    • Panels and discussions with creators and industry folks
    • Art booths and creative activities
    • Game areas, including access to Japanese games
    • Seeing people dress up and fully embrace the atmosphere
    • Kid-friendly spaces, including:
      • A play area
      • Drawing with crayons on the floor
    • Helpful and kind staff
    • Good accessibility (elevators, manageable crowd flow)
    • Parking was surprisingly easy, thanks to app-based parking systems (even if the walk was a bit long for kids)

    How Things Have Changed Over Time

    Compared to 10–15 years ago:

    • Access to Japanese games is no longer rare—you can find them locally or online now
    • Anime premieres are often released online quickly, reducing the “you must be here” feeling
    • Nostalgia plays a smaller role now that content is more accessible
    • Events that once felt exclusive now feel more like early access rather than unique access

    Mixed Feelings / Observations

    • There was a heavy focus on voice acting:
      • Voice actors attending
      • Voice acting panels
      • A voice acting school with a booth
        This isn’t bad, but it stood out. When I was younger, conventions felt more centered on watching anime and playing games, with less emphasis on voice acting as a career path.
    • Game tournaments were less interesting to me than they used to be:
      • They feel more like watching highly practiced competitors
      • My interest has faded, possibly because I have kids now
    • Watching full shows is harder with a 2-year-old, which creates practical constraints

    Demographics & Social Observations

    I found myself noticing who was there:

    • Many attendees seemed to be single, dating, or without kids
    • Fewer families with young children
    • As someone in my 30s with three kids, it highlighted how life stage changes how you experience conventions
    • Not a negative—just an interesting contrast

    Personal Preferences

    • I care less about arts and crafts
    • I’d rather watch shows or experience premieres
    • I still think I prefer game conventions, especially for:
      • Experimental demos
      • Hands-on gameplay experiences

    Final Thoughts

    Overall, the experience was good.
    Not everything hit the same way it would have 10–15 years ago, but that’s expected.

    There’s a high probability we’ll go again next year, mainly to:

    • See what’s changed
    • Re-evaluate as the kids get older
    • Experience it again with adjusted expectations

    It worked well enough, and I’m glad we went.

  • GenAI – Force Multiplier

    Whew — got physical today. Jogging, working out, pushing myself. NBA convention coming up, so I’m trying to keep the momentum going.

    From a learning and professional perspective, I’m noticing something interesting: the challenges I run into on the “professional” engineering side are the exact same kinds of challenges I run into on the home-lab or development side. Different environments, similar problem-solving patterns. Systems are systems.

    But the thing that’s really fascinating — and something I was mentioning to my colleagues — is how much of a force multiplier AI has become for me.

    AI as Acceleration

    When I’m building applications, AI lets me generate boilerplate code in seconds. Stuff that used to take hours — the scaffolding, the repetitive patterns, the configs — now takes a couple of minutes. The core ideas still come from me, but I get to stand on a higher platform and build faster.

    Testing used to be the same story. Crafting a JSON test payload with multiple layers of parameters? That used to take 30 minutes or an hour if the spec was complicated. Now I can generate it instantly, try variations, and validate behavior way faster. It completely changes the tempo of development.

    The Real Impact

    It’s not just speed — it’s iteration. AI reduces the friction between idea → implementation → test → improve. Faster loops = better systems.

    In fact, I’m going to expand this in a separate blog focused specifically on the problems I’ve been running into and how I solved them. The patterns are becoming clearer:

    • Proxy configurations matter more than you expect
    • CI jobs can silently hit rate limits
    • AI can help diagnose and test faster
    • System understanding compounds over time

    Feeling excited about where all of this is going.

  • Marvel’s Spider-Man 2 (PC) – Review

    Perspective: A Parent, Casual Performance Gamer, and Multi-System PC Owner

    Reviewer Context

    I want to start this review by giving some context, because how you judge performance really matters.

    I’m a father with kids who actively play games, and I personally own multiple gaming systems:

    • An Intel Arc system (B580)
    • An AMD-based PC
    • An NVIDIA-based PC
    • Several living-room and handheld setups

    I’m also not someone who chases perfect performance. I don’t need 100+ FPS to enjoy a game. If a game runs at medium or low settings, looks fine, and doesn’t constantly stutter, I’m happy. Even 30 FPS—or slightly below—doesn’t bother me if it’s consistent.

    Because of that mindset, I’m generally very forgiving when reviewing games.

    That’s what makes this experience stand out.


    Why I Picked Up Spider-Man 2

    My kids played through Marvel’s Spider-Man and Miles Morales, and loved both. Naturally, they wanted to continue the story with Spider-Man 2. I bought the game mostly so I could:

    • Keep up with what they’re playing
    • Experience the story alongside them

    I wasn’t expecting cutting-edge performance. I just expected it to work.


    Performance Experience (The Core Issue)

    Unfortunately, this is one of the worst optimization experiences I’ve had in years.

    System 1: Older NVIDIA PC

    • 16 GB RAM
    • Decent CPU
    • GTX 1080 (still very capable for most games)

    Result:
    ➡️ Severe stuttering, even after lowering settings significantly.

    System 2: Intel Arc B580 PC

    • 32 GB RAM
    • SSD / M.2 storage
    • Intel Arc B580 GPU

    Result:
    ➡️ Same stuttering issues, despite better memory, faster storage, and a newer GPU.

    Additional Tests

    • Tried different settings
    • Tried handheld devices with SSDs
    • Eliminated storage bottlenecks

    Result:
    ➡️ Stuttering persisted across platforms

    This wasn’t a “your PC is too weak” situation. This was a system-agnostic problem.


    What Makes This So Frustrating

    This game doesn’t just run slow — it hangs, as if it’s getting caught on something internally. The stutters feel unrelated to raw GPU power or disk speed.

    Possible causes (speculation):

    • Ray tracing behavior
    • Frame generation conflicts
    • CPU scheduling or shader compilation issues
    • Poor engine optimization compared to the previous two games

    Whatever the reason, the result is the same:
    ➡️ Inconsistent, immersion-breaking stutter

    And for many players, this would be nearly unplayable.


    Story & Gameplay

    From what I’ve been able to experience:

    • The story itself is fine
    • Gameplay is familiar and enjoyable
    • My kids want to keep playing despite the issues

    That says a lot about the core game design — it should be good.

    But performance overshadows everything.


    Platform Frustration

    I honestly wish this game were available on Xbox. I don’t own a PlayStation, and I’m not buying one for a single game — especially when I already maintain multiple PCs.

    The PC version should be the flexible option. Right now, it isn’t.


    Final Verdict

    Spider-Man 2 has the foundation of a great game, but the PC version is severely held back by optimization issues.

    • ❌ Persistent stuttering across multiple hardware setups
    • ❌ Poor experience even on capable systems
    • ✅ Story and gameplay are strong enough that kids still want to play

    Score (From My Perspective):

    6 / 10

    This score reflects potential, not execution.
    With proper optimization patches, this could easily be an 8 or higher.


    Who Should (and Shouldn’t) Play This Right Now

    You might be okay if:

    • You’re very patient
    • You tolerate stutter
    • You’re invested in the story

    You should wait if:

    • You care about smooth performance
    • You expect PC flexibility
    • You don’t want to troubleshoot endlessly
  • A Jog, a Homelab, and the Cost of ‘Doing It Yourself’

    Another day, another jog, another chance to talk out loud to myself. It’s Monday, December 8th, 2025. I haven’t listened to my audiobooks or the SANS podcast yet, but life moves forward, so I’m moving forward with this goal of just talking through what I’m working on.

    Homelab Reality Check

    Lately I’ve been fooling around with my home server and Homelab setup. One thing I’ve realized: Homelabs are the biggest oxymoron when you’re trying to “save money.” It’s exactly like camping—you can do it for cheap, but once you scale up, it’s absolutely not cheap anymore.

    My Unraid server has multiple hard drives, and they fail randomly. Redundancy becomes a whole topic by itself. Before COVID, you could grab used 10TB drives for ~$80–100, which felt like a steal. Five drives, one parity? Boom—40 TB usable storage for ~$350.

    But in reality, once you start actually hosting things locally, you realize you need:

    • multiple drives
    • parity
    • cache drives
    • backup strategies
    • and constant troubleshooting

    Today’s issue: my cache drive stopped reading. I had just replaced the PSU with a Corsair unit and ran a parity check. Overnight one SSD just decided to quit. VMs broke, Docker broke, the whole thing went sideways.

    The SSD is officially dead. Now I’m debating:

    • Do I buy a new Samsung 1TB for ~$109?
    • A cheaper brand for ~$50–80?
    • Reuse a failing 240GB SSD just to limp by?

    It’s the endless Homelab question: fix it properly or patch it for now?

    Audiobooks, Habits, and Time

    It’s been weeks—maybe months—since I’ve opened Audible. When I lived in NYC, I listened constantly because I had an hour-to-90-minute commute on the R train.
    At 2x or 3x speed, I could get through 1–3 audiobooks a week.

    Now? With working from home and only driving the kids to school, finding listening time is tough.

    My plan:

    • Cut down music during workouts
    • Turn on audiobooks when driving
    • Mix in Udemy and training materials
    • Follow Atomic Habits: make good habits easier by putting them “in reach”

    It’s not impossible; I just need to build a new routine around my current life.

    Upcoming Events & Goals

    A few other things on my radar:

    Anime Convention

    It’s been almost 10 years since I last went to one. My kids want to go, so that’s going to be fun — a full throwback moment.

    Fitness

    I need to get back on track. I’ve been jogging again but want to rebuild consistency.

    GitHub Actions on Raspberry Pi

    This is my “because I can” project.
    I run a lot of security scanning tools for work, and having many self-hosted runners speeds everything up.

    Raspberry Pis:

    • aren’t power efficient
    • aren’t super powerful
    • but they’re tiny, quiet compute units I can scatter everywhere

    They’re perfect for automating repetitive workloads. Combine that with faster building through AI tools and I can really speed up my workflows.

    Short-Term Roadmap

    • Decide whether to buy the new SSD
    • Tighten up my Unraid setup again
    • Restart my audiobook habit
    • Prep for the anime convention
    • Deploy GitHub Action runners across a few Raspberry Pis
    • Keep working out consistently

    A lot going on, but doable. The real goal: stop locking up mentally and figure out how to keep moving while I’m doing more.

  • The Value of Sharing What You Know

    You know, something I think about a lot—especially now that I’m doing more development and security work—is the value of sharing information.

    Back in the day, I used to post a lot online. Not funny videos, not memes—actual educational content. And honestly, I helped way more people than I ever realized at the time. People run into problems constantly, and when you share your knowledge publicly, you suddenly become part of thousands of invisible conversations where you’re helping someone fix something they couldn’t figure out on their own.

    It’s wild how far that goes.

    And the truth is, people remember you for that. You start to build a name for yourself. Ask yourself this:

    Would you rather be known as the guy who causes problems, or the guy who solves them?
    Well, okay—maybe it is kind of fun to cause problems sometimes. But still, you want to be known as someone who can solve interesting challenges, someone people can count on.


    Most Problems Aren’t Really New

    What I’ve learned over time is that almost every difficult problem—whether it’s programming, security, or math—comes from extensions of simple, foundational concepts.

    Take math, for example.
    Right now, you’re learning about systems of equations, balancing expressions, doing substitution. But the truth is, all these concepts are built on very basic operations:

    • Addition
    • Subtraction
    • The idea of equality
    • Commutative properties
    • Transitive properties
    • Substitution

    Everything big is built on something small.

    And it’s the same in engineering: most “complex” bugs are just simple rules applied in the wrong place or forgotten entirely.


    Sharing Helps Everyone (Including You)

    When you share knowledge—whether it’s a solution, a walkthrough, or even a mistake you learned from—you give people access to these foundational ideas. You help remove confusion for someone who might’ve spent hours trying to fix it.

    You also help yourself.
    Sharing makes you:

    • Understand the topic more deeply
    • Become part of a community
    • Build credibility
    • Create opportunities for yourself
    • Gain confidence in your abilities

    Honestly, you help 10× more people than you realize.

    That’s why I think about creating tools, sharing solutions, and explaining how things work—not just because I enjoy it, but because it’s meaningful. Helping others understand problems makes you a better problem-solver yourself.


    The Bigger Picture

    Sharing isn’t just about putting something on the internet. It’s about:

    • Reducing barriers
    • Spreading foundational ideas
    • Helping people grow
    • Building a good reputation
    • Making the world (and the internet) a little less confusing

    If you can make something complicated feel simple for someone else, you’ve done something genuinely valuable.

    That’s the kind of person I want to be—and the kind of example I want to set.

  • Sunday Jog Reflections – December 7th, 2025

    Another day, another jog. It’s Sunday, December 7th, 2025, and I’ve been thinking a lot about where I’m at—especially with my fitness and my projects.

    I ate way too much recently, and it’s becoming clear (again) that my diet needs a serious reset. I know the formula: if I really want to get in shape, the diet has to change first. Jogging helps, but diet is the multiplier. So that’s on my list.

    On the creative front, one of my friends pinged me about the card game we started building together. I’ve been on and off with it, but he’s motivated to get it off the ground, and honestly, I have everything I need—guides, materials, videos—to make real progress. I just need to sit down and put in the time. I’m planning to spend part of today watching the videos and pulling things together so I can get momentum back.

    On the technical side, I’m working on upgrading my GitHub environment, but I’ve been hitting challenges while learning. I’ve been experimenting with Coolify and trying to set up a clean environment for my sites. One issue I’m seeing: domains and services not running properly. My suspicion is that when I used the WordPress template, Coolify expected certain preset environment variables, and since I removed the local database and switched to a remote one, those variables are missing. That might be why the service keeps failing.

    My plan is to reset the configuration. The good news is that since the data is now on a remote database, everything should still be there. The real test will be bringing down the site that runs, spinning it back up, and seeing if everything reconnects correctly. If it works, great. If not… well, that means data loss, but I’m hoping for the best.

    Overall, life is good. I just need to get outside more, work out more, jog more, and keep moving forward—one step at a time.


  • Deployments, Gen-AI, and How They Shape My Development Process

    Hey, my name is Quentin Mayo, and in today’s blog I want to talk about deployments, Gen-AI, and how both shape the way I think about development. I’ll start with a story—because that’s how my brain usually connects everything.


    Why I “Never Look Back” in Development

    There’s a moment in The Incredibles where Edna Mode says she never looks back. That mindset stuck with me. I treat my devices and development setups the same way. Every so often, I’ll do a full hard reset on my phone or laptop. I back up what I know I’ll need… and if I didn’t grab it, it’s gone forever. No regrets.

    I apply that same philosophy to my development environments. Throughout my home lab and even in professional work, there have been countless times where I built something—a pipeline, an application, or a deployment setup—then realized I could build it better with the constraints I have today. Reset. Rebuild. Improve.

    That’s really what this vlog is about.


    My Deployment Evolution

    When I first started building websites, apps, and just tinkering with entrepreneurial ideas, I followed the same path a lot of organizations take:

    1. EC2 for basic servers
    2. Then Dockerized containers
    3. Then ECS as services needed to talk to one another
    4. And exploring equivalents in other clouds

    ECS was… complicated. Still fun, but definitely not simple.

    Recently, I discovered tools like Coolify and Dockify—platforms that let you deploy multiple services on a single instance, with a clean UI. This changed everything.

    Migrating WordPress

    I moved all my WordPress sites:

    • From EC2
    • To Lightsail
    • And finally to Coolify

    Lightsail was convenient… until it wasn’t. It wasn’t as cheap as advertised, had limitations, and I hit some scaling pain.

    So I figured: why not run Coolify on-prem? I even got AT&T to give me a static IP (which they were not excited about), and it worked… until it didn’t.

    Because I have a baby.
    And that baby loves unplugging my servers.

    One unplug → and boom → all my sites go down.

    So now I’m moving toward a hybrid approach.


    Designing a Hybrid Deployment Architecture

    Here’s the concept:

    • Route 53 manages my DNS.
    • The primary A record points to my home network running Coolify.
    • If the home IP becomes unreachable, Route 53 fails over to:
      → an EC2 instance also running Coolify in the cloud.

    This setup gives:

    • the low cost and flexibility of on-prem
    • the reliability of AWS
    • automatic DNS-level failover

    Database Strategy

    If I’m splitting workloads between home and cloud:

    • For small projects that don’t really matter → I’ll keep them on-prem.
    • For anything that needs reliability → the database cannot be on-prem.

    That leaves two good options:

    1. A database running on an EC2 instance (cheap, small projects)
    2. RDS for anything larger, or anything requiring scaling or stability

    A Common Mistake (AI Included)

    One thing I’ve noticed—and AI models repeat this mistake constantly—is the advice to always put your database in a private subnet. Sounds good… until you realize your application might need direct access from outside, depending on your architecture.

    Databases in a private subnet with no plan for safe exposure leads to:

    • misconfigurations
    • exposed IPs
    • unnecessary risks

    You need a solid access pattern—not just “put it in private and pray.”


    Gen-AI and Development Speed

    Now for the Gen-AI side.

    There are things I’m doing right now that would take some organizations weeks or months, maybe even a year or more. I built them in a weekend with Gen-AI assisting me:

    Examples:

    • Setting up GitHub Actions with PATs and a GitHub App
    • Creating an autoscaling group of self-hosted GitHub runners
    • Reducing GitHub Actions billing by using my own metal before cloud minutes

    It’s wild how much faster development is when you combine:

    • solid engineering fundamentals
    • strong problem-solving skills
    • and modern AI tooling

    Gen-AI doesn’t replace thinking—it accelerates people who already know how to think.

    Those who rely on AI without understanding fundamentals will struggle. But those who use AI to extend their reach will outperform by a mile.


    Wrapping Up

    I’m almost at my destination, so I’ll close here. Today’s thoughts were really about:

    • Why I rebuild systems instead of clinging to them
    • Why hybrid deployments make sense for my home lab
    • The realities of cloud cost vs on-prem cost
    • How Gen-AI lets engineers ship insanely fast—if they know what they’re doing

    More to come soon.

  • Home Automation Journey

    A while back, one of my colleagues mentioned Home Assistant. I didn’t pay much attention at first because I was using a tool called DRAC Board. Fun name—and I liked it because it gave me a clean dashboard view for things like my calendar. But as I started working on more personal projects, I realized DRAC Board was too limited. No JIRA integration, inconsistent support across Microsoft To Do, Google Keep, and other tools I used. And honestly, it hasn’t really improved much as a product. For five bucks a month, it did its job, but it wasn’t enough for the long-term.

    What really pushed me toward the home automation world was security cameras. With three little monsters—AKA my kids—running around, I needed visibility in the house while I’m working in the office. I had access to the cameras, but my wife didn’t, so I wanted a simple way for her to see what was going on in the shared spaces.

    I played with Home Assistant again and discovered that a lot of camera systems support RTSP streams. Even my little rinky-dink NVR could spit out RTSP URLs with the standard format (rtsp://username:password@IP/channel). Once I realized that, I built a nice dashboard where my wife and I can pull up the camera feeds anytime. And that alone made Home Assistant worthwhile.

    Integrations That Surprised Me

    As I dove deeper, I found that Home Assistant has integrations for other stuff I already own. For example, my Withings scale—the fancy one that tracks weight, BMI, etc.—has an API integration. So now my weight trend shows up on the dashboard too. It only pulls in one user profile, but that’s fine for now. I’m not buying another scale just to track everyone’s data.

    I also noticed you can pull in Xbox information. Since my kids often play using my profile, I can see what games they’ve been on. I haven’t tried Steam yet, but maybe that’s next.

    Next Steps for Home Automation

    I want to experiment with outdoor cameras next. The challenge is that I want all the features but I’m also really cheap. A lot of commercial brands either require subscriptions or offer limited functionality. And from a security perspective, I’m not overly concerned about digital locks being “weak.” Most break-ins aren’t Hollywood-style lockpicking. People smash windows, kick doors, or replay garage signals. So for me, it’s about features, not fear.

    Ideally, I want:

    • Fingerprint or phone-based entry for my kids
    • Cameras at entry points
    • Local + cloud backup of footage
    • A system that doesn’t cost a fortune

    I’ll keep refining it over the next few weeks.


    Fitness and Getting Back in Shape

    This year, getting back into shape has been harder than I expected. Not because my body can’t do it, but because my willpower keeps fighting for time against everything else. The funny thing is, when I was the busiest in my life, I was actually in the best shape. So it’s not a time problem—it’s a desire and consistency problem.

    I want to get back into jogging and lifting, especially after dealing with past shoulder injuries. Dance Dance Revolution is still my favorite workout, but it’s tough when the garage isn’t cleaned up (where I keep the dance pad). Sometimes the barrier isn’t motivation—it’s clutter.

    I used to do:

    • Push-ups
    • Crunches
    • Light lifting
    • Jogging
    • DDR or P90X

    And I could bounce back quickly. Now, with more responsibilities, I can’t just “go hard for 30 days” and magically reset. I need a slower, more sustainable ramp-up. Better eating. Smaller habits. Less “all or nothing.”

    Today I got outside and jogged a bit. It wasn’t perfect, but it was something. And right now, something is exactly what I need.


    Looking Ahead

    Over the next few weeks:

    • Keep improving the home automation setup
    • Start working toward getting outdoor cameras
    • Work on sustainable fitness
    • Drop a few pounds through consistency, not intensity

    Small steps, steady progress.