Available · Server engineering

Ziggy

ziggy@server:~$

Full-stack server engineer with 14+ yrs in computer science. I design backend architecture that scales — and the frontend integrations that make it feel weightless.

engineer.profile.py Python
1# ziggy.wiki — engineer.profile2class Engineer:3 handle = "Ziggy"4 role = "Full-Stack Server"5 years = 14 # and counting6 stack = ["C++", "Go", "Rust", "…"]78 def ship(self, spec):9 return production(spec, tests=100)

Servers are my home turf.

I'm Ziggy. I've been writing code for 14+ years — starting deep on the systems side and growing into a full-stack server engineer who treats the frontend as part of the platform, not an afterthought.

My focus is the unglamorous stuff that decides whether a product survives at scale: infrastructure that doesn't flinch under load, networking that respects the packet, storage layers that don't lie, and APIs that mean what they say.

I move comfortably between low-level performance work in C++ and Rust, backend platforms in Go, C#, Python, and Node, embedded scripting in Lua, and everything in between — because the right tool wins.

14+
years engineering
11
production languages
servers shipped
0
tickets ignored
whoami.sh Bash
$ uname -aLinux ziggy 6.8.0-prod #1 SMP x86_64 GNU/Linux$ whoamiziggy # full-stack server engineer$ uptime14y 3mo, load: 0.42 0.31 0.27$ cat ~/.profileexport FOCUS="scale + speed"export MOTTO="the right tool wins"alias ship="git push --tags && deploy"

Eleven languages. One engineer.

Pick a language. See how I write it. Every snippet types itself out, line by line — same as how I would.

server.cpp C++
C++ proficiency 96%

Where I do my best work.

infra.yaml YAML
service: api-gateway replicas: 8 # scale on p95 latency runtime: rust-axum # hot path lives here limits: cpu: 2 memory: 512Mi probes: health: "/healthz" ready: "/readyz" slo: availability: 99.95 p95_ms: 35
deploy.sh Bash
#!/usr/bin/env bashset -euo pipefailREV="$(git rev-parse --short HEAD)"echo "→ building $REV"cargo build --releaseecho "→ rolling out"kubectl set image deploy/api api=api:$REVkubectl rollout status deploy/api --timeout=120scurl -fsS https://api/healthz || exit 1echo "✓ shipped $REV"
/01
Backend architecture
Service boundaries that hold up under traffic, queues that don't lose messages, and APIs that future-you won't curse at.
/02
Server infrastructure
Provisioning, deploys, observability, networking. Linux-first, comfortable with bare metal, containers, or whatever the team is on.
/03
Performance work
Profiling, hot paths, memory, allocators. C++ and Rust where every microsecond counts; Go and C# where readable concurrency wins.
/04
Data layer
SQL that uses the planner instead of fighting it. Schema migrations that don't need a maintenance window. Caching that means it.
/05
Frontend integration
Making the backend feel weightless from the browser. TypeScript, real-time, auth, and DX that doesn't punish your team.
/06
Systems scripting
Bash, batch, Lua, Python — the glue that holds the platform together. Quiet automation that pays interest forever.

A 14-year arc.

2012 — present
Server engineering, end to end
Designing and maintaining the server infrastructure behind production products: APIs, queues, schedulers, networking, observability — the whole stack from socket to UI.
2018 — present
Polyglot by default
Comfortable picking the right language for the job: C++/Rust for hot paths, Go/C# for services, Python/TS/Node for orchestration, Lua for embedded scripting, SQL everywhere.
Always
Shipping things that have to keep running
Servers, schedulers, daemons. Built to run while nobody's looking.
git log --author=ziggy Git
commit a1f7c92 (HEAD -> main) feat(api): zero-downtime schema migrationcommit e3b0c8e perf(rust): -38% p99 on hot pathcommit 7d4a611 refactor(go): swap channels for buffered queuecommit 2b9f04c chore(infra): cut deploy time 4m → 38scommit 0001000 init: "hello, server" — 2012

Got a server no one wants
to touch?
Let's talk.

Long-term engineering, architecture reviews, performance triage, or a one-off rescue — drop a line. I'll read it. I'll reply.

handshake.ts TypeScript
type Project = { scope: "audit" | "build" | "rescue"; stack: string[]; deadline?: Date;};async function contact(p: Project): Promise<Reply> { const reply = await send("hello@ziggy.wiki", p); return reply; // usually within a day}