Nathan Baune
Backend & Platform Engineer
I build distributed systems, real-time engines, and database architectures—the infrastructure that makes platforms work. Currently doing research software engineering at Emory University while building an interactive storytelling platform (MUSE) in off-hours.
Seeking remote backend or platform engineering roles where I can contribute to production systems architecture, distributed systems, and backend infrastructure.
Currently
Software Engineer at the Precision Neural Engineering Lab (Emory University) building research platforms, ML pipelines, and real-time systems for NIH-funded clinical research.
Also working on MUSE Living Worlds (Gothic Grandma LLC) in spare time—a long-term creative and technical project exploring interactive narrative through biological character simulation.
Seeking: Remote backend or platform engineering roles. Available for 2-3+ year commitment; transparent about long-term goals.
Startup Experience
Co-Founder & Technical Lead
PlatformSTL LLC · 2017–2021
Built Proprio: real-time wearable data platform from watch app to cloud infrastructure. Secured $100K SBIR grant, published peer-reviewed research, achieved 20% better accuracy than state-of-the-art.
Full-stack ownership: Swift iOS apps, AWS Lambda ingestion, MongoDB storage, ML pipeline, multi-tenant dashboards.
View Details →Core Strengths
Backend
Platform
Interface
Technologies
Design Philosophy
The principles that guide every system I build:
- Systems should explain themselves — opacity is a design failure, not an acceptable trade-off
- State must be observable by default — if you can't see what's happening, you can't understand or fix it
- Interfaces define power, not just permissions — the control plane shapes what users can think, not just what they can do
- Visual reasoning scales cognitively — if a system can't be understood visually, it won't scale in human minds
- Design for failure as the default — build for inspection, correction, and evolution from day one
How I Build
The concrete practices behind my work:
I design platforms by working backwards from human cognition and forwards from execution constraints. This means treating the human-facing interface and the execution runtime as equally important design problems that must solve each other's constraints.
In practice:
- Visual interfaces compile to executable systems — In MUSE, designers model systems through visual graphs that generate optimized execution kernels. No hand-coding required.
- Schemas are contracts — between people, code, and runtime. The database defines the truth; everything else generates from it.
- Control planes over displays — interfaces aren't passive views; they're active control surfaces that shape what's possible
- Naming, boundaries, and affordances as first-class engineering — these determine cognitive scalability long before technical scalability matters
- Complete toolchains — I build from isolated components through to production systems, ensuring every layer coheres
- AI as augmentation instrument — humans retain decision authority; AI extends capability without replacing judgment
Projects
Professional Work (Current)
MR.Flow
2024–PresentDomain-specific orchestration for multi-stage async workloads. Execution state, failure modes, and artifacts inspectable at every step. Research-grade MRI pipelines without the pain.
ePOCHE
2023–PresentVisual EEG-to-model pipeline for cognitive and BCI research. Feature engineering, model management, automated evaluation.
Research Engineering Portfolio
Select projects from over a decade of research engineering: VR assessments, robotics controllers, neuroimaging tools, and experimental systems built across clinical research.
MINT Balance Platform
Modernized a legacy balance perturbation system for NIH-funded Parkinson's research.
- MINT platform with low-level hardware controller
- MATLAB integration with Vicon motion capture
- Refactored >15,000 lines to <6,000
- Migrated from Windows XP to Windows 11
VR Upper Extremity Assessment
VR-based motor assessments for stroke rehabilitation research using Unity and motion capture.
- Designed for accessibility with motor-impaired users
- Integrated motion capture for kinematic analysis
- Published findings on proprioceptive deficits
- Tested with 50+ stroke survivors
KINARM + Stateflow Experimental Control
Experimental paradigms for the KINARM robotic exoskeleton with multi-modal integration and state machine control.
- Integrated robot with EEG, forceplates, and real-time ML pipeline
- Stateflow-based trial sequencing with adaptive protocols
- Hardware synchronization (displays, sensors, actuators)
- Real-time kinematic feedback and neural decoding
- Used across multiple patient populations
VR Integration
Hardware integration between Unity VR and research equipment.
- VR treadmill integration for locomotion research
- PCI card sends TTL pulses via serial port for event synchronization
- Voltage conversion circuit for BioPAC DAQ compatibility
- Synchronized VR events with EMG/ECG/GSR recording
fMRI Experimental Software
Stimulus presentation and response collection for fMRI studies of sensorimotor control.
- Python and E-Prime stimulus presentation
- Arduino/hardware controller for MR-compatible response devices
- Precise timing synchronization with scanner triggers
- Motor task paradigms for functional imaging