The Problem
RWE's energy services platform was running on a legacy Java monolith that was slow and poorly implemented. Energy market data needed to be queried, filtered, and exported daily — but the system couldn't keep up. Large dataset exports took 45 minutes. API responses were sluggish. The monolith was hard to maintain and even harder to extend.
The business needed a system that could handle thousands of daily energy market queries with fast, reliable responses — and a codebase that a small team could actually evolve without fear.
How I Approached It
We migrated to Node.js with NestJS and MongoDB — a stack that gave us the modularity and performance the monolith lacked. Energy projects have dynamic properties, so the filtering system needed to generate columns on the fly based on each project's characteristics. I built this with MongoDB aggregations optimized through profiling, not guesswork.
The biggest technical challenge was that heavy data processing was killing the main thread. I implemented Worker Pools to offload large dataset operations to parallel threads — this is what brought export times from 45 minutes down to 12. I used the Node.js profiler to find the actual bottlenecks rather than optimizing blind.
One thing I did without being asked: I introduced TDD from the start for every feature I built. On a migration project — where you're replacing a running system people depend on — you need the confidence that tests give you. I made that call early, and it paid off immediately. This discipline later became second nature when I moved to Scout24, where full test coverage was already the norm.
Key Technical Decisions
Worker Pools for Exports
Large dataset operations were blocking the event loop. Worker Pools moved heavy processing to parallel threads, keeping the API responsive while exports ran in the background. 45min became 12min.
Profiler-Driven Optimization
Instead of guessing at bottlenecks, I used the Node.js profiler to identify exactly where time was spent in the column filtering pipeline — then optimized the specific MongoDB aggregations that mattered.
Dynamic Column Generation
Energy projects have variable properties. The filtering system generates columns dynamically based on project characteristics — flexible enough for any project shape, fast enough for daily use.
TDD from Day One
On a migration project, you're replacing a system people depend on. I introduced test-driven development for every feature — not because it was required, but because the risk of shipping broken replacements was too high.
Business Impact
The 65% API improvement meant energy traders could query market data and get answers fast enough to act on them. The export time reduction meant analysts stopped scheduling exports overnight — they could run them during the workday and iterate. The new architecture gave a 4-person team the ability to ship features that the old monolith made impossible.
What I Took Away
This project taught me the value of consistency in delivery. On a small team replacing a production system, there's no room for erratic output. Every week needed to move the migration forward predictably. I learned to scope work tightly, ship reliably, and communicate progress clearly.
Technically, it deepened my conviction that performance optimization starts with measurement. The profiler-first approach — finding the real bottleneck before writing a single optimization — saved weeks of wasted effort and became a permanent part of how I work.