Reviving Graduate School MATLAB with Claude Code
Every researcher has a graveyard of old code. Mine includes MATLAB scripts from graduate school that once optimized rocket motor geometries, tuned CubeSat attitude controllers, and ran parallel genetic algorithms across networked lab computers. For years, these files sat in a folder called “grad_school_matlab” on backup drives, technically functional but practically useless.
Then some new models came out and I needed something interesting to try them out on.
The Original Code
In 2012, I wrote a parallel genetic algorithm framework in MATLAB for my thesis work at Auburn. The core idea was simple but effective: maintain multiple populations (I called them “nations”) that evolve independently, then periodically merge the best individuals. This is the island model, a well-established technique for avoiding premature convergence in evolutionary optimization. It happens distribute across an HPC (like we had then) really well.
The original code was structured around three classes:
GAGlobe.m- The island model manager, coordinating multiple populationsPopulation.m- Individual island populations with breeding and selectionMember.m- Genome representation for individuals
It worked. I used it to optimize star grain geometries for solid rocket motors, tune control systems for nanosatellites, and solve various parameter estimation problems. The code ran on MATLAB’s Parallel Computing Toolbox, distributing fitness evaluations across cores and machines in a cluster.
But MATLAB is expensive, slow for this kind of work, and increasingly awkward for modern deployment. I haven’t had a MATLAB license since I graduated. The code sat unused.
Enter Claude Code
As models and harnesses have gotten better and better I’ve gradually run out of toy problems that are the right mix of difficult, understandable, maybe useful, and unique to test them out on. So this time, I decided to try an experiment: could an AI assistant help me rewrite 14-year-old MATLAB into modern Rust with Python bindings?
The answer was yes, though “help” understates it. The process looked like this:
- Context loading: I fed Claude/Codex the entire legacy MATLAB codebase and explained what each piece did
- Architecture planning: We discussed modern equivalents, different frameworks to use or not use (e.g. is this just DEAP?)
- Incremental porting: Starting with the
Memberclass (nowgenome.rs), thenPopulation(population.rs), and finallyGAGlobe(island.rs) - Python bindings: Adding PyO3 annotations to expose the Rust types to Python
- Testing: Writing property-based tests to verify the new implementation matched the old behavior
The bulk of the work endedup being in testing. If performance was what mattered then we needed great benchmarks on different types of problem. I asked the model to research some academic optimization problems that we could use for our benchmarking, and we will have some good posts in the coming weeks diving deep into those.
What Changed
The core algorithm stayed the same. Islands evolve independently, periodically exchange their best individuals through migration, and eventually converge on good solutions. What changed was everything around it:
Performance: On standard optimization benchmarks (Rastrigin, Rosenbrock, Ackley, Griewank), ParGA’s Rust core is 2-2.7x faster than DEAP. For computationally expensive fitness functions (>0.5ms per evaluation), auto-parallelization kicks in and distributes evaluations across CPU cores, achieving further speedup over DEAP. For very cheap fitness functions like a simple sphere evaluation, the speedup is marginal (~1.1x) since both libraries are bottlenecked by Python call overhead rather than compute. There is room to improve this further, the parallel strategy currently uses Python multiprocessing, and moving more of the parallelism into Rust would reduce overhead and improve scaling.
Parallelism: Instead of MATLAB’s implicit parallelism, ParGA now has explicit strategies. For expensive fitness functions (>0.5ms per evaluation), it automatically uses Python’s ProcessPoolExecutor to bypass the GIL. For cheap functions, the overhead of that isn’t worth it so we dynamically do the quicker option and leave it in one interpreter and let the GIL do it’s thing.
API Design: The original MATLAB code required understanding the internal structure. The new Python API is simple:
from parga import minimize
def sphere(x):
return sum(x ** 2)
result = minimize(sphere, genome_length=10, bounds=(-5, 5))
Lessons Learned
Working with an AI on this project taught me something about code archaeology. The assistant asked clarifying questions I hadn’t thought to ask myself in years. I don’t think GAs are likely the future, but they’re intuitive and interesting and fun.
Answering these questions forced me to re-examine decisions I’d made as a sleep-deprived graduate student. Some held up. Others didn’t. The AI didn’t judge either way. It just helped me build something better.
The result is ParGA, which I look forward to neglecting for the next couple of decades again.
What’s Next
This is the first post in a series about ParGA. Coming up:
- Building Python libraries in Rust with PyO3 and Maturin
- Genetic algorithms: the fundamentals
- Island models and migration topologies
- Case studies: Lennard-Jones clusters, protein folding, rocket motors, and alloy design
The code that sat dormant for a decade (more) is now solving problems again. Sometimes the best way to move forward is to look back at what you already built.
Stay in the loop
Get notified when I publish new posts. No spam, unsubscribe anytime.