How Datadog Cut Go Binary Sizes 77%—Without Removing Features
Datadog engineers reduced their Agent binaries from 1.22 GiB to manageable sizes by auditing dependencies, eliminating reflection pitfalls, and re-enabling linker optimizations. Here's what they learned.
When the Datadog Agent ballooned from 428 MiB to 1.22 GiB over five years, it wasn't due to bloat or careless engineering. New features, integrations, and cloud SDKs had simply accumulated. But the growth created real problems: higher network costs, worse perception, and deployment headaches on resource-constrained platforms like serverless and IoT devices.
So Datadog's engineers spent six months systematically dismantling the bloat. By the time they shipped version 7.68.0, they'd achieved up to a 77% reduction in binary sizes—without removing a single feature. Their approach reveals subtle behaviors in Go's compiler and linker that most developers don't understand, and provides immediately actionable techniques for anyone shipping Go binaries.
The Hidden Cost of Transitive Dependencies
The first culprit was Go's dependency model. When you import a package, you get everything it imports, transitively. According to Datadog engineer Pierre Gimalac's account, even small changes can pull in hundreds of packages unexpectedly.
Datadog's solution was methodical auditing using three tools:
The payoff was immediate. Moving a single function into its own package removed approximately 570 packages and 36 MiB of generated code from binaries that didn't use it.
The technique relies on Go's build tags system. By marking optional code with //go:build feature_x constraints, teams can exclude entire dependency trees from builds that don't need them. The key is systematic: audit imports, identify what's optional, isolate it behind build tags or separate packages.
Reflection: The Silent Optimization Killer
The second major win came from understanding how reflection disables linker optimizations. When you use a non-constant method name—something like reflect.ValueOf(obj).MethodByName(variableName)—the linker can't determine at build time which methods will be called at runtime.
The result: it keeps every exported method of every reachable type, plus all their dependencies. This cascading effect can drastically inflate binary size.
Datadog eliminated dynamic reflection wherever possible in their own codebase. But they didn't stop there. They submitted pull requests to upstream projects—including Kubernetes, uber-go/dig, and google/go-cmp—to remove unnecessary reflection use. This yielded an additional 20% size reduction across affected builds.
The lesson: reflection isn't free. Every reflect.ValueOf call is a decision that affects your deployment artifacts.
The Plugin Package Trap
The most surprising finding involved Go's plugin package. Simply importing it—even without using it—forces the linker to treat the binary as dynamically linked. This disables method dead code elimination and forces the linker to retain all unexported methods.
Removing the plugin package import yielded another 20% reduction in some builds.
This is the kind of subtle compiler behavior that doesn't make it into tutorials. The plugin package announces to the linker that symbols might be referenced dynamically at runtime, so aggressive optimization becomes unsafe. The fix is straightforward once you know: if you're not actually using plugins, don't import the package.
What the Numbers Actually Mean
Let's be concrete about what Datadog achieved. Between version 7.60.0 (December 2024) and 7.68.0 (July 2025):
These aren't theoretical improvements. They translate directly to faster deployments, lower storage costs, reduced network transfer times, and better cold start performance in serverless environments.
The Upstream Contribution Effect
What makes this work particularly valuable is that Datadog didn't keep it internal. Their pull requests to projects like Kubernetes mean the entire Go ecosystem benefits. When widely-used dependencies eliminate reflection or reduce their own bloat, every project downstream gets smaller binaries.
This is how ecosystems improve: one team hits a pain point, solves it rigorously, and contributes the fixes back. According to the InfoQ coverage, Datadog's work is already helping other large Go projects unlock similar gains.
What You Can Do Tomorrow
If you're shipping Go binaries—particularly for cloud-native tools, agents, or microservices—here's the playbook:
1. Audit your dependencies. Run go list and goda to see what's actually being pulled in. You'll be surprised.
2. Isolate optional code. Use build tags to exclude features from builds that don't need them. Create separate packages for non-core functionality.
3. Search for reflection. Grep your codebase for reflect.ValueOf and MethodByName. Ask if each use is necessary. Consider alternatives like code generation.
4. Check for plugin imports. If you're not using Go's plugin system, make sure you're not importing the package anywhere.
5. Measure everything. Use go-size-analyzer before and after changes to quantify impact.
The techniques aren't exotic. They're systematic application of understanding how Go's compiler and linker work. The difference between a 1.2 GiB binary and a 280 MiB binary often comes down to knowing what the linker can and can't optimize away.
The Real Lesson
Datadog's engineers didn't achieve a 77% reduction through a single clever trick. They did it through patient, methodical work: auditing dependencies, understanding compiler behavior, fixing issues in their code and upstream dependencies, and validating improvements over six months.
This is what production engineering looks like. Not blog-driven development or framework-hopping, but understanding your tools deeply enough to make informed tradeoffs. The Go compiler is sophisticated, but its optimizations depend on how you structure your code. Write code that gives the linker permission to be aggressive about dead code elimination, and your binaries shrink.
Gimalac's full account on the Datadog engineering blog includes significantly more technical detail, including specific examples of dependency chains and before/after comparisons. If you're serious about binary size, it's worth reading in full.
For teams deploying Go applications at scale, this work proves that binary bloat isn't inevitable. It's a problem with concrete, actionable solutions—if you're willing to look under the hood.