At TikTok, our application operates on a massive scale, serving billions of users with a complex and constantly evolving feature set. To manage this complexity, our engineering teams rely on a robust infrastructure and carefully chosen tools.
One key technique we use is dependency injection (DI), which helps keep components loosely coupled and modular. As the app continued to grow in size and complexity, however, our existing DI system started to show limitations. In this post, we’ll explain the challenges we faced and how we improved DI at TikTok with a new framework called Knit.
The problem: The hidden costs of traditional DI at scale
Over the years, our teams explored various DI frameworks. However, to ensure the best possible user experience, we chose annotation-based solutions such as Hilt and Dagger. Despite this benefit, as the TikTok app continued to grow, the underlying mechanics of these annotation-processor-based frameworks presented several obstacles at our scale:
- Build Time Impact: Our analysis revealed that this annotation processing accounted for approximately 10% of our total compilation time. These multiple rounds of annotation processing steps introduced significant overhead to every build.
- Runtime Overhead & ANRs: The fixed component hierarchy in Hilt, while suitable for many standard applications, led to the creation of large, unified components during application startup. This eager initialization contributed to significant runtime overhead, resulting in nearly 388,000 Application Not Responding (ANR) errors per month, making it one of the top three ANR sources in our app.
- Inflexible Component Hierarchy: TikTok utilizes a Single-Activity architecture. This meant the distinction between Hilt's `ApplicationComponent` and `ActivityComponent` was less meaningful for us. Furthermore, our deep fragment hierarchies required a more granular and flexible way to scope dependencies than the standard hierarchy allowed, without resorting to complex workarounds.
- Package Size: The code generation approach resulted in numerous additional classes for each entry point and module, including injectors, factories, and module implementations. This added up to a noticeable increase in our final package size.
Faced with these challenges, we defined four key goals for a new solution. It needed to be:
- Fast: Both at compile time and runtime, with no ANRs.
- Flexible: Allowing a customizable component hierarchy that fits our app's architecture.
- Safe: Providing compile-time verification for all dependencies.
- Minimized: Generating a minimum amount of code. Generally, if we prioritize better runtime performance and compile-time safety, some code generation is required, but we aim to keep it minimal.
Rethinking DI: The ideal pattern
At its core, dependency injection is about two fundamental concepts: producers, which create object instances, and consumers, which use them. The role of a DI framework is simply to connect these two parties.
We envisioned an ideal framework where developers only need to declare what they provide and what they consume. The framework should handle the "wiring" behind the scenes, directly connecting the consumer to the producer with minimal impact on build times, runtime performance, and package size.
What if we could establish this link without generating any intermediate classes or factories? This question became the starting point for our new solution.
Introducing Knit: A bytecode-level solution
We call our solution Knit. The name reflects its core function: it "knits" together dependencies, weaving producers and consumers into a direct, efficient graph. To achieve our goals of speed and minimalism, Knit operates not at the source code level, but at the bytecode level.
Instead of generating source code that must be re-processed and re-compiled, Knit works after the compilation is complete. It scans the compiled bytecode for its simple annotations, understands the dependency graph, and then modifies the bytecode directly to inject the necessary logic.
Here's what this looks like in practice. A developer simply writes:
// Producer
@Provides
class UserRepository
// Consumer
class UserService {
val repository: UserRepository by di
}
This is all the developer needs to write — no modules, no components, no extra setup. After compilation, Knit transforms the UserService
class's bytecode. The resulting code is functionally equivalent to the following highly optimized, thread-safe implementation using a double-checked locking pattern:
// Pseudo-code of the bytecode generated by Knit
class UserService {
private var _repository: UserRepository? = null
val repository: UserRepository get() {
var localRepo = this._repository
if (localRepo != null) return localRepo
synchronized(this) {
localRepo = this._repository
if (localRepo != null) return localRepo
localRepo = UserRepository() // Direct instantiation
_repository = localRepo
return localRepo
}
}
}
As you can see, the connection is direct. There are no intermediate factories or lookups. Knit inserts the creation and caching logic exactly where it's needed, resulting in code that is both efficient and easy to trace.
The impact: Knit at TikTok
By replacing annotation-processor-based DI with Knit, we have seen significant, measurable improvements across the board:
- Build time reduced by 10%: By eliminating multiple rounds of annotation processing and source code generation, Knit streamlines the build process. Its processing flow is much simpler:
- ANRs reduced by 388,000 per month: Knit's "lazy by default" strategy is the key to its runtime performance. Dependencies are only instantiated when first accessed, not eagerly during application or component creation. This drastically reduces the work done on the main thread during critical startup paths.
- Package size reduced by 1MB: Because Knit modifies existing classes instead of generating numerous new ones (like injectors, factories, and modules), it adds a minimal footprint to the final application.
`source code → compiler → bytecode → Knit → modified bytecode`.
These results stem directly from Knit's foundational design: a bytecode-level approach that prioritizes direct connections and lazy initialization.
We are open sourcing Knit
Today, we are excited to announce that Knit is now open source and available for the Android community. We built Knit to solve the challenges of dependency injection at massive scale, and we believe its principles of speed, flexibility, and simplicity can benefit many other projects.
We invite you to check out the repository on GitHub at tiktok/knit, try it out, and share your feedback. We look forward to seeing what the community builds with it and welcome your contributions.
