Configuration Cache Requirements for your Build Logic
- Certain Types must not be Referenced by Tasks
- Using the
Project
Object at Execution Time - Accessing a Task Instance from Another Instance
- Sharing Mutable Objects
- Accessing Task Extensions or Conventions
- Using Build Listeners
- Running External Processes
- Reading System Properties and Environment Variables
- Undeclared Reading of Files
- Bytecode Modifications and Java Agent
- Handling of Credentials and Secrets
To capture and reload the task graph state using the Configuration Cache, Gradle enforces specific requirements on tasks and build logic. Any violation of these requirements is reported as a Configuration Cache "problem," which causes the build to fail.
In most cases, these requirements expose undeclared inputs, making builds more strict, correct, and reliable. Using the Configuration Cache is effectively an opt-in to these improvements.
The following sections describe each requirement and provide guidance on resolving issues in your build.
Certain Types must not be Referenced by Tasks
Some types must not be referenced by task fields or in task actions (such as doFirst {}
or doLast {}
).
These types fall into the following categories:
-
Live JVM state types
-
Gradle model types
-
Dependency management types
These restrictions exist because these types cannot easily be stored or reconstructed by the Configuration Cache.
Live JVM State Types
Live JVM state types (e.g., ClassLoader
, Thread
, OutputStream
, Socket
) are disallowed, as they do not represent task inputs or outputs.
Gradle Model Types
Gradle model types (e.g., Gradle
, Settings
, Project
, SourceSet
, Configuration
) are often used to pass task inputs that should instead be explicitly declared.
For example, instead of referencing a Project to retrieve project.version
at execution time, declare the project version as a Property<String>
input.
Similarly, instead of referencing a SourceSet
for source files or classpath resolution, declare these as a FileCollection
input.
Dependency Management Types
The same requirement applies to dependency management types with some nuances.
Some dependency management types, such as Configuration
and SourceDirectorySet
, should not be used as task inputs because they contain unnecessary state and are not precise.
Use a less specific type that gives necessary features instead:
-
If referencing a
Configuration
to get resolved files, declare aFileCollection
input. -
If referencing a
SourceDirectorySet
, declare aFileTree
input.
Additionally, referencing resolved dependency results is disallowed (e.g., ArtifactResolutionQuery
, ResolvedArtifact
, ArtifactResult
).
Instead:
-
Use a
Provider<ResolvedComponentResult>
fromResolutionResult.getRootComponent()
. -
Use
ArtifactCollection.getResolvedArtifacts()
, which returns aProvider<Set<ResolvedArtifactResult>>
.
Tasks should avoid referencing resolved results and instead rely on lazy specifications to defer dependency resolution until execution time.
Some types, such as Publication
or Dependency
, are not serializable but could be made so in the future.
Gradle may allow them as task inputs if necessary.
The following task references a SourceSet
, which is not allowed:
abstract class SomeTask : DefaultTask() {
@get:Input lateinit var sourceSet: SourceSet (1)
@TaskAction
fun action() {
val classpathFiles = sourceSet.compileClasspath.files
// ...
}
}
abstract class SomeTask extends DefaultTask {
@Input SourceSet sourceSet (1)
@TaskAction
void action() {
def classpathFiles = sourceSet.compileClasspath.files
// ...
}
}
1 | This will be reported as a problem because referencing SourceSet is not allowed |
The following is the fixed version:
abstract class SomeTask : DefaultTask() {
@get:InputFiles @get:Classpath
abstract val classpath: ConfigurableFileCollection (1)
@TaskAction
fun action() {
val classpathFiles = classpath.files
// ...
}
}
abstract class SomeTask extends DefaultTask {
@InputFiles @Classpath
abstract ConfigurableFileCollection getClasspath() (1)
@TaskAction
void action() {
def classpathFiles = classpath.files
// ...
}
}
1 | No more problems reported, we now reference the supported type FileCollection |
If an ad-hoc task in a script captures a disallowed reference in a doLast {}
closure:
tasks.register("someTask") {
doLast {
val classpathFiles = sourceSets.main.get().compileClasspath.files (1)
}
}
tasks.register('someTask') {
doLast {
def classpathFiles = sourceSets.main.compileClasspath.files (1)
}
}
1 | This will be reported as a problem because the doLast {} closure is capturing a reference to the SourceSet |
You still need to fulfil the same requirement, that is not referencing a disallowed type.
This is how the task declaration above can be fixed:
tasks.register("someTask") {
val classpath = sourceSets.main.get().compileClasspath (1)
doLast {
val classpathFiles = classpath.files
}
}
tasks.register('someTask') {
def classpath = sourceSets.main.compileClasspath (1)
doLast {
def classpathFiles = classpath.files
}
}
1 | No more problems reported, the doLast {} closure now only captures classpath which is of the supported FileCollection type |
Sometimes, a disallowed type is indirectly referenced through another type. For example, a task may reference an allowed type that, in turn, references a disallowed type. The hierarchical view in the HTML problem report can help you trace such issues and identify the offending reference.
Using the Project
Object at Execution Time
Tasks must not use any Project
objects during execution.
This includes calling Task.getProject()
while a task is running.
Some cases can be resolved similarly to those described in disallowed types.
Often, equivalent functionality is available on both Project
and Task
.
For example:
-
If you need a
Logger
, useTask.logger
instead ofProject.logger
. -
For file operations, use injected services rather than
Project
methods.
The following task incorrectly references the Project
object at execution time:
abstract class SomeTask : DefaultTask() {
@TaskAction
fun action() {
project.copy { (1)
from("source")
into("destination")
}
}
}
abstract class SomeTask extends DefaultTask {
@TaskAction
void action() {
project.copy { (1)
from 'source'
into 'destination'
}
}
}
1 | This will be reported as a problem because the task action is using the Project object at execution time |
Fixed version:
abstract class SomeTask : DefaultTask() {
@get:Inject abstract val fs: FileSystemOperations (1)
@TaskAction
fun action() {
fs.copy {
from("source")
into("destination")
}
}
}
abstract class SomeTask extends DefaultTask {
@Inject abstract FileSystemOperations getFs() (1)
@TaskAction
void action() {
fs.copy {
from 'source'
into 'destination'
}
}
}
1 | No more problem reported, the injected FileSystemOperations service is supported as a replacement for project.copy {} |
If the same problem occurs in an ad-hoc task in a script:
tasks.register("someTask") {
doLast {
project.copy { (1)
from("source")
into("destination")
}
}
}
tasks.register('someTask') {
doLast {
project.copy { (1)
from 'source'
into 'destination'
}
}
}
1 | This will be reported as a problem because the task action is using the Project object at execution time |
Fixed version:
interface Injected {
@get:Inject val fs: FileSystemOperations (1)
}
tasks.register("someTask") {
val injected = project.objects.newInstance<Injected>() (2)
doLast {
injected.fs.copy { (3)
from("source")
into("destination")
}
}
}
interface Injected {
@Inject FileSystemOperations getFs() (1)
}
tasks.register('someTask') {
def injected = project.objects.newInstance(Injected) (2)
doLast {
injected.fs.copy { (3)
from 'source'
into 'destination'
}
}
}
1 | Services can’t be injected directly in scripts, we need an extra type to convey the injection point |
2 | Create an instance of the extra type using project.object outside the task action |
3 | No more problem reported, the task action references injected that provides the FileSystemOperations service, supported as a replacement for project.copy {} |
Fixing ad-hoc tasks in scripts requires additional effort, making it a good opportunity to refactor them into proper task classes.
The table below lists recommended replacements for commonly used Project
methods:
Instead of: | Use: |
---|---|
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
|
|
|
|
|
|
A task input or output property or a script variable to capture the result of using |
|
A task input or output property or a script variable to capture the result of using |
|
|
|
|
|
|
|
|
|
|
|
A task input or output property or a script variable to capture the result of using |
|
|
|
|
|
|
|
|
|
The Kotlin, Groovy or Java API available to your build logic. |
|
|
|
|
|
|
|
Accessing a Task Instance from Another Instance
Tasks must not directly access the state of another task instance. Instead, they should be connected using inputs and outputs relationships.
This requirement ensures that tasks remain isolated and correctly cacheable. As a result, it is unsupported to write tasks that configure other tasks at execution time.
Sharing Mutable Objects
When storing a task in the Configuration Cache, all objects referenced through the task’s fields are serialized.
In most cases, deserialization preserves reference equality—if two fields a
and b
reference the same instance at configuration time, they will still reference the same instance after deserialization (a == b
, or a === b
in Groovy/Kotlin syntax).
However, for performance reasons, certain classes—such as java.lang.String
, java.io.File
, and many java.util.Collection
implementations—are serialized without preserving reference equality.
After deserialization, fields that referred to these objects may reference different but equal instances.
Consider a task that stores a user-defined object and an ArrayList
as task fields:
class StateObject {
// ...
}
abstract class StatefulTask : DefaultTask() {
@get:Internal
var stateObject: StateObject? = null
@get:Internal
var strings: List<String>? = null
}
tasks.register<StatefulTask>("checkEquality") {
val objectValue = StateObject()
val stringsValue = arrayListOf("a", "b")
stateObject = objectValue
strings = stringsValue
doLast { (1)
println("POJO reference equality: ${stateObject === objectValue}") (2)
println("Collection reference equality: ${strings === stringsValue}") (3)
println("Collection equality: ${strings == stringsValue}") (4)
}
}
class StateObject {
// ...
}
abstract class StatefulTask extends DefaultTask {
@Internal
StateObject stateObject
@Internal
List<String> strings
}
tasks.register("checkEquality", StatefulTask) {
def objectValue = new StateObject()
def stringsValue = ["a", "b"] as ArrayList<String>
stateObject = objectValue
strings = stringsValue
doLast { (1)
println("POJO reference equality: ${stateObject === objectValue}") (2)
println("Collection reference equality: ${strings === stringsValue}") (3)
println("Collection equality: ${strings == stringsValue}") (4)
}
}
1 | doLast action captures the references from the enclosing scope. These captured references are also serialized to the Configuration Cache. |
2 | Compare the reference to an object of user-defined class stored in the task field and the reference captured in the doLast action. |
3 | Compare the reference to ArrayList instance stored in the task field and the reference captured in the doLast action. |
4 | Check the equality of stored and captured lists. |
Without Configuration Cache, reference equality is preserved in both cases:
❯ ./gradlew --no-configuration-cache checkEquality > Task :checkEquality POJO reference equality: true Collection reference equality: true Collection equality: true
With Configuration Cache enabled, only user-defined object references remain identical. List references are different, although the lists themselves remain equal:
❯ ./gradlew --configuration-cache checkEquality > Task :checkEquality POJO reference equality: true Collection reference equality: false Collection equality: true
Best Practices:
-
Avoid sharing mutable objects between configuration and execution phases.
-
If sharing state is necessary, wrap it in a user-defined class.
-
Do not rely on reference equality for standard Java, Groovy, Kotlin, or Gradle-defined types.
Reference equality is never preserved between tasks—each task is an isolated "realm." To share objects across tasks, use a Build Service to wrap the shared state.
Accessing Task Extensions or Conventions
Tasks must not access conventions, extensions, or extra properties at execution time.
Instead, any value relevant to task execution should be explicitly modeled as a task property to ensure proper caching and reproducibility.
Using Build Listeners
Plugins and build scripts must not register build listeners that are created at configuration time and triggered at execution time.
This includes listeners such as BuildListener
or TaskExecutionListener
.
Recommended Alternatives:
-
Use Build Services to handle execution-time logic.
-
Register a Build Service to receive task execution notifications.
-
Replace
buildFinished
listeners with dataflow actions to manage build results.
Running External Processes
Plugin and build scripts should avoid running external processes at configuration time.
In general, it is preferred to run external processes in tasks with properly declared inputs and outputs to avoid unnecessary work when the task is UP-TO-DATE
.
However, if needed, you should only use the configuration-cache-compatible APIs described below instead of Java and Groovy standard APIs, or Gradle-provided methods Project.exec
, Project.javaexec
, ExecOperations.exec
, and ExecOperations.javaexec
.
The flexibility of these methods prevents Gradle from determining how the calls impact the build configuration, making it difficult to ensure that the Configuration Cache entry can be safely reused.
For simpler cases, when grabbing the output of the process is enough,
providers.exec()
and
providers.javaexec()
can be used:
val gitVersion = providers.exec {
commandLine("git", "--version")
}.standardOutput.asText.get()
def gitVersion = providers.exec {
commandLine("git", "--version")
}.standardOutput.asText.get()
For more complex cases, a custom ValueSource
implementation with injected ExecOperations
can be used.
This ExecOperations
instance can be used at configuration time without restrictions:
abstract class GitVersionValueSource : ValueSource<String, ValueSourceParameters.None> {
@get:Inject
abstract val execOperations: ExecOperations
override fun obtain(): String {
val output = ByteArrayOutputStream()
execOperations.exec {
commandLine("git", "--version")
standardOutput = output
}
return String(output.toByteArray(), Charset.defaultCharset())
}
}
abstract class GitVersionValueSource implements ValueSource<String, ValueSourceParameters.None> {
@Inject
abstract ExecOperations getExecOperations()
String obtain() {
ByteArrayOutputStream output = new ByteArrayOutputStream()
execOperations.exec {
it.commandLine "git", "--version"
it.standardOutput = output
}
return new String(output.toByteArray(), Charset.defaultCharset())
}
}
You can also use standard Java/Kotlin/Groovy process APIs like java.lang.ProcessBuilder
in the ValueSource
.
The ValueSource
implementation can then be used to create a provider with providers.of
:
val gitVersionProvider = providers.of(GitVersionValueSource::class) {}
val gitVersion = gitVersionProvider.get()
def gitVersionProvider = providers.of(GitVersionValueSource.class) {}
def gitVersion = gitVersionProvider.get()
In both approaches, if the value of the provider is used at configuration time then it will become a build configuration input.
The external process will be executed for every build to determine if the Configuration Cache is UP-TO-DATE
, so it is recommended to only call fast-running processes at configuration time.
If the value changes then the cache is invalidated and the process will be run again during this build as part of the configuration phase.
Reading System Properties and Environment Variables
Plugins and build scripts may read system properties and environment variables directly at configuration time with standard Java, Groovy, or Kotlin APIs or with the value supplier APIs. Doing so makes such variables or properties a build configuration input. Therefore, changing their value invalidates the Configuration Cache.
The Configuration Cache report includes a list of these build configuration inputs to help track them.
In general, you should avoid reading the value of system properties and environment variables at configuration time to avoid cache misses when these values change.
Instead, you can connect the Provider
returned by providers.systemProperty()
or
providers.environmentVariable()
to task properties.
Some access patterns that potentially enumerate all environment variables or system properties (for example, calling System.getenv().forEach()
or using the iterator of its keySet()
) are discouraged.
In this case, Gradle cannot find out what properties are actual build configuration inputs, so every available property becomes one.
Even adding a new property will invalidate the cache if this pattern is used.
Using a custom predicate to filter environment variables is an example of this discouraged pattern:
val jdkLocations = System.getenv().filterKeys {
it.startsWith("JDK_")
}
def jdkLocations = System.getenv().findAll {
key, _ -> key.startsWith("JDK_")
}
The logic in the predicate is opaque to the Configuration Cache, so all environment variables are considered inputs.
One way to reduce the number of inputs is to always use methods that query a concrete variable name, such as getenv(String)
, or getenv().get()
:
val jdkVariables = listOf("JDK_8", "JDK_11", "JDK_17")
val jdkLocations = jdkVariables.filter { v ->
System.getenv(v) != null
}.associate { v ->
v to System.getenv(v)
}
def jdkVariables = ["JDK_8", "JDK_11", "JDK_17"]
def jdkLocations = jdkVariables.findAll { v ->
System.getenv(v) != null
}.collectEntries { v ->
[v, System.getenv(v)]
}
The fixed code above, however, is not exactly equivalent to the original as only an explicit list of variables is supported. Prefix-based filtering is a common scenario, so there are provider-based APIs to access system properties and environment variables:
val jdkLocationsProvider = providers.environmentVariablesPrefixedBy("JDK_")
def jdkLocationsProvider = providers.environmentVariablesPrefixedBy("JDK_")
Note that the Configuration Cache would be invalidated not only when the value of the variable changes or the variable is removed but also when another variable with the matching prefix is added to the environment.
For more complex use cases a custom ValueSource
implementation can be used.
System properties and environment variables referenced in the code of the ValueSource
do not become build configuration inputs, so any processing can be applied.
Instead, the value of the ValueSource
is recomputed each time the build runs and only if the value changes the Configuration Cache is invalidated.
For example, a ValueSource
can be used to get all environment variables with names containing the substring JDK
:
abstract class EnvVarsWithSubstringValueSource : ValueSource<Map<String, String>, EnvVarsWithSubstringValueSource.Parameters> {
interface Parameters : ValueSourceParameters {
val substring: Property<String>
}
override fun obtain(): Map<String, String> {
return System.getenv().filterKeys { key ->
key.contains(parameters.substring.get())
}
}
}
val jdkLocationsProvider = providers.of(EnvVarsWithSubstringValueSource::class) {
parameters {
substring = "JDK"
}
}
abstract class EnvVarsWithSubstringValueSource implements ValueSource<Map<String, String>, Parameters> {
interface Parameters extends ValueSourceParameters {
Property<String> getSubstring()
}
Map<String, String> obtain() {
return System.getenv().findAll { key, _ ->
key.contains(parameters.substring.get())
}
}
}
def jdkLocationsProvider = providers.of(EnvVarsWithSubstringValueSource.class) {
parameters {
substring = "JDK"
}
}
Undeclared Reading of Files
Plugins and build scripts should not read files directly using the Java, Groovy or Kotlin APIs at configuration time. Instead, declare files as potential build configuration inputs using the value supplier APIs.
This problem is caused by build logic similar to this:
val config = file("some.conf").readText()
def config = file('some.conf').text
To fix this problem, read files using providers.fileContents()
instead:
val config = providers.fileContents(layout.projectDirectory.file("some.conf"))
.asText
def config = providers.fileContents(layout.projectDirectory.file('some.conf'))
.asText
In general, you should avoid reading files at configuration time, to avoid invalidating Configuration Cache entries when the file content changes.
Instead, you can connect the Provider
returned by providers.fileContents()
to task properties.
Bytecode Modifications and Java Agent
To detect the configuration inputs, Gradle modifies the bytecode of classes on the build script classpath, like plugins and their dependencies. Gradle uses a Java agent to modify the bytecode. Integrity self-checks of some libraries may fail because of the changed bytecode or the agent’s presence.
To work around this, you can use the Worker API with classloader or process isolation to encapsulate the library code. The bytecode of the worker’s classpath is not modified, so the self-checks should pass. When process isolation is used, the worker action is executed in a separate worker process that doesn’t have the Gradle Java agent installed.
In simple cases, when the libraries also provide command-line entry points (public static void main()
method), you can also use the JavaExec
task to isolate the library.
Handling of Credentials and Secrets
Currently, the Configuration Cache does not have a built-in mechanism to prevent storing secrets used as inputs.
As a result, secrets may end up in the serialized Configuration Cache entry, which, by default, is stored under .gradle/configuration-cache
in your project directory.
To mitigate the risk of accidental exposure, Gradle encrypts the Configuration Cache.
When required, Gradle transparently generates a machine-specific secret key, caches it under the
GRADLE_USER_HOME
directory, and uses it to encrypt data in the project-specific caches.
To further enhance security, follow these recommendations:
-
Secure access to Configuration Cache entries.
-
Use
GRADLE_USER_HOME/gradle.properties
to store secrets. The content of this file is not included in the Configuration Cache—only its fingerprint is. If storing secrets in this file, ensure access is properly restricted.
See gradle/gradle#22618.
Providing an Encryption Key with the GRADLE_ENCRYPTION_KEY
Environment Variable
By default, Gradle automatically generates and manages the encryption key as a Java keystore, stored under the GRADLE_USER_HOME
directory.
For environments where this behavior is undesirable—such as when the GRADLE_USER_HOME
directory is shared across multiple machines—you can explicitly provide an encryption key using the GRADLE_ENCRYPTION_KEY
environment variable.
The same encryption key must be consistently provided across multiple Gradle runs; otherwise, Gradle will be unable to reuse existing cached configurations. |
Generating an Encryption Key compatible with GRADLE_ENCRYPTION_KEY
To encrypt the Configuration Cache using a user-specified encryption key, Gradle requires the GRADLE_ENCRYPTION_KEY
environment variable to be set with a valid AES key, encoded as a Base64 string.
You can generate a Base64-encoded AES-compatible key using the following command:
❯ openssl rand -base64 16
This command works on Linux and macOS, and on Windows if using a tool like Cygwin.
Once generated, set the Base64-encoded key as the value of the GRADLE_ENCRYPTION_KEY
environment variable:
❯ export GRADLE_ENCRYPTION_KEY="your-generated-key-here"