Table of Contents
The Java plugin adds Java compilation along with testing and bundling capabilities to a project. It serves as the basis for many of the other Gradle plugins.
To use the Java plugin, include the following in your build script:
The Java plugin introduces the concept of a source set. A source set is simply a group of source files which are compiled and executed together. These source files may include Java source files and resource files. Other plugins add the ability to include Groovy and Scala source files in a source set. A source set has an associated compile classpath, and runtime classpath.
One use for source sets is to group source files into logical groups which describe their purpose. For example, you might use a source set to define an integration test suite, or you might use separate source sets to define the API and implementation classes of your project.
The Java plugin defines two standard source sets, called main
and test
. The main
source set contains your production source code, which is compiled and assembled into a JAR file. The test
source set contains your test source code, which is compiled and executed using JUnit or TestNG. These can be unit tests, integration tests, acceptance tests, or any combination that is useful to you.
The Java plugin adds a number of tasks to your project, as shown below.
compileJava(type:
JavaCompile
)Compiles production Java source files using javac. Depends on all tasks which produce the compile classpath. This includes the
jar
task for project dependencies included in thecompile
configuration.processResources(type:
Copy
)Copies production resources into the production resources directory.
classes(type:
Task
)Assembles the production classes and resources directories.
compileTestJava(type:
JavaCompile
)Compiles test Java source files using javac. Depends on
compile
, plus all tasks which produce the test compile classpath.processTestResources(type:
Copy
)Copies test resources into the test resources directory.
testClasses(type:
Task
)Assembles the test classes and resources directories. Depends on
compileTestJava
task andprocessTestResources
task. Some plugins add additional test compilation tasks.jar(type:
Jar
)Assembles the JAR file. Depends on
compile
.javadoc(type:
Javadoc
)Generates API documentation for the production Java source, using Javadoc. Depends on
compile
.test(type:
Test
)Runs the unit tests using JUnit or TestNG. Depends on
compile
,compileTest
, plus all tasks which produce the test runtime classpath.uploadArchives(type:
Upload
)Uploads artifacts in the
archives
configuration, including the JAR file. Depends on the tasks which produce the artifacts in thearchives
configuration, includingjar
.clean(type:
Delete
)Deletes the project build directory.
cleanTaskName(type:
Delete
)Deletes files created by specified task.
cleanJar
will delete the JAR file created by thejar
task, andcleanTest
will delete the test results created by thetest
task.
For each source set you add to the project, the Java plugin adds the following compilation tasks:
compileSourceSetJava(type:
JavaCompile
)Compiles the given source set’s Java source files using javac. Depends on all tasks which produce the source set’s compile classpath.
processSourceSetResources(type:
Copy
)Copies the given source set’s resources into the resources directory.
sourceSetClasses(type:
Task
)Assembles the given source set’s classes and resources directories. Depends on the
compileSourceSetJava
task and theprocessSourceSetResources
task. Some plugins add additional compilation tasks for the source set.
The Java plugin also adds a number of tasks which form a lifecycle for the project:
assemble(type:
Task
)Assembles all the archives in the project. Depends on all archive tasks in the project, including
jar
. Some plugins add additional archive tasks to the project.check(type:
Task
)Performs all verification tasks in the project. Depends on all verification tasks in the project, including
test
. Some plugins add additional verification tasks to the project.build(type:
Task
)Performs a full build of the project. Depends on
check
andassemble
.buildNeeded(type:
Task
)Performs a full build of the project and all projects it depends on. Depends on
build
andbuildNeeded
tasks in all project lib dependencies of thetestRuntime
configuration.buildDependents(type:
Task
)Performs a full build of the project and all projects which depend on it. Depends on
build
andbuildDependents
tasks in all projects with a project lib dependency on this project in atestRuntime
configuration.buildConfigName(type:
Task
)Assembles the artifacts in the specified configuration. The task is added by the Base plugin which is implicitly applied by the Java plugin. Depends on the tasks which produce the artifacts in configuration ConfigName.
uploadConfigName(type:
Upload
)Assembles and uploads the artifacts in the specified configuration. The task is added by the Base plugin which is implicitly applied by the Java plugin. Depends on the tasks which uploads the artifacts in configuration ConfigName.
The following diagram shows the relationships between these tasks.
The Java plugin assumes the project layout shown below. None of these directories need to exist or have anything in them. The Java plugin will compile whatever it finds, and handles anything which is missing.
Java plugin - default project layout
Directory | Meaning | |
|
Production Java source |
|
|
Production resources |
|
|
Test Java source |
|
|
Test resources |
|
|
Java source for the given source set |
|
|
Resources for the given source set |
You configure the project layout by configuring the appropriate source set. This is discussed in more detail in the following sections. Here is a brief example which changes the main Java and resource source directories.
Example: Custom Java source layout
build.gradle
sourceSets { main { java { srcDirs = ['src/java'] } resources { srcDirs = ['src/resources'] } } }
The Java plugin adds a number of dependency configurations to your project, as shown below. It assigns those configurations to tasks such as compileJava
and test
.
compile
Compile time dependencies.
compileOnly
Compile time only dependencies, not used at runtime.
compileClasspath
extendscompile, compileOnly
Compile classpath, used when compiling source. Used by task
compileJava
.annotationProcessor
Annotation processors used during compilation.
runtime
extendscompile
Runtime dependencies.
testCompile
extendscompile
Additional dependencies for compiling tests.
testCompileOnly
Additional dependencies only for compiling tests, not used at runtime.
testCompileClasspath
extendstestCompile, testCompileOnly
Test compile classpath, used when compiling test sources. Used by task
compileTestJava
.testRuntime
extendsruntime, testCompile
Additional dependencies for running tests only. Used by task
test
.archives
Artifacts (e.g. jars) produced by this project. Used by tasks
uploadArchives
.default
extendsruntime
The default configuration used by a project dependency on this project. Contains the artifacts and dependencies required by this project at runtime.
For each source set you add to the project, the Java plugins adds the following dependency configurations:
sourceSetCompile
Compile time dependencies for the given source set.
sourceSetCompileOnly
Compile time only dependencies for the given source set, not used at runtime.
sourceSetCompileClasspath
extendscompileSourceSetJava
Compile classpath, used when compiling source. Used by
sourceSetCompile
,sourceSetCompileOnly
.sourceSetAnnotationProcessor
Annotation processors used during compilation of this source set.
sourceSetRuntime
Runtime dependencies for the given source set. Used by
sourceSetCompile
.
The Java plugin adds a number of convention properties to the project, shown below. You can use these properties in your build script as though they were properties of the project object.
String reportsDirName
The name of the directory to generate reports into, relative to the build directory. Default value:
reports
(read-only) File reportsDir
The directory to generate reports into. Default value:
buildDir/reportsDirName
String testResultsDirName
The name of the directory to generate test result .xml files into, relative to the build directory. Default value:
test-results
(read-only) File testResultsDir
The directory to generate test result .xml files into. Default value:
buildDir/testResultsDirName
String testReportDirName
The name of the directory to generate the test report into, relative to the reports directory. Default value:
tests
(read-only) File testReportDir
The directory to generate the test report into. Default value:
reportsDir/testReportDirName
String libsDirName
The name of the directory to generate libraries into, relative to the build directory. Default value:
libs
(read-only) File libsDir
The directory to generate libraries into. Default value:
buildDir/libsDirName
String distsDirName
The name of the directory to generate distributions into, relative to the build directory. Default value:
distributions
(read-only) File distsDir
The directory to generate distributions into. Default value:
buildDir/distsDirName
String docsDirName
:
:_The name of the directory to generate documentation into, relative to the build directory._ Default value: docs
(read-only) File docsDir
The directory to generate documentation into. Default value:
buildDir/docsDirName
String dependencyCacheDirName
The name of the directory to use to cache source dependency information, relative to the build directory. Default value:
dependency-cache
(read-only)
SourceSetContainer
sourceSetsContains the project’s source sets. Default value: Not null
SourceSetContainer
JavaVersion
sourceCompatibilityJava version compatibility to use when compiling Java source. Default value: version of the current JVM in use
JavaVersion
. Can also set using a String or a Number, e.g.'1.5'
or1.5
.JavaVersion
targetCompatibilityJava version to generate classes for. Default value:
sourceCompatibility
. Can also set using a String or Number, e.g.'1.5'
or1.5
.String archivesBaseName
The basename to use for archives, such as JAR or ZIP files. Default value:
projectName
Manifest
manifestThe manifest to include in all JAR files. Default value: an empty manifest.
These properties are provided by convention objects of type JavaPluginConvention
, and BasePluginConvention
.
You can access the source sets of a project using the sourceSets
property. This is a container for the project’s source sets, of type SourceSetContainer
. There is also a sourceSets { }
script block, which you can pass a closure to configure the source set container. The source set container works pretty much the same way as other containers, such as tasks
.
Example: Accessing a source set
build.gradle
// Various ways to access the main source set println sourceSets.main.output.classesDirs println sourceSets['main'].output.classesDirs sourceSets { println main.output.classesDirs } sourceSets { main { println output.classesDirs } } // Iterate over the source sets sourceSets.all { println name }
To configure an existing source set, you simply use one of the above access methods to set the properties of the source set. The properties are described below. Here is an example which configures the main Java and resources directories:
Example: Configuring the source directories of a source set
build.gradle
sourceSets { main { java { srcDirs = ['src/java'] } resources { srcDirs = ['src/resources'] } } }
The following table lists some of the important properties of a source set. You can find more details in the API documentation for SourceSet
.
(read-only) String name
The name of the source set, used to identify it. Default value: Not null
(read-only)
SourceSetOutput
outputThe output files of the source set, containing its compiled classes and resources. Default value: Not null
FileCollection
output.classesDirsThe directories to generate the classes of this source set into. Default value: Not null
File output.resourcesDir
The directory to generate the resources of this source set into. Default value:
buildDir/resources/name
FileCollection
compileClasspathThe classpath to use when compiling the source files of this source set. Default value:
sourceSetCompileClasspath
configuration.FileCollection
annotationProcessorPathThe processor path to use when compiling the source files of this source set. Default value:
sourceSetAnnotationProcessor
configuration.FileCollection
runtimeClasspathThe classpath to use when executing the classes of this source set. Default value:
output
+sourceSetRuntimeClasspath
configuration.(read-only)
SourceDirectorySet
javaThe Java source files of this source set. Contains only
.java
files found in the Java source directories, and excludes all other files. Default value: Not nullSet<File> java.srcDirs
The source directories containing the Java source files of this source set. Default value:
[projectDir/src/name/java]
. Can set using anything described in the section called “Specifying a set of input files”.File java.outputDir
The directory to generate compiled Java sources into. Default value:
buildDir/classes/java/sourceSetName
. Can set using anything described in the section called “Locating files”.(read-only)
SourceDirectorySet
resourcesThe resources of this source set. Contains only resources, and excludes any
.java
files found in the resource source directories. Other plugins, such as the Groovy plugin, exclude additional types of files from this collection. Default value: Not nullSet<File> resources.srcDirs
The source directories containing the resources of this source set. Default value:
[projectDir/src/name/resources]
. Can set using anything described in the section called “Specifying a set of input files”.(read-only)
SourceDirectorySet
allJavaAll
.java
files of this source set. Some plugins, such as the Groovy plugin, add additional Java source files to this collection. Default value:java
(read-only)
SourceDirectorySet
allSourceAll source files of this source set. This include all resource files and all Java source files. Some plugins, such as the Groovy plugin, add additional source files to this collection. Default value:
resources + java
To define a new source set, you simply reference it in the sourceSets { }
block. Here’s an example:
When you define a new source set, the Java plugin adds some dependency configurations for the source set, as shown in the section called “SourceSet dependency configurations”. You can use these configurations to define the compile and runtime dependencies of the source set.
Example: Defining source set dependencies
build.gradle
sourceSets { intTest } dependencies { intTestCompile 'junit:junit:4.12' intTestRuntime 'org.ow2.asm:asm-all:4.0' }
The Java plugin also adds a number of tasks which assemble the classes for the source set, as shown in the section called “SourceSet Tasks”. For example, for a source set called intTest
, compiling the classes for this source set is done by running gradle intTestClasses
.
Example: Compiling a source set
Output of gradle intTestClasses
> gradle intTestClasses :compileIntTestJava :processIntTestResources :intTestClasses BUILD SUCCESSFUL in 0s 2 actionable tasks: 2 executed
Adding a JAR containing the classes of a source set:
Example: Assembling a JAR for a source set
build.gradle
task intTestJar(type: Jar) { from sourceSets.intTest.output }
Generating Javadoc for a source set:
Example: Generating the Javadoc for a source set
build.gradle
task intTestJavadoc(type: Javadoc) { source sourceSets.intTest.allJava }
Adding a test suite to run the tests in a source set:
Example: Running tests in a source set
build.gradle
task intTest(type: Test) { testClassesDirs = sourceSets.intTest.output.classesDirs classpath = sourceSets.intTest.runtimeClasspath }
The javadoc
task is an instance of Javadoc
. It supports the core Javadoc options and the options of the standard doclet described in the reference documentation of the Javadoc executable. For a complete list of supported Javadoc options consult the API documentation of the following classes: CoreJavadocOptions
and StandardJavadocDocletOptions
.
FileCollection
classpathDefault value:
sourceSets.main.output
+sourceSets.main.compileClasspath
FileTree
sourceDefault value:
sourceSets.main.allJava
. Can set using anything described in the section called “Specifying a set of input files”.File destinationDir
Default value:
docsDir/javadoc
String title
Default value: The name and version of the project
The clean
task is an instance of Delete
. It simply removes the directory denoted by its dir
property.
The Java plugin uses the Copy
task for resource handling. It adds an instance for each source set in the project. You can find out more about the copy task in the section called “Copying files”.
Object srcDirs
Default value:
sourceSet.resources
. Can set using anything described in the section called “Specifying a set of input files”.File destinationDir
Default value:
sourceSet.output.resourcesDir
. Can set using anything described in the section called “Locating files”.
The Java plugin adds a JavaCompile
instance for each source set in the project. Some of the most common configuration options are shown below.
FileCollection
classpathDefault value:
sourceSet.compileClasspath
FileTree
sourceDefault value:
sourceSet.java
. Can set using anything described in the section called “Specifying a set of input files”.File destinationDir
Default value:
sourceSet.java.outputDir
By default, the Java compiler runs in the Gradle process. Setting options.fork
to true
causes compilation to occur in a separate process. In the case of the Ant javac task, this means that a new process will be forked for each compile task, which can slow down compilation. Conversely, Gradle’s direct compiler integration (see above) will reuse the same compiler process as much as possible. In both cases, all fork options specified with options.forkOptions
will be honored.
Starting with Gradle 2.1, it is possible to compile Java incrementally. See the JavaCompile
task for information on how to enable it.
Main goals for incremental compilations are:
Avoid wasting time compiling source classes that don’t have to be compiled. This means faster builds, especially when a change to a source class or a jar does not incur recompilation of many source classes that depend on the changed input.
Change as few output classes as possible. Classes that don’t need to be recompiled remain unchanged in the output directory. An example scenario when this is really useful is using JRebel - the fewer output classes are changed the quicker the JVM can use refreshed classes.
The incremental compilation at a high level:
The detection of the correct set of stale classes is reliable at some expense of speed. The algorithm uses bytecode analysis and deals gracefully with compiler optimizations (inlining of non-private constants), transitive class dependencies, etc. Example: When a class with a public constant changes, we eagerly compile classes that use the same constants to avoid problems with constants inlined by the compiler.
To make incremental compilation fast, we cache class analysis results and jar snapshots. The initial incremental compilation can be slower due to the cold caches.
If a compile task fails due to a compile error, it will do a full compilation again the next time it is invoked.
Because of type erasure, the incremental compiler is not able to recognize when a type is only used in a type parameter, and never actually used in the code. For example, imagine that you have the following code:
List<? extends A> list = Lists.newArrayList();
but that no member ofA
is in practice used in the code, then changes toA
will not trigger recompilation of the class. In practice, this should very rarely be an issue.
If a dependent project has changed in an ABI-compatible way (only its private API has changed), then Java compilation tasks will be up-to-date. This means that if project A
depends on project B
and a class in B
is changed in an ABI-compatible way (typically, changing only the body of a method), then Gradle won’t recompile A
.
Some of the types of changes that do not affect the public API and are ignored:
Changing a method body
Changing a comment
Adding, removing or changing private methods, fields, or inner classes
Adding, removing or changing a resource
Changing the name of jars or directories in the classpath
Renaming a parameter
Compile-avoidance is deactivated if annotation processors are found on the compile classpath, because for annotation processors the implementation details matter. Annotation processors should be declared on the annotation processor path instead. Gradle 5.0 will ignore processors on the compile classpath.
Example: Declaring annotation processors
build.gradle
dependencies { // The dagger compiler and its transitive dependencies will only be found on annotation processing classpath annotationProcessor 'com.google.dagger:dagger-compiler:2.8' // And we still need the Dagger library on the compile classpath itself implementation 'com.google.dagger:dagger:2.8' }
The test
task is an instance of Test
. It automatically detects and executes all unit tests in the test
source set. It also generates a report once test execution is complete. JUnit and TestNG are both supported. Have a look at Test
for the complete API.
Tests are executed in a separate JVM, isolated from the main build process. The Test
task’s API allows you some control over how this happens.
There are a number of properties which control how the test process is launched. This includes things such as system properties, JVM arguments, and the Java executable to use.
The test process can exit unexpectedly if configured incorrectly. For instance, if the Java executable does not exist or an invalid JVM argument is provided, the test process will fail to start.
Similarly, if a test makes programmatic changes to the test process, this can also cause unexpected failures. For example, issues may occur if a SecurityManager is modified in a test because
Gradle’s internal messaging depends on reflection and socket communication, which may be disrupted if the permissions on the security manager change. (In this case, you should restore the original SecurityManager
after the test so that the
gradle test worker process can continue to function.)
You can specify whether or not to execute your tests in parallel. Gradle provides parallel test execution by running multiple test processes concurrently. Each test process executes only a single test at a time, so you generally don’t need to do anything special to your tests to take advantage of this. The maxParallelForks
property specifies the maximum number of test processes to run at any given time. The default is 1, that is, do not execute the tests in parallel.
The test process sets the org.gradle.test.worker
system property to a unique identifier for that test process, which you can use, for example, in files names or other resource identifiers.
You can specify that test processes should be restarted after it has executed a certain number of test classes. This can be a useful alternative to giving your test process a very large heap. The forkEvery
property specifies the maximum number of test classes to execute in a test process. The default is to execute an unlimited number of tests in each test process.
The task has an ignoreFailures
property to control the behavior when tests fail. The Test
task always executes every test that it detects. It stops the build afterwards if ignoreFailures
is false and there are failing tests. The default value of ignoreFailures
is false.
The testLogging
property allows you to configure which test events are going to be logged and at which detail level. By default, a concise message will be logged for every failed test. See TestLoggingContainer
for how to tune test logging to your preferences.
Projects with large test suites can take a long time to execute even though a failure occurred early on leading to unnecessary wait times (especially on CI). To short circuit this behavior, the Test.getFailFast()
property allows you to cause the test
task to fail after the first test failure instead of running all tests. When this property is true, the resulting output will only show the results of tests that have completed up to and including the failure. To enable this fail fast behavior in your build file, set the failFast
property to true
:
test { failFast = true }
The --fail-fast
command line option enables the behavior from the command line. An invocation looks like:
gradle integTest --fail-fast
The default value for the failFast
property is false.
The test task provides a Test.getDebug()
property that can be set to launch to make the JVM wait for a debugger to attach to port 5005 before proceeding with test execution.
This can also be enabled at invocation time via the --debug-jvm
task option (since Gradle 1.12).
Starting with Gradle 1.10, it is possible to include only specific tests, based on the test name pattern. Filtering is a different mechanism than test class inclusion / exclusion that will be described in the next few paragraphs (-Dtest.single
, test.include
and friends). The latter is based on files, e.g. the physical location of the test implementation class. File-level test selection does not support many interesting scenarios that are possible with test-level filtering. Some of them Gradle handles now and some will be satisfied in future releases:
Filtering at the level of specific test methods; executing a single test method
Filtering based on custom annotations (future)
Filtering based on test hierarchy; executing all tests that extend a certain base class (future)
Filtering based on some custom runtime rule, e.g. particular value of a system property or some static state (future)
Test filtering feature has following characteristic:
Fully qualified class name or fully qualified method name is supported, e.g.
org.gradle.SomeTest
,org.gradle.SomeTest.someMethod
Wildcard '*' is supported for matching any characters
Command line option
--tests
is provided to conveniently extend the test filter for an individual Gradle execution. This is especially useful for the classic 'single test method execution' use case. When the command line option is used, the inclusions declared in the build script are still honored. That is, the command line filters are always applied on top of the filter definition in the build script. It is possible to supply multiple--tests
options and tests matching any of those patterns will be included.Gradle tries to filter the tests given the limitations of the test framework API. Some advanced, synthetic tests may not be fully compatible with filtering. However, the vast majority of tests and use cases should be handled neatly.
Test filtering supersedes the file-based test selection. The latter may be completely replaced in future. We will grow the test filtering API and add more kinds of filters.
Example: Filtering tests in the build script
build.gradle
test { filter { //include specific method in any of the tests includeTestsMatching "*UiCheck" //include all tests from package includeTestsMatching "org.gradle.internal.*" //include all integration tests includeTestsMatching "*IntegTest" } }
For more details and examples please see the TestFilter
reference.
Some examples of using the command line option:
gradle test --tests org.gradle.SomeTest.someSpecificFeature
gradle test --tests \*SomeTest.someSpecificFeature
gradle test --tests \*SomeSpecificTest
gradle test --tests \*SomeSpecificTestSuite
gradle test --tests all.in.specific.package\*
gradle test --tests \*IntegTest
gradle test --tests \*IntegTest\*ui\*
gradle test --tests "com.example.MyTestSuite"
gradle test --tests "com.example.ParameterizedTest"
gradle test --tests "*ParameterizedTest.foo*"
gradle test --tests "*ParameterizedTest.*[2]"
gradle someTestTask --tests \*UiTest someOtherTestTask --tests \*WebTest\*ui
This is something you can combine with continuous build using --continuous
(or -t
, for short) to re-execute a subset of tests immediately after every change.
gradle test --continuous --tests "com.mypackage.foo.*"
This mechanism has been superseded by 'Test Filtering', described above.
Setting a system property of taskName.single = testNamePattern will only execute tests that match the specified testNamePattern. The taskName can be a full multi-project path like :sub1:sub2:test
or just the task name. The testNamePattern will be used to form an include pattern of **/testNamePattern*.class
. If no tests with this pattern can be found, an exception is thrown. This is to shield you from false security. If tests of more than one subproject are executed, the pattern is applied to each subproject. An exception is thrown if no tests can be found for a particular subproject. In such a case you can use the path notation of the pattern, so that the pattern is applied only to the test task of a specific subproject. Alternatively you can specify the fully qualified task name to be executed. You can also specify multiple patterns. Examples:
gradle -Dtest.single=ThisUniquelyNamedTest test
gradle -Dtest.single=a/b/ test
gradle -DintegTest.single=\*IntegrationTest integTest
gradle -D:proj1:test.single=Customer build
gradle -D:proj1:integTest.single=c/d/
The Test
task detects which classes are test classes by inspecting the compiled test classes. By default it scans all .class
files. You can set custom includes / excludes, only those classes will be scanned. Depending on the test framework used (JUnit / TestNG) the test class detection uses different criteria.
When using JUnit, we scan for both JUnit 3 and 4 test classes. If any of the following criteria match, the class is considered to be a JUnit test class:
Class or a super class extends
TestCase
orGroovyTestCase
Class or a super class is annotated with
@RunWith
Class or a super class contain a method annotated with
@Test
When using TestNG, we scan for methods annotated with @Test
.
Note that abstract classes are not executed. Gradle also scans up the inheritance tree into jar files on the test classpath.
If you don’t want to use test class detection, you can disable it by setting scanForTestClasses
to false. This will make the test task only use includes / excludes to find test classes. If scanForTestClasses
is false and no include / exclude patterns are specified, the defaults are **/*Tests.class
, **/*Test.class
and **/Abstract*.class
for include and exclude, respectively.
With JUnit Platform, only includes
/excludes
is used to filter test classes, scanForTestClasses
has no effect.
JUnit/JUnit Platform and TestNG allow sophisticated groupings of test methods.
For grouping JUnit 4 test classes and methods JUnit 4.8 introduces the concept of categories.[14] The test
task allows the specification of the JUnit categories you want to include and exclude.
Example: JUnit Categories
build.gradle
test { useJUnit { includeCategories 'org.gradle.junit.CategoryA' excludeCategories 'org.gradle.junit.CategoryB' } }
In JUnit Platform, tagging is introduced instead of categories. You can specify the included/excluded tags as follows:
Example: JUnit Platform Tags
build.gradle
test { useJUnitPlatform { includeTags 'fast' excludeTags 'slow' } }
The TestNG framework has a quite similar concept. In TestNG you can specify different test groups.[15] The test groups that should be included or excluded from the test execution can be configured in the test task.
Example: Grouping TestNG tests
build.gradle
test { useTestNG { excludeGroups 'integrationTests' includeGroups 'unitTests' } }
JUnit 5 is the latest version of the well-known JUnit test framework. Unlike its predecessor, JUnit 5 is modularized and composed of several modules:
JUnit 5 = JUnit Platform + JUnit Jupiter + JUnit Vintage
The JUnit Platform serves as a foundation for launching testing frameworks on the JVM. JUnit Jupiter is the combination of the new programming model
and extension model for writing tests and extensions in JUnit 5. JUnit Vintage provides a TestEngine
for running JUnit 3 and JUnit 4 based tests on the platform.
The following code enables JUnit Platform support in build.gradle
:
test { useJUnitPlatform() }
See Test.useJUnitPlatform()
for more details.
There’re some known limitations on JUnit 5, e.g. tests in static nested classes won’t be discovered and classes are still displayed by its name instead of @DisplayName
.
They’ll be fixed in future version of Gradle. If you find more, please don’t hesitate to tell us: https://github.com/gradle/gradle/issues/new
To enable JUnit Jupiter support, add the following dependencies:
Example: JUnit Jupiter dependencies
build.gradle
dependencies { testImplementation 'org.junit.jupiter:junit-jupiter-api:5.1.0' testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.1.0' }
Put the following code into src/test/java
:
Example: Jupiter test example
src/test/java/org/gradle/junitplatform/JupiterTest.java
package org.gradle.junitplatform; import org.junit.jupiter.api.*; public class JupiterTest { @Test public void ok() { System.out.println("Hello from JUnit Jupiter!"); } }
Now you can run gradle test
to see the results.
A Jupiter sample can be found at samples/testing/junitplatform/jupiter
in the '-all' distribution of Gradle.
If you want to run JUnit 3/4 tests on JUnit Platform, or even mix them with Jupiter tests, you should add extra JUnit Vintage Engine dependencies:
Example: JUnit Vintage dependencies
build.gradle
dependencies { testImplementation 'org.junit.jupiter:junit-jupiter-api:5.1.0' testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.1.0' testCompileOnly 'junit:junit:4.12' testRuntimeOnly 'org.junit.vintage:junit-vintage-engine:5.1.0' }
In this way, you can use gradle test
to test JUnit 3/4 tests on JUnit Platform, without the need to rewrite them.
A sample of mixed tests can be found at samples/testing/junitplatform/engine
in the '-all' distribution of Gradle.
TestNG allows explicit control of the execution order of tests.
The preserveOrder
property controls whether tests are executed in deterministic order. Preserving the order guarantees that the complete test (including @BeforeXXX
and @AfterXXX
) is run in a test thread before the next test is run. While preserving the order of tests is the default behavior when directly working with testng.xml
files, the TestNG API, that is used for running tests programmatically, as well as Gradle’s TestNG integration execute tests in unpredictable order by default.[16] Preserving the order of tests was introduced with TestNG version 5.14.5. Setting the preserveOrder
property to true
for an older TestNG version will cause the build to fail.
The groupByInstance
property controls whether tests should be grouped by instances. Grouping by instances will result in resolving test method dependencies for each instance instead of running the dependees of all instances before running the dependants. The default behavior is not to group tests by instances.[17] Grouping tests by instances was introduced with TestNG version 6.1. Setting the groupByInstances
property to true
for an older TestNG version will cause the build to fail.
The Test
task generates the following results by default.
An HTML test report.
The results in an XML format that is compatible with the Ant JUnit report task. This format is supported by many other tools, such as CI servers.
Results in an efficient binary format. The task generates the other results from these binary results.
There is also a stand-alone TestReport
task type which can generate the HTML test report from the binary results generated by one or more Test
task instances. To use this task type, you need to define a destinationDir
and the test results to include in the report. Here is a sample which generates a combined report for the unit tests from subprojects:
Example: Creating a unit test report for subprojects
build.gradle
subprojects { apply plugin: 'java' // Disable the test report for the individual test task test { reports.html.enabled = false } } task testReport(type: TestReport) { destinationDir = file("$buildDir/reports/allTests") // Include the results from the `test` task in all subprojects reportOn subprojects*.test }
You should note that the TestReport
type combines the results from multiple test tasks and needs to aggregate the results of individual test classes. This means that if a given test class is executed by multiple test tasks, then the test report will include executions of that class, but it can be hard to distinguish individual executions of that class and their output.
TestNG supports parameterizing test methods, allowing a particular test method to be executed multiple times with different inputs. Gradle includes the parameter values in its reporting of the test method execution.
Given a parameterized test method named aTestMethod
that takes two parameters, it will be reported with the name: aTestMethod(toStringValueOfParam1, toStringValueOfParam2)
. This makes identifying the parameter values for a particular iteration easy.
File testClassesDirs
Default value:
sourceSets.test.output.classesDirs
FileCollection
classpathDefault value:
sourceSets.test.runtimeClasspath
File testResultsDir
Default value:
testResultsDir
File testReportDir
Default value:
testReportDir
The jar
task creates a JAR file containing the class files and resources of the project. The JAR file is declared as an artifact in the archives
dependency configuration. This means that the JAR is available in the classpath of a dependent project. If you upload your project into a repository, this JAR is declared as part of the dependency descriptor. You can learn more about how to work with archives in the section called “Creating archives” and artifact configurations in Publishing artifacts.
Each jar or war object has a manifest
property with a separate instance of Manifest
. When the archive is generated, a corresponding MANIFEST.MF
file is written into the archive.
Example: Customization of MANIFEST.MF
build.gradle
jar { manifest { attributes("Implementation-Title": "Gradle", "Implementation-Version": version) } }
You can create stand-alone instances of a Manifest
. You can use that for example, to share manifest information between jars.
Example: Creating a manifest object.
build.gradle
ext.sharedManifest = manifest { attributes("Implementation-Title": "Gradle", "Implementation-Version": version) } task fooJar(type: Jar) { manifest = project.manifest { from sharedManifest } }
You can merge other manifests into any Manifest
object. The other manifests might be either described by a file path or, like in the example above, by a reference to another Manifest
object.
Example: Separate MANIFEST.MF for a particular archive
build.gradle
task barJar(type: Jar) { manifest { attributes key1: 'value1' from sharedManifest, 'src/config/basemanifest.txt' from('src/config/javabasemanifest.txt', 'src/config/libbasemanifest.txt') { eachEntry { details -> if (details.baseValue != details.mergeValue) { details.value = baseValue } if (details.key == 'foo') { details.exclude() } } } } }
Manifests are merged in the order they are declared by the from
statement. If the base manifest and the merged manifest both define values for the same key, the merged manifest wins by default. You can fully customize the merge behavior by adding eachEntry
actions in which you have access to a ManifestMergeDetails
instance for each entry of the resulting manifest. The merge is not immediately triggered by the from statement. It is done lazily, either when generating the jar, or by calling writeTo
or effectiveManifest
You can easily write a manifest to disk.
How to upload your archives is described in Publishing artifacts.
Gradle can only run on Java version 7 or higher. However, support for running Gradle on Java 7 has been deprecated and is scheduled to be removed in Gradle 5.0. There are two reasons for deprecating support for Java 7:
Java 7 reached end of life. Therefore, Oracle ceased public availability of security fixes and upgrades for Java 7 as of April 2015.
Once support for Java 7 has ceased (likely with Gradle 5.0), Gradle’s implementation can start to use Java 8 APIs optimized for performance and usability.
Gradle still supports compiling, testing, generating Javadoc and executing applications for Java 6 and Java 7. Java 5 is not supported.
To use Java 6 or Java 7, the following tasks need to be configured:
JavaCompile
task to fork and use the correct Java homeJavadoc
task to use the correctjavadoc
executableTest
and theJavaExec
task to use the correctjava
executable.
The following sample shows how the build.gradle
needs to be adjusted. In order to be able to make the build machine-independent, the location of the old Java home and target version should be configured in GRADLE_USER_HOME/gradle.properties
[18] in the user’s home directory on each developer machine, as shown in the example.
Example: Configure Java 6 build
gradle.properties
# in $HOME/.gradle/gradle.properties javaHome=/Library/Java/JavaVirtualMachines/1.7.0.jdk/Contents/Home targetJavaVersion=1.7
build.gradle
assert hasProperty('javaHome'): "Set the property 'javaHome' in your your gradle.properties pointing to a Java 6 or 7 installation" assert hasProperty('targetJavaVersion'): "Set the property 'targetJavaVersion' in your your gradle.properties to '1.6' or '1.7'" sourceCompatibility = targetJavaVersion def javaExecutablesPath = new File(javaHome, 'bin') def javaExecutables = [:].withDefault { execName -> def executable = new File(javaExecutablesPath, execName) assert executable.exists(): "There is no ${execName} executable in ${javaExecutablesPath}" executable } tasks.withType(AbstractCompile) { options.with { fork = true forkOptions.javaHome = file(javaHome) } } tasks.withType(Javadoc) { executable = javaExecutables.javadoc } tasks.withType(Test) { executable = javaExecutables.java } tasks.withType(JavaExec) { executable = javaExecutables.java }
[14] The JUnit wiki contains a detailed description on how to work with JUnit categories: https://github.com/junit-team/junit/wiki/Categories.
[15] The TestNG documentation contains more details about test groups: http://testng.org/doc/documentation-main.html#test-groups.
[16] The TestNG documentation contains more details about test ordering when working with testng.xml
files: http://testng.org/doc/documentation-main.html#testng-xml.
[17] The TestNG documentation contains more details about grouping tests by instances: http://testng.org/doc/documentation-main.html#dependencies-with-annotations.
[18] For more details on gradle.properties
see the section called “Gradle properties”