danilo.pianini@unibo.it
Compiled on: 2024-11-21 — printable version
The process of creating tested deployable software artifacts
from source code
May include, depending on the system specifics:
Custom: select some phases that the product needs and perform them.
Standard: run a sequence of pre-defined actions/phases.
Automation of the build lifecycle
Different lifecycle types generate different build automation styles
Imperative: write a script that tells the system what to do to get from code to artifacts
Declarative: adhere to some convention, customizing some settings
Create a declarative infrastructure upon an imperative basis, and allow easy access to the underlying machinery
DSLs are helpful in this context: they can “hide” imperativity without ruling it out
Still, many challenges remain open:
validate
- validate the project is correct and all necessary information is availablecompile
- compile the source code of the projecttest
- test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployedpackage
- take the compiled code and package it in its distributable format, such as a JAR.verify
- run any checks on results of integration tests to ensure quality criteria are metinstall
- install the package into the local repository, for use as a dependency in other projects locallydeploy
- done in the build environment, copies the final package to the remote repository for sharing with other developers and projects.What if there is no plugin for something peculiar of the project?
Rather than declaratively fit the build into a predefined lifecycle, declaratively define a build lifecycle
A paradigmatic example of a hybrid automator:
Let’s start as empty as possible, just point your terminal to an empty folder and:
touch build.gradle.kts
gradle tasks
Stuff happens: if nothing is specified,
Gradle considers the folder where it is invoked as a project
The project name matches the folder name
Let’s understand what:
Welcome to Gradle <version>!
Here are the highlights of this release:
- Blah blah blah
Starting a Gradle Daemon (subsequent builds will be faster)
Up to there, it’s just performance stuff: Gradle uses a background service to speed up cacheable operations
> Task :tasks
------------------------------------------------------------
Tasks runnable from root project
------------------------------------------------------------
Build Setup tasks
-----------------
init - Initializes a new Gradle build.
wrapper - Generates Gradle wrapper files.
Some tasks exist already! They are built-in. Let’s ignore them for now.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project '00-empty'.
components - Displays the components produced by root project '00-empty'. [incubating]
dependencies - Displays all dependencies declared in root project '00-empty'.
dependencyInsight - Displays the insight into a specific dependency in root project '00-empty'.
dependentComponents - Displays the dependent components of components in root project '00-empty'. [incubating]
help - Displays a help message.
model - Displays the configuration model of root project '00-empty'. [incubating]
outgoingVariants - Displays the outgoing variants of root project '00-empty'.
projects - Displays the sub-projects of root project '00-empty'.
properties - Displays the properties of root project '00-empty'.
tasks - Displays the tasks runnable from root project '00-empty'.
Informational tasks. Among them, the tasks
task we just invoked
It is time to create our first task
Create a build.gradle.kts
file as follows:
tasks.register("brokenTask") { // creates a new task
println("this is executed at CONFIGURATION time!")
}
Now launch gradle with gradle brokenTask
:
gradle broken
this is executed at CONFIGURATION time!
BUILD SUCCESSFUL in 378ms
Looks ok, but it’s utterly broken
Try launching gradle tasks
❯ gradle tasks
> Task :tasks
------------------------------------------------------------
Tasks runnable from root project
------------------------------------------------------------
this is executed at CONFIGURATION time!
Build Setup tasks
Ouch!
Reason: the build script executes when Gradle is invoked, and configures tasks and dependencies.
Only later, when a task is invoked, the block gets actually executed
Let’s write a correct task
tasks.register("helloWorld") {
doLast { // This method takes as argument a Task.() -> Unit
println("Hello, World!")
}
}
Execution with gradle helloWorld
gradle helloWorld
> Task :helloWorld
Hello, World!
Delaying the execution allows for more flexible configuration
This will be especially useful when modifying existing behavior
tasks.register("helloWorld") {
doLast { println("Hello, World!") }
}
tasks.getByName("helloWorld") { // let's find an existing task
doFirst { // Similar to doLast, but adds operations in head
println("Configured later, executed first.")
}
}
gradle helloWorld
> Task :helloWorld
Configured later, executed first.
Hello, World!
Gradle offers some facilities to make writing new tasks easier
An example is the org.gradle.api.Exec
task type, representing a command to be executed on the underlying command line
At task registration time, it is possible to specify the task type.
Any open class
implementing org.gradle.api.Task
can be instanced
import org.gradle.internal.jvm.Jvm // Jvm is part of the Gradle API
tasks.register<Exec>("printJavaVersion") { // Do you Recognize this? inline function with reified type!
// Configuration action is of type T.() -> Unit, in this case Exec.T() -> Unit
val javaExecutable = Jvm.current().javaExecutable.absolutePath
commandLine( // this is a method of class org.gradle.api.Exec
javaExecutable, "-version"
)
// There is no need of doLast / doFirst, actions are already configured
// Still, we may want to do something before or after the task has been executed
doLast { println("$javaExecutable invocation complete") }
doFirst { println("Ready to invoke $javaExecutable") }
}
> Task :printJavaVersion
Ready to invoke /usr/lib/jvm/java-11-openjdk/bin/java
openjdk version "11.0.8" 2020-07-14
OpenJDK Runtime Environment (build 11.0.8+10)
OpenJDK 64-Bit Server VM (build 11.0.8+10, mixed mode)
/usr/lib/jvm/java-11-openjdk/bin/java invocation complete
Let’s try something more involved: compiling some Java source located in src
.
If you know how to do it, then you can instruct a machine to do it
Compiling a Java source is just matter of invoking the javac
compiler:
Once you learn how some product is built, and you know how to build it by hand
you have all the knowledge required to automate its construction
Let’s compile a simple src/HelloWorld.java
:
class HelloWorld {
public static void main(String... args) {
System.out.println("Hello, World!");
}
}
Build logic:
javac
javac -d destination <files>
import org.gradle.internal.jvm.Jvm
tasks.register<Exec>("compileJava") {
val sources = findSources() //
if (sources.isNotEmpty()) { // If the folder exists and there are files
val javacExecutable = Jvm.current().javacExecutable.absolutePath // Use the current JVM's javac
commandLine(
"$javacExecutable",
"-d", "$buildDir/bin", // destination folder: the output directory of Gradle, inside "bin"
*sources
)
}
// the task's doLast is inherited from Exec
}
Here is the findSources()
function:
fun findSources(): Array<String> = projectDir // From the project
.listFiles { it: File -> it.isDirectory && it.name == "src" } // Find a folder named 'src'
?.firstOrNull() // If it's not there we're done
?.walk() // If it's there, iterate all its content (returns a Sequence<File>)
?.filter { it.extension == "java" } // Pick all Java files
?.map { it.absolutePath } // Map them to their absolute path
?.toList() // Sequences can't get converted to arrays, we must go through lists
?.toTypedArray() // Convert to Array<String>
?: emptyArray() // Yeah if anything's missing there are no sources
Execution:
gradle compileJava
BUILD SUCCESSFUL in 693ms
Compiled files are in build/bin
!
Dependency management in Gradle depends from two fundamental concepts:
Let’s see a use case: compiling a Java source with a dependency
javac
terms, we need to feed some jars to the -cp
flag of the compilerConceptually, we want something like:
// Gradle way to create a configuration
val compileClasspath by configurations.creating // Delegation!
dependencies {
forEachLibrary { // this function does not exist, unfortunate...
compileClasspath(files(it))
}
}
To be consumed by our improved compile task:
tasks.register<Exec>("compileJava") {
// Resolve the classpath configuration (in general, files could be remote and need fetching)
val classpathFiles = compileClasspath.resolve()
val sources = findSources() // Find sources
if (sources != null) {
val javacExecutable = Jvm.current().javacExecutable.absolutePath
val separator = if (Os.isFamily(Os.FAMILY_WINDOWS)) ";" else ":" // Deal with Windows conventions
commandLine(
"$javacExecutable", "-cp", classpathFiles.joinToString(separator = separator),
"-d", "bin", *sources
)
}
}
We just need to write forEachLibrary
, but that is just a Kotlin exercise…
…not particularly difficult to solve:
fun forEachLibrary(todo: (String) -> Unit) {
findLibraries().forEach {
todo(it)
}
}
findLibraries()
is similar to findSources()
, let’s refactor:fun findSources() = findFilesIn("src").withExtension("java") // OK now we need findFiles()
fun findLibraries() = findFilesIn("lib").withExtension("jar") // And we also need a way to invoke withExtension
fun findFilesIn(directory: String) = FinderInFolder(directory)
data class FinderInFolder(val directory: String) {
fun withExtension(extension: String): Array<String> = TODO()
}
// Now it compiles! We just need to write the actual method, but that's easy
Complete solution:
data class FinderInFolder(val directory: String) {
fun withExtension(extension: String): Array<String> = projectDir
.listFiles { it: File -> it.isDirectory && it.name == directory }
?.firstOrNull()
?.walk()
?.filter { it.extension == extension }
?.map { it.absolutePath }
?.toList()
?.toTypedArray()
?: emptyArray()
}
fun findFilesIn(directory: String) = FinderInFolder(directory)
fun findSources() = findFilesIn("src").withExtension("java")
fun findLibraries() = findFilesIn("lib").withExtension("jar")
fun DependencyHandlerScope.forEachLibrary(todo: DependencyHandlerScope.(String) -> Unit) {
findLibraries().forEach { todo(it) }
}
Next step: we can compile, why not executing the program as well?
runtimeClasspath
configuration
compileClasspath
val runtimeClasspath by configurations.creating {
extendsFrom(compileClasspath) // Built-in machinery to say that one configuration is another "plus stuff"
}
dependencies {
...
runtimeClasspath(files("$buildDir/bin"))
}
tasks.register<Exec>("runJava") {
val classpathFiles = runtimeClasspath.resolve()
val mainClass = "PrintException" // Horribly hardcoded, we must do something
val javaExecutable = Jvm.current().javaExecutable.absolutePath
commandLine(javaExecutable, "-cp", classpathFiles.joinToString(separator = separator), mainClass)
}
Let’s run it!
❯ gradle runJava
> Task :runJava FAILED
Error: Could not find or load main class PrintException
Caused by: java.lang.ClassNotFoundException: PrintException
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':runJava'.
> Process 'command '/usr/lib/jvm/java-11-openjdk/bin/java'' finished with non-zero exit value 1
runJava
to run after compileJava
// Let's get a reference to the task
val compileJava = tasks.register<Exec>("compileJava") {
...
}
tasks.register<Exec>("runJava") {
...
dependsOn(compileJava) // runJava can run only if compileJava has been run
}
Run now:
TERM=dumb gradle runJava
> Task :compileJava
> Task :runJava
java.lang.IllegalStateException
at PrintException.main(PrintException.java:5)
Just printed a stacktrace, I'm fine actually
BUILD SUCCESSFUL in 775ms
2 actionable tasks: 2 executed
Dependencies permeate the world of build automation.
no guarantee that automation written with some tool at version X
, will work at version Y
!
Gradle proposes a (partial) solution with the so-called Gradle wrapper
wrapper
gradle wrapper --gradle-version=<VERSION>
gradlew
gradlew.bat
The Gradle wrapper is the correct way to use gradle, and we’ll be using it from now on.
A source of failures when building is dirty status.
For istance, in the previous example, before we introduced a dependency between tasks:
We need a way to start clean.
This usually involves cleaning up the build directory - not so hard in our example
tasks.register("clean") { // A generic task is fine
doLast {
if (!buildDir.deleteRecursively()) {
error("Cannot delete $buildDir")
}
}
}
Sometimes projects are modular
Where a module is a sub-project with a clear identity, possibly reusable elsewhere
Examples:
Modular software simplifies maintenance and improves understandability
Modules may depend on other modules
Some build tasks of some module may require build tasks of other modules to be complete before execution
Let us split our project into two components:
We need to reorganize the build logic to something similar to
hierarchial-project
|__:library
\__:app
Desiderata:
Gradle (as many other build automators)
offers built-in support for hierarchial projects.
Gradle is limited to two levels, other products such as Maven have no limitation
Subprojects are listed in a settings.gradle.kts
file
Incidentally, it’s the same place where the project name can be specified
Subprojects must have their own build.gradle.kts
They can also have their own settings.gradle.kts
, e.g. for selecting a name different than their folder
rootProject.name = "project-with-hierarchy"
include(":library") // There must be a folder named "library"
include(":app") // There must be a folder named "app"
allprojects
block
clean
task should be available for each projectallprojects {
tasks.register("clean") { // A generic task is fine
if (!buildDir.deleteRecursively()) {
error("Cannot delete $buildDir")
}
}
}
subprojects
block
compileJava
task and the related utilitiessubprojects {
// This must be there, as projectDir must refer to the *current* project
data class FinderInFolder(val directory: String) ...
fun findFilesIn(directory: String) = FinderInFolder(directory)
fun findSources() = findFilesIn("src").withExtension("java")
fun findLibraries() = findFilesIn("lib").withExtension("jar")
fun DependencyHandlerScope.forEachLibrary(todo: DependencyHandlerScope.(String) -> Unit) ...
val compileClasspath by configurations.creating
val runtimeClasspath by configurations.creating { extendsFrom(compileClasspath) }
dependencies { ... }
tasks.register<Exec>("compileJava") { ... }
}
build.gradle.kts
, add further customization as necessary
runJava
task can live in the :app
subrojectapp
’s build.gradle.kts
, for instance:dependencies {
compileClasspath(project(":library")) { // My compileClasspath configuration depends on project library
targetConfiguration = "runtimeClasspath" // Specifically, from its runtime
}
}
app
requires library
to be compiled!app
’s build.gradle.kts
:tasks.compileJava { dependsOn(project(":library").tasks.compileJava) }
Note: library
’s build.gradle.kts
is actually empty at the end of the process
At the moment, we have part of the project that’s declarative, and part that’s imperative:
The declarative part is the one for which we had a built-in API for!
The base mechanism at work here is hiding imperativity under a clean, declarative API.
Also “purely declarative” build systems, such as Maven, which are driven with markup files, hide their imperativity behind a curtain (in the case of Maven, plugins that are configured in the pom.xml
, but implemented elsewhere).
Usability, understandability, and, ultimately, maintability, get increased when:
Let’s begin our operation of isolation of imperativity by refactoring our hierarchy of operations.
Gradle supports the definition of new task types:
Task
interface
DefaultTask
open
)
@Inject
@TaskAction
, and will get invoked to execute the taskopen class Clean @Inject constructor() : DefaultTask() {
@TaskAction
fun clean() {
if (!project.buildDir.deleteRecursively()) {
error("Cannot delete ${project.buildDir}")
}
}
}
In general, it is a good practice (that will become mandatory in future gradle releases) to annotate every public property of a task with a marker annotation that determines whether it is an input or an output.
@Input
, @InputFile
, @InputFiles
, @InputDirectory
, @InputDirectories
@OutputFile
, @OutputFiles
, @OutputDirectory
, @OutputDirectories
@Internal
marks internal output properties (not reified on the file system)-t
optionIn our main build.gradle.kts
// Imperative part
abstract class JavaTask(javaExecutable: File = Jvm.current().javaExecutable) : Exec() { ... }
open class CompileJava @javax.inject.Inject constructor() : JavaTask(Jvm.current().javacExecutable) { ... }
open class RunJava @javax.inject.Inject constructor() : JavaTask() { ... }
// Declarative part
allprojects { tasks.register<Clean>("clean") }
subprojects {
val compileClasspath by configurations.creating
val runtimeClasspath by configurations.creating { extendsFrom(compileClasspath) }
dependencies {
findLibraries().forEach { compileClasspath(files(it)) }
runtimeClasspath(files("$buildDir/bin"))
}
tasks.register<CompileJava>("compileJava")
}
In subprojects, only have the declarative part
Unfortunately, subprojects have no access to the root’s defined types
Gradle provides the functionality we need (project-global type definitions) using a special buildSrc
folder
├── build.gradle.kts
├── buildSrc
│ ├── build.gradle.kts
│ └── src
│ └── main
│ └── kotlin
│ └── OurImperativeCode.kt
└── settings.gradle.kts
inside buildSrc/build.gradle.kts
(clearer in future):
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
excerpt of buildSrc/src/main/kotlin/JavaOperations.kt
(full code in the repo)
open class Clean @Inject constructor() : DefaultTask() { ... }
abstract class JavaTask(javaExecutable: File = Jvm.current().javaExecutable) : Exec() { ... }
open class CompileJava @javax.inject.Inject constructor() : JavaTask(Jvm.current().javacExecutable) {
@OutputDirectory // Marks this property as an output
var outputFolder: String = "${project.buildDir}/bin/"
...
}
open class RunJava @javax.inject.Inject constructor() : JavaTask() {
@Input // Marks this property as an Input
var mainClass: String = "Main"
...
}
Our Project’s build.gradle.kts
(full):
allprojects {
tasks.register("clean") { // A generic task is fine
doLast {
if (!buildDir.deleteRecursively()) {
error("Cannot delete $buildDir")
}
}
}
}
subprojects {
val compileClasspath by configurations.creating
val runtimeClasspath by configurations.creating { extendsFrom(compileClasspath) }
dependencies {
findLibraries().forEach { compileClasspath(files(it)) }
runtimeClasspath(files("$buildDir/bin"))
}
tasks.register<CompileJava>("compileJava")
}
Purely declarative, yay!
Our app
suproject’s build.gradle.kts
(full):
dependencies {
compileClasspath(project(":library")) { targetConfiguration = "runtimeClasspath" }
}
tasks.compileJava {
dependsOn(project(":library").tasks.compileJava)
fromConfiguration(configurations.compileClasspath.get())
}
tasks.register<RunJava>("runJava") {
fromConfiguration(configurations.runtimeClasspath.get())
mainClass = "PrintException"
}
Purely declarative, yay!
General approach to a new build automation problem:
Divide: Identify the base steps, they could become your tasks
Conquer: Clearly express the dependencies among them
Encapsulate: confine imperative logic, make it an implementation detail
Adorn: provide a DSL that makes the library easy and intuitive
Not very different than what’s usually done in (good) software development
We now have a rudimental infrastructure for building and running Java projects
What if we want to reuse it?
Of course, copy/pasting the same file across projects is to be avoided whenever possible
Gradle (as many other build systems) allow extensibility via plugins
A plugin is a software component that extends the API of the base system
It usually includes:
Task
sExtension
– An object incapsulating the global configuration options
Plugin
object, implementing an apply(Project)
function
Plugin
is the entry point of the declared plugin
META-INF/gradle-plugins/<plugin-name>.properties
buildEnvironment
apply(Project)
functionExample code
plugins {
pluginName // Loads a plugin from the "buildEnvironment" classpath
`plugin-name` // Syntax for non Kotlin-compliant plugin names
id("plugin2-name") // Alternative to the former
id("some-custom-plugin") version "1.2.3" // if not found locally, gets fetched from the Gradle plugin portal
}
// In case of non-hierarchial projects, plugins are also "applied"
// Otherwise, they need to get applied manually, e.g.:
allprojects {
apply(plugin = "pluginName")
}
The default Gradle distribution includes a large number of plugins, e.g.:
java
plugin, for Java written applications
java-library
plugin, for Java libraries (with no main class)scala
plugincpp
plugin, for C++kotlin
plugin, supporting Kotlin with multiple targets (JVM, Javascript, native)We are going to use the Kotlin JVM plugin to build our first standalone plugin!
(yes we already did write our first one: code in buildSrc
is project-local plugin code)
A very simple plugin that greets the user
Desiderata
greet
task that prints a greetingplugins {
id("it.unibo.spe.greetings")
}
greetings {
greetWith { "Ciao da" }
}
First step: we need to set up a Kotlin build, we’ll write our plugin in Kotlin
plugins {
// No magic: calls a method running behind the scenes the same of id("org.jetbrains.kotlin-$jvm")
kotlin("jvm") version "1.5.31" // version is necessary
}
The Kotlin plugin introduces:
// Configuration of software sources
repositories {
mavenCentral() // points to Maven Central
}
dependencies {
// "implementation" is a configuration created by by the Kotlin plugin
implementation(kotlin("stdlib-jdk8")) // "kotlin" is an extension method of DependencyHandler
// The call to "kotlin" passing `module`, returns a String "org.jetbrains.kotlin:kotlin-$module:<KotlinVersion>"
}
In order to develop a plugin, we need the Gradle API
dependencies {
implementation(kotlin("stdlib-jdk8"))
implementation(gradleApi()) // Built-in method, returns a `Dependency` to the current Gradle version
}
Gradle expects the plugin entry point (the class implementing the Plugin
interface) to be specified in a manifest file
META-INF/gradle-plugins
<plugin-name>.properties
The name is usually a “reverse url”, similarly to Java packages.
e.g., it.unibo.spe.greetings
The file content is just a pointer to the class implementing Plugin
, in our case:
implementation-class=it.unibo.spe.firstplugin.GreetingPlugin
Usually, composed of:
Some properties need to be lazy:
Provider
– a value that can only be queried and cannot be changed
map
method!Property
– a value that can be queried and also changed
set
or to be set
passing a Provider
instanceGreetingTask
task typeopen class GreetingTask : DefaultTask() {
@Input
val greeting: Property<String> = project.objects.property<String>(String::class.java) // Lazy property creation
@Internal // Read-only property calculated from `greeting`
val message: Provider<String> = greeting.map { "$it Gradle" }
@TaskAction
fun printMessage() {
// "logger" is a property of DefaultTask
logger.quiet(message.get())
}
}
Properties are created via project
(a property of DefaultTask
of type Project
)
GreetingExtension
extension typeopen class GreetingExtension(val project: Project) {
val defaultGreeting: Property<String> = project.objects.property(String::class.java)
.apply { convention("Hello from") } // Set a conventional value
// A DSL would go there
fun greetWith(greeting: () -> String) = defaultGreeting.set(greeting())
}
Extensions can be seen as global configuration containers
If the plugin can be driven with a DSL, the extension is a good place for the entry point
GreetingPlugin
class GreetingPlugin : Plugin<Project> {
override fun apply(target: Project) {
// Create the extension
val extension = target.extensions.create("greetings", GreetingExtension::class.java, target)
// Create the task
target.tasks.register("greet", GreetingTask::class.java).get().run {
// Set the default greeting to be the one configured in the extension
greeting.set(extension.defaultGreeting)
// Configuration per-task can still be changed manually by users
}
}
}
Project
objectPlugin
configures the project as needed for the tasks and the extension to workjava-library
plugin behind the scenesplugins
property of Project
, e.g.:project.plugins.withType(JavaPlugin::class.java) {
// Stuff you want to do only if someone enables the Java plugin for the current project
}
We got a plugin, we don’t know yet how to use it though.
First step is: testing it to see if it works
Tools to be used
It’s just matter of pulling the right dependencies
dependencies {
implementation(gradleApi())
testImplementation(gradleTestKit()) // Test implementation: available for testing compile and runtime
testImplementation("io.kotest:kotest-runner-junit5:4.2.5") // for kotest framework
testImplementation("io.kotest:kotest-assertions-core:4.2.5") // for kotest core assertions
testImplementation("io.kotest:kotest-assertions-core-jvm:4.2.5") // for kotest core jvm assertions
}
Kotest leverages Junit 5 / Jupiter for execution, we need to enable it
tasks.withType<Test> { // The task type is defined in the Java plugin
useJUnitPlatform() // Use JUnit 5 engine
}
In general, our automation process may and should be informative
We can exploit the API of any Gradle plugin at our advantage
(Of course it depends whether or not configuration options are available)
Let’s add information to our testing system:
tasks.withType<Test> {
useJUnitPlatform() // Use JUnit 5 engine
testLogging.showStandardStreams = true
testLogging {
showCauses = true
showStackTraces = true
showStandardStreams = true
events(*org.gradle.api.tasks.testing.logging.TestLogEvent.values())
exceptionFormat = org.gradle.api.tasks.testing.logging.TestExceptionFormat.FULL
}
}
In general explore the API and use it your advantage
By default, the Gradle test kit just runs Gradle. We want to inject our plugin into the distribution.
Strategy
Kotest is a testing framework fro Kotlin, inspired by Scalatest and Cucumber
FreeSpec
(Scalatest inspired), similar to StringSpec
(Kotest original)class PluginTest : FreeSpec({
// Arbitrarily nested test levels
"whenever a Formula 1 championship" - {
"begins testing" - {
"Ferrari and Mercedes are favorites" {
// Test code for
// "whenever a Formula 1 championship begins testing Ferrari and Mercedes are favorites"
}
}
"reaches mid-season" - {
"Vettel spins repeatedly" { /* Test code */ }
"Ferrari" {
"lags behind with development" { /* Test code */ }
"wins next year" { /* Test code */ }
}
}
}
})
// Configure a Gradle runner
val runner = GradleRunner.create()
.withProjectDir()
.withPluginClasspath(classpath) // we need Gradle **and** our plugin
.withArguments(":tasks", ":you", ":need", ":to", ":run:", "--and", "--cli", "--options")
.build() // This actually runs Gradle
// Inspect results
runner.task(":someExistingTask")?.outcome shouldBe TaskOutcome.SUCCESS
runner.output shouldContain "Hello from Gradle"
Final result in the attached code!
Look at the following code:
dependencies {
testImplementation("io.kotest:kotest-runner-junit5:4.2.5")
testImplementation("io.kotest:kotest-assertions-core:4.2.5")
testImplementation("io.kotest:kotest-assertions-core-jvm:4.2.5")
}
It is repetitive and fragile (what if you change the version of a single kotest module?)
Let’s patch all this fragility:
dependencies {
val kotestVersion = "4.2.5"
testImplementation("io.kotest:kotest-runner-junit5:$kotestVersion")
testImplementation("io.kotest:kotest-assertions-core:$kotestVersion")
testImplementation("io.kotest:kotest-assertions-core-jvm:$kotestVersion")
}
Still, quite repetitive…
dependencies {
val kotestVersion = "4.2.5"
fun kotest(module: String) = "io.kotest:kotest-$module:$kotestVersion"
testImplementation(kotest("runner-junit5")
testImplementation(kotest("assertions-core")
testImplementation(kotest("assertions-core-jvm")
}
Uhmm…
buildSrc
Gradle 7 introduced the catalogs, a standardized way to collect and bundle dependencies.
Catalogs can be declared in:
build.gradle.kts
file (they are API, of course)gradle/libs.versions.toml
)[versions]
dokka = "1.9.20"
konf = "1.1.2"
kotest = "5.9.1"
kotlin = "2.0.21"
testkit = "0.9.0"
[libraries]
classgraph = "io.github.classgraph:classgraph:4.8.179"
konf-yaml = { module = "com.uchuhimo:konf-yaml", version.ref = "konf" }
kotest-junit5-jvm = { module = "io.kotest:kotest-runner-junit5-jvm", version.ref = "kotest" }
kotest-assertions-core-jvm = { module = "io.kotest:kotest-assertions-core-jvm", version.ref = "kotest" }
kotlin-gradle-plugin-api = { module = "org.jetbrains.kotlin:kotlin-gradle-plugin", version.ref = "kotlin" }
testkit = { module = "io.github.mirko-felice.testkit:core", version.ref = "testkit" }
[bundles]
kotlin-testing = [ "kotest-junit5-jvm", "kotest-assertions-core-jvm" ]
[plugins]
dokka = { id = "org.jetbrains.dokka", version.ref = "dokka" }
gitSemVer = "org.danilopianini.git-sensitive-semantic-versioning:3.1.7"
gradlePluginPublish = "com.gradle.plugin-publish:1.3.0"
jacoco-testkit = "pl.droidsonroids.jacoco.testkit:1.0.12"
kotlin-jvm = { id = "org.jetbrains.kotlin.jvm", version.ref = "kotlin" }
kotlin-qa = "org.danilopianini.gradle-kotlin-qa:0.70.2"
multiJvmTesting = "org.danilopianini.multi-jvm-test-plugin:1.3.2"
publishOnCentral = "org.danilopianini.publish-on-central:5.1.11"
taskTree = "com.dorongold.task-tree:4.0.0"
plugins {
`java-gradle-plugin`
alias(libs.plugins.dokka)
alias(libs.plugins.gitSemVer)
alias(libs.plugins.gradlePluginPublish)
alias(libs.plugins.jacoco.testkit)
alias(libs.plugins.kotlin.jvm)
alias(libs.plugins.kotlin.qa)
alias(libs.plugins.publishOnCentral)
alias(libs.plugins.multiJvmTesting)
alias(libs.plugins.taskTree)
}
dependencies {
api(gradleApi())
api(gradleKotlinDsl())
api(kotlin("stdlib-jdk8"))
testImplementation(gradleTestKit())
testImplementation(libs.konf.yaml)
testImplementation(libs.classgraph)
We now have three different runtimes at play:
These toolchains should be controlled indipendently!
You may want to use Java 16 to run Gradle, but compile in a Java 8-compatible bytecode, and then test on Java 11.
We now have three different runtimes at play:
These toolchains should be controlled indipendently!
You may want to use Java 16 to run Gradle, but compile in a Java 8-compatible bytecode, and then test on Java 11.
Default behaviour: Gradle uses the same JVM it is running in as:
Supporting multiple toolchains may not be easy!
Targeting a portable runtime (such as the JVM) helps a lot.
Relatively new tool, from Gradle 6.7 (October 2020)
Define the reference toolchain version (compilation target):
java {
toolchain {
languageVersion.set(JavaLanguageVersion.of(11))
vendor.set(JvmVendorSpec.ADOPTOPENJDK) // Optionally, specify a vendor
implementation.set(JvmImplementation.J9) // Optionally, select an implementation
}
}
Create tasks for running tests on specific environments:
tasks.withType<Test>().toList().takeIf { it.size == 1 }?.let{ it.first }.run {
// If there exist a "test" task, run it with some specific JVM version
javaLauncher.set(javaToolchains.launcherFor { languageVersion.set(JavaLanguageVersion.of(8)) })
}
// Register another test task, with a different JVM
val testWithJVM17 by tasks.registering<Test> { // Also works with JavaExec
javaLauncher.set(javaToolchains.launcherFor { languageVersion.set(JavaLanguageVersion.of(17)) })
} // You can pick JVM's not yet supported by Gradle!
tasks.findByName("check")?.configure { it.dependsOn(testWithJVM17) } // make it part of the QA suite
We now know how to run the plugin,
yet manual classpath modification is not the way we want to run our plugin
We want something like:
plugins {
id("it.unibo.spe.greetings") version "0.1.0"
}
To do so, we need to ship our plugin to the Gradle plugin portal
Gradle provides a plugin publishing plugin to simplify delivery
…but before, we need to learn how to
click
click
The project version can be specified in Gradle by simply setting the version
property of the project:
version = "0.1.0"
It would be better to rely on the underlying DVCS
to compute a Semantic Versioning compatible version!
There are a number of plugins that do so
including one I’ve developed
Minimal configuration:
plugins {
id ("org.danilopianini.git-sensitive-semantic-versioning") version "0.3.0"
}
./gradlew printGitSemVer
> Task :printGitSemVer
Version computed by GitSemVer: 0.1.0-archeo+cf5b4c0
Another possibility is writing a plugin yourself
But at the moment we are stuck: we don’t know yet how to expose plugins to other builds
There’s not really much I want to protect in this example, so I’m going to pick one of the most open licenses: MIT (BSD would have been a good alternative)
Copyright 2020 Danilo Pianini
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
JVM artifacts are normally shipped in form of jar archives
the de-facto convention is inherited from Maven:
com.google.guava:guava:29.0-jre
com.google.guava
guava
29.0-jre
pom.xml
fileguava-29.0-jre.jar
guava-29.0-jre-javadoc.jar
guava-29.0-jre-sources.jar
In order to create Maven-compatible artifacts, we need first to set the groupId:
group = "it.unibo.firstplugin"
Many repositories require to register the group and associate developer identities to it
The project name set in settings.gradle.kts
is usually used as artifactId
Gradle provides two plugins to simplify the assembly and upload of plugins
plugins {
`java-gradle-plugin`
id("com.gradle.plugin-publish") version "0.12.0"
}
pluginBundle { // These settings are set for the whole plugin bundle
website = "https://unibo-spe.github.io/"
vcsUrl = "https://github.com/unibo-spe/"
tags = listOf("example", "greetings", "spe", "unibo")
}
gradlePlugin {
plugins {
create("") { // One entry per plugin
id = "${project.group}.${project.name}"
displayName = "SPE Greeting plugin"
description = "Example plugin for the SPE course"
implementationClass = "it.unibo.spe.firstplugin.GreetingPlugin"
}
}
}
They add the publishPlugins
task
In order to publish on the Gradle Plugin Portal (but it is true for any repository) users need to be authenticated
This is most frequently done via authentication tokens, and more rarely by username and password.
It is first required to register, once done, an API Key will be available from the web interface, along with a secret.
These data is required to be able to publish, and can be fed to Gradle in two ways:
~/.gradle/gradle.properties
file, adding:gradle.publish.key=YOUR_KEY
gradle.publish.secret=YOUR_SECRET
-P
flags:./gradlew -Pgradle.publish.key=<key> -Pgradle.publish.secret=<secret> publishPlugins
❯ ./gradlew publishPlugins
> Task :publishPlugins
Publishing plugin it.unibo.spe.greetings-plugin version 0.1.0-archeo+ea6b9d7
Publishing artifact build/libs/greetings-plugin-0.1.0-archeo+ea6b9d7.jar
Publishing artifact build/libs/greetings-plugin-0.1.0-archeo+ea6b9d7-sources.jar
Publishing artifact build/libs/greetings-plugin-0.1.0-archeo+ea6b9d7-javadoc.jar
Publishing artifact build/publish-generated-resources/pom.xml
Activating plugin it.unibo.spe.greetings-plugin version 0.1.0-archeo+ea6b9d7
It is a good practice to set up some tools to validate the quality of the source code and testing.
In the case of Kotlin, there are three useful tools:
check
taskMoreover, we need a way to inspect the results of executing these controls, besides of course failing if too many things go wrong.
(note: under Kotlin and Scala, I do not recommend to use Spotbugs: even though it works, it generates way too many false positives)
Tasks with a report module usually publish their results under $buildDir/reports/$reportName
$buildDir/reports/tests
AbstractReportTask
Jacoco works with Kotest out of the box
plugins {
// Some plugins
jacoco
// Some plugins
}
The plugin introduces two tasks:
jacocoTestCoverageVerification
jacocoTestReport
The latter must be configured to produce readable reports:
tasks.jacocoTestReport {
reports {
// xml.isEnabled = true // Useful for processing results automatically
html.isEnabled = true // Useful for human inspection
}
}
Note: Jacoco does not work with the Gradle test kit, but there are plugins to work this around.
Can be configured for every KotlinCompile
task
tasks.withType<org.jetbrains.kotlin.gradle.tasks.KotlinCompile> {
kotlinOptions {
allWarningsAsErrors = true
}
}
.editorconfig
fileplugins {
id("org.jlleitschuh.gradle.ktlint") version "9.4.1"
}
Adds the following tasks:
ktlintApplyToIdea
, ktlintApplyToIdeaGlobally
– Change the IntelliJ Idea configuration to adhere to the rulesktlintCheck
, ktlintKotlinScriptCheck
, ktlint<SourceSetName>SourceSetCheck
, – Apply rules and report errorsktlintFormat
, ktlintKotlinScriptFormat
, ktlint<SourceSetName>SourceSetFormat
– Lint code automaticallyplugins {
id("io.gitlab.arturbosch.detekt") version "1.14.1"
}
repositories {
mavenCentral()
}
dependencies {
// Adds a configuration "detektPlugins"
detektPlugins("io.gitlab.arturbosch.detekt:detekt-formatting:1.14.1")
}
detekt {
failFast = true // fail build on any finding
buildUponDefaultConfig = true // preconfigure defaults
config = files("$projectDir/config/detekt.yml") // Custom additional rules
}
Adds the detekt
task, failing in case of violation
You know how to build and publish Gradle plugins: factorize the common part!
plugins {
// Just applies and pre-configures jacoco, detekt, and ktlint
id("org.danilopianini.gradle-kotlin-qa") version "0.2.1"
// Just applies and pre-configures jacoco, Spotbugs, PMD, and checkstyle
id("org.danilopianini.gradle-java-qa") version "0.2.1"
}
It is a good practice to automate the generation of the API documentation.
java[-library]
plugin adds a javadoc
task for the Javadocscala
plugin includes a task of type ScalaDoc
plugins { id("org.jetbrains.dokka") version "1.4.10" }
Adds four tasks:
dokkaGfm
, dokkaHtml
, dokkaJavadoc
, dokkaJekyll
The java-library
and java
plugins (applied behind the scenes by the kotlin-jvm
plugin as well) automatically create an assemble
task which generates a task of type Jar
creating a non-executable jar with the project contents.
val javadocJar by tasks.registering(Jar::class) {
archiveClassifier.set("javadoc")
from(tasks.dokkaJavadoc.get().outputDirectory) // Automatically makes it depend on dokkaJavadoc
}
val sourceJar by tasks.registering(Jar::class) {
archiveClassifier.set("source")
from(tasks.compileKotlin.get().outputDirectory)
from(tasks.processResources.get().outputDirectory)
}
generates a jar file with classifier javadoc
inside the build/libs
folder
Many repositories require artifacts to be signed in order for them to be delivered/deployed
If you do not have a signature yet, time to create one
gpg --gen-key
gpg --list-keys
gpg --keyserver keyserver.ubuntu.com --send-keys <KEY-ID>
Once you have a key, you can use the signing
plugin to have Gradle generate signatures
To set a default signatory, add to your ~/.gradle/gradle.properties
:
signing.keyId = <your key id>
signing.password = <redacted>
signing.secretKeyRingFile = <your user home>/.gnupg/secring.gpg
Maven Central is one of the de-facto standard repositories for JVM (artifacts)
Other notable repositories:
Requirements
io.github.yourghusername
as group idpom.xml
file
Procedure
oss.sonatype.org
Gradle provides a maven-publish
plugin for automated delivery to Maven repositories
Requires some manual configuration:
pom.xml
metadataIf a publication pubName
is created for a repository RepoName
, then these tasks get created:
publish<PubName>PublicationTo<RepoName>Repository
publish<PubName>PublicationToMavenLocal
plugins { `maven-publish` }
val javadocJar by ...
val sourceJar by ...
publishing {
maven {
url = uri("https://s01.oss.sonatype.org/service/local/staging/deploy/maven2/")
val mavenCentralPwd: String? by project // Pass the pwd via -PmavenCentralPwd='yourPassword'
credentials {
username = "danysk"
password = mavenCentralPwd
}
}
publications {
val publicationName by creating(MavenPublication::class) {
from(components["java"]) // add the jar produced by the java plugin
// Warning: the gradle plugin-publish plugin already adds them to the java SoftwareComponent
artifact(javadocJar) // add the javadoc jar to this publication
artifact(sourceJar) // add the source jar to this publication
pom {
name.set("My Library")
description.set("A concise description of my library")
url.set("http://www.example.com/library")
licenses { license { name.set("...") } }
developers { developer { name.set("...") } }
scm {
url.set("...")
url.set("...")
}
}
}
signing { sign(publicationName) }
}
}
I produced a plugin that pre-configures maven-publish
to point to Maven Central
java
, maven-publish
, and signing
pluginsSourcesJar
and JavadocJar
MAVEN_CENTRAL_USERNAME
MAVEN_CENTRAL_PASSWORD
group = "org.danilopianini"
inner class ProjectInfo {
val longName = "Gradle Publish On Maven Central Plugin"
val projectDetails = "A Plugin for easily publishing artifacts on Maven Central"
val website = "https://github.com/DanySK/$name"
showStackTraces = true
events(*org.gradle.api.tasks.testing.logging.TestLogEvent.values())
exceptionFormat = org.gradle.api.tasks.testing.logging.TestExceptionFormat.FULL
}
}
gradlePlugin {
plugins {
website.set(info.website)
vcsUrl.set(info.vcsUrl)
create("PublishOnCentralPlugin") {
id = "$group.${project.name}"
displayName = info.longName
description = project.description
implementationClass = info.pluginImplementationClass
tags.set(info.tags)
description = info.projectDetails
}
}
}
publishOnCentral {
projectDescription.set(info.projectDetails)
projectLongName.set(info.longName)
projectUrl.set(info.website)
In rich projects, most of the build-related issues are due to pesky stuff going on with dependencies
Gradle allows for inspection of the dependencies:
./gradlew dependencies
prints the dependency trees for each configurationInspecting multiple large trees can be difficult
./gradlew dependencyInsight --dependency <DepName>
--configuration <ConfName>
When developing plugins or rich builds, the issue of dependencies also affect tasks
Gradle does not provide tools to ispect the task graph graphically, but a plugin exists.
plugins {
id "com.dorongold.task-tree" version "4.0.0"
}
Generates a taskTree
task printing the task tree of the tasks listed along with taskTree
.
Gradle supports a reporting system called Gradle build scans
--scan
to the buildExample scans:
--scan
In settings.gradle.kts
:
develocity {
buildScan {
termsOfUseUrl = "https://gradle.com/terms-of-service"
termsOfUseAgree = "yes"
uploadInBackground = !System.getenv("CI").toBoolean()
}
danilo.pianini@unibo.it
Compiled on: 2024-11-21 — printable version