danilo.pianini@unibo.itCompiled on: 2025-11-21 — printable version
The process of creating tested deployable software artifacts
from source code
May include, depending on the system specifics:
Automation of the build lifecycle
Imperative/Custom: write a script that tells the system what to do to get from code to artifacts
Declarative/Standard: adhere to some convention, customizing some settings
Create a declarative infrastructure upon an imperative basis, and allow easy access to the underlying machinery
DSLs are helpful in this context: they can “hide” imperativity without ruling it out
Still, many challenges remain open:
nos esse quasi nanos gigantium humeris insidentes Bernard of Chartres
All modern software depends on other software!
All the software we build and use depends on other software
Indirect dependencies (dependencies of dependencies) are called transitive
In non-toy projects, transitive dependencies are the majority
We need a tool that can:
To do this, however, we need to know some repositories and how to refer and locate artifacts
When two libraries depend on different versions of the same library, we have a conflict
A requires B at version 1, dependency C requires B at version 2To reduce the risk of conflicts, some systems allow specifying ranges of acceptable versions
[1.2,2.0) means “from 1.2 (inclusive) to 2.0 (exclusive)”[1.2,1.3) means “from 1.2 (inclusive) to 1.3 (exclusive)”[1.2,1.3] means “from 1.2 (inclusive) to 1.3 (inclusive)”(,1.3] means “up to 1.3 (inclusive)”[1.2,) means “from 1.2 (inclusive) onwards”1.2+ means “any 1.2 version”, i.e., >=1.2.0 <1.3.0^1.2.3 means “compatible with 1.2.3”, i.e., >=1.2.3 <2.0.0~1.2.3 means “approximately 1.2.3”, i.e., >=1.2.3 <1.3.01.2.x means “any 1.2 version”, i.e., >=1.2.0 <1.3.0>=1.2, <2.0 means “from 1.2 (inclusive) to 2.0 (exclusive)”>=1.2, <1.3 means “from 1.2 (inclusive) to 1.3 (exclusive)”>=1.2, <=1.3 means “from 1.2 (inclusive) to 1.3 (inclusive)”<1.3 means “up to 1.3 (exclusive)”>=1.2 means “from 1.2 (inclusive) onwards”When multiple ranges are specified for the same dependency, the system must find a version that satisfies all constraints.
The resolver typically relies on a dependency graph:
Introducing ranges leads to the explosion of the version combinations space.
Dependency locking is a technique to fix the versions of all dependencies (including transitive ones) to a known good set.
exact pins trade low maintenance risk for high ongoing cost and poor ecosystem fit:
| Ranges, no lock | Ranges + lock | Fixed versions (exact pins) | |
|---|---|---|---|
| Updates | Instant | Manual (regenerate lock file) | Manual (change version) |
| Reproducibility | Low | Maximum (transitive dependencies are locked) | High (transitive drift remains) |
| Reliability | Low (uncontrolled updates) | Medium (testing all ranges is impossible) | High (controlled updates, transitive drift) |
Dependencies may be needed in different contexts:
Scopes are typically pre-defined in declarative build systems and manually defined in imperative ones (there are exceptions to this rule).
Hybrid automators often provide pre-defined scopes and ways to define additional custom scopes.
CMake is a widely used imperative build automation tool, especially in C and C++ projects.
CMakeLists.txt using CMake’s own scripting language.CMake version and project name, declarative:
cmake_minimum_required(VERSION 3.0) # setting this is required
project(example_project) # this sets the project name
File globbing: search of what to build is manual/imperative:
# These instructions search the directory tree when cmake is
# invoked and put all files that match the pattern in the variables
# `sources` and `data`.
file(GLOB_RECURSE sources src/main/*.cpp src/main/*.h)
file(GLOB_RECURSE sources_test src/test/*.cpp)
file(GLOB_RECURSE data resources/*)
# You can use set(sources src/main.cpp) etc if you don't want to
# use globbing to find files automatically.
Target definitions, imperative:
# The data is just added to the executable, because in some IDEs (QtCreator)
# files are invisible when they are not explicitly part of the project.
add_executable(example ${sources} ${data})
# Just for example add some compiler flags.
target_compile_options(example PUBLIC -std=c++1y -Wall -Wfloat-conversion)
# This allows to include files relative to the root of the src directory with a <> pair
target_include_directories(example PUBLIC src/main)
# This copies all resource files in the build directory.
# We need this, because we want to work with paths relative to the executable.
file(COPY ${data} DESTINATION resources)
Depedency management, imperative:
# This defines the variables Boost_LIBRARIES that containts all library names
# that we need to link into the program.
find_package(Boost 1.36.0 COMPONENTS filesystem system REQUIRED)
target_link_libraries(example PUBLIC
${Boost_LIBRARIES}
# here you can add any library dependencies
)
Testing with googletest, imperative:
find_package(GTest)
if(GTEST_FOUND)
add_executable(unit_tests ${sources_test} ${sources})
# This define is added to prevent collision with the main.
# It might be better solved by not adding the source with the main to the
# testing target.
target_compile_definitions(unit_tests PUBLIC UNIT_TESTS)
# This allows us to use the executable as a link library, and inherit all
# linker options and library dependencies from it, by simply adding it as dependency.
set_target_properties(example PROPERTIES ENABLE_EXPORTS on)
target_link_libraries(unit_tests PUBLIC ${GTEST_BOTH_LIBRARIES} example)
target_include_directories(unit_tests PUBLIC ${GTEST_INCLUDE_DIRS})
endif()
Packaging with CPack, mostly imperative with some declarative parts:
# All install commands get the same destination. this allows us to use paths
# relative to the executable.
install(TARGETS example DESTINATION example_destination)
# This is basically a repeat of the file copy instruction that copies the
# resources in the build directory, but here we tell cmake that we want it
# in the package.
install(DIRECTORY resources DESTINATION example_destination)
# Now comes everything we need, to create a package
# there are a lot more variables you can set, and some
# you need to set for some package types, but we want to
# be minimal here.
set(CPACK_PACKAGE_NAME "MyExample")
set(CPACK_PACKAGE_VERSION "1.0.0")
# We don't want to split our program up into several incomplete pieces.
set(CPACK_MONOLITHIC_INSTALL 1)
# This must be last
include(CPack)
Poetry is a modern, declarative build and dependency management tool for Python.
pyproject.toml.
poetry.lock.
Since there were no standard management systems originally, multiple tools proliferated
By default, Python is installed system-wide
All Python installations come with pip, the package installer for Python
pip install PACKAGE_NAMEPyEnv is a tool to manage multiple Python installations on the same system
virtualenv and venv create virtual Python installations on the same system
virtualenv is a third-party tool, venv is built-in in Python 3.3 and laterroot-directory/
├── main_package/
│ ├── __init__.py
│ ├── sub_module.py
│ └── sub_package/
│ ├── __init__.py
│ └── sub_sub_module.py
├── test/
│ ├── test_something.py
│ └── test_something_else.py/
├── pyproject.toml # File where project configuration (metadata, dependencies, etc.) is stored
├── poetry.toml # File where Poetry configuration is stored
├── poetry.lock # File where Poetry locks the dependencies
└── README.md
If you already use Python, notice:
requirements.txt nor requirements-dev.txtpyproject.toml, poetry.toml, and poetry.lock are Poetry-specificpoetry.lock is generated automatically by Poetry, and should not be edited manuallycalculator(courtesy of Giovanni Ciatto)
[tool.poetry]
# publication metadata
name = "unibo-dtm-se-calculator"
packages = [ # files to be included for publication
{ include = "calculator" },
]
version = "0.1.1"
description = "A simple calculator toolkit written in Python, with several UIs."
authors = ["Giovanni Ciatto <giovanni.ciatto@unibo.it>"]
license = "Apache 2.0"
readme = "README.md"
# dependencies (notice that Python is considered a dependency)
[tool.poetry.dependencies]
python = "^3.10.0"
Kivy = "^2.3.0"
# development dependencies
[tool.poetry.group.dev.dependencies]
poetry = "^1.7.0"
pytest = "^8.1.0"
coverage = "^7.4.0"
mypy = "^1.9.0"
# executable commands that will be created then installing this package
[tool.poetry.scripts]
calculator-gui = "calculator.ui.gui:start_app"
calculator = "calculator.ui.cli:start_app"
# where to download the dependencies from
[[tool.poetry.source]]
name = "PyPI"
priority = "primary"
# packaging dependencies
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
# the project-specific environment will be created in the local .venv folder
[virtualenvs]
in-project = true
Pure TOML: completely declarative
Poetry is used via the poetry command line tool:
poetry install – resolves and installs dependenciespoetry run <command> – runs a command within the virtual environmentpoetry installpoetry.lock if already availableSubsequent lifecycle phases are managed by the poetry run command, and are thus custom.
❗ Except for install, Poetry does not provide a predefined lifecycle
A build lifecycle typical of declarative automators, composed by phases.
⚠️ Selecting a phase implies executing all previous phases.
validate – validate the project is correct and all necessary information is availablecompile – compile the source code of the projecttest – test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployedpackage – take the compiled code and package it in its distributable format, such as a JAR.verify – run any checks on results of integration tests to ensure quality criteria are metinstall – install the package into the local repository, for use as a dependency in other projects locallydeploy – done in the build environment, copies the final package to the remote repository for sharing with other developers and projects.What if there is no plugin for something peculiar of the project?
Rather than declaratively fit the build into a predefined lifecycle, declaratively define a build lifecycle
A paradigmatic example of a hybrid automator:
Let’s start as empty as possible, just point your terminal to an empty folder and:
touch build.gradle.kts
gradle tasks
Stuff happens: if nothing is specified,
Gradle considers the folder where it is invoked as a project
The project name matches the folder name
Let’s understand what:
Welcome to Gradle <version>!
Here are the highlights of this release:
- Blah blah blah
Starting a Gradle Daemon (subsequent builds will be faster)
Up to there, it’s just performance stuff: Gradle uses a background service to speed up cacheable operations
> Task :tasks
------------------------------------------------------------
Tasks runnable from root project
------------------------------------------------------------
Build Setup tasks
-----------------
init - Initializes a new Gradle build.
wrapper - Generates Gradle wrapper files.
Some tasks exist already! They are built-in. Let’s ignore them for now.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project '00-empty'.
components - Displays the components produced by root project '00-empty'. [incubating]
dependencies - Displays all dependencies declared in root project '00-empty'.
dependencyInsight - Displays the insight into a specific dependency in root project '00-empty'.
dependentComponents - Displays the dependent components of components in root project '00-empty'. [incubating]
help - Displays a help message.
model - Displays the configuration model of root project '00-empty'. [incubating]
outgoingVariants - Displays the outgoing variants of root project '00-empty'.
projects - Displays the sub-projects of root project '00-empty'.
properties - Displays the properties of root project '00-empty'.
tasks - Displays the tasks runnable from root project '00-empty'.
Informational tasks. Among them, the tasks task we just invoked
It is time to create our first task
Create a build.gradle.kts file as follows:
tasks.register("brokenTask") { // creates a new task
println("this is executed at CONFIGURATION time!")
}
Now launch gradle with gradle brokenTask:
gradle broken
this is executed at CONFIGURATION time!
BUILD SUCCESSFUL in 378ms
Looks ok, but it’s utterly broken
Try launching gradle tasks
❯ gradle tasks
> Task :tasks
------------------------------------------------------------
Tasks runnable from root project
------------------------------------------------------------
this is executed at CONFIGURATION time!
Build Setup tasks
Ouch!
Reason: the build script executes when Gradle is invoked, and configures tasks and dependencies.
Only later, when a task is invoked, the block gets actually executed
Let’s write a correct task
tasks.register("helloWorld") {
doLast { // This method takes as argument a Task.() -> Unit
println("Hello, World!")
}
}
Execution with gradle helloWorld
gradle helloWorld
> Task :helloWorld
Hello, World!
Delaying the execution allows for more flexible configuration
This will be especially useful when modifying existing behavior
val helloWorld = tasks.register("helloWorld") {
doLast { // This method takes as argument a Task.() -> Unit
println("Hello, World!")
}
}
helloWorld.configure { // Note the `configure` method: late configuration of the task
doFirst {
println("About to say hello...")
}
}
output of gradle helloWorld:
Configuring task: helloWorld
> Task :helloWorld
About to say hello...
Hello, World!
BUILD SUCCESSFUL in 231ms
1 actionable task: 1 executed
While task execution happens only for those tasks that are invoked (or their dependencies), task configuration happens for all tasks declared in the build script.
In Gradle, tasks are registered lazily, and can be configured lazily as well, using the configure and configureEach methods.
val helloWorld = tasks.register("helloWorld") {
doLast { // This method takes as argument a Task.() -> Unit
println("Hello, World!")
}
}
helloWorld.configure { // Note the `configure` method: late configuration of the task
doFirst {
println("About to say hello...")
}
}
tasks.withType<Task>().configureEach {
println("Configuring task: ${this.name}")
doLast {
println("Finished task: ${this.name}")
}
doFirst {
println("Starting task: ${this.name}")
}
}
gradle helloWorld output:
Configuring task: helloWorld
> Task :helloWorld
Starting task: helloWorld
About to say hello...
Hello, World!
Finished task: helloWorld
BUILD SUCCESSFUL in 395ms
1 actionable task: 1 executed
gradle tasks output:
Configuring task: tasks
> Task :tasks
Starting task: tasks
Configuring task: help # Configuration happens only when a task is needed!
Configuring task: projects
... list of tasks ...
Configuring task: helloWorld
------------------------------------------------------------
Tasks runnable from root project 'configuration-avoidance'
------------------------------------------------------------
... list of tasks ...
BUILD SUCCESSFUL in 239ms
1 actionable task: 1 executed
Gradle offers some facilities to make writing new tasks easier
An example is the org.gradle.api.Exec task type, representing a command to be executed on the underlying command line
The task type can be specified at task registration time.
Any open class implementing org.gradle.api.Task can be instanced.
Tasks of unspecified type are plain DefaultTasks
import org.gradle.internal.jvm.Jvm // Jvm is part of the Gradle API
tasks.register<Exec>("printJavaVersion") { // Do you Recognize this? inline function with reified type!
// Configuration action is of type T.() -> Unit, in this case Exec.T() -> Unit
val javaExecutable = Jvm.current().javaExecutable.absolutePath
commandLine( // this is a method of class org.gradle.api.Exec
javaExecutable, "-version"
)
// There is no need of doLast / doFirst, actions are already configured
// Still, we may want to do something before or after the task has been executed
doLast { println("$javaExecutable invocation complete") }
doFirst { println("Ready to invoke $javaExecutable") }
}
> Task :printJavaVersion
Ready to invoke /usr/lib/jvm/java-11-openjdk/bin/java
openjdk version "11.0.8" 2020-07-14
OpenJDK Runtime Environment (build 11.0.8+10)
OpenJDK 64-Bit Server VM (build 11.0.8+10, mixed mode)
/usr/lib/jvm/java-11-openjdk/bin/java invocation complete
Let’s compile a simple src/HelloWorld.java:
class HelloWorld {
public static void main(String... args) {
System.out.println("Hello, World!");
}
}
Build logic:
javac -d destination <files>Which step should be in configuration, and which in execution?
General rule: move as much as possible to execution
tasks.register<Exec>("compileJava") {
// Computed at configuration time
val sources = TODO("assume this is expensive")
// configuration that needs "sources"
doFirst { // We need to compute the sources and classpath as late as possible!
sources.forEach { ... }
}
}
How can Gradle determine which tasks some task depends on?
Inputs and outputs can be configured via the inputs and outputs properties of a task.
Inputs and outputs are also used by Gradle to determine if a task is UP-TO-DATE,
namely, if it can be skipped because its inputs have not changed since the last execution.
Gradle supports the construction of lazy properties and providers:
Provider – a value that can only be queried and cannot be changed
map methodproject.provider { ... } (project can be omitted in build scripts)Property – a value that can be queried and also changed
set by passing a value or a Providerproject.objects.property<Type>() (project can be omitted in build scripts)import org.gradle.internal.jvm.Jvm
tasks.register<Exec>("compileJava") { // This is a Kotlin lambda with receiver!
val sourceDir = projectDir.resolve("src")
inputs.dir(sourceDir)
val outputDir = layout.buildDirectory.dir("bin").get().asFile.absolutePath
outputs.dir(outputDir)
val javacExecutable = Jvm.current().javacExecutable.absolutePath // Use the current JVM's javac
executable(javacExecutable)
doFirst { // We need to compute the sources and classpath as late as possible
val sources = sourceDir.walkTopDown().filter { it.isFile && it.extension == "java" }.toList()
println(sources)
when {
sources.isEmpty() -> {
println("No source files found, skipping compilation.")
args("-version")
}
else -> args(
// destination folder: the output directory of Gradle, inside "bin"
"-d", outputDir,
*sources.toTypedArray(),
)
}
}
}
Execution:
gradle compileJava
BUILD SUCCESSFUL in 693ms
Compiled files are in build/bin!
Dependency management in Gradle is rooted in two fundamental concepts:
Let’s see a use case: compiling a Java source with a dependency
javac terms, we need to feed some jars to the -cp flag of the compilerConceptually, we want something like:
// Gradle way to create a configuration
val compileClasspath ... // Delegation!
dependencies {
compileClasspath.add(dir("libs").files.filter { it.extension == "jar" })
}
To be consumed by our improved compile task:
tasks.register<Exec>("compileJava") {
...
else -> args(
"-d", outputDir,
// classpath from the configuration
"-cp", "${File.pathSeparator}${compileClasspath.asPath}",
*sources.toTypedArray(),
)
}
A minimal DSL to simplify file access:
// Minimal file access DSL
object AllFiles
val Project.allFiles: AllFiles get() = AllFiles // We need this to prevent "Object AllFiles captures the script class instance" error
data class Finder(val path: File) {
fun withExtension(extension: String): List<File> =
path.walkTopDown().filter { it.isFile && it.extension == extension }.toList()
}
fun AllFiles.inFolder(path: String) = Finder(projectDir.resolve(path))
Dependency declaration (configuration time):
val compileClasspath: Configuration by configurations.creating // Delegation!
dependencies { // built-in in Gradle
AllFiles.inFolder("libs").withExtension("jar").forEach { // Not Gradle: defined below
compileClasspath(files(it)) // The Configuration class overrides the invoke operator
}
}
Dependency use (execution time):
doFirst { // We need to compute the sources and classpath as late as possible
val sources = AllFiles.inFolder("src").withExtension("java")
println(sources)
when {
sources.isEmpty() -> {
println("No source files found, skipping compilation.")
args("-version")
}
else -> args(
// destination folder: the output directory of Gradle, inside "bin"
"-d", outputDir,
// classpath from the configuration
"-cp", "${File.pathSeparator}${compileClasspath.asPath}",
*sources.toTypedArray(),
)
}
}
Full example:
import org.gradle.internal.jvm.Jvm
val compileClasspath: Configuration by configurations.creating // Delegation!
dependencies { // built-in in Gradle
AllFiles.inFolder("libs").withExtension("jar").forEach { // Not Gradle: defined below
compileClasspath(files(it)) // The Configuration class overrides the invoke operator
}
}
tasks.register<Exec>("compileJava") { // This is a Kotlin lambda with receiver!
inputs.dir(projectDir.resolve("src"))
val outputDir = layout.buildDirectory.dir("bin").get().asFile.absolutePath
outputs.dir(outputDir)
val javacExecutable = Jvm.current().javacExecutable.absolutePath // Use the current JVM's javac
executable(javacExecutable)
doFirst { // We need to compute the sources and classpath as late as possible
val sources = AllFiles.inFolder("src").withExtension("java")
println(sources)
when {
sources.isEmpty() -> {
println("No source files found, skipping compilation.")
args("-version")
}
else -> args(
// destination folder: the output directory of Gradle, inside "bin"
"-d", outputDir,
// classpath from the configuration
"-cp", "${File.pathSeparator}${compileClasspath.asPath}",
*sources.toTypedArray(),
)
}
}
}
// Minimal file access DSL
object AllFiles
val Project.allFiles: AllFiles get() = AllFiles // We need this to prevent "Object AllFiles captures the script class instance" error
data class Finder(val path: File) {
fun withExtension(extension: String): List<File> =
path.walkTopDown().filter { it.isFile && it.extension == extension }.toList()
}
fun AllFiles.inFolder(path: String) = Finder(projectDir.resolve(path))
Next step: we can compile, why not executing the program as well?
runtimeClasspath configuration
compileClasspathval compileClasspath: Configuration by configurations.creating
val runtimeClasspath: Configuration by configurations.creating {
extendsFrom(compileClasspath)
}
dependencies { // built-in in Gradle
AllFiles.inFolder("libs").withExtension("jar").forEach { // Not Gradle: defined below
compileClasspath(files(it)) // The Configuration class overrides the invoke operator
}
}
tasks.register<Exec>("runJava") {
inputs.dir(compilationDestination)
dependsOn(compileJava) // IMPORTANT! This is a task dependency, we must compile before running
executable(Jvm.current().javaExecutable.absolutePath)
doFirst {
args(
"-cp", "${compilationDestination}${File.pathSeparator}${runtimeClasspath.asPath}",
"HelloMath",
)
}
}
import org.gradle.internal.jvm.Jvm
val compileClasspath: Configuration by configurations.creating
val runtimeClasspath: Configuration by configurations.creating {
extendsFrom(compileClasspath)
}
dependencies { // built-in in Gradle
AllFiles.inFolder("libs").withExtension("jar").forEach { // Not Gradle: defined below
compileClasspath(files(it)) // The Configuration class overrides the invoke operator
}
}
val compilationDestination: String = layout.buildDirectory.dir("bin").get().asFile.absolutePath
val compileJava = tasks.register<Exec>("compileJava") { // This is a Kotlin lambda with receiver!
inputs.dir(projectDir.resolve("src"))
outputs.dir(compilationDestination)
val javacExecutable = Jvm.current().javacExecutable.absolutePath // Use the current JVM's javac
executable(javacExecutable)
doFirst { // We need to compute the sources and classpath as late as possible
val sources = AllFiles.inFolder("src").withExtension("java")
println(sources)
when {
sources.isEmpty() -> {
println("No source files found, skipping compilation.")
args("-version")
}
else -> args(
// destination folder: the output directory of Gradle, inside "bin"
"-d", compilationDestination,
// classpath from the configuration
"-cp", "${File.pathSeparator}${compileClasspath.asPath}",
*sources.toTypedArray(),
)
}
}
}
tasks.register<Exec>("runJava") {
inputs.dir(compilationDestination)
dependsOn(compileJava) // IMPORTANT! This is a task dependency, we must compile before running
executable(Jvm.current().javaExecutable.absolutePath)
doFirst {
args(
"-cp", "${compilationDestination}${File.pathSeparator}${runtimeClasspath.asPath}",
"HelloMath",
)
}
}
// Minimal file access DSL
object AllFiles
val Project.allFiles: AllFiles get() = AllFiles // We need this to prevent "Object AllFiles captures the script class instance" error
data class Finder(val path: File) {
fun withExtension(extension: String): List<File> =
path.walkTopDown().filter { it.isFile && it.extension == extension }.toList()
}
fun AllFiles.inFolder(path: String) = Finder(projectDir.resolve(path))
Let us temporarily comment:
dependsOn(compileJava) // IMPORTANT! This is a task dependency, we must compile before running
and run:
> Task :runJava FAILED
[Incubating] Problems report is available at: file:///home/danysk/LocalProjects/spe-slides/examples/run-java-deps/build/reports/problems/problems-report.html
FAILURE: Build failed with an exception.
* What went wrong:
A problem was found with the configuration of task ':runJava' (type 'Exec').
- Type 'org.gradle.api.tasks.Exec' property '$1' specifies directory '/home/danysk/LocalProjects/spe-slides/examples/run-java-deps/build/bin' which doesn't exist.
Reason: An input file was expected to be present but it doesn't exist.
Possible solutions:
1. Make sure the directory exists before the task is called.
2. Make sure that the task which produces the directory is declared as an input. # <<<< OUR PROBLEM!!
For more information, please refer to https://docs.gradle.org/8.14/userguide/validation_problems.html#input_file_does_not_exist in the Gradle documentation.
* Try:
> Make sure the directory exists before the task is called
> Make sure that the task which produces the directory is declared as an input
> Run with --scan to get full insights.
BUILD FAILED in 752ms
1 actionable task: 1 executed
runJava to run after compileJavagradle runJava with the dependency correctly set:
> Task :compileJava
[/home/danysk/LocalProjects/spe-slides/examples/run-java-deps/src/HelloMath.java]
> Task :runJava
avg=3.0, std.dev=1.5811388300841898
regression: y = 2.0300000000000002 * x + -0.030000000000001137
R²=0.998957626296907
BUILD SUCCESSFUL in 581ms
2 actionable tasks: 2 executed
Dependencies permeate the world of build automation.
no guarantee that automation written with some tool at version X, will work at version Y!
Gradle proposes a (partial) solution with the so-called Gradle wrapper
wrapper
gradle wrapper --gradle-version=<VERSION>gradlewgradlew.batThe Gradle wrapper is the correct way to use gradle, and we’ll be using it from now on.
At the moment, we have part of the project that’s declarative, and part that’s imperative:
The declarative part is the one for which we had a built-in API for!
The base mechanism at work here is hiding imperativity under a clean, declarative API.
Also “purely declarative” build systems, such as Maven, which are driven with markup files, hide their imperativity behind a curtain (in the case of Maven, plugins that are configured in the pom.xml, but implemented elsewhere).
Usability, understandability, and, ultimately, maintability, get increased when:
Let’s begin our operation of isolation of imperativity by refactoring our hierarchy of operations.
interface TaskWithClasspath : Task {
val classpath: Property<FileCollection>
}
interface JavaCompileTask : TaskWithClasspath {
val sources: Property<FileCollection>
val destinationDir: DirectoryProperty
}
interface JavaRunTask : TaskWithClasspath {
val mainClass: Property<String>
}
Next step: let’s inherit the behavior of our tasks from Exec:
Gradle supports the definition of new task types:
Task interface
DefaultTaskabstract
@TaskAction, and will get invoked to execute the taskIn recent Gradle versions, it is mandatory to annotate every public property’s getter of a task with a marker annotation for gradle to mark it as an input or an output.
@Input, @InputFile, @InputFiles, @InputDirectory, @InputDirectories, @Classpath@OutputFile, @OutputFiles, @OutputDirectory, @OutputDirectories
@Internal marks internal output properties (not reified on the file system)@Internal marks internal output properties (not reified on the file system)@get:Input, etc.
-t optionabstract class AbstractJvmExec : TaskWithClasspath, Exec() {
@get:Classpath
override val classpath: Property<FileCollection> = project.objects.property()
init {
executable(Jvm.current().jvmExecutableForTask().absolutePath)
}
// Extension function with virtual dispatch receiver!
protected abstract fun Jvm.jvmExecutableForTask(): File
}
abstract class JavaRun() : JavaRunTask, AbstractJvmExec() {
@get:Input
override val mainClass: Property<String> = project.objects.property()
override fun Jvm.jvmExecutableForTask(): File = javaExecutable
@TaskAction
override fun exec() {
args(
"-cp", classpath.get().asPath,
mainClass.get(),
)
super.exec()
}
}
abstract class JavaCompile: JavaCompileTask, AbstractJvmExec() {
@get:InputFiles
override val sources: Property<FileCollection> = project.objects.property()
@get:OutputDirectory
override val destinationDir: DirectoryProperty = project.objects.directoryProperty()
override fun Jvm.jvmExecutableForTask(): File = javacExecutable
@TaskAction
override fun exec() {
when {
sources.get().isEmpty -> {
println("No source files found, skipping compilation.")
args("-version")
}
else -> args(
// destination folder: the output directory of Gradle, inside "bin"
"-d", destinationDir.get().asFile.absolutePath,
// classpath from the configuration
"-cp", "${File.pathSeparator}${classpath.get().asPath}",
*sources.get().files.toTypedArray(),
)
}
super.exec()
}
}
In our main build.gradle.kts, we have:
a declarative part
// DECLARATIVE (what)
val compilationDestination = layout.buildDirectory.dir("bin").get().asFile
val compileClasspath: Configuration by configurations.creating
val runtimeClasspath: Configuration by configurations.creating {
extendsFrom(compileClasspath)
}
dependencies { // built-in in Gradle
allFiles.inFolder("libs").withExtension("jar").forEach { // Not Gradle: defined below
compileClasspath(files(it)) // The Configuration class overrides the invoke operator
}
runtimeClasspath(files(compilationDestination))
}
tasks.register<JavaCompile>("compileJava") {
classpath = compileClasspath
destinationDir = compilationDestination
sources = files(project.allFiles.inFolder("src").withExtension("java"))
}
tasks.register<JavaRun>("runJava") {
classpath = runtimeClasspath
mainClass = "HelloMath"
dependsOn(tasks.named("compileJava"))
}
an imperative part
// IMPERATIVE (how)
// Minimal file access DSL
object AllFiles
val Project.allFiles: AllFiles get() = AllFiles // We need this to prevent "Object AllFiles captures the script class instance" error
data class Finder(val path: File) {
fun withExtension(extension: String): List<File> =
path.walkTopDown().filter { it.isFile && it.extension == extension }.toList()
}
fun AllFiles.inFolder(path: String) = Finder(projectDir.resolve(path))
interface TaskWithClasspath : Task {
val classpath: Property<FileCollection>
}
interface JavaCompileTask : TaskWithClasspath {
val sources: Property<FileCollection>
val destinationDir: DirectoryProperty
}
interface JavaRunTask : TaskWithClasspath {
val mainClass: Property<String>
}
abstract class AbstractJvmExec : TaskWithClasspath, Exec() {
(continues a lot further)
Hide the imperative part under the hood, and expose a purely declarative API to the user.
Gradle provides a way to define project-wise build APIs using a special buildSrc folder
Directory structure:
project-folder
├── build.gradle.kts
├── buildSrc
│ ├── build.gradle.kts
│ └── src
│ └── main
│ └── kotlin
│ ├── ImperativeAPI.kt
│ └── MoreImperativeAPIs.kt
└── settings.gradle.kts
buildSrc/build.gradle.kts’ contents (clearer in future):
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
Directory structure for our Java infrastructure:
examples/buildsrc
├── build.gradle.kts
├── buildSrc
│ ├── build.gradle.kts
│ └── src
│ └── main
│ └── kotlin
│ ├── AllFiles.kt
│ ├── JavaCompile.kt
│ ├── JavaRun.kt
│ └── JavaTasksAPI.kt
├── gradlew
├── gradlew.bat
├── libs
│ └── commons-math3-3.6.1.jar
└── src
└── HelloMath.java
our new build.gradle.kts:
val compilationDestination = project.layout.buildDirectory.dir("bin").get().asFile
val compileClasspath: Configuration by configurations.creating
val runtimeClasspath: Configuration by configurations.creating {
extendsFrom(compileClasspath)
}
dependencies {
allFilesIn("libs").withExtension("jar").forEach {
compileClasspath(files(it))
}
runtimeClasspath(files(compilationDestination))
}
tasks.register<JavaCompile>("compileJava") {
classpath = compileClasspath
destinationDir = compilationDestination
sources = files(allFilesIn("src").withExtension("java"))
}
tasks.register<JavaRun>("runJava") {
classpath = runtimeClasspath
dependsOn(tasks.named("compileJava"))
}
we can use all types defined in buildSrc/src/main/kotlin/ in the main project’s build.gradle.kts!
What our build.gradle.kts defines is now a convention for Java projects:
val compilationDestination = project.layout.buildDirectory.dir("bin").get().asFile
val compileClasspath: Configuration by configurations.creating
val runtimeClasspath: Configuration by configurations.creating {
extendsFrom(compileClasspath)
}
dependencies {
allFilesIn("libs").withExtension("jar").forEach {
compileClasspath(files(it))
}
runtimeClasspath(files(compilationDestination))
}
tasks.register<JavaCompile>("compileJava") {
classpath = compileClasspath
destinationDir = compilationDestination
sources = files(allFilesIn("src").withExtension("java"))
}
tasks.register<JavaRun>("runJava") {
classpath = runtimeClasspath
dependsOn(tasks.named("compileJava"))
}
compileClasspathruntimeClasspath, which extends from compileClasspathlibs are added to both configurationscompileJava task that compiles all Java sources in srcrunJava task that runs a specified main classThese could be valid for any Java project!
buildSrc/src/main/kotlin/convention-name.gradle.kts,build.gradle.kts via:plugins {
id("convention-name")
}
buildSrc/src/main/kotlin/java-convention.gradle.kts:
val compilationDestination = project.layout.buildDirectory.dir("bin").get().asFile
val compileClasspath: Configuration by configurations.creating
val runtimeClasspath: Configuration by configurations.creating {
extendsFrom(compileClasspath)
}
dependencies {
allFilesIn("libs").withExtension("jar").forEach {
compileClasspath(files(it))
}
runtimeClasspath(files(compilationDestination))
}
tasks.register<JavaCompile>("compileJava") {
classpath = compileClasspath
destinationDir = compilationDestination
sources = files(allFilesIn("src").withExtension("java"))
}
tasks.register<JavaRun>("runJava") {
classpath = runtimeClasspath
dependsOn(tasks.named("compileJava"))
}
build.gradle.kts:
plugins {
id("java-convention")
}
tasks.runJava.configure { // We must set the main class here
mainClass = "HelloMath"
}
Sometimes projects are modular
Where a module is a sub-project with a clear identity, possibly reusable elsewhere
Examples:
Modular software simplifies maintenance and improves understandability
Modules may depend on other modules
Some build tasks of some module may require build tasks of other modules to be complete before execution
Let us split our project into two components:
We need to reorganize the build logic to something similar to
hierarchial-project
|__:library
\__:app
Desiderata:
Gradle (as many other build automators)
offers built-in support for hierarchial projects.
Gradle is limited to two levels, other products such as Maven have no limitation
Subprojects are listed in a settings.gradle.kts file
Incidentally, it’s the same place where the project name can be specified
Subprojects must have their own build.gradle.kts
They can also have their own settings.gradle.kts, e.g. for selecting a name different than their folder
rootProject.name = "project-with-hierarchy"
include(":library") // There must be a folder named "library"
include(":app") // There must be a folder named "app"
allprojects blockallprojects {
// Executed for every project, included the root one
// here, `project` refers to the current project
}
subprojects blocksubprojects {
// Executed for all subprojects
// here, `project` refers to the current project
}
build.gradle.kts, add further customization as necessarydependencies {
compileClasspath(project(":library")) { // My compileClasspath configuration depends on project library
targetConfiguration = "runtimeClasspath" // Specifically, from its runtime
}
}
app requires library to be compiled.tasks.compileJava { dependsOn(project(":library").tasks.compileJava) }
We now have a rudimental infrastructure for building and running Java projects
What if we want to reuse it?
Of course, copy/pasting the same file across projects is to be avoided whenever possible
Gradle (as many other build systems) allow extensibility via plugins
A plugin is a software component that extends the API of the base system
It usually includes:
TasksExtension – An object incapsulating the global configuration options
Plugin object, implementing an apply(Project) function
Plugin is the entry point of the declared plugin
META-INF/gradle-plugins/<plugin-name>.propertiesGeneral approach to a new build automation problem:
Divide: Identify the base steps, they could become your tasks
Conquer: Clearly express the dependencies among them
Encapsulate: confine imperative logic, make it an implementation detail
Adorn: provide a DSL that makes the library easy and intuitive
Not very different than what’s usually done in (good) software development
buildEnvironmentapply(Project) functionExample code
plugins {
pluginName // Loads a plugin from the "buildEnvironment" classpath
`plugin-name` // Syntax for non Kotlin-compliant plugin names
id("plugin2-name") // Alternative to the former
id("some-custom-plugin") version "1.2.3" // if not found locally, gets fetched from the Gradle plugin portal
}
// In case of non-hierarchial projects, plugins are also "applied"
// Otherwise, they need to get applied manually, e.g.:
allprojects {
apply(plugin = "pluginName")
}
The default Gradle distribution includes a large number of plugins, e.g.:
java plugin, for Java written applications
java-library plugin, for Java libraries (with no main class)scala plugincpp plugin, for C++kotlin plugin, supporting Kotlin with multiple targets (JVM, Javascript, native)We are going to use the Kotlin JVM plugin to build our first standalone plugin!
(yes we already did write our first one: code in buildSrc is project-local plugin code)
A very simple plugin that greets the user
Desiderata
greet task that prints a greetingplugins {
id("org.danilopianini.template-for-gradle-plugins")
}
hello {
author.set("Danilo Pianini")
}
First step: we need to set up a Kotlin build, we’ll write our plugin in Kotlin
plugins {
// No magic: calls a method running behind the scenes, equivalent to id("org.jetbrains.kotlin-$jvm")
kotlin("jvm") version "2.2.20" // version is necessary
}
The Kotlin plugin introduces tasks and configurations to compile and package Kotlin code
Second step: we need to declare where to find dependencies
// Configuration of software sources
repositories {
mavenCentral() // points to Maven Central
}
dependencies {
// "implementation" is a configuration created by by the Kotlin JVM plugin
implementation(...) // we can load libraries here
}
Third step, we need the Gradle API
dependencies {
implementation(gradleApi()) // Built-in method, returns a `Dependency` to the current Gradle version
api(gradleKotlinDsl()) // Built-in method, returns a `Dependency` to the Gradle Kotlin DSL library
}
Gradle expects the plugin entry point (the class implementing the Plugin interface) to be specified in a manifest file
META-INF/gradle-plugins<plugin-name>.propertiesThe name is usually a “reverse url”, similarly to Java packages.
e.g., it.unibo.spe.greetings
The file content is just a pointer to the class implementing Plugin, for instance:
implementation-class=it.unibo.spe.firstplugin.GreetingPlugin
Usually, composed of:
inside src/main/kotlin/<package-path>/:
HelloTask implementation:
open class HelloTask : DefaultTask() {
/**
* The author of the greeting, lazily set.
*/
@get:Input
val author: Property<String> = project.objects.property()
/**
* Read-only property calculated from the greeting.
*/
@get:Internal
val message: Provider<String> = author.map { "Hello from $it" }
/**
* This is the code that is executed when the task is run.
*/
@TaskAction
fun printMessage() {
logger.quiet(message.get())
}
}
HelloExtension, the DSL entrypoint:
open class HelloExtension(objects: ObjectFactory) {
/**
* This is where you write your DSL to control the plugin.
*/
val author: Property<String> = objects.property()
}
HelloGradle, the plugin entrypoint:
apply method is called upon applicationopen class HelloGradle : Plugin<Project> {
override fun apply(target: Project) {
val extension = target.extensions.create<HelloExtension>("hello")
// Enables `hello { ... }` in build.gradle.kts
target.tasks.register<HelloTask>("hello") {
author.set(extension.author)
}
}
}
Plugin configures the project as needed for the tasks and the extension to workjava-library plugin behind the scenesplugins property of Project, e.g.:project.plugins.withType(JavaPlugin::class.java) {
// Stuff you want to do only if someone enables the Java plugin for the current project
}
Gradle provides a test kit, to launch Gradle programmatically and inspect the execution results
It’s just matter of pulling the right dependencies
dependencies {
implementation(gradleApi())
implementation(gradleKotlinDsl())
testImplementation(gradleTestKit()) // Test implementation: available for testing compile and testing runtime
}
By default, the Gradle test kit just runs Gradle. We want to inject our plugin into the distribution:
This operation is now built-in the test kit:
// Configure a Gradle runner
val runner = GradleRunner.create()
.withProjectDir()
.withPluginClasspath(classpath) // we need Gradle **and** our plugin
.withArguments(":tasks", ":you", ":need", ":to", ":run:", "--and", "--cli", "--options")
.build() // This actually runs Gradle
// Inspect results
runner.task(":someExistingTask")?.outcome shouldBe TaskOutcome.SUCCESS
runner.output shouldContain "Hello from Gradle"
Look at the following example code:
dependencies {
testImplementation("io.kotest:kotest-runner-junit5:4.2.5")
testImplementation("io.kotest:kotest-assertions-core:4.2.5")
testImplementation("io.kotest:kotest-assertions-core-jvm:4.2.5")
}
It is repetitive and fragile (what if you change the version of a single kotest module?)
Let’s patch all this fragility:
dependencies {
val kotestVersion = "4.2.5"
testImplementation("io.kotest:kotest-runner-junit5:$kotestVersion")
testImplementation("io.kotest:kotest-assertions-core:$kotestVersion")
testImplementation("io.kotest:kotest-assertions-core-jvm:$kotestVersion")
}
Still, quite repetitive…
dependencies {
val kotestVersion = "4.2.5"
fun kotest(module: String) = "io.kotest:kotest-$module:$kotestVersion"
testImplementation(kotest("runner-junit5")
testImplementation(kotest("assertions-core")
testImplementation(kotest("assertions-core-jvm")
}
Uhmm…
buildSrcGradle 7 introduced the catalogs, a standardized way to collect and bundle dependencies.
Catalogs can be declared in:
build.gradle.kts file (they are API, of course)gradle/libs.versions.toml)[versions]
dokka = "2.0.0"
konf = "1.1.2"
kotest = "6.0.4"
kotlin = "2.2.21"
[libraries]
apache-commons-lang3 = "org.apache.commons:commons-lang3:3.20.0"
classgraph = "io.github.classgraph:classgraph:4.8.184"
konf-yaml = { module = "com.uchuhimo:konf-yaml", version.ref = "konf" }
kotest-junit5-jvm = { module = "io.kotest:kotest-runner-junit5-jvm", version.ref = "kotest" }
kotest-assertions-core-jvm = { module = "io.kotest:kotest-assertions-core-jvm", version.ref = "kotest" }
[bundles]
kotlin-testing = [ "kotest-junit5-jvm", "kotest-assertions-core-jvm" ]
[plugins]
dokka = { id = "org.jetbrains.dokka", version.ref = "dokka" }
gitSemVer = "org.danilopianini.git-sensitive-semantic-versioning:7.0.6"
gradlePluginPublish = "com.gradle.plugin-publish:2.0.0"
jacoco-testkit = "pl.droidsonroids.jacoco.testkit:1.0.12"
kotlin-jvm = { id = "org.jetbrains.kotlin.jvm", version.ref = "kotlin" }
kotlin-qa = "org.danilopianini.gradle-kotlin-qa:0.98.0"
multiJvmTesting = "org.danilopianini.multi-jvm-test-plugin:4.3.2"
publishOnCentral = "org.danilopianini.publish-on-central:9.1.7"
taskTree = "com.dorongold.task-tree:4.0.1"
Gradle generates type-safe accessors for the definitions:
dependencies {
api(gradleApi())
api(gradleKotlinDsl())
api(kotlin("stdlib-jdk8"))
testImplementation(gradleTestKit())
testImplementation(libs.apache.commons.lang3)
testImplementation(libs.konf.yaml)
testImplementation(libs.classgraph)
testImplementation(libs.bundles.kotlin.testing)
}
Also for the plugins:
plugins {
`java-gradle-plugin`
alias(libs.plugins.dokka)
alias(libs.plugins.gitSemVer)
alias(libs.plugins.gradlePluginPublish)
alias(libs.plugins.jacoco.testkit)
alias(libs.plugins.kotlin.jvm)
alias(libs.plugins.kotlin.qa)
alias(libs.plugins.publishOnCentral)
alias(libs.plugins.multiJvmTesting)
alias(libs.plugins.taskTree)
}
We now have three different runtimes at play:
These toolchains should be controlled indipendently!
You may want to use Java 17 to run Gradle, but compile in a Java 8-compatible bytecode, and then test on Java 11.
Default behaviour: Gradle uses the same JVM it is running in as:
Supporting multiple toolchains may not be easy!
Targeting a portable runtime (such as the JVM) helps a lot.
Define the reference toolchain version (compilation target):
java {
toolchain {
languageVersion.set(JavaLanguageVersion.of(11))
vendor.set(JvmVendorSpec.ADOPTOPENJDK) // Optionally, specify a vendor
implementation.set(JvmImplementation.J9) // Optionally, select an implementation
}
}
Create tasks for running tests on specific environments:
tasks.withType<Test>().toList().takeIf { it.size == 1 }?.let{ it.first }.run {
// If there exist a "test" task, run it with some specific JVM version
javaLauncher.set(javaToolchains.launcherFor { languageVersion.set(JavaLanguageVersion.of(8)) })
}
// Register another test task, with a different JVM
val testWithJVM17 by tasks.registering<Test> { // Also works with JavaExec
javaLauncher.set(javaToolchains.launcherFor { languageVersion.set(JavaLanguageVersion.of(17)) })
} // You can pick JVM's not yet supported by Gradle!
tasks.findByName("check")?.configure { it.dependsOn(testWithJVM17) } // make it part of the QA suite
We now know how to build a plugin, we know how to test it,
we don’t know how to make it available to other projects!
We want something like:
plugins {
id("our.plugin.id") version "our.plugin.version"
}
To do so, we need to ship our plugin to the Gradle plugin portal
Gradle provides a plugin publishing plugin to simplify delivery
…but before, we need to learn how to
click
click
The project version can be specified in Gradle by simply setting the version property of the project:
version = "0.1.0"
It would be better to rely on the underlying DVCS
to compute a Semantic Versioning compatible version!
There are a number of plugins that do so
including one I’ve developed
Minimal configuration:
plugins {
id ("org.danilopianini.git-sensitive-semantic-versioning") version "<latest version>"
}
./gradlew printGitSemVer
> Task :printGitSemVer
Version computed by GitSemVer: 0.1.0-archeo+cf5b4c0
Another possibility is writing a plugin yourself
But at the moment we are stuck: we don’t know yet how to expose plugins to other builds
There’s not really much I want to protect in this example, so I’m going to pick one of the most open licenses: MIT (BSD would have been a good alternative)
Copyright 2020 Danilo Pianini
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
JVM artifacts are normally shipped in form of jar archives
the de-facto convention is inherited from Maven:
com.google.guava:guava:29.0-jre
com.google.guavaguava29.0-jrepom.xml fileguava-29.0-jre.jarguava-29.0-jre-javadoc.jarguava-29.0-jre-sources.jarIn order to create Maven-compatible artifacts, we need first to set the groupId:
group = "it.unibo.firstplugin"
Many repositories require to register the group and associate developer identities to it
The project name set in settings.gradle.kts is usually used as artifactId
Gradle provides two plugins to simplify the assembly and upload of plugins
plugins {
`java-gradle-plugin`
id("com.gradle.plugin-publish") version "2.0.0"
}
gradlePlugin {
plugins {
website.set(info.website)
vcsUrl.set(info.vcsUrl)
create("") {
id = "$group.${project.name}"
displayName = info.longName
description = project.description
implementationClass = info.pluginImplementationClass
tags.set(info.tags)
}
}
}
They add the publishPlugins task
In order to publish on the Gradle Plugin Portal (but it is true for any repository) users need to be authenticated
This is most frequently done via authentication tokens, and more rarely by username and password.
It is first required to register, once done, an API Key will be available from the web interface, along with a secret.
These data is required to be able to publish, and can be fed to Gradle in two ways:
~/.gradle/gradle.properties file, adding:gradle.publish.key=YOUR_KEY
gradle.publish.secret=YOUR_SECRET
-P flags:./gradlew -Pgradle.publish.key=<key> -Pgradle.publish.secret=<secret> publishPlugins
The result is a published plugin:
❯ ./gradlew publishPlugins
> Task :publishPlugins
Publishing plugin it.unibo.spe.greetings-plugin version 0.1.0-archeo+ea6b9d7
Publishing artifact build/libs/greetings-plugin-0.1.0-archeo+ea6b9d7.jar
Publishing artifact build/libs/greetings-plugin-0.1.0-archeo+ea6b9d7-sources.jar
Publishing artifact build/libs/greetings-plugin-0.1.0-archeo+ea6b9d7-javadoc.jar
Publishing artifact build/publish-generated-resources/pom.xml
Activating plugin it.unibo.spe.greetings-plugin version 0.1.0-archeo+ea6b9d7
Static analysis is the automatic inspection of source code to detect potential problems, without executing it.
Test coverage tools measure how much of the code is executed while running tests.
If the lifecycle plugin is applied (it is auto-applied by most language-specific plugins), then a check task is available
check is meant to run all quality control taskscheckcheck depends on testQA tasks normally produce reports that can be inspected to understand what went wrong (if anything)
build/reports/
$buildDir/reports/testsAbstractReportTaskUseful Kotlin tools:
You know how to build and publish Gradle plugins: factorize the common part!
plugins {
// Just applies and pre-configures jacoco, detekt, and ktlint
id("org.danilopianini.gradle-kotlin-qa") version "0.2.1"
// Just applies and pre-configures jacoco, Spotbugs, PMD, and checkstyle
id("org.danilopianini.gradle-java-qa") version "0.2.1"
}
It is a good practice to automate the generation of the API documentation.
java[-library] plugin adds a javadoc task to generate the Javadocscala plugin includes a task of type ScalaDocIn general:
Software products are usually shipped as (possibly executable) archives of some sort.
In the JVM world, the de-facto standard format is jar (Java ARchive)
Jar to create such archivesjava-library and java plugins (applied behind the scenes by the kotlin-jvm plugin as well)
automatically create an assemble task which generates a task of type Jar
creating a non-executable jar with the project contents.Many repositories require artifacts to be signed in order for them to be delivered/deployed If you do not have a signature yet, time to create one
gpg --gen-keygpg --list-keysgpg --keyserver keyserver.ubuntu.com --send-keys <KEY-ID>Once you have a key, you can use the signing plugin to have Gradle generate signatures
To set a default signatory, add to your ~/.gradle/gradle.properties:
signing.keyId = <your key id>
signing.password = <redacted>
signing.secretKeyRingFile = <your user home>/.gnupg/secring.gpg
Software repositories are services hosting software artifacts for distribution
gem yank retracts a packageRequirements
io.github.yourghusername) domains is semi-automatic
pom.xml file
The submission procedures has been greatly simplified recently with the Gradle plugin portal:
In rich projects, most of the build-related issues are due to pesky stuff going on with dependencies
Gradle allows for inspection of the dependencies:
./gradlew dependencies prints the dependency trees for each configurationInspecting multiple large trees can be difficult
./gradlew dependencyInsight --dependency <DepName>
--configuration <ConfName>When developing plugins or rich builds, the issue of dependencies also affect tasks
Gradle does not provide tools to ispect the task graph graphically, but a plugin exists.
plugins {
id "com.dorongold.task-tree" version "4.0.1"
}Generates a taskTree task printing the task tree of the tasks listed along with taskTree.
Gradle supports a reporting system called Gradle build scans
--scan to the buildExample scans:
--scanIn settings.gradle.kts:
develocity {
buildScan {
termsOfUseUrl = "https://gradle.com/terms-of-service"
termsOfUseAgree = "yes"
uploadInBackground = !System.getenv("CI").toBoolean()
}
}
danilo.pianini@unibo.itCompiled on: 2025-11-21 — printable version