From 3a82b34bcf12c0f30d03c7e84de3bf5d8cdb388f Mon Sep 17 00:00:00 2001 From: memsharded Date: Wed, 10 Jul 2024 01:30:53 +0200 Subject: [PATCH 01/22] initial CI tutorial --- devops.rst | 1 + .../packages_pipeline.rst | 10 ++ .../packages_pipeline/multi_configuration.rst | 2 + .../multi_configuration_lockfile.rst | 2 + .../single_configuration.rst | 2 + .../products_pipeline.rst | 31 +++++ .../single_configuration.rst | 2 + .../continuous_integration/project_setup.rst | 90 +++++++++++++++ devops/continuous_integration/tutorial.rst | 109 ++++++++++++++++++ 9 files changed, 249 insertions(+) create mode 100644 devops/continuous_integration/packages_pipeline.rst create mode 100644 devops/continuous_integration/packages_pipeline/multi_configuration.rst create mode 100644 devops/continuous_integration/packages_pipeline/multi_configuration_lockfile.rst create mode 100644 devops/continuous_integration/packages_pipeline/single_configuration.rst create mode 100644 devops/continuous_integration/products_pipeline.rst create mode 100644 devops/continuous_integration/products_pipeline/single_configuration.rst create mode 100644 devops/continuous_integration/project_setup.rst create mode 100644 devops/continuous_integration/tutorial.rst diff --git a/devops.rst b/devops.rst index 08c2f353530..06d787fb11e 100644 --- a/devops.rst +++ b/devops.rst @@ -14,6 +14,7 @@ If you plan to use Conan in production in your project, team, or organization, t devops/using_conancenter devops/devops_local_recipes_index + devops/continuous_integration/tutorial devops/backup_sources/sources_backup devops/metadata devops/versioning diff --git a/devops/continuous_integration/packages_pipeline.rst b/devops/continuous_integration/packages_pipeline.rst new file mode 100644 index 00000000000..91d5db2325c --- /dev/null +++ b/devops/continuous_integration/packages_pipeline.rst @@ -0,0 +1,10 @@ +Packages pipeline +================== + + +.. toctree:: + :maxdepth: 1 + + packages_pipeline/single_configuration + packages_pipeline/multi_configuration + packages_pipeline/multi_configuration_lockfile diff --git a/devops/continuous_integration/packages_pipeline/multi_configuration.rst b/devops/continuous_integration/packages_pipeline/multi_configuration.rst new file mode 100644 index 00000000000..a411cffe4c5 --- /dev/null +++ b/devops/continuous_integration/packages_pipeline/multi_configuration.rst @@ -0,0 +1,2 @@ +Package pipeline: multi configuration +===================================== diff --git a/devops/continuous_integration/packages_pipeline/multi_configuration_lockfile.rst b/devops/continuous_integration/packages_pipeline/multi_configuration_lockfile.rst new file mode 100644 index 00000000000..54dec9c2541 --- /dev/null +++ b/devops/continuous_integration/packages_pipeline/multi_configuration_lockfile.rst @@ -0,0 +1,2 @@ +Package pipeline: multi configuration using lockfiles +===================================================== diff --git a/devops/continuous_integration/packages_pipeline/single_configuration.rst b/devops/continuous_integration/packages_pipeline/single_configuration.rst new file mode 100644 index 00000000000..171a4e58003 --- /dev/null +++ b/devops/continuous_integration/packages_pipeline/single_configuration.rst @@ -0,0 +1,2 @@ +Package pipeline: single configuration +====================================== diff --git a/devops/continuous_integration/products_pipeline.rst b/devops/continuous_integration/products_pipeline.rst new file mode 100644 index 00000000000..6aa3cbfdb21 --- /dev/null +++ b/devops/continuous_integration/products_pipeline.rst @@ -0,0 +1,31 @@ +Products pipeline +================== + + + +There are some important points to understand about the products pipeline: + +- What are the **products**? The "products" are the main software artifact that my organization is delivering as final result and provide some + value for users of those artifacts. In this example we will consider ``game/1.0`` and ``mapviewer/1.0`` the "products". Note that it is + possible to define different versions of the same package as products, for example, if we had to maintain different versions of the ``game`` for + different customers, we could have ``game/1.0`` and ``game/2.3`` as well as different versions of ``mapviewer`` as products. +- Why not defining in CI the "users" or "consumers" of every package? It might be tempting to model the relationships between packages, in this + case, that the package ``ai`` is used by the ``engine`` package, and then try to configure the CI so a build of ``engine`` is triggered after + a build of ``ai``. But this approach does not scale at all and have very important limitations: + + - The example above is relatively simple, but in practice dependency graphs can have many more packages, even hundrends, making it very tedious and error prone to define all dependencies among packages in the CI + - Dependencies evolve over time, and new versions are used, some dependencies are removed and newer dependencies are added. The simple relationship between repositories modeled at the CI level can result in a very inefficient, slow and time consuming CI, if not a fragile one that continuously breaks because some dependencies change. + - The combinatorial nature that happens downstream a dependency graph, where a relatively stable top dependency, lets say ``mathlib/1.0`` might be used by multiple consumers such as ``ai/1.0``, ``ai/1.1``, ``ai/1.2`` which in turn each one might be used by multiple ``engine`` different versions and so on. Building only the latest version of the consumers would be insufficient in many cases and building all of them would be extremely costly. +- In C and C++ projects the "products" pipeline becomes more necessary and critical than in other languages due to the compilation model with + headrs textual inclusions becoming part of the consumers' binary artifacts and due to the native artifacts + linkage models. This means that in many scenarios it will be necessary to build new binaries that depend on some modified packages, even + if the source of the package itself didn't change at all. Conan's ``package_id`` computation together with some versioning conventions + can greatly help to efficiently define which packages needs to rebuild and which ones don't. + + + +.. toctree:: + :maxdepth: 1 + + products_pipeline/single_configuration + diff --git a/devops/continuous_integration/products_pipeline/single_configuration.rst b/devops/continuous_integration/products_pipeline/single_configuration.rst new file mode 100644 index 00000000000..8a3ef1953f6 --- /dev/null +++ b/devops/continuous_integration/products_pipeline/single_configuration.rst @@ -0,0 +1,2 @@ +Products pipeline: single configuration +======================================= diff --git a/devops/continuous_integration/project_setup.rst b/devops/continuous_integration/project_setup.rst new file mode 100644 index 00000000000..51fd9a35837 --- /dev/null +++ b/devops/continuous_integration/project_setup.rst @@ -0,0 +1,90 @@ +Project setup +============= + +The code necessary for this tutorial is found in the ``examples2`` repo, clone it and +move to the folder: + + +.. code-block:: bash + + $ git clone https://github.com/conan-io/examples2.git + $ cd examples2/devops/ci/game + + +Server repositories setup +------------------------- + +We need 3 different repositories in the same server. Make sure to have an Artifactory running and available. You can download the free `JFrog Artifactory Community Edition (CE) `_ and run it in your own computer. Log into the web UI and create 3 different local repositories called ``develop``, ``packages`` and ``products``. + +Edit the ``project_setup.py`` file: + +.. code-block:: python + + # TODO: This must be configured by users + SERVER_URL = "http:///artifactory/api/conan" + USER = "admin" + PASSWORD = "your password" + + +Initial dependency graph +------------------------ + +.. warning:: + + - The initialization of the project will remove the contents of the 3 ``develop``, ``products`` and ``packages`` repositories. + - The ``examples2/devops/ci/game`` folder contains an ``.conanrc`` file that defines a local cache, so commands executed in this tutorial do not pollute or alter your main Conan cache. + + +.. code-block:: bash + + $ python project_setup.py + +This will do several tasks, clean the server repos, create initial ``Debug`` and ``Release`` binaries for the dependency graph and upload them to the ``develop`` repo, then clean the local cache. + +This dependency graph of packages in the ``develop`` repo is the starting point for our tutorial, assumed as a functional and stable "develop" state of the project that developers can ``conan install`` to work in any of the different packages. + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } \ No newline at end of file diff --git a/devops/continuous_integration/tutorial.rst b/devops/continuous_integration/tutorial.rst new file mode 100644 index 00000000000..687bc355899 --- /dev/null +++ b/devops/continuous_integration/tutorial.rst @@ -0,0 +1,109 @@ +Continuous Integration (CI) tutorial +==================================== + +Continuous Integration has different meanings for different users and organizations. In this tutorial we will cover the scenarios when users +are doing changes to the source code of their packages and want to automatically build new binaries for those packages and also compute if those new package changes integrate cleanly or break the organization main products. + +We will use in this tutorial this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility: + +.. graphviz:: + :align: center + + digraph game { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="BT" + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + { + rank = same; + edge[ style=invis]; + "game/1.0" -> "mapviewer/1.0" ; + rankdir = LR; + } + } + + +All of the packages in the dependency graph have a ``requires`` to its direct dependencies using version ranges, for example, ``game`` contains a ``requires("engine/[>=1.0 <2]")`` so new patch and minor versions of the dependencies will automatically be used without needing to modify the recipes. + +.. note:: + + **Important notes** + + - This section is written as a hands-on tutorial. It is intended to be reproduced by copying the commands in your machine. + - The tutorial presents some of the tools, good practices and common approaches to the CI problem. But there are no silver bullets. + This tutorial is not the unique way that things should be done. + Different organizations might have different needs and priorities, different build services power and budget, different sizes, etc. + The principles and practices presented in the tutorial might need to be adapted. + - However some of the principles and best practices would be general for all approaches. Things like package immutability, using promotions + between repositories and not using the ``channel`` for that purpose are good practices that should be followed. + + +Packages and products pipelines +------------------------------- + +When a developer is doing some changes to a package source code, we will consider 2 different parts or pipelines of the overall system CI: +the **packages pipeline** and the **products pipeline** + + +The **packages pipeline** will build, create and upload the package binaries for the different configurations and platforms, when some +developer is submitting some changes to one of the organization repositories source code. For example if a developer is doing some changes +to the ``ai`` package, improving some of the library functionality, and bumping the version to ``ai/1.1.0``. If the organization needs to +support both Windows and Linux platforms, then the package pipeline will build the new ``ai/1.1.0`` both for Windows and Linux, before +considering the changes are valid. If some of the configurations fail to build under a specific platform, it is common to consider the +changes invalid and stop the processing of those changes, until the code is fixed. + + +The **products pipeline** responds a more challenging question: does my "products" build correctly with the latest changes that have been done +to the packages? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization +important product to check if things integrate cleanly or break. Let's continue with the example above, if we now have a new ``ai/1.1.0`` package, +is it going to break the existing ``game/1.0`` and/or ``mapviewer/1.0`` applications? Is it necessary to re-build from source some of the existing +packages that depend directly or indirectly on ``ai`` package? In this tutorial we will use ``game/1.0`` and ``mapviewer/1.0`` as our "products", +but this concept will be further explained later, and specially why it is important to think in terms of "products" instead of trying to explicitly +model the dependencies top-bottom in the CI. + + +Repositories and promotions +--------------------------- + +The concept of multiple server side repositories is very important for CI. In this tutorial we will use 3 repositories: + +- ``develop``: This repository is the main one that developers have configured in their machines to be able to ``conan install`` dependencies + and work. As such it is expected to be quite stable, similar to a shared "develop" branch in git, and the repository should contain pre-compiled + binaries for the organization pre-defined platforms, so developers and CI don't need to do ``--build=missing`` and build again and again from + source. +- ``packages``: This repository will be used to upload individual package binaries for different configurations. To consider a certain change + in a package source code to be correct, it might require that such change build correctly under a variaty of platforms, lets say Windows and Linux. + If the package builds correctly under Linux more quickly, we can upload it to the ``packages`` repository, and wait until the Windows build + finishes, and only when both are correct we can proceed. The ``packages`` repository serves as a temporary storage when building different + binaries for the same package in different platforms concurrently. +- ``products``: It is possible that some changes create + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + style=filled; + color=lightgrey; + rankdir="LR"; + label = "Packages server"; + "packages\n repository" -> "products\n repository" -> "develop\n repository" [ label="promotion" ]; + } + + } + + +Let's start with the tutorial, move to the next section to do the project setup: + +.. toctree:: + :maxdepth: 2 + + project_setup + packages_pipeline + products_pipeline From dfcc0b6aadb341000029a66439bee24d69f3c621 Mon Sep 17 00:00:00 2001 From: memsharded Date: Mon, 15 Jul 2024 12:52:12 +0200 Subject: [PATCH 02/22] wip --- .../packages_pipeline.rst | 18 ++ .../packages_pipeline/multi_configuration.rst | 176 ++++++++++++++++++ .../single_configuration.rst | 96 ++++++++++ .../continuous_integration/project_setup.rst | 5 +- devops/continuous_integration/tutorial.rst | 36 +++- 5 files changed, 327 insertions(+), 4 deletions(-) diff --git a/devops/continuous_integration/packages_pipeline.rst b/devops/continuous_integration/packages_pipeline.rst index 91d5db2325c..ebd4b00be02 100644 --- a/devops/continuous_integration/packages_pipeline.rst +++ b/devops/continuous_integration/packages_pipeline.rst @@ -1,6 +1,24 @@ Packages pipeline ================== +For the ``package pipeline`` we will start with a simple source code change in the ``ai`` recipe, simulating some improvements +in the ``ai`` package, providing some better algorithms for our game. + +Let's do the following changes: + +- Let's change the implementation of the ``ai/src/ai.cpp`` function and change the message from ``Some Artificial`` to ``SUPER BETTER Artificial`` +- Let's change the default ``intelligence=0`` value in ``ai/include/ai.h`` to a new ``intelligence=50`` default. +- Finally, let's bump the version. As we did some changes to the package public headers, it would be adviced to bump the ``minor`` version, + so let`s edit the ``ai/conanfile.py`` file and define ``version = "1.1.0"`` there (instead of the previous ``1.0``). Note that if we + did some breaking changes to the ``ai`` public API, the recommendation would be to change the major instead and create a new ``2.0`` version. + + +The ``packages pipeline`` will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages`` +binary repository. If the pipeline succeed it will copy them to the ``products`` binary repository, and stop otherwise. + +There are different aspects that need to be taken into account when building these packages. The following tutorial sections do the same +job, but under different hypothesis. They are explained in increasing complexity. + .. toctree:: :maxdepth: 1 diff --git a/devops/continuous_integration/packages_pipeline/multi_configuration.rst b/devops/continuous_integration/packages_pipeline/multi_configuration.rst index a411cffe4c5..bb73b178532 100644 --- a/devops/continuous_integration/packages_pipeline/multi_configuration.rst +++ b/devops/continuous_integration/packages_pipeline/multi_configuration.rst @@ -1,2 +1,178 @@ Package pipeline: multi configuration ===================================== + +In the previous section we were building just 1 configuration. This section will cover the case in which we need to build more +than 1 configuration. We will use the ``Release`` and ``Debug`` configurations here for convenience, as it is easier to +follow, but in real case these configurations will be more like Windows, Linux, OSX, building for different architectures, +cross building, etc. + +Let's begin cleaning our cache and initializing only the ``develop`` repo: + + +.. code-block:: bash + + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + $ conan remote add develop # Add only the develop repo + + +We will create the packages for the 2 configurations sequentially in our computer, but note these will typically run +in different computers, so it is typical for CI systems to launch the builds of different configurations in parallel. + +.. code-block:: bash + :caption: Release build + + $ conan create . --build="missing:ai/*" -s build_type=Release --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > upload_release.json + $ conan remote add packages "" + $ conan upload -l=upload_release.json -r=packages -c --format=json > upload_release.json + +We have done a few changes and extra steps: + +- First step is similar to the one in the previous section, a ``conan create``, just making it explicit our configuration + ``-s build_type=Release`` for clarity, and capturing the output of the ``conan create`` in a ``graph.json`` file. +- The second step is create a ``upload_release.json`` **package list** file, with the packages that needs to be uploaded, + in this case, only the packages that have been built from source (``--graph-binaries=build``) will be uploaded. This is + done for efficiency and faster uploads. +- Third step is to define the ``packages`` repository +- Finally, we will upload the ``upload_release.json`` package list to the ``packages`` repository, updating the ``upload_release.json`` + package list with the new location of the packages (the server repository). + +Likewise, the Debug build will do the same steps: + + +.. code-block:: bash + :caption: Debug build + + $ conan create . --build="missing:ai/*" -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > upload_debug.json + $ conan remote add packages "" -f # Can be ommitted, it was defined above + $ conan upload -l=upload_debug.json -r=packages -c --format=json > upload_debug.json + + +When both Release and Debug configuration finish successfully, we would have these packages in the repositories: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +If the build of all configurations for ``ai/1.1.0`` were succesfull, then the ``packages pipeline`` can proceed and promote +them to the ``products`` repository: + +.. code-block:: bash + :caption: Promoting from packages->product + + # aggregate the package list + $ conan pkglist merge -l upload_release.json -l upload_debug.json --format=json > promote.json + + $ conan remote add packages "" -f # Can be ommitted, it was defined above + $ conan remote add products "" -f # Can be ommitted, it was defined above + + # Promotion with Artifactory CE (slow, can be improved with art:promote) + $ conan download --list=promote.json -r=packages --format=json > promote.json + $ conan upload --list=promote.json -r=products -c + + +The first step uses the ``conan pkglist merge`` command to merge the package lists from the "Release" and "Debug" configurations and +merge it into a single ``promote.json`` package list. +This list is the one that will be used to run the promotion. + +In this example we are using a slow ``conan download`` + ``conan upload`` promotion. This can be way more efficient with +the ``conan art:promote`` extension command. + +After running the promotion we will have the following packages in the server: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/promoted release" [label="ai/1.1.0\n (Release)"]; + "ai/promoted debug" [label="ai/1.1.0\n (Debug)"]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } diff --git a/devops/continuous_integration/packages_pipeline/single_configuration.rst b/devops/continuous_integration/packages_pipeline/single_configuration.rst index 171a4e58003..2cf698c33b3 100644 --- a/devops/continuous_integration/packages_pipeline/single_configuration.rst +++ b/devops/continuous_integration/packages_pipeline/single_configuration.rst @@ -1,2 +1,98 @@ Package pipeline: single configuration ====================================== + +We will start with the most simple case, in which we only had to build 1 configuration, and that configuration +can be built in the current CI machine. + +As we described before while presenting the different server binary repositories, the idea is that package builds +will use by default the ``develop`` repo only, which is considered the stable one for developer and CI jobs. + +Let's make sure we start from a clean state: + +.. code-block:: bash + + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + $ conan remote add develop # Add only the develop repo + + +The removal and addition of repos accross this tutorial can be a bit tedious, but it is important for the correct +behavior. Also, there might be other configurations that can be even more efficient for some cases, like re-triggering +a broken job because of CI malfunction, but we will keep it simple at the moment and try to focus on the main concepts. + +With this configuration the CI job could just do: + +.. code-block:: bash + + $ conan create ai --build="missing:ai/*" + ... + ai/1.1.0: SUPER BETTER Artificial Intelligence for aliens (Release)! + ai/1.1.0: Intelligence level=50 + + +Note the ``--build="missing:ai/*"`` might not be fully necessary in some cases, but it can save time in other situations. +For example, if the developer did some changes just to the repo README, and didn't bump the version at all, Conan will not +generate a new ``recipe revision``, and detect this as a no-op, avoiding having to unnecessarily rebuild binaries from source. + +If we are in a single-configuration scenario and it built correctly, for this simple case we won't need a promotion at all, +and just uploading directly the built packages to the ``products`` repository will be enough: + + +.. code-block:: bash + + # We don't want to disrupt developers or CI, upload to products + $ conan remote add products + $ conan upload "ai*" -r=products -c + +As the cache was initially clean, all ``ai`` packages would be the ones that were built in this pipeline. + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/1.1.0\n (single config)"; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +This was a very simple scenario, let's move to a more realistic one: having to build more than one configuration. diff --git a/devops/continuous_integration/project_setup.rst b/devops/continuous_integration/project_setup.rst index 51fd9a35837..b50069c9845 100644 --- a/devops/continuous_integration/project_setup.rst +++ b/devops/continuous_integration/project_setup.rst @@ -14,7 +14,7 @@ move to the folder: Server repositories setup ------------------------- -We need 3 different repositories in the same server. Make sure to have an Artifactory running and available. You can download the free `JFrog Artifactory Community Edition (CE) `_ and run it in your own computer. Log into the web UI and create 3 different local repositories called ``develop``, ``packages`` and ``products``. +We need 3 different repositories in the same server. Make sure to have an Artifactory running and available. You can download the free `JFrog Artifactory Community Edition (CE) `_ and run it in your own computer. If you have another available Artifactory, it can be used too if you can create new repositories there. Log into the web UI and create 3 different local repositories called ``develop``, ``packages`` and ``products``. Edit the ``project_setup.py`` file: @@ -39,7 +39,8 @@ Initial dependency graph $ python project_setup.py -This will do several tasks, clean the server repos, create initial ``Debug`` and ``Release`` binaries for the dependency graph and upload them to the ``develop`` repo, then clean the local cache. +This will do several tasks, clean the server repos, create initial ``Debug`` and ``Release`` binaries for the dependency graph and upload them to the ``develop`` repo, then clean the local cache. Note in this example we are using ``Debug`` and ``Release`` different configurations for convenience, but in real cases these would be different configurations such as Windows/X86_64, Linux/x86_64, Linux/armv8, etc., running +in different computers. This dependency graph of packages in the ``develop`` repo is the starting point for our tutorial, assumed as a functional and stable "develop" state of the project that developers can ``conan install`` to work in any of the different packages. diff --git a/devops/continuous_integration/tutorial.rst b/devops/continuous_integration/tutorial.rst index 687bc355899..2a9a44cf350 100644 --- a/devops/continuous_integration/tutorial.rst +++ b/devops/continuous_integration/tutorial.rst @@ -78,8 +78,15 @@ The concept of multiple server side repositories is very important for CI. In th in a package source code to be correct, it might require that such change build correctly under a variaty of platforms, lets say Windows and Linux. If the package builds correctly under Linux more quickly, we can upload it to the ``packages`` repository, and wait until the Windows build finishes, and only when both are correct we can proceed. The ``packages`` repository serves as a temporary storage when building different - binaries for the same package in different platforms concurrently. -- ``products``: It is possible that some changes create + binaries for the same package in different platforms concurrently, until all of those configurations have been built correctly. In the example + above, when some developer did source changes in a new ``ai/1.1.0`` recipe, the different binaries for Windows and Linux will be built in different + servers. These jobs can upload their respective binaries for Windows and Linux to the ``packages`` binary repository. Note that these individual + binaries will not disrupt other developers or CI jobs, as they don't use the ``packages`` repository. +- ``products``: It is possible that some changes create new package versions or revisions correctly. But these new versions might break consumers + of those packages, for example some changes in the new ``ai/1.1.0`` package might unexpectedly break ``engine/1.0``. Or even if they don't + necessarily break, they might still need to build a new binary from source for ``engine/1.0`` and/or ``game/1.0``. The ``products`` binary + repository will be the place where binaries for different packages are uploaded to not disrupt or break the ``develop`` repository, until + the "products pipeline" can build necessary binaries from source and verify that these packages integrate cleanly. .. graphviz:: @@ -98,6 +105,31 @@ The concept of multiple server side repositories is very important for CI. In th } +Promotions are the mechanism used to make available packages from one pipeline to the other. Connecting the above packages and product pipelines +with the repositories, there will be 2 promotions: + +- First, when the developer submit the changes that create the new ``ai/1.1.0`` version, the ``package pipeline`` is triggered. It will build + new binaries for ``ai/1.1.0`` for Windows and Linux. These jobs will upload their respective package binaries to the ``packages`` binary + repository. If some of this jobs succeed, but other fail, it won't be a problem, because the ``packages`` repo is not used by other jobs, + so the new ``ai/1.1.0`` new package will still not be used by other packages and won't break anyone. +- When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide + to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository, + the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository, + no one will be broken at this stage either. +- The promotion is a copy of the packages. This can be done with several mechanisms, for example the ``conan art:promote`` extension commands + can efficiently promote Conan "package lists" between Artifactory (Pro) repositories. As Artifactory is deduplicating storage, this promotion + will be very fast and do not require any extra storage. +- One very important aspect of the promotion mechanisms is that packages are **immutable**. They do not change at all, not its contents, + not its reference. Using user/channel to denote stages or maturity is discouraged. +- Then, when packages are in the ``products`` repository, the ``products pipeline`` can be triggered. This job will make sure that both the + organization products ``game/1.0`` and ``mapviewer/1.0`` build cleanly with the new ``ai/1.1.0`` package, and build necessary new package + binaries, for example if ``engine/1.0`` needs to do a build from source to integrate the changes in ``ai/1.1.0`` the ``products pipeline`` + will make sure that this happens. +- When the ``products pipeline`` build all necessary new binaries for all intermediate and product packages and check that every is correct, then + these new packages can be made available for all other developers and CI jobs. This can be done with a promotion of these packages, copying + them from the ``products`` repository to the ``develop`` repository. As the changes have been integrated and tested consistently for the main + organization products, developers doing ``conan install`` will start seeing and using the new packages and binaries. + Let's start with the tutorial, move to the next section to do the project setup: From caaa5f099a06a3628122c9b2fa756a9638c81877 Mon Sep 17 00:00:00 2001 From: memsharded Date: Wed, 17 Jul 2024 10:29:47 +0200 Subject: [PATCH 03/22] moved --- .../continuous_integration => ci_tutorial}/packages_pipeline.rst | 0 .../packages_pipeline/multi_configuration.rst | 0 .../packages_pipeline/multi_configuration_lockfile.rst | 0 .../packages_pipeline/single_configuration.rst | 0 .../continuous_integration => ci_tutorial}/products_pipeline.rst | 0 .../products_pipeline/single_configuration.rst | 0 {devops/continuous_integration => ci_tutorial}/project_setup.rst | 0 {devops/continuous_integration => ci_tutorial}/tutorial.rst | 0 devops.rst | 1 - index.rst | 1 + 10 files changed, 1 insertion(+), 1 deletion(-) rename {devops/continuous_integration => ci_tutorial}/packages_pipeline.rst (100%) rename {devops/continuous_integration => ci_tutorial}/packages_pipeline/multi_configuration.rst (100%) rename {devops/continuous_integration => ci_tutorial}/packages_pipeline/multi_configuration_lockfile.rst (100%) rename {devops/continuous_integration => ci_tutorial}/packages_pipeline/single_configuration.rst (100%) rename {devops/continuous_integration => ci_tutorial}/products_pipeline.rst (100%) rename {devops/continuous_integration => ci_tutorial}/products_pipeline/single_configuration.rst (100%) rename {devops/continuous_integration => ci_tutorial}/project_setup.rst (100%) rename {devops/continuous_integration => ci_tutorial}/tutorial.rst (100%) diff --git a/devops/continuous_integration/packages_pipeline.rst b/ci_tutorial/packages_pipeline.rst similarity index 100% rename from devops/continuous_integration/packages_pipeline.rst rename to ci_tutorial/packages_pipeline.rst diff --git a/devops/continuous_integration/packages_pipeline/multi_configuration.rst b/ci_tutorial/packages_pipeline/multi_configuration.rst similarity index 100% rename from devops/continuous_integration/packages_pipeline/multi_configuration.rst rename to ci_tutorial/packages_pipeline/multi_configuration.rst diff --git a/devops/continuous_integration/packages_pipeline/multi_configuration_lockfile.rst b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst similarity index 100% rename from devops/continuous_integration/packages_pipeline/multi_configuration_lockfile.rst rename to ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst diff --git a/devops/continuous_integration/packages_pipeline/single_configuration.rst b/ci_tutorial/packages_pipeline/single_configuration.rst similarity index 100% rename from devops/continuous_integration/packages_pipeline/single_configuration.rst rename to ci_tutorial/packages_pipeline/single_configuration.rst diff --git a/devops/continuous_integration/products_pipeline.rst b/ci_tutorial/products_pipeline.rst similarity index 100% rename from devops/continuous_integration/products_pipeline.rst rename to ci_tutorial/products_pipeline.rst diff --git a/devops/continuous_integration/products_pipeline/single_configuration.rst b/ci_tutorial/products_pipeline/single_configuration.rst similarity index 100% rename from devops/continuous_integration/products_pipeline/single_configuration.rst rename to ci_tutorial/products_pipeline/single_configuration.rst diff --git a/devops/continuous_integration/project_setup.rst b/ci_tutorial/project_setup.rst similarity index 100% rename from devops/continuous_integration/project_setup.rst rename to ci_tutorial/project_setup.rst diff --git a/devops/continuous_integration/tutorial.rst b/ci_tutorial/tutorial.rst similarity index 100% rename from devops/continuous_integration/tutorial.rst rename to ci_tutorial/tutorial.rst diff --git a/devops.rst b/devops.rst index 06d787fb11e..08c2f353530 100644 --- a/devops.rst +++ b/devops.rst @@ -14,7 +14,6 @@ If you plan to use Conan in production in your project, team, or organization, t devops/using_conancenter devops/devops_local_recipes_index - devops/continuous_integration/tutorial devops/backup_sources/sources_backup devops/metadata devops/versioning diff --git a/index.rst b/index.rst index 86dcb4f0da1..f1c414312cf 100644 --- a/index.rst +++ b/index.rst @@ -16,6 +16,7 @@ Table of contents: whatsnew installation tutorial + CI Tutorial devops integrations examples From 559ec590b27039beee6a95673849986f78372c8d Mon Sep 17 00:00:00 2001 From: memsharded Date: Thu, 18 Jul 2024 11:52:17 +0200 Subject: [PATCH 04/22] wip --- ci_tutorial/packages_pipeline.rst | 15 +- .../packages_pipeline/multi_configuration.rst | 51 ++++-- .../single_configuration.rst | 16 +- ci_tutorial/products_pipeline.rst | 20 +++ .../single_configuration.rst | 8 + ci_tutorial/project_setup.rst | 17 +- ci_tutorial/tutorial.rst | 75 +++----- devops.rst => devops/devops.rst | 15 +- devops/package_promotions.rst | 162 ++++++++++++++++++ index.rst | 2 +- 10 files changed, 296 insertions(+), 85 deletions(-) rename devops.rst => devops/devops.rst (71%) create mode 100644 devops/package_promotions.rst diff --git a/ci_tutorial/packages_pipeline.rst b/ci_tutorial/packages_pipeline.rst index ebd4b00be02..52f0c2c31d1 100644 --- a/ci_tutorial/packages_pipeline.rst +++ b/ci_tutorial/packages_pipeline.rst @@ -1,6 +1,15 @@ Packages pipeline ================== + +The **packages pipeline** will build, create and upload the package binaries for the different configurations and platforms, when some +developer is submitting some changes to one of the organization repositories source code. For example if a developer is doing some changes +to the ``ai`` package, improving some of the library functionality, and bumping the version to ``ai/1.1.0``. If the organization needs to +support both Windows and Linux platforms, then the package pipeline will build the new ``ai/1.1.0`` both for Windows and Linux, before +considering the changes are valid. If some of the configurations fail to build under a specific platform, it is common to consider the +changes invalid and stop the processing of those changes, until the code is fixed. + + For the ``package pipeline`` we will start with a simple source code change in the ``ai`` recipe, simulating some improvements in the ``ai`` package, providing some better algorithms for our game. @@ -14,11 +23,15 @@ Let's do the following changes: The ``packages pipeline`` will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages`` -binary repository. If the pipeline succeed it will copy them to the ``products`` binary repository, and stop otherwise. +binary repository to avoid disrupting or causing potential issues to other developers and CI jobs. +If the pipeline succeed it will promote (copy) them to the ``products`` binary repository, and stop otherwise. There are different aspects that need to be taken into account when building these packages. The following tutorial sections do the same job, but under different hypothesis. They are explained in increasing complexity. +Note all of the commands can be found in the repository ``run_example.py`` file. This file is mostly intended for maintainers and testing, +but it might be useful in case of issues. + .. toctree:: :maxdepth: 1 diff --git a/ci_tutorial/packages_pipeline/multi_configuration.rst b/ci_tutorial/packages_pipeline/multi_configuration.rst index bb73b178532..0fb09a51e0e 100644 --- a/ci_tutorial/packages_pipeline/multi_configuration.rst +++ b/ci_tutorial/packages_pipeline/multi_configuration.rst @@ -13,7 +13,8 @@ Let's begin cleaning our cache and initializing only the ``develop`` repo: $ conan remove "*" -c # Make sure no packages from last run $ conan remote remove "*" # Make sure no other remotes defined - $ conan remote add develop # Add only the develop repo + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop We will create the packages for the 2 configurations sequentially in our computer, but note these will typically run @@ -22,20 +23,22 @@ in different computers, so it is typical for CI systems to launch the builds of .. code-block:: bash :caption: Release build + $ cd ai $ conan create . --build="missing:ai/*" -s build_type=Release --format=json > graph.json - $ conan list --graph=graph.json --graph-binaries=build --format=json > upload_release.json - $ conan remote add packages "" - $ conan upload -l=upload_release.json -r=packages -c --format=json > upload_release.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Add packages repo, you might need to adjust this URL + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_release.json We have done a few changes and extra steps: - First step is similar to the one in the previous section, a ``conan create``, just making it explicit our configuration ``-s build_type=Release`` for clarity, and capturing the output of the ``conan create`` in a ``graph.json`` file. -- The second step is create a ``upload_release.json`` **package list** file, with the packages that needs to be uploaded, +- The second step is create from the ``graph.json`` a ``built.json`` **package list** file, with the packages that needs to be uploaded, in this case, only the packages that have been built from source (``--graph-binaries=build``) will be uploaded. This is done for efficiency and faster uploads. - Third step is to define the ``packages`` repository -- Finally, we will upload the ``upload_release.json`` package list to the ``packages`` repository, updating the ``upload_release.json`` +- Finally, we will upload the ``built.json`` package list to the ``packages`` repository, creating the ``uploaded_release.json`` package list with the new location of the packages (the server repository). Likewise, the Debug build will do the same steps: @@ -45,9 +48,10 @@ Likewise, the Debug build will do the same steps: :caption: Debug build $ conan create . --build="missing:ai/*" -s build_type=Debug --format=json > graph.json - $ conan list --graph=graph.json --graph-binaries=build --format=json > upload_debug.json - $ conan remote add packages "" -f # Can be ommitted, it was defined above - $ conan upload -l=upload_debug.json -r=packages -c --format=json > upload_debug.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Remote definition can be ommitted in tutorial, it was defined above (-f == force) + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages -f + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_debug.json When both Release and Debug configuration finish successfully, we would have these packages in the repositories: @@ -100,6 +104,14 @@ When both Release and Debug configuration finish successfully, we would have the } } +TODO + + +- When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide + to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository, + the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository, + no one will be broken at this stage either. + If the build of all configurations for ``ai/1.1.0`` were succesfull, then the ``packages pipeline`` can proceed and promote them to the ``products`` repository: @@ -108,18 +120,16 @@ them to the ``products`` repository: :caption: Promoting from packages->product # aggregate the package list - $ conan pkglist merge -l upload_release.json -l upload_debug.json --format=json > promote.json - - $ conan remote add packages "" -f # Can be ommitted, it was defined above - $ conan remote add products "" -f # Can be ommitted, it was defined above + $ conan pkglist merge -l uploaded_release.json -l uploaded_debug.json --format=json > uploaded.json - # Promotion with Artifactory CE (slow, can be improved with art:promote) - $ conan download --list=promote.json -r=packages --format=json > promote.json + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=uploaded.json -r=packages --format=json > promote.json $ conan upload --list=promote.json -r=products -c The first step uses the ``conan pkglist merge`` command to merge the package lists from the "Release" and "Debug" configurations and -merge it into a single ``promote.json`` package list. +merge it into a single ``uploaded.json`` package list. This list is the one that will be used to run the promotion. In this example we are using a slow ``conan download`` + ``conan upload`` promotion. This can be way more efficient with @@ -176,3 +186,12 @@ After running the promotion we will have the following packages in the server: } } } + + +To summarize: + +- We built 2 different configurations, ``Release`` and ``Debug`` (could have been Windows/Linux or others), and uploaded them + to the ``packages`` repository. +- When all package binaries for all configurations were successfully built, we promoted them from the ``packages`` to the + ``products`` repository, to make them available for the ``products pipeline``. +- **Package lists** were captured in the package creation process and merged into a single one to run the promotion. diff --git a/ci_tutorial/packages_pipeline/single_configuration.rst b/ci_tutorial/packages_pipeline/single_configuration.rst index 2cf698c33b3..250c9ca265e 100644 --- a/ci_tutorial/packages_pipeline/single_configuration.rst +++ b/ci_tutorial/packages_pipeline/single_configuration.rst @@ -13,7 +13,8 @@ Let's make sure we start from a clean state: $ conan remove "*" -c # Make sure no packages from last run $ conan remote remove "*" # Make sure no other remotes defined - $ conan remote add develop # Add only the develop repo + # Add only the develop repo, you might need to adjust this for your URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop The removal and addition of repos accross this tutorial can be a bit tedious, but it is important for the correct @@ -24,7 +25,8 @@ With this configuration the CI job could just do: .. code-block:: bash - $ conan create ai --build="missing:ai/*" + $ cd ai + $ conan create . --build="missing:ai/*" ... ai/1.1.0: SUPER BETTER Artificial Intelligence for aliens (Release)! ai/1.1.0: Intelligence level=50 @@ -34,14 +36,16 @@ Note the ``--build="missing:ai/*"`` might not be fully necessary in some cases, For example, if the developer did some changes just to the repo README, and didn't bump the version at all, Conan will not generate a new ``recipe revision``, and detect this as a no-op, avoiding having to unnecessarily rebuild binaries from source. -If we are in a single-configuration scenario and it built correctly, for this simple case we won't need a promotion at all, -and just uploading directly the built packages to the ``products`` repository will be enough: +If we are in a single-configuration scenario and it built correctly, for this simple case we don't need a promotion, +and just uploading directly the built packages to the ``products`` repository will be enough, where the ``products pipeline`` +will pick it later. .. code-block:: bash - # We don't want to disrupt developers or CI, upload to products - $ conan remote add products + # We don't want to disrupt developers or CI, upload to products + # Add products repo, you might need to adjust this URL + $ conan remote add products http://localhost:8081/artifactory/api/conan/products $ conan upload "ai*" -r=products -c As the cache was initially clean, all ``ai`` packages would be the ones that were built in this pipeline. diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst index 6aa3cbfdb21..66b721a9c0d 100644 --- a/ci_tutorial/products_pipeline.rst +++ b/ci_tutorial/products_pipeline.rst @@ -1,8 +1,28 @@ Products pipeline ================== +The **products pipeline** responds a more challenging question: does my "products" build correctly with the latest changes that have been done +to the packages? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization +important product to check if things integrate cleanly or break. Let's continue with the example above, if we now have a new ``ai/1.1.0`` package, +is it going to break the existing ``game/1.0`` and/or ``mapviewer/1.0`` applications? Is it necessary to re-build from source some of the existing +packages that depend directly or indirectly on ``ai`` package? In this tutorial we will use ``game/1.0`` and ``mapviewer/1.0`` as our "products", +but this concept will be further explained later, and specially why it is important to think in terms of "products" instead of trying to explicitly +model the dependencies top-bottom in the CI. + +- Then, when packages are in the ``products`` repository, the ``products pipeline`` can be triggered. This job will make sure that both the + organization products ``game/1.0`` and ``mapviewer/1.0`` build cleanly with the new ``ai/1.1.0`` package, and build necessary new package + binaries, for example if ``engine/1.0`` needs to do a build from source to integrate the changes in ``ai/1.1.0`` the ``products pipeline`` + will make sure that this happens. + + +- ``products``: It is possible that some changes create new package versions or revisions correctly. But these new versions might break consumers + of those packages, for example some changes in the new ``ai/1.1.0`` package might unexpectedly break ``engine/1.0``. Or even if they don't + necessarily break, they might still need to build a new binary from source for ``engine/1.0`` and/or ``game/1.0``. The ``products`` binary + repository will be the place where binaries for different packages are uploaded to not disrupt or break the ``develop`` repository, until + the "products pipeline" can build necessary binaries from source and verify that these packages integrate cleanly. + There are some important points to understand about the products pipeline: - What are the **products**? The "products" are the main software artifact that my organization is delivering as final result and provide some diff --git a/ci_tutorial/products_pipeline/single_configuration.rst b/ci_tutorial/products_pipeline/single_configuration.rst index 8a3ef1953f6..9fcbf689c63 100644 --- a/ci_tutorial/products_pipeline/single_configuration.rst +++ b/ci_tutorial/products_pipeline/single_configuration.rst @@ -1,2 +1,10 @@ Products pipeline: single configuration ======================================= + + + + +- When the ``products pipeline`` build all necessary new binaries for all intermediate and product packages and check that every is correct, then + these new packages can be made available for all other developers and CI jobs. This can be done with a promotion of these packages, copying + them from the ``products`` repository to the ``develop`` repository. As the changes have been integrated and tested consistently for the main + organization products, developers doing ``conan install`` will start seeing and using the new packages and binaries. \ No newline at end of file diff --git a/ci_tutorial/project_setup.rst b/ci_tutorial/project_setup.rst index b50069c9845..8bcc9b3feb8 100644 --- a/ci_tutorial/project_setup.rst +++ b/ci_tutorial/project_setup.rst @@ -8,13 +8,24 @@ move to the folder: .. code-block:: bash $ git clone https://github.com/conan-io/examples2.git - $ cd examples2/devops/ci/game + $ cd examples2/ci/game Server repositories setup ------------------------- -We need 3 different repositories in the same server. Make sure to have an Artifactory running and available. You can download the free `JFrog Artifactory Community Edition (CE) `_ and run it in your own computer. If you have another available Artifactory, it can be used too if you can create new repositories there. Log into the web UI and create 3 different local repositories called ``develop``, ``packages`` and ``products``. +We need 3 different repositories in the same server. Make sure to have an Artifactory running and available. You can download the free :ref:`Artifactory CE` from the `downloads page `_ and run it in your own computer, or you can use docker: + + +.. code-block:: bash + + $ docker run --name artifactory -d -p 8081:8081 -p 8082:8082 releases-docker.jfrog.io/jfrog/artifactory-cpp-ce:7.63.12 + # Can be stopped with "docker stop artifactory" + +When you launch it, you can go to http://localhost:8081/ to check it (user: "admin", password: "password"). + + +If you have another available Artifactory, it can be used too if you can create new repositories there. Log into the web UI and create 3 different local repositories called ``develop``, ``packages`` and ``products``. Edit the ``project_setup.py`` file: @@ -32,7 +43,7 @@ Initial dependency graph .. warning:: - The initialization of the project will remove the contents of the 3 ``develop``, ``products`` and ``packages`` repositories. - - The ``examples2/devops/ci/game`` folder contains an ``.conanrc`` file that defines a local cache, so commands executed in this tutorial do not pollute or alter your main Conan cache. + - The ``examples2/ci/game`` folder contains an ``.conanrc`` file that defines a local cache, so commands executed in this tutorial do not pollute or alter your main Conan cache. .. code-block:: bash diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 2a9a44cf350..9a69312c32a 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -1,10 +1,12 @@ +.. _ci_tutorial: + Continuous Integration (CI) tutorial ==================================== Continuous Integration has different meanings for different users and organizations. In this tutorial we will cover the scenarios when users are doing changes to the source code of their packages and want to automatically build new binaries for those packages and also compute if those new package changes integrate cleanly or break the organization main products. -We will use in this tutorial this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility: +We will use in this tutorial this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility. The game and mapviewer are our final "products", what we distribute to our users: .. graphviz:: :align: center @@ -47,22 +49,14 @@ Packages and products pipelines When a developer is doing some changes to a package source code, we will consider 2 different parts or pipelines of the overall system CI: the **packages pipeline** and the **products pipeline** +- The **packages pipeline** takes care of building one single package when its code is changed. If necessary it will build it for different configurations. +- The **products pipeline** takes care of building the main organization "products" (the packages that implement the final applications or deliverables), + and making sure that changes and new versions in dependencies integrate correctly, rebuilding any intermediate packages in the graph if necessary. -The **packages pipeline** will build, create and upload the package binaries for the different configurations and platforms, when some -developer is submitting some changes to one of the organization repositories source code. For example if a developer is doing some changes -to the ``ai`` package, improving some of the library functionality, and bumping the version to ``ai/1.1.0``. If the organization needs to -support both Windows and Linux platforms, then the package pipeline will build the new ``ai/1.1.0`` both for Windows and Linux, before -considering the changes are valid. If some of the configurations fail to build under a specific platform, it is common to consider the -changes invalid and stop the processing of those changes, until the code is fixed. - - -The **products pipeline** responds a more challenging question: does my "products" build correctly with the latest changes that have been done -to the packages? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization -important product to check if things integrate cleanly or break. Let's continue with the example above, if we now have a new ``ai/1.1.0`` package, -is it going to break the existing ``game/1.0`` and/or ``mapviewer/1.0`` applications? Is it necessary to re-build from source some of the existing -packages that depend directly or indirectly on ``ai`` package? In this tutorial we will use ``game/1.0`` and ``mapviewer/1.0`` as our "products", -but this concept will be further explained later, and specially why it is important to think in terms of "products" instead of trying to explicitly -model the dependencies top-bottom in the CI. +The idea is that if some developer does changes to ``ai`` package, producing a new ``ai/1.1.0`` version, the packages pipeline will first build this +new version. But this new version might accidentally break or require rebuilding some consumers packages. If our organization main **products** are +``game/1.0`` and ``mapviewer/1.0``, then the products pipeline can be triggered, in this case it would rebuild ``engine/1.0`` and ``game/1.0`` as +they are affected by the change. Repositories and promotions @@ -74,20 +68,10 @@ The concept of multiple server side repositories is very important for CI. In th and work. As such it is expected to be quite stable, similar to a shared "develop" branch in git, and the repository should contain pre-compiled binaries for the organization pre-defined platforms, so developers and CI don't need to do ``--build=missing`` and build again and again from source. -- ``packages``: This repository will be used to upload individual package binaries for different configurations. To consider a certain change - in a package source code to be correct, it might require that such change build correctly under a variaty of platforms, lets say Windows and Linux. - If the package builds correctly under Linux more quickly, we can upload it to the ``packages`` repository, and wait until the Windows build - finishes, and only when both are correct we can proceed. The ``packages`` repository serves as a temporary storage when building different - binaries for the same package in different platforms concurrently, until all of those configurations have been built correctly. In the example - above, when some developer did source changes in a new ``ai/1.1.0`` recipe, the different binaries for Windows and Linux will be built in different - servers. These jobs can upload their respective binaries for Windows and Linux to the ``packages`` binary repository. Note that these individual - binaries will not disrupt other developers or CI jobs, as they don't use the ``packages`` repository. -- ``products``: It is possible that some changes create new package versions or revisions correctly. But these new versions might break consumers - of those packages, for example some changes in the new ``ai/1.1.0`` package might unexpectedly break ``engine/1.0``. Or even if they don't - necessarily break, they might still need to build a new binary from source for ``engine/1.0`` and/or ``game/1.0``. The ``products`` binary - repository will be the place where binaries for different packages are uploaded to not disrupt or break the ``develop`` repository, until - the "products pipeline" can build necessary binaries from source and verify that these packages integrate cleanly. - +- ``packages``: This repository will be used to temporarily upload the packages built by the "packages pipeline", to not upload them directly to + the ``develop`` repo and avoid disruption until these packages are fully validated. +- ``products``: This repository will be used to temporarily upload the packages built by the "products pipeline", while building and testing that + new dependencies changes do not break the main "products". .. graphviz:: :align: center @@ -108,27 +92,16 @@ The concept of multiple server side repositories is very important for CI. In th Promotions are the mechanism used to make available packages from one pipeline to the other. Connecting the above packages and product pipelines with the repositories, there will be 2 promotions: -- First, when the developer submit the changes that create the new ``ai/1.1.0`` version, the ``package pipeline`` is triggered. It will build - new binaries for ``ai/1.1.0`` for Windows and Linux. These jobs will upload their respective package binaries to the ``packages`` binary - repository. If some of this jobs succeed, but other fail, it won't be a problem, because the ``packages`` repo is not used by other jobs, - so the new ``ai/1.1.0`` new package will still not be used by other packages and won't break anyone. -- When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide - to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository, - the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository, - no one will be broken at this stage either. -- The promotion is a copy of the packages. This can be done with several mechanisms, for example the ``conan art:promote`` extension commands - can efficiently promote Conan "package lists" between Artifactory (Pro) repositories. As Artifactory is deduplicating storage, this promotion - will be very fast and do not require any extra storage. -- One very important aspect of the promotion mechanisms is that packages are **immutable**. They do not change at all, not its contents, - not its reference. Using user/channel to denote stages or maturity is discouraged. -- Then, when packages are in the ``products`` repository, the ``products pipeline`` can be triggered. This job will make sure that both the - organization products ``game/1.0`` and ``mapviewer/1.0`` build cleanly with the new ``ai/1.1.0`` package, and build necessary new package - binaries, for example if ``engine/1.0`` needs to do a build from source to integrate the changes in ``ai/1.1.0`` the ``products pipeline`` - will make sure that this happens. -- When the ``products pipeline`` build all necessary new binaries for all intermediate and product packages and check that every is correct, then - these new packages can be made available for all other developers and CI jobs. This can be done with a promotion of these packages, copying - them from the ``products`` repository to the ``develop`` repository. As the changes have been integrated and tested consistently for the main - organization products, developers doing ``conan install`` will start seeing and using the new packages and binaries. +- When all the different binaries for the different configurations have been built for a single package with the ``packages pipeline``, and uploaded + to the ``packages`` repository, the package changes and package new version can be considered "correct" and promoted (copied) to the ``products`` + repository. +- When the ``products pipeline`` has built from source all the necessary packages that need a re-build because of the new package versions in + the ``products`` repository and has checked that the organization "products" (such ``game/1.0`` and ``mapviewer/1.0``) are not broken, then + the packages can be promoted (copied) from the ``products`` repo to the ``develop`` repo, to make them available for all other developers and CI. + +This tutorial is just modeling the **development** flow. In production systems, there will be other repositories +and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users and packages can +be promoted from ``develop`` to ``testing`` to ``release`` as they pass validation. Read more about promotions in :ref:`Package promotions`. Let's start with the tutorial, move to the next section to do the project setup: diff --git a/devops.rst b/devops/devops.rst similarity index 71% rename from devops.rst rename to devops/devops.rst index 08c2f353530..e9fc43dadb7 100644 --- a/devops.rst +++ b/devops/devops.rst @@ -12,10 +12,11 @@ If you plan to use Conan in production in your project, team, or organization, t .. toctree:: :maxdepth: 1 - devops/using_conancenter - devops/devops_local_recipes_index - devops/backup_sources/sources_backup - devops/metadata - devops/versioning - devops/save_restore - devops/vendoring + using_conancenter + devops_local_recipes_index + backup_sources/sources_backup + metadata + versioning + save_restore + vendoring + package_promotions diff --git a/devops/package_promotions.rst b/devops/package_promotions.rst new file mode 100644 index 00000000000..3062656f95a --- /dev/null +++ b/devops/package_promotions.rst @@ -0,0 +1,162 @@ +.. _devops_package_promotions: + +Package promotions +================== + +Package promotions are the recommended devops practice to handle quality, maturity or stages of packages +in different technologies, and of course, also for Conan packages. + +The principle of package promotions is that there are multiple server package repositories defined and +packages are uploaded and copied among repositories depending on the stage. For +example we could have two different server package repositories called "testing" and "release": + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + style=filled; + color=lightgrey; + rankdir="LR"; + label = "Packages server"; + "testing\n repository" -> "release\n repository" [ label="promotion" ]; + } + } + + +.. note:: + + **Best practices** + + - Using different ``user/channel`` to try to denote maturity is strongly discouraged. It was described in the early + Conan 1 days years ago, before the possibility of having multiple repositories, but it shouldn't be used anymore. + - Packages should be completely immutable accross pipelines and stages, a package cannot rename or change its ``user/channel``, + and re-building it from source to have a new ``user/channel`` is also a strongly discourage devops practice. + + +Between those repositories there will be some quality gates. In our case, some packages will be +put in the "testing" repository, for the QA team to test them, for example ``zlib/1.3.1`` and ``openssl/3.2.2``: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "testing\n repository" + shape = "box"; + style=filled; + color=lightblue; + "zlib/1.3.1"; + "openssl/3.2.2"; + } + + subgraph cluster_2 { + label = "release\n repository" + shape = "box"; + style=filled; + color=lightblue; + "release" [style=invis]; + } + { + edge[style=invis]; + "zlib/1.3.1" -> "release" ; + rankdir="BT"; + } + } + } + + +When QA team tests and approves these packages, they can be promoted to the "release" repository. +Basically, a promotion is a copy of the packages, including all the artifacts and metadata from the +"testing" to the "release" repository. + + +There are different ways to implement and execute a package promotion. Artifactoy has some APIs that can be +used to move individual files or folders. The `Conan extensions repository `_ +contains the ``conan art:promote`` command that can be used to promote Conan "package lists" from one +server repository to another repository. + +If we have a package list ``pkglist.json`` that contains the above ``zlib/1.3.1`` and ``openssl/3.2.2`` packages, then +the command would look like: + +.. code-block:: bash + :caption: Promoting from testing->release + + $ conan art:promote pkglist.json --from=testing --to=release --url=https:///artifactory --user= --password= + + +Note that the ``conan art:promote`` doesn't work with ArtifactoryCE, but need pro editions of Artifactory. +The promote functionality can be implemented in these cases with a simple download+upload flow: + +.. code-block:: bash + :caption: Promoting from testing->release + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=promote.json -r=testing --format=json > downloaded.json + $ conan upload --list=downloaded.json -r=release -c + + +After the promotion from "testing" to "release" repository, the packages would be like: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "testing\n repository" + shape = "box"; + style=filled; + color=lightblue; + "zlib/1.3.1"; + "openssl/3.2.2"; + } + + subgraph cluster_2 { + label = "release\n repository" + shape = "box"; + style=filled; + color=lightblue; + "zlibpromoted" [label="zlib/1.3.1"]; + "opensslpromoted" [label="openssl/3.2.2"]; + } + { + "zlib/1.3.1" -> "zlibpromoted"; + "openssl/3.2.2" -> "opensslpromoted" [label="Promotion"]; + } + } + } + + +.. note:: + + **Best practices** + + - In modern package servers such as Artifactory package artifacts are **deduplicated**, that is, they do not + take any extra storage when they are copied in different locations, including different repositories. + The **deduplication** is checksum based, so the system is also smart to avoid re-uploading existing artifacts. + This is very important for the "promotions" mechanism: this mechanism is only copying some metadata, so + it can be very fast and it is storage efficient. Pipelines can define as many repositories and promotions + as necessary without concerns about storage costs. + - Promotions can also be done in JFrog platform with ``Release Bundles``. The `Conan extensions repository `_ + also contains one command to generate a release bundle (that can be promoted using the Artifatory API). + + +.. seealso:: + + - :ref:`Using package lists examples ` + - :ref:`Promotions usage in CI ` diff --git a/index.rst b/index.rst index f1c414312cf..48ce8703eeb 100644 --- a/index.rst +++ b/index.rst @@ -17,7 +17,7 @@ Table of contents: installation tutorial CI Tutorial - devops + devops/devops integrations examples reference From bd2a6e4c8e9a2beaa499e97e85977714639c27d4 Mon Sep 17 00:00:00 2001 From: memsharded Date: Thu, 18 Jul 2024 12:03:12 +0200 Subject: [PATCH 05/22] wip --- ci_tutorial/packages_pipeline.rst | 2 +- ci_tutorial/project_setup.rst | 5 +++-- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/ci_tutorial/packages_pipeline.rst b/ci_tutorial/packages_pipeline.rst index 52f0c2c31d1..b8f62b75d84 100644 --- a/ci_tutorial/packages_pipeline.rst +++ b/ci_tutorial/packages_pipeline.rst @@ -13,7 +13,7 @@ changes invalid and stop the processing of those changes, until the code is fixe For the ``package pipeline`` we will start with a simple source code change in the ``ai`` recipe, simulating some improvements in the ``ai`` package, providing some better algorithms for our game. -Let's do the following changes: +✍️ **Let's do the following changes in the ai package**: - Let's change the implementation of the ``ai/src/ai.cpp`` function and change the message from ``Some Artificial`` to ``SUPER BETTER Artificial`` - Let's change the default ``intelligence=0`` value in ``ai/include/ai.h`` to a new ``intelligence=50`` default. diff --git a/ci_tutorial/project_setup.rst b/ci_tutorial/project_setup.rst index 8bcc9b3feb8..1beb402782e 100644 --- a/ci_tutorial/project_setup.rst +++ b/ci_tutorial/project_setup.rst @@ -23,11 +23,12 @@ We need 3 different repositories in the same server. Make sure to have an Artifa # Can be stopped with "docker stop artifactory" When you launch it, you can go to http://localhost:8081/ to check it (user: "admin", password: "password"). +If you have another available Artifactory, it can be used too if you can create new repositories there. -If you have another available Artifactory, it can be used too if you can create new repositories there. Log into the web UI and create 3 different local repositories called ``develop``, ``packages`` and ``products``. +✍️ As a first step, log into the web UI and **create 3 different local repositories** called ``develop``, ``packages`` and ``products``. -Edit the ``project_setup.py`` file: +✍️ Then edit the ``project_setup.py`` file: .. code-block:: python From 435f2d701bc5e050c13acc9b61632eb80db1fe4a Mon Sep 17 00:00:00 2001 From: memsharded Date: Thu, 22 Aug 2024 13:26:59 +0200 Subject: [PATCH 06/22] wip --- ci_tutorial/packages_pipeline.rst | 4 +- .../packages_pipeline/multi_configuration.rst | 18 +-- .../multi_configuration_lockfile.rst | 145 ++++++++++++++++++ ci_tutorial/products_pipeline.rst | 11 +- ci_tutorial/project_setup.rst | 12 +- ci_tutorial/tutorial.rst | 8 +- tutorial/versioning/lockfiles.rst | 2 +- 7 files changed, 179 insertions(+), 21 deletions(-) diff --git a/ci_tutorial/packages_pipeline.rst b/ci_tutorial/packages_pipeline.rst index b8f62b75d84..f3dbb0a475b 100644 --- a/ci_tutorial/packages_pipeline.rst +++ b/ci_tutorial/packages_pipeline.rst @@ -26,11 +26,11 @@ The ``packages pipeline`` will take care of building the different packages bina binary repository to avoid disrupting or causing potential issues to other developers and CI jobs. If the pipeline succeed it will promote (copy) them to the ``products`` binary repository, and stop otherwise. -There are different aspects that need to be taken into account when building these packages. The following tutorial sections do the same +There are different aspects that need to be taken into account when building these binary packages for ``ai/1.1.0``. The following tutorial sections do the same job, but under different hypothesis. They are explained in increasing complexity. Note all of the commands can be found in the repository ``run_example.py`` file. This file is mostly intended for maintainers and testing, -but it might be useful in case of issues. +but it might be useful as a reference in case of issues. .. toctree:: diff --git a/ci_tutorial/packages_pipeline/multi_configuration.rst b/ci_tutorial/packages_pipeline/multi_configuration.rst index 0fb09a51e0e..c6f203ce3c6 100644 --- a/ci_tutorial/packages_pipeline/multi_configuration.rst +++ b/ci_tutorial/packages_pipeline/multi_configuration.rst @@ -104,17 +104,11 @@ When both Release and Debug configuration finish successfully, we would have the } } -TODO - -- When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide - to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository, - the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository, - no one will be broken at this stage either. - - -If the build of all configurations for ``ai/1.1.0`` were succesfull, then the ``packages pipeline`` can proceed and promote -them to the ``products`` repository: +When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide +to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository, +the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository, +no one will be broken at this stage either: .. code-block:: bash :caption: Promoting from packages->product @@ -195,3 +189,7 @@ To summarize: - When all package binaries for all configurations were successfully built, we promoted them from the ``packages`` to the ``products`` repository, to make them available for the ``products pipeline``. - **Package lists** were captured in the package creation process and merged into a single one to run the promotion. + + +There is still an aspect that we haven't considered yet, the possibility that the dependencies of ``ai/1.1.0`` change +during the build. Move to the next section to see how to use lockfiles to achieve more consistent multi-configuration builds. diff --git a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst index 54dec9c2541..4390c38eee2 100644 --- a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst +++ b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst @@ -1,2 +1,147 @@ Package pipeline: multi configuration using lockfiles ===================================================== + +In the previous example, we built both ``Debug`` and ``Release`` package binaries for ``ai/1.1.0``. In real world scenarios the binaries to build would be different platforms (Windows, Linux, embedded), different architectures, and very often it will not be possible to build them in the same machine, requiring different computers. + +The previous example had an important assumption: the dependencies of ``ai/1.1.0`` do not change at all during the building process. In many scenarios, this assumption will not hold, for example if there are any other concurrent CI jobs, and one succesfull job publish a new ``mathlib/1.1`` version in the ``develop`` repo. + +Then it is possible that one build of ``ai/1.1.0``, for example, the one running in the Linux servers starts earlier and uses the previous ``mathlib/1.0`` version as dependency, while the Windows servers start a bit later, and then their build will use the recent ``mathlib/1.1`` version as dependency. This is a very undesirable situation, having binaries for the same ``ai/1.1.0`` version using different dependencies versions. This can lead in later graph resolution problems, or even worse, get to the release with different behavior for different platforms. + +The way to avoid this discrepancy in dependencies is to force the usage of the same dependencies versions and revisions, something that can be done with :ref:`lockfiles`. + +Creating and applying lockfiles is relatively straightforward. The process of creating and promoting the configurations will be identical to the previous section, but just applying the lockfiles. + +Creating the lockfile +--------------------- + +Let's make sure as usual that we start from a clean state: + +.. code-block:: bash + + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Then we can create the lockfile ``conan.lock`` file: + +.. code-block:: bash + + # Capture a lockfile for the Release configuration + $ conan lock create . -s build_type=Release --lockfile-out=conan.lock + # extend the lockfile so it also covers the Debug configuration + # in case there are Debug-specific dependencies + $ conan lock create . -s build_type=Debug --lockfile=conan.lock --lockfile-out=conan.lock + +Note that different configurations, using different profiles or settings could result in different dependency graphs. A lockfile file can be used to lock the different configurations, but it is important to iterate the different configurations/profiles and capture their information in the lockfile. + +.. note:: + + The ``conan.lock`` is the default argument, and if a ``conan.lock`` file exists, it might be automatically used by ``conan install/create`` and other graph commands. This can simplify many of the commands, but this tutorial is showing the full explicit commands for clarity and didactical reasons. + +The ``conan.lock`` file can be inspected, it will be something like: + +.. code-block:: json + + { + "version": "0.5", + "requires": [ + "mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea%1724319985.398" + ], + "build_requires": [], + "python_requires": [], + "config_requires": [] + } + +As we can see, it is locking the ``mathlib/1.0`` dependency version and revision. + + +With the lockfile, the creating of the different configurations is exactly the same, but providing the ``--lockfile=conan.lock`` argument to the ``conan create`` step, it will guarantee that ``mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea`` will always be the exact dependency used, irrespective if there exist new ``mathlib/1.1`` versions or new revisions available. The following builds could be launched in parallel but executed at different times, and still they will always use the same ``mathlib/1.0`` dependency: + + +.. code-block:: bash + :caption: Release build + + $ cd ai + $ conan create . --build="missing:ai/*" --lockfile=conan.lock -s build_type=Release --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Add packages repo, you might need to adjust this URL + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_release.json + +.. code-block:: bash + :caption: Debug build + + $ conan create . --build="missing:ai/*" --lockfile=conan.lock -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Remote definition can be ommitted in tutorial, it was defined above (-f == force) + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages -f + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_debug.json + +Note the only modification to the previous example is the addition of ``--lockfile=conan.lock``. The promotion will also be identical to the previous one: + +.. code-block:: bash + :caption: Promoting from packages->product + + # aggregate the package list + $ conan pkglist merge -l uploaded_release.json -l uploaded_debug.json --format=json > uploaded.json + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=uploaded.json -r=packages --format=json > promote.json + $ conan upload --list=promote.json -r=products -c + +And the final result will be the same as in the previous section, but this time just with the guarantee that both ``Debug`` and ``Release`` binaries were built using exactly the same ``mathlib`` version: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/promoted release" [label="ai/1.1.0\n (Release)"]; + "ai/promoted debug" [label="ai/1.1.0\n (Debug)"]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + +Now that we have the new ``ai/1.1.0`` binaries in the ``products`` repo, we can consider the ``packages pipeline`` finished and move to the next section, and build and check our products to see if this new ``ai/1.1.0`` version integrates correctly. diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst index 66b721a9c0d..0bf2e249720 100644 --- a/ci_tutorial/products_pipeline.rst +++ b/ci_tutorial/products_pipeline.rst @@ -10,7 +10,6 @@ but this concept will be further explained later, and specially why it is import model the dependencies top-bottom in the CI. - - Then, when packages are in the ``products`` repository, the ``products pipeline`` can be triggered. This job will make sure that both the organization products ``game/1.0`` and ``mapviewer/1.0`` build cleanly with the new ``ai/1.1.0`` package, and build necessary new package binaries, for example if ``engine/1.0`` needs to do a build from source to integrate the changes in ``ai/1.1.0`` the ``products pipeline`` @@ -23,6 +22,9 @@ model the dependencies top-bottom in the CI. repository will be the place where binaries for different packages are uploaded to not disrupt or break the ``develop`` repository, until the "products pipeline" can build necessary binaries from source and verify that these packages integrate cleanly. +What are the **products** +------------------------- + There are some important points to understand about the products pipeline: - What are the **products**? The "products" are the main software artifact that my organization is delivering as final result and provide some @@ -43,6 +45,13 @@ There are some important points to understand about the products pipeline: can greatly help to efficiently define which packages needs to rebuild and which ones don't. +Building intermediate packages new binaries +------------------------------------------- + +- Is it a new version +- No, it is a new package-id +- The package-id model and links + .. toctree:: :maxdepth: 1 diff --git a/ci_tutorial/project_setup.rst b/ci_tutorial/project_setup.rst index 1beb402782e..f64a37174f3 100644 --- a/ci_tutorial/project_setup.rst +++ b/ci_tutorial/project_setup.rst @@ -28,14 +28,14 @@ If you have another available Artifactory, it can be used too if you can create ✍️ As a first step, log into the web UI and **create 3 different local repositories** called ``develop``, ``packages`` and ``products``. -✍️ Then edit the ``project_setup.py`` file: +✍️ Then according to the ``project_setup.py`` file, these are the necessary environment variables to configure the server. Please define ``ARTIFACTORY_URL``, ``ARTIFACTORY_USER`` and/or ``ARTIFACTORY_PASSWORD`` if necessary to adapt to your setup: .. code-block:: python # TODO: This must be configured by users - SERVER_URL = "http:///artifactory/api/conan" - USER = "admin" - PASSWORD = "your password" + SERVER_URL = os.environ.get("ARTIFACTORY_URL", "http://localhost:8081/artifactory/api/conan") + USER = os.environ.get("ARTIFACTORY_USER", "admin") + PASSWORD = os.environ.get("ARTIFACTORY_PASSWORD", "password") Initial dependency graph @@ -43,8 +43,8 @@ Initial dependency graph .. warning:: - - The initialization of the project will remove the contents of the 3 ``develop``, ``products`` and ``packages`` repositories. - - The ``examples2/ci/game`` folder contains an ``.conanrc`` file that defines a local cache, so commands executed in this tutorial do not pollute or alter your main Conan cache. + - The initialization of the project will remove the contents of the 3 ``develop``, ``products`` and ``packages`` repositories in the server. + - The ``examples2/ci/game`` folder contains a ``.conanrc`` file that defines a local cache, so commands executed in this tutorial do not pollute or alter your main Conan cache. .. code-block:: bash diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 9a69312c32a..04443c73062 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -6,7 +6,7 @@ Continuous Integration (CI) tutorial Continuous Integration has different meanings for different users and organizations. In this tutorial we will cover the scenarios when users are doing changes to the source code of their packages and want to automatically build new binaries for those packages and also compute if those new package changes integrate cleanly or break the organization main products. -We will use in this tutorial this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility. The game and mapviewer are our final "products", what we distribute to our users: +We will use in this tutorial this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility. The ``game`` and ``mapviewer`` are our final "**products**", what we distribute to our users: .. graphviz:: :align: center @@ -99,6 +99,12 @@ with the repositories, there will be 2 promotions: the ``products`` repository and has checked that the organization "products" (such ``game/1.0`` and ``mapviewer/1.0``) are not broken, then the packages can be promoted (copied) from the ``products`` repo to the ``develop`` repo, to make them available for all other developers and CI. + +.. note:: + + The concept of **immutability** is important in package management and devops. Modifying ``channel`` is strongly discouraged, see + :ref:`Package promotions`. + This tutorial is just modeling the **development** flow. In production systems, there will be other repositories and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users and packages can be promoted from ``develop`` to ``testing`` to ``release`` as they pass validation. Read more about promotions in :ref:`Package promotions`. diff --git a/tutorial/versioning/lockfiles.rst b/tutorial/versioning/lockfiles.rst index 1148fc1d292..d05e8b2bb13 100644 --- a/tutorial/versioning/lockfiles.rst +++ b/tutorial/versioning/lockfiles.rst @@ -318,4 +318,4 @@ scripts, and for some advanced CI flows that will be explained later. .. seealso:: - - Continuous Integrations links. + - :ref:`CI tutorial`. From 845fb1bc5088e568d25fb4c26f95d307fc870ce3 Mon Sep 17 00:00:00 2001 From: memsharded Date: Thu, 22 Aug 2024 18:15:03 +0200 Subject: [PATCH 07/22] wip --- ci_tutorial/products_pipeline.rst | 109 ++++++++++++------ ci_tutorial/tutorial.rst | 4 +- .../binary_model/custom_compatibility.rst | 1 + reference/binary_model/dependencies.rst | 39 +++++++ 4 files changed, 117 insertions(+), 36 deletions(-) diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst index 0bf2e249720..c9641b3638f 100644 --- a/ci_tutorial/products_pipeline.rst +++ b/ci_tutorial/products_pipeline.rst @@ -1,56 +1,97 @@ Products pipeline ================== -The **products pipeline** responds a more challenging question: does my "products" build correctly with the latest changes that have been done -to the packages? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization -important product to check if things integrate cleanly or break. Let's continue with the example above, if we now have a new ``ai/1.1.0`` package, +The **products pipeline** responds a more challenging question: does my "products" build correctly with the latest changes and new versions that have been done +to the packages and their dependencies? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization +important products to check if things integrate cleanly or break. + +Let's continue with the example above, if we now have a new ``ai/1.1.0`` package, is it going to break the existing ``game/1.0`` and/or ``mapviewer/1.0`` applications? Is it necessary to re-build from source some of the existing -packages that depend directly or indirectly on ``ai`` package? In this tutorial we will use ``game/1.0`` and ``mapviewer/1.0`` as our "products", +packages that depend directly or indirectly on ``ai`` package? In this tutorial we use ``game/1.0`` and ``mapviewer/1.0`` as our "products", but this concept will be further explained later, and specially why it is important to think in terms of "products" instead of trying to explicitly model the dependencies top-bottom in the CI. +The essence of this **products pipeline** in our example is that the new ``ai/1.1.0`` version that was uploaded to the ``products`` repository +automatically falls into the valid version ranges, and our versioning approach means that such a minor version increase will require building from +source its consumers, in this case ``engine/1.0`` and ``game/1.0`` and in that specific sequential order, while all the other packages will remain the same. +Knowing which packages need to be built from source and in which order, and executing that build to check if the main organization products keep +working correctly with the new dependencies versions is the responsibility of the products pipeline. -- Then, when packages are in the ``products`` repository, the ``products pipeline`` can be triggered. This job will make sure that both the - organization products ``game/1.0`` and ``mapviewer/1.0`` build cleanly with the new ``ai/1.1.0`` package, and build necessary new package - binaries, for example if ``engine/1.0`` needs to do a build from source to integrate the changes in ``ai/1.1.0`` the ``products pipeline`` - will make sure that this happens. - - -- ``products``: It is possible that some changes create new package versions or revisions correctly. But these new versions might break consumers - of those packages, for example some changes in the new ``ai/1.1.0`` package might unexpectedly break ``engine/1.0``. Or even if they don't - necessarily break, they might still need to build a new binary from source for ``engine/1.0`` and/or ``game/1.0``. The ``products`` binary - repository will be the place where binaries for different packages are uploaded to not disrupt or break the ``develop`` repository, until - the "products pipeline" can build necessary binaries from source and verify that these packages integrate cleanly. What are the **products** ------------------------- -There are some important points to understand about the products pipeline: +The **products** are the main software artifact that a organization (a company, a team, a project) is delivering as final result and provide some +value for users of those artifacts. In this example we will consider ``game/1.0`` and ``mapviewer/1.0`` the "products". Note that it is +possible to define different versions of the same package as products, for example, if we had to maintain different versions of the ``game`` for +different customers, we could have ``game/1.0`` and ``game/2.3`` as well as different versions of ``mapviewer`` as products. -- What are the **products**? The "products" are the main software artifact that my organization is delivering as final result and provide some - value for users of those artifacts. In this example we will consider ``game/1.0`` and ``mapviewer/1.0`` the "products". Note that it is - possible to define different versions of the same package as products, for example, if we had to maintain different versions of the ``game`` for - different customers, we could have ``game/1.0`` and ``game/2.3`` as well as different versions of ``mapviewer`` as products. -- Why not defining in CI the "users" or "consumers" of every package? It might be tempting to model the relationships between packages, in this - case, that the package ``ai`` is used by the ``engine`` package, and then try to configure the CI so a build of ``engine`` is triggered after - a build of ``ai``. But this approach does not scale at all and have very important limitations: +The "products" approach, besides the advantage of focusing on the business value, has another very important advantage: it avoids having to model +the dependency graph at the CI layer. It is a frequent attempt trying to model the inverse dependency model, that is, representing at the CI level +the dependants or consumers of a given package. In our example, if we had configured a job for building the ``ai`` package, we could have another +job for the ``engine`` package, that is triggered after the ``ai`` one, configuring such topology somehow in the CI system. + +But this approach does not scale at all and have very important limitations: - - The example above is relatively simple, but in practice dependency graphs can have many more packages, even hundrends, making it very tedious and error prone to define all dependencies among packages in the CI - - Dependencies evolve over time, and new versions are used, some dependencies are removed and newer dependencies are added. The simple relationship between repositories modeled at the CI level can result in a very inefficient, slow and time consuming CI, if not a fragile one that continuously breaks because some dependencies change. - - The combinatorial nature that happens downstream a dependency graph, where a relatively stable top dependency, lets say ``mathlib/1.0`` might be used by multiple consumers such as ``ai/1.0``, ``ai/1.1``, ``ai/1.2`` which in turn each one might be used by multiple ``engine`` different versions and so on. Building only the latest version of the consumers would be insufficient in many cases and building all of them would be extremely costly. -- In C and C++ projects the "products" pipeline becomes more necessary and critical than in other languages due to the compilation model with - headrs textual inclusions becoming part of the consumers' binary artifacts and due to the native artifacts - linkage models. This means that in many scenarios it will be necessary to build new binaries that depend on some modified packages, even - if the source of the package itself didn't change at all. Conan's ``package_id`` computation together with some versioning conventions - can greatly help to efficiently define which packages needs to rebuild and which ones don't. +- The example above is relatively simple, but in practice dependency graphs can have many more packages, even several hundrends, making it very tedious and error prone to define all dependencies among packages in the CI +- Dependencies evolve over time, and new versions are used, some dependencies are removed and newer dependencies are added. The simple relationship between repositories modeled at the CI level can result in a very inefficient, slow and time consuming CI, if not a fragile one that continuously breaks because some dependencies change. +- The combinatorial nature that happens downstream a dependency graph, where a relatively stable top dependency, lets say ``mathlib/1.0`` might be used by multiple consumers such as ``ai/1.0``, ``ai/1.1``, ``ai/1.2`` which in turn each one might be used by multiple ``engine`` different versions and so on. Building only the latest version of the consumers would be insufficient in many cases and building all of them would be extremely costly. +- The "inverse" dependency model, that is, asking what are the "dependants" of a given package is extremely challeging in practice, specially in a decentralized + approach like Conan in which packages can be stored in different repositories, including different servers, and there isn't a central database of all packages and their relations. + Also, the "inverse" dependency model is, similar to the direct one, conditional. As a dependency can be conditional on any configuration (settings, options), the inverse is + also conditioned to the same logic, and such logic also evolves and changes with every new revision and version. + +In C and C++ projects the "products" pipeline becomes more necessary and critical than in other languages due to the compilation model with headers textual inclusions becoming part of the consumers' binary artifacts and due to the native artifacts linkage models. Building intermediate packages new binaries ------------------------------------------- -- Is it a new version -- No, it is a new package-id -- The package-id model and links +A frequently asked question is what would be the version of a consumer package when it builds against a new dependency version. +Put it explicitly for our example, where we have defined that we need to build again the ``engine/1.0`` package because now it is +depending on ``ai/1.1.0`` new version: + +- Should we create a new ``engine/1.1`` version to build against the new ``ai/1.1.0``? +- Or should we keep the ``engine/1.0`` version? + +The answer lies in the :ref:`binary model and how dependencies affect the package_id`. +Conan has a binary model that takes into account both the versions, revisions and ``package_id`` of the dependencies, as well +as the different package types (``package_type`` attribute). + +The recommendation is to keep the package versions aligned with the source code. If ``engine/1.0`` is building from a specific +commit/tag of its source repository, and the source of that repository doesn't change at all, then it becomes very confusing to +have a changing package version that deviate from the source version. With the Conan binary model what we will have is 2 +different binaries for ``engine/1.0``, with 2 different ``package_id``. One binary will be built against the ``ai/1.0`` version +and the other binary will be built against the ``ai/1.1.0``, something like: + +.. code-block:: + :emphasize-lines: 6, 12, 14, 20 + + $ conan list engine:* -r=develop + engine/1.0 + revisions + fba6659c9dd04a4bbdc7a375f22143cb (2024-08-22 09:46:24 UTC) + packages + 2c5842e5aa3ed21b74ed7d8a0a637eb89068916e + info + settings + ... + requires + ai/1.0.Z + graphics/1.0.Z + mathlib/1.0.Z + de738ff5d09f0359b81da17c58256c619814a765 + info + settings + ... + requires + ai/1.1.Z + graphics/1.0.Z + mathlib/1.0.Z + + +Let's see how a product pipeline can build such ``engine/1.0`` and ``game/1.0`` new binaries using the new dependencies versions. +In the following sections we will present a products pipeline in an incremental way, the same as the packages pipeline. .. toctree:: diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 04443c73062..1c478eb10c3 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -102,8 +102,8 @@ with the repositories, there will be 2 promotions: .. note:: - The concept of **immutability** is important in package management and devops. Modifying ``channel`` is strongly discouraged, see - :ref:`Package promotions`. + - The concept of **immutability** is important in package management and devops. Modifying ``channel`` is strongly discouraged, see :ref:`Package promotions`. + - The versioning approach is important. This tutorial will be following :ref:`the default Conan versioning approach, see details here` This tutorial is just modeling the **development** flow. In production systems, there will be other repositories and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users and packages can diff --git a/reference/binary_model/custom_compatibility.rst b/reference/binary_model/custom_compatibility.rst index 61917385745..56e03b4d550 100644 --- a/reference/binary_model/custom_compatibility.rst +++ b/reference/binary_model/custom_compatibility.rst @@ -85,6 +85,7 @@ Compatibility can be defined globally via the ``compatibility.py`` plugin, in th Check the binary compatibility :ref:`compatibility.py extension `. +.. _reference_binary_model_custom_compatibility_dependencies: Customizing binary compatibility of dependencies versions --------------------------------------------------------- diff --git a/reference/binary_model/dependencies.rst b/reference/binary_model/dependencies.rst index b8e74003457..031153298b4 100644 --- a/reference/binary_model/dependencies.rst +++ b/reference/binary_model/dependencies.rst @@ -152,3 +152,42 @@ Now we will have two ``app/0.1`` different binaries: We will have these two different binaries, one of them linking with the first revision of the ``dep/0.1`` dependency (with the "Hello World" message), and the other binary with the other ``package_id`` linked with the second revision of the ``dep/0.1`` dependency (with the "Hello Moon" message). The above described mode is called ``full_mode``, and it is the default for the ``embed_mode``. + + +.. _reference_binary_model_dependencies_versioning: + + +Default versioning approach +---------------------------- + +The above behavior is the default for ``embed`` and ``non_embed`` mode. These defaults are intended to work with the +following versioning approach: + +- Not modifying the version typically means that we want Conan automatic + **recipe revisions** to handle that. A common use case is when the C/C++ source code is not modified at all, and only changes + to the ``conanfile.py`` recipe are done. As the source code is the same, we might want to keep the same version number, and + just have a new revision of that version. +- **Patch**: Increasing the **patch** version of a package means that only internal changes were done, in practice it means change to files + that are not public headers of the package. This "patch" version can avoid having to re-build consumers of this package, for + example if the current package getting a new "patch" version is a static library, all other packages that implement static + libraries that depend on this one do not need to be re-built from source, as depending on the same public interface headers + guarantee the same binary. +- **Minor**: If changes are done to package public headers, in an API source compatible way, then the recommendation would be to increase + the **minor** verson of a package. That means that other packages that depend on it will be able to compile without issues, + but as there were modification in public headers (that could contain C++ templates or other things that could be inlined in + the consumer packages), then those consumer packages need to be rebuilt from source to incorporate these changes. +- **Major**: If API breaking changes are done to the package public headers, then increasing the **major** version is recommended. As the + most common recommended version-range is something like ``dependency/[>1.0 <2]``, where the next major is excluded, that means + that publishing these new versions will not break existing consumers, because they will not be used at all by those consumers, + because their version ranges will exclude them. It will be necessary to modify the consumers recipes and source code (to fix + the API breaking changes) to be able to use the new major version. + + +Note that while this is close to the standard "semver" definition of version and version ranges, the C/C++ compilation model +needs to introduce a new side effect, that of "needing to rebuild the consumers", following the logic explained above in the +``embed`` and ``non_embed`` cases. + + +This is just the default recommended versioning approach, but Conan allows to change these defaults, as it implements an extension of the "semver" standard that allows any number of digits, +letters, etc, and it also allows to change the ``package_id`` modes to define how different versions of the dependencies affect +the consumers binaries. See :ref:`how to customize the dependencies package_id modes`. From acaebc508af0ea6eea650ebc6ff8bfcff6b0bd5f Mon Sep 17 00:00:00 2001 From: memsharded Date: Fri, 23 Aug 2024 13:31:23 +0200 Subject: [PATCH 08/22] moved default versioning --- ci_tutorial/tutorial.rst | 9 +++- devops/devops.rst | 6 ++- devops/versioning/default.rst | 70 +++++++++++++++++++++++++ devops/{ => versioning}/versioning.rst | 3 +- reference/binary_model/dependencies.rst | 39 -------------- 5 files changed, 84 insertions(+), 43 deletions(-) create mode 100644 devops/versioning/default.rst rename devops/{ => versioning}/versioning.rst (79%) diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 1c478eb10c3..4682d0c8454 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -3,6 +3,13 @@ Continuous Integration (CI) tutorial ==================================== +.. note:: + + - This is an advanced topic, previous knowledge of Conan tool is necessary. Please :ref:`read and practice the user tutorial` first. + - This section is intended for devops and build engineers designing and implementing a CI pipeline involving Conan packages, if it is not the + case, you can skip this section. + + Continuous Integration has different meanings for different users and organizations. In this tutorial we will cover the scenarios when users are doing changes to the source code of their packages and want to automatically build new binaries for those packages and also compute if those new package changes integrate cleanly or break the organization main products. @@ -103,7 +110,7 @@ with the repositories, there will be 2 promotions: .. note:: - The concept of **immutability** is important in package management and devops. Modifying ``channel`` is strongly discouraged, see :ref:`Package promotions`. - - The versioning approach is important. This tutorial will be following :ref:`the default Conan versioning approach, see details here` + - The versioning approach is important. This tutorial will be following :ref:`the default Conan versioning approach, see details here` This tutorial is just modeling the **development** flow. In production systems, there will be other repositories and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users and packages can diff --git a/devops/devops.rst b/devops/devops.rst index e9fc43dadb7..0de7e1246c0 100644 --- a/devops/devops.rst +++ b/devops/devops.rst @@ -4,7 +4,9 @@ Devops guide ============ -The previous tutorial section was aimed at users in general and developers. +The previous :ref:`tutorial` section was aimed at users in general and developers. + +The :ref:`Continuous Integration tutorial` explained the basics how to implement Continuous Integration involving Conan packages. This section is intended for DevOps users, build and CI engineers, administrators, and architects adopting, designing and implementing Conan in production in their teams and organizations. If you plan to use Conan in production in your project, team, or organization, this section contains the necessary information. @@ -16,7 +18,7 @@ If you plan to use Conan in production in your project, team, or organization, t devops_local_recipes_index backup_sources/sources_backup metadata - versioning + versioning/versioning save_restore vendoring package_promotions diff --git a/devops/versioning/default.rst b/devops/versioning/default.rst new file mode 100644 index 00000000000..6193a917d1c --- /dev/null +++ b/devops/versioning/default.rst @@ -0,0 +1,70 @@ + +.. _devops_versioning_default: + + +Default versioning approach +---------------------------- + +When doing changes to the source code of a package, and creating such a package, one good practice is to increase the version +of the package to represent the scope and impact of those changes. The "semver" standard specification defines a ``MAJOR.MINOR.PATCH`` +versioning approach with a specific meaning for changing each digit. + +Conan implements versioning based on the "semver" specification, but with some extended capabilities that were demanded by the C and C++ +ecosystems: + +- Conan versions can have any number of digits, like ``MAJOR.MINOR.PATH.MICRO.SUBMICRO...`` +- Conan versions can contain also letters, not only digits, and they are also ordered in alphabetical order, so ``1.a.2`` is older tha ``1.b.1`` for example. +- The version ranges can be equally defined for any number of digits, like ``dependency/[>=1.0.0.0 <1.0.0.10]`` + +Read the :ref:`introduction to versioning` in the tutorial. + +But one very different aspect of C and C++ building model compared to other languages is how the dependencies affect the +binaries of the consumers requiring them. This is described in the :ref:`Conan binary model` reference. + +Basically, when some package changes its version, this can have different effects on the "consumers" of this package, requiring such +"consumers" to do a rebuild from source or not to integrate the new dependency changes. This also depends on the package types, +as the logic changes when linking a shared library or a static library. Conan binary model with ``dependency traits``, ``package_type``, +and the ``package_id`` modes is able to represent this logic and compute efficiently what needs to be rebuilt from source. + +The default Conan behavior can give some hints of what version changes would be recommended when doing different changes to the packages +source code: + +- Not modifying the version typically means that we want Conan automatic + **recipe revisions** to handle that. A common use case is when the C/C++ source code is not modified at all, and only changes + to the ``conanfile.py`` recipe are done. As the source code is the same, we might want to keep the same version number, and + just have a new revision of that version. +- **Patch**: Increasing the **patch** version of a package means that only internal changes were done, in practice it means change to files + that are not public headers of the package. This "patch" version can avoid having to re-build consumers of this package, for + example if the current package getting a new "patch" version is a static library, all other packages that implement static + libraries that depend on this one do not need to be re-built from source, as depending on the same public interface headers + guarantee the same binary. +- **Minor**: If changes are done to package public headers, in an API source compatible way, then the recommendation would be to increase + the **minor** verson of a package. That means that other packages that depend on it will be able to compile without issues, + but as there were modification in public headers (that could contain C++ templates or other things that could be inlined in + the consumer packages), then those consumer packages need to be rebuilt from source to incorporate these changes. +- **Major**: If API breaking changes are done to the package public headers, then increasing the **major** version is recommended. As the + most common recommended version-range is something like ``dependency/[>1.0 <2]``, where the next major is excluded, that means + that publishing these new versions will not break existing consumers, because they will not be used at all by those consumers, + because their version ranges will exclude them. It will be necessary to modify the consumers recipes and source code (to fix + the API breaking changes) to be able to use the new major version. + + +Note that while this is close to the standard "semver" definition of version and version ranges, the C/C++ compilation model +needs to introduce a new side effect, that of "needing to rebuild the consumers", following the logic explained above in the +``embed`` and ``non_embed`` cases. + + +This is just the default recommended versioning approach, but Conan allows to change these defaults, as it implements an extension of the "semver" standard that allows any number of digits, +letters, etc, and it also allows to change the ``package_id`` modes to define how different versions of the dependencies affect +the consumers binaries. See :ref:`how to customize the dependencies package_id modes`. + + +.. note:: + + **Best practices** + + - It is not recommended to use other package reference fields, as the ``user`` and ``channel`` to represent changes in the source code, + or other information like the git branch, as this becomes "viral" requiring changes in the ``requires`` of the consumers. Furthermore, + they don't implement any logic in the build model with respect to which consumers need to be rebuilt. + - The recommended approach is to use versioning and multiple server repositories to host the different packages, so they don't interfere + with other builds, read :ref:`the Continuous Integration tutorial` for more details. diff --git a/devops/versioning.rst b/devops/versioning/versioning.rst similarity index 79% rename from devops/versioning.rst rename to devops/versioning/versioning.rst index f0fd5febe51..25980c6c107 100644 --- a/devops/versioning.rst +++ b/devops/versioning/versioning.rst @@ -8,4 +8,5 @@ This section deals with different versioning topics: .. toctree:: :maxdepth: 1 - versioning/resolve_prereleases + default + resolve_prereleases diff --git a/reference/binary_model/dependencies.rst b/reference/binary_model/dependencies.rst index 031153298b4..b8e74003457 100644 --- a/reference/binary_model/dependencies.rst +++ b/reference/binary_model/dependencies.rst @@ -152,42 +152,3 @@ Now we will have two ``app/0.1`` different binaries: We will have these two different binaries, one of them linking with the first revision of the ``dep/0.1`` dependency (with the "Hello World" message), and the other binary with the other ``package_id`` linked with the second revision of the ``dep/0.1`` dependency (with the "Hello Moon" message). The above described mode is called ``full_mode``, and it is the default for the ``embed_mode``. - - -.. _reference_binary_model_dependencies_versioning: - - -Default versioning approach ----------------------------- - -The above behavior is the default for ``embed`` and ``non_embed`` mode. These defaults are intended to work with the -following versioning approach: - -- Not modifying the version typically means that we want Conan automatic - **recipe revisions** to handle that. A common use case is when the C/C++ source code is not modified at all, and only changes - to the ``conanfile.py`` recipe are done. As the source code is the same, we might want to keep the same version number, and - just have a new revision of that version. -- **Patch**: Increasing the **patch** version of a package means that only internal changes were done, in practice it means change to files - that are not public headers of the package. This "patch" version can avoid having to re-build consumers of this package, for - example if the current package getting a new "patch" version is a static library, all other packages that implement static - libraries that depend on this one do not need to be re-built from source, as depending on the same public interface headers - guarantee the same binary. -- **Minor**: If changes are done to package public headers, in an API source compatible way, then the recommendation would be to increase - the **minor** verson of a package. That means that other packages that depend on it will be able to compile without issues, - but as there were modification in public headers (that could contain C++ templates or other things that could be inlined in - the consumer packages), then those consumer packages need to be rebuilt from source to incorporate these changes. -- **Major**: If API breaking changes are done to the package public headers, then increasing the **major** version is recommended. As the - most common recommended version-range is something like ``dependency/[>1.0 <2]``, where the next major is excluded, that means - that publishing these new versions will not break existing consumers, because they will not be used at all by those consumers, - because their version ranges will exclude them. It will be necessary to modify the consumers recipes and source code (to fix - the API breaking changes) to be able to use the new major version. - - -Note that while this is close to the standard "semver" definition of version and version ranges, the C/C++ compilation model -needs to introduce a new side effect, that of "needing to rebuild the consumers", following the logic explained above in the -``embed`` and ``non_embed`` cases. - - -This is just the default recommended versioning approach, but Conan allows to change these defaults, as it implements an extension of the "semver" standard that allows any number of digits, -letters, etc, and it also allows to change the ``package_id`` modes to define how different versions of the dependencies affect -the consumers binaries. See :ref:`how to customize the dependencies package_id modes`. From c22e05ad2210d18ef8987c32b34408e5dcd6a932 Mon Sep 17 00:00:00 2001 From: memsharded Date: Mon, 30 Sep 2024 20:16:45 +0200 Subject: [PATCH 09/22] products pipeline --- ci_tutorial/products_pipeline.rst | 2 + .../products_pipeline/build_order_simple.png | Bin 0 -> 30449 bytes .../products_pipeline/decentralized_build.rst | 90 +++++++++++ .../single_configuration.rst | 143 +++++++++++++++++- 4 files changed, 231 insertions(+), 4 deletions(-) create mode 100644 ci_tutorial/products_pipeline/build_order_simple.png create mode 100644 ci_tutorial/products_pipeline/decentralized_build.rst diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst index c9641b3638f..422165993c9 100644 --- a/ci_tutorial/products_pipeline.rst +++ b/ci_tutorial/products_pipeline.rst @@ -98,4 +98,6 @@ In the following sections we will present a products pipeline in an incremental :maxdepth: 1 products_pipeline/single_configuration + products_pipeline/decentralized_build + diff --git a/ci_tutorial/products_pipeline/build_order_simple.png b/ci_tutorial/products_pipeline/build_order_simple.png new file mode 100644 index 0000000000000000000000000000000000000000..22db687959986b61bc9e6a7e9fd776e026e00e18 GIT binary patch literal 30449 zcmagG1yoew`z~rB0s_*43`m2NGz=vIQqtWG(m8a9bf|3c~TmRbfXOid`7SoRFxTec4+(#~klH8F`bf8A#zkzL*6~8X#6&*g3v!xxhO3!L;d`@@C}U_P zt+<#8Vxc`Xd134~qtBA6nWShe4uVJv=#!K>HE?_}{0kfR{Y>@c3F2iOcI62-O}G6! z`1R#X^splNvYd8WZh5r&IwJ^Qzp9h)p$1X*MvfA9x6L;JgfNWgo3o** zCg&JsF@dDBwQ@$QouAO9hRD(W_{-LZRv=_ozdZL%xyIoNrRnH2Q|-qi-ZqAQg%P_s zw-2yew|pwkf-#g}ph>rF1$*3fxnFylE2FZwjlWfMS?74mrbF8K`dtX0h=>6aTBTlJ z7w6GP$X<@-J*zRa(RlJ@BC9q!!QH~M3=q>u!IsAB-G|OOscJRv7kMiw+-4}3xj0(C zT-RivrgLwi5G6%laKArj)*U(4pV~Qzt85A56Z&<0p0a!+E5T+`hx#x|WQhxyQx?=P zZnvuqp?>r*+skDou*K6RCyi)jegSpuouW{yG1q3kwem^>WXsb{hMcPoi!*xVdH;4W zIxq>jzIP{~ze?QNF6C~j9T3{V(wN=d4515uItMru<5*f?s;oPjHFfVE>XOEvvDf0= zchJjKe)bZ7L9@C*GTF8PsAfdUWxo0&EKRH*-8L9zabGt_13`Tl`!x2`z^<6(bkJ-+ z`r&W`;h%nyt~fH{fGlT=8=kC0mw08Fd!D7N5{HK)LLl-Y9$RL)p2m zX;m5bob_PN13vjOUpavLZ>cB143)sf8VwLA>$X*xgak^7+cPz+6|$6{B|B_6GT}Hs zjIDsx`XYTHgH4N9>5ApK6-7Kyb7r92w`%T8^9=jFdEIE8K#wFpKgz*|vO5ryVdXKJ z^u?>&i-XFG1_QdP$JO}f>)0U307V#=W2PTXx zb*QB9xQ6Xs+8?Ha9;?rA*@pG^KNUj3|F-Uz!}(eiPiuYHjGvPcSU?XOfZ_kx0FPV7 zAQ@;9QGt3rb|i%)s~GoCxMt%k7z+OIN$A|M?rKs@;Ra`~;kn@W>?!x|$5UBZ!^G1< zlhRb9$Yjma-~+AwAnG$0n!l@5TPt*KQ1U|O=H^{Ly7Ymy_&VmC``(#BuI^*+z~4Q% zt?jfV?XN8NGj0uNrHAZ?dP#5}wqYI8bLZ9FmOD@SgU+NF#uC{BhRseiZt{}yF{;0t zQ&)@PVgCl`K6Vxu?VyLj1lk_*Q1?RzOYPv#?|e7xwjQ>(fB>y7(!YKEe}9$CKuc3L zC?kV(RgpOk>nQ{&GOVIRVu($CmBdJaBfj}V{I*mooqu1!7EYj>uMJfK3 z$Q9+{w`k7X$`#XulY09>wp17Fdvc&TIlAZ{OY#rPr;tNuJ@bBKe!xcS2>gn9T>hF)9!8+x0)b?A);bfOR-c5i^D=SnXkU--I7P#u~Durq1RqM3Z3P^eG&SHgS zb#POq^BU)pM1}3^(C6F(AA}sg1s2=X$NvcBXw#KhZ8)Ls_0(A0zG8!ArUefnf`@eE z5++htorP(0o^eg{rKC0!cEMD6FaoQ`kFE_iKv$4bIZlm(8@I&5_2?x{t9Xm*BHa&& zG0*o44x14lMzWH8`I+;_bXFE8tgut?CS{*@sI$yTuNrSA ze@P=I@JmL+5Rih+{s6#j`+&KVgilYXlpl1wRdDTe8^nuDs1vmEqh!NR|SHZ->IZW zH&ToCOz#~rQ@^OscE=t}Sta*eiH$=n(=$54k~~KR6{etFjEeIhhrB{vqZOrg<8zQ) zsx^fAY0ASU2|%YVDbDqo#eE^0MO4kC{b4}=bFzLLDONJnv2fD7U6V1r&Hkq{bhS`X zd(j)gfCJxNfpzYK@VW#7PHf>Q!gMhmJOgV%@#jI(4UNcUQKs^#C&g6xhcR|98W$1s zc_SvvcX-CqQh>LY9K_m=2mtR&9qpW+Hmmf`2SCe0K;%{}TE%0#~=%6c~o zYQ&T<&+dP_4W;n6EqK_1MCcSHJA+vqlq@XJI}!LT6gwVdBXQJ<*U;1%U7o&ic}}wW z(TQMBoWjc5ton%XL|#+^ti+9}O#F46X9#V}qVgG7;2N@OgawEJHpWu#^X*2oiiCvG zcbaCum@~L%z|EG;0_@(iMa5>MGHSv?SPg`+kDX@7DmV5r+D>ISH!Wy{{Jl;;*UwMj;uSjCy04By23lDrmb(rH z1;P;rJO=Xum|hiAp?vTkfvhI)wosDPDJ=Bo&%VrG?I@+kBG)Iy$$y?{vmjF~LldRx%f(YH1XseBlyuHj1 zHZWED2^b`9K%ktv2d}5YxUia2_KW3Lv$DaVm4e&J^sy;@((P;J7w!t%0cyR^VwJCN1s&o z^*4JIgq(?pxAiBB)B}4WDH1Z1H@b^8yjSFnyRG*#CrYzM>7yn7JKi3hT$N6aW=Y2WZ zRI6u7zRi57Z*)_rd!CkiKZ9qaV%rfA<4iS0Mi+QtOPXm^j`BD8%H3PKDXS;^gmgg8{*~l!qL2bdv;cT zG*L{h+aJ5Wtg3OcU+;Vf<825HY(C{Ouipc4-JVJ`+#GjJRTrz9RiVL7$2>15irPu5*8k3tt)u@svMcvFbGS_p(r9ab!ZelV4y*CL06D?d1CG32+y zIM9C4)_MEvV!gPYjo@|n>X1*^X890qVxX8hRZ*6 z{f$?U+ntw!SJUP`?Yw2{b!X_(0K!WV?CrgPj1pVGD$jeQ1+Wi@-TU%P>+ zX4xt%P>!xYmNjJA1+qxDSme39hK6=FBv8C((q)(6H1n%t03-6=ER)u#JwMzn=akk6 zSGt9(!%%$yZ8?48h0`8lO~+gk+BFj@O~RGQGbQlx?j=^QZN zQCvduum}Oyyiaf+C$kL6{(GR>%s)3m&bWu(mZkzo*Zye>J(` z6g8g9-<%pAXdNtgu7jc*oq3^|?+zNPaROsZTMCqmwQjE)Hy0NJD2p4<5!?wAzmhH&JTYEyFpBX>w(9aKey?Af8_+3~I$i~ObP3#JmyNglXuf#COzrOVgq{9zx za!n8dQ(av39hZl>Pe{K7pB|PJZ;4vZNpe@p`yNF~5(MdnBJV$;CYiahpJ2TKR^^0A zYH_RcffZ%1aEbSW?*j|A9W*CTrL~!OqHK3`Q2fcn$zJNf$y!>A8T6fXEkImyJzVlU zSjqjV+E!IjtWpW8Uc5u0tNVue;lpU@x%j&9 z`9Bm)lz?oLp$?4u-WHpoWZyWZuB|0Xma(5d4gW-6MfP*13Q^RK**v?eXXKX* z<#BpB2sX)?#*6C+Y`pH&8^p&k&SNSu-@CfpvydEl+E8T&ui;{p?U*TO{*BANDSdE{ z1Tbe48<%Y-vWKFs*m;WjOL_U7d570=bqW?ME1|0N;o&yT_Z}DH1D!;A1}H)*GvNke z7uUPxLuUigYW7C+_94$mG8wJ;?QYLZO0_+)M%TXd^tBM6J;rO>QM9V(%ST;oTN1%+ z^R};>Y}y&>1Qmr{YrNJ>ip57}?@v++fhk@0I86^dUmg_`S}D>@@j}WDooq)=wi&dX zH#++GQz2t=7FSQ7aEQD5dA&X1I7x;Ji#-hr?_2=ON+)s)XnzF~=Dt(N?UaP?D{H*{ zDbc!Di#y5kCj9zKI=9^(`1^sY#WOc0W(104_P6c67hOSpkw-s^gGOjC=$qbhA@+*V zy9ad<%zAk|M{N&vvn}8ef+>1MO5g>lTEuxV4ZbZfe`Z&8^@O7MLiXk)MfH>(!CIYD z#-0$_TKs&MBggYvT1c4;Ji2|?Up{+QjAFlVRkZ&emUNJzGyJRlR4Q_|+9qd2_~S#Q zC!H<4(SxILspNg?c6q&wbq zcAlQx-5SmO*7)5*!A_=OE;3;1ZbgRaC z&Cb?UbkM186#&L>xeZBZ>&eZOF5&X>=*5J1dp3Bu*D_-9W>UpfH!O);rdp#ber4eq z&*DZMe_la;?lju@wXmTo6;6aqc=YE2wVe&ksZ{q&VM)RudJX0oQ-4%Ol!mz|5o#F7 zz9EoIY+;N|rx`Zi#8QM57}^qNS?ylM`~vOQv1gkUA>6;grh2MUX`E0;HJv>~U2&RI-yyKPWPCKce2=#Utkt>8U$!6>Y=Dd?>;22vm zs;%(eYe!K7<>@cnF6mx#jwj@s*H*bCF@yPJWE5NRmq9jl&7GW7J;ox@OLR$^?dmui zQ5O?d>q)Jvty0H1_bZ`<8De5OB4EE>{t}gUZ0gTy#+L)rLkYff?z3wUW|zwr+=#v> z`+A(;K2>_lV5MW30*4s-Z!z|7l43=}Z&~&0;~y!7S#@hVZRv7vcMsT5ED0fnbVPJy zFD*EaO5L1bOa$}g?JwcZ;8t=9gYKxZ@pJpsHJprM-0)0it6c1SqF4i@d>W8x*J(o=>~lx^R(uWMByV@lO_?z zZ1jea(Pt|p6{^`R!7E2OgOW28QHHoRciAVnb4JMsV6LJ*YdyaX2Wp(F>j zqo<2)ZQyduf?VR}4l}cznVtC=tEvGDrrWDDALL zQR+D1<(&P!^K}Z_fO59#w|oVcah)lX&`3!zy0S8f<>j)gEqs@iP@%x+cZ%VILwD`& zVRsV!8&N$;R|Q{3qhNcttO`r|%aMedyM+6(u|h$MO^Un#UxMqFW({&G{w>5V0%tD* zcXd&nyKCsc_SCjzU!ld)zH>v(h6cS`)gj1rfG$=eTQ!^i_Czz`4CAvtW!IQQWv|3c znfv|5b$m*=#!gCaA6u@;Q>+Z$j#H$an;nareeg+<)+mP@*#FvP@+>le5jR=JJi0Ah z6C;|(77T!q?-VI>K4SFFPfsiy)p^&>k8$ajpv?=vI1%rcQM0!saxmk(5O>jC%B0ip zzPa4xk{7bLlD;>+>cVCC-O56Ef@i+9+O5WdRpCV)_HSfv0T6C z3NpXo{E)%AB{U}1C14mA5Co+S-ISC-S!9s=0>Wh*oY^n%j#hy<#%EP9i}h8G7rL)e zr&}(2lbn^3Qe1T97w>oefV0ruA1Kxa!h`jsmU9qHTD?&`)Shl^O!8ouh?R^C;h>?s z3m8Smw_%Yk(G5kJV3%6Q@V#a5WbfOv=Bojbn$?Jr5AFElw8L!kr(pA@E95x{^yR#? z-BgL26EldyagW}uRO138md6#dGsvH2vzKE}YU2as?mX%NZSioRUdgv<5IQBZA}npB z@P1YwBf73HxS*QGU-{M@PHZRm!2!__*$14@S=OLXeT zkS8e0k^<}qz*9VT$TE!A7hC^<=dkppGMz=PnoD%2J~>faLxu80x+x3)>fCPrvqhzv zO0nWBTYQYkAVAY)3XYi{+1??(nG>D$I#v#G&82E?yMlf4N~s*dZ*F{~88=F@*h|Ed zEdGaOJU)pz87!tmb#l$$Sloq%v+_`Has^78DfmVl5LNyXy)D&)cm2}mjzg=2>oPwB zDHs6wMZ%j~W)lzShEa3pb>h;_)G-G-8UzPnIS#j7+8Wy9aOfdhYP)4bNy{xSiOTB| zZ!X=qv0q`3^l2!C@iQEh`DT@(D}kkzG-uJdxN=^(`nD~QaODp6?rds(v{lX+dLTaIT zn`ctDS@Wi21P!{@cXV?N4Nux=NKmDdylQhKvAuu7F$miiYhR10B&b@uq?2RAZdSJr z;E>$GU$a@YXt|P83HbUwkD8JU%z z(xjt3C9`CS#dfZq(J*dSX_H}IWuRdtUcOmLJl(v z&W-5XKk~KzANi_!O@m;AffP>HRq<)QfjwsLVlCGu)}koVbjZ40*CJ&{FtzUJD6%9O z?*6PSWEv$9jUbX=o-gxWrT40~eAmPx$$jm$v*tP!&0WfDMa4JSu<%;dGTE2k+RgjU zv}U4GgB9cWUSa00{+q&e|(XQZ`>S8x4=febR3S z#AyT%5j5yWS14f)1+L9)%}PkiYEOF@6N$3xx|Vt;QRqpvNO1ieWjFh}nfP(wckv%N zK|hFhLV53m%UM5FiEXG(({12|G<+F!G(M+P3ui;othZ|b-(h0?=-SqeplAd?uZ_x} zQNM25&!7TxIs^b8TX=2rzf_yA7#o;O#ftZDHBfqLn3zWTMLmv&9r9)@5n+v zuEm+aCw~1Z&w;q_DCI6T8Nc-*v+!hFYf?`7Sv8`7hxH4{>p)SDADOi9jSVt1J#0r* z`sC}mxAls>`D>T*N|9k^q>??}0cz1KV$wBpo{s0v13T0=eV_j^&-bz8hH^P?Hn}>D zm~NZ}AJ}`2AtK$XIsyZY&!;*FCmuUwFjbkCUE+bVHM)q)HPMq-6YbQka)YQLCh41+ zLR@6vnkq=)nPZL#=d3${;P!fNe>GKEbD-29)DSzG*wB$oLC~^5n+1lFCskYb^SdOF zVu`pmqS^?1&!hF}l9GgFN5K!|j=*t!tz0Xrk-<2>6xkL8MkS zyMjrbd8VnOJgq~=W0PvmNF^BmRkvvPSb5o{t8-6xU% z@QS0j>!m}?EcD|?xP{^$n`)?eh%$v{NtBJl8#vbGKch%Dbd7dYcJ`Kw%t*nf_Grh~ zm^h;oBM!WqQ;UK%2fvJOUzUFr5ZEJHi5Gi9$XY-TL<1pzGz+?HAe<$3dvsWq_A5UX z?Uo7e=Yk05&!bW50jx1hJK!(8eY&H`L&#~bR=`;#a}ZV2ED{^7pZ_A`EP7PJ3f|tj zl3XazmUD`a+gY6oPoH-^i$sP)KloeJq(%LlV7aH$FI&MF(F)IIwN8%d<`;UH%6--a zc1z~v8uf6F>kXb+i0*+diW)mPVi~$*#M{~#BpKtv?i-w4o7yNA&9@*mx2wQdh&XqX z!r_OaUUytYHR<9muIBB$(o4Z3|F?{!HJHAX;iM8;P!LJ`1U{zT>1AMvT_P)6s5_Xx-`x?I$4zSGs3|hT$tfToB;_6wJg=dc8)&4WD zo8v|GD!ABpg8tXSyH?y8HsQ+AlWK7P#c*BKU~!=d$h^L(DtvQzQLuXlRCo%ys(*C> zsyOXK0TNAnIq5H#;1^T;0G{FXoa;poC8aQ*)3sEh!ymMo%{1xDEy>V!lF-B=PODz0 z=m;rdvBZ|59k=;z6xzNir4V$4Ht*4@IR9xr5J;XmFBU`Q9YL+ZveF1iaD@V1nnv?@ zQ9de|b@{c20(Tv+b^M9Voy@T5=^I)M!4J&v;QSAR?+eUL6Ui9rLgpetIcLzyV*`zO zRwXiez%?>OhN~5zL=8HM&5Kg?rp(+$>%m|)$)PLc1%)PN78jbTXA8U7;3QUPX>;Hw zctZ$iJ^!G0e8Yb8ply6RaCm5v?MX<&ue#C6J`+E`ZsH`NY!_`Wg77s8`{eKS?uNmb zCw-cUUL0)VErK{9lO%l~3uM#8enq^^Z%=30uaj5NnLu%77?M`d4o)T_`K~lu7|sV) z$@|r2nAbJY!`WY{8X0#+}@|gnxL;F{4z61sB1UIVf&Xf z>fn{ljO}brwfJzY(+0%0b#9SB^>0xMmScP})j^!Rie~ikakGnc`3uE;rsVN^B9p>& z3MP&7ag0B7jUyp$1crOn1WBED<}-DZlXkceAq%#R5p>uWGE4s9g=Qp3I!n9B-J}e} zce=%~`u>18n+8Y2CRi5f`>0Qg;VejNPd7iA&K<`LiCc&`Z9()4D11auPXH%Qj~N$t z+AAk}bETY69V^u>b0eX)$m-Xj5mU_N>9B_x+%@}q+6%^0z5MjXH$Zx~(gdn=yxL=s zyTNgskE6Z#bcc`ie{$418`;hbZg*$0zupNg#rIRy+|TR`#D{Qf zsX}%5$H=*xH{UI`&fzLO8#>% zySix}l17-&w^Q|D<04&)vJ{0g_btPouXi5q`+NN-4_(PwB1SP%l=x7HUyi=1_Wysv zns>{#wNG_G4XWg@P6-GL1Tb-VwwSG5Mybvefd3wnv1lDH{-aO%Kl0c#PoFzu^cs>8 zQBi$d12WiVN%FormCb)x066{7h6FMNtwp7fn+!R*!Eate8%(muqewF06(xfW58wOc zYzY5qS@1f(^8Tbu^5Hmg037b0<|DJ}AS%@~QZU-OFpZc(IF+SLyLGphs_m{{y3*LO zJt1M%5cBvp64@a8E_xe<0&#WfGbY*oI?K}?L?D9Tm-ZgWiQ6>g?YA%O$ecXp5f{zNUyKMKE6l=1h7!ai=e6#&hO+ zv$Ht$!%92SL`~@Vwj9BTcYQxD!W)Y)g9k%0Xv=)Uq7q)h8fl^JkKzf4#vfZS88I=V zlT-GXZc#mYLY3-$Bkk2X{+wmcYi>z_Qnu^uz6TME(_LzUyPpaD=3EXFC|rs5LHW}7 z>9}Os0A;Gd#y5^6O$oMIE_ti19%OY>k1O_98e=f$7jA&!pnp4S5aOq^*I~cFXcJx9 zE)-ikrSl-{2w)l;dKCq`fvw>jPT#p2`1|Q>`N!zJ`rLOZt!p{sC1ODJPa)GELq6M{ z{4|B|8sGhLh_@ddt)th&6aMIO?4CvJ_+X90fbM)|ek%%C-Suau64^Q`Bluh=*DBof zyzR}mOX-`7#VH!(X`3+R4#J~IAwpKMmH@(gd`+HwXM&$S*pD(ITR^83%drFqi5NDT z7GnV+Pt^>h_@(z*aiWbzwKwEZyoKqTL9<>KL?27Gw48nC{ilnNK}-OX_Q1@%4eEYk zQ?6Hg7NBfUaCB*4Iawb-=XKE2pT>44bN{w`{C1}M>jT%6@V!(`9E zA+C4rq4=hg=+YUXb1EABZu^<LJrqrI{K?;2ptB zGUS$)4AR?Iu)A9nHdXnh?J=SIli}3$Qn4y;)y0U(^NEP_CW@J2oKm;Yl0`P-O{pLd z>oZ-Upx=%ZnOl$}8joiWwly=&vR2vlaQs>d(;>jd)Rac(?ziYFtDKfWfHG8nTPiUM z*|y6OP{uuZX&1oeu6hzg(AZT~eN1(_GYh3bgGjU??@%tQ&Fj6$0z@;0WT_6pj!yy0 zAsu8k+t}ByOosh>XZ$C+JN|i8Tu#neOOkvLxV+amSn{(>L>%^vYQfnRmd6oG!iVE< z=6p4?QjwjiU9r2k(BptVEpDgR(G{n0op>8=dWZ`#JSg|h`2#8sNvId9_O7Pu-^2NR zVmZ=^Mwkc^qZqIV#@p+qvA=G8RS7oowh-8=GsXB7$zb*cC2!Iq4CjNmQe_&V(!O`& zOOw21J4wMTjhxTRh&XtAx2v_`AiUp|lRUEFCCT!kqc2bLzZZW2S!ijZPs$>VN~c#} zNu^iSo?-v3ns>oy{kycMZ~Q~m{K>Z9r$r4P9C)S5O`?Ipgl9o2=%yudT~B-TGeyr_ zkE7IBG|g11*;?{7cC6~9%|wm273!Ab%$soz`;&6d-xrjGLNtU-8b&$|iPMGQtPBgC z$+NC8Y9V7`vaGXnq&bC1q{9!Hfi%mil2ANwe?J-|m;in1d3?j!r(`if2VVf%4-~Li zsH}*aTUhR&`gxJ=|eAiQSvJ|fidv9(axNeJ@H(A`4hxKVcN-v*9 zt)38ym8 zzSfXrEUS`N{8~+spCM~)2ZhgPIWzZW& zGw{zbc{I6bkT!*`R}8zBq9eom64`*8nh#Erph}**+HAkvO3!qfT<7U zL?+WO>fL}6h(!fj$p9cSeg+}H|3bXNg+lprw;Pc}3t~g~1 zmRrUH6fib1WtLy8QUO{Cs1{9e94&Zy+2T8b*ShWL?BS*jE1Eo6 zIzSBuApKi)`&|`Z#M*NJb+XLb!FOsAGB~`tKHD>i_N$XQq_2=|?qiWKZD1h91*T>K)OPQRQ7;#7$oyofU4U-VC zX{1B|kTZCeL#i44Ne|U17}YFzqOj@wTxVP)KkqjFBW)W5X=0}z*`wTBeH~CTLQE=l z`^?EEijNK!Yw5?H)pZ_SIcX{c+MPg=Hdt9W%)b_3pC13R87M9pl&i|bmdo9V09l(4 zXwD{{#hqVuUvUn}W*ySD9XFZQNPD;Ww-z5ogNsBqFNmPc1FrDQjyvH@MnBFPH!r-@ z%EL}x&ucEZGUgsEs^o`oV|n*IwO6>*{dVOn|6`EILA4Jmc_;Ba1^1FiO6I`ALhEblwnaCE3@xYqAK_3kS1L zaSX|x?+jWeM~bVgPjC>yXKq(|U|)a0BHg&+RUC8-aYaG5E&qaY+pon8~__pUFvax9V{&u`(zI)O`2~=O_)?Hr30)z1Fu~ok3 zwoDv6kWb-n&ctHqx7?NK1y?^a ztUIyj^9r~aOpOpj1`dL2i$QdFv^f^_(`3toz2`J9@Iw z9q8m@@eOi63!4E}NB6mdRQY!@E&+j3Y|N?Y<)}1L?dSDTD3@Yzr%?fI6?m6lzm24d zRjY7nl4L%^ZsZ;Ul$pw#VRKllPV%qIMbQ(xV&^;eGaE<}k$1#J0(CV-FpwrFYz4>- zcB0kGQ{GP8R7;0;MFJO^7IZ!wO$0ioK zX`3g;^Fze~S$0yZc`*LNGumI$`aeluxL2v~4?mR|MK7bz*RQ^{^;LFkj&h5A=MlX# zrJL<2q{C7c+Lhvljrg@K~5KQdMfqY+V6 zTM8<&1Y?=D9KMit*-9OG=mD<|pw=|e+>Fu${rFsYCr<)jbY#F06%`w+V4ojZbWab* z6eBWFebsd8#ubZKC6H6O^15049z6Qg?Dp2!NqC%h;~CU9XSwKdth0-K{JMD#<;ve9LsRUZ(x(_iiJPP(0O3}W zaJXw&!ix*~4h~Bz5TiTcXT2cJsRU2rpG}2~V&W2!x%4$xM8o^q7+DpU!O%AG+zj2H zLI#yRuB|~nx0Oo)OuIXiSJ6{lTbrCqgyNL_9Gjm&>oF<96=xSFgCa3)Q<)4WoW~cR z0AI5<+e}D!Fs&PX86_K#OfbI1j3J1e*&SY|39|m$ms@_W8v0o$CR}M&-wU(ROowcA ze$YwRq~c9}r?OcJYX4X!TX?C;t^86M?`+WVXsI-3fsLmtjqxbSumC$s!~R@Ov8NaJ zPQ$aaKMH&9zG4s+(j?JB@SR_;ncSvnltk8Dox6!w)L|EyK6NV@eI42CqO=@%6x^f= z+1u&Ehauk5aK{HxvEUxajHoxUZ45+vx>>j`IZu%6$6oZ(V>{`+Ef)`ci(!PIz?e1g z(G_Xt2KF-{&a?)^a&zr7;VuguGy%A1(4Sq)59|dbf?|XzL-yVMLIOWLqoi#!zwMTj z-0(A;U&0duD81@GB$9uQ^NmHjQzFa7J#biL@_FA!?OpbAfpLhF)Z?)M>0Pg?k`s|^ zOxnK?1N6T^42BZU&1Dp26n?;)qfKqUCGr7DP{lu{_Wi)`H>t7#+#x-l+OkhrnHC6m zMdCDp6&inSBu+uu`sej$6&*znx@E97=AD%*)sxpa*TalHZs{2=% z7Lu3_<(L1ITtn*Pk(B297oR+@;_!j!0SH1i2JReCfG+RXAAFIJKq3Dc`<>78<(A00 zev)Y<6cApbk=21xc02E$!}p^D#nsi3mfT-<@pg^~5e6%2IKp{*d*>7ucI?mBpOK!F zNmTWitX4h%6@Sx&bo?(t1Z~nflulr*sYQnsWFRk73uw+T7Mylyxh|JHy9Wj)S{p1D zo17vXl__RdS&mg4Lmw$3N;HlgD4gmPRXiAfip)~ex`&LCl5$vE?T4f4e;!MH5P_kG z{{uEzDDdaS?zcNNYtgEJ7>zBKB|P$?+gYFRu`vd-`z>#GNnWDzH8;5qpi4t{q_kqd zrxg_yDBh5S2F3H%_y-s#$~+hm`#1L4{ARBe6RzQQmahpB8IE@CZkl0p`udME>wk!- zf`hXHZerau3^#Svl?z*Jm}r0u_)eU%;OB-4#2Z z)<{+%bm7?;_mjaSt`z$$F$)}CSW;!h=rO1@>=g*Fs{*p%3=1wxg7WeC?TdWt&tP+` zE%jWWBFx6R+F*_GM`HyDBV1UX zg>z@H@KmrURA0}`21-|Fu^&EQtx;l=YA3Mjls->$np$;RZ`1DN*IT^%yCfK6;gAV_ zO7Vq$8%KMca3UEe#e7QMXnPTdGq@sEoU5vyJDB19X>q&q_1$3N?96Ra z+}XKFk~`s~)5c=pRlVk=>6U-+ew^i;-s`u`h#J$?IM`1RfKe}P|5CH}%M|HA(l z{4!(0pH1ALf z2{0Fh^B#@4xd=iaNR;w6*L2Ko`H9>o&4Opu&UpcWz44jg3K}b~C;)ejJBk*(srlb= zmu+qAx`jt!`{=V@w&Ce|j20qj9#v*98K9te%}^GX^c=ms!)MeN zuqf;_V72Zj)^c?V7&y zGSfIY3+Lye^r-Vexm4HQ59;=t>t5J>e@#Z)UdPo|TAbnUy^R5GL#(>n!AXz5pA@+? zzf0OK3ya>ww%rhq&^EVzMl?57gZT5qk}70#6pXqm5iU&4w@|=VTsjF~?%5Q1r=gOQ zidt{hAU0oS`-#+D^7^HaYcbq$zVwohCjg#3fpAj{2se9rdM*#!Q3qr;dYbf7_SSZ7 z^ZMWc6c*et`U%0qJ~OIK5v?1@-7c%l;~y zrv{qRK)31$Qn&lF8%CA%W|rWJ6klCvZ&xaJGn@~f#7}_kl)YduVcEDsw+nB%Yv0zW z^a~ha@z`P_ozsGG8yirG+&~aSz6Y6;tdZPBkO;XtONliWk)YlsbjO5r)4p(mIV*41 zM4vOT-4BjK6k)wo3G*JBf-~37GAW#kQEhs$qp>AP9{vEc%g&h?FjAnMO=)7UmBL{| zPT;tT)^L9ZGZ63U;dwB2gM9+h{8ZK%=IXw-SkGcaC}f3z1fBP9I+D_=zHX4y>~GPG z`XsDQI|{yXSDufOF$)5n#%GFbug?t=q&0~5=Wu_6*`~VE$hmcdy;fRw22-!W!hYq0 z)Kc{36LUj>i$jB93_1s`JE)v&fur~m3dyM|Hfkqo=HZhykB zIiDCS442sj+ZwvoSB%4Nk0rQo)Q)bYdV?V_=Uh~0+6h)B#-PdpJO=X79sHZ5%DW4> zYAGM{*S5bT1~vcGD2sPi;ARbdRy$RE4`}C8X?FoSNKo}3g!`H-RcACt{s1z7-mAIa zp1ox=$KP)_ktok{zVaR1IYJcx+`Z|4Pkc+8ax1_z)nNH4n^#Y7VY<+i`@#+Xv`qf= z-@<^B&m07tYSAdqx?OE73w1zv%-@Ai)o~8D4WeXYB7qQh|D^g7w9&g?^`Us%0U$>4 z9LNd@L<#@!vDNdH4e$WaNu&f=#QrTUHT75~ZTNUa9{`k&2^@=eNs|AI2MedR9loIh zum>A}J>D<&H4J?6?^OMqN9nasgXS4Q70q&`@qxW#o7Z8Pm`6I)!W5!@T^7Yqyyap zFdl>=1f73=t3RR119HH!c||xBA3SS-z3G_tZ7bq0zqLRN;78hBOxM4lcCTcBp@0Oy zgV=xJCGmf0sXvqqnD72Rgf_EI>5G=`YZ>)6fag(@p7!b=to!e49Y@2Y54;rIzwgav zL2o@>e=iYjLaU3Fi314!^PZT_7ZhBbb-%R7qtJeopH)F*4Cyb+@qKhWA2BWE=Rxdq zX?}op>v;K>7W4Y?jFU4_Cg?8(CfNNLd3GTF`|_{l*7>WZru1lD21keCd8bG%J zWdFwCza*mSzww=F1L$1xe;;1a`D<%sPia4%b`EG^?J5ZY`lR%?WaIV_Nju$p4}=VCfJoPP(gvaQ;k*Y4p|{6kdwkM2}pSB1j)LNb*8J?~xKp8IwHSXcje zUxj}$naSV5DzxNTv;A~`_m*+{;^1Otb`70{1yut}PySv#g> zlH^`H2QI`oNtQMaOb0CdVNgn0Th#p4SMR7PCB^|%c7``=x5sGX(n|#i-Gn~}xLQD| zC%ZFN*vA5>e!@q?cOjX8Uc*omnJansmzLUEu8_Is%%oHRekE`;T_yKIF;qsm_+sNr z6_cLxIDEHZk8?DRPVPN7EvOwIlkVcz7edeY=mD>E@ds^!Q!Op9qRg?+S{4BMMgCCB z*hhI8`hY~AVCBg|cI;U3IcEL<2W6f@N^4-Us6Z-FfquUH>CbbcY0n8h0zVkucuI* zf3ceGl3w|6e&Abzu98snZz+2}C#`CglAYzBZyHMR$?mX^09I<8my68o<>~{N4ELO) z>H+RX*o{Y3yS#oHha`6hP2$U6x@}R(F7*Kd$Q;uvaY`IohRqdZ$Ikzm)Yrpc|PuwT7`U;z_ zy-no*NUmQC33Qi^9mNHr$!*svj$ZBGn?(3zK03vrLmQlW>nnaia}kdW9kyOP#Gl{% z*}9fU-x~L*akI7Y=j3;2(p*aQcC_S@$)?eMY1#V&&@f1!{Q&x~QXRgoZ|Fxcfzpv% zmR;_sTcl4@@Dw6qJP&K`=MT;M*Kvz5Y`Vr31vPu<=bNPq`;{j5qK~SD=i)|I_I97i z0=(JH9)u;vxPX7bOD=Fn)3h356D!;q%ML9vBxG@*1b)PVrOB?f!vIMd`u8Ug!mXvR zmD$%~5e#x6z9^u!$x!=0v8$!8cQ7G;H`15ZQ~J2t?m5Rj=m5U->0jfAyJtk3l48#? zN&&I}I`Nls=njJ?QFm^|8Ib7}!;r z6GrSp}Pj&J>Jjzeb09sAOA7H?Ae>W*IL)L&hxi~&)x&i>VFc@ODqoC`jf)y zQo>Gxe*^jx=7?=?t&?Om*KwGPphXUlJaGOEYR@{N`6{*9Zz`mtED-`=?0;W<{l3qH zbfU2F-_IPxwZ8Vpal(ml=PKQ0sEbH7A879UZ!`(eC+?U4?}e>2lExs_vw7d6q;r`<8XlAQ==dsLRY=mmyWX>5RQbi-Eq zjD4?-`pfZUsCg&}e@AyDF<5R{0Xl~bqUV^X4+o%NxXMM3M#*oFhx*E|VfQH@i7@$V zv^lR%5TCC}A08y<^DovgZ4`jQC&A%uA7&nIr>yMGr8$d+JdnYlH=nwA){whAdi%^K zS_RarL&e2T=L3_bgT4_1+o2siD&pKH)n~& z=bH;YIFs{dZ)`6=#cP{xPu6mBh@4QSd+uVAzOm&4AU4J|&pMNkfVsF5I$*7XoGJDY z=vUZ0eZ#N{h&;NvQ#2$*%K;NQ(DKf0d{A2;S9T0E!V?|<-6lF1@a)ywEa^TAt!$eW zhQ$3qJl&1;CCOnaLy3MKLU+riEt$`q#@Ii+ujwjzDu9kTHLV{q%@B z!(}xRf9z7#tlh01;OY0Ggs~XJzOOX@wqM0q|6#+>Way~Si-M97|39(KihJTMW$4sr znYZB#NK7!cx>gc$_QrMY0}${f`*dmEt~4|fqX&V>plf>=G@b681Lj%_?&-@X~Oe`>Rr-36%=m#zbv`R0!}?|I^h=Q!WWd&RtfdW*FTm`W z9JB5R-15U6SR#&pd`=H0he}?`*>*+B{`U9tO7c(lKH8-~kk0510}ao-++QPs?CdKR z^0UrMPJPAm5FYkp=su@8x%0`ZHF>7mz`Lf~cm0!j9B!-)sVx0G#hnY*ab+wr=F)(< zJzRhcb0g1vkz2ksD#?#XwD&$$n|4Q=I^6{u?y)g#feC+?*NuZ^AjE};7hkQfAq&t`rjmj2>I#4#c^!%=pGUE2bZSAf=2_3ml>8|-zi zC>!bZ)I)A5k1z#qjR zqMSA%o|@MVn6N93lzhBgPfe6qHGcgn;R8$1{DG@JhT#X_f!1V5^W&xa|V&yVXZj*?Ga@Ppavx9{GxP=(|Eqm zCEt3>YLUX9lWB80Y+SeA+q!uDrmUmtZ0sGex#b=>|ob@keDo5|Hp#~RvnaYdqw|Gw4KGE z*Ei-4fioWIpxc4IKPo-#sCw=>#@f*yk7o|Eio+YXOPM#8PIrPMvwusrPEuG9gvfEv3<4r@we&Te4ZElr8@QB+RZ{^Z5l{o z=)sP*39}=uTS4Tl$fH#o;eGl|MvI) ze{`ZuvZ=}Ei0C5Os@@SZ4ItkGYq%V0(y26McpY^fj>u)=fJ7xg3Io0pLcKo8eh|=O zwn#)Bh$-N#wH5wgVf{bfww*y*7f_{ZuJwmAqL+V40G$_*Zp?&P${9B`TQ}3fcXCa} zQoyLUk_aooNc{ETGwTqbltUe|j(P?q_5eP<P1R2sPs_aU8V}ftQ=+yPE`i>}hkX?=Wq_P=)T55>Nm$G(M1%vwW3<*P( z7vnQreTZWf`2#+FpP@B&e3j)NKr7eOaf1>szbh^sCx_~WiOZSfe#oz*J`98XEcT@PK+b_FK%P;bK&9aC5^t|HmwUzb6#De&H}1 z_ige+F#E6GLI`ki*yRi>IjrsTI?G{S8Za$%@4I`+56z+ zxg`Bu^yC_S71cbOTbhquNIbZPnz~k*kCMP<9I;@FqQ{j~#?+Z|MsYA`MF}-Mfe#>N z$efL`7Cf}q3zADPsj;d^L~gIDmlbch%KKAHzU+In9rnFjf9%|H+mL3geynaF{E1BW zkW^H?R20*|8;Y;#Kg(pO^1usiP?U1VhQRB@5*}Hxn)&`ZrP%418Ad9wD@w|@g?XPK zbk2;zrEE|s-?t!N+#$2rlfeUhUbDHTHjWe1LrVY=BaZv7<+AN|@=IV{Z0fGdkT?1w z+!%^Gtvx-ooKUTko^_Va!G8sSR_abGNlrx=vS8%?NZoii*%NwWzw=4MPOj4XomFA_ zJ*!ATYx+6&ro2G#+Jp1;B=Q88fC=zO?SXMJw|O0Iw*x#5XNan6 zIUydS_&ylDrIdAV-us!7P{m`wo&PLJpGT+mS}?|@^)tLqd0r3YxPJ@U6E`21w!o&x zLOz$T)tG=zIqNMHuo%5A~vlw{5|Sy;!o9`H7iRc|^Nu z8k)58{z7&Y_AKwexJy5lGjUeo2P6W?=bzeHz%FV!pY-{GkAx;Zv$R6Y_LBS+=}M%-UV zU3l$WT~Ax@kQgc-S2{tH6RXV^4}X>dPb=Ivq}=yLq`zeyhiK-aXcF&NyHQ#L)kcMR(h@1pD~3oa!G8=d+a1&0X5w~Qxvvht!=l!cRewCR`hp3E;cGZr_DgLy zSQEvn|Ip7uIOren0J6U{O}w&sG9Y<)ea~m0@w>mdI69y8b{tnxX_EPdESG+GAtqP(lT{V=~h;g^$@_8aQ(A=TVt|X4H2j^QDyW&Fqq0SD-`qOz3Kjht2mA`QsvpE*oCs zCWtHAglB(w&NUF|-@cqZQQm?ka;+pqhsG){!7upqG5K z+_0tNH9(h12C+WQ@;$G&KcjjJaZtu?tZwzc_-%T4HZ8qc{U;dWOIg}}U@Rj`7V6-8 zLaOK7L%O@*t2_zuzqWGp^)&s7mt>e3DGwJym!=%SGyQ$q1<-}s#tP{i_dv{~LUma) z{{F~kbd8AyJ1h4`Lj#bq#XLcig=G{x37**xn?9G7R8J5w-2z{6=kBAXni22J%g2RZ_)zi+v4J+Xl*Mf26 z=V%zDQ-qjk-V}k>wG<`f*2QliC9D$T$M17{o2saVDOF=K*~=7C(fUJuA`{k_>mi(w zuQqa%3AL^gQ-~#@f-r-%#K$bxdxJlxciuuTX2jplP`OntYux`i08wea!IlpQR71k~ zlS%YC7SOkuu*=u8LYZcKSf0wiOZb))M|YD3d*GH5TCE{(2a8V#saB(<1x*&DJLhIz zT;Xni-hFGY=kNQ*sKwBL(;UP*=NJjsem1(~tL(eh_C?$rsPh7knpBTWA%a?oBmktQ!3K zX%lC;D2$_}`<~j2Z=>U?HG)Mt%G<1=G`P7F@#9eqQK5_T*aAkgG{cQ=en+WGrC`56 zsg0zc#W2i9zLHfRX^49xBk|l04db*k6d!psB7KtzlaWu61fTIxrRU{o8j#2Z=2Qok zT_U9GI!+QfCX>CZKuF|4+CXt?JQ9b7Sm^D_QkyRq4}T9EsMhz!&!c&wZj<{ysaIhj z|6ZtG9qGRLqcq;Z?li%DVgBc!T}Y|Fk|>?B8v)+hhGc2IT747poUy^b&G!PdV@N>l zFGS^zynZg6s;~$@N33r!i1d0gaX;NX`XKs@853nlvt=K?u-*{+p#3h2OcGdo3{S|h$iKG_+fT_-wy%n+9l)8G9#&;#4DTek^%@~(l+$<* zI@fAg`WYXRDpZH@cHpTwKLFJVHObu-2XD^Yobk-R9DTS$E@ZrFEO8df4nqcq+s7F$Z5x z;D_865elDBJ0{ozH&nmbm*ckXujtZwaI+l3>jvr6@WZ(raNyZeECc+wW9-uvFJO1a zMi5TlE6)!i2&>zbq=Ood+tuIn%VVk-2yPN)RxDrZk|rfMFIky^z|MvtXwWuW0{H(D zY3PHK@Cs#|TFKY*r~IyMm44jFIm&#XwN9&^nt|4Q^ya`*o~d`qqE3;l%zetv6mys4 zNG;GC@$`U6Tnvj$6=W#ls;00Vh72~yycfp@;`wUadw={94 znrDkv6AAEdPS!ovJ=!0ytou)Y#S=MP&W(?jn7E=Fn3?GD%QM##k!K*|k}i#l9jck^ zd+wECJ1XneL%^BoX~9TNE+VcnGV5{F*{W)qVfHKa_SWfo@@sIg(sKO?74ir36&t^3 zecSaqC&j6%Y0KT&5vgF!dUV6{EMUp>-~*oI7HOtRVk1X#%3$1RSoOH$tFo$)cael% zr1JdvrSTlnaZR{BGC1=1qF64?V(%!2Vm^=fFFo0YY};2AuJc z_%zY3JKwps?VPi!PgWDw?)030Lt zQ=313LUszi1IZh=i}KC8+UI=(0kK1I&d)vH#S87qbZf2l^vIHm`h9LegWRg>jH;@l zl@X$Ng*5DZ#O`xWC8Y-o<;$vb;3N`BNr&YEQD04WhWQ%1tOdhQmZPaNc7Tar_}a%( zkJ>Xt%c?o4Al&b`M+|TfHzsU$*Y})NO!ke9dTbrrNGK{+UhlpV_A$4&=N?p>ETkuB zZ@F*Aybh}X&^D(y-_tic@ce(sC%~@;ejWkN=jSKnK0D-3cX5voGty-V`@b!N@MN89 zp33!=drrl~8Z9qc;B{{K_&$2b9tKc{m;X$glreMxoAp{+)Nsi~C!YzkkRP$gAm3*R zLQ!gx?wXEc6J7haw=#KkV|MQ>9UiTbd^1UqFK*^Pe;(XTg>zG zfAMO^5WRh`_-EI%Q+)|?Dcz~8bS3%4dJ%Z5w^`-jNPXFoq3@br1S{oa_tnKS*S+?8 z9W_mz(9`l=IL)N<>`$~vFy>3FqF~D`CrmGrc&2psJ6jHi?{~s`{((9U_ok+3Ux8_x{p| zA-~-USMaUnjtd*BFj}TybO)K`)1$>FDkjE8lkE8uXkLLdUS>|k!{xuK5A#U+tpwFl zvPmhZl#sUaj`Aj^UzIMab$6kKPEzz6d=AfqzPHj=L4k#_R~nYR_8F61`GmZUrQ$RJ zVp#&Pi@hEYK?kurR@d%ekZZG-AHT~Ttl{Vb*PL@s!dZ3P<{`fb5OoxCDlJnDo)Fz+ ztOGq45i$1a9f2*znp8Vj9C9N&Mz>kShtK%(ES*Y-IThls{IrIH} z4SDUX&*pKa?N|~>=OE5yfVGnV>ArAXCbIvKb0cNw$U}~Da?pVD2CXt}zGOUmX0>fh z5t(88*}c9#C7vQJmT{XvA@lfkXSEbzSvWUG>I}OQRhns)vxuL9tNFAZG~LR#3tZoY+B$>-c)40Vu2kL zWbjCYC}38+4=nt8Q#DAwVaQk}c=W3cz|HL&Zoaxnd%OMQ!&UneM+*7U@smW0> zr~)Uf-%qF`otj-bswJ>6DQd)8z&m|Uf@LLZi(y$V&ko_X!Rd-u>6j0F`v&DLa|_Yw zJyTU^G%LAWN?iJ3$n;+4*B9M7;bhCKtU0|c)3{*>hj=0~)0U_(3N-Fmd(%^4XSK-% zftFUYJXd@wOyVNE_s*69!B7Z8GP8v0(lbqzNoD|R`PsP-3ZMJnVF@+`cS&~ekld0D zcTNQ&`W^H(KL%(^LTrFq>VlCS{)E>D3O$tow1%~2bv~IwLWr z4R=FT6CI;_LT5_QS&?9&7@&;!_HeRA2za`zyWJAdsSXe5P$d%QV%Qr9e7MsXj|%t- zuDZuTh}+BQK6C9@6{^8S;^4={Kc(OU?>r&;6aauHJ?c);^f!_1*n$-n1GqvkXKX)T z05w&wO8vui{rhU__kvdst@b$CNMmXjQ^NsT(R2y@jtHU$hI(_1!nQF&MU*Bje_f zYQS6F77?(o{jj@nt&inR-^ZDKrkNG4%X{h)W>CAvcH+e(0Ve$LzQ{x&TF0+>?$)8w zJcX%90dK}onNh#64^MSry~%B&g~ZOVaqX7ATiofY$jNxr1lBRgXR+ zVla}sQr=)eE8S)AwJ>*26mR~@kiQ6_|Me;vY8$cvo)Icg7Ng`9t$gs}td+GuKp;WR zNHqYKorIBXF`360cTh*JN|oVFFO>Ab+P2Y9-1}?`A#-H0Hv>x+@~qU}A=@u1c{eh8 zb9(Ev7R+2m5M%@Z)|G7=W}0v z!j|bt1?Qq4rIsGpyLRRLt>>EpjJ&wj#;jBl?qEB!w>6G2UA9hv7Q!$%VegaPXhFLs z5^&1LD?_S!Kb05~mdHxI+Bq)gF#Gx{7@T7uDfKF@qR)KJCB-b$I0J&n2jq1s2(Q&1H)f`bQ)6LgDTsBcm3 z3yHA@Tu*ih8U&|gx!Js#1VAu>M;Tu?YC@vrnT~U%Hu-O|=BA<%2%j0#;WO+x%I(2Y z?8bcxRL>+QI^u-btvKGyu}kKUQ)Tw^4F5~7>Q|;4m9dOqw}6Jx_qjtLJi%`-PdCuy zXTVfyOp~iUx*2#BmTI&hrsS@Xcs<~AhRQ@}8(mz}^b>wMOAI-CB<$KbLc8x1tK=w$*qWStBnpD76XcjjTG?DS{BBic$uNI$Zq_TdCVc#i>_+(5tg{v8 zJrR~>U5WG+Z}%UWB%NiAJqJZG0c*;v)k2UsyIiaWVqN8AfQ?fk@2ccCFt1=NycbII zCC>^>ZzP)E`~~n9pzu&lbGfD$T=olNE44`cL`LN3@Y zOuoRzf>6~`y9lv9wWMKoN4e=J^v`i7%L&94^LRjaUDyp%E*yK?OTqF{sd#1Ig6?eB z3+RbFosX;UEDxkS6Ugk$(@y`kw4Rrl7mL}xv2FfRFFFg8G2EEa`Sj4< z`A*0~XnNxMtUe&srE~vl%AU%>;5W}dvVZK^vmosoe>cX+wqI2#@AN}QzX2Ot)h&sy zHaW$%1rVeq%z-Zn=r2=O&y_YvcLyPbj{~@qY6J#qyq=<}CH7&Ju^~^YUBv8$awHIq zg6yL2OFF6IrILFZ2P?DFm5}UR*<#0-0`P+mr$r zT4rGMJp*U9LP+{ZH*TTC*xvtz(JzN$rYre3R3XcMofDJ!AJbtDvr6Pdjv-C zpC?ukS&L2sQcD#u*QXL=tWF8ZHPe8awi8P_EDTVsQrXKdSwlV-S2I37=MZ9~N;IG9 zojw!@af|dc!gE3W1y_0PUfX!_p-+dPMHtgH+|8c2GFoti4Mv0J-3KhMm)&M|HjGss z<_bNM?PHbviCF-J8iwnQk#wU)J@=$7!s7~sv$f1|8~1Eu*#DkU;7iA;Hd9q7B5`jMvO(nm}LfttCG+u01j5b7QXoHksV(&aCR9`4Z65z`x_V`Ab2 zq6z;wc`XG!k%D!ekXa8)PsLy%dN;RH(RaHDg!;4E+LU+p!M(O@eIvZoVzUx9U4Vm8 zV(bfWm2>R#C^G8@`hFe6D+bc<(# zWyi0%+(_K5F`HYR#~N7l)g!BgzA1hTr-Q&n{54=|y<;o~dg|04&S-2H@*4m(;kylB zvPtp#au;aV?*&>ZjL3PTph1Ug(HP^kVUnx!l)vs2Bo^3}_}A zHQ6=L+jrov1?GmRz)ZgTHzyRS<<+OceuV%n6AqjTdefPwz{D5jkAxD|wN;-zvBO{E zO?OYyGvf9&PZ?PN?N>fo+0KT327$Gt$Gk;*FG>fqf5oF<6SgvrNd(DW4u|~eP=^Du z!ULo1ulJuaQHPfNvzg^@zrpkz&6_pXH5Ua{ZBUa`sQTbB#Dt}jg+l${N40+on>J@j zS9*K%27CZaj8SgjnAIs@6!e(L)^g{D+0^D`3WJ`_{Ow9Y`^?vT2ikpVWq@psGk}6ox}OEIU&yM(9hoH2h`G*9k)DFwd1dP~DuWS>oqCqWjh#xeqMDAe-LT zFioeC7`SEPXFOn&+?8%Nts*FcAU#qh{udAoFdgO&<17X%H=w-zzyS{1XB|@cjg0Pr zq5=cmP0rCGR6ZC?E9Y3CBt$G7Lv5N^LhPQiu(rr4?6C1UVo+$dKkl!dIh@cim6DXU zq-ld0HOvhV6HOaq27Ep#&zqCa2d$jM*t(-VYnWW)>bh&|Di5)g6sMD+K ztHj#1WIJ!a&73tb)Z=I9ey;;OrtB$#t8#N5+nNByV|_KF?skyYBT+a~>EfziC+FAL z;RwT=<3$bNZ-`-m=Gz#e6#oRW`=nEdq$0Et+u` zfX2WW!uM0UN|>5-w;Rx2S8}i!Q+f@Pli5)g8Zju#*Lg9Ee~NaXk_9x~MVG5OO_gtS zbz-B*z5T^H$ZLFnRFXg{2lrG}{zM|`zZ=81d~M^{H|kw+Jf3veW0{r8E^%bHnW=P` zWNCA>w}`}I^uN(z^F7Y8h*;OWcc+8<_B^eU$iB?|Lo7?>0q85gI2w7r2WVv2MW;HV z?l?Z!++f5RCEHPm*?pxA^S8>tnXQ1Pk8e5BFwac7ymKXeam!^My7zVc5Rq->01y){ z$}>iy=b~P@!OdP?j{K3TTb_jYlB!BRqgL|7Wge6#_l_~z*7k?3Vm@C5JE32){G;4Yx{x5Bva7({2X5u( zN?xi)B%IAR#NjmFk63pyd2B)nvZ^7AUh6QFM64t`vHVrOB;bg5`8g9A`my%4vS_9d zJqeRm(Ta=ij`CiQ*#2`OlQ;S3-y0X!*OePQKep>~3TWI0 zQ+#$YJ$%|SE<-Hw;aBT@aRR|nbP36dV1-||O~pr}-xd+6;NY|F@}OqWXvJ7Ug@?7x z=aP4bny&_y+L4hzXrV*w&`HW0LI8u=Ala}^>;{_}<&q)I)OvEym~xydb7xowj{JDP zZ(Hb~-=#CPPVxX8Z`>OMjE$3XurbE7gt`m!*eym62)Y{~N;81Laz6aKDhlHYRlf|`VZKuLm4qep1i^{`RBlrF=(2NPVU`+PeY@YNu;wf5B zU~1xd|ECVoMuGeQks08i*-`ixC;$HpkMw5Ty0+{3BZpcEgf@_)HBx&G@5nIIWxry# z6%nX6C^Mi_O*0Kg3?|~kc}lB&sb=~-=hLawGbSOdA^=Mv6vAj~H0iuV?qDHq$0ps= z@bT@n&tYK?|B9@$-(#O_(=5y9+mnDuJ((yh_d8-(j2pLEFXqSciviH_xK}?Gs;INZ z{8zElo$|Jgjf}=a<@*Mf=ZiR$XeV@5mB2Lf)(Xafk@=P-57bE%Z8xZ4X=ziPx|?Wi z)y!e=$*xcHV7zOUFOk@v=*f(s-xOn6a5L`4)EAvz)JZYzkWR&g(Po{gsaSDc)c%-V z*2(1F>EU@Wi93BvP2+=;7urOnA9NdL&>w%XJz)+!kmd7_&uhp%gy;ia*wsp<*>z(* z^1TosA-55Rut2^Ri#0m{I}~GH^$(a*#y&0l37sPV3({HvO?4TNvs28!1$fK=;vBJ$>e%F zqqy4~tP{Ybwnwn?kfcbfH=o-_C!`JImN=+P5zY7-OI;sKnY*@oOR722g0MD^iwO3CGP1>G9befcC^$k%la}GsxuXTlJvth zL*Jb)|+EYP(-qEQE|N>DwBC_6w&4|fLsGtChq-y;JW|D#*zorG_rp%~;K z<~a0^ry-AZDwL7768gV$9wj7%DF6pS(XJszpq$liHodsO!c2?a>k?FD@%Zm(76M#g z$`6DWU={!`{r4&yP)k-yX)JrJqb?uHWkVI9C&jp3bkwr9Om#N4A@b7&aCB!>_3}#q z%msXvK$HL6-)TJ(ajUY~afKbDfv#gyO5{|1vjgk%72YslqgIb( game_build_order.json + +Note a few important points: + +- It is necessary to use the ``--build=missing``, in exactly the same way than in the previous section. Failing to provide the intended ``--build`` policy and argument will result in incomplete or erroneous build-orders. +- The ``--reduce`` eliminates all elements in the result that doesn't have the ``binary: Build`` policy. This means that the resulting "build-order" cannot be merged with other build order files for aggregating them into a single one, which is important when there are multiple configurations and products. +- The ``--order-by`` argument allows to define different orders, by "recipe" or by "configuration". In this case, we are using the ``--order-by=recipe`` which is intended to parallelize builds per recipe, that means, that all possible different binaries for a given package like ``engine/1.0`` should be built first before any consumer of ``engine/1.0`` can be built. + +The resulting ``game_build_order.json`` looks like: + +.. code-block:: json + :caption: game_build_order.json + + { + "order_by": "recipe", + "reduced": true, + "order": [ + [ + { + "ref": "engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb", + "packages": [ + [ + { + "package_id": "de738ff5d09f0359b81da17c58256c619814a765", + "binary": "Build", + "build_args": "--requires=engine/1.0 --build=engine/1.0", + + } + ] + ] + } + ], + [ + { + "ref": "game/1.0#1715574045610faa2705017c71d0000e", + "depends": [ + "engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb" + ], + "packages": [ + [ + { + "package_id": "bac7cd2fe1592075ddc715563984bbe000059d4c", + "binary": "Build", + "build_args": "--requires=game/1.0 --build=game/1.0", + } + ] + ] + } + ] + ] + } + + +For convenience, in the same way that ``conan graph info ... --format=html > graph.html`` can generate a file with an HTML interactive dependency graph, the ``conan graph build-order ... --format=html > build_order.html`` can generate an HTML visual representation of the above json file: + + +.. image:: ./build_order_simple.png + :width: 500 px + :align: center + + +Note that this +- not upload yet +- html diff --git a/ci_tutorial/products_pipeline/single_configuration.rst b/ci_tutorial/products_pipeline/single_configuration.rst index 9fcbf689c63..2bdd6c78cff 100644 --- a/ci_tutorial/products_pipeline/single_configuration.rst +++ b/ci_tutorial/products_pipeline/single_configuration.rst @@ -1,10 +1,145 @@ Products pipeline: single configuration ======================================= +In this section we will implement a very basic products pipeline, without distributing the build, without using lockfiles or building multiple configurations. +The main idea is to illustrate the need to rebuild some packages because there is a new ``ai/1.1.0`` version that can be integrated by our main products. This new ``ai`` version is in the ``products`` repository, as it was already succesfully built by the "packages pipeline". +Let's start by making sure we have a clean environment with the right repositories defined: +.. code-block:: bash -- When the ``products pipeline`` build all necessary new binaries for all intermediate and product packages and check that every is correct, then - these new packages can be made available for all other developers and CI jobs. This can be done with a promotion of these packages, copying - them from the ``products`` repository to the ``develop`` repository. As the changes have been integrated and tested consistently for the main - organization products, developers doing ``conan install`` will start seeing and using the new packages and binaries. \ No newline at end of file + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Note that the ``products`` repo is added first, so it will have higher priority than the ``develop`` repo. It means Conan will resolve first in the ``products`` repo, if it finds a valid version for the defined version ranges, it will stop there and return that version, without +checking the ``develop`` repo (checking all repositories can be done with ``--update``, but that would be slower and with the right repository ordering, it is not necessary). + +As we have already defined, our main products are ``game/1.0`` and ``mapviewer/1.0``, let's start by trying to install and use ``mapviewer/1.0``: + + +.. code-block:: bash + + $ conan install --requires=mapviewer/1.0 + ... + Requirements + graphics/1.0#24b395ba17da96288766cc83accc98f5 - Downloaded (develop) + mapviewer/1.0#c4660fde083a1d581ac554e8a026d4ea - Downloaded (develop) + mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea - Downloaded (develop) + ... + Install finished successfully + + # Activate the environment and run the executable + # Use "conanbuild.bat && mapviewer" in Windows + $ source conanrun.sh && mapviewer + ... + graphics/1.0: Checking if things collide (Release)! + mapviewer/1.0:serving the game (Release)! + + +As we can see, ``mapviewer/1.0`` doesn't really depend on ``ai`` package at all, not any version. +So if we install it, we would already have a pre-compiled binary for it and everything works. + +But if we now try the same with ``game/1.0``: + +.. code-block:: bash + + $ conan install --requires=game/1.0 + ... + Requirements + ai/1.1.0#01a885b003190704f7617f8c13baa630 - Downloaded (products) + engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb - Downloaded (develop) + game/1.0#1715574045610faa2705017c71d0000e - Downloaded (develop) + graphics/1.0#24b395ba17da96288766cc83accc98f5 - Cache + mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea - Cache + ERROR: Missing binary: game/1.0:bac7cd2fe1592075ddc715563984bbe000059d4c + + game/1.0: WARN: Cant find a game/1.0 package binary bac7cd2fe1592075ddc715563984bbe000059d4c for the configuration: + ... + [requires] + ai/1.1.0#01a885b003190704f7617f8c13baa630 + +It will fail, because it will get ``ai/1.1.0`` from the ``products`` repo, and there will be no pre-compiled binary for ``game/1.0`` against this new version of ``ai``. This is correct, ``ai`` is a static library, so we need to re-build ``game/1.0`` against it, let's do it using the ``--build=missing`` argument: + +.. code-block:: bash + + $ conan install --requires=game/1.0 --build=missing + ... + ======== Computing necessary packages ======== + Requirements + ai/1.1.0:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 - Download (products) + engine/1.0:de738ff5d09f0359b81da17c58256c619814a765 - Build + game/1.0:bac7cd2fe1592075ddc715563984bbe000059d4c - Build + graphics/1.0:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 - Download (develop) + mathlib/1.0:4d8ab52ebb49f51e63d5193ed580b5a7672e23d5 - Download (develop) + + -------- Installing package engine/1.0 (4 of 5) -------- + engine/1.0: Building from source + ... + engine/1.0: Package de738ff5d09f0359b81da17c58256c619814a765 created + -------- Installing package game/1.0 (5 of 5) -------- + game/1.0: Building from source + ... + game/1.0: Package bac7cd2fe1592075ddc715563984bbe000059d4c created + Install finished successfully + +Note the ``--build=missing`` knows that ``engine/1.0`` also needs a new binary as a result of its dependency to the new ``ai/1.1.0`` version. Then, Conan proceeds to build the packages in the right order, first ``engine/1.0`` has to be built, because ``game/1.0`` depends on it. After the build we can list the new built binaries and see how they depend on the new versions: + +.. code-block:: bash + + $ conan list engine:* + Local Cache + engine + engine/1.0 + revisions + fba6659c9dd04a4bbdc7a375f22143cb (2024-09-30 12:19:54 UTC) + packages + de738ff5d09f0359b81da17c58256c619814a765 + info + ... + requires + ai/1.1.Z + graphics/1.0.Z + mathlib/1.0.Z + + $ conan list game:* + Local Cache + game + game/1.0 + revisions + 1715574045610faa2705017c71d0000e (2024-09-30 12:19:55 UTC) + packages + bac7cd2fe1592075ddc715563984bbe000059d4c + info + ... + requires + ai/1.1.0#01a885b003190704f7617f8c13baa630:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 + engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb:de738ff5d09f0359b81da17c58256c619814a765 + graphics/1.0#24b395ba17da96288766cc83accc98f5:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 + mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea:4d8ab52ebb49f51e63d5193ed580b5a7672e23d5 + +The new ``engine/1.0:de738ff5d09f0359b81da17c58256c619814a765`` binary depends on ``ai/1.1.Z``, because as it is a static library it will only require re-builds for changes in the minor version, but not patches. While the ``game/1.0`` new binary will depend on the full exact ``ai/1.1.0#revision:package_id``, and also on the new ``engine/1.0:de738ff5d09f0359b81da17c58256c619814a765`` new binary that depends on ``ai/1.1.Z``. + +Now the game can be executed: + +.. code-block:: bash + + # Activate the environment and run the executable + # Use "conanbuild.bat && game" in Windows + $ source conanrun.sh && game + mathlib/1.0: mathlib maths (Release)! + ai/1.1.0: SUPER BETTER Artificial Intelligence for aliens (Release)! + ai/1.1.0: Intelligence level=50 + graphics/1.0: Checking if things collide (Release)! + engine/1.0: Computing some game things (Release)! + game/1.0:fun game (Release)! + +We can see that the new ``game/1.0`` binary incorporates the improvements in ``ai/1.1.0``, and links correctly with the new binary for ``engine/1.0``. + +And this is a basic "products pipeline", we manage to build and test our main products when necessary (recall that ``mapviewer`` wasn't really affected, so no rebuilds were necessary at all). +In general, a production "products pipeline" will finish uploading the built packages to the repository and running a new promotion to the ``develop`` repo. But as this was a very basic and simplify pipeline, let's wait a bit for that, and let's continue with more advanced scenarios. From ea4453ea856c0e45bdab344bdbc2a95aa05986b0 Mon Sep 17 00:00:00 2001 From: memsharded Date: Wed, 2 Oct 2024 12:15:48 +0200 Subject: [PATCH 10/22] wip --- ci_tutorial/products_pipeline.rst | 6 +- ...alized_build.rst => distributed_build.rst} | 37 +++-- .../products_pipeline/full_pipeline.rst | 102 ++++++++++++++ .../products_pipeline/multi_product.rst | 129 ++++++++++++++++++ .../single_configuration.rst | 6 + .../create_your_first_package.rst | 2 +- 6 files changed, 270 insertions(+), 12 deletions(-) rename ci_tutorial/products_pipeline/{decentralized_build.rst => distributed_build.rst} (65%) create mode 100644 ci_tutorial/products_pipeline/full_pipeline.rst create mode 100644 ci_tutorial/products_pipeline/multi_product.rst diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst index 422165993c9..cfa2843e809 100644 --- a/ci_tutorial/products_pipeline.rst +++ b/ci_tutorial/products_pipeline.rst @@ -98,6 +98,6 @@ In the following sections we will present a products pipeline in an incremental :maxdepth: 1 products_pipeline/single_configuration - products_pipeline/decentralized_build - - + products_pipeline/distributed_build + products_pipeline/multi_product + products_pipeline/full_pipeline diff --git a/ci_tutorial/products_pipeline/decentralized_build.rst b/ci_tutorial/products_pipeline/distributed_build.rst similarity index 65% rename from ci_tutorial/products_pipeline/decentralized_build.rst rename to ci_tutorial/products_pipeline/distributed_build.rst index bfc13f58f59..82d57a024c9 100644 --- a/ci_tutorial/products_pipeline/decentralized_build.rst +++ b/ci_tutorial/products_pipeline/distributed_build.rst @@ -1,5 +1,5 @@ -Products pipeline: decentralized build -====================================== +Products pipeline: distributed build +==================================== The previous section used ``--build=missing`` to build all the necessary packages in the same CI machine. @@ -9,6 +9,12 @@ Let's start as usual making sure we have a clean environment with the right repo .. code-block:: bash + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes $ conan remove "*" -c # Make sure no packages from last run $ conan remote remove "*" # Make sure no other remotes defined # Add products repo, you might need to adjust this URL @@ -24,7 +30,8 @@ This is done with the following ``conan graph build-order`` command: .. code-block:: bash - $ conan graph build-order --requires=game/1.0 --build=missing --order-by=recipe --reduce --format=json > game_build_order.json + $ conan graph build-order --requires=game/1.0 --build=missing + --order-by=recipe --reduce --format=json > game_build_order.json Note a few important points: @@ -49,8 +56,7 @@ The resulting ``game_build_order.json`` looks like: { "package_id": "de738ff5d09f0359b81da17c58256c619814a765", "binary": "Build", - "build_args": "--requires=engine/1.0 --build=engine/1.0", - + "build_args": "--requires=engine/1.0 --build=engine/1.0", } ] ] @@ -85,6 +91,21 @@ For convenience, in the same way that ``conan graph info ... --format=html > gra :align: center -Note that this -- not upload yet -- html +The resulting json contains an ``order`` element which is a list of lists. This arrangement is important, every element in the top list is a set of packages that can be built in parallel because they do not have any relationship among them. You can view this list as a list of "levels", in level 0, there are packages that have no dependencies to any other package being built, in level 1 there are packages that contain dependencies only to elements in level 0 and so on. + +Then, the order of the elements in the top list is important and must be respected. Until the build of all the packages in one list item has finished, it is not possible to start the build of the next "level". + +Using the information in the ``graph_build_order.json`` file, it is possible to execute the build of the necessary packages, in the same way that the previous section ``--build=missing`` did, but not directly managed by us. + +Taking the arguments from the json, the commands to execute would be: + +.. code-block:: bash + + $ conan install --requires=engine/1.0 --build=engine/1.0 + $ conan install --requires=game/1.0 --build=game/1.0 + +We are executing these commands manually, but in practice, it would be a ``for`` loop in CI executing over the json output. We will see some Python code later for this. At this point we wanted to focus on the ``conan graph build-order`` command, but we haven't really explained how the build is distributed. + +Also note that inside every element there is an inner list of lists, the ``"packages"`` section, for all the binaries that must be built for a specific recipe for different configurations. + +Let's move now to see how a multi-product, multi-configuration build order can be computed. diff --git a/ci_tutorial/products_pipeline/full_pipeline.rst b/ci_tutorial/products_pipeline/full_pipeline.rst new file mode 100644 index 00000000000..7b7f060723d --- /dev/null +++ b/ci_tutorial/products_pipeline/full_pipeline.rst @@ -0,0 +1,102 @@ +Products pipeline: distributed full pipeline with lockfiles +=========================================================== + +This section will present the full and complete implementation of a multi-product, multi-configuration +distributed CI pipeline. We will complete the + +Let's start as usual cleaning the local cache and defining the correct repos: + +.. code-block:: bash + + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + + +Now, we will start computing the build-order for ``game/1.0`` for the 2 different configurations that we are building in this tutorial, debug and release: + +.. code-block:: bash + + $ conan lock create --requires=game/1.0 --lockfile-out=conan.lock + $ conan lock create --requires=game/1.0 -s build_type=Debug + --lockfile=conan.lock --lockfile-out=conan.lock + $ conan lock create --requires=mapviewer/1.0 --lockfile=conan.lock + --lockfile-out=conan.lock + $ conan lock create --requires=mapviewer/1.0 -s build_type=Debug + --lockfile=conan.lock --lockfile-out=conan.lock + + +.. note:: + + Recall that the ``conan.lock`` arguments are mostly optional, as that is the default lockfile name. + The first command can be typed as ``conan lock create --requires=game/1.0``. Also, all commands, including + ``conan install``, if they find a existing ``conan.lock`` file they will use it automatically, without an + explicit ``--lockfile=conan.lock``. The commands in this tutorial are shown explicitly complete for + completeness and didactical reasons. + + +Then, we can compute the build order for each product and configuration. These commands are identical to the ones in the +previous section, with the only difference of adding a ``--lockfile=conan.lock`` argument: + + +.. code-block:: bash + + $ conan graph build-order --requires=game/1.0 --lockfile=conan.lock + --build=missing --order-by=recipe --format=json > game_release.json + $ conan graph build-order --requires=game/1.0 --lockfile=conan.lock + --build=missing -s build_type=Debug --order-by=recipe --format=json > game_debug.json + $ conan graph build-order --requires=mapviewer/1.0 --lockfile=conan.lock + --build=missing --order-by=recipe --format=json > mapviewer_release.json + $ conan graph build-order --requires=mapviewer/1.0 --lockfile=conan.lock + --build=missing -s build_type=Debug --order-by=recipe --format=json > mapviewer_debug.json + +Likewise the ``build-order-merge`` command will be identical to the previous one. +In this case, as this command doesn't really compute a dependency graph, a ``conan.lock`` argument is not necessary, +dependencies are not being resolved: + + +.. code-block:: bash + + $ conan graph build-order-merge + --file=game_release.json --file=game_debug.json + --file=mapviewer_release.json --file=mapviewer_debug.json + --reduce --format=json > build_order.json + + + + + + + + + +This build order summarizes the necessary builds. First it is necessary to build all different binaries for ``engine/1.0``. This recipe contains 2 different binaries, one for Release and the other for Debug. These binaries belong to the same element in the ``packages`` list, which means they do not depend on each other and can be built in parallel. Each binary tracks its own original build-order file with ``"filenames": ["game_release"],`` so it is possible to deduce the necessary profiles to apply to it. + +Then, after all binaries of ``engine/1.0`` have been built, it is possible to proceed to build the different binaries for ``game/1.0``. It also contains 2 different binaries for its debug and release configurations, which can be built in parallel. + +In practice, this would mean something like: + +.. code-block:: bash + + # This 2 could be executed in parallel + # (in different machines, or different Conan caches) + $ conan install --requires=engine/1.0 --build=engine/1.0 + $ conan install --requires=engine/1.0 --build=engine/1.0 -s build_type=Debug + + # Once engine/1.0 builds finish, it is possible + # to build these 2 binaries in parallel (in different machines or caches) + $ conan install --requires=game/1.0 --build=game/1.0 + $ conan install --requires=game/1.0 --build=game/1.0 -s build_type=Debug + +In this section we have still omitted some important implementation details that will follow in next sections. The goal was to focus on the ``conan graph build-order-merge`` command and how different products and configurations can be merged in a single "build-order". diff --git a/ci_tutorial/products_pipeline/multi_product.rst b/ci_tutorial/products_pipeline/multi_product.rst new file mode 100644 index 00000000000..1a4a29bc973 --- /dev/null +++ b/ci_tutorial/products_pipeline/multi_product.rst @@ -0,0 +1,129 @@ +Products pipeline: multi-product multi-configuration builds +=========================================================== + +In the previous section we computed a ``conan graph build-order`` with several simplifications, we didn't take the ``mapviewer`` product into account, and we processed only 1 configuration. + +In real scenarios, it will be necessary to manage more than one product and the most common case is that there is more than on configuration for every product. If we build these differents cases sequentially it will be much slower and inefficient, and if we try to build them in parallel there will easily be many duplicated and unnecessary builds of the same packages, wasting resources and even producing issues as race conditions or traceability problems. + +To avoid this issue, it is possible to compute a single unified "build-order" that aggregates all the different build-orders that are compute for the different products and configurations. + +Let's start as usual cleaning the local cache and defining the correct repos: + +.. code-block:: bash + + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Now, we will start computing the build-order for ``game/1.0`` for the 2 different configurations that we are building in this tutorial, debug and release: + +.. code-block:: bash + + $ conan graph build-order --requires=game/1.0 --build=missing + --order-by=recipe --format=json > game_release.json + $ conan graph build-order --requires=game/1.0 --build=missing + --order-by=recipe -s build_type=Debug --format=json > game_debug.json + +These commands are basically the same than in the previous section, each one with a different configuration and creating a different output file ``game_release.json`` and ``game_debug.json``. These files will be similar to the previous ones, but as we haven't used the ``--reduce`` argument (this is important!) they will actually contain a "build-order" of all elements in the graph, even if only some contain the ``binary: Build`` definition, and others will contain other ``binary: Download|Cache|etc``. + +.. code-block:: bash + + $ conan graph build-order --requires=mapviewer/1.0 --build=missing + --order-by=recipe --format=json > mapviewer_release.json + $ conan graph build-order --requires=mapviewer/1.0 --build=missing + --order-by=recipe -s build_type=Debug --format=json > mapviewer_debug.json + + +Note that in the generated ``mapviewer_xxx.json`` build-order files, there will be only 1 element for ``mapviewer/1.0`` that contains a ``binary: Download``, because there is really no other package to be built, and as ``mapviewer`` is an application linked statically, Conan knows that it can "skip" its dependencies binaries. If we had used the ``--reduce`` argument we would have obtained an empty ``order``. But this is not an issue, as the next final step will really compute what needs to be built. + +Let's take all the 4 different "build-order" files (2 products x 2 configurations each), and merge them together: + +.. code-block:: bash + + $ conan graph build-order-merge + --file=game_release.json --file=game_debug.json + --file=mapviewer_release.json --file=mapviewer_debug.json + --reduce --format=json > build_order.json + + +Now we have applied the ``--reduce`` argument to produce a final ``build_order.json`` that is ready for distribution to the build agents and it only contains those specific packages that need to be built: + +.. code-block:: json + + { + "order_by": "recipe", + "reduced": true, + "order": [ + [ + { + "ref": "engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb", + "packages": [ + [ + { + "package_id": "de738ff5d09f0359b81da17c58256c619814a765", + "filenames": ["game_release"], + "build_args": "--requires=engine/1.0 --build=engine/1.0", + }, + { + "package_id": "cbeb3ac76e3d890c630dae5c068bc178e538b090", + "filenames": ["game_debug"], + "build_args": "--requires=engine/1.0 --build=engine/1.0", + + } + ] + ] + } + ], + [ + { + "ref": "game/1.0#1715574045610faa2705017c71d0000e", + "packages": [ + [ + { + "package_id": "bac7cd2fe1592075ddc715563984bbe000059d4c", + "filenames": ["game_release"], + "build_args": "--requires=game/1.0 --build=game/1.0", + }, + { + "package_id": "01fbc27d2c156886244dafd0804eef1fff13440b", + "filenames": ["game_debug"], + "build_args": "--requires=game/1.0 --build=game/1.0", + } + ] + ] + } + ] + ] + } + + +This build order summarizes the necessary builds. First it is necessary to build all different binaries for ``engine/1.0``. This recipe contains 2 different binaries, one for Release and the other for Debug. These binaries belong to the same element in the ``packages`` list, which means they do not depend on each other and can be built in parallel. Each binary tracks its own original build-order file with ``"filenames": ["game_release"],`` so it is possible to deduce the necessary profiles to apply to it. + +Then, after all binaries of ``engine/1.0`` have been built, it is possible to proceed to build the different binaries for ``game/1.0``. It also contains 2 different binaries for its debug and release configurations, which can be built in parallel. + +In practice, this would mean something like: + +.. code-block:: bash + + # This 2 could be executed in parallel + # (in different machines, or different Conan caches) + $ conan install --requires=engine/1.0 --build=engine/1.0 + $ conan install --requires=engine/1.0 --build=engine/1.0 -s build_type=Debug + + # Once engine/1.0 builds finish, it is possible + # to build these 2 binaries in parallel (in different machines or caches) + $ conan install --requires=game/1.0 --build=game/1.0 + $ conan install --requires=game/1.0 --build=game/1.0 -s build_type=Debug + +In this section we have still omitted some important implementation details that will follow in next sections. The goal was to focus on the ``conan graph build-order-merge`` command and how different products and configurations can be merged in a single "build-order". diff --git a/ci_tutorial/products_pipeline/single_configuration.rst b/ci_tutorial/products_pipeline/single_configuration.rst index 2bdd6c78cff..24bb17dc9c9 100644 --- a/ci_tutorial/products_pipeline/single_configuration.rst +++ b/ci_tutorial/products_pipeline/single_configuration.rst @@ -8,6 +8,12 @@ Let's start by making sure we have a clean environment with the right repositori .. code-block:: bash + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes $ conan remove "*" -c # Make sure no packages from last run $ conan remote remove "*" # Make sure no other remotes defined # Add products repo, you might need to adjust this URL diff --git a/tutorial/creating_packages/create_your_first_package.rst b/tutorial/creating_packages/create_your_first_package.rst index 8ae21c6cab3..28fe1b125df 100644 --- a/tutorial/creating_packages/create_your_first_package.rst +++ b/tutorial/creating_packages/create_your_first_package.rst @@ -152,7 +152,7 @@ Then, several methods are declared: * The ``generate()`` method prepares the build of the package from source. In this case, it could be simplified to an attribute ``generators = "CMakeToolchain"``, but it is left to show this important method. In this case, the execution of ``CMakeToolchain`` ``generate()`` method will create a *conan_toolchain.cmake* file that translates - the Conan ``settings`` and ``options`` to CMake syntax. The ``CMakeDeps`` generator is added for completitude, + the Conan ``settings`` and ``options`` to CMake syntax. The ``CMakeDeps`` generator is added for completeness, but it is not strictly necessary until ``requires`` are added to the recipe. * The ``build()`` method uses the ``CMake`` wrapper to call CMake commands, it is a thin layer that will manage From 62edc7e5d840c6b7c5a663fe6321dcc8826ff4be Mon Sep 17 00:00:00 2001 From: memsharded Date: Wed, 2 Oct 2024 14:21:14 +0200 Subject: [PATCH 11/22] final draft --- .../products_pipeline/distributed_build.rst | 2 + .../products_pipeline/full_pipeline.rst | 231 ++++++++++++++++-- .../products_pipeline/multi_product.rst | 2 + 3 files changed, 220 insertions(+), 15 deletions(-) diff --git a/ci_tutorial/products_pipeline/distributed_build.rst b/ci_tutorial/products_pipeline/distributed_build.rst index 82d57a024c9..75aa65d8d3b 100644 --- a/ci_tutorial/products_pipeline/distributed_build.rst +++ b/ci_tutorial/products_pipeline/distributed_build.rst @@ -16,6 +16,8 @@ Let's start as usual making sure we have a clean environment with the right repo # Now clean packages and define remotes $ conan remove "*" -c # Make sure no packages from last run + + # If you did in previous sections, NO need to repeat this $ conan remote remove "*" # Make sure no other remotes defined # Add products repo, you might need to adjust this URL # NOTE: The products repo is added first, it will have higher priority. diff --git a/ci_tutorial/products_pipeline/full_pipeline.rst b/ci_tutorial/products_pipeline/full_pipeline.rst index 7b7f060723d..ff580ec2fd2 100644 --- a/ci_tutorial/products_pipeline/full_pipeline.rst +++ b/ci_tutorial/products_pipeline/full_pipeline.rst @@ -2,7 +2,13 @@ Products pipeline: distributed full pipeline with lockfiles =========================================================== This section will present the full and complete implementation of a multi-product, multi-configuration -distributed CI pipeline. We will complete the +distributed CI pipeline. It will cover important implementation details: + +- Using lockfiles to guarantee a consistent and fixed set of dependencies for all configurations. +- Uploading built packages to the ``products`` repository. +- Capturing "package lists" and using them to run the final promotion. +- How to iterate the "build-order" programmatically + Let's start as usual cleaning the local cache and defining the correct repos: @@ -15,6 +21,8 @@ Let's start as usual cleaning the local cache and defining the correct repos: # Now clean packages and define remotes $ conan remove "*" -c # Make sure no packages from last run + + # If you did in previous sections, NO need to repeat this $ conan remote remove "*" # Make sure no other remotes defined # Add products repo, you might need to adjust this URL # NOTE: The products repo is added first, it will have higher priority. @@ -23,8 +31,7 @@ Let's start as usual cleaning the local cache and defining the correct repos: $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop - -Now, we will start computing the build-order for ``game/1.0`` for the 2 different configurations that we are building in this tutorial, debug and release: +Similarly to what we did in the ``packages pipeline`` when we wanted to ensure that the dependencies are exactly the same when building the different configurations and products, the first necessary step is to compute a ``conan.lock`` lockfile that we can pass to the different CI build agents to enforce the same set of dependencies everywhere. This can be done incrementally for the different ``products`` and configurations, aggregating it in the final single ``conan.lock`` lockfile. This approach assumes that both ``game/1.0`` and ``mapviewer/1.0`` will be using the same versions and revisions of the common dependencies. .. code-block:: bash @@ -75,28 +82,222 @@ dependencies are not being resolved: +So far, this process has been almost identical to the previous section one, just with the difference of capturing and using a lockfile. +Now, we will explain the "core" of the ``products`` pipeline: iterating the build-order and distributing the build, gathering the +resulting built packages. + +This would be some Python code that performs the iteration sequentially (a real CI system would distribute the builds to different agents in parallel): + + +.. code-block:: python + + build_order = open("build_order.json", "r").read() + build_order = json.loads(build_order) + to_build = build_order["order"] + + pkg_lists = [] # to aggregate the uploaded package-lists + for level in to_build: + for recipe in level: # This could be executed in parallel + ref = recipe["ref"] + # For every ref, multiple binary packages are being built. + # This can be done in parallel too. Often it is for different platforms + # they will need to be distributed to different build agents + for packages_level in recipe["packages"]: + # This could be executed in parallel too + for package in packages_level: + build_args = package["build_args"] + filenames = package["filenames"] + build_type = "-s build_type=Debug" if any("debug" in f for f in filenames) else "" + run(f"conan install {build_args} {build_type} --lockfile=conan.lock --format=json", file_stdout="graph.json") + run("conan list --graph=graph.json --format=json", file_stdout="built.json") + filename = f"uploaded{len(pkg_lists)}.json" + run(f"conan upload -l=built.json -r=products -c --format=json", file_stdout=filename) + pkg_lists.append(filename) +.. note:: + - This code is specific for the ``--order-by=recipe`` build-order, if chosing the ``--order-by=configuration``, the json + is different and it would require a different iteration. +These are the tasks that the above Python code is doing: -This build order summarizes the necessary builds. First it is necessary to build all different binaries for ``engine/1.0``. This recipe contains 2 different binaries, one for Release and the other for Debug. These binaries belong to the same element in the ``packages`` list, which means they do not depend on each other and can be built in parallel. Each binary tracks its own original build-order file with ``"filenames": ["game_release"],`` so it is possible to deduce the necessary profiles to apply to it. +- For every ``package`` in the build-order, a ``conan install --require= --build=`` is issued, and the result of this command is stored in a ``graph.json`` file +- The ``conan list`` command transform this ``graph.json`` into a package list called ``built.json``. Note that this package list actually stores both the built packages and the necessary transitive dependencies. This is done for simplicity, as later these package lists will be used for running a promotion, and we also want to promote the dependencies such as ``ai/1.1.0`` that were built in the ``packages pipeline`` and not by this job. +- The ``conan upload`` command uploads the package list to the ``products`` repo. Note that the ``upload`` first checks what packages already exist in the repo, avoiding costly transfers if they already exist. +- The result of the ``conan upload`` command is captured in a new package list called ``uploaded.json``, that we will accumulate later, that will serve for the final promotion. -Then, after all binaries of ``engine/1.0`` have been built, it is possible to proceed to build the different binaries for ``game/1.0``. It also contains 2 different binaries for its debug and release configurations, which can be built in parallel. -In practice, this would mean something like: +In practice this translates to the following commands (that you can execute to continue the tutorial): .. code-block:: bash - # This 2 could be executed in parallel - # (in different machines, or different Conan caches) - $ conan install --requires=engine/1.0 --build=engine/1.0 - $ conan install --requires=engine/1.0 --build=engine/1.0 -s build_type=Debug + # engine/1.0 release + $ conan install --requires=engine/1.0 --build=engine/1.0 --lockfile=conan.lock + --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded1.json + + # engine/1.0 debug + $ conan install --requires=engine/1.0 --build=engine/1.0 --lockfile=conan.lock + -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded2.json + + # game/1.0 release + $ conan install --requires=game/1.0 --build=game/1.0 --lockfile=conan.lock + --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded3.json + + # game/1.0 debug + $ conan install --requires=game/1.0 --build=game/1.0 --lockfile=conan.lock + -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded4.json + + +After this step the new built packages will be in the ``products`` repo and we will have 4 ``uploaded1.json`` - ``uploaded4.json`` files. + +Simplifying the different release and debug configurations, the state of our repositories would be something like: + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/promoted" [label="ai/1.1.0\n(new version)"]; + "engine/promoted" [label="engine/1.0\n(new binary)"]; + "game/promoted" [label="game/1.0\n(new binary)", fillcolor="lightgreen"]; + + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/promoted" -> "engine/promoted" -> "ai/promoted"; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +We can now accumulate the different ``uploadedX.json`` files into a single package list ``uploaded.json`` that contains everything: - # Once engine/1.0 builds finish, it is possible - # to build these 2 binaries in parallel (in different machines or caches) - $ conan install --requires=game/1.0 --build=game/1.0 - $ conan install --requires=game/1.0 --build=game/1.0 -s build_type=Debug +.. code-block:: bash + + $ conan pkglist merge -l uploaded0.json -l uploaded1.json + -l uploaded2.json -l uploaded3.json + --format=json > uploaded.json + + +And finally, if everything worked well and we consider this new set of versions and new package binaries is ready to be used by developers and other CI jobs, then, we can run the final promotion from the ``products`` to the ``develop`` repository: + +.. code-block:: bash + :caption: Promoting from products->develop + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=uploaded.json -r=products --format=json > promote.json + $ conan upload --list=promote.json -r=develop -c + + +And our final ``develop`` repository state will be: + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + "ai/promoted" [label="ai/1.1.0\n(new version)"]; + "engine/promoted" [label="engine/1.0\n(new binary)"]; + "game/promoted" [label="game/1.0\n(new binary)", fillcolor="lightgreen"]; + "game/promoted" -> "engine/promoted" -> "ai/promoted" -> "mathlib/1.0"; + "engine/promoted" -> "graphics/1.0"; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +This state of the ``develop`` repository will have the following behavior: + +- Developers installing ``game/1.0`` or ``engine/1.0`` will by default resolve to latest ``ai/1.1.0`` and use it. They will find pre-compiled binaries for the dependencies too, and they can continue developing using the latest set of dependencies. +- Developers and CI that were using a lockfile that was locking ``ai/1.0`` version, will still be able to keep working with that dependency without anything breaking, as the new versions and package binaries do not break or invalidate the previous existing binaries. -In this section we have still omitted some important implementation details that will follow in next sections. The goal was to focus on the ``conan graph build-order-merge`` command and how different products and configurations can be merged in a single "build-order". diff --git a/ci_tutorial/products_pipeline/multi_product.rst b/ci_tutorial/products_pipeline/multi_product.rst index 1a4a29bc973..d36fa56c3a2 100644 --- a/ci_tutorial/products_pipeline/multi_product.rst +++ b/ci_tutorial/products_pipeline/multi_product.rst @@ -18,6 +18,8 @@ Let's start as usual cleaning the local cache and defining the correct repos: # Now clean packages and define remotes $ conan remove "*" -c # Make sure no packages from last run + + # If you did in previous sections, NO need to repeat this $ conan remote remove "*" # Make sure no other remotes defined # Add products repo, you might need to adjust this URL # NOTE: The products repo is added first, it will have higher priority. From fa8b872d145ec26979e99c455b5e566cd70b6c0b Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:13:00 +0100 Subject: [PATCH 12/22] Update devops/versioning/default.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- devops/versioning/default.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/devops/versioning/default.rst b/devops/versioning/default.rst index 6193a917d1c..36155a3d9a0 100644 --- a/devops/versioning/default.rst +++ b/devops/versioning/default.rst @@ -40,7 +40,7 @@ source code: guarantee the same binary. - **Minor**: If changes are done to package public headers, in an API source compatible way, then the recommendation would be to increase the **minor** verson of a package. That means that other packages that depend on it will be able to compile without issues, - but as there were modification in public headers (that could contain C++ templates or other things that could be inlined in + but as there were modifications in public headers (that could contain C++ templates or other things that could be inlined in the consumer packages), then those consumer packages need to be rebuilt from source to incorporate these changes. - **Major**: If API breaking changes are done to the package public headers, then increasing the **major** version is recommended. As the most common recommended version-range is something like ``dependency/[>1.0 <2]``, where the next major is excluded, that means From 422f9d3ead5b89b0472cd32751f28c2e8268b8a6 Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:13:17 +0100 Subject: [PATCH 13/22] Update devops/package_promotions.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- devops/package_promotions.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/devops/package_promotions.rst b/devops/package_promotions.rst index 3062656f95a..136309937b9 100644 --- a/devops/package_promotions.rst +++ b/devops/package_promotions.rst @@ -93,7 +93,7 @@ the command would look like: $ conan art:promote pkglist.json --from=testing --to=release --url=https:///artifactory --user= --password= -Note that the ``conan art:promote`` doesn't work with ArtifactoryCE, but need pro editions of Artifactory. +Note that the ``conan art:promote`` command doesn't work with ArtifactoryCE, Pro editions of Artifactory are needed. The promote functionality can be implemented in these cases with a simple download+upload flow: .. code-block:: bash From 62185d50d07fe8bd247604c79be59307313dd601 Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:13:35 +0100 Subject: [PATCH 14/22] Update devops/package_promotions.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- devops/package_promotions.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/devops/package_promotions.rst b/devops/package_promotions.rst index 136309937b9..400a59b5c06 100644 --- a/devops/package_promotions.rst +++ b/devops/package_promotions.rst @@ -74,7 +74,7 @@ put in the "testing" repository, for the QA team to test them, for example ``zli } -When QA team tests and approves these packages, they can be promoted to the "release" repository. +When the QA team tests and approves these packages, they can be promoted to the "release" repository. Basically, a promotion is a copy of the packages, including all the artifacts and metadata from the "testing" to the "release" repository. From e8e77a23cb10d67c2f5bc4b09c528a3c5155d76b Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:13:55 +0100 Subject: [PATCH 15/22] Update devops/devops.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- devops/devops.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/devops/devops.rst b/devops/devops.rst index 0de7e1246c0..1f1ffb79407 100644 --- a/devops/devops.rst +++ b/devops/devops.rst @@ -6,7 +6,7 @@ Devops guide The previous :ref:`tutorial` section was aimed at users in general and developers. -The :ref:`Continuous Integration tutorial` explained the basics how to implement Continuous Integration involving Conan packages. +The :ref:`Continuous Integration tutorial` explained the basics on how to implement Continuous Integration involving Conan packages. This section is intended for DevOps users, build and CI engineers, administrators, and architects adopting, designing and implementing Conan in production in their teams and organizations. If you plan to use Conan in production in your project, team, or organization, this section contains the necessary information. From 18962cbdf4564eb5d912659a48f9cdd5c2d1b128 Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:14:17 +0100 Subject: [PATCH 16/22] Update ci_tutorial/tutorial.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- ci_tutorial/tutorial.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 4682d0c8454..248adc68ca6 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -5,7 +5,7 @@ Continuous Integration (CI) tutorial .. note:: - - This is an advanced topic, previous knowledge of Conan tool is necessary. Please :ref:`read and practice the user tutorial` first. + - This is an advanced topic, previous knowledge of Conan is necessary. Please :ref:`read and practice the user tutorial` first. - This section is intended for devops and build engineers designing and implementing a CI pipeline involving Conan packages, if it is not the case, you can skip this section. From 98fc4a2b3e08190430b738c49f384503a5d83909 Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:14:45 +0100 Subject: [PATCH 17/22] Update ci_tutorial/tutorial.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- ci_tutorial/tutorial.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 248adc68ca6..6356fa898cb 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -113,7 +113,7 @@ with the repositories, there will be 2 promotions: - The versioning approach is important. This tutorial will be following :ref:`the default Conan versioning approach, see details here` This tutorial is just modeling the **development** flow. In production systems, there will be other repositories -and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users and packages can +and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users, such that packages can be promoted from ``develop`` to ``testing`` to ``release`` as they pass validation. Read more about promotions in :ref:`Package promotions`. From 19cdfd6e97d9a348a22093f929c83b8345cea13d Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:15:15 +0100 Subject: [PATCH 18/22] Update ci_tutorial/packages_pipeline.rst Co-authored-by: Carlos Zoido --- ci_tutorial/packages_pipeline.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci_tutorial/packages_pipeline.rst b/ci_tutorial/packages_pipeline.rst index f3dbb0a475b..5c002afc0dc 100644 --- a/ci_tutorial/packages_pipeline.rst +++ b/ci_tutorial/packages_pipeline.rst @@ -22,7 +22,7 @@ in the ``ai`` package, providing some better algorithms for our game. did some breaking changes to the ``ai`` public API, the recommendation would be to change the major instead and create a new ``2.0`` version. -The ``packages pipeline`` will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages`` +The **packages pipeline** will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages`` binary repository to avoid disrupting or causing potential issues to other developers and CI jobs. If the pipeline succeed it will promote (copy) them to the ``products`` binary repository, and stop otherwise. From f598c81ebb8cb855c0eb74032c884aaf4f01fa6e Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:15:37 +0100 Subject: [PATCH 19/22] Update ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst Co-authored-by: Carlos Zoido --- ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst index 4390c38eee2..ef5704db082 100644 --- a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst +++ b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst @@ -3,7 +3,7 @@ Package pipeline: multi configuration using lockfiles In the previous example, we built both ``Debug`` and ``Release`` package binaries for ``ai/1.1.0``. In real world scenarios the binaries to build would be different platforms (Windows, Linux, embedded), different architectures, and very often it will not be possible to build them in the same machine, requiring different computers. -The previous example had an important assumption: the dependencies of ``ai/1.1.0`` do not change at all during the building process. In many scenarios, this assumption will not hold, for example if there are any other concurrent CI jobs, and one succesfull job publish a new ``mathlib/1.1`` version in the ``develop`` repo. +The previous example had an important assumption: the dependencies of ``ai/1.1.0`` do not change at all during the building process. In many scenarios, this assumption will not hold, for example if there are any other concurrent CI jobs, and one succesfull job publishes a new ``mathlib/1.1`` version in the ``develop`` repo. Then it is possible that one build of ``ai/1.1.0``, for example, the one running in the Linux servers starts earlier and uses the previous ``mathlib/1.0`` version as dependency, while the Windows servers start a bit later, and then their build will use the recent ``mathlib/1.1`` version as dependency. This is a very undesirable situation, having binaries for the same ``ai/1.1.0`` version using different dependencies versions. This can lead in later graph resolution problems, or even worse, get to the release with different behavior for different platforms. From 380ac46b3291204baef3ad6e90284a79a74fa7d4 Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:17:23 +0100 Subject: [PATCH 20/22] Update ci_tutorial/tutorial.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- ci_tutorial/tutorial.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 6356fa898cb..5e76ccdfbc2 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -100,7 +100,7 @@ Promotions are the mechanism used to make available packages from one pipeline t with the repositories, there will be 2 promotions: - When all the different binaries for the different configurations have been built for a single package with the ``packages pipeline``, and uploaded - to the ``packages`` repository, the package changes and package new version can be considered "correct" and promoted (copied) to the ``products`` + to the ``packages`` repository, the new version and changes to the package can be considered "correct" and promoted (copied) to the ``products`` repository. - When the ``products pipeline`` has built from source all the necessary packages that need a re-build because of the new package versions in the ``products`` repository and has checked that the organization "products" (such ``game/1.0`` and ``mapviewer/1.0``) are not broken, then From 6d6d6b068aa9122e35fb4c821e83795aa755d070 Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:17:42 +0100 Subject: [PATCH 21/22] Update ci_tutorial/products_pipeline/distributed_build.rst MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Abril Rincón Blanco --- ci_tutorial/products_pipeline/distributed_build.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci_tutorial/products_pipeline/distributed_build.rst b/ci_tutorial/products_pipeline/distributed_build.rst index 75aa65d8d3b..67953c9a312 100644 --- a/ci_tutorial/products_pipeline/distributed_build.rst +++ b/ci_tutorial/products_pipeline/distributed_build.rst @@ -3,7 +3,7 @@ Products pipeline: distributed build The previous section used ``--build=missing`` to build all the necessary packages in the same CI machine. -This is not always desired, or even possible, and in many situations it is preferable to do a distributed build, to achieve faster builds and better use the CI resources. The most natural distribution of the build load is to build different packages in different machines. Let's see how this is possible with the ``conan graph build-order`` command. +This is not always desired, or even possible, and in many situations it is preferable to do a distributed build, to achieve faster builds and better usage the CI resources. The most natural distribution of the build load is to build different packages in different machines. Let's see how this is possible with the ``conan graph build-order`` command. Let's start as usual making sure we have a clean environment with the right repositories defined: From 11d58864d85bfcfad85ae57e277fcf9d135571dc Mon Sep 17 00:00:00 2001 From: James Date: Mon, 28 Oct 2024 12:24:59 +0100 Subject: [PATCH 22/22] Apply suggestions from code review MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: Carlos Zoido Co-authored-by: Michael Farrell Co-authored-by: Abril Rincón Blanco Co-authored-by: Artalus --- .../multi_configuration_lockfile.rst | 2 +- ci_tutorial/products_pipeline.rst | 4 ++-- ci_tutorial/products_pipeline/distributed_build.rst | 12 ++++++------ ci_tutorial/products_pipeline/full_pipeline.rst | 10 +++++----- ci_tutorial/products_pipeline/multi_product.rst | 8 +++++--- .../products_pipeline/single_configuration.rst | 2 +- ci_tutorial/project_setup.rst | 2 +- ci_tutorial/tutorial.rst | 10 +++++----- 8 files changed, 26 insertions(+), 24 deletions(-) diff --git a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst index ef5704db082..ff6032e8897 100644 --- a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst +++ b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst @@ -57,7 +57,7 @@ The ``conan.lock`` file can be inspected, it will be something like: As we can see, it is locking the ``mathlib/1.0`` dependency version and revision. -With the lockfile, the creating of the different configurations is exactly the same, but providing the ``--lockfile=conan.lock`` argument to the ``conan create`` step, it will guarantee that ``mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea`` will always be the exact dependency used, irrespective if there exist new ``mathlib/1.1`` versions or new revisions available. The following builds could be launched in parallel but executed at different times, and still they will always use the same ``mathlib/1.0`` dependency: +With the lockfile, creating the different configurations is exactly the same, but providing the ``--lockfile=conan.lock`` argument to the ``conan create`` step, it will guarantee that ``mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea`` will always be the exact dependency used, irrespective if there exist new ``mathlib/1.1`` versions or new revisions available. The following builds could be launched in parallel but executed at different times, and still they will always use the same ``mathlib/1.0`` dependency: .. code-block:: bash diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst index cfa2843e809..1c5a7287cbf 100644 --- a/ci_tutorial/products_pipeline.rst +++ b/ci_tutorial/products_pipeline.rst @@ -1,7 +1,7 @@ Products pipeline ================== -The **products pipeline** responds a more challenging question: does my "products" build correctly with the latest changes and new versions that have been done +The **products pipeline** responds to a more challenging question: do my "products" build correctly with the new versions of the packages? to the packages and their dependencies? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization important products to check if things integrate cleanly or break. @@ -33,7 +33,7 @@ job for the ``engine`` package, that is triggered after the ``ai`` one, configur But this approach does not scale at all and have very important limitations: -- The example above is relatively simple, but in practice dependency graphs can have many more packages, even several hundrends, making it very tedious and error prone to define all dependencies among packages in the CI +- The example above is relatively simple, but in practice dependency graphs can have many more packages, even several hundreds, making it very tedious and error prone to define all dependencies among packages in the CI - Dependencies evolve over time, and new versions are used, some dependencies are removed and newer dependencies are added. The simple relationship between repositories modeled at the CI level can result in a very inefficient, slow and time consuming CI, if not a fragile one that continuously breaks because some dependencies change. - The combinatorial nature that happens downstream a dependency graph, where a relatively stable top dependency, lets say ``mathlib/1.0`` might be used by multiple consumers such as ``ai/1.0``, ``ai/1.1``, ``ai/1.2`` which in turn each one might be used by multiple ``engine`` different versions and so on. Building only the latest version of the consumers would be insufficient in many cases and building all of them would be extremely costly. - The "inverse" dependency model, that is, asking what are the "dependants" of a given package is extremely challeging in practice, specially in a decentralized diff --git a/ci_tutorial/products_pipeline/distributed_build.rst b/ci_tutorial/products_pipeline/distributed_build.rst index 67953c9a312..f7de64f9959 100644 --- a/ci_tutorial/products_pipeline/distributed_build.rst +++ b/ci_tutorial/products_pipeline/distributed_build.rst @@ -26,8 +26,8 @@ Let's start as usual making sure we have a clean environment with the right repo $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop -We will obviate by now the ``mapviewer/1.0`` and focus this section in the ``game/1.0`` product. -The first step is to compute the "build-order", that is, the list of packages that need to be build, and in what order. +We will obviate by now the ``mapviewer/1.0`` product and focus this section in the ``game/1.0`` product. +The first step is to compute the "build-order", that is, the list of packages that need to be built, and in what order. This is done with the following ``conan graph build-order`` command: .. code-block:: bash @@ -38,8 +38,8 @@ This is done with the following ``conan graph build-order`` command: Note a few important points: - It is necessary to use the ``--build=missing``, in exactly the same way than in the previous section. Failing to provide the intended ``--build`` policy and argument will result in incomplete or erroneous build-orders. -- The ``--reduce`` eliminates all elements in the result that doesn't have the ``binary: Build`` policy. This means that the resulting "build-order" cannot be merged with other build order files for aggregating them into a single one, which is important when there are multiple configurations and products. -- The ``--order-by`` argument allows to define different orders, by "recipe" or by "configuration". In this case, we are using the ``--order-by=recipe`` which is intended to parallelize builds per recipe, that means, that all possible different binaries for a given package like ``engine/1.0`` should be built first before any consumer of ``engine/1.0`` can be built. +- The ``--reduce`` argument eliminates all elements in the resulting order that don't have the ``binary: Build`` policy. This means that the resulting "build-order" cannot be merged with other build order files for aggregating them into a single one, which is important when there are multiple configurations and products. +- The ``--order-by`` argument allows to define different orders, by "recipe" or by "configuration". In this case, we are using ``--order-by=recipe`` which is intended to parallelize builds per recipe, that means, that all possible different binaries for a given package like ``engine/1.0`` should be built first before any consumer of ``engine/1.0`` can be built. The resulting ``game_build_order.json`` looks like: @@ -95,9 +95,9 @@ For convenience, in the same way that ``conan graph info ... --format=html > gra The resulting json contains an ``order`` element which is a list of lists. This arrangement is important, every element in the top list is a set of packages that can be built in parallel because they do not have any relationship among them. You can view this list as a list of "levels", in level 0, there are packages that have no dependencies to any other package being built, in level 1 there are packages that contain dependencies only to elements in level 0 and so on. -Then, the order of the elements in the top list is important and must be respected. Until the build of all the packages in one list item has finished, it is not possible to start the build of the next "level". +Then, the order of the elements in the outermost list is important and must be respected. Until the build of all the packages in one list item has finished, it is not possible to start the build of the next "level". -Using the information in the ``graph_build_order.json`` file, it is possible to execute the build of the necessary packages, in the same way that the previous section ``--build=missing`` did, but not directly managed by us. +Using the information in the ``graph_build_order.json`` file, it is possible to execute the build of the necessary packages, in the same way that the previous section's ``--build=missing`` did, but not directly managed by us. Taking the arguments from the json, the commands to execute would be: diff --git a/ci_tutorial/products_pipeline/full_pipeline.rst b/ci_tutorial/products_pipeline/full_pipeline.rst index ff580ec2fd2..6dcb5a7bde9 100644 --- a/ci_tutorial/products_pipeline/full_pipeline.rst +++ b/ci_tutorial/products_pipeline/full_pipeline.rst @@ -83,10 +83,10 @@ dependencies are not being resolved: So far, this process has been almost identical to the previous section one, just with the difference of capturing and using a lockfile. -Now, we will explain the "core" of the ``products`` pipeline: iterating the build-order and distributing the build, gathering the +Now, we will explain the "core" of the ``products`` pipeline: iterating the build-order and distributing the build, and gathering the resulting built packages. -This would be some Python code that performs the iteration sequentially (a real CI system would distribute the builds to different agents in parallel): +This would be an example of some Python code that performs the iteration sequentially (a real CI system would distribute the builds to different agents in parallel): .. code-block:: python @@ -158,9 +158,9 @@ In practice this translates to the following commands (that you can execute to c $ conan upload -l=built.json -r=products -c --format=json > uploaded4.json -After this step the new built packages will be in the ``products`` repo and we will have 4 ``uploaded1.json`` - ``uploaded4.json`` files. +After this step the newly built packages will be in the ``products`` repo and we will have 4 ``uploaded1.json`` - ``uploaded4.json`` files. -Simplifying the different release and debug configurations, the state of our repositories would be something like: +Simplifying the different release and debug configurations, the state of our repositories would be something like: .. graphviz:: @@ -228,7 +228,7 @@ We can now accumulate the different ``uploadedX.json`` files into a single packa --format=json > uploaded.json -And finally, if everything worked well and we consider this new set of versions and new package binaries is ready to be used by developers and other CI jobs, then, we can run the final promotion from the ``products`` to the ``develop`` repository: +And finally, if everything worked well, and we consider this new set of versions and new package binaries is ready to be used by developers and other CI jobs, then we can run the final promotion from the ``products`` to the ``develop`` repository: .. code-block:: bash :caption: Promoting from products->develop diff --git a/ci_tutorial/products_pipeline/multi_product.rst b/ci_tutorial/products_pipeline/multi_product.rst index d36fa56c3a2..54f82aa85d5 100644 --- a/ci_tutorial/products_pipeline/multi_product.rst +++ b/ci_tutorial/products_pipeline/multi_product.rst @@ -3,9 +3,9 @@ Products pipeline: multi-product multi-configuration builds In the previous section we computed a ``conan graph build-order`` with several simplifications, we didn't take the ``mapviewer`` product into account, and we processed only 1 configuration. -In real scenarios, it will be necessary to manage more than one product and the most common case is that there is more than on configuration for every product. If we build these differents cases sequentially it will be much slower and inefficient, and if we try to build them in parallel there will easily be many duplicated and unnecessary builds of the same packages, wasting resources and even producing issues as race conditions or traceability problems. +In real scenarios, it will be necessary to manage more than one product and the most common case is that there is more than one configuration for every product. If we build these different cases sequentially it will be much slower and inefficient, and if we try to build them in parallel there will easily be many duplicated and unnecessary builds of the same packages, wasting resources and even producing issues as race conditions or traceability problems. -To avoid this issue, it is possible to compute a single unified "build-order" that aggregates all the different build-orders that are compute for the different products and configurations. +To avoid this issue, it is possible to compute a single unified "build-order" that aggregates all the different build-orders that are computed for the different products and configurations. Let's start as usual cleaning the local cache and defining the correct repos: @@ -37,7 +37,9 @@ Now, we will start computing the build-order for ``game/1.0`` for the 2 differen $ conan graph build-order --requires=game/1.0 --build=missing --order-by=recipe -s build_type=Debug --format=json > game_debug.json -These commands are basically the same than in the previous section, each one with a different configuration and creating a different output file ``game_release.json`` and ``game_debug.json``. These files will be similar to the previous ones, but as we haven't used the ``--reduce`` argument (this is important!) they will actually contain a "build-order" of all elements in the graph, even if only some contain the ``binary: Build`` definition, and others will contain other ``binary: Download|Cache|etc``. +These commands are basically the same as in the previous section, each one with a different configuration and creating a different output file ``game_release.json`` and ``game_debug.json``. These files will be similar to the previous ones, but as we haven't used the ``--reduce`` argument (this is important!) they will actually contain a "build-order" of all elements in the graph, even if only some contain the ``binary: Build`` definition, and others will contain other ``binary: Download|Cache|etc``. + +Now, let's compute the build-order for ``mapviewer/1.0``: .. code-block:: bash diff --git a/ci_tutorial/products_pipeline/single_configuration.rst b/ci_tutorial/products_pipeline/single_configuration.rst index 24bb17dc9c9..cfd427f46f2 100644 --- a/ci_tutorial/products_pipeline/single_configuration.rst +++ b/ci_tutorial/products_pipeline/single_configuration.rst @@ -148,4 +148,4 @@ Now the game can be executed: We can see that the new ``game/1.0`` binary incorporates the improvements in ``ai/1.1.0``, and links correctly with the new binary for ``engine/1.0``. And this is a basic "products pipeline", we manage to build and test our main products when necessary (recall that ``mapviewer`` wasn't really affected, so no rebuilds were necessary at all). -In general, a production "products pipeline" will finish uploading the built packages to the repository and running a new promotion to the ``develop`` repo. But as this was a very basic and simplify pipeline, let's wait a bit for that, and let's continue with more advanced scenarios. +In general, a production "products pipeline" will finish uploading the built packages to the repository and running a new promotion to the ``develop`` repo. But as this was a very basic and simple pipeline, let's wait a bit for that, and let's continue with more advanced scenarios. diff --git a/ci_tutorial/project_setup.rst b/ci_tutorial/project_setup.rst index f64a37174f3..256f89b0175 100644 --- a/ci_tutorial/project_setup.rst +++ b/ci_tutorial/project_setup.rst @@ -51,7 +51,7 @@ Initial dependency graph $ python project_setup.py -This will do several tasks, clean the server repos, create initial ``Debug`` and ``Release`` binaries for the dependency graph and upload them to the ``develop`` repo, then clean the local cache. Note in this example we are using ``Debug`` and ``Release`` different configurations for convenience, but in real cases these would be different configurations such as Windows/X86_64, Linux/x86_64, Linux/armv8, etc., running +This will do several tasks, clean the server repos, create initial ``Debug`` and ``Release`` binaries for the dependency graph and upload them to the ``develop`` repo, then clean the local cache. Note in this example we are using ``Debug`` and ``Release`` as our different configurations for convenience, but in real cases these would be different configurations such as Windows/X86_64, Linux/x86_64, Linux/armv8, etc., running in different computers. This dependency graph of packages in the ``develop`` repo is the starting point for our tutorial, assumed as a functional and stable "develop" state of the project that developers can ``conan install`` to work in any of the different packages. diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst index 5e76ccdfbc2..5bc3a7d9d6d 100644 --- a/ci_tutorial/tutorial.rst +++ b/ci_tutorial/tutorial.rst @@ -13,7 +13,7 @@ Continuous Integration (CI) tutorial Continuous Integration has different meanings for different users and organizations. In this tutorial we will cover the scenarios when users are doing changes to the source code of their packages and want to automatically build new binaries for those packages and also compute if those new package changes integrate cleanly or break the organization main products. -We will use in this tutorial this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility. The ``game`` and ``mapviewer`` are our final "**products**", what we distribute to our users: +In this tutorial we will use this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility. The ``game`` and ``mapviewer`` are our final "**products**", what we distribute to our users: .. graphviz:: :align: center @@ -60,8 +60,8 @@ the **packages pipeline** and the **products pipeline** - The **products pipeline** takes care of building the main organization "products" (the packages that implement the final applications or deliverables), and making sure that changes and new versions in dependencies integrate correctly, rebuilding any intermediate packages in the graph if necessary. -The idea is that if some developer does changes to ``ai`` package, producing a new ``ai/1.1.0`` version, the packages pipeline will first build this -new version. But this new version might accidentally break or require rebuilding some consumers packages. If our organization main **products** are +The idea is that if some developer does changes to the ``ai`` package, producing a new ``ai/1.1.0`` version, the packages pipeline will first build this +new version. But this new version might accidentally break or require rebuilding some consumer packages. If our organization main **products** are ``game/1.0`` and ``mapviewer/1.0``, then the products pipeline can be triggered, in this case it would rebuild ``engine/1.0`` and ``game/1.0`` as they are affected by the change. @@ -73,7 +73,7 @@ The concept of multiple server side repositories is very important for CI. In th - ``develop``: This repository is the main one that developers have configured in their machines to be able to ``conan install`` dependencies and work. As such it is expected to be quite stable, similar to a shared "develop" branch in git, and the repository should contain pre-compiled - binaries for the organization pre-defined platforms, so developers and CI don't need to do ``--build=missing`` and build again and again from + binaries for the organization's pre-defined platforms, so developers and CI don't need to do ``--build=missing`` and build again and again from source. - ``packages``: This repository will be used to temporarily upload the packages built by the "packages pipeline", to not upload them directly to the ``develop`` repo and avoid disruption until these packages are fully validated. @@ -96,7 +96,7 @@ The concept of multiple server side repositories is very important for CI. In th } -Promotions are the mechanism used to make available packages from one pipeline to the other. Connecting the above packages and product pipelines +Promotions are the mechanism used to make packages available from one pipeline to the other. Connecting the above packages and product pipelines with the repositories, there will be 2 promotions: - When all the different binaries for the different configurations have been built for a single package with the ``packages pipeline``, and uploaded