Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/devops ci #3799

Open
wants to merge 25 commits into
base: develop2
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions devops.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ If you plan to use Conan in production in your project, team, or organization, t

devops/using_conancenter
devops/devops_local_recipes_index
devops/continuous_integration/tutorial
memsharded marked this conversation as resolved.
Show resolved Hide resolved
devops/backup_sources/sources_backup
devops/metadata
devops/versioning
Expand Down
28 changes: 28 additions & 0 deletions devops/continuous_integration/packages_pipeline.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
Packages pipeline
==================

For the ``package pipeline`` we will start with a simple source code change in the ``ai`` recipe, simulating some improvements
in the ``ai`` package, providing some better algorithms for our game.

Let's do the following changes:

- Let's change the implementation of the ``ai/src/ai.cpp`` function and change the message from ``Some Artificial`` to ``SUPER BETTER Artificial``
- Let's change the default ``intelligence=0`` value in ``ai/include/ai.h`` to a new ``intelligence=50`` default.
- Finally, let's bump the version. As we did some changes to the package public headers, it would be adviced to bump the ``minor`` version,
so let`s edit the ``ai/conanfile.py`` file and define ``version = "1.1.0"`` there (instead of the previous ``1.0``). Note that if we
did some breaking changes to the ``ai`` public API, the recommendation would be to change the major instead and create a new ``2.0`` version.


The ``packages pipeline`` will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages``
binary repository. If the pipeline succeed it will copy them to the ``products`` binary repository, and stop otherwise.

There are different aspects that need to be taken into account when building these packages. The following tutorial sections do the same
job, but under different hypothesis. They are explained in increasing complexity.


.. toctree::
:maxdepth: 1

packages_pipeline/single_configuration
packages_pipeline/multi_configuration
packages_pipeline/multi_configuration_lockfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,178 @@
Package pipeline: multi configuration
=====================================

In the previous section we were building just 1 configuration. This section will cover the case in which we need to build more
than 1 configuration. We will use the ``Release`` and ``Debug`` configurations here for convenience, as it is easier to
follow, but in real case these configurations will be more like Windows, Linux, OSX, building for different architectures,
cross building, etc.

Let's begin cleaning our cache and initializing only the ``develop`` repo:


.. code-block:: bash

$ conan remove "*" -c # Make sure no packages from last run
$ conan remote remove "*" # Make sure no other remotes defined
$ conan remote add develop <url-develop-repo> # Add only the develop repo


We will create the packages for the 2 configurations sequentially in our computer, but note these will typically run
in different computers, so it is typical for CI systems to launch the builds of different configurations in parallel.

.. code-block:: bash
:caption: Release build

$ conan create . --build="missing:ai/*" -s build_type=Release --format=json > graph.json
memsharded marked this conversation as resolved.
Show resolved Hide resolved
$ conan list --graph=graph.json --graph-binaries=build --format=json > upload_release.json
$ conan remote add packages "<url-packages-repo>"
$ conan upload -l=upload_release.json -r=packages -c --format=json > upload_release.json
memsharded marked this conversation as resolved.
Show resolved Hide resolved

We have done a few changes and extra steps:

- First step is similar to the one in the previous section, a ``conan create``, just making it explicit our configuration
``-s build_type=Release`` for clarity, and capturing the output of the ``conan create`` in a ``graph.json`` file.
- The second step is create a ``upload_release.json`` **package list** file, with the packages that needs to be uploaded,
in this case, only the packages that have been built from source (``--graph-binaries=build``) will be uploaded. This is
done for efficiency and faster uploads.
- Third step is to define the ``packages`` repository
- Finally, we will upload the ``upload_release.json`` package list to the ``packages`` repository, updating the ``upload_release.json``
package list with the new location of the packages (the server repository).

Likewise, the Debug build will do the same steps:


.. code-block:: bash
:caption: Debug build

$ conan create . --build="missing:ai/*" -s build_type=Debug --format=json > graph.json
$ conan list --graph=graph.json --graph-binaries=build --format=json > upload_debug.json
$ conan remote add packages "<url-packages-repo>" -f # Can be ommitted, it was defined above
$ conan upload -l=upload_debug.json -r=packages -c --format=json > upload_debug.json
memsharded marked this conversation as resolved.
Show resolved Hide resolved


When both Release and Debug configuration finish successfully, we would have these packages in the repositories:

.. graphviz::
:align: center

digraph repositories {
node [fillcolor="lightskyblue", style=filled, shape=box]
rankdir="LR";
subgraph cluster_0 {
label="Packages server";
style=filled;
color=lightgrey;
subgraph cluster_1 {
label = "packages\n repository"
shape = "box";
style=filled;
color=lightblue;
"packages" [style=invis];
"ai/1.1.0\n (Release)";
"ai/1.1.0\n (Debug)";
}
subgraph cluster_2 {
label = "products\n repository"
shape = "box";
style=filled;
color=lightblue;
"products" [style=invis];
}
subgraph cluster_3 {
rankdir="BT";
shape = "box";
label = "develop repository";
color=lightblue;
rankdir="BT";

node [fillcolor="lightskyblue", style=filled, shape=box]
"game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0";
"engine/1.0" -> "graphics/1.0" -> "mathlib/1.0";
"mapviewer/1.0" -> "graphics/1.0";
"game/1.0" [fillcolor="lightgreen"];
"mapviewer/1.0" [fillcolor="lightgreen"];
}
{
edge[style=invis];
"packages" -> "products" -> "game/1.0" ;
rankdir="BT";
}
}
}


If the build of all configurations for ``ai/1.1.0`` were succesfull, then the ``packages pipeline`` can proceed and promote
them to the ``products`` repository:

.. code-block:: bash
:caption: Promoting from packages->product

# aggregate the package list
$ conan pkglist merge -l upload_release.json -l upload_debug.json --format=json > promote.json

memsharded marked this conversation as resolved.
Show resolved Hide resolved
$ conan remote add packages "<url-packages-repo>" -f # Can be ommitted, it was defined above
$ conan remote add products "<url-products-repo>" -f # Can be ommitted, it was defined above

# Promotion with Artifactory CE (slow, can be improved with art:promote)
memsharded marked this conversation as resolved.
Show resolved Hide resolved
$ conan download --list=promote.json -r=packages --format=json > promote.json
memsharded marked this conversation as resolved.
Show resolved Hide resolved
$ conan upload --list=promote.json -r=products -c


The first step uses the ``conan pkglist merge`` command to merge the package lists from the "Release" and "Debug" configurations and
merge it into a single ``promote.json`` package list.
This list is the one that will be used to run the promotion.

In this example we are using a slow ``conan download`` + ``conan upload`` promotion. This can be way more efficient with
the ``conan art:promote`` extension command.

After running the promotion we will have the following packages in the server:

.. graphviz::
:align: center

digraph repositories {
node [fillcolor="lightskyblue", style=filled, shape=box]
rankdir="LR";
subgraph cluster_0 {
label="Packages server";
style=filled;
color=lightgrey;
subgraph cluster_1 {
label = "packages\n repository"
shape = "box";
style=filled;
color=lightblue;
"packages" [style=invis];
"ai/1.1.0\n (Release)";
"ai/1.1.0\n (Debug)";
}
subgraph cluster_2 {
label = "products\n repository"
shape = "box";
style=filled;
color=lightblue;
"products" [style=invis];
"ai/promoted release" [label="ai/1.1.0\n (Release)"];
"ai/promoted debug" [label="ai/1.1.0\n (Debug)"];
}
subgraph cluster_3 {
rankdir="BT";
shape = "box";
label = "develop repository";
color=lightblue;
rankdir="BT";

node [fillcolor="lightskyblue", style=filled, shape=box]
"game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0";
"engine/1.0" -> "graphics/1.0" -> "mathlib/1.0";
"mapviewer/1.0" -> "graphics/1.0";
"game/1.0" [fillcolor="lightgreen"];
"mapviewer/1.0" [fillcolor="lightgreen"];
}
{
edge[style=invis];
"packages" -> "products" -> "game/1.0" ;
rankdir="BT";
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Package pipeline: multi configuration using lockfiles
=====================================================
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
Package pipeline: single configuration
======================================

We will start with the most simple case, in which we only had to build 1 configuration, and that configuration
can be built in the current CI machine.

As we described before while presenting the different server binary repositories, the idea is that package builds
will use by default the ``develop`` repo only, which is considered the stable one for developer and CI jobs.

Let's make sure we start from a clean state:

.. code-block:: bash

$ conan remove "*" -c # Make sure no packages from last run
$ conan remote remove "*" # Make sure no other remotes defined
$ conan remote add develop <url-develop-repo> # Add only the develop repo
memsharded marked this conversation as resolved.
Show resolved Hide resolved


The removal and addition of repos accross this tutorial can be a bit tedious, but it is important for the correct
behavior. Also, there might be other configurations that can be even more efficient for some cases, like re-triggering
a broken job because of CI malfunction, but we will keep it simple at the moment and try to focus on the main concepts.

With this configuration the CI job could just do:

.. code-block:: bash

$ conan create ai --build="missing:ai/*"
...
ai/1.1.0: SUPER BETTER Artificial Intelligence for aliens (Release)!
ai/1.1.0: Intelligence level=50


Note the ``--build="missing:ai/*"`` might not be fully necessary in some cases, but it can save time in other situations.
For example, if the developer did some changes just to the repo README, and didn't bump the version at all, Conan will not
generate a new ``recipe revision``, and detect this as a no-op, avoiding having to unnecessarily rebuild binaries from source.

If we are in a single-configuration scenario and it built correctly, for this simple case we won't need a promotion at all,
and just uploading directly the built packages to the ``products`` repository will be enough:


.. code-block:: bash

# We don't want to disrupt developers or CI, upload to products
$ conan remote add products <url-products-repo>
$ conan upload "ai*" -r=products -c

As the cache was initially clean, all ``ai`` packages would be the ones that were built in this pipeline.


.. graphviz::
:align: center

digraph repositories {
node [fillcolor="lightskyblue", style=filled, shape=box]
rankdir="LR";
subgraph cluster_0 {
label="Packages server";
style=filled;
color=lightgrey;
subgraph cluster_1 {
label = "packages\n repository"
shape = "box";
style=filled;
color=lightblue;
"packages" [style=invis];
}
subgraph cluster_2 {
label = "products\n repository"
shape = "box";
style=filled;
color=lightblue;
"products" [style=invis];
"ai/1.1.0\n (single config)";
}
subgraph cluster_3 {
rankdir="BT";
shape = "box";
label = "develop repository";
color=lightblue;
rankdir="BT";

node [fillcolor="lightskyblue", style=filled, shape=box]
"game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0";
"engine/1.0" -> "graphics/1.0" -> "mathlib/1.0";
"mapviewer/1.0" -> "graphics/1.0";
"game/1.0" [fillcolor="lightgreen"];
"mapviewer/1.0" [fillcolor="lightgreen"];
}
{
edge[style=invis];
"packages" -> "products" -> "game/1.0" ;
rankdir="BT";
}
}
}


This was a very simple scenario, let's move to a more realistic one: having to build more than one configuration.
31 changes: 31 additions & 0 deletions devops/continuous_integration/products_pipeline.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
Products pipeline
==================



There are some important points to understand about the products pipeline:

- What are the **products**? The "products" are the main software artifact that my organization is delivering as final result and provide some
value for users of those artifacts. In this example we will consider ``game/1.0`` and ``mapviewer/1.0`` the "products". Note that it is
possible to define different versions of the same package as products, for example, if we had to maintain different versions of the ``game`` for
different customers, we could have ``game/1.0`` and ``game/2.3`` as well as different versions of ``mapviewer`` as products.
- Why not defining in CI the "users" or "consumers" of every package? It might be tempting to model the relationships between packages, in this
case, that the package ``ai`` is used by the ``engine`` package, and then try to configure the CI so a build of ``engine`` is triggered after
a build of ``ai``. But this approach does not scale at all and have very important limitations:

- The example above is relatively simple, but in practice dependency graphs can have many more packages, even hundrends, making it very tedious and error prone to define all dependencies among packages in the CI
- Dependencies evolve over time, and new versions are used, some dependencies are removed and newer dependencies are added. The simple relationship between repositories modeled at the CI level can result in a very inefficient, slow and time consuming CI, if not a fragile one that continuously breaks because some dependencies change.
- The combinatorial nature that happens downstream a dependency graph, where a relatively stable top dependency, lets say ``mathlib/1.0`` might be used by multiple consumers such as ``ai/1.0``, ``ai/1.1``, ``ai/1.2`` which in turn each one might be used by multiple ``engine`` different versions and so on. Building only the latest version of the consumers would be insufficient in many cases and building all of them would be extremely costly.
- In C and C++ projects the "products" pipeline becomes more necessary and critical than in other languages due to the compilation model with
headrs textual inclusions becoming part of the consumers' binary artifacts and due to the native artifacts
linkage models. This means that in many scenarios it will be necessary to build new binaries that depend on some modified packages, even
if the source of the package itself didn't change at all. Conan's ``package_id`` computation together with some versioning conventions
can greatly help to efficiently define which packages needs to rebuild and which ones don't.



.. toctree::
:maxdepth: 1

products_pipeline/single_configuration

Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Products pipeline: single configuration
=======================================
Loading