Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow to specify the development shell to use #327

Open
otavio opened this issue Aug 13, 2024 · 9 comments
Open

Allow to specify the development shell to use #327

otavio opened this issue Aug 13, 2024 · 9 comments

Comments

@otavio
Copy link

otavio commented Aug 13, 2024

I'm facing a situation where the development shell for the CI can be minimal while the developer's development shell must have additional tools installed.

I need to select the appropriate shell for the CI job, but the current action doesn't seem to allow me to do so.

Can you add support for that?

@workflow
Copy link
Owner

Hi @otavio, does the following combination of options work for your use case?

@otavio
Copy link
Author

otavio commented Aug 20, 2024

It does not. I want to use a specific devShell to reduce the size of the dependencies that I'm fetching for the CI job to execute. So, I would create a CI development shell specifically for this use, and the developer development shell would inherit this one and extend it with command line tools used during user development.

@workflow
Copy link
Owner

workflow commented Sep 2, 2024

@otavio You can already do this with existing nix/flake functionality.

For example, you could have a CI "base flake" under /ci/flake.nix where you define the inputs wanted for CI only:

{
  description = "A basic flake with a special devShell package";

  inputs = {
    nixpkgs.url = "nixpkgs/24.05";
    flake-utils.url = "github:numtide/flake-utils";
  };

  outputs = {...} @ inputs:
    inputs.flake-utils.lib.eachDefaultSystem (system: let
      pkgs = import inputs.nixpkgs {inherit system;};
    in {
      devShell = pkgs.mkShell {
        buildInputs = with pkgs; [
          cmatrix
        ];
      };
    });
}

(setting working-directory and flakes-from-devshell to use that slimmed down development shell in CI)

And then for your local development environment, you can import the CI flake and extend its buildInputs:

{
  description = "Main project flake that extends the CI flake's devShell";

  inputs = {
    nixpkgs.url = "nixpkgs/24.05";
    flake-utils.url = "github:numtide/flake-utils";
    ci.url = "path:./ci";
  };

  outputs = {
    nixpkgs,
    flake-utils,
    ci,
    ...
  }:
    flake-utils.lib.eachDefaultSystem (system: let
      pkgs = import nixpkgs {inherit system;};
      ciBuildInputs = ci.devShell.x86_64-linux.buildInputs;
    in {
      devShell = pkgs.mkShell {
        buildInputs = ciBuildInputs ++ [pkgs.hello];
      };
    });
}

This local devShell gives you all the packages from the CI shell (cmatrix) and extends it with whatever you want to have only locally (hello).

Addition: If you plan on maintaining a more complex development shell, I can highly recommend using the excellent https://devenv.sh/

@otavio
Copy link
Author

otavio commented Sep 2, 2024

I didn't imagine bringing this split setup, this is indeed very clever. However in my specific use case I'm using a single flake as a source for multiple projects so I can standardize the flake setup as well as dependencies.

In this specific use case, I would need to have a way to specify the development shell used for entering the nix develop.

If I was going to do that by command-line, I would do:

nix develop .#ci

And for normal development, I would just call:

nix develop

Is it possible for you to add support to specify the shell to use?

@workflow
Copy link
Owner

workflow commented Sep 20, 2024

@otavio v3.4.0 (latest) now supports a custom-devshell attribute, see README

Here's an example in action.

@otavio
Copy link
Author

otavio commented Sep 24, 2024

This seems to be exactly what I was looking for. Thanks for your support.

@otavio otavio closed this as completed Sep 24, 2024
@otavio otavio reopened this Oct 28, 2024
@otavio
Copy link
Author

otavio commented Oct 28, 2024

@workflow I'm returning to this because I found an interesting side effect of this change.

If I use version 3.3, it all works. However, bumping this to version 3.4 breaks the build. I'm having a very hard time figuring out what has changed because I didn't even change the development shell to be used; only the action has been updated in this PR (https://github.com/OSSystems/ZephyrBT/actions/runs/11558586629/job/32171463466).

Could you take a look at that and see if you have any idea what's going on?

@workflow
Copy link
Owner

workflow commented Nov 3, 2024

Hmm sorry @otavio at first look I have no idea what could be going on here. It seems to be pulling the exact same versions of dependencies like west between both runs, and even all tests are passing as before but then failing with this strange error from West:

415     FATAL ERROR: already initialized in /home/runner/work/ZephyrBT, aborting.
416       Hint: if you do not want a workspace there,
417       remove this directory and re-run this command:
418
419       /home/runner/work/ZephyrBT/.west

Since you likely understand more than me what west is doing under the hood, I would dig in there and activate some debug flags there to try and figure out what's going on.

Also as a side-note since your project is already fully Flake-yfied, you could try the excellent https://devenv.sh/ instead of this action for your CI needs, or look at This Action which seems more built for the modern flake age :)

@otavio
Copy link
Author

otavio commented Nov 4, 2024

This is indeed very weird because if you look at the code with the current 3.3 version it works well, however updating the version to the 3.4 it fails.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants