Skip to content

Wma/shard levels

Wma/shard levels #2330

Triggered via pull request February 6, 2025 01:19
Status Cancelled
Total duration 8m 40s
Artifacts

CI.yml

on: pull_request
Matrix: test
Fit to window
Zoom out
Zoom in

Annotations

8 errors and 1 notice
Documentation: src/tensors/levels/shard_levels.jl#L9
doctest failure in ~/work/Finch.jl/Finch.jl/src/tensors/levels/shard_levels.jl:9-19 ```jldoctest julia> tensor_tree(Tensor(Dense(Shard(Element(0.0))), [1, 2, 3])) 3-Tensor └─ Dense [1:3] ├─ [1]: Shard -> │ └─ 1.0 ├─ [2]: Shard -> │ └─ 2.0 └─ [3]: Shard -> └─ 3.0 ``` Subexpression: tensor_tree(Tensor(Dense(Shard(Element(0.0))), [1, 2, 3])) Evaluated output: ERROR: MethodError: no method matching ShardLevel(::ElementLevel{0.0, Float64, Int64, Vector{Float64}}) The type `ShardLevel` exists, but no method is defined for this combination of argument types when trying to construct it. Closest candidates are: ShardLevel(::Device, !Matched::Lvl, !Matched::Ptr, !Matched::Task, !Matched::Val) where {Device, Lvl, Ptr, Task, Val} @ Finch ~/work/Finch.jl/Finch.jl/src/tensors/levels/shard_levels.jl:22 ShardLevel(::Device, !Matched::Lvl) where {Device, Lvl} @ Finch ~/work/Finch.jl/Finch.jl/src/tensors/levels/shard_levels.jl:30 Stacktrace: [1] top-level scope @ none:1 Expected output: 3-Tensor └─ Dense [1:3] ├─ [1]: Shard -> │ └─ 1.0 ├─ [2]: Shard -> │ └─ 2.0 └─ [3]: Shard -> └─ 3.0 diff = Warning: Diff output requires color. 3-Tensor └─ Dense [1:3] ├─ [1]: Shard -> │ └─ 1.0 ├─ [2]: Shard -> │ └─ 2.0 └─ [3]: Shard -> └─ 3.0ERROR: MethodError: no method matching ShardLevel(::ElementLevel{0.0, Float64, Int64, Vector{Float64}}) The type `ShardLevel` exists, but no method is defined for this combination of argument types when trying to construct it. Closest candidates are: ShardLevel(::Device, !Matched::Lvl, !Matched::Ptr, !Matched::Task, !Matched::Val) where {Device, Lvl, Ptr, Task, Val} @ Finch ~/work/Finch.jl/Finch.jl/src/tensors/levels/shard_levels.jl:22 ShardLevel(::Device, !Matched::Lvl) where {Device, Lvl} @ Finch ~/work/Finch.jl/Finch.jl/src/tensors/levels/shard_levels.jl:30 Stacktrace: [1] top-level scope @ none:1
Documentation
Process completed with exit code 1.
Julia 1.10.6 - windows-latest - x86 - pull_request
Canceling since a higher priority waiting request for 'CI-refs/pull/697/merge' exists
Julia 1.10.6 - ubuntu-latest - x64 - pull_request
Canceling since a higher priority waiting request for 'CI-refs/pull/697/merge' exists
Julia 1 - windows-latest - x86 - pull_request
Canceling since a higher priority waiting request for 'CI-refs/pull/697/merge' exists
Julia 1 - ubuntu-latest - x64 - pull_request
Canceling since a higher priority waiting request for 'CI-refs/pull/697/merge' exists
Python tests
Canceling since a higher priority waiting request for 'CI-refs/pull/697/merge' exists
Python tests
The operation was canceled.
[julia-buildpkg] Caching of the julia depot was not detected
Consider using `julia-actions/cache` to speed up runs https://github.com/julia-actions/cache To ignore, set input `ignore-no-cache: true`