Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NDTensors] Start SparseArrayDOKs module #1270

Merged
merged 17 commits into from
Nov 29, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 20 additions & 27 deletions NDTensors/src/NDTensors.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,33 +19,26 @@ using Strided
using TimerOutputs
using TupleTools

# TODO: List types, macros, and functions being used.
include("lib/AlgorithmSelection/src/AlgorithmSelection.jl")
using .AlgorithmSelection: AlgorithmSelection
include("lib/BaseExtensions/src/BaseExtensions.jl")
using .BaseExtensions: BaseExtensions
include("lib/SetParameters/src/SetParameters.jl")
using .SetParameters
include("lib/BroadcastMapConversion/src/BroadcastMapConversion.jl")
using .BroadcastMapConversion: BroadcastMapConversion
include("lib/Unwrap/src/Unwrap.jl")
using .Unwrap
include("lib/RankFactorization/src/RankFactorization.jl")
using .RankFactorization: RankFactorization
include("lib/TensorAlgebra/src/TensorAlgebra.jl")
using .TensorAlgebra: TensorAlgebra
include("lib/DiagonalArrays/src/DiagonalArrays.jl")
using .DiagonalArrays
include("lib/BlockSparseArrays/src/BlockSparseArrays.jl")
using .BlockSparseArrays
include("lib/NamedDimsArrays/src/NamedDimsArrays.jl")
using .NamedDimsArrays: NamedDimsArrays
include("lib/SmallVectors/src/SmallVectors.jl")
using .SmallVectors
include("lib/SortedSets/src/SortedSets.jl")
using .SortedSets
include("lib/TagSets/src/TagSets.jl")
using .TagSets
for lib in [
:AlgorithmSelection,
:BaseExtensions,
:SetParameters,
:BroadcastMapConversion,
:Unwrap,
:RankFactorization,
:TensorAlgebra,
:SparseArrayInterface,
:SparseArrayDOKs,
:DiagonalArrays,
:BlockSparseArrays,
:NamedDimsArrays,
:SmallVectors,
:SortedSets,
:TagSets,
]
include("lib/$(lib)/src/$(lib).jl")
@eval using .$lib: $lib
end

using Base: @propagate_inbounds, ReshapedArray, DimOrInd, OneTo

Expand Down
3 changes: 3 additions & 0 deletions NDTensors/src/abstractarray/fill.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using .SetParameters: DefaultParameters, set_unspecified_parameters
using .Unwrap: unwrap_type

function generic_randn(
arraytype::Type{<:AbstractArray}, dim::Integer=0; rng=Random.default_rng()
)
Expand Down
2 changes: 2 additions & 0 deletions NDTensors/src/abstractarray/similar.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
using .Unwrap: IsWrappedArray

## Custom `NDTensors.similar` implementation.
## More extensive than `Base.similar`.

Expand Down
3 changes: 3 additions & 0 deletions NDTensors/src/abstractarray/tensoralgebra/contract.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
using LinearAlgebra: BlasFloat
using .Unwrap: expose

# TODO: Delete these exports
export backend_auto, backend_blas, backend_generic

@eval struct GemmBackend{T}
Expand Down
5 changes: 4 additions & 1 deletion NDTensors/src/array/permutedims.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
## Create the Exposed version of Base.permutedims
using .Unwrap: Exposed, unexpose

# TODO: Move to `Unwrap` module.
# Create the Exposed version of Base.permutedims
function permutedims(E::Exposed{<:Array}, perm)
## Creating Mperm here to evaluate the permutation and
## avoid returning a Stridedview
Expand Down
2 changes: 2 additions & 0 deletions NDTensors/src/array/set_types.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
using .SetParameters: Position, get_parameter, set_parameters

"""
TODO: Use `Accessors.jl` notation:
```julia
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using .BlockSparseArrays: BlockSparseArray
using .DiagonalArrays: DiagonalArray

# Used for dispatch to distinguish from Tensors wrapping TensorStorage.
# Remove once TensorStorage is removed.
const ArrayStorage{T,N} = Union{
Expand Down
4 changes: 3 additions & 1 deletion NDTensors/src/arraystorage/arraystorage/tensor/svd.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
using .DiagonalArrays: DiagIndices, DiagonalMatrix

backup_svd_alg(::Algorithm"divide_and_conquer") = Algorithm"qr_iteration"()
backup_svd_alg(::Algorithm"qr_iteration") = Algorithm"recursive"()

Expand Down Expand Up @@ -111,7 +113,7 @@ function svd(
# Make the new indices to go onto U and V
# TODO: Put in a separate function, such as
# `rewrap_inds` or something like that.
dS = length(S[DiagIndices()])
dS = length(S[DiagIndices(:)])
indstype = typeof(inds(T))
u = eltype(indstype)(dS)
v = eltype(indstype)(dS)
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
# TODO: Change to:
# using .SparseArrayDOKs: SparseArrayDOK
using .BlockSparseArrays: SparseArray

# TODO: This is inefficient, need to optimize.
# Look at `contract_labels`, `contract_blocks` and `maybe_contract_blocks!` in:
# src/blocksparse/contract_utilities.jl
Expand Down
7 changes: 7 additions & 0 deletions NDTensors/src/imports.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
# Makes `cpu` available as `NDTensors.cpu`.
# TODO: Define `cpu`, `cu`, etc. in a module `DeviceAbstractions`,
# similar to:
# https://github.com/JuliaGPU/KernelAbstractions.jl
# https://github.com/oschulz/HeterogeneousComputing.jl
using .Unwrap: cpu

import Base:
# Types
AbstractFloat,
Expand Down
41 changes: 32 additions & 9 deletions NDTensors/src/lib/DiagonalArrays/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,33 @@
A Julia `DiagonalArray` type.

````julia
using NDTensors.DiagonalArrays: DiagonalArray, DiagIndex, DiagIndices
using NDTensors.DiagonalArrays: DiagonalArray, DiagonalMatrix, DiagIndex, DiagIndices, isdiagindex
using Test

function main()
d = DiagonalArray([1.0, 2, 3], 3, 4, 5)
d = DiagonalMatrix([1.0, 2.0, 3.0])
@test eltype(d) == Float64
@test size(d) == (3, 3)
@test d[1, 1] == 1
@test d[2, 2] == 2
@test d[3, 3] == 3
@test d[1, 2] == 0

d = DiagonalArray([1.0, 2.0, 3.0], 3, 4, 5)
@test eltype(d) == Float64
@test d[1, 1, 1] == 1
@test d[2, 2, 2] == 2
@test d[3, 3, 3] == 3
@test d[1, 2, 1] == 0

d[2, 2, 2] = 22
@test d[2, 2, 2] == 22

@test length(d[DiagIndices()]) == 3
d_r = reshape(d, 3, 20)
@test size(d_r) == (3, 20)
@test all(I -> d_r[I] == d[I], LinearIndices(d))

@test length(d[DiagIndices(:)]) == 3
@test Array(d) == d
@test d[DiagIndex(2)] == d[2, 2, 2]

Expand All @@ -24,15 +38,24 @@ function main()

a = randn(3, 4, 5)
new_diag = randn(3)
a[DiagIndices()] = new_diag
d[DiagIndices()] = a[DiagIndices()]
a[DiagIndices(:)] = new_diag
d[DiagIndices(:)] = a[DiagIndices(:)]

@test a[DiagIndices()] == new_diag
@test d[DiagIndices()] == new_diag
@test a[DiagIndices(:)] == new_diag
@test d[DiagIndices(:)] == new_diag

permuted_d = permutedims(d, (3, 2, 1))
@test permuted_d isa DiagonalArray
@test permuted_d == d
@test permuted_d[DiagIndices(:)] == d[DiagIndices(:)]
@test size(d) == (3, 4, 5)
@test size(permuted_d) == (5, 4, 3)
for I in eachindex(d)
if !isdiagindex(d, I)
@test iszero(d[I])
else
@test !iszero(d[I])
end
end

mapped_d = map(x -> 2x, d)
@test mapped_d isa DiagonalArray
Expand All @@ -48,7 +71,7 @@ You can generate this README with:
```julia
using Literate
using NDTensors.DiagonalArrays
dir = joinpath(pkgdir(DiagonalArrays), "src", "DiagonalArrays")
dir = joinpath(pkgdir(DiagonalArrays), "src", "lib", "DiagonalArrays")
Literate.markdown(joinpath(dir, "examples", "README.jl"), dir; flavor=Literate.CommonMarkFlavor())
```

Expand Down
4 changes: 4 additions & 0 deletions NDTensors/src/lib/DiagonalArrays/examples/Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
[deps]
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
NDTensors = "23ae76d9-e61a-49c4-8f12-3f1a16adf9cf"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
42 changes: 33 additions & 9 deletions NDTensors/src/lib/DiagonalArrays/examples/README.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,34 @@
#
# A Julia `DiagonalArray` type.

using NDTensors.DiagonalArrays: DiagonalArray, DiagIndex, DiagIndices
using NDTensors.DiagonalArrays:
DiagonalArray, DiagonalMatrix, DiagIndex, DiagIndices, isdiagindex
using Test

function main()
d = DiagonalArray([1.0, 2, 3], 3, 4, 5)
d = DiagonalMatrix([1.0, 2.0, 3.0])
@test eltype(d) == Float64
@test size(d) == (3, 3)
@test d[1, 1] == 1
@test d[2, 2] == 2
@test d[3, 3] == 3
@test d[1, 2] == 0

d = DiagonalArray([1.0, 2.0, 3.0], 3, 4, 5)
@test eltype(d) == Float64
@test d[1, 1, 1] == 1
@test d[2, 2, 2] == 2
@test d[3, 3, 3] == 3
@test d[1, 2, 1] == 0

d[2, 2, 2] = 22
@test d[2, 2, 2] == 22

@test length(d[DiagIndices()]) == 3
d_r = reshape(d, 3, 20)
@test size(d_r) == (3, 20)
@test all(I -> d_r[I] == d[I], LinearIndices(d))

@test length(d[DiagIndices(:)]) == 3
@test Array(d) == d
@test d[DiagIndex(2)] == d[2, 2, 2]

Expand All @@ -23,15 +38,24 @@ function main()

a = randn(3, 4, 5)
new_diag = randn(3)
a[DiagIndices()] = new_diag
d[DiagIndices()] = a[DiagIndices()]
a[DiagIndices(:)] = new_diag
d[DiagIndices(:)] = a[DiagIndices(:)]

@test a[DiagIndices()] == new_diag
@test d[DiagIndices()] == new_diag
@test a[DiagIndices(:)] == new_diag
@test d[DiagIndices(:)] == new_diag

permuted_d = permutedims(d, (3, 2, 1))
@test permuted_d isa DiagonalArray
@test permuted_d == d
@test permuted_d[DiagIndices(:)] == d[DiagIndices(:)]
@test size(d) == (3, 4, 5)
@test size(permuted_d) == (5, 4, 3)
for I in eachindex(d)
if !isdiagindex(d, I)
@test iszero(d[I])
else
@test !iszero(d[I])
end
end

mapped_d = map(x -> 2x, d)
@test mapped_d isa DiagonalArray
Expand All @@ -47,7 +71,7 @@ You can generate this README with:
```julia
using Literate
using NDTensors.DiagonalArrays
dir = joinpath(pkgdir(DiagonalArrays), "src", "DiagonalArrays")
dir = joinpath(pkgdir(DiagonalArrays), "src", "lib", "DiagonalArrays")
Literate.markdown(joinpath(dir, "examples", "README.jl"), dir; flavor=Literate.CommonMarkFlavor())
```
=#
Loading
Loading