Skip to content

Commit

Permalink
Merge pull request #696 from finch-tensor/wma/formatter
Browse files Browse the repository at this point in the history
use Blue Style
  • Loading branch information
willow-ahrens authored Feb 5, 2025
2 parents e34956c + 2994939 commit 6ae4ee1
Show file tree
Hide file tree
Showing 194 changed files with 14,760 additions and 7,962 deletions.
11 changes: 11 additions & 0 deletions .JuliaFormatter.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
ignore = ["test/reference32", "test/reference64"]
format_markdown = true
style = "blue"
join_lines_based_on_source = true
annotate_untyped_fields_with_any = false
normalize_line_endings = "unix"
always_use_return = false # https://github.com/domluna/JuliaFormatter.jl/issues/888
align_assignment = true
align_struct_field = true
align_conditional = true
align_pair_arrow = true
10 changes: 10 additions & 0 deletions .github/workflows/StyleBot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
name: Style Review
on:
pull_request:
jobs:
code-style:
runs-on: ubuntu-latest
steps:
- uses: julia-actions/julia-format@v3
with:
version: '1' # Set `version` to '1.0.54' if you need to use JuliaFormatter.jl v1.0.54 (default: '1')
8 changes: 7 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ option to configure the number of processors.
```

You can also filter to only run a selection of test suitesusing the `--include` or
`--exclude` arguments, or
`--exclude` arguments, or

```
./test/runtests.jl --include constructors interface_einsum interface_asmd
Expand Down Expand Up @@ -115,3 +115,9 @@ The `/docs` directory includes Finch documentation in `/src`, and a built
website in `/build`. You can build the website with `./docs/make.jl`. You can
run doctests with `./docs/test.jl`, and fix doctests with `./docs/fix.jl`,
though both are included as part of the test suite.

## Code Style

We use [Blue Style](https://github.com/JuliaDiff/BlueStyle) formatting, with a few tweaks
defined in `.JuliaFormatter.toml`. Running the tests in overwrite mode will
automatically reformat your code, but you can also add [`JuliaFormatter`](https://domluna.github.io/JuliaFormatter.jl/stable/#Editor-Plugins) to your editor to reformat as you go.
4 changes: 3 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,8 @@ SimpleWeightedGraphs = "47aef6b3-ad0c-573a-a1e2-d07658019622"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
TensorMarket = "8b7d4fe7-0b45-4d0d-9dd8-5cc9b23b4b77"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
FileWatching = "7b1f6079-737a-58dc-b8bc-7a2ca5c1b5ee"
JuliaFormatter = "98e50ef6-434e-11e9-1051-2b60c6c9e899"

[targets]
test = ["ReTestItems", "Test", "ArgParse", "LinearAlgebra", "Random", "SparseArrays", "Graphs", "SimpleWeightedGraphs", "HDF5", "NPZ", "Pkg", "TensorMarket", "Documenter"]
test = ["ReTestItems", "Test", "ArgParse", "LinearAlgebra", "Random", "SparseArrays", "Graphs", "SimpleWeightedGraphs", "HDF5", "NPZ", "Pkg", "TensorMarket", "Documenter", "FileWatching", "JuliaFormatter"]
51 changes: 29 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,11 @@
# Finch.jl

[docs]:https://finch-tensor.github.io/Finch.jl/stable
[ddocs]:https://finch-tensor.github.io/Finch.jl/dev
[ci]:https://github.com/finch-tensor/Finch.jl/actions/workflows/CI.yml?query=branch%3Amain
[cov]:https://codecov.io/gh/finch-tensor/Finch.jl
[example]:https://github.com/finch-tensor/Finch.jl/tree/main/docs/examples

[docs_ico]:https://img.shields.io/badge/docs-stable-blue.svg
[ddocs_ico]:https://img.shields.io/badge/docs-dev-blue.svg
[ci_ico]:https://github.com/finch-tensor/Finch.jl/actions/workflows/CI.yml/badge.svg?branch=main
[cov_ico]:https://codecov.io/gh/finch-tensor/Finch.jl/branch/main/graph/badge.svg
[example_ico]:https://img.shields.io/badge/examples-docs%2Fexamples-blue.svg

| **Documentation** | **Build Status** | **Examples** |
|:---------------------------------------------:|:-------------------------------------:|:---------------------:|
| [![][docs_ico]][docs] [![][ddocs_ico]][ddocs] | [![][ci_ico]][ci] [![][cov_ico]][cov] | [![][example_ico]][example] |
[![Stable Documentation](https://img.shields.io/badge/docs-stable-blue.svg)](https://finch-tensor.github.io/Finch.jl/stable)
[![Development Documentation](https://img.shields.io/badge/docs-dev-blue.svg)](https://finch-tensor.github.io/Finch.jl/dev)
[![Examples](https://img.shields.io/badge/docs-examples-blue.svg)](https://github.com/finch-tensor/Finch.jl/tree/main/docs/examples)
[![CI Status](https://github.com/finch-tensor/Finch.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/finch-tensor/Finch.jl/actions/workflows/CI.yml?query=branch%3Amain)
[![Coverage Report](https://codecov.io/gh/finch-tensor/Finch.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/finch-tensor/Finch.jl)
[![Code Style: Blue](https://img.shields.io/badge/code%20style-blue-4495d1.svg)](https://github.com/JuliaDiff/BlueStyle)

Finch is a Julia-to-Julia compiler for sparse or structured multidimensional arrays. Finch empowers users to write high-level array programs which are transformed behind-the-scenes into fast sparse code.

Expand All @@ -28,7 +19,9 @@ use case. This allows users to write readable, high-level sparse array programs
At the [Julia](https://julialang.org/downloads/) REPL, install the latest stable version by running:

```julia
julia> using Pkg; Pkg.add("Finch")
julia> using Pkg;
Pkg.add("Finch");

```

## Quickstart
Expand All @@ -37,13 +30,15 @@ julia> using Pkg; Pkg.add("Finch")
julia> using Finch

# Create a sparse tensor

julia> A = Tensor(CSCFormat(), [1 0 0; 0 2 0; 0 0 3])
3×3 Tensor{DenseLevel{Int64, SparseListLevel{Int64, Vector{Int64}, Vector{Int64}, ElementLevel{0.0, Float64, Int64, Vector{Float64}}}}}:
1.0 0.0 0.0
0.0 2.0 0.0
0.0 0.0 3.0

# Perform a simple operation

julia> B = A + A
3×3 Tensor{DenseLevel{Int64, SparseDictLevel{Int64, Vector{Int64}, Vector{Int64}, Vector{Int64}, Dict{Tuple{Int64, Int64}, Int64}, Vector{Int64}, ElementLevel{0.0, Float64, Int64, Vector{Float64}}}}}:
2.0 0.0 0.0
Expand Down Expand Up @@ -71,26 +66,33 @@ Finch supports many high-level array operations out of the box, such as `+`, `*`
julia> using Finch

# Define sparse tensor A

julia> A = Tensor(Dense(SparseList(Element(0.0))), [0 1.1 0; 2.2 0 3.3; 4.4 0 0; 0 0 5.5])

# Define sparse tensor B

julia> B = Tensor(Dense(SparseList(Element(0.0))), [0 1 1; 1 0 0; 0 0 1; 0 0 1])

# Element-wise multiplication

julia> C = A .* B

# Element-wise max

julia> C = max.(A, B)

# Sum over rows
julia> D = sum(C, dims=2)

julia> D = sum(C; dims=2)

```

For situations where more complex operations are needed, Finch supports an `@einsum` syntax on sparse and structured tensors.

```julia
julia> @einsum E[i] += A[i, j] * B[i, j]

julia> @einsum F[i, k] <<max>>= A[i, j] + B[j, k]
julia> @einsum F[i, k] << max >>= A[i, j] + B[j, k]

```

Expand All @@ -101,9 +103,13 @@ sparsity patterns of the inputs.
```julia
julia> using Finch, BenchmarkTools

julia> A = fsprand(1000, 1000, 0.1); B = fsprand(1000, 1000, 0.1); C = fsprand(1000, 1000, 0.0001);
julia> A = fsprand(1000, 1000, 0.1);
B = fsprand(1000, 1000, 0.1);
C = fsprand(1000, 1000, 0.0001);

julia> A = lazy(A); B = lazy(B); C = lazy(C);
julia> A = lazy(A);
B = lazy(B);
C = lazy(C);

julia> sum(A * B * C)

Expand All @@ -115,7 +121,8 @@ julia> @btime compute(sum(A * B * C), ctx=galley_scheduler());
```

### How it Works
Finch first translates high-level array code into **FinchLogic**, a custom intermediate representation that captures operator fusion and enables loop ordering optimizations. Using advanced schedulers, Finch optimizes FinchLogic and lowers it to **FinchNotation**, a more refined representation that precisely defines control flow. This optimized FinchNotation is then compiled into highly efficient, sparsity-aware code. Finch can specialize to each combination of sparse formats and algebraic properties, such as `x * 0 => 0`, eliminating unnecessary computations in sparse code automatically.

Finch first translates high-level array code into **FinchLogic**, a custom intermediate representation that captures operator fusion and enables loop ordering optimizations. Using advanced schedulers, Finch optimizes FinchLogic and lowers it to **FinchNotation**, a more refined representation that precisely defines control flow. This optimized FinchNotation is then compiled into highly efficient, sparsity-aware code. Finch can specialize to each combination of sparse formats and algebraic properties, such as `x * 0 => 0`, eliminating unnecessary computations in sparse code automatically.

## Learn More

Expand All @@ -124,7 +131,7 @@ The following manuscripts provide a good description of the research behind Finc
[Finch: Sparse and Structured Array Programming with Control Flow](https://arxiv.org/abs/2404.16730).
Willow Ahrens, Teodoro Fields Collin, Radha Patel, Kyle Deeds, Changwan Hong, Saman Amarasinghe.

[Looplets: A Language for Structured Coiteration](https://doi.org/10.1145/3579990.3580020). CGO 2023.
[Looplets: A Language for Structured Coiteration](https://doi.org/10.1145/3579990.3580020). CGO 2023.
Willow Ahrens, Daniel Donenfeld, Fredrik Kjolstad, Saman Amarasinghe.

## Beyond Finch
Expand Down
Loading

0 comments on commit 6ae4ee1

Please sign in to comment.