66[ ![ Build Status] ( https://github.com/mcabbott/TensorCast.jl/workflows/CI/badge.svg )] ( https://github.com/mcabbott/TensorCast.jl/actions?query=workflow%3ACI )
77
88This package lets you work with multi-dimensional arrays in index notation,
9- by defining a few macros.
9+ by defining a few macros which translate this to broadcasting, permuting, and reducing operations.
1010
1111The first is ` @cast ` , which deals both with "casting" into new shapes (including going to and from an array-of-arrays) and with broadcasting:
1212
1313``` julia
14- @cast A[row][col] := B[row, col] # slice a matrix B into rows, also @cast A[r] := B[r,:]
14+ @cast A[row][col] := B[row, col] # slice a matrix B into rows, also @cast A[r] := B[r,:]
1515
16- @cast C[(i,j), (k,ℓ)] := D. x[i,j,k,ℓ] # reshape a 4-tensor D.x to give a matrix
16+ @cast C[(i,j), (k,ℓ)] := D. x[i,j,k,ℓ] # reshape a 4-tensor D.x to give a matrix
1717
18- @cast E[φ,γ] = F[φ]^ 2 * exp (G[γ]) # broadcast E .= F.^2 .* exp.(G') into existing E
18+ @cast E[φ,γ] = F[φ]^ 2 * exp (G[γ]) # broadcast E .= F.^2 .* exp.(G') into existing E
1919
20- @cast _[i] := isodd (i) ? log (i) : V[i] # broadcast a function of the index values
20+ @cast _[i] := isodd (i) ? log (i) : V[i] # broadcast a function of the index values
2121
22- @cast T[x,y,n] := outer (M[:,n])[x,y] # generalised mapslices, vector -> matrix function
22+ @cast T[x,y,n] := outer (M[:,n])[x,y] # generalised mapslices, vector -> matrix function
2323```
2424
2525Second, ` @reduce ` takes sums (or other reductions) over the indicated directions. Among such sums is
@@ -33,7 +33,7 @@ matrix multiplication, which can be done more efficiently using `@matmul` instea
3333@matmul M[i,j] := sum (k,k′) U[i,k,k′] * V[(k,k′),j] # matrix multiplication, plus reshape
3434```
3535
36- This notation with ` @cast ` applies a function which takes the ` dims ` keyword, without reducing:
36+ The same notation with ` @cast ` applies a function accepting the ` dims ` keyword, without reducing:
3737
3838``` julia
3939@cast W[i,j,c,n] := cumsum (c) X[c,i,j,n]^ 2 # permute, broadcast, cumsum(; dims=3)
@@ -43,7 +43,16 @@ All of these are converted into array commands like `reshape` and `permutedims`
4343and ` eachslice ` , plus a [ broadcasting expression] ( https://julialang.org/blog/2017/01/moredots ) if needed,
4444and ` sum ` / ` sum! ` , or ` * ` / ` mul! ` . This package just provides a convenient notation.
4545
46- It can be used with some other packages which modify broadcasting:
46+ From version 0.4, it relies on [ TransmuteDims.jl] ( https://github.com/mcabbott/TransmuteDims.jl )
47+ to handle re-ordering of dimensions, and [ LazyStack.jl] ( https://github.com/mcabbott/LazyStack.jl )
48+ to handle slices. It should also now work with [ OffsetArrays.jl] ( https://github.com/JuliaArrays/OffsetArrays.jl ) :
49+
50+ ``` julia
51+ using OffsetArrays
52+ @cast R[n,c] := n^ 2 + rand (3 )[c] (n in - 5 : 5 ) # arbitrary indexing
53+ ```
54+
55+ And it can be used with some packages which modify broadcasting:
4756
4857``` julia
4958using Strided, LoopVectorization, LazyArrays
@@ -55,15 +64,15 @@ using Strided, LoopVectorization, LazyArrays
5564## Installation
5665
5766``` julia
58- ] add TensorCast
67+ using Pkg; Pkg . add ( " TensorCast" )
5968```
6069
6170The current version requires [ Julia 1.4] ( https://julialang.org/downloads/ ) or later.
6271There are a few pages of [ documentation] ( https://mcabbott.github.io/TensorCast.jl/dev ) .
6372
6473## Elsewhere
6574
66- Similar notation is used by some other packages, although all of them use an implicit sum over
75+ Similar notation is also used by some other packages, although all of them use an implicit sum over
6776repeated indices. [ TensorOperations.jl] ( https://github.com/Jutho/TensorOperations.jl ) performs
6877Einstein-convention contractions and traces:
6978
@@ -102,4 +111,3 @@ while `@ein` & `@tensor` are closer to [`einsum`](https://numpy.org/doc/stable/r
102111This was a holiday project to learn a bit of metaprogramming, originally ` TensorSlice.jl ` .
103112But it suffered a little scope creep.
104113
105- From version 0.4, it relies on two helper packages: [ TransmuteDims.jl] ( https://github.com/mcabbott/TransmuteDims.jl ) handles permutations & reshapes, and [ LazyStack.jl] ( https://github.com/mcabbott/LazyStack.jl ) handles slices.
0 commit comments