-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sparse state matrices given to the various solvers lead to poor performance - we should either provide a warning or automatically call dense
#379
Comments
Hi there! For debugging, it helps to make the example smaller and independent from plotting and other distractions. It also helps significantly if the example is shortened so that it can evaluate fast. Lastly, in julia it is valuable to use I changed to julia> @benchmark timeevolution.master(T, ρ0, H, J)
BenchmarkTools.Trial: 2 samples with 1 evaluation.
Range (min … max): 2.514 s … 2.516 s ┊ GC (min … max): 4.99% … 4.99%
Time (median): 2.515 s ┊ GC (median): 4.99%
Time (mean ± σ): 2.515 s ± 1.667 ms ┊ GC (mean ± σ): 4.99% ± 0.00%
Memory estimate: 3.20 GiB, allocs estimate: 105701576. That immense amount of allocations is a very big red flag. The function seems to be spending more time reserving and unreserving memory than actually running computation. In comparison, in python you get:
Way faster in python. But what caused all of this? Check the output of each of these functions. In julia you will notice that your Anyway, fixing this in julia by changing your input to a dense matrix: julia> @benchmark timeevolution.master(T, dense(ρ0), H, J)
BenchmarkTools.Trial: 1095 samples with 1 evaluation.
Range (min … max): 4.218 ms … 5.520 ms ┊ GC (min … max): 0.00% … 16.30%
Time (median): 4.553 ms ┊ GC (median): 0.00%
Time (mean ± σ): 4.566 ms ± 206.563 μs ┊ GC (mean ± σ): 0.98% ± 3.33%
Memory estimate: 4.02 MiB, allocs estimate: 245. A bit faster than python's qutip... Not too bad. And probably it can be made another order of magnitude faster if you play with the diffeq solver settings. Thanks for bringing this up! We should probably print a warning or something in situations like this (beeing given a sparse matrix for the state when the evolutions should use a dense matrix). Edit: the |
dense
Hi Krastanov, Thank you for your help in pinpointing the issue and directing me to the benchmarking tools! Should I come across any other problems, I'll make sure to provide minimal working examples for clarity. Thank you once again for your support! |
Just one comment here: if we opt for the autoconversion to dense, there might be memory issues. A valid use-case for sparse states is when a user knows the evolution will not populate many entries in a density matrix and the system is very large. Granted, this is quite a special case, but we somehow have to offer the possibility of keeping things sparse, i.e. the current behaviour should still be available to users. Yet, I agree that a sensible default is to make everything dense. |
I (Stefan @Krastanov ) edited this issue to give some context. @Mulliken reported below a performance issue due to using a sparse initial state matrix in
timeevolution.master
. A large array of sparse (datastructure) matrices was created for matrices that are not at all sparse, causing enormous performance penalties. We should either warn or automatically calldense
in these functions.Original report:
Hello
QuantumOptics.jl
community!I have a question about the performance of QuantumOptics.jl, stated as follows. This is the first time I wrote code in Julia. Please forgive me if I made silly mistakes.
Issue Description
I am experiencing a performance issue with a simulation script in QuantumOptics.jl, specifically for a three-level atom coupled to two modes (5 states each). The two modes are subject to some Lindblad dissipations. The script, named
three-level.jl
, simulates a system with a Hilbert space dimension of (3*25, 3*25) and is set to run for 500 steps.Expected Behavior
Based on the simplicity of the simulation, I anticipated that the execution would take less than 1 minute.
Actual Behavior
The script took over 10 minutes and still would not stop my Windows PC, which is considerably longer than expected. A similar simulation written in Python using QuTiP (
three-level.py
) completes in just a few seconds.Environment
Comparison with QuTiP/Python
I've also implemented a comparable simulation in Python using the QuTiP library, which runs significantly faster. I'm attaching the Python script (
three-level.py
) and Julia script for reference.Request
Are there any known issues, optimizations, or alternative approaches that could improve the execution time of my script?
Thank you for your assistance and looking forward to any suggestions or insights you can provide.
Julia script
three-level.jl
Python (qutip) script
three-level.py
The text was updated successfully, but these errors were encountered: