Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce latency #210

Merged
merged 1 commit into from
Aug 6, 2021
Merged

Reduce latency #210

merged 1 commit into from
Aug 6, 2021

Conversation

timholy
Copy link
Member

@timholy timholy commented Aug 6, 2021

julia> using Cthulhu

julia> tstart = time(); descend(gcd, (Int, Int)); time() - tstart

and hit 'q' while you are waiting.

Here's the timing:

  • master: 6.57s
  • this branch with just precompilation: 5.86s
  • this branch with compile=min optimize=1: 4.68s

I haven't run this extensively, so I can't promise that there isn't a hit to runtime performance, but it looks from my analysis in #209 that this module is not performance-sensitive. It's a bit at odds with a couple of annotations we have

Base.@aggressive_constprop function lookup(interp::CthulhuInterpreter, mi::MethodInstance, optimize::Bool; allow_no_codeinf::Bool=false)

so some discussion may be merited.

```
julia> using Cthulhu

julia> tstart = time(); descend(gcd, (Int, Int)); time() - tstart
```
and hit 'q' while you are waiting.

Here's the timing:
- master: 6.57s
- this branch with just precompilation: 5.86s
- this branch with compile=min optimize=1: 4.68s
@timholy timholy closed this Aug 6, 2021
@timholy timholy reopened this Aug 6, 2021
@codecov-commenter
Copy link

codecov-commenter commented Aug 6, 2021

Codecov Report

Merging #210 (5b4f238) into master (60d8b54) will decrease coverage by 35.70%.
The diff coverage is n/a.

Impacted file tree graph

@@             Coverage Diff             @@
##           master     #210       +/-   ##
===========================================
- Coverage   83.47%   47.77%   -35.71%     
===========================================
  Files           7        7               
  Lines         938      875       -63     
===========================================
- Hits          783      418      -365     
- Misses        155      457      +302     
Impacted Files Coverage Δ
src/Cthulhu.jl 41.99% <ø> (-31.35%) ⬇️
src/backedges.jl 26.82% <0.00%> (-68.36%) ⬇️
src/callsite.jl 10.97% <0.00%> (-67.34%) ⬇️
src/codeview.jl 55.37% <0.00%> (-38.38%) ⬇️
src/interpreter.jl 69.44% <0.00%> (-25.16%) ⬇️
src/ui.jl 78.31% <0.00%> (-9.79%) ⬇️
src/reflection.jl 78.48% <0.00%> (-6.83%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 60d8b54...5b4f238. Read the comment docs.

Copy link
Member

@aviatesk aviatesk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great.

It's a bit at odds with a couple of annotations we have

The only purpose of Base.@aggressive_constant_prop is to narrow down the return type of lookup from Tuple{Union{Nothing,CodeInfo,IRCode},...} to Tuple{Union{CodeInfo,IRCode}} when not in test, by making sure inference propagates constant information of allow_no_codeinf argument.

It may come with inference cost (but we can't say anything solid unless we do benchmarking, because inference performance isn't really monotonic w.r.t. inference accuracy), but since we're precompiling the inference cache in this PR, it won't hurt the latency ?

@timholy
Copy link
Member Author

timholy commented Aug 6, 2021

That's correct, because this package "owns" lookup so all instances will be cached.

I don't know what's up with the coverage report; it appeared shortly after submission before all workers had finished, and hasn't updated. I don't think it's a permanent state of affairs, but feel free to close and re-open if you want to run it again.

@aviatesk aviatesk closed this Aug 6, 2021
@aviatesk aviatesk reopened this Aug 6, 2021
@aviatesk
Copy link
Member

aviatesk commented Aug 6, 2021

Hmmm, very weird, the coverage is still very low.
Well, I generally don't care codecov report so seriously, so I'd like to merge this PR as is. The latency gain is really valuable imho.

@timholy
Copy link
Member Author

timholy commented Aug 6, 2021

Sure, we can fix the report later (if the breakage is real, which it may not be).

@timholy timholy merged commit 463aed0 into master Aug 6, 2021
@timholy timholy deleted the teh/latency branch August 6, 2021 17:11
@timholy
Copy link
Member Author

timholy commented Aug 6, 2021

Julia bug. It turns out that the combination of precompilation and limiting the compiler options seems to break coverage analysis. AFAICT it's essentially limited to the calls that were made at the time the package was built, which means the "workload" executed during precompilation.

@timholy
Copy link
Member Author

timholy commented Aug 6, 2021

Ah, it's JuliaLang/julia#37059

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants