8000 feat: use dispatch doctor on `apply` by avik-pal · Pull Request #28 · LuxDL/LuxCore.jl · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
This repository was archived by the owner on Nov 4, 2024. It is now read-only.

feat: use dispatch doctor on apply #28

Merged
merged 2 commits into from
Jul 10, 2024
Merged

feat: use dispatch doctor on apply #28

merged 2 commits into from
Jul 10, 2024

Conversation

avik-pal
Copy link
Member
@avik-pal avik-pal commented Jun 7, 2024

Unlikely I will merge this... But let's see

julia> using Lux, Random

julia> model = Chain(Dense(2 => 3, tanh), Dense(3 => 2))
Chain(
    layer_1 = Dense(2 => 3, tanh_fast),  # 9 parameters
    layer_2 = Dense(3 => 2),            # 8 parameters
)         # Total: 17 parameters,
          #        plus 0 states.

julia> ps, st = Lux.setup(Xoshiro(), model);

julia> x = Array{Any}(rand(Float32, 2, 10))
2×10 Matrix{Any}:
 0.250114  0.422033  0.331566  0.287223  0.174177  0.0369731  0.725141   0.228544  0.517224  0.167443
 0.10156   0.432903  0.545861  0.446346  0.443447  0.340539   0.0544641  0.066212  0.203659  0.275935

julia> model(x, ps, st)
┌ Warning: DispatchDoctor._Errors.TypeInstabilityWarning: Instability detected in `apply` defined at /mnt/research/lux/LuxCore.jl/src/LuxCore.jl:180 with arguments `(Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}, Matrix{Any}, @NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, @NamedTuple{})`. Inferred to be `Tuple{AbstractMatrix, @NamedTuple{}}`, which is not a concrete type.
└ @ LuxCore ~/.julia/packages/DispatchDoctor/sYnip/src/stabilization.jl:309
(Float32[0.09118694 0.33019328  0.16104816 0.24224024; -0.28271726 -0.4778355  -0.55057544 -0.2418724], (layer_1 = NamedTuple(), layer_2 = NamedTuple()))

julia> x = rand(Float32, 2, 10)
2×10 Matrix{Float32}:
 0.62034   0.986571  0.337403  0.211978  0.0683416  0.739813  0.641151  0.733598  0.621704  0.281616
 0.651012  0.96922   0.606521  0.313661  0.515087   0.152605  0.585971  0.767035  0.584248  0.356263

julia> model(x, ps, st)
(Float32[0.40631223 0.40294093  0.375147 0.2957083; -0.5933738 -0.6907558  -0.60710454 -0.3575806], (layer_1 = NamedTuple(), layer_2 = NamedTuple()))

@avik-pal avik-pal force-pushed the ap/dispatch_doctor branch from 39d0cdf to efc6d3e Compare June 7, 2024 22:34
Copy link
codecov bot commented Jun 7, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 83.15%. Comparing base (caa7a32) to head (519485d).

Additional details and impacted files
@@            Coverage Diff             @@
##             main      #28      +/-   ##
==========================================
+ Coverage   82.97%   83.15%   +0.17%     
==========================================
  Files           1        1              
  Lines          94       95       +1     
==========================================
+ Hits           78       79       +1     
  Misses         16       16              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@avik-pal avik-pal force-pushed the ap/dispatch_doctor branch from 4339d8c to 6797490 Compare June 7, 2024 22:51
@avik-pal avik-pal force-pushed the ap/dispatch_doctor branch from 6797490 to fe7e81d Compare June 9, 2024 19:58
@avik-pal avik-pal force-pushed the ap/dispatch_doctor branch from fe7e81d to 691fb4c Compare July 9, 2024 02:25
@avik-pal avik-pal changed the title Check what happens if we enable Dispatch Checks at apply feat: use dispatch doctor on apply Jul 9, 2024
@avik-pal avik-pal force-pushed the ap/dispatch_doctor branch 2 times, most recently from 64151c6 to 919b8cc Compare July 9, 2024 14:15
@avik-pal avik-pal force-pushed the ap/dispatch_doctor branch from 919b8cc to 519485d Compare July 10, 2024 00:30
@avik-pal avik-pal merged commit 003c8da into main Jul 10, 2024
11 of 14 checks passed
@avik-pal avik-pal deleted the ap/dispatch_doctor branch July 10, 2024 01:31
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant
0