Skip to content
Discussion options

You must be logged in to vote

Hi, you were right, I misread the passage in the article. The replication was successful. Thank you for your help 🙏

Here is a table with some experiments I made during replication, it can be useful for somebody 😉

LibriSpeech clean-100 (Freeze)_(Last) (Freeze)_(All) (FineTuned)_(Last) (FineTuned)_(All)
Linear 87.28 / 94.49 24.33 / 39.61 7.56 / 18.48 7.84 / 19.02
BLSTM 18.52 / 41.34 9.92 / 24.09 6.78 / 17.54 7.38 / 18.6
Transformer 31.5 / 52.87 11.0 / 24.87 7.17 / 17.98 7.46 / 18.6
  • Legend:
    • BEST-RQ encoder: 12xConformer, GlobNorm, fp32, cb1x320x8192 ... +- setup from recipe.
    • Header: (Encoder)_(Decoder)
      • (Encoder) == Freeze or FineTuned
      • (Decoder) == All , Output of each layer …

Replies: 2 comments 7 replies

Comment options

You must be logged in to vote
5 replies
@svecjan
Comment options

@whettenr
Comment options

@TParcollet
Comment options

@svecjan
Comment options

Answer selected by svecjan
@whettenr
Comment options

Comment options

You must be logged in to vote
2 replies
@svecjan
Comment options

@vanIvan
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
5 participants