From cslt Wiki
Revision as of 05:32, 28 March 2020 by Cslt
- Back to 2017, we set our goal of deep speech factorizatoin. The first paper is published on ICASSP 2018
- Lantian Li, Dong Wang, Yixiang Chen, Ying Shi, Zhiyuan Tang, "DEEP FACTORIZATION FOR SPEECH SIGNAL", ICASSP 2018. 
- We noticed the problem of soft-max based training, due to the discardxing of the output layers
- Lantian Li, Zhiyuan Tang, Dong Wang, "FULL-INFO TRAINING FOR DEEP SPEAKER FEATURE LEARNING", ICASSP 2018. 
- 2018/12/26, propose the idea of deep statistical speaker representation. That was based on VAE 
- We noticed the impact of irregulation of deep speaker vectors, and tried to present normalization approaches
- Yang Zhang and Lantian Li and Dong Wang, VAE-based regularization for deep speaker embedding, Interspeech 2019. 
- 2019/04/20, "Normalization in speaker embedding", Speaker recognition workshop, Kunshan, Shanghai, 
- 2019/07/17, Deep Feature Learning and Normalization for Speaker Recognition, report in India summr school 
- 2019/08/14, present the first proposal that uses flow to model deep speaker featrues. (Report in Huawei group discussion)
- 2019/10/27, present the initial idea of using flow to perform factorization, CSLT weekly meeting 
- 2019/11/12, CYQ start to work on DNF, using the subspace of the dimension to discriminte speakers [cvss 714]
- 2019/12/20, I start to work on NF with constraint training. More understanding acheived for LDA. [cvss 741]
- 2020/1/23, I noticed a bug in the DNF code, where the residual space was infact trained, so it is not a true dim-split DNF we hoped. [cvss 741]
- 2020/1/27, I conjectured the normalization role of DNF, and informed YQ to perform a full-space experiment. The results are good.  
- 2020/1/28, I confirmed the normalization role of LDA for x-vectors. This forms the basic argument for the deep norm paper. [cvss 741]
- 2020/2/10, Dong Wang, Deep Generative Models for Discriminative Tasks, CSLT weekly meeting. Present DNF
- 2020/2/18, Deep norm paper submitted to IEEE Transactions.
- 2020/2/24, I start working on optimal scoring for SRE, and establish the NL theory. The paper was submitted on 3.17 to APSIPA transaction.
- 2020/3/21, I coined the NDA model, and completed the verification in 3 hours. This model can be used for scoring.
- 2020/3/25, I designed the VAE-NF model, using NF to perform the generation net in VAE. It can generate more informative latent codes but the theory is not completed.
- 2020/3/27, I extend the NDA to neural linear Gaussian model.
- 2020/3/28, I extend the NDA to neural Bayesian model.