Difference between revisions of "2019-01-16"

From cslt Wiki
Jump to: navigation, search
Line 23: Line 23:
 
|Xiuqi Jiang
 
|Xiuqi Jiang
 
||  
 
||  
*
+
* Focused back on the quatrain generation, thinking of the drawbacks of current model.
 +
* Tried to weaken the attention mechanism between sentences.
 
||  
 
||  
*  
+
* Add VAE into model and try to generate instead of predicting.
 
||
 
||
 
*   
 
*   

Revision as of 03:42, 16 January 2019

People Last Week This Week Task Tracking (DeadLine)
Yibo Liu
Xiuqi Jiang
  • Focused back on the quatrain generation, thinking of the drawbacks of current model.
  • Tried to weaken the attention mechanism between sentences.
  • Add VAE into model and try to generate instead of predicting.
Jiayao Wu
Zhaodi Qi
Jiawei Yu
Yunqi Cai
  • 1.Run through the bert model. 2.study the details of the bert model.
Dan He
Yang Zhang
  • exam week.
  • (I will finish all my exams on Friday night)