Difference between revisions of "NLP Status Report 2016-11-21"
From cslt Wiki
Line 39: | Line 39: | ||
|- | |- | ||
|Shiyue Zhang || | |Shiyue Zhang || | ||
− | * run rnng on MKL successfully, which can double or triple the speed. | + | * run rnng on MKL successfully, which can double or triple the speed. Revised the RNNG User Guide. |
* rerun the original model and get the final result 92.32 | * rerun the original model and get the final result 92.32 | ||
* rerun the wrong memory models, still running | * rerun the wrong memory models, still running |
Latest revision as of 01:41, 21 November 2016
Date | People | Last Week | This Week |
---|---|---|---|
2016/11/21 | Yang Feng |
1) ran experiments of rnng+mn [report] ; 2) used top-k for memory, under training
1) wrote the proposal 2) discussed the details with Andy
|
1) get the result of top-k; 2) try bigger memory;
1)coding work
1) try syntax-based TM |
Jiyuan Zhang |
|
| |
Andi Zhang |
|
| |
Shiyue Zhang |
|
| |
Guli |
|
|