Difference between revisions of "NLP Status Report 2016-12-19"

From cslt Wiki
Jump to: navigation, search
Line 10: Line 10:
 
*sort out the system and corresponding documents.
 
*sort out the system and corresponding documents.
 
||
 
||
*[[s2smn:]] finish the manual of nmt tensorflow
+
*[[s2smn:]] finish the code of adding mn.
*[[Huilan:]] system submission
+
*[[Huilan:]] handover.
 
|-
 
|-
 
|Jiyuan Zhang ||
 
|Jiyuan Zhang ||

Revision as of 05:11, 19 December 2016

Date People Last Week This Week
2016/12/12 Yang Feng
  • s2smn: wrote the manual of s2s with tensorflow [nmt-manual]
  • wrote part of the code of mn.
  • wrote the manual of Moses [moses-manual]
  • Huilan: fixed the problem of syntax-based translation.
  • sort out the system and corresponding documents.
Jiyuan Zhang
  • attempted to use memory model to improve the atten model of bad effect
  • With the vernacular as the input,generated poem by local atten model[1]
  • Modified working mechanism of memory model(top1 to average)
  • help andi
  • improve poem model
Andi Zhang
  • prepared a paraphrase data set that is enumerated from a previous one (ignoring words like "啊呀哈")
  • worked on coding bidirectional model under tensorflow, met with NAN problem
  • ignore NAN problem for now, run it on the same data set used in Theano
Shiyue Zhang
  • finished tsne pictures, and discussed with teachers
  • tried experiments with 28-dim mem, but found almost all of them converged to baseline
  • returned to 384-dim mem, which is still slightly better than basline.
  • found the problem of action mem, one-hot vector is not proper.
  • [report]
  • change one-hot vector to (0, -10000.0, -10000.0...)
  • try 1-dim gate
  • try max cos
Guli
  • install and run moses
  • prepare thesis report
  • read papers about Transfer learning and solving OOV
Peilun Xiao
  • Read a paper about document classification wiht GMM distributions of word vecotrs and try to code it in python
  • Use LDA to reduce the dimension of the text in r52、r8 and contrast the performance of classification
  • Use LDA to reduce the dimension of the text in 20news and webkb