Difference between revisions of "NLP Status Report 2016-12-26"

From cslt Wiki
Jump to: navigation, search
Line 30: Line 30:
 
* tried to change cos to only inner product, and inner product is better than cos.
 
* tried to change cos to only inner product, and inner product is better than cos.
 
* [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/9f/RNNG%2Bmm_experiment_report.pdf report]]
 
* [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/9f/RNNG%2Bmm_experiment_report.pdf report]]
* read a paper [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/92/DEEP_BIAFFINE_ATTENTION_FOR_NEURAL_DEPENDENCY_PARSING.pdf]]
+
* read 3 papers [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/92/DEEP_BIAFFINE_ATTENTION_FOR_NEURAL_DEPENDENCY_PARSING.pdf]] [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/a/aa/Simple_and_Accurate_Dependency_Parsing_Using_Bidirectional_LSTM_Feature_Representations.pdf]] [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/f/fb/Bi-directional_Attention_with_Agreement_for_Dependency_Parsing.pdf]]
 
* trying the joint training, which got a problem of optimization.  
 
* trying the joint training, which got a problem of optimization.  
 
||
 
||
 
* try the joint training  
 
* try the joint training  
* read more papers [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/a/aa/Simple_and_Accurate_Dependency_Parsing_Using_Bidirectional_LSTM_Feature_Representations.pdf]] [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/f/fb/Bi-directional_Attention_with_Agreement_for_Dependency_Parsing.pdf]]
+
* read more papers and write a summary
 
|-
 
|-
 
|Guli ||
 
|Guli ||

Revision as of 03:19, 26 December 2016

Date People Last Week This Week
2016/12/26 Yang Feng
  • s2smn: wrote the manual of s2s with tensorflow [nmt-manual]
  • wrote part of the code of mn.
  • wrote the manual of Moses [moses-manual]
  • Huilan: fixed the problem of syntax-based translation.
  • sort out the system and corresponding documents.
Jiyuan Zhang
  • coded tone_model,but had some trouble
  • run global_attention_model that decodes four sentences, fourfivegenerated by local_attention model
  • improve poem model
Andi Zhang
  • coded to output encoder outputs and correspoding source & target sentences(ids in dictionaries)
  • coded a script for bleu scoring, which tests the five checkpoints auto created by training process and save the one with best performance
  • extract encoder outputs
Shiyue Zhang
  • tried to add true action info when training gate, which got better results than no true actions, but still not very good.
  • tried different scale vectors, and found setting >=-5000 is good
  • tried to change cos to only inner product, and inner product is better than cos.
  • [report]
  • read 3 papers [[1]] [[2]] [[3]]
  • trying the joint training, which got a problem of optimization.
  • try the joint training
  • read more papers and write a summary
Guli
  • read papers about Transfer learning and solving OOV
  • conducted comparative test
  • writing survey
  • complete the first draft of the survey
Peilun Xiao
  • learned tf-idf algorithm
  • coded tf-idf alogrithm in python,but found it not worked well
  • tried to use small dataset to test the program
  • use sklearn tfidf to test the dataset