Difference between revisions of "FreeNeb model training"

From cslt Wiki
Jump to: navigation, search
Line 34: Line 34:
 
|Uyhgur 16k AM training with data augmentation
 
|Uyhgur 16k AM training with data augmentation
 
||
 
||
2
+
3
 
||
 
||
 
# Resource file preparation and release.
 
# Resource file preparation and release.
Line 40: Line 40:
 
# Online model training.
 
# Online model training.
 
||
 
||
May.25 (<font color="red">Delay to June.20</font>)
+
May.25 (<font color="red">Delay ....</font>)
 
||
 
||
 
Holding
 
Holding
Line 52: Line 52:
 
# Model training.
 
# Model training.
 
||
 
||
June.30
+
June.30 (<font color="red">Delay ....</font>)
 
||
 
||
 
Holding
 
Holding
 
|-
 
|-
 
|Chinese 16k AM training with data augmentation||
 
|Chinese 16k AM training with data augmentation||
3
+
1
 
||
 
||
 
# Resource file preparation and release.
 
# Resource file preparation and release.
 
# Data preparation.
 
# Data preparation.
 +
# Experiment on small data set.
 
# Online model training.
 
# Online model training.
 
# Local model training.
 
# Local model training.
 
||
 
||
July.10
+
July.31
 
||
 
||
Holding
+
# In progress
 +
# In progress
 +
# Holding
 +
# Holding
 +
# Holding
 
|-
 
|-
 
|Chinese 16k d-vector based AM training||
 
|Chinese 16k d-vector based AM training||

Revision as of 05:03, 25 June 2018

AM

Task Priority Subtasks Deadline Status track
Japanese 16k AM training with data augmentation

1

  1. Resource file preparation and release.
  2. Data preparation.
  3. Online model training.
  4. Local model training.

May.18 (Delay to May.31)

  1. Finish.
  2. Finish.
  3. Finish.
  4. Finish.
Chinese 16k children AM training

1

  1. Data preparation.
  2. Online model training.

May.31

Finish

Uyhgur 16k AM training with data augmentation

3

  1. Resource file preparation and release.
  2. Data preparation.
  3. Online model training.

May.25 (Delay ....)

Holding

Chinese 8k AM training

2

  1. Experiments on small dataset.
  2. Data preparation.
  3. Model training.

June.30 (Delay ....)

Holding

Chinese 16k AM training with data augmentation

1

  1. Resource file preparation and release.
  2. Data preparation.
  3. Experiment on small data set.
  4. Online model training.
  5. Local model training.

July.31

  1. In progress
  2. In progress
  3. Holding
  4. Holding
  5. Holding
Chinese 16k d-vector based AM training

3

  1. Recipe preparation and experiments on small dataset.
  2. Data preparation.
  3. Model training.

June.31

Holding


LM

Task Priority Subtasks Deadline Status track
Chinese children education domain LM training

1

  1. Children education domain vocabulary list construction.
  2. Corpus preparation.
  3. LM training

May.31

  1. Finish.
  2. Finish.
  3. In progress.
Chinese general domain LM training.

2

  1. Current vocabulary lists collaboration.
  2. New vocabulary list construction.
  3. Corpus preparation.
  4. LM training.

June.30

  1. done.
  2. In progress.
  3. done.
  4. In progress.
Chinese domain specific vocabulary list construction

3

  1. Financial domain keyword construction.

July.31

Holding.