edit

Inference

Release models

After training a model, you may want to release it for inference only by using the tools/release_model.lua script. A released model takes less space on disk and is compatible with both CPU and GPU translation.

th tools/release_model.lua -model model.t7 -gpuid 1

在默认情况下,它将产生一个 model_release.t7 文件。 有关高级选项的信息,请看 th tools/release_model.lua -h

Warning

A GPU is required to load non released models and released models can no longer be used for training.

Inference engine

CTranslate is a C++ implementation of translate.lua for integration in existing products. Take a look at the GitHub project for more information.