Sie sind auf Seite 1von 14

12/15/2019 A Gentle Introduction to Neural Machine Translation

 Navigation

Click to Take the FREE NLP Crash-Course

Search... 

A Gentle Introduction to Neural Machine Translation


by Jason Brownlee on December 29, 2017 in Deep Learning for Natural Language Processing

Tweet Share Share

Last Updated on August 7, 2019

One of the earliest goals for computers was the automatic translation of text from one language to another.

Automatic or machine translation is perhaps one of the most challenging artificial intelligence tasks given
the fluidity of human language. Classically, rule-based systems were used for this task, which were
replaced in the 1990s with statistical methods. More recently, deep neural network models achieve state-of-
the-art results in a field that is aptly named neural machine translation.

In this post, you will discover the challenge of machine translation and the effectiveness of neural machine
translation models.

After reading this post, you will know:

Machine translation is challenging given the inherent ambiguity and flexibility of human language.
Statistical machine translation replaces classical rule-based systems with models that learn to
translate from examples.
Neural machine translation models fit a single model rather than a pipeline of fine-tuned models and
currently achieve state-of-the-art results.

Discover how to develop deep learning models for text classification, translation, photo captioning and
more in my new book, with 30 step-by-step tutorials and full source code.

Let’s get started.


Thank you for signing up! ×
Please check your email and click the link
provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 1/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

A Gentle Introduction to Neural Machine Translation


Photo by Fabio Achilli, some rights reserved.

What is Machine Translation?


Machine translation is the task of automatically converting source text in one language to text in another
language.

In a machine translation task, the input already consists of a sequence of symbols in some
 language, and the computer program must convert this into a sequence of symbols in another
language.

— Page 98, Deep Learning, 2016. Your Start in Machine Learning

Given a sequence of text in a source language, there is no one single best translation of that text to
another language. This is because of the natural ambiguity and flexibility of human language. This makes
the challenge of automatic machine translation difficult, perhaps one of the most difficult in artificial
intelligence: Thank you for signing up! ×
Please check your email and click the link
The fact is that accurate translation requires background
provided knowledge in subscription.
order to resolve
 ambiguity and establish the content of the sentence.
to confirm your

https://machinelearningmastery.com/introduction-neural-machine-translation/ 2/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

— Page 21, Artificial Intelligence, A Modern Approach, 3rd Edition, 2009.

Classical machine translation methods often involve rules for converting text in the source language to the
target language. The rules are often developed by linguists and may operate at the lexical, syntactic, or
semantic level. This focus on rules gives the name to this area of study: Rule-based Machine Translation,
or RBMT.

RBMT is characterized with the explicit use and manual creation of linguistically informed rules
 and representations.

— Page 133, Handbook of Natural Language Processing and Machine Translation, 2011.

The key limitations of the classical machine translation approaches are both the expertise required to
develop the rules, and the vast number of rules and exceptions required.

Need help with Deep Learning for Text Data?


Take my free 7-day email crash course now (with code).

Click to sign-up and also get a free PDF Ebook version of the course.

Start Your FREE Crash-Course Now

What is Statistical Machine Translation?


Statistical machine translation, or SMT for short, is the use of statistical models that learn to translate text
from a source language to a target language gives a large corpus of examples.

This task of using a statistical model can be stated formally as follows:

Given a sentence T in the target language, we seek the sentence S from which the translator
 Your Start in Machine Learning
produced T. We know that our chance of error is minimized by choosing that sentence S that is
most probable given T. Thus, we wish to choose S so as to maximize Pr(S|T).

— A Statistical Approach to Machine Translation, 1990.


Thank you for signing up! ×
This formal specification makes the maximizing of the probability of the output sequence given the input
Please check your email and click the link
sequence of text explicit. It also makes the notion of thereprovided
being atosuite of your
confirm candidate translations explicit
subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 3/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

and the need for a search process or decoder to select the one most likely translation from the model’s
output probability distribution.

Given a text in the source language, what is the most probable translation in the target
 language? […] how should one construct a statistical model that assigns high probabilities to
“good” translations and low probabilities to “bad” translations?

— Page xiii, Syntax-based Statistical Machine Translation, 2017.

The approach is data-driven, requiring only a corpus of examples with both source and target language
text. This means linguists are not longer required to specify the rules of translation.

This approach does not need a complex ontology of interlingua concepts, nor does it need
 handcrafted grammars of the source and target languages, nor a hand-labeled treebank. All it
needs is data—sample translations from which a translation model can be learned.

— Page 909, Artificial Intelligence, A Modern Approach, 3rd Edition, 2009.

Quickly, the statistical approach to machine translation outperformed the classical rule-based methods to
become the de-facto standard set of techniques.

Since the inception of the field at the end of the 1980s, the most popular models for statistical
 machine translation […] have been sequence-based. In these models, the basic units of
translation are words or sequences of words […] These kinds of models are simple and
effective, and they work well for man language pairs

— Syntax-based Statistical Machine Translation, 2017.

The most widely used techniques were phrase-based and focus on translating sub-sequences of the
source text piecewise.

Statistical Machine Translation (SMT) has been the dominant translation paradigm for decades.
 Practical implementations of SMT are generally phrase-based systems (PBMT) which translate
sequences of words or phrases where the lengths may differ
Your Start in Machine Learning

— Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine
Translation, 2016.

Although effective, statistical machine translation methodsThank


suffered you
from afor signing
narrow focus on up!
the phrases
×
being translated, losing the broader nature of the target text. The hard focus on data-driven approaches
Please check your email and click the link
also meant that methods may have ignored important syntax distinctions known by linguists. Finally, the
provided to confirm your subscription.
statistical approaches required careful tuning of each module in the translation pipeline.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 4/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

What is Neural Machine Translation?


Neural machine translation, or NMT for short, is the use of neural network models to learn a statistical
model for machine translation.

The key benefit to the approach is that a single system can be trained directly on source and target text, no
longer requiring the pipeline of specialized systems used in statistical machine learning.

Unlike the traditional phrase-based translation system which consists of many small sub-
 components that are tuned separately, neural machine translation attempts to build and train a
single, large neural network that reads a sentence and outputs a correct translation.

— Neural Machine Translation by Jointly Learning to Align and Translate, 2014.

As such, neural machine translation systems are said to be end-to-end systems as only one model is
required for the translation.

The strength of NMT lies in its ability to learn directly, in an end-to-end fashion, the mapping from
 input text to associated output text.

— Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine
Translation, 2016.

Encoder-Decoder Model
Multilayer Perceptron neural network models can be used for machine translation, although the models are
limited by a fixed-length input sequence where the output must be the same length.

These early models have been greatly improved upon recently through the use of recurrent neural
networks organized into an encoder-decoder architecture that allow for variable length input and output
sequences.

An encoder neural network reads and encodes a source sentence into a fixed-length vector. A
 decoder then outputs a translation from the encoded vector. The whole encoder–decoder
system, which consists of the encoder and the decoder for ainlanguage
Your Start Machine pair, is jointly trained to
Learning
maximize the probability of a correct translation given a source sentence.

— Neural Machine Translation by Jointly Learning to Align and Translate, 2014.

Thank you for signing up! ×


Key to the encoder-decoder architecture is the ability of the model to encode the source text into an
internal fixed-length representation called the context vector. Interestingly,
Please onceand
check your email encoded,
click the different
link
decoding systems could be used, in principle, to translateprovided
the context into your
to confirm different languages.
subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 5/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

… one model first reads the input sequence and emits a data structure that summarizes the
 input sequence. We call this summary the “context” C. […] A second mode, usually an RNN,
then reads the context C and generates a sentence in the target language.

— Page 461, Deep Learning, 2016.

For more on the Encoder-Decoder recurrent neural network architecture, see the post:

Encoder-Decoder Long Short-Term Memory Networks

Encoder-Decoders with Attention


Although effective, the Encoder-Decoder architecture has problems with long sequences of text to be
translated.

The problem stems from the fixed-length internal representation that must be used to decode each word in
the output sequence.

The solution is the use of an attention mechanism that allows the model to learn where to place attention
on the input sequence as each word of the output sequence is decoded.

Using a fixed-sized representation to capture all the semantic details of a very long sentence […]
 is very difficult. […] A more efficient approach, however, is to read the whole sentence or
paragraph […], then to produce the translated words one at a time, each time focusing on a
different part of he input sentence to gather the semantic details required to produce the next
output word.

— Page 462, Deep Learning, 2016.

The encoder-decoder recurrent neural network architecture with attention is currently the state-of-the-art on
some benchmark problems for machine translation. And this architecture is used in the heart of the Google
Neural Machine Translation system, or GNMT, used in their Google Translate service.
https://translate.google.com

… current state-of-the-art machine translation systems are powered by models that employ
 attention. Your Start in Machine Learning

— Page 209, Neural Network Methods in Natural Language Processing, 2017.

For more on attention, see the post: Thank you for signing up! ×
Please
Attention in Long Short-Term Memory Recurrent Neural check your email and click the link
Networks
provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 6/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

Although effective, the neural machine translation systems still suffer some issues, such as scaling to
larger vocabularies of words and the slow speed of training the models. There are the current areas of
focus for large production neural translation systems, such as the Google system.

Three inherent weaknesses of Neural Machine Translation […]: its slower training and inference
 speed, ineffectiveness in dealing with rare words, and sometimes failure to translate all words in
the source sentence.

— Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine
Translation, 2016.

Further Reading
This section provides more resources on the topic if you are looking to go deeper.

Books
Neural Network Methods in Natural Language Processing, 2017.
Syntax-based Statistical Machine Translation, 2017.
Deep Learning, 2016.
Statistical Machine Translation, 2010.
Handbook of Natural Language Processing and Machine Translation, 2011.
Artificial Intelligence, A Modern Approach, 3rd Edition, 2009.

Papers
A Statistical Approach to Machine Translation, 1990.
Review Article: Example-based Machine Translation, 1999.
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation,
2014.
Neural Machine Translation by Jointly Learning to Align and Translate, 2014.
Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine
Translation, 2016.
Sequence to sequence learning with neural networks, 2014.
Recurrent Continuous Translation Models, 2013.
Continuous space translation models for phrase-based statistical
Your machine Learning
Start in Machine translation, 2013.

Additional
Machine Translation Archive
Neural machine translation on Wikipedia Thank you for signing up! ×
Chapter 13, Neural Machine Translation, Statistical Machine Translation, 2017.
Please check your email and click the link
provided to confirm your subscription.
Summary
https://machinelearningmastery.com/introduction-neural-machine-translation/ 7/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

In this post, you discovered the challenge of machine translation and the effectiveness of neural machine
translation models.

Specifically, you learned:

Machine translation is challenging given the inherent ambiguity and flexibility of human language.
Statistical machine translation replaces classical rule-based systems with models that learn to
translate from examples.
Neural machine translation models fit a single model rather than a pipeline of fine tuned models and
currently achieve state-of-the-art results.

Do you have any questions?


Ask your questions in the comments below and I will do my best to answer.

Develop Deep Learning models for Text Data Today!


Develop Your Own Text models in Minutes
...with just a few lines of python code

Discover how in my new Ebook:


Deep Learning for Natural Language Processing

It provides self-study tutorials on topics like:


Bag-of-Words, Word Embedding, Language Models, Caption Generation, Text
Translation and much more...

Finally Bring Deep Learning to your Natural Language


Processing Projects
Skip the Academics. Just Results.

SEE WHAT'S INSIDE

Tweet Share Share Your Start in Machine Learning

About Jason Brownlee


Jason Brownlee, PhD is a machine learning specialist who teaches developers how to get results with
modern machine learning methods via hands-on tutorials.
Thank you for signing up! ×
View all posts by Jason Brownlee →
Please check your email and click the link
provided to confirm your subscription.

 Caption Generation with the Inject and Merge Encoder-Decoder Models


https://machinelearningmastery.com/introduction-neural-machine-translation/ 8/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

Encoder-Decoder Recurrent Neural Network Models for Neural Machine Translation 

18 Responses to A Gentle Introduction to Neural Machine Translation

REPLY 
Roberto Mariani January 1, 2018 at 3:17 am #

Given a database of hundreds of million lines of short sentences with a limited number of 20000
words, do you think it is better to investigate a character-level RNN or a word-based RNN? What your
intuition tells you?

REPLY 
Jason Brownlee January 1, 2018 at 5:29 am #

Start with words and go to char to see if it can lift skill or simplify the model.

REPLY 
Rodolfo Maslias January 2, 2018 at 6:03 pm #

I shared your interesting article on my Fb page European Terminology

REPLY 
Jason Brownlee January 3, 2018 at 5:30 am #

Thanks!

REPLY 
Dan Baez January 10, 2018 at 4:08 pm #

Great post Jason, machinelearningmastery.com has become my new home for practical learning as
I am starting to get a hold of some ML techniques. A suggestion from me that may help others..
Your Start in Machine Learning
Could you look putting together a simple tutorial developing ‘production’ ready models. For example, once a
model has been developed how does one go about updating with new data and using the model for ongoing
classification and prediction with new data. Some methods I have come stumbled across are manually
updating new inputs into the code, manually updating new inputs into a .CSV file and for bigger datasets
updating new data into .H5 file that the model recognises.Thank
This wouldyou forthe
help take signing
enormous up!
learnings you ×
offer to a level where the models become an ongoing tool for work or research….definitely something I have
not yet mastered! Please check your email and click the link
provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 9/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

REPLY 
Jason Brownlee January 11, 2018 at 5:46 am #

Great suggestion, thanks.

See this post on final models:


https://machinelearningmastery.com/train-final-machine-learning-model/

And this post on models in production:


http://machinelearningmastery.com/deploy-machine-learning-model-to-production/

I hope that helps as a start.

REPLY 
Ali July 25, 2018 at 6:03 pm #

Sir, your post is very informative, and it gives me novel intuitions into this area.
Thank you very much for sharing your knowledge.
I’m completely new in this field.
Actually, I used to translate research papers and articles as my freelance job. So, I know nothing academic
in the computer science field.
However, I have been interested in machine learning since 2 years ago. I worked with python and attended
in some online courses. The whole field is full of joy, and challenges, of course.
I’m not an English native speaker, as it can be inferred from my english writing skills; sorry for that.
My first language is Persian (Farsi) and Persian has no ASCII representation. We use unicode charset, just
like Arabic.
I was wondering if the aforementioned issue (lack of ASCII support), and the special properties of Persian
language (e.g., its syntax which is way different from that of English, Spanish, French, or even Arabic) would
affect NLP techniques and algorithms used in translation services like Google Translate?
I think google service translates English-Arabic pair so much better than English-Persian pair, and I feel like
it has nothing to do with the volume of data (Persian texts, particularly) provided for the engine.
Also, I really like to develope a minimal machine translation project (for my research purposes), but I have
no idea in terms of best algorithms, platforms, or techniques.
It would be useful if you share your opinion with us on this particular matter, and I would really appreciate
that.
Again, thank you for the intuitive information you post here.
Best Wishes,
Ali from Persia

Your Start in Machine Learning

REPLY 
Jason Brownlee July 26, 2018 at 7:39 am #

Thank
It is an interesting question and not something youabout.
I know much forOffsigning up! try to
the cuff, I would ×
model the problem using unicode instead of chars, but I’d encourage you to read up in the literature how
it is addressed generally. Please check your email and click the link
provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 10/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

Bob Hodgson September 6, 2018 at 5:23 am # REPLY 

Dear Jason

Do you have any thoughts on the usefulness of NNT to the task of Bible translation? I consult for a Bible
translation agency and am eager to show the application of NNT to the production of first draft translations in
small and threatened languages of the world. FYI: the Hebrew Bible has only about 6,000+ discrete words,
the Christian New Testament about the same amount. Many of the small and endangered languages have
about the same number of discrete words.

REPLY 
Jason Brownlee September 6, 2018 at 5:42 am #

I don’t know. Perhaps prototype some models and see how well it performs.

REPLY 
Dario September 7, 2018 at 11:37 pm #

Hi Jason, would NMT a good method to do code translation from one language to another: let’s say
from R to Python? Thanks

REPLY 
Jason Brownlee September 8, 2018 at 6:08 am #

Maybe.

REPLY 
Buli Diriba January 19, 2019 at 7:25 am #

Hello Jason,

Thanks for the post its very constructive and interesting, and it gives me good understanding but I got some
questions on Neural Machine Translation

1 As I understand, In NMT we don’t need a separate language model, so how does a Decoder learns the
grammar of the target language during predicting the next word, Or does a Seq2seq model do not need to
Your Start in Machine Learning
learn grammar of a language ?

Jason Brownlee January 19, 2019 at 8:18 am # × 


Thank you for signing up! REPLY
It learns a conditional probabilistic model, e.g. outputcheck
Please the next
your word
email conditioned
and click the on
linkthe input
and on the words generated so far. provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 11/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

REPLY 
Ben Johnson August 11, 2019 at 12:23 pm #

Hello Jason:

As luck would have it, I’m glad I came across your informative post. It is a good introduction–thanks to your
good analysis and gentle approach (your headline got me here).

I have been translating from Japanese to English for about 40 years now, and since the beginning of MT, I
do see surprising progress, but it still seems the “attention” or equivalent level of improvement in the
Western languages is greater than for the Asian languages, as nuanced in some of the earlier posts to you
in this blog. Goofy Google translations (Google Maps) made headlines recently in Japan, in addition to the
continued cry for help with Chinese to English translations.

I perceive this is still simply a “cultural issue” and in time this too will improve; sorry for being in the wrong
forum. It seems to me NMT providers should at least use qualified human checks before publishing
(sometimes perverse) translations. “Well, this too will get better sooner or later.”

REPLY 
Jason Brownlee August 12, 2019 at 6:33 am #

Great comment Ben, thanks for sharing.

Regarding Chinese translation, I would expect that systems by Baidu may be more effective thatn those
by google.

REPLY 
Maysoon December 11, 2019 at 11:27 pm #

Thank you so much for the comprehensive explanation of how neural machine translation works, I
have a question regarding probabilities learning; for commonly used words, pronouns, helping verbs, etc.
Are they treated differently than domain-specific terms?

REPLY 
Jason Brownlee December 12, 2019 at 6:25 am #

Thanks!
Your Start in Machine Learning
You can handle them differently if you want, or remove them completely if needed.

Thank you for signing up! ×


Leave a Reply
Please check your email and click the link
provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 12/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

Name (required)

Email (will not be published) (required)

Website

SUBMIT COMMENT

Welcome!
My name is Jason Brownlee PhD, and I help developers get results with machine learning.
Read more

Never miss a tutorial:

Picked for you:

How to Develop a Deep Learning Photo Caption Generator from Scratch

Your Start in Machine Learning


Sequence Classification with LSTM Recurrent Neural Networks in Python with Keras

Thank you for signing up! ×


How to Use Word Embedding Layers for Deep Learning with Keras

Please check your email and click the link


provided to confirm your subscription.

How to Develop a Neural Machine Translation System from Scratch


https://machinelearningmastery.com/introduction-neural-machine-translation/ 13/14
12/15/2019 A Gentle Introduction to Neural Machine Translation

Text Generation With LSTM Recurrent Neural Networks in Python with Keras

Loving the Tutorials?

The Deep Learning for NLP EBook


is where I keep the Really Good stuff.

SEE WHAT'S INSIDE

© 2019 Machine Learning Mastery Pty. Ltd. All Rights Reserved.


Address: PO Box 206, Vermont Victoria 3133, Australia. | ACN: 626 223 336.
Twitter | Facebook | LinkedIn | Newsletter | RSS

Privacy | Disclaimer | Terms | Contact | Sitemap | Search

Your Start in Machine Learning

Thank you for signing up! ×


Please check your email and click the link
provided to confirm your subscription.

https://machinelearningmastery.com/introduction-neural-machine-translation/ 14/14

Das könnte Ihnen auch gefallen