Skip to content

Commit 63f2d65

Browse files
authored
Add link to TF monotonic_attention to README
1 parent a68c29c commit 63f2d65

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Online and Linear-Time Attention by Enforcing Monotonic Alignments
22

3-
#### *NOTE* - A reference implementation of the monotonic attention mechanism is now built into TensorFlow in the `tf.contrib.seq2seq` module.
3+
#### *NOTE* - A reference implementation of the monotonic attention mechanism is now built into TensorFlow in the `tf.contrib.seq2seq` module. See [here](https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/monotonic_attention) for documentation.
44

55
This repository contains an example implementation of the monotonic alignment decoder proposed in ["Online and Linear-Time Attention by Enforcing Monotonic Alignments"](https://arxiv.org/abs/1704.00784), by Colin Raffel, Minh-Thang Luong, Peter J. Liu, Ron J. Weiss, and Douglas Eck (arXiv:1704.00784).
66

0 commit comments

Comments
 (0)