Skip to content
Snippets Groups Projects
Commit 73a78c24 authored by Mario Sänger's avatar Mario Sänger
Browse files

Fix errors in latex compilation

parent 55f923c6
No related merge requests found
......@@ -32,7 +32,7 @@ linear combination of both states.
\subsection{Word Embeddings}
Distributional semantic models (DSMs) have been researched for decades in NLP \cite{turney_frequency_2010}.
Based on a huge amount of unlabeled texts, DSMs aim to represent words using a real-valued vector (also called embedding) which captures syntactic and semantic similarities between the words.
Starting with the publication of the work from Collobert et al. \cite{collobert_natural_2011} in 2011, learning embeddings for linguistic units, such as words, sentences or paragraphs, is one of the hot topics in NLP and a plethora of approaches have been proposed \cite{bojanowski_enriching_2017,mikolov_distributed_2013,peters_deep_2018,pennington_glove_2014}.
Starting with the publication of the work from Collobert et al. \cite{collobert_natural_2011} in 2011, learning embeddings for linguistic units, such as words, sentences or paragraphs, is one of the hot topics in NLP and a plethora of approaches have been proposed \cite{bojanowski_enriching_2017,mikolov_distributed_2013,pennington_glove_2014,peters_deep_2018}.
The majority of todays embedding models are based on deep learning models trained to perform some kind of language modeling task \cite{peters_semi-supervised_2017,peters_deep_2018,pinter_mimicking_2017}.
The most popular embedding model is the Word2Vec model introduced by Mikolov et al. \cite{mikolov_efficient_2013,mikolov_distributed_2013}.
......
This diff is collapsed.
File added
......@@ -33,15 +33,15 @@
% an abbreviated paper title here
\titlerunning{ICD-10 coding using multi-lingual embeddings and RNNs}
\author{Jurica \v{S}eva\inst{1} \and
Mario Sänger\inst{1} \and
Ulf Leser\inst{1}}
\author{Jurica \v{S}eva \and
Mario Sänger \and
Ulf Leser}
% First names are abbreviated in the running head.
% If there are more than two authors, 'et al.' is used.
\authorrunning{\v{S}eva et al.}
\institute{\inst{1}Humboldt-Universität zu Berlin, Knowledge Management in
\institute{Humboldt-Universität zu Berlin, Knowledge Management in
Bioinformatics, \\ Berlin, Germany\\
\email{\{seva,saengema,leser\}@informatik.hu-berlin.de}}
%
......
File added
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment