W15-3820 Such word vector representations , also known as word embedding , have been shown to improve the performance of machine learning models in several NLP tasks .
W15-3820 In this work our aim is to compare the performance of two state-of-the-art word embedding methods , namely word2vec and GloVe on a basic task of reflecting semantic similarity and relatedness of biomedical concepts .
W15-3822 We investigated three different methods for deriving word embeddings from a large unlabeled clinical corpus : one existing method called Surrounding based embedding feature ( SBE ) , and two newly developed methods : Left-Right surrounding based embedding feature ( LR_SBE ) and MAX surrounding based embedding feature ( MAX_SBE ) .
W15-3822 We investigated three different methods for deriving word embeddings from a large unlabeled clinical corpus : one existing method called Surrounding based embedding feature ( SBE ) , and two newly developed methods : Left-Right surrounding based embedding feature ( LR_SBE ) and MAX surrounding based embedding feature ( MAX_SBE ) .
W15-3822 We investigated three different methods for deriving word embeddings from a large unlabeled clinical corpus : one existing method called Surrounding based embedding feature ( SBE ) , and two newly developed methods : Left-Right surrounding based embedding feature ( LR_SBE ) and MAX surrounding based embedding feature ( MAX_SBE ) .
W15-3822 Evaluation using the clinical abbreviation datasets from both the Vanderbilt University and the University of Minnesota showed that neural word embedding features improved the performance of the SVM - based clinical abbreviation disambiguation system .
W15-4005 We propose the use of partial least square regression to learn the bilingual word embedding using compositional distributional semantics .
W15-4310 Besides word embedding , we use partof-speech ( POS ) tags , chunks , and brown clusters induced from Wikipedia as fea - tures .
W96-0413 As a variable 's restriction is processed the resulting focus set is passed back up to act as the candidate set for the same variable in the embedding structure .
W97-0901 This paper reports on SRA 's experience in embedding name recognition in these three specific ap - plications , and the mutual impacts that oc - cur , both on the algorithmic level and in the role that name recognition plays in user interaction with a system .
W98-1311 Another aspect to consider is the embedding of the single elements of the finite-state tools into a portable object-oriented framework , the architecture of which assures the reuse , the flexibility and the customization of the different parts .
hide detail