D14-1167 At the same time , concerning the quality of the word embeddings , experiments on the analogical reasoning task show that jointly embedding is comparable to or slightly better than word2vec ( Skip-Gram ) .
T75-2004 A small grammar rich in embedding capabilities is coded in Woods " form of Augmented Transition Net ( Woods 1970 ) for a set of ATN functions to interpret .
C92-2095 ( = S ) on tile right and so eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling and NP drop to isolate tile separate effects of llead-final ( e.g. , Verb-final ) l ) hrase structure in Japanese .
H90-1008 As well , all of them will be before their embedding episode , i.e. , the appropriate sentence utter - ance . )
C86-1088 The embedding rule for = > - conditions requires , roughly speaking , that every proper embedding for the antecedent sub-DRS can be properly extended to the consequent sub-DRS .
D15-1054 We identify several potential weaknesses of the plain application of conventional word embedding methodologies for ad click prediction .
C82-2022 - I01 - 1.2 , Another problem is proposed by the structural ambi - 8 ~ ity of sequences of PP 's , for which both the " embedding " and the " same-level " hypotheses are presented !
C86-1088 A proper embedding is a function from 1J / ~ to the model universe , assigning real-world objects to DRs in a way that all conditions of the DRS are satisfied .
D15-1200 The results highlight the importance of testing embedding models in real applications .
W02-0108 This unique duality allows students to contribute to research projects and gain skills in embedding HLT in practical applications .
J00-3002 The unacceptability of centre embedding is illustrated by the fact that while the nested subject relativizations of ( 4 ) exhibit little variation in acceptability , the nested object relativizations ( 5 ) exhibit a severe deterioration in acceptability ( Chomsky 1965 , Chap .
W10-4104 The numbers in the brackets to the right of the PClauses indicating the depth of the embedding and overlapping .
W00-1421 Generally , the realized word order of an utterance is the result of its embedding into the situative context , which finds expression in the use of linear precedence ( LP ) rules for word order determination during surface realization .
W15-3820 Such word vector representations , also known as word embedding , have been shown to improve the performance of machine learning models in several NLP tasks .
W15-3822 We investigated three different methods for deriving word embeddings from a large unlabeled clinical corpus : one existing method called Surrounding based embedding feature ( SBE ) , and two newly developed methods : Left-Right surrounding based embedding feature ( LR_SBE ) and MAX surrounding based embedding feature ( MAX_SBE ) .
D12-1086 Our best model based on Euclidean co-occurrence embedding combines the paradigmatic context representation with morphological and orthographic features and achieves 80 % many-to-one accuracy on a 45-tag 1M word corpus .
C90-3012 We will attempt to show how human performance limitations on various types of syntactic embedding constructions in Germanic languages can be modelled in a relational network linguistic framework .
P15-1011 We present in this paper the embedding models that achieve an F-score of 92 % on the widely-used , publicly available dataset , the GRE " most contrasting word " questions ( Mohammad et al. , 2008 ) .
D15-1183 In addition , it is desirable to incorporate global latent factors , such as topics , sentiments or writing styles , into the word embedding model .
N12-1088 This embedding is learned in such a way that prediction becomes a low-dimensional nearest-neighbor search , which can be done computationally ef - ficiently .
hide detail