other,24-1-H92-1026,bq |
We describe a
<term>
generative probabilistic model of natural language
</term>
, which we call
<term>
HBG
</term>
, that takes advantage of detailed
<term>
linguistic information
</term>
to resolve
<term>
ambiguity
</term>
.
|
#18917
We describe a generative probabilistic model of natural language, which we call HBG, that takes advantage of detailed linguistic information to resolveambiguity. |
lr,3-3-H92-1026,bq |
We use a
<term>
corpus of bracketed sentences
</term>
, called a
<term>
Treebank
</term>
, in combination with
<term>
decision tree building
</term>
to tease out the relevant aspects of a
<term>
parse tree
</term>
that will determine the correct
<term>
parse
</term>
of a
<term>
sentence
</term>
.
|
#18946
We use acorpus of bracketed sentences, called a Treebank, in combination with decision tree building to tease out the relevant aspects of a parse tree that will determine the correct parse of a sentence. |
tech,15-3-H92-1026,bq |
We use a
<term>
corpus of bracketed sentences
</term>
, called a
<term>
Treebank
</term>
, in combination with
<term>
decision tree building
</term>
to tease out the relevant aspects of a
<term>
parse tree
</term>
that will determine the correct
<term>
parse
</term>
of a
<term>
sentence
</term>
.
|
#18958
We use a corpus of bracketed sentences, called a Treebank, in combination withdecision tree building to tease out the relevant aspects of a parse tree that will determine the correct parse of a sentence. |
tech,17-2-H92-1026,bq |
<term>
HBG
</term>
incorporates
<term>
lexical , syntactic , semantic , and structural information
</term>
from the
<term>
parse tree
</term>
into the
<term>
disambiguation process
</term>
in a novel way .
|
#18936
HBG incorporates lexical, syntactic, semantic, and structural information from the parse tree into thedisambiguation process in a novel way. |
model,3-1-H92-1026,bq |
We describe a
<term>
generative probabilistic model of natural language
</term>
, which we call
<term>
HBG
</term>
, that takes advantage of detailed
<term>
linguistic information
</term>
to resolve
<term>
ambiguity
</term>
.
|
#18896
We describe agenerative probabilistic model of natural language, which we call HBG, that takes advantage of detailed linguistic information to resolve ambiguity. |
other,10-4-H92-1026,bq |
This stands in contrast to the usual approach of further
<term>
grammar
</term>
tailoring via the usual
<term>
linguistic introspection
</term>
in the hope of generating the correct
<term>
parse
</term>
.
|
#18991
This stands in contrast to the usual approach of furthergrammar tailoring via the usual linguistic introspection in the hope of generating the correct parse. |
tech,13-1-H92-1026,bq |
We describe a
<term>
generative probabilistic model of natural language
</term>
, which we call
<term>
HBG
</term>
, that takes advantage of detailed
<term>
linguistic information
</term>
to resolve
<term>
ambiguity
</term>
.
|
#18906
We describe a generative probabilistic model of natural language, which we callHBG, that takes advantage of detailed linguistic information to resolve ambiguity. |
tech,0-2-H92-1026,bq |
We describe a
<term>
generative probabilistic model of natural language
</term>
, which we call
<term>
HBG
</term>
, that takes advantage of detailed
<term>
linguistic information
</term>
to resolve
<term>
ambiguity
</term>
.
<term>
HBG
</term>
incorporates
<term>
lexical , syntactic , semantic , and structural information
</term>
from the
<term>
parse tree
</term>
into the
<term>
disambiguation process
</term>
in a novel way .
|
#18919
We describe a generative probabilistic model of natural language, which we call HBG, that takes advantage of detailed linguistic information to resolve ambiguity.HBG incorporates lexical, syntactic, semantic, and structural information from the parse tree into the disambiguation process in a novel way. |
tech,20-5-H92-1026,bq |
In
<term>
head-to-head tests
</term>
against one of the best existing robust
<term>
probabilistic parsing models
</term>
, which we call
<term>
P-CFG
</term>
, the
<term>
HBG model
</term>
significantly outperforms
<term>
P-CFG
</term>
, increasing the
<term>
parsing accuracy
</term>
rate from 60 % to 75 % , a 37 % reduction in error .
|
#19027
In head-to-head tests against one of the best existing robust probabilistic parsing models, which we call P-CFG, theHBG model significantly outperforms P-CFG, increasing the parsing accuracy rate from 60% to 75%, a 37% reduction in error. |
measure(ment),1-5-H92-1026,bq |
In
<term>
head-to-head tests
</term>
against one of the best existing robust
<term>
probabilistic parsing models
</term>
, which we call
<term>
P-CFG
</term>
, the
<term>
HBG model
</term>
significantly outperforms
<term>
P-CFG
</term>
, increasing the
<term>
parsing accuracy
</term>
rate from 60 % to 75 % , a 37 % reduction in error .
|
#19008
Inhead-to-head tests against one of the best existing robust probabilistic parsing models, which we call P-CFG, the HBG model significantly outperforms P-CFG, increasing the parsing accuracy rate from 60% to 75%, a 37% reduction in error. |
other,2-2-H92-1026,bq |
<term>
HBG
</term>
incorporates
<term>
lexical , syntactic , semantic , and structural information
</term>
from the
<term>
parse tree
</term>
into the
<term>
disambiguation process
</term>
in a novel way .
|
#18921
HBG incorporateslexical , syntactic , semantic , and structural information from the parse tree into the disambiguation process in a novel way. |
other,20-1-H92-1026,bq |
We describe a
<term>
generative probabilistic model of natural language
</term>
, which we call
<term>
HBG
</term>
, that takes advantage of detailed
<term>
linguistic information
</term>
to resolve
<term>
ambiguity
</term>
.
|
#18913
We describe a generative probabilistic model of natural language, which we call HBG, that takes advantage of detailedlinguistic information to resolve ambiguity. |
other,15-4-H92-1026,bq |
This stands in contrast to the usual approach of further
<term>
grammar
</term>
tailoring via the usual
<term>
linguistic introspection
</term>
in the hope of generating the correct
<term>
parse
</term>
.
|
#18996
This stands in contrast to the usual approach of further grammar tailoring via the usuallinguistic introspection in the hope of generating the correct parse. |
other,33-3-H92-1026,bq |
We use a
<term>
corpus of bracketed sentences
</term>
, called a
<term>
Treebank
</term>
, in combination with
<term>
decision tree building
</term>
to tease out the relevant aspects of a
<term>
parse tree
</term>
that will determine the correct
<term>
parse
</term>
of a
<term>
sentence
</term>
.
|
#18976
We use a corpus of bracketed sentences, called a Treebank, in combination with decision tree building to tease out the relevant aspects of a parse tree that will determine the correctparse of a sentence. |
other,24-4-H92-1026,bq |
This stands in contrast to the usual approach of further
<term>
grammar
</term>
tailoring via the usual
<term>
linguistic introspection
</term>
in the hope of generating the correct
<term>
parse
</term>
.
|
#19005
This stands in contrast to the usual approach of further grammar tailoring via the usual linguistic introspection in the hope of generating the correctparse. |
other,13-2-H92-1026,bq |
<term>
HBG
</term>
incorporates
<term>
lexical , syntactic , semantic , and structural information
</term>
from the
<term>
parse tree
</term>
into the
<term>
disambiguation process
</term>
in a novel way .
|
#18932
HBG incorporates lexical, syntactic, semantic, and structural information from theparse tree into the disambiguation process in a novel way. |
other,26-3-H92-1026,bq |
We use a
<term>
corpus of bracketed sentences
</term>
, called a
<term>
Treebank
</term>
, in combination with
<term>
decision tree building
</term>
to tease out the relevant aspects of a
<term>
parse tree
</term>
that will determine the correct
<term>
parse
</term>
of a
<term>
sentence
</term>
.
|
#18969
We use a corpus of bracketed sentences, called a Treebank, in combination with decision tree building to tease out the relevant aspects of aparse tree that will determine the correct parse of a sentence. |
measure(ment),28-5-H92-1026,bq |
In
<term>
head-to-head tests
</term>
against one of the best existing robust
<term>
probabilistic parsing models
</term>
, which we call
<term>
P-CFG
</term>
, the
<term>
HBG model
</term>
significantly outperforms
<term>
P-CFG
</term>
, increasing the
<term>
parsing accuracy
</term>
rate from 60 % to 75 % , a 37 % reduction in error .
|
#19035
In head-to-head tests against one of the best existing robust probabilistic parsing models, which we call P-CFG, the HBG model significantly outperforms P-CFG, increasing theparsing accuracy rate from 60% to 75%, a 37% reduction in error. |
tech,17-5-H92-1026,bq |
In
<term>
head-to-head tests
</term>
against one of the best existing robust
<term>
probabilistic parsing models
</term>
, which we call
<term>
P-CFG
</term>
, the
<term>
HBG model
</term>
significantly outperforms
<term>
P-CFG
</term>
, increasing the
<term>
parsing accuracy
</term>
rate from 60 % to 75 % , a 37 % reduction in error .
|
#19024
In head-to-head tests against one of the best existing robust probabilistic parsing models, which we callP-CFG, the HBG model significantly outperforms P-CFG, increasing the parsing accuracy rate from 60% to 75%, a 37% reduction in error. |
tech,24-5-H92-1026,bq |
In
<term>
head-to-head tests
</term>
against one of the best existing robust
<term>
probabilistic parsing models
</term>
, which we call
<term>
P-CFG
</term>
, the
<term>
HBG model
</term>
significantly outperforms
<term>
P-CFG
</term>
, increasing the
<term>
parsing accuracy
</term>
rate from 60 % to 75 % , a 37 % reduction in error .
|
#19031
In head-to-head tests against one of the best existing robust probabilistic parsing models, which we call P-CFG, the HBG model significantly outperformsP-CFG, increasing the parsing accuracy rate from 60% to 75%, a 37% reduction in error. |