#1632In particular, range concatenation languages [RCL] can be parsed in polynomial time and many classical grammatical formalisms can be translated into equivalent RCGs without increasing their worst-case parsing time complexity.
<term>
tree adjoining grammar
</term>
can be
parsed
in O ( n6 )
<term>
time
</term>
. In this paper
#1671For example, after translation into an equivalent RCG, any tree adjoining grammar can be parsed in O(n6) time.
tech,10-2-N03-1026,ak
LFG
</term>
, a
<term>
transfer component for
parse
reduction
</term>
operating on
<term>
packed
#2822Our system incorporates a linguistic parser/generator for LFG, a transfer component for parse reduction operating on packed parse forests, and a maximum-entropy model for stochastic output selection.
other,17-2-N03-1026,ak
reduction
</term>
operating on
<term>
packed
parse
forests
</term>
, and a
<term>
maximum-entropy
#2827Our system incorporates a linguistic parser/generator for LFG, a transfer component for parse reduction operating on packed parse forests, and a maximum-entropy model for stochastic output selection.
tech,10-5-H05-1064,ak
, we apply the
<term>
model
</term>
to
<term>
parse
reranking
</term>
. The
<term>
model
</term>
#5524As a case study, we apply the model toparse reranking.
other,18-7-H05-1064,ak
naturally to NLP structures other than
<term>
parse
trees
</term>
. This paper presents a
<term>
#5578Although our experiments are focused on parsing, the techniques described generalize naturally to NLP structures other thanparse trees.
other,7-2-J05-1003,ak
parser
</term>
produces a set of
<term>
candidate
parses
</term>
for each
<term>
input sentence
</term>
#8036The base parser produces a set of candidate parses for each input sentence, with associated probabilities that define an initial ranking of these parses.
other,24-2-J05-1003,ak
initial
<term>
ranking
</term>
of these
<term>
parses
</term>
. A second
<term>
model
</term>
then
#8052The base parser produces a set of candidate parses for each input sentence, with associated probabilities that define an initial ranking of theseparses.
apply the
<term>
boosting method
</term>
to
parsing
the
<term>
Wall Street Journal treebank
</term>
#8157We apply the boosting method to parsing the Wall Street Journal treebank.
other,25-7-J05-1003,ak
additional 500,000
<term>
features
</term>
over
<term>
parse
trees
</term>
that were not included in the
#8189The method combined the log-likelihood under a baseline model (that of Collins [1999]) with evidence from an additional 500,000 features overparse trees that were not included in the original model.
other,8-1-P05-1010,ak
generative probabilistic model
</term>
of
<term>
parse
trees
</term>
, which we call
<term>
PCFG-LA
#8487This paper defines a generative probabilistic model ofparse trees, which we call PCFG-LA.
lr,8-3-P05-1010,ak
</term>
are automatically induced from a
<term>
parsed
corpus
</term>
by training a
<term>
PCFG-LA
#8520Fine-grained CFG rules are automatically induced from aparsed corpus by training a PCFG-LA model using an EM-algorithm.
tech,1-4-P05-1010,ak
<term>
EM-algorithm
</term>
. Because
<term>
exact
parsing
</term>
with a
<term>
PCFG-LA
</term>
is NP-hard
#8533Because exact parsing with a PCFG-LA is NP-hard, several approximations are described and empirically compared.
other,8-3-P05-1034,ak
</term>
, project the
<term>
source dependency
parse
</term>
onto the
<term>
target sentence
</term>
#8892We align a parallel corpus, project the source dependency parse onto the target sentence, extract dependency treelet translation pairs, and train a tree-based ordering model.
model,8-4-P05-1053,ak
most of useful information in
<term>
full
parse
trees
</term>
for
<term>
relation extraction
#9339This suggests that most of useful information in full parse trees for relation extraction is shallow and can be captured by chunking.
other,25-4-P05-1073,ak
<term>
classifier
</term>
for
<term>
gold-standard
parse
trees
</term>
on
<term>
PropBank
</term>
. Previous
#10138This system achieves an error reduction of 22% on all arguments and 32% on core arguments over a state-of-the art independent classifier for gold-standard parse trees on PropBank.
tech,9-4-P80-1026,ak
In this paper , we outline a set of
<term>
parsing
flexibilities
</term>
that such a
<term>
system
#13684In this paper, we outline a set ofparsing flexibilities that such a system should provide.
other,33-6-P84-1047,ak
structure
</term>
, and worked examples of
<term>
parses
</term>
. A
<term>
parser
</term>
incorporating
#15153Representative samples from an entity-oriented language definition are presented, along with a control structure for an entity-oriented parser, some parsing strategies that use the control structure, and worked examples ofparses.
tech,15-1-C88-1066,ak
Restrictions ( CCRs )
</term>
and describes two
<term>
parsing
algorithms
</term>
that interpret it .
<term>
#18102This paper summarizes the formalism of Category Cooccurrence Restrictions (CCRs) and describes twoparsing algorithms that interpret it.
second stage the
<term>
sentence
</term>
is
parsed
with respect to this set . An
<term>
Earley-type
#19713In the first stage, the parser selects a set of elementary structures associated with the lexical items in the input sentence, and in the second stage the sentence is parsed with respect to this set.