Our
<term>
algorithm
</term>
reported more than 99 %
<term>
accuracy
</term>
in both
<term>
language identification
</term>
and
<term>
key prediction
</term>
.
<term>
Sentence planning
</term>
is a set of inter-related but distinct tasks , one of which is
<term>
sentence scoping
</term>
, i.e. the choice of
<term>
syntactic structure
</term>
for
<term>
elementary speech acts
</term>
and the decision of how to combine them into one or more
<term>
sentences
</term>
.
#1293Our algorithm reported more than 99% accuracy in both language identification and key prediction.Sentence planning is a set of inter-related but distinct tasks, one of which is sentence scoping, i.e. the choice of syntactic structure for elementary speech acts and the decision of how to combine them into one or more sentences.
tech,15-1-N01-1003,ak
<term>
Sentence planning
</term>
is a set of inter-related but distinct tasks , one of which is
<term>
sentence scoping
</term>
, i.e. the choice of
<term>
syntactic structure
</term>
for
<term>
elementary speech acts
</term>
and the decision of how to combine them into one or more
<term>
sentences
</term>
.
#1308Sentence planning is a set of inter-related but distinct tasks, one of which issentence scoping, i.e. the choice of syntactic structure for elementary speech acts and the decision of how to combine them into one or more sentences.
other,22-1-N01-1003,ak
<term>
Sentence planning
</term>
is a set of inter-related but distinct tasks , one of which is
<term>
sentence scoping
</term>
, i.e. the choice of
<term>
syntactic structure
</term>
for
<term>
elementary speech acts
</term>
and the decision of how to combine them into one or more
<term>
sentences
</term>
.
#1315Sentence planning is a set of inter-related but distinct tasks, one of which is sentence scoping, i.e. the choice ofsyntactic structure for elementary speech acts and the decision of how to combine them into one or more sentences.
other,25-1-N01-1003,ak
<term>
Sentence planning
</term>
is a set of inter-related but distinct tasks , one of which is
<term>
sentence scoping
</term>
, i.e. the choice of
<term>
syntactic structure
</term>
for
<term>
elementary speech acts
</term>
and the decision of how to combine them into one or more
<term>
sentences
</term>
.
#1318Sentence planning is a set of inter-related but distinct tasks, one of which is sentence scoping, i.e. the choice of syntactic structure forelementary speech acts and the decision of how to combine them into one or more sentences.
other,40-1-N01-1003,ak
<term>
Sentence planning
</term>
is a set of inter-related but distinct tasks , one of which is
<term>
sentence scoping
</term>
, i.e. the choice of
<term>
syntactic structure
</term>
for
<term>
elementary speech acts
</term>
and the decision of how to combine them into one or more
<term>
sentences
</term>
.
#1333Sentence planning is a set of inter-related but distinct tasks, one of which is sentence scoping, i.e. the choice of syntactic structure for elementary speech acts and the decision of how to combine them into one or moresentences.
tool,6-2-N01-1003,ak
In this paper , we present
<term>
SPoT
</term>
, a
<term>
sentence planner
</term>
, and a new methodology for automatically training
<term>
SPoT
</term>
on the basis of
<term>
feedback
</term>
provided by
<term>
human judges
</term>
.
#1341In this paper, we presentSPoT, a sentence planner, and a new methodology for automatically training SPoT on the basis of feedback provided by human judges.
tech,9-2-N01-1003,ak
In this paper , we present
<term>
SPoT
</term>
, a
<term>
sentence planner
</term>
, and a new methodology for automatically training
<term>
SPoT
</term>
on the basis of
<term>
feedback
</term>
provided by
<term>
human judges
</term>
.
#1344In this paper, we present SPoT, asentence planner, and a new methodology for automatically training SPoT on the basis of feedback provided by human judges.
tool,19-2-N01-1003,ak
In this paper , we present
<term>
SPoT
</term>
, a
<term>
sentence planner
</term>
, and a new methodology for automatically training
<term>
SPoT
</term>
on the basis of
<term>
feedback
</term>
provided by
<term>
human judges
</term>
.
#1354In this paper, we present SPoT, a sentence planner, and a new methodology for automatically trainingSPoT on the basis of feedback provided by human judges.
model,24-2-N01-1003,ak
In this paper , we present
<term>
SPoT
</term>
, a
<term>
sentence planner
</term>
, and a new methodology for automatically training
<term>
SPoT
</term>
on the basis of
<term>
feedback
</term>
provided by
<term>
human judges
</term>
.
#1359In this paper, we present SPoT, a sentence planner, and a new methodology for automatically training SPoT on the basis offeedback provided by human judges.
other,27-2-N01-1003,ak
In this paper , we present
<term>
SPoT
</term>
, a
<term>
sentence planner
</term>
, and a new methodology for automatically training
<term>
SPoT
</term>
on the basis of
<term>
feedback
</term>
provided by
<term>
human judges
</term>
.
#1362In this paper, we present SPoT, a sentence planner, and a new methodology for automatically training SPoT on the basis of feedback provided byhuman judges.
tech,6-4-N01-1003,ak
First , a very simple ,
<term>
randomized sentence-plan-generator ( SPG )
</term>
generates a potentially large list of possible
<term>
sentence plans
</term>
for a given
<term>
text-plan input
</term>
.
#1380First, a very simple,randomized sentence-plan-generator ( SPG ) generates a potentially large list of possible sentence plans for a given text-plan input.
other,18-4-N01-1003,ak
First , a very simple ,
<term>
randomized sentence-plan-generator ( SPG )
</term>
generates a potentially large list of possible
<term>
sentence plans
</term>
for a given
<term>
text-plan input
</term>
.
#1392First, a very simple, randomized sentence-plan-generator (SPG) generates a potentially large list of possiblesentence plans for a given text-plan input.
other,23-4-N01-1003,ak
First , a very simple ,
<term>
randomized sentence-plan-generator ( SPG )
</term>
generates a potentially large list of possible
<term>
sentence plans
</term>
for a given
<term>
text-plan input
</term>
.
#1397First, a very simple, randomized sentence-plan-generator (SPG) generates a potentially large list of possible sentence plans for a giventext-plan input.
tech,3-5-N01-1003,ak
Second , the
<term>
sentence-plan-ranker ( SPR )
</term>
ranks the list of
<term>
output sentence plans
</term>
, and then selects the
<term>
top-ranked plan
</term>
.
#1403Second, thesentence-plan-ranker ( SPR ) ranks the list of output sentence plans, and then selects the top-ranked plan.
other,11-5-N01-1003,ak
Second , the
<term>
sentence-plan-ranker ( SPR )
</term>
ranks the list of
<term>
output sentence plans
</term>
, and then selects the
<term>
top-ranked plan
</term>
.
#1411Second, the sentence-plan-ranker (SPR) ranks the list ofoutput sentence plans, and then selects the top-ranked plan.
other,19-5-N01-1003,ak
Second , the
<term>
sentence-plan-ranker ( SPR )
</term>
ranks the list of
<term>
output sentence plans
</term>
, and then selects the
<term>
top-ranked plan
</term>
.
#1419Second, the sentence-plan-ranker (SPR) ranks the list of output sentence plans, and then selects thetop-ranked plan.
tech,1-6-N01-1003,ak
The
<term>
SPR
</term>
uses
<term>
ranking rules
</term>
automatically learned from
<term>
training data
</term>
.
#1423TheSPR uses ranking rules automatically learned from training data.
model,3-6-N01-1003,ak
The
<term>
SPR
</term>
uses
<term>
ranking rules
</term>
automatically learned from
<term>
training data
</term>
.
#1425The SPR usesranking rules automatically learned from training data.
lr,8-6-N01-1003,ak
The
<term>
SPR
</term>
uses
<term>
ranking rules
</term>
automatically learned from
<term>
training data
</term>
.
We show that the trained
<term>
SPR
</term>
learns to select a
<term>
sentence plan
</term>
whose
<term>
rating
</term>
on average is only 5 % worse than the
<term>
top human-ranked sentence plan
</term>
.
#1438We show that the trainedSPR learns to select a sentence plan whose rating on average is only 5% worse than the top human-ranked sentence plan.