#507(iii) Rapid system development and porting to new domains via knowledge-based automated acquisition of grammars.
developing applications of this technology in new
domains
. Recent advances in
<term>
Automatic Speech
#909We have demonstrated this capability in several field exercises with the Marines and are currently developing applications of this technology in new domains.
model,3-2-N03-1001,ak
training data
</term>
. The method combines
<term>
domain
independent acoustic models
</term>
with
<term>
#2227The method combinesdomain independent acoustic models with off-the-shelf classifiers to give utterance classification performance that is surprisingly close to what can be achieved using conventional word-trigram recognition requiring manual transcription.
other,18-3-N03-1001,ak
n-gram model
</term>
for a particular
<term>
domain
</term>
; the
<term>
output
</term>
of
<term>
recognition
#2274In our method, unsupervised training is first used to train a phone n-gram model for a particulardomain; the output of recognition with this model is then passed to a phone-string classifier.
other,11-4-N03-1001,ak
three different
<term>
spoken language system
domains
</term>
. Motivated by the success of
<term>
#2305The classification accuracy of the method is evaluated on three different spoken language system domains.
Lexical-Functional Grammars ( LFG )
</term>
to the
domain
of
<term>
sentence condensation
</term>
. Our
#2804We present an application of ambiguity packing and stochastic disambiguation techniques for Lexical-Functional Grammars (LFG) to the domain of sentence condensation.
other,21-3-P03-1033,ak
knowledge level
</term>
on the
<term>
target
domain
</term>
and the degree of
<term>
hastiness
</term>
#4346Specifically, we set up three dimensions of user models: skill level to the system, knowledge level on the target domain and the degree of hastiness.
other,20-4-P03-1050,ak
allowing it to adapt to a desired
<term>
domain
</term>
or
<term>
genre
</term>
. Examples and
#4508Monolingual, unannotated text can be used to further improve the stemmer by allowing it to adapt to a desireddomain or genre.
other,10-5-P03-1058,ak
highlights the importance of the issue of
<term>
domain
dependence
</term>
in evaluating
<term>
WSD
#4927Our analysis also highlights the importance of the issue ofdomain dependence in evaluating WSD programs.
other,14-3-I05-2048,ak
new
<term>
language pairs
</term>
or new
<term>
domains
</term>
. This workshop is intended to give
#6805This is particularly important when building translation systems for new language pairs or newdomains.
other,26-3-I05-4010,ak
collection covering the specific and
<term>
special
domain
</term>
of HK laws . It is particularly valuable
#7329The resultant bilingual corpus, 10.4M English words and 18.3M Chinese characters, is an authoritative and comprehensive text collection covering the specific and special domain of HK laws.
other,30-3-P05-1046,ak
useful structure in either of our
<term>
domains
</term>
. However , one can dramatically
#9101Although hidden Markov models (HMMs) provide a suitable generative model for field structured text, general unsupervised HMM learning fails to learn useful structure in either of ourdomains.
knowledge of the desired solutions . In both
domains
, we found that
<term>
unsupervised methods
#9127In both domains, we found that unsupervised methods can attain accuracies with 400 unlabeled examples comparable to those attained by supervised methods on 50 labeled examples, and that semi-supervised methods can make good use of small amounts of labeled data.
other,8-3-P05-2008,ak
demonstrates that match with respect to
<term>
domain
</term>
and time is also important , and
#10466This paper demonstrates that match with respect todomain and time is also important, and presents preliminary experiments with training data labeled with emoticons, which has the potential of being independent of domain, topic and time.
other,34-3-P05-2008,ak
potential of being independent of
<term>
domain
</term>
,
<term>
topic
</term>
and time . This
#10492This paper demonstrates that match with respect to domain and time is also important, and presents preliminary experiments with training data labeled with emoticons, which has the potential of being independent ofdomain, topic and time.
other,28-3-N06-4001,ak
behavioral patterns in two distinct
<term>
domains
</term>
: tutorial dialogue ( Kumar et al.
#11859As evidence of its usefulness and usability, it has been used successfully in a research context to uncover relationships between language and behavioral patterns in two distinctdomains: tutorial dialogue (Kumar et al., submitted) and on-line communities (Arguello et al., 2006).
#13733Robust natural language interpretation requires strong semantic domain models, fail-soft recovery heuristics, and very flexible control structures.
other,31-2-P81-1032,ak
and ability to bring
<term>
task-specific
domain
knowledge
</term>
( in addition to
<term>
general
#13778Although single-strategy parsers have met with a measure of success, a multi-strategy approach is shown to provide a much higher degree of flexibility, redundancy, and ability to bring task-specific domain knowledge (in addition to general linguistic knowledge) to bear on both grammatical and ungrammatical input.
other,21-6-P81-1033,ak
that is natural in terms of the
<term>
task
domain
</term>
to be interpreted directly without
#14053A construction-specific approach also aids in task-specific language development by allowing a language definition that is natural in terms of the task domain to be interpreted directly without compilation into a uniform grammar formalism, thus greatly speeding the testing of changes to the language definition.
other,29-5-P82-1035,ak
understands scruffy
<term>
texts
</term>
in the
<term>
domain
</term>
of Navy messages . This article deals
#14417This method of using expectations to aid the understanding of scruffy texts has been incorporated into a working computer program called NOMAD, which understands scruffy texts in thedomain of Navy messages.