P01-1005 training data . We present the active learning algorithm we used below . Initialize :
W00-1304 . The following describes the active learning algorithm used in the experiments : 1 .
D10-1095 of 50 annotated sentences . The active learning algorithm is then using the CW algorithm
E06-3009 well-known disadvantage of many simple active learning algorithms . A second type of bias is due
W00-1304 the performance achieved by the active learning algorithm using TBLDT with the performance
P02-1064 examples in a pool and ran the active learning algorithm in Section 3.2 . We use the same
P02-1016 as most of the time . And all active learning algorithms perform better than random selection
D14-1143 Propagation Since the graph-based active learning algorithms are based on label propagation
D14-1143 NLP community , non-interactive active learning algorithms exclude expert annotators ' human
D08-1105 unlikely event 1007 Figure 2 : The active learning algorithm . that we have access to a large
P14-1055 and then describes a baseline active learning algorithm . 3.1 Supervised Learning We
P09-1117 semi-supervised AL. . 3.1 Fully Supervised Active Learning Algorithm 1 describes the general AL framework
D14-1143 ) , graph-based noninteractive active learning algorithms have been introduced in recent
D14-1143 edge becomes . 2.4 Graph-based active learning algorithms An important categorization of
P13-3011 which is needed in the proposed active learning algorithm . 3.4 Materials There is a corpus
E06-3009 country data , for in - stance , the active learning algorithm above needs 44 % fewer training
P05-3018 the effectiveness of different active learning algorithms . Finally , our interface supports
D10-1095 ready for the next round . Many active learning algorithms are first computing a prediction
N06-1030 random curve . Encouragingly , the active learning algorithm straddles the range in between
P00-1016 time the server took to run the active learning algorithm and select the next batch of
hide detail