In real applications of inductive learning for classifcation,labeled instances are often defcient,and labeling them by an oracle is often expensive and time-consuming.Active learning on a single task aims to select only informative unlabeled instances for querying to improve the classifcation accuracy while decreasing the querying cost.However,an inevitable problem in active learning is that the informative measures for selecting queries are commonly based on the initial hypotheses sampled from only a few labeled instances.In such a circumstance,the initial hypotheses are not reliable and may deviate from the true distribution underlying the target task.Consequently,the informative measures will possibly select irrelevant instances.A promising way to compensate this problem is to borrow useful knowledge from other sources with abundant labeled information,which is called transfer learning.However,a signifcant challenge in transfer learning is how to measure the similarity between the source and the target tasks.One needs to be aware of diferent distributions or label assignments from unrelated source tasks;otherwise,they will lead to degenerated performance while transferring.Also,how to design an efective strategy to avoid selecting irrelevant samples to query is still an open question.To tackle these issues,we propose a hybrid algorithm for active learning with the help of transfer learning by adopting a divergence measure to alleviate the negative transfer caused by distribution diferences.To avoid querying irrelevant instances,we also present an adaptive strategy which could eliminate unnecessary instances in the input space and models in the model space.Extensive experiments on both the synthetic and the real data sets show that the proposed algorithm is able to query fewer instances with a higher accuracy and that it converges faster than the state-of-the-art methods.
In real applications of inductive learning for classifi cation, labeled instances are often defi cient, and labeling them by an oracle is often expensive and time-consuming. Active learning on a single task aims to select only informative unlabeled instances for querying to improve the classifi cation accuracy while decreasing the querying cost. However, an inevitable problem in active learning is that the informative measures for selecting queries are commonly based on the initial hypotheses sampled from only a few labeled instances. In such a circumstance, the initial hypotheses are not reliable and may deviate from the true distribution underlying the target task. Consequently, the informative measures will possibly select irrelevant instances. A promising way to compensate this problem is to borrow useful knowledge from other sources with abundant labeled information, which is called transfer learning. However, a signifi cant challenge in transfer learning is how to measure the similarity between the source and the target tasks. One needs to be aware of different distributions or label assignments from unrelated source tasks;otherwise, they will lead to degenerated performance while transferring. Also, how to design an effective strategy to avoid selecting irrelevant samples to query is still an open question. To tackle these issues, we propose a hybrid algorithm for active learning with the help of transfer learning by adopting a divergence measure to alleviate the negative transfer caused by distribution differences. To avoid querying irrelevant instances, we also present an adaptive strategy which could eliminate unnecessary instances in the input space and models in the model space. Extensive experiments on both the synthetic and the real data sets show that the proposed algorithm is able to query fewer instances with a higher accuracy and that it converges faster than the state-of-the-art methods.