Semantic Subspace Learning with Conditional Significance Vectors
Proceedings of the International Joint Conference on Neural Networks (IJCNN 2010),
pages 3670--3677,
doi: 10.1109/IJCNN.2010.5596640
- Jul 2010
Subspace detection and processing is receiving
more attention nowadays as a method to speed up search
and reduce processing overload. Subspace Learning
algorithms try to detect low dimensional subspaces in the
data which minimize the intra-class separation while
maximizing the inter-class separation. In this paper we
present a novel technique using the maximum significance
value to detect a semantic subspace. We further modify the
document vector using conditional significance to represent
the subspace. This enhances the distinction between classes
within the subspace. We compare our method against
TFIDF with PCA and show that it consistently outperforms
the baseline with a large margin when tested with a wide
variety of learning algorithms. Our results show that the
combination of subspace detection and conditional
significance vectors improves subspace learning.
@InProceedings{WHOT10, author = {Wermter, Stefan and Hung, Chihli and Oakes, Michael Philip and Tripathi, Nandita}, title = {Semantic Subspace Learning with Conditional Significance Vectors}, booktitle = {Proceedings of the International Joint Conference on Neural Networks (IJCNN 2010)}, editors = {}, number = {}, volume = {}, pages = {3670--3677}, year = {2010}, month = {Jul}, publisher = {IEEE}, doi = {10.1109/IJCNN.2010.5596640}, }