Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/41425
Title: A Maximal Figure-of-Merit Learning Approach to Text Categorization
Authors: Gao, S.
Wu, W.
Lee, C.-H. 
Chua, T.-S. 
Keywords: Decision tree
Generalized probabilistic descent method
Latent semantic indexing
Maximal figure-of-merit
Support vector machines
Text categorization
Issue Date: 2003
Source: Gao, S.,Wu, W.,Lee, C.-H.,Chua, T.-S. (2003). A Maximal Figure-of-Merit Learning Approach to Text Categorization. SIGIR Forum (ACM Special Interest Group on Information Retrieval) (SPEC. ISS.) : 174-181. ScholarBank@NUS Repository.
Abstract: A novel maximal figure-of-merit (MFoM) learning approach to text categorization is proposed. Different from the conventional techniques, the proposed MFoM method attempts to integrate any performance metric of interest (e.g. accuracy, recall, precision, or F 1 measure) into the design of any classifier. The corresponding classifier parameters are learned by optimizing an overall objective function of interest. To solve this highly nonlinear optimization problem, we use a generalized probabilistic descent algorithm. The MFoM learning framework is evaluated on the Reuters-21578 task with LSI-based feature extraction and a binary tree classifier. Experimental results indicate that the MFoM classifier gives improved F 1 and enhanced robustness over the conventional one. It also outperforms the popular SVM method in micro-averaging F 1. Other extensions to design discriminative multiple-category MFoM classifiers for application scenarios with new performance metrics could be envisioned too.
Source Title: SIGIR Forum (ACM Special Interest Group on Information Retrieval)
URI: http://scholarbank.nus.edu.sg/handle/10635/41425
ISSN: 01635840
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Page view(s)

58
checked on Nov 18, 2017

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.