[1] 王瑞燕,赵庚星, 李涛. 山东省青州市耕地地力等级评价研究[J]. 土壤, 2004, 36(1): 76-80.
[2] 鲁明星, 贺立源, 吴礼树, 等. 基于GIS的华中丘陵区耕地地力评价研究[J]. 农业工程学报, 2006, 22(8): 96-101.
[3] 袁秀杰, 赵庚星, 朱雪欣. 平原和丘陵区耕地地力评价及其指标体系衔接研究[J]. 农业工程学报, 2008, 24(7): 65-71.
[4] 方灿华, 马友华, 钱国平, 等. 基于GIS的明光市耕地地力评价[J]. 中国农学通报, 2008, 24(12): 308-312.
[5] 李贤胜, 叶军华, 杨平, 等. 基于 GIS 的广德县耕地地力定量评价[J]. 土壤, 2009, 41(3): 490-494.
[6] 刘永文, 樊燕, 刘洪斌. 丘陵山地耕地地力评价研究[J]. 中国农学通报, 2009, 25(18): 420-425.
[7] Dung E J,Sugurnaran R. Development of an agricultural land evaluation and site assessment (LESA) decision support tool using remote sensing and geographic information system [J]. Journal of Soil and Water Conservation, 2005, 60(5): 228-235.
[8] 王瑞燕, 赵庚星, 陈丽丽. 基于ANN-产量的耕地地力定量评价模型及其应用[J]. 农业工程学报, 2008, 24(1): 113-118.
[9] 阎平凡, 张长水. 人工神经网络与模拟进化计算[M]. 北京: 清华大学出版社, 2000.
[10] 王凌. 智能优化算法及其应用[M]. 北京: 清华大学出版社, 2001.
[11] 邓乃扬, 田英杰. 数据挖掘中的新方法——支持向量机[M]. 北京: 科学出版社, 2004.
[12] Nello Cristianini, John Shawe-Taylor. 支持向量机导论[M]. 北京: 机械工业出版社, 2005.
[13] 曹丽娟, 王小明. 金融工程的支持向量机方法[M]. 上海: 上海财经大学出版社, 2007.
[14] 梁吉业, 曲开社, 徐宗本. 信息系统的属性约简[J]. 系统工程理论与实践, 2001(12): 76-80.
[15] 安利平. 基于粗糙集理论的多属性决策分析[M]. 北京: 科学出版社, 2008.
[16] Goldberg D E. Genetic Algorithms in Search, Optimization and Machine Learning [M]. Reading M A: Addison-Wesley, 1989.
[17] Kirkpatrick S, Gelatt C D, Vecchi M P. Optimization by simulated annealing [J]. Science, 1983, 220(4598): 671-680.
[18] Skowron A, Rauszer C. The discernibility matrices and functions in information system //Slowinski R. Intelligent Decision Support, Handbook of Applications and Advances of the Rough Sets Theory. Dordrecht: KIuwer Academic Publishers, 1992: 331-362.
[19] Nguyen S H, Nguyen H S. Some efficient algorithms for rough set methods //Proceedings of the Fifth International Conference on Information Processing and Management of Uncertainty in Knowledge-based Systems (IPMU'96). Granada, Spain, 1996: 1451-1456.
[20] Hu Xiaohua. Knowledge Discovery in Databases: An Attribute-oriented Rough Set Approach . University of Regina, Canada, 1995.
[21] 苗夺谦, 胡桂荣. 知识约简的一种启发式算法[J]. 计算机研究与发展, 1999, 36(6): 681-684.
[22] Wroblewski J. Finding minimal reducts using genetic algorithms //Wang P P. Proceedings of the Second Annual Joint Conference on Information Sciences. Wrightsville, Beach, North Carolina, 1995: 186-189.
[23] Vinterbo S, hrn A. Minimal approximate hitting sets and rule templates [J]. International Journal of Approximate Reasoning, 2000, 25(2): 123-143.
[24] Kryszkiewicz M, Rybinski H. Finding reducts in composed information systems //Ziarko W P. Proceedings of the Second International Workshop on Rough Sets and Knowledge Discovery(RSKD'93). Banff, Alberta, Canada, 1993: 261-273.
[25] Starzyk J, Nelson D E, Sturtz K. Reduct generation in information system [J]. Bulletin of International Rough Set Society, 1999, 3(1): 19-22.
[26] Bazan G J, Skowron A, Synak P. Dynamic reducts as a tool for extracting laws from decision tables//Polkowski L, Skowron A. International Symposium on Methodologies for Intelligent Systems. Charlotte, NC: Springer-Verlag, 1994: 346-355.
[27] Hashemi R R, Jelovsek F R, Razzaghi M. Developmental toxicity risk assessment: A rough sets approach [J]. Methods of Information in Medicine, 1993, 32(1): 47-54.
[28] 张文修, 魏玲, 祁建军. 概念格的属性约简理论与方法[J]. 中国科学E缉: 信息科学, 2005, 35(6): 628-639.
[29] 范昕炜, 杜树新, 吴铁军. 可补偿类别差异的加权支持向量机算法[J]. 中国图象图形学报, 2003, 8(9): 1037-1042.
[30] Kreel U H G. Pairwise classification and support vector machines //Schlkopf B, Burges C J C, Smola A J. Advances in Kernel Methods: Support Vector Learning. Cambridge, Massachusetts: MIT Press, 1999: 255-268.
[31] Bottou L, Cortes C, Dcnkcr J, et al. A comparison of classifier methods: A case study in handwritten digit recognition//Proceedings of the 12th International Conference on Pattern Recognition. Jerusalem, 1994: 77-87.
[32] Takahashi F, Abe S. Decision-tree-based multi-class support vector machines //Proceedings of the Ninth International Conference on Neural Information Processing. Singapore, 2002, 1418-1422.
[33] Platt J C, Cristianini N, Shawe-Taylor J. Large margin DAG's for multi-class classification //Advances in neural information processing systems. Cambridge, Massachusetts: MIT Press, 2000: 547-553.
[34] Dietterich T G, Bakiri G. Solving multi-class learning problems via error-correcting output codes [J]. Journal of Artificial Intelligence Research, 1995(2): 263-286.
[35] Sebald D J, Buchlew J A. Support vector machines and the multiple hypothesis test problem [J]. IEEE Transactions on Signal Processing, 2001, 49(11): 2865-2872.
[36] 王睿. 关于支持向量机参数选择方法分析[J]. 重庆师范大学学报: 自然科学版, 2007, 24(2): 36-38, 42.
[37] 邓乃扬, 田英杰. 支持向量机——理论、算法与拓展[M]. 北京: 科学出版社, 2009.
[38] 苏高利, 邓芳萍. 关于支持向量回归机的模型选择[J]. 科技通报, 2006, 22(2): 154-158.
[39] Chapelle O, Vapnik V, Bousquet O, et al. Choosing multiple parameters for support vector machines [J]. Machine Learning, 2002, 46(1/3): 131-159.
[40] Hsu C W, Chang C C, Lin C J. A practical guide to support vector classification . Department of Computer Science and information Engineering, National Taiwan University, 2003.
[41] Zheng Chunhong, Jiao Licheng. Automatic parameters selection for SVM based on GA //Proceedings of the 5th World Congress on Intelligent Control and Automation. Hangzhou, China, 2004: 1869-1872.
[42] 刘胜, 李妍妍. 自适应GA-SVM 参数选择算法研究[J]. 哈尔滨工程大学报, 2007, 28(4): 398-402.
[43] Lee T F, Cho M Y, Shieh C S, et al. Particle swarm optimization-based SVM for Incipient Fault Classification of Power Transformers //Esposito F, Ras Z W, Malerba D, et al. Proceedings of the 16th International Symposium on Methodologies for Intelligent Systems, Foundations of Intelligent Systems, Lecture Notes in Computer Science. Berlin Heidelberg: Springer-Verlag, 2006: 84-90.
[44] Huang C M, Lee Y J, Lin D K J, et al. Model selection for support vector machines via uniform design [J]. Computational Statistics & Data Analysis, 2007, 52(1): 335-346.
[45] 林丹, 李敏强, 寇纪松. 基于实数编码的遗传算法的收敛性研究[J]. 计算机研究与发展, 2000, 37(11): 1311-1327. |