[1]汪培庄,鲁晨光.从随机集落影到随机点落影——隶属函数用于机器学习[J].智能系统学报,2025,20(2):305-315.[doi:10.11992/tis.202309028]
WANG Peizhuang,LU Chenguang.From random set falling shadows to a random point falling shadow: membership functions for machine learning[J].CAAI Transactions on Intelligent Systems,2025,20(2):305-315.[doi:10.11992/tis.202309028]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
20
期数:
2025年第2期
页码:
305-315
栏目:
综述
出版日期:
2025-03-05
- Title:
-
From random set falling shadows to a random point falling shadow: membership functions for machine learning
- 作者:
-
汪培庄, 鲁晨光
-
辽宁工程技术大学 智能工程与数学研究院, 辽宁 阜新 123000
- Author(s):
-
WANG Peizhuang, LU Chenguang
-
Intelligence Engineering and Mathematics Institute, Liaoning Technical University, Fuxin 123000, China
-
- 关键词:
-
模糊集合; 隶属函数; 样本分布; 语义信息测度; 机器学习; 多标签分类; 最大互信息分类; 混合模型; 贝叶斯确证
- Keywords:
-
fuzzy set; membership function; sampling distribution; semantic information measure; machine learning; multilabel classification; maximum mutual information classification; mixed model; Bayesian confirmation
- 分类号:
-
TP3; O21; O23
- DOI:
-
10.11992/tis.202309028
- 摘要:
-
从样本分布求得隶属函数是重要的也是困难的。汪培庄的随机集落影理论使用集值统计得到隶属函数,从而在统计和模糊逻辑之间架起桥梁。但是,通常的样本并不包含集值,所以该理论不够实用。鲁晨光使用语义信息方法推导出用样本分布优化隶属函数的2个公式,它们和集值统计结果一致,可谓随机点落影方法。该方法可以用于多标签分类、最大互信息分类、混合模型、贝叶斯确证等。深度学习最新潮流中用的相似函数和估计互信息就是隶属函数和语义互信息的特例。因为最大语义信息准则和最大似然准则以及正则化最小误差平方准则兼容,并且隶属函数比似然函数迁移性更好,比反概率函数更容易构造,隶属函数有希望被广泛用于机器学习。
- Abstract:
-
Obtaining membership functions from sample distributions is essential and challenging. Wang Peizhuang’s random set falling shadow theory uses set-valued statistics to derive membership functions, bridging the gap between statistics and fuzzy logic. However, traditional samples do not include set values, limiting the practical applicability of this theory. Lu Chenguang addressed this issue by using a semantic information method to derive two formulas for optimizing membership functions based on sample distributions. This method, known as the random point falling shadow method, is compatible with set-valued statistics. The resulting membership functions have applications in multilabel classification, maximum mutual information classification, mixed models, and Bayesian confirmation. Furthermore, the similarity function and estimated mutual information in modern deep learning techniques are special cases of the membership function and semantic mutual information. The maximum semantic information criterion is compatible with the maximum likelihood criterion, and the regularized least square error criterion, and the membership function is more transferable and easier to construct than likelihood functions or inverse probability functions. Thus, the membership function and the semantic information method hold considerable potential for widespread use in machine learning.
备注/Memo
收稿日期:2023-9-15。
基金项目:国家自然科学基金重大项目(9688007-1).
作者简介:汪培庄,教授,博士生导师,曾任国际模糊系统协会副主席。主要研究方向为模糊数学及其在人工智能中的应用。获得国家级和部委级奖励多项、国际奖1项。发表学术论文200余篇,出版学术著作4部。E-mail:peizhuangw@126.com;鲁晨光,辽宁工程技术大学客座教授。主要研究方向为语义信息论、机器学习、色觉机制、投资组合、美感和进化。发表学术论文40余篇,出版学术专著4部。E-mail:lcguang@foxmail.com。
通讯作者:鲁晨光. E-mail:lcguang@foxmail.com
更新日期/Last Update:
2025-03-05