[1]WANG Peizhuang,LU Chenguang.From random set falling shadows to a random point falling shadow: membership functions for machine learning[J].CAAI Transactions on Intelligent Systems,2025,20(2):305-315.[doi:10.11992/tis.202309028]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
20
Number of periods:
2025 2
Page number:
305-315
Column:
综述
Public date:
2025-03-05
- Title:
-
From random set falling shadows to a random point falling shadow: membership functions for machine learning
- Author(s):
-
WANG Peizhuang; LU Chenguang
-
Intelligence Engineering and Mathematics Institute, Liaoning Technical University, Fuxin 123000, China
-
- Keywords:
-
fuzzy set; membership function; sampling distribution; semantic information measure; machine learning; multilabel classification; maximum mutual information classification; mixed model; Bayesian confirmation
- CLC:
-
TP3; O21; O23
- DOI:
-
10.11992/tis.202309028
- Abstract:
-
Obtaining membership functions from sample distributions is essential and challenging. Wang Peizhuang’s random set falling shadow theory uses set-valued statistics to derive membership functions, bridging the gap between statistics and fuzzy logic. However, traditional samples do not include set values, limiting the practical applicability of this theory. Lu Chenguang addressed this issue by using a semantic information method to derive two formulas for optimizing membership functions based on sample distributions. This method, known as the random point falling shadow method, is compatible with set-valued statistics. The resulting membership functions have applications in multilabel classification, maximum mutual information classification, mixed models, and Bayesian confirmation. Furthermore, the similarity function and estimated mutual information in modern deep learning techniques are special cases of the membership function and semantic mutual information. The maximum semantic information criterion is compatible with the maximum likelihood criterion, and the regularized least square error criterion, and the membership function is more transferable and easier to construct than likelihood functions or inverse probability functions. Thus, the membership function and the semantic information method hold considerable potential for widespread use in machine learning.