Transforming big data into smart data: An insight on the use of the k‐nearest neighbors algorithm to obtain quality data
The k‐nearest neighbors algorithm is characterized as a simple yet effective data mining
technique. The main drawback of this technique appears when massive amounts of data …
technique. The main drawback of this technique appears when massive amounts of data …
A tutorial on distance metric learning: Mathematical foundations, algorithms, experimental analysis, prospects and challenges
Distance metric learning is a branch of machine learning that aims to learn distances from
the data, which enhances the performance of similarity-based algorithms. This tutorial …
the data, which enhances the performance of similarity-based algorithms. This tutorial …
Challenges in KNN classification
S Zhang - IEEE Transactions on Knowledge and Data …, 2021 - ieeexplore.ieee.org
The KNN algorithm is one of the most popular data mining algorithms. It has been widely
and successfully applied to data analysis applications across a variety of research topics in …
and successfully applied to data analysis applications across a variety of research topics in …
[HTML][HTML] Building an effective intrusion detection system using the modified density peak clustering algorithm and deep belief networks
Y Yang, K Zheng, C Wu, X Niu, Y Yang - Applied Sciences, 2019 - mdpi.com
Featured Application The model proposed in this paper can be deployed to the enterprise
gateway, dynamically monitor network activities, and connect with the firewall to protect the …
gateway, dynamically monitor network activities, and connect with the firewall to protect the …
A novel ensemble method for k-nearest neighbor
In this paper, to address the issue that ensembling k-nearest neighbor (kNN) classifiers with
resampling approaches cannot generate component classifiers with a large diversity, we …
resampling approaches cannot generate component classifiers with a large diversity, we …
High-dimensional Bayesian optimisation with variational autoencoders and deep metric learning
We introduce a method combining variational autoencoders (VAEs) and deep metric
learning to perform Bayesian optimisation (BO) over high-dimensional and structured input …
learning to perform Bayesian optimisation (BO) over high-dimensional and structured input …
Active contour model based on local Kullback–Leibler divergence for fast image segmentation
C Yang, G Weng, Y Chen - Engineering Applications of Artificial …, 2023 - Elsevier
The inhomogeneity of image intensity and noise are the main factors that affect the
segmentation results. To overcome these challenges, a new active contour model is …
segmentation results. To overcome these challenges, a new active contour model is …
Kullback–Leibler divergence metric learning
The Kullback–Leibler divergence (KLD), which is widely used to measure the similarity
between two distributions, plays an important role in many applications. In this article, we …
between two distributions, plays an important role in many applications. In this article, we …
Worst-case discriminative feature learning via max-min ratio analysis
We propose a novel discriminative feature learning method via Max-Min Ratio Analysis
(MMRA) for exclusively dealing with the long-standing “worst-case class separation” …
(MMRA) for exclusively dealing with the long-standing “worst-case class separation” …
Kernel-Based Distance Metric Learning for Supervised -Means Clustering
B Nguyen, B De Baets - IEEE transactions on neural networks …, 2019 - ieeexplore.ieee.org
Finding an appropriate distance metric that accurately reflects the (dis) similarity between
examples is a key to the success of k-means clustering. While it is not always an easy task to …
examples is a key to the success of k-means clustering. While it is not always an easy task to …