Transforming big data into smart data: An insight on the use of the k‐nearest neighbors algorithm to obtain quality data

I Triguero, D García‐Gil, J Maillo… - … : Data Mining and …, 2019 - Wiley Online Library
The k‐nearest neighbors algorithm is characterized as a simple yet effective data mining
technique. The main drawback of this technique appears when massive amounts of data …

A tutorial on distance metric learning: Mathematical foundations, algorithms, experimental analysis, prospects and challenges

JL Suárez, S García, F Herrera - Neurocomputing, 2021 - Elsevier
Distance metric learning is a branch of machine learning that aims to learn distances from
the data, which enhances the performance of similarity-based algorithms. This tutorial …

Challenges in KNN classification

S Zhang - IEEE Transactions on Knowledge and Data …, 2021 - ieeexplore.ieee.org
The KNN algorithm is one of the most popular data mining algorithms. It has been widely
and successfully applied to data analysis applications across a variety of research topics in …

[HTML][HTML] Building an effective intrusion detection system using the modified density peak clustering algorithm and deep belief networks

Y Yang, K Zheng, C Wu, X Niu, Y Yang - Applied Sciences, 2019 - mdpi.com
Featured Application The model proposed in this paper can be deployed to the enterprise
gateway, dynamically monitor network activities, and connect with the firewall to protect the …

A novel ensemble method for k-nearest neighbor

Y Zhang, G Cao, B Wang, X Li - Pattern Recognition, 2019 - Elsevier
In this paper, to address the issue that ensembling k-nearest neighbor (kNN) classifiers with
resampling approaches cannot generate component classifiers with a large diversity, we …

High-dimensional Bayesian optimisation with variational autoencoders and deep metric learning

A Grosnit, R Tutunov, AM Maraval, RR Griffiths… - arXiv preprint arXiv …, 2021 - arxiv.org
We introduce a method combining variational autoencoders (VAEs) and deep metric
learning to perform Bayesian optimisation (BO) over high-dimensional and structured input …

Active contour model based on local Kullback–Leibler divergence for fast image segmentation

C Yang, G Weng, Y Chen - Engineering Applications of Artificial …, 2023 - Elsevier
The inhomogeneity of image intensity and noise are the main factors that affect the
segmentation results. To overcome these challenges, a new active contour model is …

Kullback–Leibler divergence metric learning

S Ji, Z Zhang, S Ying, L Wang, X Zhao… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
The Kullback–Leibler divergence (KLD), which is widely used to measure the similarity
between two distributions, plays an important role in many applications. In this article, we …

Worst-case discriminative feature learning via max-min ratio analysis

Z Wang, F Nie, C Zhang, R Wang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
We propose a novel discriminative feature learning method via Max-Min Ratio Analysis
(MMRA) for exclusively dealing with the long-standing “worst-case class separation” …

Kernel-Based Distance Metric Learning for Supervised -Means Clustering

B Nguyen, B De Baets - IEEE transactions on neural networks …, 2019 - ieeexplore.ieee.org
Finding an appropriate distance metric that accurately reflects the (dis) similarity between
examples is a key to the success of k-means clustering. While it is not always an easy task to …