Keyword search (4,163 papers available)

"Bouguila N" Authored Publications:

Title Authors PubMed ID
1 Neural topic modeling on hyperspheres: Spherical representation learning with von Mises-Fisher mixtures Guo D; Luo Z; Bouguila N; Fan W; 41791177
ENCS
2 Disentangled representation learning for multi-view clustering via von Mises-Fisher hyperspherical embedding Li Z; Luo Z; Bouguila N; Su W; Fan W; 40664160
ENCS
3 Clustering and Interpretability of Residential Electricity Demand Profiles Kallel S; Amayri M; Bouguila N; 40218540
ENCS
4 SAVE: Self-Attention on Visual Embedding for Zero-Shot Generic Object Counting Zgaren A; Bouachir W; Bouguila N; 39997554
ENCS
5 Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings Guo J; Fan W; Amayri M; Bouguila N; 39662201
ENCS
6 FishSegSSL: A Semi-Supervised Semantic Segmentation Framework for Fish-Eye Images Paul S; Patterson Z; Bouguila N; 38535151
ENCS
7 Perceptions of self-monitoring dietary intake according to a plate-based approach: A qualitative study Kheirmandparizi M; Gouin JP; Bouchaud CC; Kebbe M; Bergeron C; Madani Civi R; Rhodes RE; Farnesi BC; Bouguila N; Conklin AI; Lear SA; Cohen TR; 38015899
PERFORM
8 Unsupervised Mixture Models on the Edge for Smart Energy Consumption Segmentation with Feature Saliency Al-Bazzaz H; Azam M; Amayri M; Bouguila N; 37837127
ENCS
9 Data-Weighted Multivariate Generalized Gaussian Mixture Model: Application to Point Cloud Robust Registration Ge B; Najar F; Bouguila N; 37754943
ENCS
10 Human Activity Recognition with an HMM-Based Generative Model Manouchehri N; Bouguila N; 36772428
ENCS
11 Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications Luo Z; Amayri M; Fan W; Bouguila N; 36685642
ENCS
12 Weakly Supervised Occupancy Prediction Using Training Data Collected via Interactive Learning Bouhamed O; Amayri M; Bouguila N; 35590880
ENCS
13 Entropy-Based Variational Scheme with Component Splitting for the Efficient Learning of Gamma Mixtures Bourouis S; Pawar Y; Bouguila N; 35009726
ENCS
14 Bayesian Learning of Shifted-Scaled Dirichlet Mixture Models and Its Application to Early COVID-19 Detection in Chest X-ray Images Bourouis S; Alharbi A; Bouguila N; 34460578
ENCS

 

Title:Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings
Authors:Guo JFan WAmayri MBouguila N
Link:https://pubmed.ncbi.nlm.nih.gov/39662201/
DOI:10.1016/j.neunet.2024.106979
Publication:Neural networks : the official journal of the International Neural Network Society
Keywords:ClusteringData augmentationGamma mixture modelsVAEVariational inference
PMID:39662201 Category: Date Added:2024-12-12
Dept Affiliation: ENCS
1 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: g_jiax@encs.concordia.ca.
2 Guangdong Provincial Key Laboratory IRADS and Department of Computer Science, Beijing Normal University-Hong Kong Baptist University United International College, Zhuhai, Guangdong, China. Electronic address: wentaofan@uic.edu.cn.
3 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: manar.amayri@concordia.ca.
4 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: nizar.bouguila@concordia.ca.

Description:

This article proposes a novel deep clustering model based on the variational autoencoder (VAE), named GamMM-VAE, which can learn latent representations of training data for clustering in an unsupervised manner. Most existing VAE-based deep clustering methods use the Gaussian mixture model (GMM) as a prior on the latent space. We employ a more flexible asymmetric Gamma mixture model to achieve higher quality embeddings of the data latent space. Second, since the Gamma is defined for strictly positive variables, in order to exploit the reparameterization trick of VAE, we propose a transformation method from Gaussian distribution to Gamma distribution. This method can also be considered a Gamma distribution reparameterization trick, allows gradients to be backpropagated through the sampling process in the VAE. Finally, we derive the evidence lower bound (ELBO) based on the Gamma mixture model in an effective way for the stochastic gradient variational Bayesian (SGVB) estimator to optimize the proposed model. ELBO, a variational inference objective, ensures the maximization of the approximation of the posterior distribution, while SGVB is a method used to perform efficient inference and learning in VAEs. We validate the effectiveness of our model through quantitative comparisons with other state-of-the-art deep clustering models on six benchmark datasets. Moreover, due to the generative nature of VAEs, the proposed model can generate highly realistic samples of specific classes without supervised information.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University