| Keyword search (4,163 papers available) | ![]() |
"Amayri M" Authored Publications:
| Title | Authors | PubMed ID | |
|---|---|---|---|
| 1 | Clustering and Interpretability of Residential Electricity Demand Profiles | Kallel S; Amayri M; Bouguila N; | 40218540 ENCS |
| 2 | Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings | Guo J; Fan W; Amayri M; Bouguila N; | 39662201 ENCS |
| 3 | Unsupervised Mixture Models on the Edge for Smart Energy Consumption Segmentation with Feature Saliency | Al-Bazzaz H; Azam M; Amayri M; Bouguila N; | 37837127 ENCS |
| 4 | Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications | Luo Z; Amayri M; Fan W; Bouguila N; | 36685642 ENCS |
| 5 | Weakly Supervised Occupancy Prediction Using Training Data Collected via Interactive Learning | Bouhamed O; Amayri M; Bouguila N; | 35590880 ENCS |
| Title: | Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings | ||||
| Authors: | Guo J, Fan W, Amayri M, Bouguila N | ||||
| Link: | https://pubmed.ncbi.nlm.nih.gov/39662201/ | ||||
| DOI: | 10.1016/j.neunet.2024.106979 | ||||
| Publication: | Neural networks : the official journal of the International Neural Network Society | ||||
| Keywords: | Clustering; Data augmentation; Gamma mixture models; VAE; Variational inference; | ||||
| PMID: | 39662201 | Category: | Date Added: | 2024-12-12 | |
| Dept Affiliation: |
ENCS
1 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: g_jiax@encs.concordia.ca. 2 Guangdong Provincial Key Laboratory IRADS and Department of Computer Science, Beijing Normal University-Hong Kong Baptist University United International College, Zhuhai, Guangdong, China. Electronic address: wentaofan@uic.edu.cn. 3 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: manar.amayri@concordia.ca. 4 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: nizar.bouguila@concordia.ca. |
||||
Description: |
This article proposes a novel deep clustering model based on the variational autoencoder (VAE), named GamMM-VAE, which can learn latent representations of training data for clustering in an unsupervised manner. Most existing VAE-based deep clustering methods use the Gaussian mixture model (GMM) as a prior on the latent space. We employ a more flexible asymmetric Gamma mixture model to achieve higher quality embeddings of the data latent space. Second, since the Gamma is defined for strictly positive variables, in order to exploit the reparameterization trick of VAE, we propose a transformation method from Gaussian distribution to Gamma distribution. This method can also be considered a Gamma distribution reparameterization trick, allows gradients to be backpropagated through the sampling process in the VAE. Finally, we derive the evidence lower bound (ELBO) based on the Gamma mixture model in an effective way for the stochastic gradient variational Bayesian (SGVB) estimator to optimize the proposed model. ELBO, a variational inference objective, ensures the maximization of the approximation of the posterior distribution, while SGVB is a method used to perform efficient inference and learning in VAEs. We validate the effectiveness of our model through quantitative comparisons with other state-of-the-art deep clustering models on six benchmark datasets. Moreover, due to the generative nature of VAEs, the proposed model can generate highly realistic samples of specific classes without supervised information. |



