Search publications

Reset filters Search by keyword

No publications found.

 

Neural topic modeling on hyperspheres: Spherical representation learning with von Mises-Fisher mixtures

Author(s): Guo D; Luo Z; Bouguila N; Fan W;

Neural topic models (NTMs) based on variational autoencoders (VAEs) have emerged as a scalable and flexible alternative to classical probabilistic models for uncovering latent thematic structures in text corpora. However, most existing NTMs either overlook the geometric structure of word embeddings or rely on Euclidean priors that are poorly aligned with ...

Article GUID: 41791177


Disentangled representation learning for multi-view clustering via von Mises-Fisher hyperspherical embedding

Author(s): Li Z; Luo Z; Bouguila N; Su W; Fan W;

Multi-view clustering has gained significant attention due to its ability to integrate data from diverse perspectives, frequently outperforming single-view approaches. However, existing methods often assume a Gaussian distribution within the latent embedding space, which can degrade performance when handling high-dimensional data or data with complex, non ...

Article GUID: 40664160


Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications

Author(s): Luo Z; Amayri M; Fan W; Bouguila N;

Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among groups. However, the restriction of Dirichlet prior and ...

Article GUID: 36685642


-   Page 1 / 1   -