| Keyword search (4,163 papers available) | ![]() |
"protein language model" Keyword-tagged Publications:
| Title | Authors | PubMed ID | |
|---|---|---|---|
| 1 | Ion channel classification through machine learning and protein language model embeddings | Ghazikhani H; Butler G; | 39572876 ENCS |
| 2 | Exploiting protein language models for the precise classification of ion channels and ion transporters | Ghazikhani H; Butler G; | 38656743 CSFG |
| 3 | Enhanced identification of membrane transport proteins: a hybrid approach combining ProtBERT-BFD and convolutional neural networks | Ghazikhani H; Butler G; | 37497772 ENCS |
| Title: | Enhanced identification of membrane transport proteins: a hybrid approach combining ProtBERT-BFD and convolutional neural networks | ||||
| Authors: | Ghazikhani H, Butler G | ||||
| Link: | https://pubmed.ncbi.nlm.nih.gov/37497772/ | ||||
| DOI: | 10.1515/jib-2022-0055 | ||||
| Publication: | Journal of integrative bioinformatics | ||||
| Keywords: | ProtBERT-BFD; neural network; protein language model; transformers; transmembrane transport proteins; | ||||
| PMID: | 37497772 | Category: | Date Added: | 2023-07-27 | |
| Dept Affiliation: | ENCS | ||||
Description: |
Transmembrane transport proteins (transporters) play a crucial role in the fundamental cellular processes of all organisms by facilitating the transport of hydrophilic substrates across hydrophobic membranes. Despite the availability of numerous membrane protein sequences, their structures and functions remain largely elusive. Recently, natural language processing (NLP) techniques have shown promise in the analysis of protein sequences. Bidirectional Encoder Representations from Transformers (BERT) is an NLP technique adapted for proteins to learn contextual embeddings of individual amino acids within a protein sequence. Our previous strategy, TooT-BERT-T, differentiated transporters from non-transporters by employing a logistic regression classifier with fine-tuned representations from ProtBERT-BFD. In this study, we expand upon this approach by utilizing representations from ProtBERT, ProtBERT-BFD, and MembraneBERT in combination with classical classifiers. Additionally, we introduce TooT-BERT-CNN-T, a novel method that fine-tunes ProtBERT-BFD and discriminates transporters using a Convolutional Neural Network (CNN). Our experimental results reveal that CNN surpasses traditional classifiers in discriminating transporters from non-transporters, achieving an MCC of 0.89 and an accuracy of 95.1 % on the independent test set. This represents an improvement of 0.03 and 1.11 percentage points compared to TooT-BERT-T, respectively. |



