简介
Transformer 是一种革命性的神经网络架构,首次出现在2017年的论文《Attention Is All You Need》中,由Google的研究团队提出。与传统的RNN和LSTM模型不同,Transformer完全依赖于自注意力(Self-Attention)机制来捕…
Yang, F., Wang, W., Wang, F. et al. scBERT as a large-scale pretrained deep language model for cell type annotation of single-cell RNA-seq data. Nat Mach Intell 4, 852–866 (2022). https://doi.org/10.1038/s42256-022-00534-z
论文地址:scBERT as a…