原文链接:
[1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (arxiv.org)
原文笔记:
What:
BETR:Pre-training of Deep Bidirectional Transformers for Language Understand…
论文《Structural Information Enhanced Graph Representation for Link Prediction》阅读 论文概况Introduction问题一:**现有的节点标记技巧只能帮助感知路径深度,而缺乏对路径“宽度”的感知,例如节点度或路径数量**。问题二:G…