Select Language

AI社区

公开数据集

伯特·蒂尼

伯特·蒂尼

16.91M
578 浏览
0 喜欢
0 次下载
0 条讨论
Earth and Nature,Music,NLP Classification

数据结构 ? 16.91M

    Data Structure ?

    * 以上分析是由系统提取分析形成的结果,具体实际数据为准。

    README.md

    Context This is a pretrained BERT model, dubbed BERT-TINY. It is not my work, and is merely reproduced for use on kaggle from https://github.com/google-research/bert. It is much smaller than BERT base, so may be useful for quick training iterations and research on general performance insights to BERT type models. A description of the model and it's utility presumably is contained in: @article{turc2019, title={Well-Read Students Learn Better: On the Importance of Pre-training Compact Models}, author={Turc, Iulia and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina}, journal={arXiv preprint arXiv:1908.08962v2 }, year={2019} } though I haven't yet made eye contact with that paper. Source This pretrained model is publicly available at https://github.com/google-research/bert
    ×

    帕依提提提温馨提示

    该数据集正在整理中,为您准备了其他渠道,请您使用

    注:部分数据正在处理中,未能直接提供下载,还请大家理解和支持。
    暂无相关内容。
    暂无相关内容。
    • 分享你的想法
    去分享你的想法~~

    全部内容

      欢迎交流分享
      开始分享您的观点和意见,和大家一起交流分享.
    所需积分:0 去赚积分?
    • 578浏览
    • 0下载
    • 0点赞
    • 收藏
    • 分享