Select Language

AI社区

公开数据集

来自维基百科的800万个德语句子

来自维基百科的800万个德语句子

1099.53M
221 浏览
0 喜欢
1 次下载
0 条讨论
Internet,NLP,Text Data Classification

数据结构 ? 1099.53M

    Data Structure ?

    * 以上分析是由系统提取分析形成的结果,具体实际数据为准。

    README.md

    Context This dataset contains a little more than 8 Million sentences from the German Wikipedia. Data was obtained via the [LinguaTools Monolingual Dumps](https://linguatools.org/tools/corpora/wikipedia-monolingual-corpora/), filtered to only keep articles with more than 35 links pointing to them and stripped of XML (and other junk). Articles were then split into sentences using [NNSplit](https://github.com/bminixhofer/nnsplit). Acknowledgements Obviously: I did almost no work here, all credit goes to [Wikipedia](https://www.wikipedia.org/) and [Linguatools](https://linguatools.org/). Inspiration Some time ago I didn't find any easy way to access the (cleaned) data from German Wikipedia for a project with self-supervised training. I just revisited it and thought I'd put the data up here, I'm not sure if there is still a need for it but it can't hurt I guess :)
    ×

    帕依提提提温馨提示

    该数据集正在整理中,为您准备了其他渠道,请您使用

    注:部分数据正在处理中,未能直接提供下载,还请大家理解和支持。
    暂无相关内容。
    暂无相关内容。
    • 分享你的想法
    去分享你的想法~~

    全部内容

      欢迎交流分享
      开始分享您的观点和意见,和大家一起交流分享.
    所需积分:0 去赚积分?
    • 221浏览
    • 1下载
    • 0点赞
    • 收藏
    • 分享