公开数据集
数据结构 ? 310.53M
Data Structure ?
* 以上分析是由系统提取分析形成的结果,具体实际数据为准。
README.md
Context
The Conference and Workshop on [Neural Information Processing Systems](https://nips.cc/) (abbreviated as **NeurIPS** and formerly **NIPS**) is a machine learning and computational neuroscience conference held every December.
This dataset is inspired by dataset [NIPS Papers](https://www.kaggle.com/benhamner/nips-papers) of [Ben Hamner](https://www.kaggle.com/benhamner). While the original dataset by Ben Hamner represent the time period of 1987-2017 covering over **7241** papers, **2439** more papers has been published in the year of 2018-19. Hence, decided to get everything together for the Kaggle community.
Content
This dataset contains the **year of publication**, **title**, **author details**, **abstracts**, and **full text** of all NeurIPS papers from 1987 to 2019.
Since, NeurIPS Conference and Workshop happen in the month of December each year, the dataset will be updated annually.
Acknowledgements
I scraped all the papers from [https://nips.cc ](https://nips.cc) using a beautiful library in Python called BeautifulSoup. You can find the code to scrap all the papers on my [GitHub Repo](https://github.com/rowhitswami/All-NeurIPS-Papers-Scraper). A huge thanks to **NeurIPS** for making the data public.
Inspiration
Feel free to torture the data and show your creativity in Kaggle Kernels.
Some ideas included but not limited to:
- Topic modelling.
- Extract keywords.
- Exploratory Data Analysis on all NeurIPS papers.
- Create a semantic search engine to answer your query in Data Science, Machine Learning, Deep Learning and Reinforcement Learning.
×
帕依提提提温馨提示
该数据集正在整理中,为您准备了其他渠道,请您使用
注:部分数据正在处理中,未能直接提供下载,还请大家理解和支持。
暂无相关内容。
暂无相关内容。
- 分享你的想法
去分享你的想法~~
全部内容
欢迎交流分享
开始分享您的观点和意见,和大家一起交流分享.
数据使用声明:
- 1、该数据来自于互联网数据采集或服务商的提供,本平台为用户提供数据集的展示与浏览。
- 2、本平台仅作为数据集的基本信息展示、包括但不限于图像、文本、视频、音频等文件类型。
- 3、数据集基本信息来自数据原地址或数据提供方提供的信息,如数据集描述中有描述差异,请以数据原地址或服务商原地址为准。
- 1、本站中的所有数据集的版权都归属于原数据发布者或数据提供方所有。
- 1、如您需要转载本站数据,请保留原数据地址及相关版权声明。
- 1、如本站中的部分数据涉及侵权展示,请及时联系本站,我们会安排进行数据下线。