公开数据集
数据结构 ? 64.2M
Data Structure ?
* 以上分析是由系统提取分析形成的结果,具体实际数据为准。
README.md
This project contains head movement data recorded from groups of participants asked to stand as still as possible and presented with a series of auditory stimuli. Data was collected in units of mm with a Qualisys motion capture system at 100Hz. Data was collected at the University of Oslo on March 12th 2015 from a total of 108 participants. Code to read and process these files has been made publicly available.
Background
This study was conducted as a continuation of a previous project which aimed to investigate the effects of music stimuli on movement when participants try to remain at rest [1]. As part of this larger MICRO project, we collected data through optical motion capture from groups of people instructed to stand as still as possible with and without music stimuli.
Unlike data collected in 2012 [1], participants were asked to stand as still as possible for 6 minutes, starting with 60 seconds in silence and followed by alternated 60-second segments of music and silence. All trials started and ended with 60 seconds of silence. The automatic randomization of stimuli allowed for consecutive segments of silence and music. A detailed list of the stimuli presented to each group of participants is included in this dataset. By alternating the presentation of stimuli and introducing silence segments between music segments we aimed at validating the differences between music and silence conditions found in the 2012 dataset [1].
Methods
To summarize, we recruited 108 participants during the Open Day at the University Oslo after approval by the Norwegian Center for Research Data (NSD), with project identification number NSD2457. Participants were asked to stand as still as possible for 6 minutes, starting with 60 seconds in silence and followed by alternated 60-second segments of music and silence. All trials ended with 60s of silence. Participants were aware that music could start after one minute, and were free to choose their standing posture. The distribution of participants in the recording space was standardized across trials with marks on the floor indicating the approximate feet position.
The instantaneous position of a reflective marker placed on the head of each participant was recorded using a Qualisys infrared motion capture system (13 Oqus 300/500 cameras) running at 100 Hz. The data were recorded in 12 groups of 3 to 12 participants at a time. The motion capture system was triggered and stopped automatically with the stimuli playback system, thus all recordings are exactly 6 minutes long.
Experimental Stimuli
The 60 second stimuli used in the experiment and the presentation order for each group were: A = Silence; B = Meditation; C = Salsa; D = Electronic Dance Music (EDM).
Presentation Order
The audio stimuli were presented to groups in the orders described below.
01: ACABDA 02: AABDCA 03: AABCDA 04: ABCDAA 05: AABCDA 06: ACABDA 07: ACBDAA 08: ACBDAA 09: ACABDA 10: ACDBAA 11: AABDCA 12: ACABDA
Questionnaire
In a post-experiment questionnaire, participants were asked to self-report demographics and details such as whether they were standing with their eyes open or closed, and whether they had their knees locked.
Data Description
The following data types are provided:
- Motion (head marker position): Recorded with Qualisys Track Manager and saved as.tsvfiles. Data from each group of participants is saved in a separate file.
- Stimuli: audio.wavfiles for each of the stimuli described above.
- Demographics: descriptive data collected from participants in a post-experiment questionnaire, includes age, sex, music listening habits, knee posture and whether eyes were closed or open during the experiment. A value of 0 indicates that the participant answered "no" to the question; a value of 1 is "yes"; and a value of 0.5indicates that the participant reported changing between states. For example, 0.5 in the Locked knees?column means that the participant reported switching between open and locked knees.
Usage Notes
Python and MATLAB code to process the data as well as the Max/MSP patch used to play and synchronize stimuli with the motion capture system is available on GitHub [2].
We encourage others to validate our work and build on it by applying novel analytical methods. In particular, between-group differences and interpersonal synchronization within groups. Alternative approaches can include clustering, music information retrieval, and frequency analysis of both movement and sound data.
Acknowledgements
This work was partially supported by the Research Council of Norway through its Centres of Excellence scheme (project numbers 262762 and 250698).
Conflicts of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
When using this resource, please cite: (show more options)
Gonzalez, V., Zelechowska, A., & Jensenius, A.
R. (2020). MICRO Motion capture data from groups of participants
standing still to auditory stimuli (2015) (version 1.0). PhysioNet. https://doi.org/10.13026/dfv0-sb95.
Please include the standard citation for PhysioNet: (show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff,
J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000).
PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research
resource for complex physiologic signals. Circulation [Online]. 101
(23), pp. e215–e220.
帕依提提提温馨提示
该数据集正在整理中,为您准备了其他渠道,请您使用
- 分享你的想法
全部内容
数据使用声明:
- 1、该数据来自于互联网数据采集或服务商的提供,本平台为用户提供数据集的展示与浏览。
- 2、本平台仅作为数据集的基本信息展示、包括但不限于图像、文本、视频、音频等文件类型。
- 3、数据集基本信息来自数据原地址或数据提供方提供的信息,如数据集描述中有描述差异,请以数据原地址或服务商原地址为准。
- 1、本站中的所有数据集的版权都归属于原数据发布者或数据提供方所有。
- 1、如您需要转载本站数据,请保留原数据地址及相关版权声明。
- 1、如本站中的部分数据涉及侵权展示,请及时联系本站,我们会安排进行数据下线。