Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Overview The Dream EEG and Mentation (DREAM) database collects and stores metadata about DREAM datasets, and is accessible to the public. DREAM datasets provide polysomnography and associated subjective mentation reports. Some datasets may also contain personally identifiable information about participants, but such information are not stored by the DREAM database. Datasets are contributed to DREAM from many different labs in many different studies and, where possible, made openly accessible in the hope of pushing the fields of sleep, dream, brain-computer interface, and consciousness research forward. If you have data that others in the community might find useful, please consider contributing it to DREAM. Contents The DREAM database consists of a following data tables:
Datasets Data records People
The records in Datasets list all officially accepted DREAM datasets and their summary metadata. Data records lists metadata of each individual datum from these datasets. People provides information on the data contributors, referred to by Key ID in Datasets.
Emotion classification using electroencephalography (EEG) data and machine learning techniques have been on the rise in the recent past. However, past studies use data from medical-grade EEG setups with long set-up times and environment constraints. The images from the OASIS image dataset were used to elicit valence and arousal emotions, and the EEG data was recorded using the Emotiv Epoc X mobile EEG headset. We propose a novel feature ranking technique and incremental learning approach to analyze performance dependence on the number of participants. The analysis is carried out on publicly available datasets: DEAP and DREAMER for benchmarking. Leave-one-subject-out cross-validation was carried out to identify subject bias in emotion elicitation patterns. The collected dataset and pipeline are made open source.
This dataset provides information about the number of properties, residents, and average property values for Dreamer Lane cross streets in Green Bay, WI.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
========================================
Previous publications:
Konkoly, K. R., Appel, K., Chabani, E., Mangiaruga, A., Gott, J., Mallett, R., ... & Paller, K. A. (2021). Real-time dialogue between experimenters and dreamers during REM sleep. Current Biology, 31(7), 1417-1427.
Correspondence:
cagatay.demirel@donders.ru.nl
Codes correspond to the study numbers described below.
The database consists of data from three projects:
Real-time dialogue during lucid dreaming (study no: 0): Subjects communicated with the researchers while lucid dreaming, and answered very basic calculations with eye signals (e.g "what is 8 minus 6? --> "answer is 2 by moving ocular muscles on both sides to count the amount of eye signalling).
Motor-decoding during lucid dreaming (study no: 1): The goal is to induce hand-clenching during lucid dreams. In this experiment, participants are instructed to provide LRLR eye signals between each hand-clenching event to differentiate between the occurrences.
Other dream data: No experimental description, just some high-density EEG data with the dream content (study no: 2).
Note: Please note that the majority of "dream recall" moments, clear phasic REM stages, and lucid dreaming (in some individuals) occurred around midday during our study. It is important to mention that all participants arrived at the EEG lab with high REM pressure around 7:00 a.m., and the events were observed between 11:00-12:00. As a result, our dream segments are more likely to be perceived as "day awakenings".
Note: The data was collected under blanket ethical approval, and informed consent for participation was obtained from subjects.
N/A
The equipment used in this study included: - Easycap 128-channel EEG device (using the 10/05 EEG layout) with passive electrodes, which included EMG, EOG, and ECG. - actiCAP 64-channel EEG device (using the 10/10 EEG layout). * During study no. 1 and no. 2, certain EEG channels are converted into EOG and skin EMG signals through the use of adhesive holders. To avoid confusion, all modified channels are given updated names to reflect their current function. * ExG box with additional passive and bipolar EOG and EMGs are utilized.
The EEG data in the .edf format has not been preprocessed and remains in its raw form. However, since the data was originally collected from Brainvision files, the 3D electrode layout information (using the 10/10 system) is already embedded in the .edf files. As a result, when loading the data into either FieldTrip or MNE platforms, the layout information will be automatically included, and there is no need to search for an external EEG layout to integrate the data structure into MATLAB or Python platforms.
Various preprocessing pipelines could be utilized for the intended analysis; however, this is the most general preprocessing pipeline identified for our specific dataset.
1) Channel type assignment 2) Notch filter at 50 Hz 3) 0.1 - 49 Hz band-pass filter (in case above 50 Hz are not interested). 4) Noisy-channel tagging & interpolation 5) Optionally, EEG channel interpolation can be performed to generate artificial EEG signals on channels that have been converted to EOG and EMG. 6) Signal-space projection (SSP): https://mne.tools/0.16/manual/preprocessing/ssp.html
Subscribers can find out export and import data of 23 countries by HS code or product’s name. This demo is helpful for market analysis.
To promote reproducibility and cross-model comparisons in EEG emotion recognition research
https://okredo.com/en-lt/general-ruleshttps://okredo.com/en-lt/general-rules
MB Seductive Dreamer financial data: profit, annual turnover, paid taxes, sales revenue, equity, assets (long-term and short-term), profitability indicators.
This dataset provides information about the number of properties, residents, and average property values for Dreamer Lane cross streets in White Springs, FL.
https://okredo.com/en-lt/general-ruleshttps://okredo.com/en-lt/general-rules
MB "Glori Dreamer" financial data: profit, annual turnover, paid taxes, sales revenue, equity, assets (long-term and short-term), profitability indicators.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is a data repository for a case study on lucid dreaming precognitive targets.
Creative Data ShareGPT
This is a shuffled dataset with data from the following: NewEden/Claude-RP-1.5K-SFW" practical-dreamer/RPGPT_PublicDomain-ShareGPT Nitral-Archive/Active_RP-ShareGPT Gryphe/Sonnet3.5-Charcard-Roleplay seank0602/bluemoon_fandom_rp Gryphe/ChatGPT-4o-Writing-Prompts Dampfinchen/Creative_Writing_Multiturn Nitral-AI/Creative_Writing-ShareGPT xDAN2099/RolePlay-Mixed-Bluemoon-Limarp
Subscribers can find out export and import data of 23 countries by HS code or product’s name. This demo is helpful for market analysis.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The research project presented aims at putting the personal perception of Nathalie’s dreams through an objective, quantitative analysis using electroencephalography (EEG), in an attempt to establish a linkage between the two dimensions.
During sleep periods, brain activity is similar to that of an awakened state, yet the thalamus, a phylogenetically ancient structure in the nervous system, isolates us from the environment. But this isolation is not total, and sometimes external stimuli are incorporated into the plot of our dreams. To establish a bridge between the record (EEG) and Nathalie’s dream narrative, we experiment with auditory stimuli as a possible mechanism of interference.
The 101 nights is a longitudinal dataset. At the core of the study is the concordance of two divergent fields of knowledge to record and represent the dream experience. For 101 nights physiological and behavioral data are continuously paired with the dreamer’s inner life, geared towards a dialogue.
A unique dataset for scientific analyses, methodological developments as well artistic projects, including cognitive science and multiple modalities of art. The unprecedented project allows an internal and external perspective on Nathalie’s dreams, containing extensive data for 101 consecutive nights and days.
the project produced four immediate results:
1. 952 GB of brain data was produced by the registry of 256 sensors over the period of 101 nights in continuity, including her body movement (actimetry and infrared camera).
2. the logs of the words triggered by the computer system, each night with their exact time.
3. Nathalie's daily dream diary entries.
4. day-by-day activity.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Previous publications:
Konkoly, K. R., Appel, K., Chabani, E., Mangiaruga, A., Gott, J., Mallett, R., ... & Paller, K. A. (2021). Real-time dialogue between experimenters and dreamers during REM sleep. Current Biology, 31(7), 1417-1427.
Correspondence:
karenkonkoly2023@u.northwestern.edu
The time of awakening column contains only approximated times based on experimenters' notes and the duration of files
There are port codes in the data that have slightly meanings for some different participants (in the "status" channel). Here is a guide for their meanings:
Cases 01-08
Cases 09-33
N/A
Methods:
Twenty-two participants (15 female, age range 18-33 years, M = 21.1 ± 4.3 years) who claimed to remember at least one dream per week were recruited by word of mouth, online forum, and the Northwestern University Psychology Department participant pool. They each participated in one or more nap sessions, which amounted to 27 nap sessions in total.
Procedure:
Participants visited the laboratory at Northwestern University at approximately their normal wake time and received guidance on identifying lucid dreams and instructions for the experiment for about 40 min during preparations for polysomnographic recordings, including EEG, EMG, and EOG, using a Neuroscan SynAmps system. Participants were instructed to signal with a prearranged number of LR eye movements if they became lucid in a dream.
Next, participants practiced making ocular signals and responding to questions using combinations of LR signals. Subsequently, participants completed the Targeted Lucidity Reactivation (TLR) procedure while lying in bed. This procedure was derived from the procedure developed by Carr and collegues. A method of reality checking to induce lucid dreaming was paired with sensory stimulation and accelerated in a single session immediately before sleep, and then cues were presented again during REM sleep. In this procedure, participants were trained to associate a novel cue sound with a lucid state of mind during wake. The sound consisted of three pure-tone beeps increasing in pitch (400, 600, and 800 Hz) at approximately 40-45 dB SPL and lasting approximately 650 ms. For one participant (ppt. 121) the pure-tone beeps had previously been associated with a different task in an unrelated study. Thus, for this participant, a 1000-ms violin sound and low-intensity flashing-red LED lights were used as cues. All participants were informed that this cue would be given during sleep to help promote a lucid dream. Over the next 15 min, the TLR sound was played up to 15 times. The first 4 times, it was followed by verbal guidance to enter a lucid state as follows. ‘‘As you notice the signal, you become lucid. Bring your attention to your thoughts and notice where your mind has wandered.[pause] Now observe your body, sensations, and feelings.[pause] Observe your breathing. [pause] Remain lucid, critically aware, and notice how aspects of this experience are in any way different from your normal waking experience.’’
Participants often fell asleep before all 15 TLR cue presentations were completed. Standard polysomnographic methods were used to determine sleep state. Once participants entered REM sleep, TLR cues were presented again, at about 30-s intervals, as long as REM sleep remained stable. After participants responded to a cue with a lucid eye signal, or after approximately 10 cues were presented without response, we began the math problem portion of the experiment.
We devised the following task to engage auditory perception of math problems, working memory, and the ability to express the correct answer. We used simple addition and subtraction problems that could each be answered by a number between 1 and 4 (LR = 1, LRLR = 2, LRLRLR = 3, LRLRLRLR = 4), or between 1 and 6 for the first 5 participants.
From the above dataset, data was included in DREAM if there was a period of sleep on the EEG followed by a report of a dream (or a lack of dream). The EEG data includes the last period of continuous sleep before the dream report was collected, starting with the first epoch scored as wake, and ending at the last second before clear movement/alpha activity indicating wake. Also, there are a few instances, noted in the “Remarks” column in the “Records” file, where I included epochs that were scored as wake, when the wake seemed due to alpha from participants attempting to answer questions with eye movements (only one of these included wake in the last 20 seconds of the recording, case21_sub111).
EEG sleep data was NOT included if it was not followed by a verbal/written dream report or clear note on the experimenter’s log that there was no recall. Also not included is data where participants completed eye signals or answered questions, but it was not part of the continuous period of sleep before a dream report was given. Also excluded was a case in which a dream report was collected at the end of the nap but the participant had been in and out of sleep beforehand, so it was unclear which sleep period the report referred to.
Karen Konkoly rated reports according to the DREAM categorization. If the participant reported remembering any sort of mental content from sleep, it was rated “2”. If the participant reported remembering a dream but none of its content, it was rated “1”. If the participant reported not remembering anything, or not falling asleep, it was rated “0”.
This dataset provides information about the number of properties, residents, and average property values for Dreamers Lane cross streets in Pinewood, SC.
This dataset provides information about the number of properties, residents, and average property values for Dreamers Lane cross streets in Glade Spring, VA.
Subscribers can find out export and import data of 23 countries by HS code or product’s name. This demo is helpful for market analysis.
This dataset provides information about the number of properties, residents, and average property values for Dreamers Lane cross streets in Alpena, MI.
This dataset provides information about the number of properties, residents, and average property values for Dreamers Lane cross streets in Conway, AR.
Dreamers By Debut Export Import Data. Follow the Eximpedia platform for HS code, importer-exporter records, and customs shipment details.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Overview The Dream EEG and Mentation (DREAM) database collects and stores metadata about DREAM datasets, and is accessible to the public. DREAM datasets provide polysomnography and associated subjective mentation reports. Some datasets may also contain personally identifiable information about participants, but such information are not stored by the DREAM database. Datasets are contributed to DREAM from many different labs in many different studies and, where possible, made openly accessible in the hope of pushing the fields of sleep, dream, brain-computer interface, and consciousness research forward. If you have data that others in the community might find useful, please consider contributing it to DREAM. Contents The DREAM database consists of a following data tables:
Datasets Data records People
The records in Datasets list all officially accepted DREAM datasets and their summary metadata. Data records lists metadata of each individual datum from these datasets. People provides information on the data contributors, referred to by Key ID in Datasets.