Datasets & OSS
The following datasets and software were created during the course of the project and are publicly available.
Open source tools (U2M)
Cam2lidar: Part of the developed mobile mapping platform for massive data collection in urban environments for the purpose of photorealistic 3D reconstruction of spaces.
TextGenerationDataset (UPF)
The dataset consists of texts in the different MindSpaces languages with superimposed semantic (predicate-argument) annotations aligned with syntactic annotations. It can be used for training in-domain and open-domain statistical generators.
VisualSentimentAnalysis (CERTH)
The Indoor/Outdoor Sentiment Analysis Dataset contains images from the height of human vision in an effort towards estimating the way human emotion is triggered by urban architectural places. Emotion is examined based on the Self-Assessment Mannequin (SAM), aiming to deploy a two-fold dataset for classification purposes.
ConceptExtraction (UPF)
The dataset consists of open-domain concept-annotated texts for training extractive models and evaluating them.
VisualLidarandXSensorDataCollection (U2M)
Τhe datasets contain the raw videos from the mobile mapping survey in L’ Hospitalet, together with the final 3D reconstruction results. The folder contains 5 sequences, each one of which has 4 videos from the four cameras (facing front, back, left and right), a map of the captured area and the final 3D coloured pointcloud.
PhysiologicalSignals (CERTH, MU)
The dataset consists of Galvanic Skin Response (GSR) signals recorded from 21 subjects during a stress-inducing experiment. Timestamps for each condition indicating the task the participant performed at that time of the study and self-reports are also included.