journalArticle
Gune
Aditya
De Amicis
Raffaele
Simões
Bruno
Sanchez
Christopher A.
Demirel
H. Onan
1 Purpose_Both
10 SONI_Level_nominal
12 Interaction_yes
13 Goal_Research
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
20 Venue Field Vis
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_browse
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
9 VIS_Level_nominal
included_Stage5
https://ieeexplore.ieee.org/document/8402185/
18-26
2018-07
Number: 4
Conference Name: IEEE Computer Graphics and Applications
Citation Key: gune_2018_GraphicallyHearingEnhancing
2023-10-23 21:20:00
IEEE Xplore
Effective presentation of data is critical to a users understanding of it. In this manuscript, we explore research challenges associated with presenting large geospatial datasets through a multimodal experience. We also suggest an interaction schema that enhances users cognition of geographic information through a user-driven display that visualizes and sonifies geospatial data.
Graphically Hearing: Enhancing Understanding of Geospatial Data through an Integrated Auditory and Visual Experience
Graphically Hearing
38
4
IEEE Computer Graphics and Applications
ISSN 1558-1756
DOI 10.1109/MCG.2018.042731655
attachment
Gune_et_al_2018_Graphically_Hearing.pdf
application/pdf
conferencePaper
DOI 10.1109/iV.2018.00093
2018 22nd International Conference Information Visualisation (IV)
Maçãs
Catarina
Martins
Pedro
Machado
Penousal
1 Purpose_Presentation
10 SONI_Level_nominal
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_no
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Arts&Humanities
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue Field Vis
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_nominal
9 VIS_Level_ratio
included_Stage5
https://ieeexplore.ieee.org/document/8564211/
504-509
2018-07
ISSN: 2375-0138
Citation Key: macas_2018_ConsumptionRhythmMultimodal
2023-10-23 21:19:21
IEEE Xplore
2018 22nd International Conference Information Visualisation (IV)
Through Data Visualisation and Sonification models, we present a study of multimodal representations to characterise the Portuguese consumption patterns, which were gathered from Portuguese hypermarkets and supermarkets over the course of two years. We focus on the rhythmic nature of the data to create and discuss audio and visual representations that highlight disruptions and sudden changes in the normal consumption patterns. For this study, we present two distinct visual and audio representations and discuss their strengths and limitations.
Consumption as a Rhythm: A Multimodal Experiment on the Representation of Time-Series
Consumption as a Rhythm
attachment
Macas_et_al_2018_Consumption_as_a_Rhythm.pdf
application/pdf
conferencePaper
DOI 10.1109/VRW50115.2020.00082
2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
Bouchara
Tifanie
Montès
Matthieu
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Researchers
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_none
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
9 VIS_Level_ratio
included_Stage5
https://ieeexplore.ieee.org/document/9090531/
380-383
2020-03
Citation Key: bouchara_2020_ImmersiveSonificationProtein
2023-10-23 21:19:20
IEEE Xplore
2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
This paper presents our ongoing efforts to design an immersive sonification model to represent protein surfaces through 3D sound, in order to extend pre-existing protein visualisation methods without overloading visual perception. The protein surface is first discretized so each point of the surface is attached to a sound source spatialized in such a way the user is immersed in the center of the protein. We add a spherical filtering system, that the user can control, to select the surface points that would be rendered in order to reinforce the auditory interpretation of the 3D shape. Several questions, which can benefit the VR and HCI communities, are discussed both on audio and audiographical filtering consistency, and on multimodal integration of data coming from different point of view and point of listening in a 3D interactive space.
Immersive sonification of protein surface
attachment
Bouchara_Montes_2020_Immersive_sonification_of_protein_surface.pdf
application/pdf
conferencePaper
DOI 10.1109/AIVR52153.2021.00057
2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)
Lyu
Zhuoyue
Li
Jiannan
Wang
Bryan
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_yes
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_XR
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_single
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
5 VISsearch_none
6 SONIsearch_locate
7 VISDatasetType_network
8 SONIDatasetType_network
9 VIS_Level_nominal
included_Stage5
https://ieeexplore.ieee.org/document/9644325/
251-255
2021-11
Citation Key: lyu_2021_AIiveInteractiveVisualization
2023-10-23 21:17:37
IEEE Xplore
2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)
Artificial Intelligence (AI), especially Neural Networks (NNs), has become increasingly popular. However, people usually treat AI as a tool, focusing on improving outcome, accuracy, and performance while paying less attention to the representation of AI itself. We present AIive, an interactive visualization of AI in Virtual Reality (VR) that brings AI “alive”. AIive enables users to manipulate the parameters of NNs with virtual hands and provides auditory feedback for the real-time values of loss, accuracy, and hyperparameters. Thus, AIive contributes an artistic and intuitive way to represent AI by integrating visualization, sonification, and direct manipulation in VR, potentially targeting a wide range of audiences.
AIive: Interactive Visualization and Sonification of Neural Networks in Virtual Reality
AIive
attachment
https://ieeexplore.ieee.org/document/9644325/
2023-10-23 21:21:34
IEEE Xplore Abstract Record
3
text/html
attachment
https://ieeexplore.ieee.org/stampPDF/getPDF.jsp?tp=&arnumber=9644325&ref=aHR0cHM6Ly9pZWVleHBsb3JlLmllZWUub3JnL3NlYXJjaC9zZWFyY2hyZXN1bHQuanNwP3F1ZXJ5VGV4dD0lMjJ2aXN1YWxpemF0aW9uJTIyJTIwQU5EJTIwJTIyc29uaWZpY2F0aW9uJTIyJmhpZ2hsaWdodD10cnVlJnJldHVybkZhY2V0cz1BTEwmcmV0dXJuVHlwZT1TRUFSQ0gmbWF0Y2hQdWJzPXRydWUmcmFuZ2VzPTIwMTFfMjAyM19ZZWFyJnJvd3NQZXJQYWdlPTEwMCZwYWdlTnVtYmVyPTE=
2023-10-23 21:21:16
Lyu_et_al_2021_AIive.pdf
1
application/pdf
conferencePaper
DOI 10.1109/VARMS.2015.7151725
2015 IEEE 1st International Workshop on Virtual and Augmented Reality for Molecular Science (VARMS@IEEEVR)
Rau
Benjamin
Frieß
Florian
Krone
Michael
Muller
Christoph
Ertl
Thomas
1 Purpose_Exploration
10 SONI_Level_nominal
10 SONI_Level_ratio
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_browse
5 VISsearch_explore
5 VISsearch_locate
6 SONIsearch_locate
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
9 VIS_Level_ratio
included_Stage5
https://ieeexplore.ieee.org/document/7151725/
25-30
2015-03
Citation Key: rau_2015_EnhancingVisualizationMolecular
2023-10-23 21:17:41
IEEE Xplore
2015 IEEE 1st International Workshop on Virtual and Augmented Reality for Molecular Science (VARMS@IEEEVR)
Scientific visualization is an application area for virtual reality environments like stereoscopic displays or CAVEs. Especially interactive molecular visualizations that show the complex threedimensional structures found in structural biology are often investigated using such environments. In contrast to VR applications like simulators, molecular visualization typically lacks auditory output. Nevertheless, sonification can be used to convey information about the data. In our work, we use sound to highlight events extracted from a molecular dynamics simulation. This not only offloads information from the visual channel, but can also guide the attention of the analyst towards important phenomena even if they are occluded in the visualization. Sound also creates a higher level of immersion, which can be beneficial for educational purposes. In this paper, we detail our application that adds sonification to the visualization of molecular simulations.
Enhancing visualization of molecular simulations using sonification
attachment
Rau_et_al_2015_Enhancing_visualization_of_molecular_simulations_using_sonification.pdf
application/pdf
conferencePaper
DOI 10.1109/SIVE.2015.7361284
2015 IEEE 2nd VR Workshop on Sonic Interactions for Virtual Environments (SIVE)
Papachristodoulou
Panagiota
Betella
Alberto
Manzolli
Jonatas
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_UP
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_group
2 VISReadingLevel_whole
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_nominal
included_Stage5
https://ieeexplore.ieee.org/document/7361284/
1-6
2015-03
Citation Key: papachristodoulou_2015_AugmentingNavigationComplex
2023-10-23 21:18:10
IEEE Xplore
2015 IEEE 2nd VR Workshop on Sonic Interactions for Virtual Environments (SIVE)
The meaningful representation and exploration of big data constitutes a challenge for many scientific fields. In recent years, auditory displays have been effectively employed to address this problem. The coupling of sonification with visualization techniques in multimodal displays can lead to the implementation of powerful tools for the understanding of complex datasets. In this study, we applied sonification techniques to a complex dataset from neuroscience. To do so, we used BrainX3, a novel immersive technology for the exploration of large brain networks. We conducted an experiment to assess whether the addition of an auditory layer would result in better user performance of brain region identification at different spatial resolutions.
Augmenting the navigation of complex data sets using sonification: a case study with BrainX3
Augmenting the navigation of complex data sets using sonification
attachment
Papachristodoulou_et_al_2015_Augmenting_the_navigation_of_complex_data_sets_using_sonification.pdf
application/pdf
conferencePaper
02
DOI 10.1109/CBI.2016.47
2016 IEEE 18th Conference on Business Informatics (CBI)
Hildebrandt
Tobias
Amerbauer
Felix
Rinderle-Ma
Stefanie
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_nominal
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes but offline
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_explore
5 VISsearch_locate
6 SONIsearch_explore
6 SONIsearch_locate
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_nominal
included_Stage5
https://ieeexplore.ieee.org/document/7781493/
32-37
2016-08
ISSN: 2378-1971
Citation Key: hildebrandt_2016_CombiningSonificationVisualization
2023-10-23 21:18:00
IEEE Xplore
2016 IEEE 18th Conference on Business Informatics (CBI)
Business process execution data is analyzed for different reasons such as process discovery, performance analysis, or anomaly detection. However, visualizations might suffer from a number of limitations. Sonification (the presentation of data using sound) has been proven to successfully enhance visualization in many domains. Although there exist approaches that apply sonification for real-time monitoring of process executions, so far this technique has not been applied to analyze process execution data ex post. We therefore propose a multi-modal system, combining visualization and sonification, for this purpose. The concepts are evaluated by a prototypical ProM plugin as well as based on a use case.
Combining Sonification and Visualization for the Analysis of Process Execution Data
attachment
Hildebrandt_et_al_2016_Combining_Sonification_and_Visualization_for_the_Analysis_of_Process_Execution.pdf
application/pdf
journalArticle
Ferguson
Sam
Beilharz
Kirsty
Calò
Claudia A.
1 Purpose_Both
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Researchers
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_group
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_single
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
Data analysis
included_Stage5
Interaction
Tabletop
https://doi.org/10.1007/s12193-011-0075-3
97-109
2012-05-01
Number: 3
Citation Key: ferguson_2012_NavigationInteractiveSonifications
2023-10-26 13:25:13
Springer Link
en
This paper discusses interaction design for interactive sonification and visualisation of data in multi-touch contexts. Interaction design for data analysis is becoming increasingly important as data becomes more openly available. We discuss how navigation issues such as zooming, selection, arrangement and playback of data relate to both the auditory and visual modality in different ways, and how they may be linked through the modality of touch and gestural interaction. For this purpose we introduce a user interface for exploring and interacting with representations of time-series data simultaneously in both the visual and auditory modalities.
Navigation of interactive sonifications and visualisations of time-series data using multi-touch computing
5
3
Journal on Multimodal User Interfaces
ISSN 1783-8738
J Multimodal User Interfaces
DOI 10.1007/s12193-011-0075-3
attachment
https://link.springer.com/content/pdf/10.1007%2Fs12193-011-0075-3.pdf
2023-10-26 20:46:47
Ferguson_et_al_2012_Navigation_of_interactive_sonifications_and_visualisations_of_time-series_data.pdf
1
application/pdf
conferencePaper
ISBN 978-3-03868-184-7
DOI 10.2312/evs.20221095
Proc. 24th Eurographics Conference on Visualization (EuroVis 2022) - Short Papers
Rome
The Eurographics Association
Enge
Kajetan
Rind
Alexander
Iber
Michael
Höldrich
Robert
Aigner
Wolfgang
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Researchers
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue Field Vis
4 LevelofReduncancy_complementary
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
included_Stage5
https://diglib.eg.org:443/xmlui/handle/10.2312/evs20221095
Attribution 4.0 International License
67-71
2022
Citation Key: enge_2022_MultimodalExploratoryData
tex.ids= enge_2022_multimodal
2023-10-25 21:18:19
diglib.eg.org
en
The metaphor of auscultating with a stethoscope can be an inspiration to combine visualization and sonification for exploratory data analysis. This paper presents SoniScope, a multimodal approach and its prototypical implementation based on this metaphor. It combines a scatterplot with an interactive parameter mapping sonification, thereby conveying additional information about items that were selected with a visual lens. SoniScope explores several design options for the shape of its lens and the sorting of the selected items for subsequent sonification. Furthermore, the open-source prototype serves as a blueprint framework for how to combine D3.js visualization and SuperCollider sonification in the Jupyter notebook environment.
Towards Multimodal Exploratory Data Analysis: SoniScope as a Prototypical Implementation
Towards Multimodal Exploratory Data Analysis
attachment
https://diglib.eg.org:443/xmlui/bitstream/10.2312/evs20221095/1/067-071.pdf
2023-10-25 21:18:20
Enge_et_al_2022_Towards_Multimodal_Exploratory_Data_Analysis.pdf
1
application/pdf
conferencePaper
Lecture Notes in Computer Science
ISBN 978-3-319-12976-1
DOI 10.1007/978-3-319-12976-1_10
Sound, Music, and Motion
Cham
Springer International Publishing
Holtzman
Benjamin
Candler
Jason
Turk
Matthew
Peter
Daniel
Aramaki
Mitsuko
Derrien
Olivier
Kronland-Martinet
Richard
Ystad
Sølvi
1 Purpose_Presentation
10 SONI_Level_nominal
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_no
13 Goal_Education
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
17 demo_no
2 VISReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_field
9 VIS_Level_nominal
Audification
included_Stage5
Seismology
161-174
2014
Citation Key: holtzman_2014_SeismicSoundLab
Springer Link
en
We construct a representation of earthquakes and global seismic waves through sound and animated images. The seismic wave field is the ensemble of elastic waves that propagate through the planet after an earthquake, emanating from the rupture on the fault. The sounds are made by time compression (i.e. speeding up) of seismic data with minimal additional processing. The animated images are renderings of numerical simulations of seismic wave propagation in the globe. Synchronized sounds and images reveal complex patterns and illustrate numerous aspects of the seismic wave field. These movies represent phenomena occurring far from the time and length scales normally accessible to us, creating a profound experience for the observer. The multi-sensory perception of these complex phenomena may also bring new insights to researchers.
Seismic Sound Lab: Sights, Sounds and Perception of the Earth as an Acoustic Space
Seismic Sound Lab
attachment
https://link.springer.com/content/pdf/10.1007%2F978-3-319-12976-1_10.pdf
2023-11-24 13:27:39
Holtzman_et_al_2014_Seismic_Sound_Lab.pdf
1
application/pdf
conferencePaper
Lecture Notes in Computer Science
ISBN 978-3-030-72914-1
DOI 10.1007/978-3-030-72914-1_11
Artificial Intelligence in Music, Sound, Art and Design
Cham
Springer International Publishing
Kariyado
Yuta
Arevalo
Camilo
Villegas
Julián
Romero
Juan
Martins
Tiago
Rodríguez-Fernández
Nereida
1 Purpose_Presentation
10 SONI_Level_interval
11 Evaluation_AP
11 Evaluation_UE
12 Interaction_yes
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
16 Display_XR
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
Auralization
Cellular automata
Game of life
included_Stage5
161-170
2021
Citation Key: kariyado_2021_AuralizationThreeDimensionalCellular
Springer Link
en
An auralization tool for exploring three-dimensional cellular automata is presented. This proof-of-concept allows the creation of a sound field comprising individual sound events associated with each cell in a three-dimensional grid. Each sound-event is spatialized depending on the orientation of the listener relative to the three-dimensional model. Users can listen to all cells simultaneously or in sequential slices at will. Conceived to be used as an immersive Virtual Reality (VR) scene, this software application also works as a desktop application for environments where the VR infrastructure is missing. Subjective evaluations indicate that the proposed sonification increases the perceived quality and immersability of the system with respect to a visualization-only system. No subjective differences between the sequential or simultaneous presentations were found.
Auralization of Three-Dimensional Cellular Automata
attachment
https://link.springer.com/content/pdf/10.1007%2F978-3-030-72914-1_11.pdf
2023-11-22 12:20:13
Kariyado_et_al_2021_Auralization_of_Three-Dimensional_Cellular_Automata.pdf
1
application/pdf
conferencePaper
Communications in Computer and Information Science
ISBN 978-3-031-19679-9
DOI 10.1007/978-3-031-19679-9_79
HCI International 2022 – Late Breaking Posters
Cham
Springer Nature Switzerland
De La Vega
Gonzalo
Dominguez
Leonardo Martin Exequiel
Casado
Johanna
García
Beatriz
Stephanidis
Constantine
Antona
Margherita
Ntoa
Stavroula
Salvendy
Gavriel
1 Purpose_Both
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_group
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
6 SONIsearch_none
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
Graphic user interface
Human centred design
included_Stage5
628-633
2022
Citation Key: delavega_2022_SonoUnoWebInnovative
Springer Link
en
Sonification as a complement of visualization is been under research for decades as a new ways of data deployment. ICAD conferences, gather together specialists from different disciplines to discuss about sonification. Different tools as sonoUno, starSound and Web Sandbox are attempt to reach a tool to open astronomical data sets and sonify it in conjunction to visualization. In this contribution, the sonoUno web version is presented, this version allows user to explore data sets without any installation. The data can be uploaded or a pre-loaded file can be opened, the sonification and the visual characteristics of the plot can be customized on the same window. The plot, sound and marks can be saved. The web interface were tested with the main used screen readers in order to confirm their good performance.
SonoUno Web: An Innovative User Centred Web Interface
SonoUno Web
attachment
https://arxiv.org/pdf/2302.00081
2023-10-26 20:48:21
De_La_Vega_et_al_2022_SonoUno_Web.pdf
1
application/pdf
journalArticle
5
3
Journal on Multimodal User Interfaces
ISSN 1783-8738
J Multimodal User Interfaces
DOI 10.1007/s12193-011-0066-4
Ness
Steven
Reimer
Paul
Love
Justin
Schloss
W. Andrew
Tzanetakis
George
1 Purpose_Exploration
10 SONI_Level_interval
11 Evaluation_QRI
12 Interaction_yes
13 Goal_Education
14 Users_Domain Experts
15 Topic_Natural Sciences
16 Display_Physical Environment
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_nominal
Geo-spatial data
included_Stage5
Phenology
Tangible interfaces
https://doi.org/10.1007/s12193-011-0066-4
123-129
2012-05-01
Number: 3
Citation Key: ness_2012_Sonophenology
2023-10-26 20:47:21
Springer Link
en
The study of periodic biological processes, such as when plants flower and birds arrive in the spring is known as Phenology. In recent years this field has gained interest from the scientific community because of the applicability of this data to the study of climate change and other ecological processes. In this paper we propose the use of tangible interfaces for interactive sonification with a specific example of a multimodal tangible interface consisting of a physical paper map and tracking of fiducial markers combined with a novel drawing interface. The designed interface enables one or more users to specify point queries with the map interface and to specify time queries with the drawing interface. This allows the user to explore both time and space while receiving immediate sonic feedback of their actions. This system can be used to study and explore the effects of climate change, both as tool to be used by scientists, and as a way to educate and involve members of the general public in a dynamic way in this research.
Sonophenology
attachment
https://link.springer.com/content/pdf/10.1007%2Fs12193-011-0066-4.pdf
2023-10-26 20:48:02
Ness_et_al_2012_Sonophenology.pdf
1
application/pdf
journalArticle
Alonso-Arevalo
Miguel A.
Shelley
Simon
Hermes
Dik
Hollowood
Jacqueline
Pettitt
Michael
Sharples
Sarah
Kohlrausch
Armin
1 Purpose_Exploration
10 SONI_Level_ratio
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_XR
2 VISReadingLevel_whole
3 SONIReadingLevel_single
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
9 VIS_Level_ratio
included_Stage5
https://doi.org/10.1145/2355598.2355600
2012-10
Number: 4
Citation Key: alonso-arevalo_2012_CurveShapeCurvature
In this article we present an approach that uses sound to communicate geometrical data related to a virtual object. This has been developed in the framework of a multimodal interface for product design. The interface allows a designer to evaluate the quality of a 3-D shape using touch, vision, and sound. Two important considerations addressed in this article are the nature of the data that is sonified and the haptic interaction between the user and the interface, which in fact triggers the sound and influences its characteristics. Based on these considerations, we present a number of sonification strategies that are designed to map the geometrical data of interest into sound. The fundamental frequency of various sounds was used to convey the curve shape or the curvature to the listeners. Two evaluation experiments are described, one involves partipants with a varied background, the other involved the intended users, i.e. participants with a background in industrial design. The results show that independent of the sonification method used and independent of whether the curve shape or the curvature were sonified, the sonification was quite successful. In the first experiment participants had a success rate of about 80% in a multiple choice task, in the second experiment it took the participants on average less than 20 seconds to find the maximum, minimum or inflection points of the curvature of a test curve.
Curve shape and curvature perception through interactive sonification
9
4
ACM Trans. Appl. Percept.
ISSN 1544-3558
DOI 10.1145/2355598.2355600
attachment
Alonso-Arevalo_et_al_2012_Curve_shape_and_curvature_perception_through_interactive_sonification.pdf
application/pdf
conferencePaper
AM '21
ISBN 978-1-4503-8569-5
DOI 10.1145/3478384.3478387
Proceedings of the 16th international audio mostly conference
New York, NY, USA
Association for Computing Machinery
Rönnberg
Niklas
1 Purpose_Presentation
10 SONI_Level_interval
10 SONI_Level_ratio
12 Interaction_no
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
5 VISsearch_none
6 SONIsearch_none
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
included_Stage5
https://doi.org/10.1145/3478384.3478387
56–63
2021
Citation Key: ronnberg_2021_SonificationConveyingData
In the present study a sonification of running data was evaluated. The aim of the sonification was to both convey information about the data and convey a specific emotion. The sonification was evaluated in three parts, firstly as an auditory graph, secondly together with additional text information, and thirdly together with an animated visualization, with a total of 150 responses. The results suggest that the sonification could convey an emotion similar to that intended, but at the cost of less good representation of the data. The addition of visual information supported understanding of the sonification, and the auditory representation of data. The results thus suggest that it is possible to design sonification that is perceived as both interesting and fun, and convey an emotional impression, but that there may be a trade off between musical experience and clarity in sonification.
Sonification for conveying data and emotion
attachment
Ronnberg_2021_Sonification_for_conveying_data_and_emotion.pdf
application/pdf
conferencePaper
CHI EA '22
ISBN 978-1-4503-9156-6
DOI 10.1145/3491101.3519903
Extended abstracts of the 2022 CHI conference on human factors in computing systems
New York, NY, USA
Association for Computing Machinery
Han
Yoon Chung
Khanduja
Amanbeer
1 Purpose_Presentation
10 SONI_Level_ratio
11 Evaluation_UE
12 Interaction_yes
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_single
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
7 VISDatasetType_table
8 SONIDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_ratio
included_Stage5
https://doi.org/10.1145/3491101.3519903
2022
Citation Key: han_2022_FutureRedVisualizing
This paper presents our approach for visualizing, sonifying, and predicting wildfire in the near future and contactless interaction. We provide an interaction tool to depict the causes and results of the wildfire and promote awareness of the environmental issues by giving forecasting results of the wildfire to the audience. Multimodal interaction allows the audience to dynamically experience the changes of the wildfire over time in two representative locations. (California, United States, and South Korea) The interactive multimodal data visualization and sonification depict the past, present, and future of the wildfire. This data-driven design was installed in an art gallery and presented to audience members. Contactless user interaction with Leap Motion cameras was used during the pandemic period for hygienic interaction. In this paper, we describe the design process and how this interface was developed based on environmental issues, and informal user responses from the art gallery setup are discussed.
The future is red: Visualizing wildfire predictions using contactless interaction
attachment
Han_Khanduja_2022_The_future_is_red.pdf
application/pdf
conferencePaper
SSE 2016
ISBN 978-1-4503-4397-8
DOI 10.1145/2993283.2993285
Proceedings of the 8th international workshop on social software engineering
New York, NY, USA
Association for Computing Machinery
North
Kevin J.
Sarma
Anita
Cohen
Myra B.
1 Purpose_Exploration
10 SONI_Level_nominal
10 SONI_Level_ordinal
11 Evaluation_UP
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_group
2 VISReadingLevel_single
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_single
4 LevelofReduncancy_mixed
5 VISsearch_explore
6 SONIsearch_explore
8 SONIDatasetType_table
9 VIS_Level_nominal
9 VIS_Level_ordinal
included_Stage5
https://doi.org/10.1145/2993283.2993285
1–7
2016
Citation Key: north_2016_UnderstandingGitHistory
Version control systems archive data about the development history of a project, which can be used to analyze and understand different facets of a software project. The project history can be used to evaluate the development process of a team, as an aid in bug fixing, or to help new members get on track with development. However, state of the art techniques for analyzing version control data provide only partial views into this information, and lack an easy way to present all the dimensions of the data. In this paper we present GitVS, a hybrid view that incorporates visualization and sonification to represent the multiple dimensions of version control data - development time line, conflicts, etc. In a formative user study comparing the GitHub Network Graph, GitVS, and a version of GitVS without sound, we show GitVS improves over the GitHub Network Graph and that while sound makes it easier to correctly understand version history for some tasks, it is more difficult for others.
Understanding git history: A multi-sense view
attachment
https://dl.acm.org/doi/pdf/10.1145/2993283.2993285
2023-11-24 13:40:12
North_et_al_2016_Understanding_git_history.pdf
1
application/pdf
journalArticle
Groppe
Sven
Klinckenberg
Rico
Warnke
Benjamin
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_nominal
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_yes
13 Goal_Education
13 Goal_PublicEngagement
14 Users_Domain Experts
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_none
6 SONIsearch_explore
7 VISDatasetType_network
8 SONIDatasetType_network
9 VIS_Level_interval
9 VIS_Level_nominal
9 VIS_Level_ordinal
included_Stage5
https://doi.org/10.14778/3476311.3476322
2695–2698
2021-07
Number: 12
Citation Key: groppe_2021_SoundDatabasesSonification
Sonifications map data to auditory dimensions and offer a new audible experience to their listeners. We propose a sonification of query processing paired with a corresponding visualization both integrated in a web application. In this demonstration we show that the sonification of different types of relational operators generates different sound patterns, which can be recognized and identified by listeners increasing their understanding of the operators' functionality and supports easy remembering of requirements like merge joins work on sorted input. Furthermore, new ways of analyzing query processing are possible with the sonification approach.
Sound of databases: Sonification of a semantic web database engine
14
12
Proc. VLDB Endow.
ISSN 2150-8097
DOI 10.14778/3476311.3476322
attachment
Groppe_et_al_2021_Sound_of_databases.pdf
application/pdf
conferencePaper
DOI 10.5281/zenodo.1422501
Proc. 15th Sound and Music Computing Conference (SMC2018)
Limassol, Cyprus
Zenodo
Fitzpatrick
Joe
Neff
Flaithri
1 Purpose_Exploration
10 SONI_Level_nominal
10 SONI_Level_ordinal
12 Interaction_no
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_single
4 LevelofReduncancy_mixed
5 VISsearch_locate
6 SONIsearch_locate
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_nominal
included_Stage5
52-59
2018
Citation Key: fitzpatrick_2018_StreamSegregationUtilizing
Zotero
en
The sonification of line charts, from which auditory line charts are produced is a common sonification strategy used today. This paper examines timbre as a potentially useful sonic dimension for relaying information in sonified line charts. A user-study is presented in which 43 participants were tasked with identifying particular trends among multiple distractor trends using sonified data. These sonified data comprised frequency-mapped trends isolated with the gradual enrichment of harmonic content, using a sawtooth wave as a guideline for the overall harmonic structure. Correlations between harmonic content and identification success rates were examined. Results from the study indicate that the majority of participants consistently chose the sample with the most harmonics when deciding which sonified trend best represented the visual equivalent. However, this confidence decreased with each harmonic addition to the point of complete uncertainty when choosing between a sample with 3 harmonics and a sample with 4 harmonics.
Stream Segregation: Utilizing Harmonic Variance in Auditory Graphs
attachment
Fitzpatrick_Neff_2018_Stream_Segregation.pdf
application/pdf
conferencePaper
Proc. 27th International Conference on Auditory Display (ICAD 2022)
Virtual Conference
Georgia Institute of Technology
Russo
Matt
Santaguida
Andrew
1 Purpose_Presentation
10 SONI_Level_interval
11 Evaluation_UE
12 Interaction_no
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_none
6 SONIsearch_none
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/67384
64-68
2022-06
Publisher: Georgia Institute of Technology
Citation Key: russo_2022_5000ExoplanetsListen
2023-10-30 17:33:45
repository.gatech.edu
en
In March of 2022, NASA announced the discovery of the 5000th planet orbiting a star other than our sun (an exoplanet). We have created a sonification and visualization to celebrate this milestone and to communicate the exciting history of discovery to the general public. Our work provides a visceral experience of how humanity’s knowledge of alien worlds has progressed. A relatively simple and straightforward sonification mapping is used to make the informational content as accessible to the general public as possible. Listeners can see and hear the timing, number, and relative orbital periods of the exoplanets that have been discovered to date. The sonification was experienced millions of times through NASA’s social media channels and there are plans to update the sonification as future milestones are reached.
5000 Exoplanets: Listen to the Sounds of Discovery
5000 Exoplanets
attachment
https://repository.gatech.edu/bitstreams/dac9bed7-3c20-4596-80e1-cc12cd61e487/download
2023-10-30 17:33:46
Russo_Santaguida_2022_5000_Exoplanets.pdf
1
application/pdf
journalArticle
Berger
Markus
Bill
Ralf
1 Purpose_Exploration
10 SONI_Level_interval
11 Evaluation_AP
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_XR
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_interval
environmental noise
immersive analytics
included_Stage5
multisensory
urban planning
virtual reality
https://www.mdpi.com/2414-4088/3/2/34/htm
34:1-34:15
2019-05-13
Number: 2
Publisher: Multidisciplinary Digital Publishing Institute
Citation Key: berger_2019_CombiningVRVisualization
2022-08-04
Urban traffic noise situations are usually visualized as conventional 2D maps or 3D scenes. These representations are indispensable tools to inform decision makers and citizens about issues of health, safety, and quality of life but require expert knowledge in order to be properly understood and put into context. The subjectivity of how we perceive noise as well as the inaccuracies in common noise calculation standards are rarely represented. We present a virtual reality application that seeks to offer an audiovisual glimpse into the background workings of one of these standards, by employing a multisensory, immersive analytics approach that allows users to interactively explore and listen to an approximate rendering of the data in the same environment that the noise simulation occurs in. In order for this approach to be useful, it should manage complicated noise level calculations in a real time environment and run on commodity low-cost VR hardware. In a prototypical implementation, we utilized simple VR interactions common to current mobile VR headsets and combined them with techniques from data visualization and sonification to allow users to explore road traffic noise in an immersive real-time urban environment. The noise levels were calculated over CityGML LoD2 building geometries, in accordance with Common Noise Assessment Methods in Europe (CNOSSOS-EU) sound propagation methods.
Combining VR Visualization and Sonification for Immersive Exploration of Urban Noise Standards
3
2
Multimodal Technologies and Interaction
ISSN 2414-4088
DOI 10.3390/MTI3020034
attachment
https://www.mdpi.com/2414-4088/3/2/34
2024-01-08 10:38:32
34
3
attachment
Berger_Bill_2019_Combining_VR_Visualization_and_Sonification_for_Immersive_Exploration_of_Urban.pdf
application/pdf
conferencePaper
Proc. The 21th International Conference on Auditory Display (ICAD–2015
Georgia Institute of Technology
Winters
R. Michael
Weinberg
Gil
1 Purpose_Presentation
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_no
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/54149
273-280
2015-07
Citation Key: winters_2015_SonificationTohokuEarthquake
2023-10-30 17:31:04
repository.gatech.edu
en_US
The past century has witnessed the emergence of expressive musical forms that originate in appropriated technologies and practices. In most cases, this appropriation is performed without protest— but not always. Carefully negotiating a space for sound as an objective, scientific medium, the field of sonification has cautiously guarded the term from subjective and affective endeavors. This paper explores the tensions arising in sonification popularization through a formal analysis of Sonification of the Tohoku Earthquake, a two-minute YouTube video that combined audification with a time-aligned seismograph, text and heatmap. Although the many views the video has received speak to a high public impact, the features contributing to this popularity have not been formalized, nor the extent to which these characteristics further sonifications’ scientific mission. For this purpose, a theory of popularization based upon “sublime listening experiences” is applied. The paper concludes by drawing attention to broader themes in the history of music and technology and presents guidelines for designing effective public-facing examples.
Sonification of the Tohoku earthquake: Music, popularization & the auditory sublime
Sonification of the Tohoku earthquake
attachment
https://repository.gatech.edu/bitstreams/2f9625cb-e47b-4af4-bb4f-fb73349d778a/download
2023-10-30 17:31:10
Winters_Weinberg_2015_Sonification_of_the_Tohoku_earthquake.pdf
1
application/pdf
conferencePaper
Proc. 21st International Conference on Auditory Display (ICAD-2015)
Graz, Austria
Georgia Institute of Technology
Ballora
Mark
1 Purpose_Presentation
10 SONI_Level_interval
11 Evaluation_NONE
12 Interaction_no
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes but offline
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_browse
5 VISsearch_explore
5 VISsearch_locate
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/54172
300-301
2015-07
Citation Key: ballora_2015_TwoExamplesSonification
2023-10-30 17:33:32
repository.gatech.edu
en_US
This extended abstract describes two sets of sonifications that were commissioned by researchers from the fields of meteorology and animal ecology. The sonifications were created with the software synthesis program SuperCollider [1]. The motivation for creating them was to pursue additional levels of engagement and immersion, supplementing the effects of visual plots. The goal is for audiences, in particular students and laypeople, to readily understand (and hopefully find compelling) the phenomena being described. The approach is parameter-based, creating “sonic scatter plots” [2] in the same manner as work described in earlier publications [3-4].
Two examples of sonification for viewer engagement: Hurricanes and squirrel hibernation cycles
Two examples of sonification for viewer engagement
attachment
https://repository.gatech.edu/bitstreams/56cd04cd-5694-4b8d-a679-d6c89c823c61/download
2023-10-30 17:33:34
Ballora_2015_Two_examples_of_sonification_for_viewer_engagement.pdf
1
application/pdf
conferencePaper
Proc. 26th International Conference on Auditory Display (ICAD 2021)
Virtual Conference
Georgia Institute of Technology
Temor
Lucas
MacDonald
Daniel E.
Natarajan
Thangam
Coppin
Peter W.
Steinman
David A.
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_no
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_field
8 SONIDatasetType_table
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/66343
202-209
2021-06
Publisher: Georgia Institute of Technology
Citation Key: temor_2021_PerceptuallymotivatedSonificationSpatiotemporallydynamic
2023-10-30 16:37:04
repository.gatech.edu
en
Everyday perception and action are fundamentally multisensory. Despite this, the sole reliance on visualization for the representation of complex 3D spatiotemporal data is still widespread. In the past we have proposed various prototypes for the sonification of dense data from computational fluid dynamics (CFD) simulations of turbulent-like blood flow, but did not robustly consider the perception and associated meaning-making of the resultant sounds. To reduce some of the complexities of these data for sonification, in this work we present a feature-based approach, applying ideas from auditory scene analysis to sonify different data features along perceptually-separable auditory streams. As there are many possible features in these dense data, we followed the analogy of "caricature" to guide our definition and subsequent amplification of unique spectral and fluctuating features, while effectively minimizing the features common between simulations. This approach may allow for better insight into the behavior of flow instabilities when compared to our previous sonifications and/or visualizations, and additionally we observed benefits when some redundancy was maintained between modalities.
Perceptually-motivated sonification of spatiotemporally-dynamic CFD data
attachment
https://repository.gatech.edu/bitstreams/31920306-5cf5-43dc-910b-80443fcf0d76/download
2023-10-30 16:37:07
Temor_et_al_2021_Perceptually-motivated_sonification_of_spatiotemporally-dynamic_CFD_data.pdf
1
application/pdf
conferencePaper
147-154
Proc. 25th International Conference on Auditory Display (ICAD 2019)
Northumbria University
Georgia Institute of Technology
Malikova
E.
Adzhiev
V.
Fryazinov
O.
Pasko
A.
1 Purpose_Exploration
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_no
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_field
9 VIS_Level_ordinal
included_Stage5
http://hdl.handle.net/1853/61517
2019-06
Publisher: Georgia Institute of Technology
Citation Key: malikova_2019_VisualauditoryVolumeRendering
2023-10-30 16:36:29
repository.gatech.edu
en
This paper describes a novel approach to visual-auditory volume rendering of continuous scalar fields. The proposed method uses well-established similarities in light transfer and sound propagation modelling to extend the visual scalar field data analysis with auditory attributes. We address the visual perception limitations of existing volume rendering techniques and show that they can be handled by auditory analysis. In particular, we describe a practical application to demonstrate how the proposed approach may keep the researcher aware of the visual perception issues in colour mapping and help track and detect geometrical features and symmetry break, issues that are important in the context of interpretation of the physical phenomena.
Visual-auditory volume rendering of scalar fields
attachment
https://repository.gatech.edu/bitstreams/f1e6da04-82b3-4789-b080-5d4927630683/download
2023-10-30 16:36:34
Malikova_et_al_2019_Visual-auditory_volume_rendering_of_scalar_fields.pdf
1
application/pdf
conferencePaper
Proc. 26th International Conference on Auditory Display (ICAD 2021)
Virtual Conference
Georgia Institute of Technology
Lindetorp
Hans
Falkenberg
Kjetil
1 Purpose_Both
10 SONI_Level_ordinal
11 Evaluation_UE
12 Interaction_yes
13 Goal_DataAnalysis
13 Goal_Education
13 Goal_PublicEngagement
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_single
20 Venue ICAD
3 SONIReadingLevel_single
4 LevelofReduncancy_redundant
5 VISsearch_browse
6 SONIsearch_none
8 SONIDatasetType_table
9 VIS_Level_ordinal
included_Stage5
http://hdl.handle.net/1853/66351
15-21
2021-06
Publisher: Georgia Institute of Technology
Citation Key: lindetorp_2021_SonificationEveryoneEverywhere
2023-10-30 17:34:59
repository.gatech.edu
en
Creating an effective sonification is a challenging task that requires skills and knowledge on an expertise level in several disciplines. This study contributes with WebAudioXML Sonification Toolkit (WAST) that aims at reaching new groups who have not yet considered themselves to be part of the ICAD community. We have designed, built, and evaluated the toolkit by analysing ten student projects using it and conclude that WAST did meet our expectations and that it led to students taking a deep approach to learning and successfully contributed to reaching the learning outcomes. The result indicates that WAST is both easy-to-use, highly accessible, extensively flexible and offers possibilities to share the sonification in any device's web browser simply through a web link, and without installations. We also suggest that a sonification toolkit would become an even more creative environment with virtual instruments and mixing features typically found in Digital Audio Workstations.
Sonification for everyone everywhere: Evaluating the WebAudioXML sonification toolkit for browsers
Sonification for everyone everywhere
attachment
https://repository.gatech.edu/bitstreams/1669e2a2-d274-40a3-bd93-c16e7cf76125/download
2023-10-30 17:35:01
Lindetorp_Falkenberg_2021_Sonification_for_everyone_everywhere.pdf
1
application/pdf
conferencePaper
DOI 10.21785/icad2021.005
Proc. 26th International Conference on Auditory Display (ICAD 2021)
Virtual Conference
Georgia Institute of Technology
Cantrell
Stanley J.
Walker
Bruce N.
Moseng
Øystein
1 Purpose_Both
10 SONI_Level_interval
10 SONI_Level_ratio
12 Interaction_yes
13 Goal_DataAnalysis
13 Goal_Education
13 Goal_PublicEngagement
14 Users_General Public
14 Users_Researchers
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_browse
5 VISsearch_explore
5 VISsearch_locate
6 SONIsearch_explore
6 SONIsearch_locate
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/66348
210-216
2021-06
Publisher: Georgia Institute of Technology
Citation Key: cantrell_2021_HighchartsSonificationStudio
2023-10-30 17:31:49
repository.gatech.edu
en
The Highcharts Sonification Studio is the culmination of a multi-year collaboration between Highsoft — the creators of Highcharts — and the Georgia Tech Sonification Lab to develop an extensible, accessible, online spreadsheet and multimodal graphing platform for the auditory display, assistive technology, and STEM education communities. The Highcharts Sonification Studio leverages the advances in auditory display and sonification research, as well as over 20 years of experience gained through research and development of the original Sonification Sandbox. We discuss the iterative design and evaluation process of the Highcharts Sonification Studio to ensure usability and accessibility, highlight opportunities for growth of the tool, and its use for research, art, and education within the ICAD community and beyond.
Highcharts Sonification Studio: an online, open-source, extensible, and accessible data sonification tool
Highcharts Sonification Studio
attachment
https://repository.gatech.edu/bitstreams/eec8f48f-828a-4ced-b86d-af500c812ede/download
2023-10-30 17:31:52
Cantrell_et_al_2021_Highcharts_Sonification_Studio.pdf
1
application/pdf
conferencePaper
Proc. 28th International Conference on Auditory Display (ICAD 2023)
Norrköping, Sweden
Georgia Institute of Technology
Peng
Tristan
Choi
Hongchan
Berger
Jonathan
1 Purpose_Presentation
10 SONI_Level_interval
10 SONI_Level_nominal
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_no
13 Goal_Education
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_single
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_none
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_nominal
9 VIS_Level_ordinal
included_Stage5
https://hdl.handle.net/1853/72879
78-84
2023-06
Publisher: Georgia Institute of Technology
Citation Key: peng_2023_SirenCreativeExtensible
2023-10-30 17:10:05
repository.gatech.edu
en
Parameter mapping sonification is a pervasive aspect of many everyday situations, from knocking on a watermelon to determine “goodness,” to diagnostic listening to the pings and rattles of an engine. However, not all mappings are so intuitive. Sonification Interface for REmapping Nature (SIREN) is a multipurpose, powerful web application that aims at exploring sonification parameter mappings allowing for a wide range of flexibility in terms of sonic choices and mapping strategies. SIREN aspires to enable users to arrive at sophisticated sonification without getting bogged down in the details of programming.
Siren: Creative and Extensible Sonification on the Web
Siren
attachment
https://repository.gatech.edu/bitstreams/ceadadcb-51d6-4c32-a12f-01ec27610288/download
2023-10-30 17:10:07
Peng_et_al_2023_Siren.pdf
1
application/pdf
conferencePaper
DOI 10.21785/icad2017.072
Proc. 23rd International Conference on Auditory Display (ICAD–2017)
Pennsylvania State University
Georgia Institute of Technology
Chabot
Samuel
Braasch
Jonas
1 Purpose_Presentation
10 SONI_Level_interval
11 Evaluation_NONE
12 Interaction_no
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Social Sciences
16 Display_Physical Environment
17 demo_no
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_nominal
included_Stage5
http://hdl.handle.net/1853/58381
203-210
2017-06
Publisher: Georgia Institute of Technology
Citation Key: chabot_2017_ImmersiveVirtualEnvironment
2023-10-30 17:09:49
repository.gatech.edu
en
The use of spatialization techniques in data sonification provides system designers with an additional tool for conveying information to users. Oftentimes, spatialized data sets are meant to be experienced by a single or few users at a time. Projects at Rensselaer's Collaborative-Research Augmented Immersive Virtual Environment Laboratory allow even large groups of collaborators to work within a shared virtual environment system. The lab provides an equal emphasis on the visual and audio system, with a nearly 360° panoramic display and 128-loudspeaker array housed behind the acoustically-transparent screen. The space allows for dynamic switching between immersions in recreations of physical scenes and presentations of abstract or symbolic data. Content creation for the space is not a complex process-the entire display is essentially a single desktop and straight-forward tools such as the Virtual Microphone Control allow for dynamic real-time spatialization. With the ability to target individual channels in the array, audio-visual congruency is achieved. The loudspeaker array creates a high-spatial density soundfield within which users are able to freely explore due to the virtual elimination of a so-called “sweet-spot.”
An Immersive Virtual Environment for Congruent Audio-Visual Spatialized Data Sonifications
attachment
https://repository.gatech.edu/bitstreams/6f897d3f-6d5a-451b-b444-32f53eecc7bc/download
2023-10-30 17:09:55
Chabot_Braasch_2017_An_Immersive_Virtual_Environment_for_Congruent_Audio-Visual_Spatialized_Data.pdf
1
application/pdf
conferencePaper
Proc. 17th International Conference on Auditory Display (ICAD-2011)
Budapest, Hungary
Adhitya
Sara
Kuuskankare
Mika
1 Purpose_Presentation
10 SONI_Level_nominal
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_Desktop
17 demo_no
2 VISReadingLevel_group
20 Venue ICAD
3 SONIReadingLevel_group
4 LevelofReduncancy_mixed
5 VISsearch_explore
5 VISsearch_locate
5 VISsearch_lookup
6 SONIsearch_browse
6 SONIsearch_explore
6 SONIsearch_lookup
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
9 VIS_Level_nominal
9 VIS_Level_ordinal
included_Stage5
http://hdl.handle.net/1853/51918
2011-06
Citation Key: adhitya_2011_SonifiedUrbanMasterplan
2023-10-30 17:08:44
repository.gatech.edu
en_US
This paper describes the progress of an interdisciplinary project that explores the potential for sonification in urban planning and design. The project involves the translation of visual urban mapping techniques used in urban planning and design, into sound, through the development of the Sonified Urban Masterplan (SUM) tool. We will describe our sonification approach and outline the implementation of the SUM tool within the computer-aided composition environment PWGL. The tool will be applied to a selected urban data set to demonstrate its potential. The paper concludes with the advantages of such an approach in urban analysis, as well as introduces the possibility, within such CAC environments as PWGL and OpenMusic, to ‘compose’ urban plans and design using sound.
The Sonified Urban Masterplan (Sum) Tool: Sonification for Urban Planning and Design
The Sonified Urban Masterplan (Sum) Tool
attachment
https://repository.gatech.edu/bitstreams/28764f41-7e6a-4c1f-95d1-ea7b5913cc27/download
2023-10-30 17:08:47
Adhitya_Kuuskankare_2011_The_Sonified_Urban_Masterplan_(Sum)_Tool.pdf
1
application/pdf
conferencePaper
Proc. 19th International Conference on Auditory Display (ICAD2013)
Lodz, Poland
Georgia Institute of Technology
Rogińska
Agnieszka
Friedman
Kent
Mohanraj
Hariharan
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_no
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_group
20 Venue ICAD
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_field
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/51653
95-101
2013-07
Publisher: Georgia Institute of Technology
Citation Key: roginska_2013_ExploringSonificationAugmenting
2023-10-30 17:08:42
repository.gatech.edu
en
Medical image data has traditionally been analyzed using visual displays and statistical analysis methods. Visual representations are limited due to the nature of the display and the number of dimensions that can be represented visually. This paper describes the use of sonification to represent medical image data of brain scans of patients with AlzheimerÕs dementia. The use of sonification is described as an approach to augment traditional diagnosis methods used to diagnose AlzheimerÕs dementia.
Exploring sonification for augmenting Brain scan data
attachment
https://repository.gatech.edu/bitstreams/d1050568-814f-4798-a15e-5d6ef804ad06/download
2023-10-30 17:08:44
Roginska_et_al_2013_Exploring_sonification_for_augmenting_Brain_scan_data.pdf
1
application/pdf
conferencePaper
Proc. 17th International Conference on Auditory Display (ICAD-2011)
Budapest, Hungary
Gomez
Imanol
Ramirez
Rafael
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_no
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_yes but offline
2 VISReadingLevel_group
20 Venue ICAD
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_field
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/51569
2011-06
Citation Key: gomez_2011_DATASONIFICATIONAPPROACH
2023-10-30 17:08:30
repository.gatech.edu
en_US
The study of human brain functions has dramatically increased greatly due to the advent of Functional Magnetic Resonance Imaging (fMRI), arguably the best technique for observing human brain activity that is currently available. However, fMRI techniques produce extremely high dimensional, sparse and noisy data which is difficult to visualize, monitor and analyze. In this paper, we propose two different sonification approaches to monitor fMRI data. The goal of the resulting fMRI data sonification system is to allow the auditory identification of cognitive states produced by different stimuli. The system consists of a feature selection component and a sonification engine. We explore different feature selection methods and sonification strategies. As a case study, we apply our system to the identification of cognitive states produced by volume accented and duration accented rhythmic stimuli.
A Data Sonification Approach to Cognitive State Identification
attachment
https://repository.gatech.edu/bitstreams/8b2a0197-7baa-4fc2-921c-61d84b3cdc27/download
2023-10-30 17:08:31
Gomez_Ramirez_2011_A_Data_Sonification_Approach_to_Cognitive_State_Identification.pdf
1
application/pdf
conferencePaper
Proc. 17th International Conference on Auditory Display (ICAD-2011)
Budapest, Hungary
Bearman
Nick
1 Purpose_Exploration
10 SONI_Level_interval
11 Evaluation_UE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_single
4 LevelofReduncancy_redundant
5 VISsearch_lookup
6 SONIsearch_lookup
7 VISDatasetType_geometry
8 SONIDatasetType_geometry
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/51922
2011-06
Citation Key: bearman_2011_UsingSoundRepresent
2023-10-30 17:08:21
repository.gatech.edu
en_US
This paper compares different visual and sonic methods of representing uncertainty in spatial data. When handling large volumes of spatial data, users can be limited in the amount that can be displayed at once due to visual saturation (when no more data can be shown visually without obscuring existing data). Using sound in combination with visual methods may help to represent uncertainty in spatial data and this example uses the UK Climate Predictions 2009 (UKCP09) dataset; where uncertainty has been included for the first time. Participants took part in the evaluation via a web-based interface which used the Google Maps API to show the spatial data and capture user inputs. Using sound and vision together to show the same variable may be useful to colour blind users. Previous awareness of the data set appears to have a significant impact (p < 0.001) on participants ability to utilise the sonification. Using sound to reinforce data shown visually results in increased scores (p = 0.005) and using sound to show some data instead of vision showed a significant increase in speed without reducing effectiveness (p = 0.033) with repeated use of the sonification.
Using Sound to Represent Uncertainty in Future Climate Projections for the United Kingdom
attachment
https://repository.gatech.edu/bitstreams/8079193e-d630-4ea7-8e47-16e79a8d8dc3/download
2023-10-30 17:08:23
Bearman_2011_Using_Sound_to_Represent_Uncertainty_in_Future_Climate_Projections_for_the.pdf
1
application/pdf
conferencePaper
ISBN 978-0-9670904-7-4
DOI 10.21785/icad2021.018
Proc. 26th International Conference on Auditory Display (ICAD 2021)
Virtual Conference
International Community for Auditory Display
Elmquist
Elias
Ejdbo
Malin
Bock
Alexander
Rönnberg
Niklas
1 Purpose_Both
10 SONI_Level_interval
11 Evaluation_UE
12 Interaction_yes
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_group
3 SONIReadingLevel_single
3 SONIReadingLevel_whole
5 VISsearch_browse
6 SONIsearch_browse
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/66324
135-142
6/2021
Citation Key: elmquist_2021_OpenspaceSonificationComplementing
2023-10-30 16:50:55
DOI.org (Crossref)
ICAD 2021: The 26th International Conference on Auditory Display
en
Data visualization software is commonly used to explore outer space in a planetarium environment, where the visuals of the software is typically accompanied with a narrator and supplementary background music. By letting sound take a bigger role in these kinds of presentations, a more informative and immersive experience can be achieved. The aim of the present study was to explore how sonification can be used as a complement to the visualization software OpenSpace to convey information about the Solar System, as well as increasing the perceived immersiveness for the audience in a planetarium environment. This was investigated by implementing a sonification that conveyed planetary properties, such as the size and orbital period of a planet, by mapping this data to sonification parameters. With a user-centered approach, the sonification was designed iteratively and evaluated in both an online and planetarium environment. The results of the evaluations show that the participants found the sonification informative and interesting, which suggest that sonification can be beneficially used as a complement to visualization in a planetarium environment.
Openspace Sonification: Complementing Visualization of the Solar System with Sound
Openspace Sonification
attachment
https://repository.gatech.edu/server/api/core/bitstreams/b7c92059-16ac-48f4-8993-89540baa9b2f/content
2023-10-30 16:50:52
Elmquist_et_al_2021_Openspace_Sonification.pdf
1
application/pdf
conferencePaper
DOI 10.21785/icad2016.023
Proc. 22nd International Conference on Auditory Display (ICAD–2016)
Canberra, Australia
Gionfrida
Letizia
Roginska
Agnieszka
Keary
James
Mohanraj
Hariharan
Friedman
Kent P.
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_UE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_group
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_browse
6 SONIsearch_none
7 VISDatasetType_field
8 SONIDatasetType_table
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/56570
2016-07
Citation Key: gionfrida_2016_TripleToneSonification
2023-10-30 16:50:06
repository.gatech.edu
en
For the current diagnosis of Alzheimer's dementia (AD), physicians and neuroscientists primarily call upon visual and statistical analysis methods of large, multi-dimensional positron emission tomography (PET) brain scan data sets. As these data sets are complex in nature, the assessment of disease severity proves challenging, and is susceptible to cognitive and perceptual errors causing intra and inter-reader variability among doctors. The Triple-Tone Sonification method, first presented and evaluated by Roginska et al., invites an audible element to the diagnosis process, offering doctors another tool to gain certainly and clarification of disease stages. Audible beating patterns resulting from three interacting frequencies extracted from PET brain scan data, the Triple-Tone method underwent a second round of subjective listening test and evaluation, this time on radiologists from NYU Langone Medical Center. Results show the method is effective at evaluation PET scan brain data.
The Triple Tone Sonification Method to Enhance the Diagnosis of Alzheimer’s Dementia
attachment
https://repository.gatech.edu/bitstreams/45013f32-db84-45f1-a00e-00bb736c352e/download
2023-10-30 16:50:08
Gionfrida_et_al_2016_The_Triple_Tone_Sonification_Method_to_Enhance_the_Diagnosis_of_Alzheimer’s.pdf
1
application/pdf
conferencePaper
DOI 10.21785/icad2018.032
Proc. 24th International Conference on Auditory Display (ICAD 2018)
Michigan Technological University
Georgia Institute of Technology
Arbon
Robert E.
Jones
Alex J.
Bratholm
Lars A.
Mitchell
Tom
Glowacki
David R.
1 Purpose_Exploration
10 SONI_Level_interval
11 Evaluation_NONE
12 Interaction_no
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_network
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/60093
232-239
2018-06
Publisher: Georgia Institute of Technology
Citation Key: arbon_2018_SonifyingStochasticWalks
2023-10-30 16:46:07
repository.gatech.edu
en
Translating the complex, multi-dimensional data produced by simulations of biomolecules into an intelligible form is a major challenge in computational chemistry and biology. The so-called “free energy landscape” is amongst the most fundamental concepts used by scientists to understand both static and dynamic properties of biomolecular systems. In this paper we use Markov models to design a strategy for mapping features of this landscape to sonic parameters, for use in conjunction with visual display techniques such as structural animations and free energy diagrams. This allows for concurrent visual display of the physical configuration of a biomolecule and auditory display of characteristics of the corresponding free energy landscape. The resulting sonification provides information about the relative free energy features of a given configuration including its stability.
Sonifying stochastic walks on biomolecular energy landscapes
attachment
https://repository.gatech.edu/bitstreams/55843b09-cfa7-492e-80fe-e28cd63f11a5/download
2023-10-30 16:46:09
Arbon_et_al_2018_Sonifying_stochastic_walks_on_biomolecular_energy_landscapes.pdf
1
application/pdf
conferencePaper
Proc. 24th International Conference on AuditoryDisplay (ICAD 2018)
Michigan Technological University
Georgia Institute of Technology
Riber
Adrián García
1 Purpose_Both
10 SONI_Level_nominal
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
8 SONIDatasetType_table
9 VIS_Level_nominal
9 VIS_Level_ordinal
included_Stage5
http://hdl.handle.net/1853/60073
219-226
2018-06
Publisher: Georgia Institute of Technology
Citation Key: riber_2018_PlanethesizerApproachingExoplanet
2023-10-30 16:45:55
repository.gatech.edu
en
The creation of simulations, sounds and images based on information related to an object of investigation is currently a real tool used in multiple areas to bring the non-specialized public closer to scientific achievements and discoveries. Under this context of multimodal representations and simulations developed for educational and informational purposes, this work intends to build a bridge between virtual musical instruments’ development and physical models, using the gravitation laws of the seven planets orbiting around the Trappist-1 star. The following is a case study of an interdisciplinary conversion algorithm design that relates musical software synthesis to exoplanets’ astronomical data - measured from the observed flux variations in the light curves of their star- and that tries to suggest a systematic and reproducible method, useful for any other planetary system or model-based virtual instrument design. As a result, the Virtual Interactive Synthesizer prototype Planethesizer is presented, whose default configurations display a multimodal Trappist-1, Kepler-444 and K2-72 planetary systems simulation.
Planethesizer: Approaching exoplanet sonification
Planethesizer
attachment
https://repository.gatech.edu/bitstreams/a3305748-d2dd-4bfe-924b-f95c4834e904/download
2023-10-30 16:45:57
Riber_2018_Planethesizer.pdf
1
application/pdf
conferencePaper
ISBN 978-0-9670904-5-0
DOI 10.21785/icad2018.010
Proc. 24th International Conference on Auditory Display (ICAD 2018)
Houghton, Michigan
The International Community for Auditory Display
MacDonald
Daniel E.
Natarajan
Thangam
Windeyer
Richard C.
Coppin
Peter
Steinman
David A.
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_group
20 Venue ICAD
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_nominal
included_Stage5
http://hdl.handle.net/1853/60063
28-33
6/2018
Citation Key: macdonald_2018_DataDrivenSonificationCFD
2023-10-30 16:43:25
DOI.org (Crossref)
The 24th International Conference on Auditory Display
en
A novel method is presented for inspecting and characterizing turbulent-like hemodynamic structures in intracranial cerebral aneurysms by sonification of data generated using Computational Fluid Dynamics (CFD). The intention of the current research is to intuitively communicate flow complexity by augmenting conventional flow visualizations with data-driven sound, thereby increasing the ease of interpretation of dense spatiotemporal data through multimodal presentation. The described implementation allows the user to listen to flow fluctuations thought to indicate turbulent-like blood flow patterns that are often visually difficult to discriminate in conventional flow visualizations.
Data-Driven Sonification of CFD Aneurysm Models
attachment
https://repository.gatech.edu/server/api/core/bitstreams/4c5e9076-c2c8-4f20-a8d5-2a7960324245/content
2023-10-30 16:43:23
MacDonald_et_al_2018_Data-Driven_Sonification_of_CFD_Aneurysm_Models.pdf
1
application/pdf
conferencePaper
25th International Conference on Auditory Display (ICAD 2019)
Northumbria University
Georgia Institute of Technology
Phillips
Sean
Cabrera
Andres
1 Purpose_Presentation
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_no
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_none
6 SONIsearch_none
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/61529
184-190
2019-06
Publisher: Georgia Institute of Technology
Citation Key: phillips_2019_SonificationWorkstation
2023-10-30 17:49:25
repository.gatech.edu
en
Sonification Workstation is an open-source application for general sonification tasks, designed with ease-of-use and wide applicability in mind. Intended to foster adoption of sonification across disciplines, and increase experimentation with sonification by non-specialists, Sonification Workstation distills tasks useful in sonification and encapsulates them in a single software environment. The novel interface combines familiar modes of navigation from Digital Audio Workstations, with a highly simplified patcher interface for creating the sonification scheme. Further, the software associates methods of sonification with the data they sonify, in session files, which will make sharing and reproducing sonifications easier. It is posited that facilitating experimentation by non-specialists will increase the potential growth of sonification into fresh territory, encourage discussion of sonification techniques and uses, and create a larger pool of ideas to draw from in advancing the field of sonification. Source code is available at https://github.com/Cherdyakov/sonificationworkstation. Binaries for macOS and Windows, as well as sample content, are available at http://sonificationworkstation.org.
Sonification workstation
attachment
http://repository.gatech.edu/bitstreams/1ccc11a7-6849-4c65-9927-a46236c18bab/download
2023-12-01 09:55:27
Phillips_Cabrera_2019_Sonification_workstation.pdf
1
application/pdf
conferencePaper
Proc. 25th International Conference on Auditory Display (ICAD 2019)
Northumbria University
Georgia Institute of Technology
García Riber
Adrian
1 Purpose_Presentation
10 SONI_Level_interval
11 Evaluation_QRI
12 Interaction_yes
13 Goal_Education
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
included_Stage5
http://hdl.handle.net/1853/61497
62-66
2019-06
Publisher: Georgia Institute of Technology
Citation Key: garciariber_2019_SonifigrapherSonifiedLight
2023-10-30 17:48:21
repository.gatech.edu
en
In an attempt to contribute to the constant feedback existing between science and music, this work describes the design strategies used in the development of the virtual synthesizer prototype called Sonifigrapher. Trying to achieve new ways of creating experimental music through the exploration of exoplanet data sonifications, this software provides an easy-touse graph-to-sound quadraphonic converter, designed for the sonification of the light curves from NASAメs publiclyavailable exoplanet archive. Based on some features of the first analog tape recorder samplers, the prototype allows end-users to load a light curve from the archive and create controlled audio spectra making use of additive synthesis sonification. It is expected to be useful in creative, educational and informational contexts as part of an experimental and interdisciplinary development project for sonification tools, oriented to both non-specialized and specialized audiences.
Sonifigrapher: Sonified light curve synthesizer
Sonifigrapher
attachment
https://repository.gatech.edu/bitstreams/ddcfbdf5-ff01-49fa-9334-0d4911109e2b/download
2023-10-30 17:48:24
Garcia_Riber_2019_Sonifigrapher.pdf
1
application/pdf
conferencePaper
Proc. 19th International Conference on Auditory Display (ICAD2013)
Lodz, Poland
Georgia Institute of Technology
Joliat
Nicholas
Mayton
Brian
Paradiso
Joseph A.
1 Purpose_Both
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Researchers
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_single
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_single
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
8 SONIDatasetType_field
9 VIS_Level_ratio
included_Stage5
http://hdl.handle.net/1853/51643
67-75
2013-07
Publisher: Georgia Institute of Technology
Citation Key: joliat_2013_SpatializedAnonymousAudio
2023-10-30 17:38:51
repository.gatech.edu
en
We explore new ways to communicate sensor data by combining spatialized sonification with animated data visualization in a 3D virtual environment. A system is designed and implemented that implies a sense of anonymized presence in an instrumented building by manipulating navigable live and recorded spatial audio streams. Exploration of both real-time and archived data is enabled. In particular, algorithms for obfuscating audio to protect privacy and for time-compressing audio to allow exploration on diverse time scales are implemented. Synthesized sonification of diverse, distributed sensor data in this context is also supported within our framework.
Spatialized anonymous audio for browsing sensor networks via virtual worlds
attachment
https://repository.gatech.edu/bitstreams/d294dcc8-5983-4da6-937b-331fe760c415/download
2023-10-30 17:38:52
Joliat_et_al_2013_Spatialized_anonymous_audio_for_browsing_sensor_networks_via_virtual_worlds.pdf
1
application/pdf
conferencePaper
DOI 10.21785/icad2023.6263
Proc. 28th International Conference on Auditory Display (ICAD 2023)
Norrköping, Sweden
Huppenkothen
Daniela
Pampin
Juan
Davenport
James R. A.
Wenlock
James
1 Purpose_Both
10 SONI_Level_interval
11 Evaluation_QRI
12 Interaction_yes
13 Goal_Education
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_group
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_single
4 LevelofReduncancy_complementary
5 VISsearch_browse
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
included_Stage5
https://hdl.handle.net/1853/72881
272-279
2023-06
Publisher: Georgia Institute of Technology
Citation Key: huppenkothen_2023_SonifiedHertzsprungRussellDiagram
2023-10-30 17:37:46
repository.gatech.edu
en
Understanding the physical properties of stars, and putting these properties into the context of stellar evolution, is a core challenge in astronomical research. A key visualization in studying stellar evolution is the Hertzsprung-Russell diagram (HRD), organizing data about stellar luminosity and colour into a form that is informative about stellar structure and evolution. However, connecting the HRD with other sources of information, including stellar time series, is an outstanding challenge. Here we present a new method to turn stellar time series into sound. This method encodes physically meaningful features such that auditory comparisons between sonifications of different stars preserve astrophysical differences between them. We present an interactive multimedia version of the HRD that combines both visual and auditory components and that allows exploration of different types of stars both on and off the main sequence through both visual and auditory media.
The Sonified Hertzsprung-Russell Diagram
attachment
https://repository.gatech.edu/bitstreams/fa1b3731-19fb-47f8-bd1b-bbe723e860ab/download
2023-10-30 17:37:48
Huppenkothen_et_al_2023_The_Sonified_Hertzsprung-Russell_Diagram.pdf
1
application/pdf
conferencePaper
DOI 10.21785/icad2023.204
Proc. 28th International Conference on Auditory Display (ICAD 2023)
Norrköping, Sweden
Georgia Institute of Technology
Traver
Peter
Bergh
Emil
1 Purpose_Presentation
10 SONI_Level_interval
11 Evaluation_QRI
12 Interaction_yes
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
17 demo_no
2 VISReadingLevel_whole
20 Venue ICAD
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
6 SONIsearch_none
7 VISDatasetType_geometry
9 VIS_Level_nominal
included_Stage5
https://hdl.handle.net/1853/72864
242-248
2023-06
Publisher: Georgia Institute of Technology
Citation Key: traver_2023_HarmonicesSolarisSonification
2023-10-30 17:37:36
repository.gatech.edu
en
This project is a sonification of the planets in our solar system designed as a spatial audio installation. Audio for each planet was generated within Python using data from NASA about the planets (including Pluto). A Markov model across arbitrary pitch set space was used to generate pitch set collections for each planet with regard to the various parameters in the dataset. XP4Live is used for spatialization and visuals due to its real-time capability. The final piece, as presented, allows the audience to experience orbiting planet sound objects around them in space; the installation includes a mapping of the planets to faders on a MIDI controller allowing the audience to isolate individual planets to hear their sonic makeup and orbiting pattern. This model is presented in a spatial audio array of at least four speakers, but can also be presented as a binaural mix for virtual submission. Future work includes making this piece accessible to the public via a binaural rendering on a webpage, as well as VR implementation for enhanced interactivity.
Harmonices Solaris - Sonification of the Planets
attachment
https://repository.gatech.edu/bitstreams/0e19b3f9-3510-4a65-a842-9696707db2e5/download
2023-10-30 17:37:38
Traver_Bergh_2023_Harmonices_Solaris_-_Sonification_of_the_Planets.pdf
1
application/pdf
conferencePaper
Proc. 5th Interactive Sonification Workshop, ISon
CITEC, Bielefeld University
Rönnberg
Niklas
Johansson
Jimmy
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_UP
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Researchers
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ISon
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_browse
5 VISsearch_explore
5 VISsearch_locate
6 SONIsearch_browse
6 SONIsearch_explore
6 SONIsearch_locate
7 VISDatasetType_table
8 SONIDatasetType_table
included_Stage5
63-67
2016
Citation Key: ronnberg_2016_InteractiveSonificationVisual
Zotero
en
This paper presents an experiment designed to evaluate the possible benefits of sonification in information visualization to give rise to further research challenges. It is hypothesized, that by using musical sounds for sonification when visualizing complex data, interpretation and comprehension of the visual representation could be increased by interactive sonification.
Interactive Sonification for Visual Dense Data Displays
attachment
Ronnberg_Johansson_2016_Interactive_Sonification_for_Visual_Dense_Data_Displays.pdf
application/pdf
journalArticle
Yang
Jiajun
Hermann
Thomas
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
14 Users_Researchers
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_group
3 SONIReadingLevel_group
3 SONIReadingLevel_single
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
5 VISsearch_none
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
included_Stage5
http://www.aes.org/e-lib/browse.cfm?elib=19712
703-711
2018-09-16
Number: 9
Citation Key: yang_2018_InteractiveModeExplorer
2023-11-03 08:37:03
DOI.org (Crossref)
Exploratory Data Analysis (EDA) refers to the process of detecting patterns of data when explicit knowledge of such patterns within the data is missing. Because EDA predominantly employs data visualization, it remains challenging to visualize high-dimensional data. To minimize the challenge, some information can be shifted into the auditory channel using humans’ highly developed listening skills. This paper introduces Mode Explorer, a new sonification model that enables continuous interactive exploration of datasets with regards to their clustering. The method was shown to be effective in supporting users in the more accurate assessment of cluster mass and number of clusters. While the Mode Explorer sonification aimed to support cluster analysis, the ongoing research has the goal of establishing a more general toolbox of sonification models, tailored to uncover different structural aspects of high-dimensional data. The principle of extending the data display to the auditory domain is applied by augmenting interactions with 2D scatter plots of high-dimensional data with information about the probability density function.
Interactive Mode Explorer Sonification Enhances Exploratory Cluster Analysis
66
9
Journal of the Audio Engineering Society
ISSN 15494950
J. Audio Eng. Soc.
DOI 10.17743/jaes.2018.0042
attachment
Yang_Hermann_2018_Interactive_Mode_Explorer_Sonification_Enhances_Exploratory_Cluster_Analysis.pdf
application/pdf
conferencePaper
Proc. ACHI 2014 : The Seventh International Conference on Advances in Computer-Human Interactions
IARIA
Papachristodoulou
Panagiota
Betella
Alberto
Verschure
Paul
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_UP
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_nominal
included_Stage5
35-40
2014-01-01
Pages: 40
Citation Key: papachristodoulou_2014_SonificationLargeDatasets
ResearchGate
Auditory display techniques can play a key role in the understanding of hidden patterns in large datasets. In this study, we investigated the role of sonification applied to an immersive 3D visualization of a complex network dataset. As a test case, we used a 3D interactive visualization of the so called, connectome of the human brain, in the immersive space called "eXperience Induction Machine (XIM)". We conducted an empirical validation where subjects were asked to perform a navigation task through the network and were subsequently tested for their understanding of the dataset. Our results showed that sonification provides a further layer of understanding of the dynamics of the network by enhancing the subjects' structural understanding of the data space.
Sonification of Large Datasets in a 3D Immersive Environment: A Neuroscience Case Study
Sonification of Large Datasets in a 3D Immersive Environment
attachment
https://www.researchgate.net/profile/Alberto-Betella/publication/262116508_Sonification_of_Large_Datasets_in_a_3D_Immersive_Environment_A_Neuroscience_Case_Study/links/0f317538dde324acbc000000/Sonification-of-Large-Datasets-in-a-3D-Immersive-Environment-A-Neuroscience-Case-Study.pdf
2023-12-01 08:40:44
Papachristodoulou_et_al_2014_Sonification_of_Large_Datasets_in_a_3D_Immersive_Environment.pdf
1
application/pdf
attachment
https://www.researchgate.net/publication/262116508_Sonification_of_Large_Datasets_in_a_3D_Immersive_Environment_A_Neuroscience_Case_Study
2023-12-01 08:40:44
ResearchGate Link
3
conferencePaper
DOI 10.5281/zenodo.6555839
Proc. AVI 2022 Workshop on Audio-Visual Analytics (WAVA22)
Frascati, Italy
Zenodo
Svoronos-Kanavas
Iason
Agiomyrgianakis
Vasilis
Rönnberg
Niklas
1 Purpose_Presentation
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_no
13 Goal_PublicEngagement
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
8 SONIDatasetType_table
9 VIS_Level_ratio
included_Stage5
2022
Citation Key: svoronos-kanavas_2022_ExploratoryUseAudiovisual
Zotero
en
The present study is an interdisciplinary endeavour that transmutes science, technology, and aesthetics into an audiovisual experience. The objective is to highlight the potential of combining sonification with visualisation in order to enhance the comprehension of extensive and complex sets of data. Moreover, this paper describes contemporary tools and methods for the implementation of the practice and suggests effective ways to monitor environmental changes. It can be regarded as an exploratory study for familiarisation with the potential of sonification and visualisation in the exploration of environmental data.
An exploratory use of audiovisual displays on oceanographic data
attachment
Svoronos-Kanavas_et_al_2022_An_exploratory_use_of_audiovisual_displays_on_oceanographic_data.pdf
application/pdf
conferencePaper
Proceedings of ISon 2016, 5th Interactive Sonification Workshop, CITEC, Bielefeld University
Bielefeld, Germany
Ballweg
Holger
Bronowska
Agnieszka K.
Vickers
Paul
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_UE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes but offline
2 VISReadingLevel_whole
20 Venue ISon
3 SONIReadingLevel_group
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_nominal
included_Stage5
2016
Citation Key: ballweg_2016_InteractiveSonificationStructural
Zotero
ISon 2016
en
The visualisation of structural biology data can be quite challenging as the datasets are complex, in particular the intrinsic dynamics/flexibility. Therefore some researchers have looked into the use of sonification for the display of proteins. Combining sonification and visualisation appears to be well fitted to this problem, but at the time of writing there are no plugins available for any of the major molecular visualisation applications.
Interactive Sonification for Structural Biology and Structure-Based Drug Design
attachment
Ballweg_et_al_2016_Interactive_Sonification_for_Structural_Biology_and_Structure-Based_Drug_Design.pdf
application/pdf
journalArticle
Harrison
Chris
Trayford
James
Harrison
Leigh
Bonne
Nicolas
1 Purpose_Presentation
10 SONI_Level_ordinal
11 Evaluation_NONE
12 Interaction_no
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Physical Environment
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_single
4 LevelofReduncancy_redundant
6 SONIsearch_none
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_ordinal
included_Stage5
https://doi.org/10.1093/astrogeo/atac027
2.38-2.40
2022-04-01
Number: 2
Citation Key: harrison_2022_AudioUniverseTour
2023-12-07 14:16:08
Silverchair
Chris Harrison, James Trayford, Leigh Harrison and Nicolas Bonne have developed a sensory odyssey to demonstrate how the Universe can be made more accessible.
Audio universe: tour of the solar system
Audio universe
63
2
Astronomy & Geophysics
ISSN 1366-8781
Astronomy & Geophysics
DOI 10.1093/astrogeo/atac027
attachment
https://academic.oup.com/astrogeo/article-pdf/63/2/2.38/42829611/atac027.pdf
2023-12-07 14:16:11
Harrison_et_al_2022_Audio_universe.pdf
1
application/pdf
attachment
https://academic.oup.com/astrogeo/article/63/2/2.38/6546997
2023-12-07 14:16:13
Snapshot
3
text/html
conferencePaper
Proc. Web Audio Conference WAC-2017
London, UK
Kondak
Zachary
Liang
Tianchu (Alex)
Tomlinson
Brianna
Walker
Bruce N.
1 Purpose_Presentation
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_NONE
14 Users_General Public
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_yes but offline
2 VISReadingLevel_whole
3 SONIReadingLevel_single
3 SONIReadingLevel_whole
4 LevelofReduncancy_mixed
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_ratio
included_Stage5
https://qmro.qmul.ac.uk/xmlui/handle/123456789/26083
http://creativecommons.org/licenses/by-nc-nd/3.0/us/
2017-08-21
Accepted: 2017-10-02T08:57:59Z
Citation Key: kondak_2017_WebSonificationSandbox
2023-12-07 16:01:44
qmro.qmul.ac.uk
en
Auditory and multimodal presentation of data (“auditory graphs”) can allow for discoveries in a data set that are sometimes impossible with visual-only inspection. At the same time, multimodal graphs can make data, and the STEM fields that rely on them, more accessible to a much broader range of people, including many with disabilities. There have been a variety of software tools developed to turn data into sound, including the widely-used Sonification Sandbox, but there remains a need for simple, powerful, and more accessible tool for the construction and manipulation of multimodal graphs. Web-based audio functionality is now at the point where it can be leveraged to provide just such a tool. Thus, we developed a web application, the Web Sonification Sandbox (or simply the Web Sandbox), that allows users to create and manipulate multimodal graphs that convey information through both sonification and visualization. The Web Sandbox is designed to be usable by individuals with no technical or musical expertise, which separates it from existing software. The easy-to-use nature of the Web Sandbox, combined with its multimodal nature, allow it to be a maximally accessible application by a diverse audience of users. Nevertheless, the application is also powerful and flexible enough to support advanced users.
Web Sonification Sandbox - an Easy-to-Use Web Application for Sonifying Data and Equations
attachment
https://qmro.qmul.ac.uk/xmlui/bitstream/123456789/26083/1/24.pdf
2023-12-07 16:01:47
Kondak_et_al_2017_Web_Sonification_Sandbox_-_an_Easy-to-Use_Web_Application_for_Sonifying_Data.pdf
1
application/pdf
conferencePaper
Proceedings of Machine Learning Research
123
Proceedings of the NeurIPS 2019 Competition and Demonstration Track
PMLR
Herrmann
Vincent
Escalante
Hugo Jair
Hadsell
Raia
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_ratio
11 Evaluation_NONE
12 Interaction_no
13 Goal_Research
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_network
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_nominal
included_Stage5
https://proceedings.mlr.press/v123/herrmann20a.html
192–202
2020-12-08
Citation Key: herrmann_2020_VisualizingSonifyingHow
A system is presented that visualizes and sonifies the inner workings of a sound processing neural network in real-time. The models that are employed have been trained on music datasets in a self-supervised way using contrastive predictive coding. An optimization procedure generates sounds that activate certain regions in the network. That way it can be rendered audible how music sounds to this artificial ear. In addition, the activations of the neurons at each point in time are visualized. For this, a force graph layout technique is used to create a vivid and dynamic representation of the neural network in action.
Visualizing and sonifying how an artificial ear hears music
attachment
http://proceedings.mlr.press/v123/herrmann20a/herrmann20a.pdf
2023-12-11 02:25:16
Full Text
1
application/pdf
attachment
herrmann20a.pdf
application/pdf
attachment
http://proceedings.mlr.press/v123/herrmann20a/herrmann20a.pdf
2023-12-11 02:24:19
herrmann20a.pdf
3
application/pdf
conferencePaper
ISBN 978-1-5386-1424-2
DOI 10.1109/PacificVis.2018.00036
2018 IEEE Pacific Visualization Symposium (PacificVis)
Kobe
IEEE
Du
Meng
Chou
Jia-Kai
Ma
Chen
Chandrasegaran
Senthil
Ma
Kwan-Liu
1 Purpose_Both
10 SONI_Level_nominal
10 SONI_Level_ratio
12 Interaction_yes
13 Goal_DataAnalysis
13 Goal_PublicEngagement
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
3 SONIReadingLevel_single
4 LevelofReduncancy_redundant
5 VISsearch_explore
5 VISsearch_none
6 SONIsearch_explore
7 VISDatasetType_field
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_nominal
9 VIS_Level_ratio
included_Stage5
https://ieeexplore.ieee.org/document/8365996/
225-229
4/2018
Citation Key: du_2018_ExploringRoleSound
2023-12-18 21:49:24
DOI.org (Crossref)
2018 IEEE Pacific Visualization Symposium (PacificVis)
en
Studies on augmenting visualization with sound are typically based on the assumption that sound can be complementary and assist in data analysis tasks. While sound promotes a different sense of engagement than vision, we conjecture that by augmenting nonspeech audio to a visualization can not only help enhance the users’ perception of the data but also increase their engagement with the data exploration process. We have designed a preliminary user study to test users’ performance and engagement while exploring in a data visualization system under two different settings: visual-only and audiovisual. For our study, we used basketball player movement data in a game and created an interactive visualization system with three linked views. We supplemented sound to the visualization to enhance the users’ understanding of a team’s offensive/defensive behavior. The results of our study suggest that we need to better understand the effect of sound choice and encoding before considering engagement. We also find that sound can be useful to draw novice users’ attention to patterns or anomalies in the data. Finally, we propose follow-up studies with designs informed by the findings from this study.
Exploring the Role of Sound in Augmenting Visualization to Enhance User Engagement
attachment
https://ieeexplore-ieee-org.ezproxy.fhstp.ac.at:2443/stampPDF/getPDF.jsp?tp=&arnumber=8365996&ref=aHR0cHM6Ly9sb2dpbi5lenByb3h5LmZoc3RwLmFjLmF0OjI0NDMv
2023-12-18 21:49:21
Du_et_al_2018_Exploring_the_Role_of_Sound_in_Augmenting_Visualization_to_Enhance_User.pdf
1
application/pdf
journalArticle
Bearman
Nick
Fisher
Peter F.
1 Purpose_Exploration
10 SONI_Level_interval
10 SONI_Level_nominal
11 Evaluation_QRI
11 Evaluation_UE
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
3 SONIReadingLevel_group
3 SONIReadingLevel_whole
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
8 SONIDatasetType_table
9 VIS_Level_nominal
included_Stage5
https://linkinghub.elsevier.com/retrieve/pii/S0098300411004250
157-163
9/2012
Citation Key: bearman_2012_UsingSoundRepresent
2023-12-18 22:13:40
DOI.org (Crossref)
en
An extension to ESRI’s, ArcGIS was created to allow spatial data to be represented using sound. A number of previous studies have used sound in combination with visual stimuli, but only a limited selection have looked at this with explicit reference to spatial data and none have created an extension for industry standard GIS software. The extension can sonify any raster data layer and represent this using piano notes. The user can choose from a number of different scales of piano notes and decide how the program plays the sound; this flexibility allows the extension to effectively represent a number of different types of data. The extension was evaluated in one-to-one semi-structured interviews with geographical information professionals, who explored aspects of a number of different data sets. Further research is needed to discover the best use of sound in a spatial data context, both in terms of which sounds to use and what data are most effectively represented using those sounds.
Using sound to represent spatial data in ArcGIS
46
Computers & Geosciences
ISSN 00983004
Computers & Geosciences
DOI 10.1016/j.cageo.2011.12.001
attachment
https://pdf.sciencedirectassets.com/271720/1-s2.0-S0098300412X00086/1-s2.0-S0098300411004250/main.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEE0aCXVzLWVhc3QtMSJIMEYCIQDW7G%2BPxspwg9v7%2BkRBzFCR8tAYmjfiDVTlJctWOoyk1QIhAN87CSWMeplrT9Z4SbiSQpsl3f7JsyWlgnCj7AXReXMkKrwFCMb%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEQBRoMMDU5MDAzNTQ2ODY1IgxTENPwhhVJ%2BGUWjv8qkAVd7mMWxltlh%2FCZ93ORBLlVktW5pKBzjnHlw9zaGguXwSYQiro%2FUtByTIgO%2Bng87XYyGdyDIcXEEtprcW0Lg%2FFvLstVHHBzxyh1vHzaVVlxsw%2BvgiL1vKvjNcJLUrWXuEoHPU%2B0i3UxXFYDeqzm6HKBYK8VaJl9vkeDD0T6NJdyUyjHNscqYd81kwW0ORja7YAyZP6%2FBC2wuc3Fvvgdm%2FWfrB5IR7wAaxfkqqcTrYu5KSb0GlayP%2BeRQDrv1c3b7qy6Cc9nVIfbBXCZBsu2kAm6yI5m8oJVLAcazIj7vqqvrlRw2xYRNK8znOl2lO5RCP2nOdkvddZPbxiUuIwmaw8lxhPd7d0t1d17keOVEi%2F1WPetAkNK3QKz7u%2FzkLP0PQQm86RncCqbKEBTW2prFC4cXeDm3CYygXJrfq29oc6a1l1G3WvLuIjd0uSqZQFG7K0y55xqsmXEDGk3p4zruPoGPJngWnjbAvyVJ60uuzvundj6ku4o%2BEKGHIMP9jKbvRuRrbsrDbv9N157%2FyYWQEhVVLfqtu2PwzyCCg8kcMlLedqbWanhpr9lTlXjclNig2KQDw4kVFkS0H3ZunxOPFgDUauvSPA2gOKpLFH6qzwY52J22bTGhzvDnhVM8Ujfjlmk0O8lYjQh8ppFc21urfLFL17XDnuc%2F0OLO0NiHp4QgNQfamieZXZW2RlH42IpZX%2FwvFdTpGOjh7ny5m6HRFWfm8wbiQw%2FAreQAykjcupR5GCa9bqWf1BrLkWF5gaXuJEu7CWPgYW4sE5Bug4oniIgEhhD9QjwDz9XbK%2B9baX6cXNVooCrPT5JxanSeZnxgnePa6%2BY3FETM5kYP5HgRTCqLccZWIX4uJE3l8rDZVoz7TCv5YKsBjqwAa7gRpmbbPQmJScelncQq%2B3wWqzbdN0x62Npr%2B3JcDgYvRQ%2FfwXBrdMpNRIzihT172dHSGWZTQYQYB8kBnsPQUk1zNiVuqEyZqwMeECMWm3dHAFWi03iW8KvCesdcjUU3Zk0moSH6GMqX7E9siEo0hhOxdznn7jK96vRUcvNSvzDmc7UMZ95Bv2795VGnUL2fIQSvL6y18b2%2F7tiohBXPt85tXHUBVOUNajOe2F7EhTx&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20231218T221326Z&X-Amz-SignedHeaders=host&X-Amz-Expires=300&X-Amz-Credential=ASIAQ3PHCVTYVT7F2ZHM%2F20231218%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=caf5b610c41847bd49755ac1dbf6150acb0a22d98ad02ee067f0f36ce5df4274&hash=6b354841299e7d989b3ce8b39415e61c874e325cc5c75f1450ba318de3889c91&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0098300411004250&tid=spdf-e87bd495-d8fd-4878-9919-90580a15b9d2&sid=e308783583c52144976b1332f5c2a4d0fdd5gxrqb&type=client&tsoh=d3d3LXNjaWVuY2VkaXJlY3QtY29tLmV6cHJveHkuZmhzdHAuYWMuYXQ6MjQ0Mw%3D%3D&ua=03175d525100055e00&rr=837ac0c82ea25ac1&cc=at
2023-12-18 22:13:38
Bearman_Fisher_2012_Using_sound_to_represent_spatial_data_in_ArcGIS.pdf
1
application/pdf
journalArticle
16
1
Journal on Multimodal User Interfaces
ISSN 1783-8738
J Multimodal User Interfaces
DOI 10.1007/s12193-021-00378-8
Paté
Arthur
Farge
Gaspard
Holtzman
Benjamin K.
Barth
Anna C.
Poli
Piero
Boschi
Lapo
Karlstrom
Leif
1 Purpose_Both
10 SONI_Level_nominal
10 SONI_Level_ratio
11 Evaluation_QRI
11 Evaluation_UE
12 Interaction_no
13 Goal_Research
14 Users_Domain Experts
15 Topic_Natural Sciences
16 Display_Physical Environment
17 demo_yes
2 VISReadingLevel_group
3 SONIReadingLevel_group
4 LevelofReduncancy_mixed
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_field
7 VISDatasetType_geometry
8 SONIDatasetType_field
9 VIS_Level_ratio
Auditory display
Data representation
included_Stage5
Spatial audio
Visual display
https://doi.org/10.1007/s12193-021-00378-8
125-142
2022-03-01
Number: 1
Citation Key: pate_2022_CombiningAudioVisual
tex.ids= pate_2022_CombiningAudioVisual
ISBN: 1783-7677
publisher: Springer
2023-12-19 09:17:56
Springer Link
en
Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.
Combining audio and visual displays to highlight temporal and spatial seismic patterns
attachment
https://link.springer.com/content/pdf/10.1007%2Fs12193-021-00378-8.pdf
2023-12-19 09:17:57
Pate_et_al_2022_Combining_audio_and_visual_displays_to_highlight_temporal_and_spatial_seismic.pdf
1
application/pdf
conferencePaper
DOI 10.5281/ZENODO.7552257
Proceedings of ISon 2022, 7th Interactive Sonification Workshop, BSCC, University of Bremen
Lemmon
Eric
Schedel
Margaret
Bilkhu
Inderjeet
Zhu
Haotong
Escobar
Litzy
Aumoithe
George
1 Purpose_Exploration
10 SONI_Level_ratio
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Life Sciences
16 Display_Touch Display
17 demo_yes
2 VISReadingLevel_whole
20 Venue ISon
3 SONIReadingLevel_single
4 LevelofReduncancy_mixed
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_nominal
included_Stage5
https://zenodo.org/record/7552257
Creative Commons Attribution 4.0 International, Open Access
2023-01-19
Publisher: Zenodo
Citation Key: lemmon_2023_MappingEmergencyDesigning
2023-12-19 09:50:04
DOI.org (Datacite)
ISon 2023
en
In this paper, we describe a hyperlocal ArcGIS- and sonificationbased COVID-19 web-mapping tool that seeks to ameliorate some of socio-technical problems associated with epidemiological mapping and the field’s frequent usage of visual and haptic data display. This socio-technical problems can be seen in current, wellknown and frequently cited epidemiological mapping tools, such as the Johns Hopkins University COVID-19 Dashboard, which face functional and formal design challenges when compared to the hyper-phenomenal scope of the ongoing pandemic. As a review of our current project scope, we describe the stakes of the pandemic and pose questions related to the aforementioned design challenges that tools deploying data display may face. Taken as a whole, our project aims to offer a response to some of these design challenges by offering user choice and control, n-dimensional data display via sonification, and the integration so socio-political data into epidemiological layers to better represent Suffolk County’s lived experience with COVID-19.
Mapping in the Emergency: Designing a Hyperlocal and Socially Conscious Sonified Map of Covid-19 in Suffolk County, New York
Mapping in the Emergency
attachment
Lemmon_et_al_2023_Mapping_in_the_Emergency.pdf
application/pdf
conferencePaper
Proceedings of ISon 2016, 5th Interactive Sonification Workshop, CITEC, Bielefeld University
Bielefeld, Germany
Matsubara
Masaki
Morimoto
Yota
Uchide
Takahiko
1 Purpose_Exploration
10 SONI_Level_interval
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_General Public
15 Topic_Natural Sciences
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_whole
20 Venue ISon
3 SONIReadingLevel_whole
4 LevelofReduncancy_complementary
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_geometry
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
9 VIS_Level_nominal
included_Stage5
2016
Citation Key: matsubara_2016_CollaborativeStudyInteractive
Zotero
ISon 2016
en
Earthquakes are studied on the basis of seismograms. When seismologists review seismograms, they plot them on a screen or paper after preprocessing. Proper visualisations help them determine the nature of earthquake source processes and/or the effects of underground structures through which the seismic wave propagates. Audification is another method to obtain an overview of seismic records. Since the frequency of seismic records is generally too low to be audible, the audification playback rate needs to be increased to shift frequencies into the audible range. This method often renders the playback of sound too fast to perceive the nature of earthquake rupture and seismic propagation. Furthermore, audified sounds are often perceived as fearful and hence unsuitable for distribution to the public. Hence, we aim to understand spatio-temporal wave propagation by sonifying data from a seismic array and to design a pleasant sound for public outreach. In this research, a sonification researcher, a composer and a seismologist collaborated to propose an interactive sonification system for seismologists. An interactive sonification method for multiple seismic waves was developed for data exploration. To investigate the method, it was applied to a seismic array of the wave propagation from the 2011 Tohoku-oki earthquake over Japanese islands. As the playback rate is only 10 times in the investigation, it is easy to understand the propagation of seismic waves. The sonified sound shapes show some characteristics and distributions such that seismologists can easily determine the time span and frequency band to be focused on. The case study showed how a seismologist explored the data with visualisation and sonification and how he discovered triggered earthquake by using the sonified sound.
Collaborative Study of Interactive Seismic Array Sonification for Data Exploration and Public Outreach Activities
attachment
Matsubara_et_al_2016_Collaborative_Study_of_Interactive_Seismic_Array_Sonification_for_Data.pdf
application/pdf
conferencePaper
DOI 10.1109/VIS54172.2023.00046
Proceedings of IEEE Visualization and Visual Analytics (VIS) 2023 - Short Papers
Bru
Egil
Trautner
Thomas
Bruckner
Stefan
1 Purpose_Exploration
10 SONI_Level_interval
11 Evaluation_QRI
12 Interaction_yes
13 Goal_DataAnalysis
14 Users_Domain Experts
15 Topic_Applied Sciences and Engineering
16 Display_Touch Display
17 demo_no
2 VISReadingLevel_group
2 VISReadingLevel_whole
20 Venue Field Vis
3 SONIReadingLevel_group
4 LevelofReduncancy_redundant
5 VISsearch_explore
6 SONIsearch_explore
7 VISDatasetType_table
8 SONIDatasetType_table
9 VIS_Level_interval
Human-centered computing-Accessibility-Accessibility systems and tools
Human-centered computing-Human computer interaction (HCI)-Interaction techniques-Auditory feedback
included_Stage5
Index Terms: Human-centered computing-Visualization-Visualization application domains-Information visualization
https://arxiv.org/abs/2307.16589v1
186-190
2023-10
Citation Key: bru_2023_LineHarpImportanceDriven
tex.ids= bru_2023_LineHarpImportanceDriven
arXiv: 2307.16589
2023-12-31 07:50:22
IEEE Xplore
2023 IEEE Visualization and Visual Analytics (VIS)
Accessibility in visualization is an important yet challenging topic. Sonification, in particular, is a valuable yet underutilized technique that can enhance accessibility for people with low vision. However, the lower bandwidth of the auditory channel makes it difficult to fully convey dense visualizations. For this reason, interactivity is key in making full use of its potential. In this paper, we present a novel approach for the sonification of dense line charts. We utilize the metaphor of a string instrument, where individual line segments can be “plucked”. We propose an importance-driven approach which encodes the directionality of line segments using frequency and dynamically scales amplitude for improved density perception. We discuss the potential of our approach based on a set of examples.
Line Harp: importance-driven sonification for dense line charts
Line harp
attachment
https://ieeexplore.ieee.org/stampPDF/getPDF.jsp?tp=&arnumber=10360906&ref=aHR0cHM6Ly9pZWVleHBsb3JlLmllZWUub3JnL2Fic3RyYWN0L2RvY3VtZW50LzEwMzYwOTA2P2Nhc2FfdG9rZW49bFNQU21qNTd0b0lBQUFBQToxN0tRZDQtS2JwZHdVQWxtU094M0lYXzJ1YTZyT1FxbHBJR2hfWWg4TElua2IxM2x4bFd0M0NsbHp4dUFoNDB0WU1JUXVWUExCOEU=
2023-12-31 07:50:26
Bru_et_al_2023_Line_Harp.pdf
1
application/pdf
attachment
PDF
application/pdf