Zotero Key,Publication Year,Author,Title,Abstract,Publisher Info,Url,DOI,Purpose,VIS Reading Level,SONI Reading Level,Level of Reduncancy,search level VIS,search level SONI,Dataset Type VIS,Dataset Type SONI,Level of Measurement VIS,Level of Measurement SONI,Evaluation System,Visualization Idiom,Sonification Technique,Visual Identity Channels,Visual Magnitude Channels,Auditory Identiy Channels,Auditory Magnitude Channels,Interaction,Goal,User,Topic (macro),Topic (micro),Brief overview topics,Target Display Plattform,has a demo?,what demo? (just for aus),demolink(just for us) GWXG7IJI,2011,"Adhitya, Sara; Kuuskankare, Mika",The Sonified Urban Masterplan (Sum) Tool: Sonification for Urban Planning and Design,"This paper describes the progress of an interdisciplinary project that explores the potential for sonification in urban planning and design. The project involves the translation of visual urban mapping techniques used in urban planning and design, into sound, through the development of the Sonified Urban Masterplan (SUM) tool. We will describe our sonification approach and outline the implementation of the SUM tool within the computer-aided composition environment PWGL. The tool will be applied to a selected urban data set to demonstrate its potential. The paper concludes with the advantages of such an approach in urban analysis, as well as introduces the possibility, within such CAC environments as PWGL and OpenMusic, to ‘compose’ urban plans and design using sound.",ICAD,http://hdl.handle.net/1853/51918,,presentation,group,group,mixed,"lookup, locate, browse, explore","lookup, locate, browse, explore",geometry,geometry,"nominal, ordinal","nominal, ordinal, ratio",NONE,map,parameter mapping,color hue,saturation,timbre,"pitch, duration, loudness",yes,Data Analysis,Domain Experts,Applied Sciences & Engineering,Urban Planning,Others,Desktop Computer Display,no,no, A35CWKBR,2012,"Alonso-Arevalo, Miguel A.; Shelley, Simon; Hermes, Dik; Hollowood, Jacqueline; Pettitt, Michael; Sharples, Sarah; Kohlrausch, Armin",Curve shape and curvature perception through interactive sonification,"In this article we present an approach that uses sound to communicate geometrical data related to a virtual object. This has been developed in the framework of a multimodal interface for product design. The interface allows a designer to evaluate the quality of a 3-D shape using touch, vision, and sound. Two important considerations addressed in this article are the nature of the data that is sonified and the haptic interaction between the user and the interface, which in fact triggers the sound and influences its characteristics. Based on these considerations, we present a number of sonification strategies that are designed to map the geometrical data of interest into sound. The fundamental frequency of various sounds was used to convey the curve shape or the curvature to the listeners. Two evaluation experiments are described, one involves partipants with a varied background, the other involved the intended users, i.e. participants with a background in industrial design. The results show that independent of the sonification method used and independent of whether the curve shape or the curvature were sonified, the sonification was quite successful. In the first experiment participants had a success rate of about 80% in a multiple choice task, in the second experiment it took the participants on average less than 20 seconds to find the maximum, minimum or inflection points of the curvature of a test curve.",ACM Trans. Appl. Percept.,https://doi.org/10.1145/2355598.2355600,10.1145/2355598.2355600,exploration,whole,single,redundant,explore,explore,geometry,geometry,ratio,ratio,UP,volume rendering,parameter mapping,none,curvature,none,pitch,yes,Data Analysis,Domain Experts,Applied Sciences & Engineering,Industrial Design,Others,XR,yes,"yes, video",https://www.youtube.com/@SATINproject/videos EYDT7T8C,2018,"Arbon, Robert E.; Jones, Alex J.; Bratholm, Lars A.; Mitchell, Tom; Glowacki, David R.",Sonifying stochastic walks on biomolecular energy landscapes,"Translating the complex, multi-dimensional data produced by simulations of biomolecules into an intelligible form is a major challenge in computational chemistry and biology. The so-called “free energy landscape” is amongst the most fundamental concepts used by scientists to understand both static and dynamic properties of biomolecular systems. In this paper we use Markov models to design a strategy for mapping features of this landscape to sonic parameters, for use in conjunction with visual display techniques such as structural animations and free energy diagrams. This allows for concurrent visual display of the physical configuration of a biomolecule and auditory display of characteristics of the corresponding free energy landscape. The resulting sonification provides information about the relative free energy features of a given configuration including its stability.",ICAD,http://hdl.handle.net/1853/60093,,exploration,whole,whole,complementary,explore,explore,network,network,interval,interval,NONE,3D molecule rendering,parameter mapping,"color hue, shape",direction,timbre,"pitch, loudness, number of notes, duration,...",no,Data Analysis,Domain Experts,Life Sciences,Molecular Science,Molecular Science,Desktop Computer Display,yes,"yes, video",https://vimeo.com/255391814 YSD6L7R4,2015,"Ballora, Mark",Two examples of sonification for viewer engagement: Hurricanes and squirrel hibernation cycles,"This extended abstract describes two sets of sonifications that were commissioned by researchers from the fields of meteorology and animal ecology. The sonifications were created with the software synthesis program SuperCollider [1]. The motivation for creating them was to pursue additional levels of engagement and immersion, supplementing the effects of visual plots. The goal is for audiences, in particular students and laypeople, to readily understand (and hopefully find compelling) the phenomena being described. The approach is parameter-based, creating “sonic scatter plots” [2] in the same manner as work described in earlier publications [3-4].",ICAD,http://hdl.handle.net/1853/54172,,presentation,whole,whole,mixed,"browse, locate, explore",explore,geometry,table,interval,interval,NONE,heatmap,parameter mapping,shape,color hue,timbre,"ryhthm, volume, timbre, panning, pitch",no,Public Engagement,General Public,Natural Sciences,Hurricanes,Nature,Desktop Computer Display,"yes, but not online anymore","yes, but not online anymore", PRYMID4T,2016,"Ballweg, Holger; Bronowska, Agnieszka K.; Vickers Paul",INTERACTIVE SONIFICATION FOR STRUCTURAL BIOLOGY AND STRUCTURE-BASED DRUG DESIGN,"The visualisation of structural biology data can be quite challenging as the datasets are complex, in particular the intrinsic dynamics/flexibility. Therefore some researchers have looked into the use of sonification for the display of proteins. Combining sonification and visualisation appears to be well fitted to this problem, but at the time of writing there are no plugins available for any of the major molecular visualisation applications.",,,,exploration,whole,group,complementary,explore,explore,geometry,table,"nominal, interval","interval, ratio","UP, UE",3D molecule rendering,"auditory icons, parameter mapping",color hue,position,spatial position,"pitch, modulation frequency",yes,Data Analysis,Domain Experts,Natural Sciences,Molecular Science,Molecular Science,Desktop Computer Display,"yes, but not online anymore","yes, but not online anymore", CACLJGZV,2011,"Bearman, Nick",Using Sound to Represent Uncertainty in Future Climate Projections for the United Kingdom,"This paper compares different visual and sonic methods of representing uncertainty in spatial data. When handling large volumes of spatial data, users can be limited in the amount that can be displayed at once due to visual saturation (when no more data can be shown visually without obscuring existing data). Using sound in combination with visual methods may help to represent uncertainty in spatial data and this example uses the UK Climate Predictions 2009 (UKCP09) dataset; where uncertainty has been included for the first time. Participants took part in the evaluation via a web-based interface which used the Google Maps API to show the spatial data and capture user inputs. Using sound and vision together to show the same variable may be useful to colour blind users. Previous awareness of the data set appears to have a significant impact (p < 0.001) on participants ability to utilise the sonification. Using sound to reinforce data shown visually results in increased scores (p = 0.005) and using sound to show some data instead of vision showed a significant increase in speed without reducing effectiveness (p = 0.033) with repeated use of the sonification.",ICAD,http://hdl.handle.net/1853/51922,,exploration,whole,single,redundant,lookup,lookup,geometry,geometry,interval,interval,"UE, QRI",map,parameter mapping,color hue,color hue,none,pitch,yes,Data Analysis,Domain Experts,Natural Sciences,Climate,Nature,Desktop Computer Display,yes,"yes, video",https://vimeo.com/17029358 WRVD29DK,2012,"Bearman, Nick; Fisher, Peter F.",Using sound to represent spatial data in ArcGIS,"An extension to ESRI’s, ArcGIS was created to allow spatial data to be represented using sound. A number of previous studies have used sound in combination with visual stimuli, but only a limited selection have looked at this with explicit reference to spatial data and none have created an extension for industry standard GIS software. The extension can sonify any raster data layer and represent this using piano notes. The user can choose from a number of different scales of piano notes and decide how the program plays the sound; this flexibility allows the extension to effectively represent a number of different types of data. The extension was evaluated in one-to-one semi-structured interviews with geographical information professionals, who explored aspects of a number of different data sets. Further research is needed to discover the best use of sound in a spatial data context, both in terms of which sounds to use and what data are most effectively represented using those sounds.",Computers & Geosciences,https://linkinghub.elsevier.com/retrieve/pii/S0098300411004250,10.1016/j.cageo.2011.12.001,exploration,whole,"whole, single",redundant,explore,explore,geometry,table,"nominal, interval","nominal, interval",UE,map,parameter mapping,color hue,position,none,pitch,yes,Data Analysis,Domain Experts,Natural Sciences,Geoscience,Nature,Desktop Computer Display,yes,"yes, video",https://www.nickbearman.me.uk/academic/bearman_fisher_2011/index.htm PQYL9LD5,2019,"Berger, Markus; Bill, Ralf",Combining VR Visualization and Sonification for Immersive Exploration of Urban Noise Standards,"Urban traffic noise situations are usually visualized as conventional 2D maps or 3D scenes. These representations are indispensable tools to inform decision makers and citizens about issues of health, safety, and quality of life but require expert knowledge in order to be properly understood and put into context. The subjectivity of how we perceive noise as well as the inaccuracies in common noise calculation standards are rarely represented. We present a virtual reality application that seeks to offer an audiovisual glimpse into the background workings of one of these standards, by employing a multisensory, immersive analytics approach that allows users to interactively explore and listen to an approximate rendering of the data in the same environment that the noise simulation occurs in. In order for this approach to be useful, it should manage complicated noise level calculations in a real time environment and run on commodity low-cost VR hardware. In a prototypical implementation, we utilized simple VR interactions common to current mobile VR headsets and combined them with techniques from data visualization and sonification to allow users to explore road traffic noise in an immersive real-time urban environment. The noise levels were calculated over CityGML LoD2 building geometries, in accordance with Common Noise Assessment Methods in Europe (CNOSSOS-EU) sound propagation methods.","Multimodal Technologies and Interaction 2019, Vol. 3, Page 34",https://www.mdpi.com/2414-4088/3/2/34/htm,10.3390/MTI3020034,exploration,whole,group,redundant,explore,explore,"geometry, table",table,interval,interval,AP,point grid,parameter mapping,position,color hue,spatial position,loudness,yes,Data Analysis,Domain Experts,Applied Sciences & Engineering,Urban Planning,Others,XR,no,no, ZGM46RQ5,2020,"Bouchara, Tifanie; Montès, Matthieu",Immersive sonification of protein surface,"This paper presents our ongoing efforts to design an immersive sonification model to represent protein surfaces through 3D sound, in order to extend pre-existing protein visualisation methods without overloading visual perception. The protein surface is first discretized so each point of the surface is attached to a sound source spatialized in such a way the user is immersed in the center of the protein. We add a spherical filtering system, that the user can control, to select the surface points that would be rendered in order to reinforce the auditory interpretation of the 3D shape. Several questions, which can benefit the VR and HCI communities, are discussed both on audio and audiographical filtering consistency, and on multimodal integration of data coming from different point of view and point of listening in a 3D interactive space.",2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW),https://ieeexplore.ieee.org/document/9090531/,10.1109/VRW50115.2020.00082,exploration,whole,"whole, group",redundant,none,explore,geometry,geometry,ratio,ratio,NONE,volume rendering,parameter mapping,none,length,none,"pitch, duration, onset time",yes,Data Analysis,Researchers,Life Sciences,Protein Display,Medicine and Health,Desktop Computer Display,no,no, TXV2ZLCV,2023,"Bru, Egil; Trautner, Thomas; Bruckner, Stefan",Line Harp: Importance-Driven Sonification for Dense Line Charts,"Accessibility in visualization is an important yet challenging topic. Sonification, in particular, is a valuable yet underutilized technique that can enhance accessibility for people with low vision. However, the lower bandwidth of the auditory channel makes it difficult to fully convey dense visualizations. For this reason, interactivity is key in making full use of its potential. In this paper, we present a novel approach for the sonification of dense line charts. We utilize the metaphor of a string instrument, where individual line segments can be ""plucked"". We propose an importance-driven approach which encodes the directionality of line segments using frequency and dynamically scales amplitude for improved density perception. We discuss the potential of our approach based on a set of examples.",Proceedings of IEEE Visualization and Visual Analytics (VIS) 2023 -- Short Papers,https://arxiv.org/abs/2307.16589v1,10.1109/VIS54172.2023.00046,exploration,"whole, group",group,redundant,explore,explore,table,table,interval,interval,QRI,"line plot, parallel coordinates",parameter mapping,"position, color hue","position, angle",none,"pitch, loudness",yes,Data Analysis,Domain Experts,Applied Sciences & Engineering,Data Display,Data Display,"Desktop Computer Display, Touch Display",no,no, 8B2XKIJR,2021,"Cantrell, Stanley J.; Walker, Bruce N.; Moseng, Øystein","Highcharts Sonification Studio: an online, open-source, extensible, and accessible data sonification tool","The Highcharts Sonification Studio is the culmination of a multi-year collaboration between Highsoft — the creators of Highcharts — and the Georgia Tech Sonification Lab to develop an extensible, accessible, online spreadsheet and multimodal graphing platform for the auditory display, assistive technology, and STEM education communities. The Highcharts Sonification Studio leverages the advances in auditory display and sonification research, as well as over 20 years of experience gained through research and development of the original Sonification Sandbox. We discuss the iterative design and evaluation process of the Highcharts Sonification Studio to ensure usability and accessibility, highlight opportunities for growth of the tool, and its use for research, art, and education within the ICAD community and beyond.",ICAD,http://hdl.handle.net/1853/66348,,both,whole,whole,redundant,"browse, locate, explore","browse, locate, explore",table,table,"interval, ratio","interval, ratio",UP,line chart,parameter mapping,color hue,position,timbre,"pitch, spatial position, loudness, duration, harmonic range",yes,"Education, Public Engagement, Data Analysis","General Public, Domain Experts, Researchers",Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,yes,"yes, webapp",https://sonification.highcharts.com/#/ PSUC8DQK,2017,"Chabot, Samuel; Braasch, Jonas",An Immersive Virtual Environment for Congruent Audio-Visual Spatialized Data Sonifications,"The use of spatialization techniques in data sonification provides system designers with an additional tool for conveying information to users. Oftentimes, spatialized data sets are meant to be experienced by a single or few users at a time. Projects at Rensselaer's Collaborative-Research Augmented Immersive Virtual Environment Laboratory allow even large groups of collaborators to work within a shared virtual environment system. The lab provides an equal emphasis on the visual and audio system, with a nearly 360° panoramic display and 128-loudspeaker array housed behind the acoustically-transparent screen. The space allows for dynamic switching between immersions in recreations of physical scenes and presentations of abstract or symbolic data. Content creation for the space is not a complex process-the entire display is essentially a single desktop and straight-forward tools such as the Virtual Microphone Control allow for dynamic real-time spatialization. With the ability to target individual channels in the array, audio-visual congruency is achieved. The loudspeaker array creates a high-spatial density soundfield within which users are able to freely explore due to the virtual elimination of a so-called “sweet-spot.”",ICAD,http://hdl.handle.net/1853/58381,,presentation,whole,whole,mixed,explore,explore,table,table,"nominal, interval",interval,NONE,bar chart,parameter mapping,color hue,length,spatial position,"tempo, pitch, timbre",no,Data Analysis,General Public,Social Sciences,Finance,Economy,Physical Environment/ Multi User,no,no, MDRMKYAZ,2022,"De La Vega, Gonzalo; Dominguez, Leonardo Martin Exequiel; Casado, Johanna; García, Beatriz",SonoUno Web: An Innovative User Centred Web Interface,"Sonification as a complement of visualization is been under research for decades as a new ways of data deployment. ICAD conferences, gather together specialists from different disciplines to discuss about sonification. Different tools as sonoUno, starSound and Web Sandbox are attempt to reach a tool to open astronomical data sets and sonify it in conjunction to visualization. In this contribution, the sonoUno web version is presented, this version allows user to explore data sets without any installation. The data can be uploaded or a pre-loaded file can be opened, the sonification and the visual characteristics of the plot can be customized on the same window. The plot, sound and marks can be saved. The web interface were tested with the main used screen readers in order to confirm their good performance.",HCI International 2022 – Late Breaking Posters,,10.1007/978-3-031-19679-9_79,both,"whole, group","whole, group",redundant,none,none,table,table,"interval, ratio","interval, ratio",QRI,line chart,parameter mapping,none,position,none,pitch,yes,Data Analysis,General Public,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,yes,yes,https://www.sonouno.org.ar/ EHEY88EM,2018,"Du, Meng; Chou, Jia-Kai; Ma, Chen; Chandrasegaran, Senthil; Ma, Kwan-Liu",Exploring the Role of Sound in Augmenting Visualization to Enhance User Engagement,"Studies on augmenting visualization with sound are typically based on the assumption that sound can be complementary and assist in data analysis tasks. While sound promotes a different sense of engagement than vision, we conjecture that by augmenting nonspeech audio to a visualization can not only help enhance the users’ perception of the data but also increase their engagement with the data exploration process. We have designed a preliminary user study to test users’ performance and engagement while exploring in a data visualization system under two different settings: visual-only and audiovisual. For our study, we used basketball player movement data in a game and created an interactive visualization system with three linked views. We supplemented sound to the visualization to enhance the users’ understanding of a team’s offensive/defensive behavior. The results of our study suggest that we need to better understand the effect of sound choice and encoding before considering engagement. We also find that sound can be useful to draw novice users’ attention to patterns or anomalies in the data. Finally, we propose follow-up studies with designs informed by the findings from this study.",2018 IEEE Pacific Visualization Symposium (PacificVis),https://ieeexplore.ieee.org/document/8365996/,10.1109/PacificVis.2018.00036,both,whole,"group, single",redundant,explore,explore,"table, field",table,"nominal, ratio","nominal, ratio",UP,"convex hull, violin plot, bar chart",parameter mapping,"color hue, shape","opacity, position",pitch,"loudness, spatialization",yes,"Data Analysis, Public Engagement",Domain Experts,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,no,no, EYCCUJX7,2021,"Elmquist, Elias; Ejdbo, Malin; Bock, Alexander; Rönnberg, Niklas",Openspace Sonification: Complementing Visualization of the Solar System with Sound,"Data visualization software is commonly used to explore outer space in a planetarium environment, where the visuals of the software is typically accompanied with a narrator and supplementary background music. By letting sound take a bigger role in these kinds of presentations, a more informative and immersive experience can be achieved. The aim of the present study was to explore how sonification can be used as a complement to the visualization software OpenSpace to convey information about the Solar System, as well as increasing the perceived immersiveness for the audience in a planetarium environment. This was investigated by implementing a sonification that conveyed planetary properties, such as the size and orbital period of a planet, by mapping this data to sonification parameters. With a user-centered approach, the sonification was designed iteratively and evaluated in both an online and planetarium environment. The results of the evaluations show that the participants found the sonification informative and interesting, which suggest that sonification can be beneficially used as a complement to visualization in a planetarium environment.",ICAD,http://hdl.handle.net/1853/66324,10.21785/icad2021.018,both,whole,"whole, group, single",mixed,browse,browse,table,table,interval,interval,UE,volume rendering,parameter mapping,none,"shape, position",none,"pitch, tempo, panning",yes,Public Engagement,General Public,Natural Sciences,Astronomy,Astronomy,Physical Environment/ Multi User,yes,"yes, video",https://vimeo.com/528822742 UNQJ4I6L,2022,"Enge, Kajetan; Rind, Alexander; Iber, Michael; Höldrich, Robert; Aigner, Wolfgang",Towards Multimodal Exploratory Data Analysis: SoniScope as a Prototypical Implementation,"The metaphor of auscultating with a stethoscope can be an inspiration to combine visualization and sonification for exploratory data analysis. This paper presents SoniScope, a multimodal approach and its prototypical implementation based on this metaphor. It combines a scatterplot with an interactive parameter mapping sonification, thereby conveying additional information about items that were selected with a visual lens. SoniScope explores several design options for the shape of its lens and the sorting of the selected items for subsequent sonification. Furthermore, the open-source prototype serves as a blueprint framework for how to combine D3.js visualization and SuperCollider sonification in the Jupyter notebook environment.",,https://diglib.eg.org:443/xmlui/handle/10.2312/evs20221095,,exploration,whole,group,"redundant, mixed, complementary",explore,explore,table,table,"interval, ratio","interval, ratio",QRI,scatter plot,parameter mapping,position,position,none,pitch,yes,Data Analysis,Researchers,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,yes,"yes, video and source code",https://phaidra.fhstp.ac.at/detail/o:4831 23UFY9MR,2012,"Ferguson, Sam; Beilharz, Kirsty; Calò, Claudia A.",Navigation of interactive sonifications and visualisations of time-series data using multi-touch computing,"This paper discusses interaction design for interactive sonification and visualisation of data in multi-touch contexts. Interaction design for data analysis is becoming increasingly important as data becomes more openly available. We discuss how navigation issues such as zooming, selection, arrangement and playback of data relate to both the auditory and visual modality in different ways, and how they may be linked through the modality of touch and gestural interaction. For this purpose we introduce a user interface for exploring and interacting with representations of time-series data simultaneously in both the visual and auditory modalities.",Journal on Multimodal User Interfaces,https://doi.org/10.1007/s12193-011-0075-3,10.1007/s12193-011-0075-3,both,"whole, group","whole, group, single",redundant,explore,explore,table,table,"interval, ratio","interval, ratio",QRI,line chart,parameter mapping,color hue,position,"timbre, onset time",pitch,yes,Data Analysis,Researchers,Applied Sciences & Engineering,Data Display,Data Display,Touch Display,no,no, HH9IXLP7,2018,"Fitzpatrick, Joe; Neff, Flaithri",Stream Segregation: Utilizing Harmonic Variance in Auditory Graphs,"The sonification of line charts, from which auditory line charts are produced is a common sonification strategy used today. This paper examines timbre as a potentially useful sonic dimension for relaying information in sonified line charts. A user-study is presented in which 43 participants were tasked with identifying particular trends among multiple distractor trends using sonified data. These sonified data comprised frequency-mapped trends isolated with the gradual enrichment of harmonic content, using a sawtooth wave as a guideline for the overall harmonic structure. Correlations between harmonic content and identification success rates were examined. Results from the study indicate that the majority of participants consistently chose the sample with the most harmonics when deciding which sonified trend best represented the visual equivalent. However, this confidence decreased with each harmonic addition to the point of complete uncertainty when choosing between a sample with 3 harmonics and a sample with 4 harmonics.",,,,exploration,whole,single,mixed,locate,locate,table,table,"nominal, ordinal","nominal, ordinal",UP,line chart,parameter mapping,none,position,timbre,pitch,no,Data Analysis,General Public,Applied Sciences & Engineering,Computer Science,Others,Desktop Computer Display,no,no, N238DY3U,2019,"García Riber, Adrian",Sonifigrapher: Sonified light curve synthesizer,"In an attempt to contribute to the constant feedback existing between science and music, this work describes the design strategies used in the development of the virtual synthesizer prototype called Sonifigrapher. Trying to achieve new ways of creating experimental music through the exploration of exoplanet data sonifications, this software provides an easy-touse graph-to-sound quadraphonic converter, designed for the sonification of the light curves from NASAメs publiclyavailable exoplanet archive. Based on some features of the first analog tape recorder samplers, the prototype allows end-users to load a light curve from the archive and create controlled audio spectra making use of additive synthesis sonification. It is expected to be useful in creative, educational and informational contexts as part of an experimental and interdisciplinary development project for sonification tools, oriented to both non-specialized and specialized audiences.",ICAD,http://hdl.handle.net/1853/61497,,presentation,whole,"whole, group",redundant,explore,explore,table,table,interval,interval,QRI,"line chart, scatter plot",parameter mapping,none,position,none,"pitch, spatial position",yes,Education,General Public,Natural Sciences,Astronomy,Astronomy,Desktop Computer Display,yes,yes,https://archive.org/details/SonifigrapherMacOSX 9BZTLSCL,2016,"Gionfrida, Letizia; Rogińska, Agnieszka; Keary, James; Mohanraj, Hariharan; Friedman, Kent",The Triple Tone Sonification Method to Enhance the Diagnosis of Alzheimer’s Dementia,"For the current diagnosis of Alzheimer's dementia (AD), physicians and neuroscientists primarily call upon visual and statistical analysis methods of large, multi-dimensional positron emission tomography (PET) brain scan data sets. As these data sets are complex in nature, the assessment of disease severity proves challenging, and is susceptible to cognitive and perceptual errors causing intra and inter-reader variability among doctors. The Triple-Tone Sonification method, first presented and evaluated by Roginska et al., invites an audible element to the diagnosis process, offering doctors another tool to gain certainly and clarification of disease stages. Audible beating patterns resulting from three interacting frequencies extracted from PET brain scan data, the Triple-Tone method underwent a second round of subjective listening test and evaluation, this time on radiologists from NYU Langone Medical Center. Results show the method is effective at evaluation PET scan brain data.",ICAD,http://hdl.handle.net/1853/56570,,exploration,"whole, group",group,complementary,browse,none,field,table,ratio,ratio,"UP, UE","slicing, volume rendering",parameter mapping,none,color hue,none,frequency beating,yes,Data Analysis,Domain Experts,Life Sciences,Medicine,Medicine,Desktop Computer Display,no,no, UYPG5JPX,2011,"Gomez, Imanol; Ramirez, Rafael",A DATA SONIFICATION APPROACH TO COGNITIVE STATE IDENTIFICATION,"The study of human brain functions has dramatically increased greatly due to the advent of Functional Magnetic Resonance Imaging (fMRI), arguably the best technique for observing human brain activity that is currently available. However, fMRI techniques produce extremely high dimensional, sparse and noisy data which is difficult to visualize, monitor and analyze. In this paper, we propose two different sonification approaches to monitor fMRI data. The goal of the resulting fMRI data sonification system is to allow the auditory identification of cognitive states produced by different stimuli. The system consists of a feature selection component and a sonification engine. We explore different feature selection methods and sonification strategies. As a case study, we apply our system to the identification of cognitive states produced by volume accented and duration accented rhythmic stimuli.",ICAD,http://hdl.handle.net/1853/51569,,exploration,group,group,redundant,explore,explore,field,field,ratio,ratio,NONE,slicing,parameter mapping,none,color hue,timbre,"pitch, loudness",no,Data Analysis,Domain Experts,Life Sciences,Brain Scans,Brain Scans,Desktop Computer Display,"yes, but not online anymore","yes, but not online anymore", PHAX23HM,2021,"Groppe, Sven; Klinckenberg, Rico; Warnke, Benjamin",Sound of databases: Sonification of a semantic web database engine,"Sonifications map data to auditory dimensions and offer a new audible experience to their listeners. We propose a sonification of query processing paired with a corresponding visualization both integrated in a web application. In this demonstration we show that the sonification of different types of relational operators generates different sound patterns, which can be recognized and identified by listeners increasing their understanding of the operators' functionality and supports easy remembering of requirements like merge joins work on sorted input. Furthermore, new ways of analyzing query processing are possible with the sonification approach.",Proc. VLDB Endow.,https://doi.org/10.14778/3476311.3476322,10.14778/3476311.3476322,exploration,whole,group,complementary,none,explore,network,network,"nominal, ordinal, interval","nominal, ordinal, interval",NONE,network,parameter mapping,"position, color hue",position,timbre,"pitch, loudness, spatial position, duration",yes,"Education, Public Engagement","General Public, Domain Experts",Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,yes,"yes, video",https://www.ifis.uni-luebeck.de/~groppe/soundofdatabases UA8MI9Y6,2018,"Gune, Aditya; De Amicis, Raffaele; Simões, Bruno; Sanchez, Christopher A.; Demirel, H. Onan",Graphically Hearing: Enhancing Understanding of Geospatial Data through an Integrated Auditory and Visual Experience,"Effective presentation of data is critical to a users understanding of it. In this manuscript, we explore research challenges associated with presenting large geospatial datasets through a multimodal experience. We also suggest an interaction schema that enhances users cognition of geographic information through a user-driven display that visualizes and sonifies geospatial data.",IEEE Computer Graphics and Applications,https://ieeexplore.ieee.org/document/8402185/,10.1109/MCG.2018.042731655,both,whole,group,redundant,explore,browse,geometry,geometry,nominal,nominal,UP,map,parameter mapping,color hue,position,timbre,duration,yes,Research,General Public,Natural Sciences,Geography,Geography,Desktop Computer Display,no,no, BR6M6CTU,2022,"Han, Yoon Chung; Khanduja, Amanbeer",The future is red: Visualizing wildfire predictions using contactless interaction,"This paper presents our approach for visualizing, sonifying, and predicting wildfire in the near future and contactless interaction. We provide an interaction tool to depict the causes and results of the wildfire and promote awareness of the environmental issues by giving forecasting results of the wildfire to the audience. Multimodal interaction allows the audience to dynamically experience the changes of the wildfire over time in two representative locations. (California, United States, and South Korea) The interactive multimodal data visualization and sonification depict the past, present, and future of the wildfire. This data-driven design was installed in an art gallery and presented to audience members. Contactless user interaction with Leap Motion cameras was used during the pandemic period for hygienic interaction. In this paper, we describe the design process and how this interface was developed based on environmental issues, and informal user responses from the art gallery setup are discussed.",Extended abstracts of the 2022 CHI conference on human factors in computing systems,https://doi.org/10.1145/3491101.3519903,10.1145/3491101.3519903,presentation,whole,single,redundant,explore,explore,"geometry, table","geometry, table",ratio,ratio,UE,geographic scatter plot,parameter mapping,position,"size, duration, transparency",none,"pitch, loudness, duration, and ""several other FM synthesis parameters""",yes,Public Engagement,General Public,Natural Sciences,Wildfire,Earth Science,Physical Environment/ Multi User,no,no, 23DFP3VX,2022,"Harrison, Chris; Trayford, James; Harrison, Leigh; Bonne, Nicolas",Audio universe: tour of the solar system,"Chris Harrison, James Trayford, Leigh Harrison and Nicolas Bonne have developed a sensory odyssey to demonstrate how the Universe can be made more accessible.",Astronomy & Geophysics,https://doi.org/10.1093/astrogeo/atac027,10.1093/astrogeo/atac027,presentation,whole,single,redundant,none,none,geometry,table,ordinal,ordinal,NONE,point cloud,parameter mapping,none,"color hue, size, position",none,"pitch, onset time",no,Public Engagement,General Public,Natural Sciences,Astronomy,Astronomy,Physical Environment/ Multi User,yes,"yes, video",https://www.youtube.com/watch?v=5HS3tRl2Ens QF3HWEDY,2020,"Herrmann, Vincent",Visualizing and sonifying how an artificial ear hears music,"A system is presented that visualizes and sonifies the inner workings of a sound processing neural network in real-time. The models that are employed have been trained on music datasets in a self-supervised way using contrastive predictive coding. An optimization procedure generates sounds that activate certain regions in the network. That way it can be rendered audible how music sounds to this artificial ear. In addition, the activations of the neurons at each point in time are visualized. For this, a force graph layout technique is used to create a vivid and dynamic representation of the neural network in action.",Proceedings of the NeurIPS 2019 Competition and Demonstration Track,https://proceedings.mlr.press/v123/herrmann20a.html,,exploration,whole,group,redundant,explore,explore,network,table,"nominal, interval","interval, ratio",NONE,network,parameter mapping,none,"color hue, position",none,pitch,no,Research,General Public,Natural Sciences,Neuroscience,Others,Desktop Computer Display,yes,yes,https://vincentherrmann.github.io/blog/immersions/ SHXBTG3G,2016,"Hildebrandt, Tobias; Amerbauer, Felix; Rinderle-Ma, Stefanie",Combining Sonification and Visualization for the Analysis of Process Execution Data,"Business process execution data is analyzed for different reasons such as process discovery, performance analysis, or anomaly detection. However, visualizations might suffer from a number of limitations. Sonification (the presentation of data using sound) has been proven to successfully enhance visualization in many domains. Although there exist approaches that apply sonification for real-time monitoring of process executions, so far this technique has not been applied to analyze process execution data ex post. We therefore propose a multi-modal system, combining visualization and sonification, for this purpose. The concepts are evaluated by a prototypical ProM plugin as well as based on a use case.",2016 IEEE 18th Conference on Business Informatics (CBI),https://ieeexplore.ieee.org/document/7781493/,10.1109/CBI.2016.47,exploration,whole,"whole, group",mixed,"explore, locate","explore, locate",table,table,"nominal, ratio","nominal, interval, ratio",QRI,dotted chart visualization,"earcons, parameter mapping","shape, color hue",postion,"timbre, melody","loudness, spatial position",yes,Data Analysis,Domain Experts,Applied Sciences & Engineering,Process Excecution Data,Others,Desktop Computer Display,"yes, but not online anymore","yes, but not online anymore", AA73ZTH5,2014,"Holtzman, Benjamin; Candler, Jason; Turk, Matthew; Peter, Daniel","Seismic Sound Lab: Sights, Sounds and Perception of the Earth as an Acoustic Space","We construct a representation of earthquakes and global seismic waves through sound and animated images. The seismic wave field is the ensemble of elastic waves that propagate through the planet after an earthquake, emanating from the rupture on the fault. The sounds are made by time compression (i.e. speeding up) of seismic data with minimal additional processing. The animated images are renderings of numerical simulations of seismic wave propagation in the globe. Synchronized sounds and images reveal complex patterns and illustrate numerous aspects of the seismic wave field. These movies represent phenomena occurring far from the time and length scales normally accessible to us, creating a profound experience for the observer. The multi-sensory perception of these complex phenomena may also bring new insights to researchers.","Sound, Music, and Motion",,10.1007/978-3-319-12976-1_10,presentation,whole,whole,redundant,explore,explore,field,field,"nominal, ordinal","nominal, ordinal",NONE,heatmap,audification,shape,"visual density, transparency",none,pitch,no,Education,General Public,Natural Sciences,Seismology,Nature,Physical Environment/ Multi User,no,no, L5CIZ5FS,2023,"Huppenkothen, Daniela; Pampin, Juan; Davenport, James R. A.; Wenlock, James",THE SONIFIED HERTZSPRUNG-RUSSELL DIAGRAM,"Understanding the physical properties of stars, and putting these properties into the context of stellar evolution, is a core challenge in astronomical research. A key visualization in studying stellar evolution is the Hertzsprung-Russell diagram (HRD), organizing data about stellar luminosity and colour into a form that is informative about stellar structure and evolution. However, connecting the HRD with other sources of information, including stellar time series, is an outstanding challenge. Here we present a new method to turn stellar time series into sound. This method encodes physically meaningful features such that auditory comparisons between sonifications of different stars preserve astrophysical differences between them. We present an interactive multimedia version of the HRD that combines both visual and auditory components and that allows exploration of different types of stars both on and off the main sequence through both visual and auditory media.",ICAD,https://hdl.handle.net/1853/72881,,both,"whole, group",single,complementary,browse,explore,table,table,interval,interval,QRI,scatter plot,"parameter mapping, audification",none,"color hue, position",none,loudness,yes,Education,General Public,Natural Sciences,Astronomy,Astronomy,Desktop Computer Display,yes,"yes, interactive website",https://starsounder.space/ EDN8WGWD,2013,"Joliat, Nicholas; Mayton, Brian; Paradiso, Joseph A.",Spatialized anonymous audio for browsing sensor networks via virtual worlds,"We explore new ways to communicate sensor data by combining spatialized sonification with animated data visualization in a 3D virtual environment. A system is designed and implemented that implies a sense of anonymized presence in an instrumented building by manipulating navigable live and recorded spatial audio streams. Exploration of both real-time and archived data is enabled. In particular, algorithms for obfuscating audio to protect privacy and for time-compressing audio to allow exploration on diverse time scales are implemented. Synthesized sonification of diverse, distributed sensor data in this context is also supported within our framework.",ICAD,http://hdl.handle.net/1853/51643,,both,"whole, single",single,complementary,explore,explore,field,field,ratio,ratio,NONE,3D scatter plot,parameter mapping,color hue,position,none,"pitch, spatial position",yes,Data Analysis,Researchers,Natural Sciences,Sensor Data,Others,Desktop Computer Display,no,no, L42KV4US,2021,"Kariyado, Yuta; Arevalo, Camilo; Villegas, Julián",Auralization of Three-Dimensional Cellular Automata,"An auralization tool for exploring three-dimensional cellular automata is presented. This proof-of-concept allows the creation of a sound field comprising individual sound events associated with each cell in a three-dimensional grid. Each sound-event is spatialized depending on the orientation of the listener relative to the three-dimensional model. Users can listen to all cells simultaneously or in sequential slices at will. Conceived to be used as an immersive Virtual Reality (VR) scene, this software application also works as a desktop application for environments where the VR infrastructure is missing. Subjective evaluations indicate that the proposed sonification increases the perceived quality and immersability of the system with respect to a visualization-only system. No subjective differences between the sequential or simultaneous presentations were found.","Artificial Intelligence in Music, Sound, Art and Design",,10.1007/978-3-030-72914-1_11,presentation,whole,"whole, group",redundant,explore,explore,geometry,geometry,interval,interval,"AP, UE",3D point cloud,parameter mapping,none,"position, color hue",spatial position,"pitch, harmonics range, onset time",yes,Public Engagement,General Public,Applied Sciences & Engineering,Simulation Presentation,Others,"Desktop Computer Display, XR",no,no,https://www.youtube.com/watch?v=eFQi3qFxAp8) MG22AV87,2017,"Kondak, Zachary; Liang, Tianchu (Alex); Tomlinson, Brianna; Walker, Bruce N.",Web Sonification Sandbox - an Easy-to-Use Web Application for Sonifying Data and Equations,"Auditory and multimodal presentation of data (“auditory graphs”) can allow for discoveries in a data set that are sometimes impossible with visual-only inspection. At the same time, multimodal graphs can make data, and the STEM fields that rely on them, more accessible to a much broader range of people, including many with disabilities. There have been a variety of software tools developed to turn data into sound, including the widely-used Sonification Sandbox, but there remains a need for simple, powerful, and more accessible tool for the construction and manipulation of multimodal graphs. Web-based audio functionality is now at the point where it can be leveraged to provide just such a tool. Thus, we developed a web application, the Web Sonification Sandbox (or simply the Web Sandbox), that allows users to create and manipulate multimodal graphs that convey information through both sonification and visualization. The Web Sandbox is designed to be usable by individuals with no technical or musical expertise, which separates it from existing software. The easy-to-use nature of the Web Sandbox, combined with its multimodal nature, allow it to be a maximally accessible application by a diverse audience of users. Nevertheless, the application is also powerful and flexible enough to support advanced users.",,https://qmro.qmul.ac.uk/xmlui/handle/123456789/26083,,presentation,whole,"whole, single",mixed,explore,explore,table,table,"interval, ratio","interval, ratio",NONE,line chart,parameter mapping,color hue,position,none,pitch,no,Public Engagement,General Public,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,"yes, but not online anymore","yes, but not online anymore",video of talk: https://youtu.be/BhL3J5hcwNE?t=7650 U5UNQJRX,2023,"Lemmon, Eric; Schedel, Margaret; Bilkhu, Inderjeet; Zhu, Haotong; Escobar, Litzy; Aumoithe, George","Mapping in the Emergency: Designing a Hyperlocal and Socially Conscious Sonified Map of Covid-19 in Suffolk County, New York","In this paper, we describe a hyperlocal ArcGIS- and sonificationbased COVID-19 web-mapping tool that seeks to ameliorate some of socio-technical problems associated with epidemiological mapping and the field’s frequent usage of visual and haptic data display. This socio-technical problems can be seen in current, wellknown and frequently cited epidemiological mapping tools, such as the Johns Hopkins University COVID-19 Dashboard, which face functional and formal design challenges when compared to the hyper-phenomenal scope of the ongoing pandemic. As a review of our current project scope, we describe the stakes of the pandemic and pose questions related to the aforementioned design challenges that tools deploying data display may face. Taken as a whole, our project aims to offer a response to some of these design challenges by offering user choice and control, n-dimensional data display via sonification, and the integration so socio-political data into epidemiological layers to better represent Suffolk County’s lived experience with COVID-19.","Proceedings of ISon 2022, 7th Interactive Sonification Workshop, BSCC, University of Bremen",https://zenodo.org/record/7552257,10.5281/ZENODO.7552257,exploration,whole,single,mixed,explore,explore,"geometry, table",table,"nominal, interval, ratio",ratio,QRI,map,parameter mapping,position,"size, color saturation","timbre, spatial position",pitch,yes,Data Analysis,General Public,Life Sciences,Epidemiology,Medicine and Health,Desktop Computer Display,yes,"yes, video",https://ericlemmon.net/ison2022-demo-video/ NDQ7CZ8J,2021,"Lindetorp, Hans; Falkenberg, Kjetil",Sonification for everyone everywhere: Evaluating the WebAudioXML sonification toolkit for browsers,"Creating an effective sonification is a challenging task that requires skills and knowledge on an expertise level in several disciplines. This study contributes with WebAudioXML Sonification Toolkit (WAST) that aims at reaching new groups who have not yet considered themselves to be part of the ICAD community. We have designed, built, and evaluated the toolkit by analysing ten student projects using it and conclude that WAST did meet our expectations and that it led to students taking a deep approach to learning and successfully contributed to reaching the learning outcomes. The result indicates that WAST is both easy-to-use, highly accessible, extensively flexible and offers possibilities to share the sonification in any device's web browser simply through a web link, and without installations. We also suggest that a sonification toolkit would become an even more creative environment with virtual instruments and mixing features typically found in Digital Audio Workstations.",ICAD,http://hdl.handle.net/1853/66351,,both,single,single,redundant,browse,none,table,table,ratio,ratio,UE,line chart,parameter mapping,color hue,position,none,"loudness, playbackrate, pitch, trigger frequency, spatial position",yes,"Data Analysis, Public Engagement, Education",Domain Experts,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,yes,"yes, webapp",https://hanslindetorp.github.io/SonificationToolkit/ ITKQH2US,2021,"Lyu, Zhuoyue; Li, Jiannan; Wang, Bryan",AIive: Interactive Visualization and Sonification of Neural Networks in Virtual Reality,"Artificial Intelligence (AI), especially Neural Networks (NNs), has become increasingly popular. However, people usually treat AI as a tool, focusing on improving outcome, accuracy, and performance while paying less attention to the representation of AI itself. We present AIive, an interactive visualization of AI in Virtual Reality (VR) that brings AI “alive”. AIive enables users to manipulate the parameters of NNs with virtual hands and provides auditory feedback for the real-time values of loss, accuracy, and hyperparameters. Thus, AIive contributes an artistic and intuitive way to represent AI by integrating visualization, sonification, and direct manipulation in VR, potentially targeting a wide range of audiences.",2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR),https://ieeexplore.ieee.org/document/9644325/,10.1109/AIVR52153.2021.00057,exploration,whole,"whole, single",complementary,none,locate,network,network,nominal,ratio,NONE,3D network,parameter mapping,"color hue, shape",transparancy,timbre,pitch,yes,Public Engagement,General Public,Applied Sciences & Engineering,Explainable AI,Others,XR,yes,"yes,video",https://www.zhuoyuelyu.com/aiive 964JAT93,2018,"Maçãs, Catarina; Martins, Pedro; Machado, Penousal",Consumption as a Rhythm: A Multimodal Experiment on the Representation of Time-Series,"Through Data Visualisation and Sonification models, we present a study of multimodal representations to characterise the Portuguese consumption patterns, which were gathered from Portuguese hypermarkets and supermarkets over the course of two years. We focus on the rhythmic nature of the data to create and discuss audio and visual representations that highlight disruptions and sudden changes in the normal consumption patterns. For this study, we present two distinct visual and audio representations and discuss their strengths and limitations.",2018 22nd International Conference Information Visualisation (IV),https://ieeexplore.ieee.org/document/8564211/,10.1109/iV.2018.00093,presentation,whole,whole,mixed,explore,explore,table,table,"nominal, ratio","nominal, ratio",QRI,individual circular design,parameter mapping,"shape, color hue",size,timbre,loudness,no,Public Engagement,General Public,Arts&Humanities,Economy,Economy,Desktop Computer Display,yes,"yes, video",https://vimeo.com/270078256 964JAT93,2018,"Maçãs, Catarina; Martins, Pedro; Machado, Penousal",Consumption as a Rhythm: A Multimodal Experiment on the Representation of Time-Series,"Through Data Visualisation and Sonification models, we present a study of multimodal representations to characterise the Portuguese consumption patterns, which were gathered from Portuguese hypermarkets and supermarkets over the course of two years. We focus on the rhythmic nature of the data to create and discuss audio and visual representations that highlight disruptions and sudden changes in the normal consumption patterns. For this study, we present two distinct visual and audio representations and discuss their strengths and limitations.",2018 22nd International Conference Information Visualisation (IV),https://ieeexplore.ieee.org/document/8564211/,10.1109/iV.2018.00093,presentation,whole,whole,mixed,explore,explore,table,table,"nominal, ratio","nominal, ratio",QRI,individual flower-style design,parameter mapping,"shape, color hue, position","size, length",timbre,"loudness, pitch",no,Public Engagement,General Public,Arts&Humanities,Economy,Economy,Desktop Computer Display,yes,"yes, video",https://vimeo.com/270077726 9UNRR9SQ,2018,"MacDonald, Daniel E.; Natarajan, Thangam; Windeyer, Richard C.; Coppin, Peter; Steinman, David A.",Data-Driven Sonification of CFD Aneurysm Models,"A novel method is presented for inspecting and characterizing turbulent-like hemodynamic structures in intracranial cerebral aneurysms by sonification of data generated using Computational Fluid Dynamics (CFD). The intention of the current research is to intuitively communicate flow complexity by augmenting conventional flow visualizations with data-driven sound, thereby increasing the ease of interpretation of dense spatiotemporal data through multimodal presentation. The described implementation allows the user to listen to flow fluctuations thought to indicate turbulent-like blood flow patterns that are often visually difficult to discriminate in conventional flow visualizations.",ICAD,http://hdl.handle.net/1853/60063,10.21785/icad2018.010,exploration,group,group,complementary,explore,explore,field,table,interval,ratio,NONE,volume rendering,parameter mapping,"shape, position","position, color hue",none,"pitch, loudness",yes,Data Analysis,Domain Experts,Life Sciences,Blood Flow,Medicine and Health,Desktop Computer Display,yes,"yes, video",https://www.youtube.com/watch?v=UmDvnPjnpV4 E4NX6U5B,2019,"Malikova, Evgeniya; Adzhiev, Valery; Fryazinov, Oleg; Pasko, Alexander",Visual-auditory volume rendering of scalar fields,"This paper describes a novel approach to visual-auditory volume rendering of continuous scalar fields. The proposed method uses well-established similarities in light transfer and sound propagation modelling to extend the visual scalar field data analysis with auditory attributes. We address the visual perception limitations of existing volume rendering techniques and show that they can be handled by auditory analysis. In particular, we describe a practical application to demonstrate how the proposed approach may keep the researcher aware of the visual perception issues in colour mapping and help track and detect geometrical features and symmetry break, issues that are important in the context of interpretation of the physical phenomena.",ICAD,http://hdl.handle.net/1853/61517,,exploration,whole,whole,redundant,explore,explore,field,field,ratio,ratio,NONE,heatmap,parameter mapping,"shape, color hue",none,none,"pitch, loudness, duration",no,Data Analysis,Domain Experts,Life Sciences,Molecular Science,Molecular Science,Desktop Computer Display,yes,"yes, videos","https://vimeo.com/323545930, https://vimeo.com/323547646, https://vimeo.com/323547659, PW: icad2019" QZ73K79Q,2016,"Matsubara, Masaki; Morimoto, Yota; Uchide, Takahiko",Collaborative Study of Interactive Seismic Array Sonification for Data Exploration and Public Outreach Activities,"Earthquakes are studied on the basis of seismograms. When seismologists review seismograms, they plot them on a screen or paper after preprocessing. Proper visualisations help them determine the nature of earthquake source processes and/or the effects of underground structures through which the seismic wave propagates. Audification is another method to obtain an overview of seismic records. Since the frequency of seismic records is generally too low to be audible, the audification playback rate needs to be increased to shift frequencies into the audible range. This method often renders the playback of sound too fast to perceive the nature of earthquake rupture and seismic propagation. Furthermore, audified sounds are often perceived as fearful and hence unsuitable for distribution to the public. Hence, we aim to understand spatio-temporal wave propagation by sonifying data from a seismic array and to design a pleasant sound for public outreach. In this research, a sonification researcher, a composer and a seismologist collaborated to propose an interactive sonification system for seismologists. An interactive sonification method for multiple seismic waves was developed for data exploration. To investigate the method, it was applied to a seismic array of the wave propagation from the 2011 Tohoku-oki earthquake over Japanese islands. As the playback rate is only 10 times in the investigation, it is easy to understand the propagation of seismic waves. The sonified sound shapes show some characteristics and distributions such that seismologists can easily determine the time span and frequency band to be focused on. The case study showed how a seismologist explored the data with visualisation and sonification and how he discovered triggered earthquake by using the sonified sound.","Proceedings of ISon 2016, 5th Interactive Sonification Workshop, CITEC, Bielefeld University",,,exploration,whole,whole,complementary,explore,explore,"geometry, table",table,"nominal, interval",interval,QRI,map,"audification, parameter mapping",none,color hue,none,pitch,yes,Data Analysis,General Public,Natural Sciences,Seismology,Earth Science,Desktop Computer Display,no,no, KIKXXGG2,2012,"Ness, Steven; Reimer, Paul; Love, Justin; Schloss, W. Andrew; Tzanetakis, George",Sonophenology,"The study of periodic biological processes, such as when plants flower and birds arrive in the spring is known as Phenology. In recent years this field has gained interest from the scientific community because of the applicability of this data to the study of climate change and other ecological processes. In this paper we propose the use of tangible interfaces for interactive sonification with a specific example of a multimodal tangible interface consisting of a physical paper map and tracking of fiducial markers combined with a novel drawing interface. The designed interface enables one or more users to specify point queries with the map interface and to specify time queries with the drawing interface. This allows the user to explore both time and space while receiving immediate sonic feedback of their actions. This system can be used to study and explore the effects of climate change, both as tool to be used by scientists, and as a way to educate and involve members of the general public in a dynamic way in this research.",Journal on Multimodal User Interfaces,https://doi.org/10.1007/s12193-011-0066-4,10.1007/s12193-011-0066-4,exploration,whole,"whole, group",complementary,explore,explore,geometry,"geometry, table",nominal,interval,QRI,map,parameter mapping,color hue,position,timbre,"pitch, onset time",yes,Education,"Domain Experts, General Public",Natural Sciences,Phenology,Earth Science,"Physical Environment/ Multi User, Touch Display",no,no,https://www.youtube.com/watch?v=829r3y01XLk) G5IRKF67,2016,"North, Kevin J.; Sarma, Anita; Cohen, Myra B.",Understanding git history: A multi-sense view,"Version control systems archive data about the development history of a project, which can be used to analyze and understand different facets of a software project. The project history can be used to evaluate the development process of a team, as an aid in bug fixing, or to help new members get on track with development. However, state of the art techniques for analyzing version control data provide only partial views into this information, and lack an easy way to present all the dimensions of the data. In this paper we present GitVS, a hybrid view that incorporates visualization and sonification to represent the multiple dimensions of version control data - development time line, conflicts, etc. In a formative user study comparing the GitHub Network Graph, GitVS, and a version of GitVS without sound, we show GitVS improves over the GitHub Network Graph and that while sound makes it easier to correctly understand version history for some tasks, it is more difficult for others.",Proceedings of the 8th international workshop on social software engineering,https://doi.org/10.1145/2993283.2993285,10.1145/2993283.2993285,exploration,"whole, group, single","group, single",mixed,explore,explore,table,table,"nominal, ordinal","nominal, ordinal",UP,gantt chart,"earcons, parameter mapping","color hue, shape","position, size",timbre,none,yes,Data Analysis,Domain Experts,Applied Sciences & Engineering,Computer Science,Computer Science,Desktop Computer Display,no,yes,https://cse.unl.edu/~myra/artifacts/GitVS/vm/ M3ML3J35,2015,"Papachristodoulou, Panagiota; Betella, Alberto; Manzolli, Jonatas",Augmenting the navigation of complex data sets using sonification: a case study with BrainX3,"The meaningful representation and exploration of big data constitutes a challenge for many scientific fields. In recent years, auditory displays have been effectively employed to address this problem. The coupling of sonification with visualization techniques in multimodal displays can lead to the implementation of powerful tools for the understanding of complex datasets. In this study, we applied sonification techniques to a complex dataset from neuroscience. To do so, we used BrainX3, a novel immersive technology for the exploration of large brain networks. We conducted an experiment to assess whether the addition of an auditory layer would result in better user performance of brain region identification at different spatial resolutions.",2015 IEEE 2nd VR Workshop on Sonic Interactions for Virtual Environments (SIVE),https://ieeexplore.ieee.org/document/7361284/,10.1109/SIVE.2015.7361284,exploration,"whole, group",group,complementary,explore,browse,geometry,table,nominal,"interval, ratio",UP,3D network,"parameter mapping, auditory icons",none,position,spatial position,"harmonics range, reverberation, loudness",yes,Data Analysis,General Public,Natural Sciences,Neuroscience,Neuroscience,"Physical Environment/ Multi User, Touch Display",no,no, 9W376E2S,2014,"Papachristodoulou, Panagiota; Betella, Alberto; Verschure, Paul",Sonification of Large Datasets in a 3D Immersive Environment: A Neuroscience Case Study,"Auditory display techniques can play a key role in the understanding of hidden patterns in large datasets. In this study, we investigated the role of sonification applied to an immersive 3D visualization of a complex network dataset. As a test case, we used a 3D interactive visualization of the so called, connectome of the human brain, in the immersive space called ""eXperience Induction Machine (XIM)"". We conducted an empirical validation where subjects were asked to perform a navigation task through the network and were subsequently tested for their understanding of the dataset. Our results showed that sonification provides a further layer of understanding of the dynamics of the network by enhancing the subjects' structural understanding of the data space.",,,,exploration,whole,group,complementary,explore,explore,network,table,nominal,"interval, ratio",UP,network,parameter mapping,none,position,spatial position,"tempo, pitch, loudness",yes,Data Analysis,General Public,Natural Sciences,Neuroscience,Neuroscience,Desktop Computer Display,no,no, SPHMHR22,2022,"Paté, Arthur; Farge, Gaspard; Holtzman, Benjamin; Barth, Anna C.; Poli, Piero; Boschi, Lapo; Karlstrom, Leif",Combining audio and visual displays to highlight temporal and spatial seismic patterns,"Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.",Journal on Multimodal User Interfaces,https://parthurp.github.io/homepage/SpatialSeismicSoundscapes_article2021.html,10.1007/s12193-021-00378-8,both,group,group,mixed,explore,explore,geometry,field,ratio,ratio,"QRI, UE",dot map,audification,position,"color hue, size",spatial position,loudness,no,Research,"Domain Experts, General Public",Natural Sciences,Seismology,Earth Science,Physical Environment/ Multi User,yes,"yes, video",https://parthurp.github.io/homepage/SpatialSeismicSoundscapes_article2021.html SPHMHR22,2022,"Paté, Arthur; Farge, Gaspard; Holtzman, Benjamin; Barth, Anna C.; Poli, Piero; Boschi, Lapo; Karlstrom, Leif",Combining audio and visual displays to highlight temporal and spatial seismic patterns,"Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.",Journal on Multimodal User Interfaces,,10.1007/s12193-021-00378-8,both,group,group,redundant,explore,explore,field,field,ratio,"nominal,ratio","QRI, UE",heatmap,"earcons, auditory icons",position,color hue,spatial position,pitch,no,Research,"Domain Experts, General Public",Natural Sciences,Seismology,Earth Science,Physical Environment/ Multi User,yes,"yes, video",https://parthurp.github.io/homepage/SpatialSeismicSoundscapes_article2021.html SPHMHR22,2022,"Paté, Arthur; Farge, Gaspard; Holtzman, Benjamin; Barth, Anna C.; Poli, Piero; Boschi, Lapo; Karlstrom, Leif",Combining audio and visual displays to highlight temporal and spatial seismic patterns,"Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.",Journal on Multimodal User Interfaces,,10.1007/s12193-021-00378-8,both,group,group,redundant,explore,explore,field,field,ratio,ratio,"QRI, UE",line chart,audification,position,position,spatial position,loudness,no,Research,"Domain Experts, General Public",Natural Sciences,Seismology,Earth Science,Physical Environment/ Multi User,yes,"yes, video",https://parthurp.github.io/homepage/SpatialSeismicSoundscapes_article2021.html RAA8WGBY,2023,"Peng, Tristan; Choi, Hongchan; Berger, Jonathan",SIREN: CREATIVE AND EXTENSIBLE SONIFICATION ON THE WEB,"Parameter mapping sonification is a pervasive aspect of many everyday situations, from knocking on a watermelon to determine “goodness,” to diagnostic listening to the pings and rattles of an engine. However, not all mappings are so intuitive. Sonification Interface for REmapping Nature (SIREN) is a multipurpose, powerful web application that aims at exploring sonification parameter mappings allowing for a wide range of flexibility in terms of sonic choices and mapping strategies. SIREN aspires to enable users to arrive at sophisticated sonification without getting bogged down in the details of programming.",ICAD,https://hdl.handle.net/1853/72879,,presentation,"whole, single",whole,redundant,none,explore,table,table,"nominal, ordinal, interval","nominal, ordinal, interval",NONE,line chart,parameter mapping,none,position,none,"pitch, loudness, spatial position, duration",no,Education,General Public,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,,"yes, webapp",https://kizjkre.github.io/siren/#/ WYGSJKCI,2019,"Phillips, Sean; Cabrera, Andres",Sonification workstation,"Sonification Workstation is an open-source application for general sonification tasks, designed with ease-of-use and wide applicability in mind. Intended to foster adoption of sonification across disciplines, and increase experimentation with sonification by non-specialists, Sonification Workstation distills tasks useful in sonification and encapsulates them in a single software environment. The novel interface combines familiar modes of navigation from Digital Audio Workstations, with a highly simplified patcher interface for creating the sonification scheme. Further, the software associates methods of sonification with the data they sonify, in session files, which will make sharing and reproducing sonifications easier. It is posited that facilitating experimentation by non-specialists will increase the potential growth of sonification into fresh territory, encourage discussion of sonification techniques and uses, and create a larger pool of ideas to draw from in advancing the field of sonification. Source code is available at https://github.com/Cherdyakov/sonificationworkstation. Binaries for macOS and Windows, as well as sample content, are available at http://sonificationworkstation.org.",ICAD,http://hdl.handle.net/1853/61529,,presentation,whole,whole,redundant,none,none,table,table,"interval, ratio","interval, ratio",NONE,line chart,parameter mapping,none,position,none,"pitch, spatial position",no,Data Analysis,General Public,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,no,no, R7GSVX9Z,2015,"Rau, Benjamin; Frieß, Florian; Krone, Michael; Muller, Christoph; Ertl, Thomas",Enhancing visualization of molecular simulations using sonification,"Scientific visualization is an application area for virtual reality environments like stereoscopic displays or CAVEs. Especially interactive molecular visualizations that show the complex threedimensional structures found in structural biology are often investigated using such environments. In contrast to VR applications like simulators, molecular visualization typically lacks auditory output. Nevertheless, sonification can be used to convey information about the data. In our work, we use sound to highlight events extracted from a molecular dynamics simulation. This not only offloads information from the visual channel, but can also guide the attention of the analyst towards important phenomena even if they are occluded in the visualization. Sound also creates a higher level of immersion, which can be beneficial for educational purposes. In this paper, we detail our application that adds sonification to the visualization of molecular simulations.",2015 IEEE 1st International Workshop on Virtual and Augmented Reality for Molecular Science (VARMS@IEEEVR),https://ieeexplore.ieee.org/document/7151725/,10.1109/VARMS.2015.7151725,exploration,whole,group,complementary,"browse, locate, explore",locate,geometry,geometry,ratio,"nominal, ratio",QRI,volume rendering,"auditory icons, parameter mapping",shape,color hue,timbre,pitch,yes,Data Analysis,Domain Experts,Life Sciences,Molecules,Molecular Science,Desktop Computer Display,no,no, Z5EZHA3K,2018,"Riber, Adrián García",Planethesizer: Approaching exoplanet sonification,"The creation of simulations, sounds and images based on information related to an object of investigation is currently a real tool used in multiple areas to bring the non-specialized public closer to scientific achievements and discoveries. Under this context of multimodal representations and simulations developed for educational and informational purposes, this work intends to build a bridge between virtual musical instruments’ development and physical models, using the gravitation laws of the seven planets orbiting around the Trappist-1 star. The following is a case study of an interdisciplinary conversion algorithm design that relates musical software synthesis to exoplanets’ astronomical data - measured from the observed flux variations in the light curves of their star- and that tries to suggest a systematic and reproducible method, useful for any other planetary system or model-based virtual instrument design. As a result, the Virtual Interactive Synthesizer prototype Planethesizer is presented, whose default configurations display a multimodal Trappist-1, Kepler-444 and K2-72 planetary systems simulation.",ICAD,http://hdl.handle.net/1853/60073,,both,whole,whole,redundant,explore,explore,table,table,"nominal, ordinal","nominal, ordinal",NONE,volume rendering,parameter mapping,color hue,"tilt, size, position",timbre,"tempo, onset time, pitch, loudness",yes,Data Analysis,Domain Experts,Natural Sciences,Astronomy,Astronomy,Desktop Computer Display,yes,yes,https://archive.org/details/PlanethesizerWindows 9L4IFX86,2013,"Rogińska, Agnieszka; Friedman, Kent; Mohanraj, Hariharan",Exploring sonification for augmenting Brain scan data,Medical image data has traditionally been analyzed using visual displays and statistical analysis methods. Visual representations are limited due to the nature of the display and the number of dimensions that can be represented visually. This paper describes the use of sonification to represent medical image data of brain scans of patients with AlzheimerÕs dementia. The use of sonification is described as an approach to augment traditional diagnosis methods used to diagnose AlzheimerÕs dementia.,ICAD,http://hdl.handle.net/1853/51653,,exploration,group,group,redundant,explore,explore,field,field,ratio,ratio,NONE,slicing,parameter mapping,none,color hue,none,"pitch, onset time, position",no,Data Analysis,Domain Experts,Life Sciences,Brain Scans,Medicine and Health,Desktop Computer Display,no,no, EINDBLK7,2021,"Rönnberg, Niklas",Sonification for conveying data and emotion,"In the present study a sonification of running data was evaluated. The aim of the sonification was to both convey information about the data and convey a specific emotion. The sonification was evaluated in three parts, firstly as an auditory graph, secondly together with additional text information, and thirdly together with an animated visualization, with a total of 150 responses. The results suggest that the sonification could convey an emotion similar to that intended, but at the cost of less good representation of the data. The addition of visual information supported understanding of the sonification, and the auditory representation of data. The results thus suggest that it is possible to design sonification that is perceived as both interesting and fun, and convey an emotional impression, but that there may be a trade off between musical experience and clarity in sonification.",Proceedings of the 16th international audio mostly conference,https://doi.org/10.1145/3478384.3478387,10.1145/3478384.3478387,presentation,whole,whole,complementary,none,none,table,table,ratio,"interval, ratio",UE,bar chart,parameter mapping,"position, color hue",length,timbre,"loudness, brigthness, the number of chord strokes",no,Public Engagement,General Public,Life Sciences,Sports,Others,Desktop Computer Display,yes,"yes, video",https://vimeo.com/397235072 HPCV8ASK,2016,"Rönnberg, Niklas; Johansson, Jimmy",Interactive Sonification for Visual Dense Data Displays,"This paper presents an experiment designed to evaluate the possible benefits of sonification in information visualization to give rise to further research challenges. It is hypothesized, that by using musical sounds for sonification when visualizing complex data, interpretation and comprehension of the visual representation could be increased by interactive sonification.","Proc. 5th Interactive Sonification Workshop, ISon",,,exploration,whole,group,redundant,"browse, locate, explore","browse, locate, explore",table,table,"interval, ratio","interval, ratio",UP,"scatter plot, parallel coordinates",parameter mapping,color hue,position,"pitch, timbre",loudness,yes,Data Analysis,Researchers,Applied Sciences & Engineering,Data Display,Data Display,Desktop Computer Display,yes,"yes, video",https://vimeo.com/247770770 VTI8KPMJ,2022,"Russo, Matt; Santaguida, Andrew",5000 Exoplanets: Listen to the Sounds of Discovery,"In March of 2022, NASA announced the discovery of the 5000th planet orbiting a star other than our sun (an exoplanet). We have created a sonification and visualization to celebrate this milestone and to communicate the exciting history of discovery to the general public. Our work provides a visceral experience of how humanity’s knowledge of alien worlds has progressed. A relatively simple and straightforward sonification mapping is used to make the informational content as accessible to the general public as possible. Listeners can see and hear the timing, number, and relative orbital periods of the exoplanets that have been discovered to date. The sonification was experienced millions of times through NASA’s social media channels and there are plans to update the sonification as future milestones are reached.",ICAD,http://hdl.handle.net/1853/67384,,presentation,whole,whole,mixed,none,none,"geometry, table",table,interval,interval,UE,3D map,parameter mapping,color hue,"position, size, opacity",none,"timing, pitch, loudness, spatial position",no,Public Engagement,General Public,Natural Sciences,Astronomy,Astronomy,"Desktop Computer Display, Physical Environment/ Multi User",yes,"yes, video",https://exoplanets.nasa.gov/resources/2321/5000-exoplanets-listen-to-the-sounds-of-discovery-360-video/ LVZUE7V8,2022,"Svoronos-Kanavas, Iason; Agiomyrgianakis, Vasilis; Rönnberg, Niklas",An exploratory use of audiovisual displays on oceanographic data,"The present study is an interdisciplinary endeavour that transmutes science, technology, and aesthetics into an audiovisual experience. The objective is to highlight the potential of combining sonification with visualisation in order to enhance the comprehension of extensive and complex sets of data. Moreover, this paper describes contemporary tools and methods for the implementation of the practice and suggests effective ways to monitor environmental changes. It can be regarded as an exploratory study for familiarisation with the potential of sonification and visualisation in the exploration of environmental data.",Proc. AVI 2022 Workshop on Audio-Visual Analytics. Rome,https://www.researchgate.net/profile/Iason-Svoronos-Kanavas/publication/363864398_An_exploratory_use_of_audiovisual_displays_on_oceanographic_data/links/6332faf1694dbe4bf4c64cef/An-exploratory-use-of-audiovisual-displays-on-oceanographic-data.pdf,,presentation,whole,whole,redundant,explore,explore,table,table,ratio,ratio,NONE,fluid-like simulation,parameter mapping,none,"movement, color hue",timbre,"onset time, spatial position, triggering frequency, loudness, pitch, cut-off frequency, modulation frequency ",no,Public Engagement,General Public,Natural Sciences,Oceanography,Earth Science,Desktop Computer Display,yes,"yes, video",https://vimeo.com/698105264 YNFIUZFP,2021,"Temor, Lucas; MacDonald, Daniel E.; Natarajan, Thangam; Coppin, Peter; Steinman, David A.",Perceptually-motivated sonification of spatiotemporally-dynamic CFD data,"Everyday perception and action are fundamentally multisensory. Despite this, the sole reliance on visualization for the representation of complex 3D spatiotemporal data is still widespread. In the past we have proposed various prototypes for the sonification of dense data from computational fluid dynamics (CFD) simulations of turbulent-like blood flow, but did not robustly consider the perception and associated meaning-making of the resultant sounds. To reduce some of the complexities of these data for sonification, in this work we present a feature-based approach, applying ideas from auditory scene analysis to sonify different data features along perceptually-separable auditory streams. As there are many possible features in these dense data, we followed the analogy of ""caricature"" to guide our definition and subsequent amplification of unique spectral and fluctuating features, while effectively minimizing the features common between simulations. This approach may allow for better insight into the behavior of flow instabilities when compared to our previous sonifications and/or visualizations, and additionally we observed benefits when some redundancy was maintained between modalities.",ICAD,http://hdl.handle.net/1853/66343,,exploration,whole,whole,mixed,explore,explore,field,field,ratio,ratio,QRI,volume rendering,parameter mapping,color hue,animation,timbre,"pitch, hamonic range",no,Data Analysis,Domain Experts,Life Sciences,Blood Flow,Medicine and Health,Desktop Computer Display,yes,"yes, video",https://www.youtube.com/playlist?list=PLRbQXqE-XKzdDOqxnXI4yfo3r_cIcrjNl XP9E8SFX,2023,"Traver, Peter; Bergh, Emil",HARMONICES SOLARIS - SONIFICATION OF THE PLANETS,"This project is a sonification of the planets in our solar system designed as a spatial audio installation. Audio for each planet was generated within Python using data from NASA about the planets (including Pluto). A Markov model across arbitrary pitch set space was used to generate pitch set collections for each planet with regard to the various parameters in the dataset. XP4Live is used for spatialization and visuals due to its real-time capability. The final piece, as presented, allows the audience to experience orbiting planet sound objects around them in space; the installation includes a mapping of the planets to faders on a MIDI controller allowing the audience to isolate individual planets to hear their sonic makeup and orbiting pattern. This model is presented in a spatial audio array of at least four speakers, but can also be presented as a binaural mix for virtual submission. Future work includes making this piece accessible to the public via a binaural rendering on a webpage, as well as VR implementation for enhanced interactivity.",ICAD,https://hdl.handle.net/1853/72864,,presentation,whole,"whole, group",complementary,none,none,geometry,table,nominal,interval,QRI,3D scatter plot,parameter mapping,color hue,"size, position",none,"pitch, harmonic range, size of chord, rate of chord change, length of composition, spatial position",yes,Public Engagement,General Public,Natural Sciences,Astronomy,Astronomy,Physical Environment/ Multi User,no,no, 7CBTT67I,2015,"Winters, R. Michael; Weinberg, Gil","Sonification of the Tohoku earthquake: Music, popularization & the auditory sublime","The past century has witnessed the emergence of expressive musical forms that originate in appropriated technologies and practices. In most cases, this appropriation is performed without protest— but not always. Carefully negotiating a space for sound as an objective, scientific medium, the field of sonification has cautiously guarded the term from subjective and affective endeavors. This paper explores the tensions arising in sonification popularization through a formal analysis of Sonification of the Tohoku Earthquake, a two-minute YouTube video that combined audification with a time-aligned seismograph, text and heatmap. Although the many views the video has received speak to a high public impact, the features contributing to this popularity have not been formalized, nor the extent to which these characteristics further sonifications’ scientific mission. For this purpose, a theory of popularization based upon “sublime listening experiences” is applied. The paper concludes by drawing attention to broader themes in the history of music and technology and presents guidelines for designing effective public-facing examples.",ICAD,http://hdl.handle.net/1853/54149,,presentation,whole,whole,redundant,explore,explore,table,table,ratio,ratio,QRI,line chart,audification,position,position,spatial position,"pitch, loudness, timbre",no,Public Engagement,General Public,Natural Sciences,Seismology,Earth Science,Desktop Computer Display,yes,"yes, video",https://www.youtube.com/watch?v=3PJxUPvz9Oo M6KHP77V,2018,"Yang, Jiajun; Hermann, Thomas",Interactive Mode Explorer Sonification Enhances Exploratory Cluster Analysis,"Exploratory Data Analysis (EDA) refers to the process of detecting patterns of data when explicit knowledge of such patterns within the data is missing. Because EDA predominantly employs data visualization, it remains challenging to visualize high-dimensional data. To minimize the challenge, some information can be shifted into the auditory channel using humans’ highly developed listening skills. This paper introduces Mode Explorer, a new sonification model that enables continuous interactive exploration of datasets with regards to their clustering. The method was shown to be effective in supporting users in the more accurate assessment of cluster mass and number of clusters. While the Mode Explorer sonification aimed to support cluster analysis, the ongoing research has the goal of establishing a more general toolbox of sonification models, tailored to uncover different structural aspects of high-dimensional data. The principle of extending the data display to the auditory domain is applied by augmenting interactions with 2D scatter plots of high-dimensional data with information about the probability density function.",Journal of the Audio Engineering Society,http://www.aes.org/e-lib/browse.cfm?elib=19712,10.17743/jaes.2018.0042,exploration,group,"whole, group, single",complementary,none,explore,table,table,"interval, ratio","interval, ratio",UP,scatter plot,model based sonification,position,position,none,"pitch, timbre",yes,Data Analysis,"Researchers, Domain Experts",Applied Sciences & Engineering,Data Display,Data Display,Touch Display,yes,"yes, video",https://pub.uni-bielefeld.de/record/2920473