Publikationer av Roberto Bresin
Refereegranskade
Artiklar
[1]
Favero, F., Bresin, R., Mancini, M., Lowden, A. & Avola, D. (2024). Light and Motion: Effects of Light Conditions and mEDI on Activity and Motion Area under a Sky-Lighting Machine. LEUKOS The Journal of the Illuminating Engineering Society of North America, 1-23.
[2]
Latupeirissa, A. B. & Bresin, R. (2023). PepperOSC: enabling interactive sonification of a robot's expressive movement. Journal on Multimodal User Interfaces, 17(4), 231-239.
[3]
Latupeirissa, A. B., Panariello, C. & Bresin, R. (2023). Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement Sonification. ACM Transactions on Human-Robot Interaction.
[4]
Orthmann, B., Leite, I., Bresin, R. & Torre, I. (2023). Sounding Robots : Design and Evaluation of Auditory Displays for Unintentional Human-robot Interaction. ACM Transactions on Human-Robot Interaction, 12(4).
[5]
Favero, F., Lowden, A., Bresin, R. & Ejhed, J. (2023). Study of the Effects of Daylighting and Artificial Lighting at 59° Latitude on Mental States, Behaviour and Perception. Sustainability, 15(2).
[6]
Sköld, M. & Bresin, R. (2022). Sonification of Complex Spectral Structures. Frontiers in Neuroscience, 16.
[7]
Panariello, C. & Bresin, R. (2022). Sonification of Computer Processes : The Cases of Computer Shutdown and Idle Mode. Frontiers in Neuroscience, 16.
[8]
Misdariis, N., Özcan, E., Grassi, M., Pauletto, S., Barrass, S., Bresin, R. & Susini, P. (2022). Sound experts’ perspectives on astronomy sonification projects. Nature Astronomy, 6(11), 1249-1255.
[9]
Frid, E. & Bresin, R. (2021). Perceptual Evaluation of Blended Sonification of Mechanical Robot Sounds Produced by Emotionally Expressive Gestures : Augmenting Consequential Sounds to Improve Non-verbal Robot Communication. International Journal of Social Robotics.
[10]
Bresin, R., Mancini, M., Elblaus, L. & Frid, E. (2020). Sonification of the self vs. sonification of the other : Differences in the sonification of performed vs. observed simple hand movements. International journal of human-computer studies, 144.
[11]
Frid, E., Elblaus, L. & Bresin, R. (2019). Interactive sonification of a fluid dance movement : an exploratory study. Journal on Multimodal User Interfaces, 13(3), 181-189.
[12]
Frid, E., Moll, J., Bresin, R. & Sallnäs Pysander, E.-L. (2018). Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task. Journal on Multimodal User Interfaces, 13(4), 279-290.
[13]
Frid, E., Bresin, R., Alborno, P. & Elblaus, L. (2016). Interactive Sonification of Spontaneous Movement of Children : Cross-Modal Mapping and the Perception of Body Movement Qualities through Sound. Frontiers in Neuroscience, 10.
[14]
Elblaus, L., Tsaknaki, V., Lewandowski, V., Bresin, R., Hwang, S., Song, J. ... Taylor, A. (2015). Demo Hour. interactions, 22(5), 6-9.
[15]
Turchet, L. & Bresin, R. (2015). Effects of interactive sonification on emotionally expressive walking styles. IEEE Transactions on Affective Computing, 6(2), 152-164.
[16]
Dubus, G. & Bresin, R. (2015). Exploration and evaluation of a system for interactive sonification of elite rowing. Sports Engineering, 18(1), 29-41.
[17]
Goebl, W., Bresin, R. & Fujinaga, I. (2014). Perception of touch quality in piano tones. Journal of the Acoustical Society of America, 136(5), 2839-2850.
[18]
Giordano, B., Egermann, H. & Bresin, R. (2014). The production and perception of emotionally expressive walking sounds : Similarities between musical performance and everyday motor activity. PLOS ONE, 9(12), e115587.
[19]
Dubus, G. & Bresin, R. (2013). A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities. PLOS ONE, 8(12), e82491.
[20]
Eerola, T., Friberg, A. & Bresin, R. (2013). Emotional expression in music : Contribution, linearity, and additivity of primary musical cues. Frontiers in Psychology, 4, 487.
[21]
Hansen, K. F., Dravins, C. & Bresin, R. (2012). Active Listening and Expressive Communication for Children with Hearing Loss Using Getatable Environments for Creativity. Journal of New Music Research, 41(4), 365-375.
[22]
Bresin, R., Hermann, T. & Hunt, A. (2012). Interactive sonification. Journal on Multimodal User Interfaces, 5(3-4), 85-86.
[23]
Fabiani, M., Bresin, R. & Dubus, G. (2012). Interactive sonification of expressive hand gestures on a handheld device. Journal on Multimodal User Interfaces, 6(1-2), 49-57.
[24]
Varni, G., Dubus, G., Oksanen, S., Volpe, G., Fabiani, M., Bresin, R. ... Camurri, A. (2012). Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices. Journal on Multimodal User Interfaces, 5(3-4), 157-173.
[25]
Hansen, K. F. & Bresin, R. (2012). Sonification of distance between stations in train journeys. TMH-QPSR special issue: Proceedings of SMC Sweden 2012 Sound and Music Computing, Understanding and Practicing in Sweden, 52(1), 13-14.
[26]
Bolíbar, J. & Bresin, R. (2012). Sound feedback for the optimization of performance in running. TMH-QPSR special issue: Proceedings of SMC Sweden 2012 Sound and Music Computing, Understanding and Practicing in Sweden, 52(1), 39-40.
[27]
Hansen, K. F., Dubus, G. & Bresin, R. (2012). Using modern smartphones to create interactive listening experiences for hearing impaired. TMH-QPSR special issue: Proceedings of SMC Sweden 2012 Sound and Music Computing, Understanding and Practicing in Sweden, 52(1), 42.
[28]
Hansen, K. F., Fabiani, M. & Bresin, R. (2011). Analysis of the acoustics and playing strategies of turntable scratching. Acta Acoustica united with Acustica, 97(2), 303-314.
[29]
Bresin, R. & Friberg, A. (2011). Emotion rendering in music : Range and characteristic values of seven musical variables. Cortex, 47(9), 1068-1081.
[30]
Burger, B. & Bresin, R. (2010). Communication of Musical Expression by Means of Mobile Robot Gestures. Journal on Multimodal User Interfaces, 3(1), 109-118.
[31]
Hansen, K. F. & Bresin, R. (2010). The Skipproof virtual turntable for high-level control of scratching. Computer music journal, 34(2), 39-50.
[32]
Visell, Y., Fontana, F., Giordano, B. L., Nordahl, R., Serafin, S. & Bresin, R. (2009). Sound design and perception in walking interactions. International journal of human-computer studies, 67(11), 947-959.
[33]
Mancini, M., Bresin, R. & Pelachaud, C. (2007). A virtual head driven by music expressivity. IEEE Transactions on Audio, Speech, and Language Processing, 15(6), 1833-1841.
[34]
Serra, X., Bresin, R. & Camurri, A. (2007). Sound and music computing : Challenges and strategies. Journal of New Music Research, 36(3), 185-190.
[35]
Friberg, A., Bresin, R. & Sundberg, J. (2006). Overview of the KTH rule system for musical performance. Advances in Cognitive Psychology, 2(2-3), 145-161.
[36]
Laukka, P., Juslin, P. N. & Bresin, R. (2005). A dimensional approach to vocal expression of emotion. Cognition & Emotion, 19(5), 633-653.
[37]
Schoonderwaldt, E. & Bresin, R. (2005). Book Review : Freedom and Constraints in Timing and Ornamentation: Investigations of Music Performance. Psychology of Music, 33(1), 122-128.
[38]
Goebl, W., Bresin, R. & Galembo, A. (2005). Touch and temporal behavior of grand piano actions. Journal of the Acoustical Society of America, 118(2), 1154-1165.
[39]
Hansen, K. F. & Bresin, R. (2004). Analysis of a genuine scratch performance. Lecture Notes in Computer Science, 2915, 477-478.
[40]
Sundberg, J., Friberg, A. & Bresin, R. (2003). Attempts to reproduce a pianist's expressive timing with director musices performance rules. Journal of New Music Research, 32(3), 317-325.
[41]
Lindström, E., Juslin, P. N., Bresin, R. & Williamon, A. (2003). Expressivity comes from within your soul: A questionnaire study of students´ perspectives on musical expressivity. Research Studies in Music Education, 20, 23-47.
[42]
Goebl, W. & Bresin, R. (2003). Measurement and reproduction accuracy of computer-controlled grand pianos. Journal of the Acoustical Society of America, 114(4), 2273-2283.
[43]
[44]
Schoonderwaldt, E., Friberg, A., Bresin, R. & Juslin, P. N. (2002). A system for improving the communication of emotion in music performance by feedback learning. Journal of the Acoustical Society of America, 111(5), 2471.
[45]
Juslin, P. N., Friberg, A. & Bresin, R. (2002). Toward a computational model of expression in music performance: The GERM model. Musicae scientiae, Special Issue 2001-2002, 63-122.
[46]
Bresin, R. & Battel, G. U. (2000). Articulation strategies in expressive piano performance - Analysis of legato, staccato, and repeated notes in performances of the Andante movement of Mozart's Sonata in G major (K 545). Journal of New Music Research, 29(3), 211-224.
[47]
Bresin, R. & Friberg, A. (2000). Emotional coloring of computer controlled music performance. Computer music journal, 24(4), 44-61.
[48]
Bresin, R. & Friberg, A. (2000). Emotional coloring of computer controlled music performance. Computer music journal, 24(4), 44-63.
[49]
Bresin, R. & Widmer, G. (2000). Production of staccato articulation in Mozart sonatas played on a grand piano. : Preliminary results. Speech Music and Hearing Quarterly Progress and Status Report, 41(4), 001-006.
[50]
Bresin, R. (1998). Artificial neural networks based models for automatic performance of musical scores. Journal of New Music Research, 27(3), 239-270.
[51]
Friberg, A., Bresin, R., Frydén, L. & Sundberg, J. (1998). Musical punctuation on the microlevel : Automatic identification and performance of small melodic units. Journal of New Music Research, 27(3), 271-292.
[52]
Friberg, A., Bresin, R., Fryden, L. & Sundberg, J. (1998). Musical punctuation on the microlevel : Automatic identification and performance of small melodic units. Journal of New Music Research, 27(3), 271-292.
Konferensbidrag
[53]
Hultman, A., Goina, M., Bresin, R. (2024). Interactive sonification helps make sense of the negative environmental impact of vessel traffic in the Baltic Sea. I Proceedings of the 19th international audio mostly conference: explorations in sonic cultures. (s. 209-217). New York, NY, USA: Association for Computing Machinery (ACM).
[54]
Telang, S., Marques, M., Latupeirissa, A. B., Bresin, R. (2023). Emotional Feedback of Robots : Comparing the perceived emotional feedback by an audience between masculine and feminine voices in robots in popular media. I HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction. (s. 434-436). Association for Computing Machinery (ACM).
[55]
Zhang, B. J., Orthmann, B., Torre, I., Bresin, R., Fick, J., Leite, I., Fitter, N. T. (2023). Hearing it Out : Guiding Robot Sound Design through Design Thinking. I 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN. (s. 2064-2071). Institute of Electrical and Electronics Engineers (IEEE).
[56]
Rafi, A. K., Murdeshwar, A., Latupeirissa, A. B., Bresin, R. (2023). Investigating the Role of Robot Voices and Sounds in Shaping Perceived Intentions. I HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction. (s. 425-427). Association for Computing Machinery (ACM).
[57]
Goina, M., Bresin, R., Rodela, R. (2023). Our Sound Space (OSS): An installation for participatory and interactive exploration of soundscapes. I SMC 2023: Proceedings of the Sound and Music Computing Conference 2023. (s. 255-260). Sound and Music Computing Network.
[58]
Zojaji, S., Latupeirissa, A. B., Leite, I., Bresin, R., Peters, C. (2023). Persuasive polite robots in free-standing conversational groups. I Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023). (s. 1-8). Institute of Electrical and Electronics Engineers (IEEE).
[59]
Maranhao, T., Berrez, P., Kihl, M., Bresin, R. (2023). What is the color of choro? : Color preferences for an instrumental brazilian popular music genre. I SMC 2023: Proceedings of the Sound and Music Computing Conference 2023. (s. 370-376). Sound and Music Computing Network.
[60]
van den Broek, G., Bresin, R. (2022). Concurrent sonification of different percentage values : the case of database values about statistics of employee engagement. I Proceedings of ISon 2022, 7th Interactive Sonification Workshop, BSCC, University of Bremen, Germany, September 22–23, 2022..
[61]
Larson Holmgren, D., Särnell, A., Bresin, R. (2022). Facilitating reflection on climate change using interactive sonification. I Proceedings of ISon 2022, 7th Interactive Sonification Workshop, BSCC, University of Bremen, Germany, September 22–23, 2022..
[62]
Kantan, P. R., Dahl, S., Spaich, E. G., Bresin, R. (2022). Sonifying Walking : A Perceptual Comparison of Swing Phase Mapping Schemes. I Proceedings of ISon 2022, 7th Interactive Sonification Workshop, BSCC, University of Bremen, Germany, September 22–23, 2022..
[63]
Bresin, R., Frid, E., Latupeirissa, A. B., Panariello, C. (2021). Robust Non-Verbal Expression in Humanoid Robots: New Methods for Augmenting Expressive Movements with Sound. Presenterad vid Workshop on Sound in Human-Robot Interaction at HRI 2021.
[64]
Myresten, E., Larson Holmgren, D., Bresin, R. (2021). Sonification of Twitter Hashtags Using Earcons Based on the Sound of Vowels. I Proceedigns of the 2nd Nordic Sound and Music Computing Conference. Zenodo.
[65]
Latupeirissa, A. B., Panariello, C., Bresin, R. (2020). Exploring emotion perception in sonic HRI. I 17th Sound and Music Computing Conference. (s. 434-441). Torino: Zenodo.
[66]
Bresin, R., Pauletto, S., Laaksolahti, J., Gandini, E. (2020). Looking for the soundscape of the future : preliminary results applying the design fiction method. I Sound and Music Computing Conference 2020..
[67]
Latupeirissa, A. B., Bresin, R. (2020). Understanding non-verbal sound of humanoid robots in films. Presenterad vid Workshop on Mental Models of Robots at HRI 2020 in Cambridge, UK, Mar 23rd 2020.
[68]
Panariello, C., Mattias, S., Frid, E., Bresin, R. (2019). From vocal sketching to sound models by means of a sound-based musical transcription system. I Proceedings of the Sound and Music Computing Conferences. (s. 167-173). CERN.
[69]
Han, X., Bresin, R. (2019). Performance of piano trills: effects of hands, fingers, notes and emotions. I Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019. (s. 9-15). Stockholm.
[70]
Latupeirissa, A. B., Frid, E., Bresin, R. (2019). Sonic characteristics of robots in films. I Proceedings of the 16th Sound and Music Computing Conference. (s. 1-6). Malaga, Spain.
[71]
Frid, E., Lindetorp, H., Hansen, K. F., Elblaus, L., Bresin, R. (2019). Sound Forest - Evaluation of an Accessible Multisensory Music Installation. I Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (s. 1-12). ACM.
[72]
Hansen, K. F., Bresin, R., Holzapfel, A., Pauletto, S., Gulz, T., Lindetorp, H., Misgeld, O., Mattias, S. (2019). Student involvement in sound and music computing research : Current practices at KTH and KMH. I Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019. (s. 36-42). Stockholm.
[73]
Serafin, S., Dahl, S., Bresin, R., Jensenius, A. R., Unnthorsson, R., Välimäki, V. (2018). NordicSMC : A nordic university hub on sound and music computing. I Proceedings of the 15th Sound and Music Computing Conference: Sonic Crossings, SMC 2018. (s. 124-128). Sound and music Computing network.
[74]
Frid, E., Bresin, R., Alexanderson, S. (2018). Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot - Implications for Movement Sonification of Humanoids. I Proceedings of the 15th Sound and Music Computing Conference. Limassol, Cyprus.
[75]
Frid, E., Bresin, R., Sallnäs Pysander, E.-L., Moll, J. (2017). An Exploratory Study On The Effect Of Auditory Feedback On Gaze Behavior In a Virtual Throwing Task With and Without Haptic Feedback. I Proceedings of the 14th Sound and Music Computing Conference. (s. 242-249). Espoo, Finland.
[76]
Paloranta, J., Lundström, A., Elblaus, L., Bresin, R., Frid, E. (2016). Interaction with a large sized augmented string instrument intended for a public setting. I Sound and Music Computing 2016. (s. 388-395). Hamburg: Zentrum für Mikrotonale Musik und Multimediale Komposition (ZM4).
[77]
Singh, A., Tajadura-Jimez, A., Bianchi-Berthouze, N., Marquardt, N., Tentori, M., Bresin, R., Kulic, D. (2016). Mind the Gap: A SIG on Bridging the Gap in Research on Body Sensing, Body Perception and Multisensory Feedback. I Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. (s. 1092-1095). New York, NY, USA.
[78]
Bresin, R., Elblaus, L., Frid, E., Favero, F., Annersten, L., Berner, D., Morreale, F. (2016). SOUND FOREST/LJUDSKOGEN: A LARGE-SCALE STRING-BASED INTERACTIVE MUSICAL INSTRUMENT. I Sound and Music Computing 2016. (s. 79-84). SMC Sound&Music Computing NETWORK.
[79]
Frid, E., Elblaus, L., Bresin, R. (2016). Sonification of fluidity -
An exploration of perceptual connotations of a particular movement feature. I Proceedings of ISon 2016, 5th Interactive Sonification Workshop. (s. 11-17). Bielefeld, Germany.
[80]
Elblaus, L., Tsaknaki, V., Lewandowski, V., Bresin, R. (2015). Nebula: An Interactive Garment Designed for Functional Aesthetics. I Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. (s. 275-278). New York, NY, USA: ACM.
[81]
Goina, M., Robitaille, M.-A., Bresin, R. (2014). Interactive sonification in circus performance at Uniarts and KTH : ongoing research. I Proceedings of the Sound and Music Computing Sweden Conference 2014. (s. 23-24). KTH Royal Institute of Technology.
[82]
Elblaus, L., Goina, M., Robitaille, M. A., Bresin, R. (2014). Modes of sonic interaction in circus : Three proofs of concept. I Proceedings of Sound and Music Computing Conference 2014. (s. 1698-1706). Athens.
[83]
Bresin, R., Elblaus, L., Falkenberg Hansen, K., Månsson, L., Tardat, B. (2014). Musikcyklarna/Music bikes : An installation for enabling children to investigate the relationship between expressive music performance and body motion. I Proceedings of the Sound and Music Computing Sweden Conference 2014. (s. 1-2). KTH Royal Institute of Technology.
[84]
Elblaus, L., Hansen, K. F., Bresin, R. (2014). NIME Design and Contemporary Music Practice : Benefits and Challenges. Presenterad vid Workshop on Practice-Based Research in New Interfaces for Musical Expression, NIME 2014.
[85]
Frid, E., Bresin, R., Moll, J., Sallnäs Pysander, E.-L. (2014). Sonification of haptic interaction in a virtual scene. I Sound and Music Computing Sweden 2014, Stockholm, December 4-5, 2014. (s. 14-16).
[86]
Dubus, G., Hansen, K. F., Bresin, R. (2012). An overview of sound and music applications for Android available on the market. I Proceedings of the 9th Sound and Music Computing Conference, SMC 2012. (s. 541-546). Sound and music Computing network.
[87]
Hansen, K. F., Bresin, R. (2012). Use of soundscapes for providing information about distance left in train journeys. I Proceedings of the 9th Sound and Music Computing Conference, SMC 2012. (s. 79-84). Sound and music Computing network.
[88]
Hansen, K. F., Dravins, C., Bresin, R. (2011). Ljudskrapan/The Soundscraper : Sound exploration for children with complex needs, accommodating hearing aids and cochlear implants. I Proceedings of the 8th Sound and Music Computing Conference, SMC 2011. (s. 70-76). Sound and Music Computing Network.
[89]
Fabiani, M., Dubus, G., Bresin, R. (2011). MoodifierLive: Interactive and collaborative music performance on mobile devices. I Proceedings of the 11th International Conference on New Interfaces for Musical Expression (NIME'11)..
[90]
Dubus, G., Bresin, R. (2011). Sonification of physical quantities throughout history: a meta-study of previous mapping strategies. I Proceedings of the 17th International Conference on Auditory Display (ICAD 2011). Budapest, Hungary: OPAKFI Egyesület.
[91]
Bresin, R., de Witt, A., Papetti, S., Civolani, M., Fontana, F. (2010). Expressive sonification of footstep sounds. I Proceedings of ISon 2010: 3rd Interactive Sonification Workshop. (s. 51-54). Stockholm, Sweden: KTH Royal Institute of Technology.
[92]
Eriksson, M., Bresin, R. (2010). Improving running mechanics by use of interactive sonification. I Proceedings of the Interaction Sonification workshop (ISon) 2010. (s. 95-98). Stockholm, Sweden: KTH Royal Institute of Technology.
[93]
Fabiani, M., Dubus, G., Bresin, R. (2010). Interactive sonification of emotionally expressive gestures by means of music performance. I Proceedings of ISon 2010, 3rd Interactive Sonification Workshop. (s. 113-116). Stockholm, Sweden: KTH Royal Institute of Technology.
[94]
Dubus, G., Bresin, R. (2010). Sonification of sculler movements, development of preliminary methods. I Proceedings of ISon 2010, 3rd Interactive Sonification Workshop. (s. 39-43). Stockholm, Sweden: KTH Royal Institute of Technology.
[95]
Camurri, A., Bevilacqua, F., Bresin, R., Maestre, E., Penttinen, H., Seppänen, J., Välimäki, V., Volpe, G., Warusfel, O. (2009). Embodied music listening and making in context-aware mobile applications : the EU-ICT SAME Project. Presenterad vid The 8th International Gesture Workshop. Bielefeld, Germany. Feb 25-27, 2009.
[96]
Friberg, A., Bresin, R., Hansen, K. F., Fabiani, M. (2009). Enabling emotional expression and interaction with new expressive interfaces. I Front. Hum. Neurosci. Conference Abstract: Tuning the Brain for Music..
[97]
Bresin, R., Delle Monache, S., Fontana, F., Papetti, S., Polotti, P., Visell, Y. (2008). Auditory feedback through continuous control of crumpling sound synthesis. I Proceedings of Sonic Interaction Design: Sound, Information and Experience. A CHI 2008 Workshop organized by COST Action IC0601. (s. 23-28). IUAV University of Venice.
[98]
Hansen, K. F., Bresin, R., Friberg, A. (2008). Describing the emotional content of hip-hop DJ recordings. I The Neurosciences and Music III. (s. 565). Montreal: New York Academy of Sciences.
[99]
Vitale, R., Bresin, R. (2008). Emotional cues in knocking sounds. I Proc. of the 10th International Conference on Music Perception and Cognition. (s. 276).
[100]
Bresin, R., Friberg, A. (2008). Influence of Acoustic Cues on the Expressive Performance of Music. I Proceedings of the 10th International Conference on Music Perception and Cognition. Sapporo, Japan.
[101]
Rocchesso, D., Serafin, S., Behrendt, F., Bernardini, N., Bresin, R., Eckel, G., Franinovic, K., Hermann, T., Pauletto, S., Susini, P., Visell, Y. (2008). Sonic Interaction Design : Sound, Information and Experience. I Conference on Human Factors in Computing Systems - Proceedings. (s. 3969-3972). New York, NY, USA: ACM.
[102]
Bjurling, J., Bresin, R. (2008). Timing in piano music : Testing a model of melody lead. I Proc. of the 10th International Conference on Music Perception and Cognition. Sapporo, Japan.
[103]
Hansen, K. F., Bresin, R. (2008). Verbal Description of DJ Recordings. I Proc. of the 10th International Conference on Music Perception and Cognition. (s. 20). Sapporo.
[104]
Burger, B., Bresin, R. (2007). Displaying expression in musical performance by means of a mobile robot. I Affective Computing And Intelligent Interaction, Proceedings. (s. 753-754).
[105]
Castellano, G., Bresin, R., Camurri, A., Volpe, G. (2007). Expressive Control of Music and Visual Media by Full-Body Movement. I Proceedings of the 7th International Conference on New Interfaces for Musical Expression, NIME '07. (s. 390-391). New York, NY, USA: ACM Press.
[106]
De Witt, A., Bresin, R. (2007). Sound design for affective interaction. I Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics. (s. 523-533).
[107]
Lindström, M., Ståhl, A., Höök, K., Sundström, P., Laaksolathi, J., Combetto, M., Taylor, A., Bresin, R. (2006). Affective diary : designing for bodily expressiveness and self-reflection. I CHI 2006 · Work-in-Progress. (s. 1037-1042). New York, NY, USA: ACM Press.
[108]
Puiggròs, M., Gómez, E., Ramírez, R., Serra, X., Bresin, R. (2006). Automatic characterization of ornamentation from bassoon recordings for expressive synthesis. I 9th International Conference on Music Perception & Cognition. (s. 1533-1538). Bologna: Bonomia University Press.
[109]
Mancini, M., Bresin, R., Pelachaud, C. (2006). From acoustic cues to an expressive agent. I Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). (s. 280-291).
[110]
Luis, I. F., Bresin, R. (2006). Influence of expressive music on the perception of short text messages. I Proceedings of the 9th International Conference on MusicPerception & Cognition (ICMPC9). (s. 739). Bologna: Bonomia University Press (abstract).
[111]
Hansen, K. F., Bresin, R. (2006). Mapping strategies in DJ scratching. I Proc. of the Conference on New Interfaces for Musical Expression. (s. 188-191). IRCAM.
[112]
Hansen, K. F., Bresin, R., Friberg, A. (2006). Principles for expressing emotional content in turntable scratching. I Proc. 9th International Conference on Music Perception & Cognition. (s. 532-533). Bologna: Bonomia University Press.
[113]
Hiraga, R., Bresin, R., Katayose, H. (2006). Rencon 2005. I Proceeding of the 20th Annual Conference of the Japanese Society for Artficial Intelligence. (s. 1D2-1).
[114]
Giordano, B., Bresin, R. (2006). Walking and playing: What's the origin of emotional expressiveness in music?. I Proceedings of the 9th International Conference on Music Perception & Cognition (ICMPC9), Bologna/Italy, August 22-26 2006. (s. 436). Bologna: Bononia University Press.
[115]
Mancini, M., Pelachaud, C., Bresin, R. (2005). Greta Listening to Expressive Music. Presenterad vid Gathering of Animated Lifelike Agents - GALA 2005. IVA.
[116]
Bresin, R. (2005). What is the color of that music performance?. I Proceedings of the International Computer Music Conference - ICMC 2005. (s. 367-370). Barcelona.
[117]
Rocchesso, D., Avanzini, F., Rath, M., Bresin, R., Serafin, S. (2004). Contact sounds for continuous feedback. I Proceedings of International Workshop on Interactive Sonification: (Human Interaction with Auditory Displays)..
[118]
Goebl, W., Bresin, R., Galembo, A. (2004). Once again: The perception of piano touch and tone : Can touch audibly change piano sound independently of intensity?. I Proceedings of the International Symposium on Musical Acoustics, March 31st to April 3rd 2004 (ISMA2004), Nara, Japan. (s. 332-335). Nara, Japan: The Acoustical Society of Japan, CD-ROM.
[119]
Bresin, R. (2004). Real-time visualization of musical expression. I Proceedings of Network of Excellence HUMAINE Workshop "From Signals to Signs of Emotion and Vice Versa". (s. 19-23).
[120]
Hiraga, R., Bresin, R., Hirata, K., Katayose, H. (2004). Rencon 2004: Turing Test for Musical Expression. I Proceedings of the 4th international conference on New interfaces for musical expression. (s. 120-123). Hamamatsu, Shizuoka, Japan: National University of Singapore.
[121]
Hiraga, R., Bresin, R., Hirata, K., Katayose, H. (2003). After the first year of Rencon. I Proceedings of the 2003 International Computer Music Conference, ICMC 2003. International Computer Music Association.
[122]
Hansen, K. F., Bresin, R. (2003). DJ scratching performance techniques : Analysis and synthesis. I Proc. Stockholm Music Acoustics Conference. (s. 693-696).
[123]
Bresin, R., Hansen, K. F., Dahl, S. (2003). The Radio Baton as configurable musical instrument and controller. I Proc. Stockholm Music Acoustics Conference. (s. 689-691).
[124]
Friberg, A., Schoonderwaldt, E., Juslin, P. N., Bresin, R. (2002). Automatic real-time extraction of musical expression. I Proceedings of the International Computer Music Conference, ICMC 2002. (s. 365-367).
[125]
Bresin, R., Friberg, A., Sundberg, J. (2002). Director musices : The KTH performance rules system. I Proceedings of SIGMUS-46. (s. 43-48). Information Processing Society of Japan,.
[126]
Bresin, R., Friberg, A. (2001). Expressive musical icons. I Proceedings of the International Conference on Auditory Display - ICAD 2001. (s. 141-143).
[127]
Bresin, R., Friberg, A., Dahl, S. (2001). Toward a new model for sound control. I Proceedings of the COST G-6 Conference on Digital Audio Effects (DAFX-01), Limerick, Ireland, December 6-8, 200. (s. 45-49).
[128]
Bresin, R., Friberg, A. (2000). Rule-based emotional colouring of music performance. I Proceedings of the International Computer Music Conference - ICMC 2000. (s. 364-367). San Francisco: ICMA.
[129]
Bresin, R., Friberg, A. (2000). Software Tools for Musical Expression. I Proceedings of the 2000 International Computer Music Conference, ICMC 2000. International Computer Music Association.
[130]
Bresin, R., Friberg, A. (2000). Software tools for musical expression. I Proceedings of the InternationalComputer Music Conference 2000. (s. 499-502). San Francisco, USA: Computer Music Association.
[131]
Bresin, R., Friberg, A. (1999). Synthesis and decoding of emotionally expressive music performance. I Proceedings of the IEEE 1999 Systems, Man and Cybernetics Conference - SMC’99. (s. 317-322).
[132]
Bresin, R., Friberg, A. (1997). A multimedia environment for interactive music performance. I Proceedings of KANSEI - The Technology of Emotion, AIMI International Workshop. (s. 64-67).
[133]
Friberg, A., Bresin, R. (1997). Automatic musical punctuation : A rule system and a neural network approach. I Proceedings of KANSEI - The Technology of Emotion, AIMI Intl Workshop. (s. 159-163).
[134]
Battel, G. U., Bresin, R. (1993). Analysis by synthesis in piano performance - A study on the theme of the Brahms’ "Variations on a Theme of Paganini", op. 35. I Proceedings of SMAC 93 (Stockholm Music Acoustic Conference). (s. 69-73). Stockholm: KTH Royal Institute of Technology.
[135]
Battel, G. U., Bresin, R., De Poli, G., Vidolin, A. (1993). Automatic performance of musical scores by mean of neural nerworks : evaluation with listening tests. I X CIM Colloquium on Musical Informatics. (s. 97-101).
[136]
Bresin, R., De Poli, G., Torelli, G. (1991). Applicazione delle reti neurali alla classificazione dei registri dell’organo a canne. I Colloquio di Informatica Musicale - IX CIM. (s. 112-114).
[137]
Bresin, R., Manduchi, R. (1989). Una sorgente di melodie con controllo di entropia. Presenterad vid VIII Colloquio di Informatica Musicale. (s. 213-215). Cagliari, Italy.
[138]
Bresin, R., Manduchi, R. (1989). Una sorgente di melodie con controllo di entropia. I Colloquio di Informatica Musicale - VIII CIM. (s. 213-215).
Kapitel i böcker
[139]
Falkenberg, K., Bresin, R., Holzapfel, A. & Pauletto, S. (2021). Musikkommunikation och ljudinteraktion. I Pernilla Falkenberg Josefsson, Mikael Wiberg (Red.), Introduktion till medieteknik (s. 155-166). Lund: Studentlitteratur AB.
[140]
Pauletto, S. & Bresin, R. (2021). Sonification Research and Emerging Topics. I Michael Filimowicz (Red.), Doing Research in Sound Design (s. 238-254). Routledge.
[141]
Friberg, A., Bresin, R. & Sundberg, J. (2014). Analysis by synthesis. I Thompson, W. F. (Red.), Music in the Social and Behavioral Sciences. Los Angeles: Sage Publications.
[142]
Friberg, A., Bresin, R. & Sundberg, J. (2014). Expressive timing. I Thompson, W. F. (Red.), Music in the Social and Behavioral Sciences (s. 440-442). Los Angeles: Sage Publications.
[143]
Bresin, R. & Friberg, A. (2013). Evaluation of computer systems for expressive music performance. I Kirke, Alexis; Miranda, Eduardo R. (Red.), Guide to Computing for Expressive Music Performance (s. 181-203). Springer.
[144]
Giordano, B. L., Susini, P. & Bresin, R. (2013). Perceptual evaluation of sound-producing objects. I Franinovic, Karmen; Serafin, Stefania (Red.), Sonic Interaction Design (s. 151-197). Boston, MA: MIT Press.
[145]
Fabiani, M., Friberg, A. & Bresin, R. (2013). Systems for Interactive Control of Computer Generated Music Performance. I Kirke, A., & Miranda, E. (Red.), Guide to Computing for Expressive Music Performance (s. 49-73). Springer Berlin/Heidelberg.
[146]
Giordano, B. L., Susini, P. & Bresin, R. (2012). Experimental methods for the perceptual evaluation of sound-producing objects and interfaces. I Franinovic, Karmen; Serafin, Stefania (Red.), Sonic Interaction Design. Boston, MA: MIT Press.
[147]
Dahl, S., Bevilacqua, F., Bresin, R., Clayton, M., Leante, L., Poggi, I. & Rasamimanana, N. (2009). Gestures in performance. I Godøy, Rolf Inge; Leman, Marc (Red.), Musical Gestures: Sound, Movement, and Meaning (s. 36-68). New York: Routledge.
[148]
Camurri, A., Volpe, G., Vinet, H., Bresin, R., Fabiani, M., Dubus, G. ... Seppanen, J. (2009). User-centric context-aware mobile applications for embodied music listening. I Akan, Ozgur; Bellavista, Paolo; Cao, Jiannong; Dressler, Falko; Ferrari, Domenico; Gerla, Mario; Kobayashi, Hisashi; Palazzo, Sergio; Sahni, Sartaj; Shen, Xuemin (Sherman); Stan, Mircea; Xiaohua, Jia; Zomaya, Albert; Coulson, Geoffrey; Daras, Petros; Ibarra, Oscar Mayora (Red.), User Centric Media (s. 21-30). Heidelberg: Springer Berlin.
[149]
Bresin, R., Hansen, K. F., Karjalainen, M., Mäki-Patola, T., Kanerva, A., Huovilainen, A. ... Rocchesso, D. (2008). Controlling sound production. I Polotti, Pietro; Rocchesso, Davide (Red.), Sound to Sense, Sense to Sound: A state of the art in Sound and Music Computing (s. 447-486). Berlin: Logos Verlag.
[150]
Friberg, A. & Bresin, R. (2008). Real-time control of music performance. I Polotti, Pietro; Rocchesso, Davide (Red.), Sound to Sense - Sense to Sound: A state of the art in Sound and Music Computing (s. 279-302). Berlin: Logos Verlag.
[151]
Goebl, W., Dixon, S., De Poli, G., Friberg, A., Bresin, R. & Widmer, G. (2008). Sense in expressive music performance: Data acquisition, computational studies, and models. I Polotti, Pietro; Rocchesso, Davide (Red.), Sound to Sense - Sense to Sound: A state of the art in Sound and Music Computing (s. 195-242). Berlin: Logos Verlag.
[152]
Rocchesso, D. & Bresin, R. (2007). Emerging sounds for disappearing computers. I Streitz, Norbert; Kameas, Achilles; Mavrommati, Irene (Red.), The Disappearing Computer (s. 233-254). Berlin / Heidelberg: Springer.
[153]
Castellano, G., Bresin, R., Camurri, A. & Volpe, G. (2007). User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements. I Paiva, Ana; Prada, Rui; Picard, Rosalind W. (Red.), Affective Computing and Intelligent Interaction (s. 501-510). Berlin / Heidelberg: Springer Berlin/Heidelberg.
[154]
Falkenberg Hansen, K. & Bresin, R. (2003). Complex Gestural Audio Control: The Case of Scratching. I Rocchesso, D., & Fontana, F. (Red.), The Sounding Object (s. 221-269). Mondo Estremo.
[155]
Hansen, K. F. & Bresin, R. (2003). Complex gestural audio control : The case of scratching. I Rocchesso, Davide; Fontana, Federico (Red.), The Sounding Object (s. 221-269). Mondo Estremo.
[156]
Bresin, R., Falkenberg Hansen, K., Dahl, S., Rath, M., Marshall, M. & Moynihan, B. (2003). Devices for manipulation and control of sounding objects: The Vodhran and the Invisiball. I Rocchesso, D., & Fontana, F. (Red.), The Sounding Object (s. 271-295). Mondo Estremo.
Icke refereegranskade
Artiklar
[157]
Ziemer, T., Lenzi, S., Rönnberg, N., Hermann, T. & Bresin, R. (2023). Introduction to the special issue on design and perception of interactive sonification. Journal on Multimodal User Interfaces, 17(4), 213-214.
[158]
Yang, J., Hermann, T. & Bresin, R. (2019). Introduction to the special issue on interactive sonification. Journal on Multimodal User Interfaces, 13(3), 151-153.
[159]
Bresin, R., Askenfelt, A., Friberg, A., Hansen, K. & Ternström, S. (2012). Sound and Music Computing at KTH. Trita-TMH, 52(1), 33-35.
[160]
Bresin, R. & Friberg, A. (1998). Emotional expression in music performance : synthesis and decoding. TMH-QPSR, 39(4), 085-094.
[161]
Bresin, R. & Friberg, A. (1997). A multimedia environment for interactive music performance. TMH-QPSR, 38(2-3), 029-032.
Konferensbidrag
[162]
Bresin, R., Falkenberg, K., Holzapfel, A., Pauletto, S. (2021). KTH Royal Institute of Technology - Sound and Music Computing (SMC) Group. I Proceedings of the Sound and Music Computing Conferences 2021. (s. xxv-xxvi). Sound and Music Computing Network.
Kapitel i böcker
[163]
Dahl, S., Bevilacqua, F., Bresin, R., Clayton, M., Leante, L., Poggi, I. & Rasamimanana, N. (2010). Gestures in Performance. I Musical Gestures: Sound, Movement, and Meaning (s. 36-68). Taylor and Francis.
Avhandlingar
[164]
Bresin, R. (2000). Virtual virtuosity (Doktorsavhandling , KTH, Stockholm, Trita-TMH 2000:9). Hämtad från http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3049.
Proceedings (redaktörskap)
[165]
Bresin, R. (Red.). (2014). SMC Sweden 2014 : Sound and Music Computing: Bridging science, art, and industry. Stockholm: KTH Royal Institute of Technology.
[166]
Bresin, R., Hermann, T., Hunt, A. (Red.). (2010). Proceedings of ISon 2010 - Interactive Sonification Workshop : Human Interaction with Auditory Displays. Stockholm: KTH School of Computer Science and Communication (CSC).
Övriga
[167]
Dubus, G., Bresin, R. (). Evaluation of a system for the sonification of elite rowing in an interactive context. (Manuskript).
[168]
Latupeirissa, A. B., Murdeshwar, A., Bresin, R. (). Semiotic analysis of robot sounds in films: implications for sound design in social robotics. (Manuskript).
Senaste synkning med DiVA:
2024-11-18 00:09:14