{"id":1564,"date":"2018-07-23T16:24:54","date_gmt":"2018-07-23T16:24:54","guid":{"rendered":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/?p=1564"},"modified":"2025-02-26T13:22:03","modified_gmt":"2025-02-26T13:22:03","slug":"ma-museum-studies-placement-in-special-collections-weeks-1-2","status":"publish","type":"post","link":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/2018\/07\/23\/ma-museum-studies-placement-in-special-collections-weeks-1-2\/","title":{"rendered":"MA Museum Studies Placement in Special Collections, Weeks 1-2"},"content":{"rendered":"<h3><\/h3>\n<h3><span style=\"color: #333399\">Guest post from Yineng Zhu, Andrew Permain and Joe Searle, MA Museum Studies students working with the Archives &amp; Special Collections team.<\/span><\/h3>\n<p><span style=\"color: #000000\"><strong>Yineng<\/strong><\/span><\/p>\n<p>Hello, I\u2019m Yineng Zhu and I have\u00a0been doing a placement with Special Collections in the Library as part of my MA in Museum Studies. My project is about the University Library\u2019s history. There are some tasks in this project. Firstly, I need to do some research on the historical background to collect and identify the significant events and key individuals in the Library\u2019s history. And then creating a timeline online to record the information I gathered before. Moreover, I need to prepare and install a physical exhibition which focuses on a specific theme, individuals, or key dates in the Library\u2019s history. Meanwhile, I will learn how to carry out an oral history interview and summarize recording. Finally, I will also know more about digitization and OMEKA to digitize archives, as well as creating metadata and exhibition page.<\/p>\n<p>&nbsp;<\/p>\n<p>In the last two weeks, I did some literature review and made a detailed note about the University Library\u2019s history. Recently, I began creating a timeline by using the software called Tiki Toki. In order to identify these themes convincingly, I plan to choose proper documents, such as electronic photographs, archive documents, and web links. Besides, I will divide these historical themes in decades by different colour. For example, the appointment of staff, the opening of new library buildings, different donations and so on. The image shows the timeline I start creating recently.<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/\u5c4f\u5e55\u5feb\u7167-2018-07-20-11.19.34.png\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-1569 aligncenter\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/\u5c4f\u5e55\u5feb\u7167-2018-07-20-11.19.34-300x173.png\" alt=\"\" width=\"512\" height=\"295\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/\u5c4f\u5e55\u5feb\u7167-2018-07-20-11.19.34-300x173.png 300w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/\u5c4f\u5e55\u5feb\u7167-2018-07-20-11.19.34-768x442.png 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/\u5c4f\u5e55\u5feb\u7167-2018-07-20-11.19.34-1024x589.png 1024w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/\u5c4f\u5e55\u5feb\u7167-2018-07-20-11.19.34.png 1277w\" sizes=\"auto, (max-width: 512px) 100vw, 512px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"color: #000000\"><strong>Andrew<\/strong><\/span><\/p>\n<p>The project that I have been tasked with conducting research into is the history of Fielding Johnson Building. A history of 245 years stretching back to the 19th Century and the year of our lord 1837. This grand building has seen much change since it began life as the Leicestershire and Rutland County Lunatic Asylum back in 1837. Today this grand building is still standing having survived two world wars and various changes to make it suitable to be used for the teaching of students. Over the past two weeks, I has spent many an hour siting in the reading room examining various materials that relate to this grand and significant building&#8217;s history including one book dedicated just to the history of the 5<sup>th<\/sup> Northern General Hospital (See image). Through a thorough read and analysis of material I have learnt much about the history of this grand building. I discovered that in the more recent additions of the University\u2019s reports the famous Geneticist of the University Sir Alec Jeffreys is always bar one photo pictured with his DNA finger print invention. Another thing that one discovered is that the Fielding Johnson building at one time contained an Aquarium.<\/p>\n<p>&nbsp;<\/p>\n<p>Finally I wish you all to remember the University\u2019s motto particularly as it has now disappeared from the university\u2019s logo.<\/p>\n<p>&nbsp;<\/p>\n<p><em>Ut vitam habeant<\/em> \u00a0&#8216;so that they may have life&#8217;.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-1443 aligncenter\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/01\/001_Cover2-653x1024.jpg\" alt=\"\" width=\"324\" height=\"508\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/01\/001_Cover2-653x1024.jpg 653w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/01\/001_Cover2-191x300.jpg 191w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/01\/001_Cover2-768x1204.jpg 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/01\/001_Cover2.jpg 1250w\" sizes=\"auto, (max-width: 324px) 100vw, 324px\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Joe<\/strong><\/p>\n<p>Hello, my name is Joe and I am a MA Museum Studies student currently on a work experience placement with the Special Collections team at the library of the University of Leicester. My official role for the next 8 weeks is \u2018Digital Curator in Residence\u2019 for the new Digital Reading Room \u2013 which simply means that I will be exploring different ways of presenting digitised archival content from Special Collections, using the cutting-edge technology available in the room. This blog, for the next 8 weeks, will be a record of my progress, as well as a place for me to explore my thoughts and ideas in what will most assuredly be an interesting placement.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Many Meetings<\/strong><\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>The beginning of the week was mostly spent with all the necessary introductions and formalities that accompany a new endeavour. In addition to meeting some of the Library staff (and the excellent Special Collections team), I got to meet Alex Moseley \u2013 the University\u2019s resident expert on play and game-based learning \u2013 for a discussion about the parameters of my task and some possible approaches. This was followed by my somewhat unexpected participation in a workshop taking place in the Digital Reading Room, which was led by Ashley James Brown \u2013 a digital artist and \u2018creative technologist\u2019. The workshop had our group exploring novel input methods for digital technology by exploiting the principles of conductivity using simple circuits, and (rather surprisingly) some fruit. Though certainly entertaining, I had hoped to gain some inspiration on how the multi-touch technology in the room could be creatively used, which was sadly lacking.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>A Literature Review<\/strong><\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>My main task for this first week was to review the literature on the use of multi-touch displays in educational and heritage contexts, which should help to inform my own projects going forward. To help get me started, I had a meeting with librarian Jackie Hanes, who had some useful advice for conducting literature searches. Then, it was left to me to explore the literature and present my findings, which can be found below.<\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>Having had no previous experience in this field, (my undergraduate degree was in History), I had no prior expectations, but as someone who has always had an interest in technology, it was a fascinating exploration into a subject far beyond my usual domain. It was interesting to find that there is an entire field of study called Human-Computer Interaction (HCI), which seeks to explore and optimise the various ways in which people work with computers. In particular, there has been a large body of work exploring different methods of interaction beyond the now-standard windows, icons, menus and pointers (or WIMP, for short) on a standard computer monitor, and this is where we can find much of the research related to the use of multi-touch displays. As this technology has improved and matured, many have been interested in the possible applications of multi-touch technology, and we can now find many companies today extolling the benefits of multi-touch technology to a wide variety of clients.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>If we first turn our attention to the marketing material produced by the manufacturers of multi-touch displays, we can see what kinds of experiences are being offered, and find that the message is remarkably consistent. Consider ProMultis, (the same company that supplied the technology in the Digital Reading Room), who claims that multi touch \u2018allows interaction and collaboration between users\u2019<a href=\"#_edn1\" name=\"_ednref1\">[1]<\/a>. Similarly, we are told that modern multi-touch technology can transform mundane displays into \u2018an engaging multi-user interactive computing interface\u2019<a href=\"#_edn2\" name=\"_ednref2\">[2]<\/a> and turn visitors \u2018from passive observers to actively engaged participants\u2019<a href=\"#_edn3\" name=\"_ednref3\">[3]<\/a>. The common experience being sold seems to be an engaging and collaborative one \u2013 and we can now turn to the academic literature to assess whether these claims hold any truth.<\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p>Many researchers in the field of HCI have naturally been curious about the possible benefits of multi touch technology, especially when compared with the more traditional computer experience. One concept that has particularly gripped academics within the field of HCI is the idea of the \u2018digital desk\u2019 \u2013 a horizontal interactive display that replaces or augments the traditional office desk<a href=\"#_edn4\" name=\"_ednref4\">[4]<\/a>. The term desk is important here, as it implies use in a work environment for the purposes of improving productivity. Horizontal tabletop displays have been singled out specifically as being good candidates for encouraging group collaboration<a href=\"#_edn5\" name=\"_ednref5\">[5]<\/a>, which is supported by research showing that horizontal displays are better at facilitating group collaboration than vertical displays<a href=\"#_edn6\" name=\"_ednref6\">[6]<\/a>, and that the nature of multi-touch technology itself seems to encourage user participation, which leads to more equitable participation among people in groups.<a href=\"#_edn7\" name=\"_ednref7\">[7]<\/a> However, it should be noted that these studies have mainly been focused on group performance in work environments, and that when these multi-touch devices have been transferred to public settings \u2013 such as museums and galleries \u2013 the results have been less positive.<a href=\"#_edn8\" name=\"_ednref8\">[8]<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>Why then, do multi-touch devices struggle to create the same levels of group collaboration in public environments as can be seen in work environments? The answer lies with the different ways that these devices are used in each environment, as well as with the different expectations of potential users. In work environments for example, groups are working together with a shared purpose to solve specific problems over a long period, and these users are more likely to learn how to effectively use the technology available. In public environments however, multi-touch devices have to serve a dual-purpose \u2013 first to attract potential users to interact with the device, then to maintain their interest long enough for the device to fulfil its purpose.<a href=\"#_edn9\" name=\"_ednref9\">[9]<\/a> In addition, users in public environments will have vastly different levels of digital literacy and computing experience, making it supremely difficult to create an experience that will appeal to everyone. An interesting question to ponder at this stage is whether the Digital Reading Room is a \u2018public\u2019 space or a \u2018work\u2019 space \u2013 considering that the answer has important implications for the type of content that will ultimately be successful on the technology in the room.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>One crucial finding that emerges continually from the research literature is that multi-touch devices installed in public spaces strongly encourage sociality. This concept has even been given a name \u2013 the \u2018honey-pot effect\u2019 \u2013 which describes the phenomenon of an in-use multi-touch interactive inexorably drawing in other people.<a href=\"#_edn10\" name=\"_ednref10\">[10]<\/a> This effect has been very well documented.<a href=\"#_edn11\" name=\"_ednref11\">[11]<\/a> Although in one sense this is certainly positive \u2013 it ensures that these devices are in near-constant use in busy periods and they can often become that \u2018star\u2019 feature of an exhibition<a href=\"#_edn12\" name=\"_ednref12\">[12]<\/a> \u2013 this also raises some key issues. For example, the high amount of attention directed at multi-touch devices can be off-putting for some, especially when the interactions are viewable to a large audience as many people are afraid of getting it \u2018wrong\u2019 and looking foolish.<a href=\"#_edn13\" name=\"_ednref13\">[13]<\/a> This seems dependent on age, as children are often more eager to dive in and begin interacting, whereas adults are generally more cautious and prefer to watch others first before interacting themselves.<a href=\"#_edn14\" name=\"_ednref14\">[14]<\/a> Another key issue is the potential for conflict. As more people begin interacting, there is always the chance that the actions of one will interfere with the actions of another \u2013 sometimes intentionally, but often by accident. Although HCI researchers have proposed various methods of dealing with this problem<a href=\"#_edn15\" name=\"_ednref15\">[15]<\/a>, these solutions have rarely carried over from academia into the real-world \u2013 and conflicts, if they do arise, often have to be resolved by the users themselves.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>Another related finding is that many people are still attracted to the novelty of large multi-touch displays \u2013 seemingly despite our increasing familiarity with touch devices. This causes problems for designers, for although it is easy to attract attention with playful methods of interaction, it is tremendously difficult to get users to progress further towards meaningful interaction with the actual content.<a href=\"#_edn16\" name=\"_ednref16\">[16]<\/a> In museums, it has been found that \u2018stay-time\u2019 \u2013 the measure of how long visitors remain at any one feature \u2013 is often higher for multi-touch tables than elsewhere.<a href=\"#_edn17\" name=\"_ednref17\">[17]<\/a> But these findings are dampened somewhat with the now-common knowledge that stay-times at traditional exhibits are often shockingly low, usually less than a minute on average<a href=\"#_edn18\" name=\"_ednref18\">[18]<\/a>, and multi-touch displays seem to average about 2-3 minutes. Based on such numbers, it is difficult to justify the often substantial costs of procuring multi-touch displays and developing content for them. Using Whitton and Moseley\u2019s synthesised model of engagement from the fields of education and game studies, we find that very few multi-touch interactives move beyond \u2018superficial engagement\u2019 \u2013 which is characterised by simple attention and participation in a task. What is needed is \u2018deep engagement\u2019, which is characterised by emotional engagement which fosters an intrinsic desire to continue the task.<a href=\"#_edn19\" name=\"_ednref19\">[19]<\/a> This tallies well with the museum studies literature, which has found that interactive stations providing \u2018real interactivity\u2019 \u2013 such as the creation of personal content \u2013 was preferred over stations that simply had \u2018flat predefined interaction\u2019.<a href=\"#_edn20\" name=\"_ednref20\">[20]<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>Some of the more successful attempts at promoting this deeper engagement have come from multi-touch interactive games. One example forced three players to work together in a game about sustainable development, where each player was assigned to a resource (food, shelter or energy) and victory could only be achieved through careful balancing and management of these resources, which required negotiation and collaboration.<a href=\"#_edn21\" name=\"_ednref21\">[21]<\/a> Here, we can see some of the strengths of multi-touch displays manifesting themselves, but it must be said that simply designing an engaging multi-user game is not a blueprint for success. As noted earlier, the social nature of multi-touch displays can sometimes work against the designer\u2019s intentions, and a clear example of this can be seen in museums, where it has been found that adults may be more reluctant to use multi-touch tables after seeing children interacting with it \u2013 which suggests that it was perceived as a toy<a href=\"#_edn22\" name=\"_ednref22\">[22]<\/a>, and similarly, that some parents refused to let their children use a multi-touch display, believing it to be a distraction from the \u2018real\u2019 knowledge on offer at the museum.<a href=\"#_edn23\" name=\"_ednref23\">[23]<\/a> Clearly, game-based interaction can be a powerful tool but it is not a panacea for the engagement problem. Indeed, one study found that they had created an experience that was <em>too engaging<\/em> \u2013 meaning that the visitors were too focused on solving the challenges set before them and had little opportunity to reflect on the deeper issues of the content as intended.<a href=\"#_edn24\" name=\"_ednref24\">[24]<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>Finally, a brief note on interaction. It is sometimes assumed that multi-touch interaction, or touch interaction more generally \u2013 aided by gestures such as finger scrolling and pinch-to-zoom \u2013 is somehow more \u2018natural\u2019 for users, as opposed to using windows, icons, menus and pointers. However, as Donald Norman\u2013 a respected authority on usability design \u2013 has noted, \u2018natural user interfaces are not natural\u2019.<a href=\"#_edn25\" name=\"_ednref25\">[25]<\/a> Clear evidence of this truth can be seen in some studies of multi-touch displays where users, firmly entrenched in the traditional paradigm of WIMP interactions, struggle to grasp seemingly \u2018natural\u2019 interactions<a href=\"#_edn26\" name=\"_ednref26\">[26]<\/a>. We therefore cannot assume that simply by mimicking real-world interactions on multi-touch displays our users will somehow automatically know how to interact with it. All of this is not to deny the utility of touch interaction or gestures \u2013 they are certainly useful tools \u2013 and, as noted earlier, often provide a captivating and \u2018delightful\u2019 experience for users.<a href=\"#_edn27\" name=\"_ednref27\">[27]<\/a> Yet equally we cannot ignore decades of work on how to design good user-experiences for computers. Multi-touch interaction is still relatively new, so we must constantly be ready to experiment and find what content best fits this method of interaction, and of course, always conduct thorough testing with real users in real environments.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>In summary, some care should be taken to separate the reality of multi-touch use from the hype that often surrounds it. But to end on a note of optimism, it is unlikely that we have yet seen the full potential of multi-touch technology. Now is certainly an exciting time, given the widespread proliferation of multi-touch devices and the development of software designed to easily create interactive experiences. I will be using one such software \u2013 IntuiFace \u2013 to design and develop my own interactive experiences for the technology in the Digital Reading Room and this will form the bulk of my placement for the next seven weeks. Hopefully, the insight gained from this literature review will give me a good foundation on which to build, and I am very much looking forward to exploring how multi-touch technology may best be used.<\/p>\n<p>&nbsp;<\/p>\n<p>Having finished a review of the literature on the use of multi-touch displays in various contexts, it was now time to begin learning how to create these experiences for myself. To do this, I have been using \u2018IntuiFace\u2019 \u2013 a software package designed to create interactive experiences for touchscreen displays without having to write code.<\/p>\n<p>&nbsp;<\/p>\n<p>My first exposure to IntuiFace began in Week 1, when I was introduced to Library staff who had previously developed content using the program. Examining their work provided a helpful starting point, but unfortunately no-one had accrued a significant amount of experience with IntuiFace (the Library staff are, after all, very busy), so it was left to me to begin teaching myself how to use it.<\/p>\n<p>&nbsp;<\/p>\n<p>Thankfully, there exists a wealth of information online to help new users. I began the week by working through this material and completing the IntuiFace tutorial, which guides users through creating their first experience. Though certainly useful, this tutorial only covers a tiny fraction of what it is possible to build using IntuiFace, so the best way to learn from this point was to begin building my own prototypes.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Prototype 1 \u2013 Timeline of the First World War<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p style=\"text-align: left\">When considering what to build, I reminded myself that I needed to somehow leverage the unique advantages offered by large multi-touch displays. Bearing this in mind, I decided to develop an interactive timeline, which seemed to offer a good balance between information and interactivity. The timeline showed major events in the First World War, a subject I chose simply because the information was readily available and familiar to me, which allowed me to concentrate purely on its design. Developing this prototype helped to sharpen my skills with IntuiFace, and also served as a valuable proof-of-concept \u2013 if my final output features a timeline (which seems likely), then I can draw from my experience of building one here.<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/IntuiFaceTimeline.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-1588\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/IntuiFaceTimeline-1024x576.png\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/IntuiFaceTimeline-1024x576.png 1024w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/IntuiFaceTimeline-300x169.png 300w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/IntuiFaceTimeline-768x432.png 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/IntuiFaceTimeline.png 1920w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><\/p>\n<h6><strong>Figure 1:<\/strong> The Timeline Prototype viewed inside IntuiFace<\/h6>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Prototype 2 \u2013 Interactive Map of Leicester<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p>My placement requires me to showcase digitised content from the University\u2019s Archives and Special Collections, which at the time of writing, mostly seems to feature the local history of Leicester. The main problem I am facing is that it is difficult to find ways of making this material compelling to interact with. One solution I have developed is my second prototype, an interactive map of Leicester, which uses material from the \u2018Vanished Leicester\u2019 collection \u2013 a series of photographs of streets and buildings in Leicester which have since been demolished. Thanks to a previous project, some of these photographs now have GPS data, which allows me to accurately plot their location on a map in IntuiFace. This data has <a href=\"https:\/\/fusiontables.googleusercontent.com\/embedviz?q=select+col10+from+1s5kc3yrw6ZdvNBOU8KW9myfYTFETw5_2f8IylQkd&amp;viz=MAP&amp;h=false&amp;lat=52.63770776639928&amp;lng=-1.1252495788714896&amp;t=1&amp;z=13&amp;l=col10&amp;y=2&amp;tmplt=2&amp;hml=GEOCODABLE\">already been plotted<\/a> onto Google Maps using Google Fusion Tables, but the advantage of importing this into IntuiFace is that I can build much more interactivity than is possible with Google Maps. It remains to be seen if this idea will be developed further, but building it certainly required me to dig deeper into the more advanced features of IntuiFace, which will help me in my future projects.<\/p>\n<p>&nbsp;<\/p>\n<h5><a href=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/MapPlay.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-1589\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/MapPlay-1024x576.jpg\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/MapPlay-1024x576.jpg 1024w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/MapPlay-300x169.jpg 300w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/MapPlay-768x432.jpg 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/MapPlay.jpg 1920w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><\/h5>\n<h6><strong>Figure 2: <\/strong>The Interactive Map experience, running in IntuiFace Player<\/h6>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Prototype 3 \u2013 Exploring Interactivity<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p>My third prototype was not a single project \u2013 rather, I thought it was valuable to experiment with various different methods of interaction using touch. This was mainly to satisfy my own curiosity \u2013 since I have never before developed content for touch devices \u2013 I wanted to get a sense of what interactions are possible. First, I decided to rebuild some previous work \u2013 the \u2018Trustometer\u2019 \u2013 and try to add more interactivity to it. This was a simple experience that had users sort various sources by their reliability. My input was to make it so that users could sort the sources into two boxes \u2013 more reliable and less reliable \u2013 with the sources disappearing when users put them in the right box. Interestingly, I remade this experience a second time when I discovered a way to build the same level of functionality in a more efficient manner.<\/p>\n<p>&nbsp;<\/p>\n<h5><a href=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/SourcesPlay.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignone size-large wp-image-1590\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/SourcesPlay-1024x576.png\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/SourcesPlay-1024x576.png 1024w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/SourcesPlay-300x169.png 300w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/SourcesPlay-768x432.png 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/SourcesPlay.png 1920w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><\/h5>\n<h6><strong>Figure 3:<\/strong> My rebuilt version of the &#8216;Trustometer&#8217; experience, with added interactivity<\/h6>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>My other experiments were short attempts to achieve a specific effect. For example, I discovered ways to build parallax scrolling, a before\/after image comparison that can be interacted with, and an \u2018x-ray\u2019 effect where users can drag a box which reveals hidden details in an image. These experiments have helped me to see different methods of interaction beyond simple taps and gestures, and will no doubt feature in my projects in the future.<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/ImageComparison.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-1591\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/ImageComparison-1024x576.jpg\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/ImageComparison-1024x576.jpg 1024w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/ImageComparison-300x169.jpg 300w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/ImageComparison-768x432.jpg 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/ImageComparison.jpg 1920w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><\/p>\n<h6 style=\"text-align: left\"><strong>Figure 4:<\/strong> Before\/after image comparison with draggable slider<\/h6>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/XRayImage.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-1592\" src=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/XRayImage-1024x576.jpg\" alt=\"\" width=\"620\" height=\"349\" srcset=\"https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/XRayImage-1024x576.jpg 1024w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/XRayImage-300x169.jpg 300w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/XRayImage-768x432.jpg 768w, https:\/\/staffblogs.le.ac.uk\/specialcollections\/files\/2018\/07\/XRayImage.jpg 1920w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/a><\/p>\n<h6><strong>Figure 5:<\/strong> &#8216;X-Ray&#8217; Image viewer<\/h6>\n<p>&nbsp;<\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p style=\"text-align: left\">Within the past week, I have gained a significant amount of experience using IntuiFace, and now feel confident using its basic functions \u2013 and even some of its more advanced features. There is certainly more to learn, but for now, my knowledge is sufficient to build working prototypes and explore possible ideas. Next week, I will be testing how content performs on the technology in the Digital Reading Room, and see what special considerations may have to be taken, for example, when publishing content on the interactive wall versus the table. This will ultimately result in a complete prototype from which we can gain valuable information about the potential of multi-touch technology.<\/p>\n<p>&nbsp;<\/p>\n<h4><strong>References<\/strong><\/h4>\n<h6><a href=\"#_ednref1\" name=\"_edn1\">[1]<\/a> Promultis, \u2018Why Multi Touch?\u2019, &lt;<a href=\"http:\/\/www.promultis.info\/why-multi-touch\">http:\/\/www.promultis.info\/why-multi-touch<\/a>\/&gt;.<\/h6>\n<h6><a href=\"#_ednref2\" name=\"_edn2\">[2]<\/a> MODE Systems, \u2018Touch Table Installation\u2019, &lt;<a href=\"http:\/\/modesystems.com\/services\/touch-table-installation.htm\">http:\/\/modesystems.com\/services\/touch-table-installation.htm<\/a>l&gt;.<\/h6>\n<h6><a href=\"#_ednref3\" name=\"_edn3\">[3]<\/a> Planar, \u2018Trends in the use of digital displays in museum environments\u2019, &lt;<a href=\"http:\/\/www.planar.com\/blog\/2015\/1\/13\/trends-in-the-use-of-digital-displays-in-museum-environments\/\">http:\/\/www.planar.com\/blog\/2015\/1\/13\/trends-in-the-use-of-digital-displays-in-museum-environments\/<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref4\" name=\"_edn4\">[4]<\/a> Pierre David Wellner, \u2018Interacting with Paper on the Digital Desk\u2019, 1994,\u00a0<span style=\"color: #000000\">&lt;<\/span><span style=\"color: #0000ff\"><a href=\"https:\/\/www.cl.cam.ac.uk\/techreports\/UCAM-CL-TR-330.pdf\">https:\/\/www.cl.cam.ac.uk\/techreports\/UCAM-CL-TR-330.pdf<\/a><u>&gt;.<\/u><\/span><\/h6>\n<h6><a href=\"#_ednref5\" name=\"_edn5\">[5]<\/a>Yvonne Rogers, Youn-Kyung Lim and William R. Hazlewood, \u2018Extending Tabletops to Support Flexible Collaborative Interactions\u2019, <em>Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems, <\/em>2006, &lt;<a href=\"http:\/\/oro.open.ac.uk\/19542\/1\/rogersTabletop06.pdf\">http:\/\/oro.open.ac.uk\/19542\/1\/rogersTabletop06.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref6\" name=\"_edn6\">[6]<\/a> Yvonne Rogers and Si\u00e2n Lindley, \u2018Collaborating around vertical and horizontal large interactive displays: which way is best?\u2019, <em>Interacting with Computers<\/em>, 16, no.6 (2004), pp. 1133-1152.<\/h6>\n<h6><a href=\"#_ednref7\" name=\"_edn7\">[7]<\/a> Paul Marshall, Eva Hornecker, Richard Morris, Nick Sheep Dalton and Yvonne Rogers, \u2018When the fingers do the talking: A study of group participation with varying constraints to a tabletop interface\u2019, 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems, (2008), &lt;<a href=\"http:\/\/oro.open.ac.uk\/19521\/1\/marshallTabletop08.pdf\">http:\/\/oro.open.ac.uk\/19521\/1\/marshallTabletop08.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref8\" name=\"_edn8\">[8]<\/a> Chris Creed, Joseph Sivell and John Sear, \u2018Multi-Touch Tables for Exploring Heritage Content in Public Spaces\u2019 in Eugene Ch&#8217;ng, Henry Chapman and Vincent Gaffney (eds.), <em>Visual Heritage in a Digital Age<\/em>, (London: Springer, 2013), p. 71.<\/h6>\n<h6><a href=\"#_ednref9\" name=\"_edn9\">[9]<\/a> Petra Isenberg, Uta Hinrichs, Mark Hancock, and Sheelagh Carpendale, \u2018Digital Tables for Collaborative Information Exploration\u2019 in Christian M\u00fcller-Tomfelde (ed.), <em>Tabletops \u2013 Horizontal Interactive Displays<\/em>, (London: Springer, 2010).<\/h6>\n<h6><a href=\"#_ednref10\" name=\"_edn10\">[10]<\/a> \u2018Jeff Heywood on the &#8220;Novelty&#8221; and &#8220;Honey Pot&#8221; effects of multitouch exhibits\u2019, <em>Open Exhibits Blog<\/em>, (2011), &lt;<a href=\"http:\/\/openexhibits.org\/accessibility\/Jeff-Heywood-on-the-quotNoveltyquot-and-quotHoney-Potquot-effects-of-multitouch-exhibits\/3079\/\">http:\/\/openexhibits.org\/accessibility\/Jeff-Heywood-on-the-quotNoveltyquot-and-quotHoney-Potquot-effects-of-multitouch-exhibits\/3079\/<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref11\" name=\"_edn11\">[11]<\/a> See for example:<\/h6>\n<h6>Uta Hinrichs, Holly Schmidt and Sheelagh Carpendale, \u2018EMDiaglog: Bringing Information Visualization into the Museum\u2019, <em>IEEE Transactions on Visualization and Computer Graphics<\/em>, 14, no.6, (2008), pp. 1181-8.<\/h6>\n<h6>Uta Hinrichs and Sheelagh Carpendale, \u2018Interactive Tables in the Wild: Visitor Experiences with Multi-Touch Tables in the Arctic Exhibit at the Vancouver Aquarium\u2019, (2011), &lt;<a href=\"http:\/\/innovis.cpsc.ucalgary.ca\/innovis\/uploads\/Publications\/Publications\/interactiveTablesInTheWild.pdf\">http:\/\/innovis.cpsc.ucalgary.ca\/innovis\/uploads\/Publications\/Publications\/interactiveTablesInTheWild.pdf<\/a>&gt;.<\/h6>\n<h6>Harry Brignull and Yvonne Rogers, \u2018Enticing People to Interact with Large Public Displays in Public Spaces\u2019, <em>Human Computer Interaction \u2013 INTERACT03<\/em>, (2003), &lt;<a href=\"http:\/\/www.idemployee.id.tue.nl\/g.w.m.rauterberg\/conferences\/interact2003\/INTERACT2003-p17.pdf\">http:\/\/www.idemployee.id.tue.nl\/g.w.m.rauterberg\/conferences\/interact2003\/INTERACT2003-p17.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref12\" name=\"_edn12\">[12]<\/a> Jenny Kidd, Irida Ntalla and William Lyons, \u2018Multi-touch Interfaces in Museum Spaces: Reporting Preliminary Findings on the Nature of Interaction\u2019, (2011), &lt;<a href=\"https:\/\/www.researchgate.net\/publication\/264885853_Multi-touch_interfaces_in_museum_spaces_reporting_preliminary_findings_on_the_nature_of_interaction\">https:\/\/www.researchgate.net\/publication\/264885853_Multi-touch_interfaces_in_museum_spaces_reporting_preliminary_findings_on_the_nature_of_interaction<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref13\" name=\"_edn13\">[13]<\/a> Uta Hinrichs, Holly Schmidt and Sheelagh Carpendale, (2008).<\/h6>\n<h6><a href=\"#_ednref14\" name=\"_edn14\">[14]<\/a> Uta Hinrichs and Sheelagh Carpendale, (2011).<\/h6>\n<h6><a href=\"#_ednref15\" name=\"_edn15\">[15]<\/a> Such as, Meredith Ringel Morris, Kathy Ryall, Chia Shen, Clifton Forlines and Frederic Vernier, \u2018Beyond \u201cSocial Protocols\u201d: Multi-User Coordination Policies for Co-located Groupware\u2019, <em>CSCW &#8217;04 Proceedings of the 2004 ACM conference on Computer supported cooperative work<\/em>, (2004), &lt;<a href=\"https:\/\/www-cs.stanford.edu\/~merrie\/papers\/social_protocols.pdf\">https:\/\/www-cs.stanford.edu\/~merrie\/papers\/social_protocols.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref16\" name=\"_edn16\">[16]<\/a> See for example:<\/h6>\n<h6>Peter Peltonen, Esko Kurvinen, Antti Salovaara, Giulio Jacucci, Tommi Ilmonen, John Evans, Antti Oulasvirta and Petri Saarikko, \u2018&#8221;It&#8217;s Mine, Don&#8217;t Touch!&#8221;: Interactions at a Large Multi-Touch Display in a City Centre\u2019, <em>CHI &#8217;08 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems<\/em>, (2008), &lt;<a href=\"https:\/\/www.cs.helsinki.fi\/u\/jacucci\/p1285-peltonen.pdf\">https:\/\/www.cs.helsinki.fi\/u\/jacucci\/p1285-peltonen.pdf<\/a>&gt;.<\/h6>\n<h6>Melanie Touw and Brad Miller, \u2018One Road: An engaging multi-touch interface within a museum context\u2019, <em>HCI2012 &#8211; People &amp; Computers XXVI<\/em>, (2012), &lt;<a href=\"https:\/\/ewic.bcs.org\/upload\/pdf\/ewic_hci12_uett_paper4.pdf\">https:\/\/ewic.bcs.org\/upload\/pdf\/ewic_hci12_uett_paper4.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref17\" name=\"_edn17\">[17]<\/a> Kate Haley Goldman and Jessica Gonzalez, \u2018Research Report: General Table Use\u2019, (2014), &lt;<a href=\"http:\/\/openexhibits.org\/wp-content\/uploads\/papers\/Open%20Exhibits%20General%20Table%20Use%20Findings.pdf\">http:\/\/openexhibits.org\/wp-content\/uploads\/papers\/Open%20Exhibits%20General%20Table%20Use%20Findings.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref18\" name=\"_edn18\">[18]<\/a> Eva Hornecker, \u2018\u201cI don\u2019t understand it either, but it is cool\u201d \u2013 Visitor Interactions with a Multi-Touch Table in a Museum\u2019, <em>2008 IEEE International Workshop on Horizontal Interactive Human Computer System (TABLETOP)<\/em>, (2008), p. 119.<\/h6>\n<h6><a href=\"#_ednref19\" name=\"_edn19\">[19]<\/a> Nicola Whitton and Alex Moseley, \u2018Deconstructing Engagement: Rethinking Involvement in Learning\u2019, <em>Simulation and Gaming<\/em>, 45, no.4-5, (2014), pp. 433-449.<\/h6>\n<h6><a href=\"#_ednref20\" name=\"_edn20\">[20]<\/a> Eva Hornecker and Matthias Stifter, \u2018Learning from Interactive Museum Installations About Interaction Design for Public Settings\u2019<em>, OZCHI &#8217;06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments, <\/em>(2006)<em>, <\/em>pp. 135-142.<\/h6>\n<h6><a href=\"#_ednref21\" name=\"_edn21\">[21]<\/a> Alissa N. Antle, Allen Bevans, Josh Tanenbaum, Katie Seaborn and Sijie Wang, \u2018Futura: Design for Collaborative Learning and Game Play on a Multi-touch Digital Tabletop\u2019, <em>TEI &#8217;11 Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction<\/em>, (2010) pp. 93-100.<\/h6>\n<h6><a href=\"#_ednref22\" name=\"_edn22\">[22]<\/a> Paul Marshall, Richard Morris, Yvonne Rogers, Stefan Kreitmayer and Matt Davies, \u2018Rethinking \u2018Multi-user\u2019: An In-the-Wild Study of How Groups Approach a Walk-Up-and-Use Tabletop Interface\u2019, <em>CHI &#8217;11 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems<\/em>, (2011), &lt;<a href=\"http:\/\/mcs.open.ac.uk\/pervasive\/pdfs\/MarshallCHI2011.pdf\">http:\/\/mcs.open.ac.uk\/pervasive\/pdfs\/MarshallCHI2011.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref23\" name=\"_edn23\">[23]<\/a> Michael S. Horn, Zeina Atrash Leong, Florian Block, Judy Diamond, E. Margaret Evans, Brenda Phillips and Chia Shen, \u2018Of BATs and APEs: An Interactive Tabletop Game for Natural History Museums\u2019, <em>Proceedings of the <\/em><em>2012 ACM annual conference on Human Factors in Computing Systems,<\/em> (2012), &lt;<a href=\"https:\/\/scholar.harvard.edu\/files\/chiashen\/files\/bat-ape_final-1.pdf\">https:\/\/scholar.harvard.edu\/files\/chiashen\/files\/bat-ape_final-1.pdf<\/a>&gt;.<\/h6>\n<h6><a href=\"#_ednref24\" name=\"_edn24\">[24]<\/a> <em>Ibid.<\/em><\/h6>\n<h6><a href=\"#_ednref25\" name=\"_edn25\">[25]<\/a> Donald A. Norman, \u2018Natural User Interfaces Are Not Natural\u2019, <em>Interactions<\/em>, 17, no.3, (2010), pp. 6-10.<\/h6>\n<h6>Similar sentiments have also been expressed by Bill Buxton at Microsoft Research:<\/h6>\n<h6>Bill Buxton, \u2018Multi-Touch Systems that I have Known and Loved\u2019, (2007), &lt;<a href=\"http:\/\/www.billbuxton.com\/multitouchOverview.html\">http:\/\/www.billbuxton.com\/multitouchOverview.html<\/a>&gt;<\/h6>\n<h6><a href=\"#_ednref26\" name=\"_edn26\">[26]<\/a> See for example:<\/h6>\n<h6>David S. Kirk, Shahram Izadi, Otmar Hilliges, Stuart Taylor, Abigail Sellen and Richard Banks, \u2018At Home with Surface Computing?\u2019, <em>CHI &#8217;12 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems<\/em>, (2012), &lt;<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/02\/athome.pdf\">https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/02\/athome.pdf<\/a>&gt;.<\/h6>\n<h6>Paul Marshall et al., (2011).<\/h6>\n<h6><a href=\"#_ednref27\" name=\"_edn27\">[27]<\/a> Melanie Touw and Brad Miller, p. 3.<\/h6>\n","protected":false},"excerpt":{"rendered":"<p>Guest post from Yineng Zhu, Andrew Permain and Joe Searle, MA Museum Studies students working with the Archives &amp; Special Collections team. Yineng Hello, I\u2019m Yineng Zhu and I have\u00a0been doing a placement with Special Collections in the Library as part of my MA in Museum Studies. My project is about the University Library\u2019s history. [&hellip;]<\/p>\n","protected":false},"author":233,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-1564","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/posts\/1564","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/users\/233"}],"replies":[{"embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/comments?post=1564"}],"version-history":[{"count":20,"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/posts\/1564\/revisions"}],"predecessor-version":[{"id":1732,"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/posts\/1564\/revisions\/1732"}],"wp:attachment":[{"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/media?parent=1564"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/categories?post=1564"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/specialcollections\/wp-json\/wp\/v2\/tags?post=1564"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}