
Arts Alive: Learning about the UX Lifecycle
through application
The Goal
To go through the UX lifecycle, and learn about how to design technology to be usable, useful and enjoyable for people.
The Project
Investigate a domain, find a barrier and solve it with some kind of technology.
The Scope
We decided to investigate art facilities on campus, because they were accessible to us and impacted our community.

Contextual Inquiry
Our main goal was to better understand the work flow, values and weaknesses of a couple art galleries on campus – the Perspective Gallery in Squires and the Moss Arts Center (MAC). We approached this task by interviewing several different work roles of museums (curator, artist, visitor, box office employee, etc.), over different mediums through combinations of video recordings, interview notes, and sometimes both. In total we had 11 interviews from our first iteration of contextual inquiry. We also spent time visiting the galleries, collecting artifacts and sketching the spaces. In the end however, we decided to only focus on the MAC to have a more concentrated, and information rich project.
Contextual Analysis
To better understand our data, we assembled a Work Activity Diagram (WAAD) of over 200 work activity notes, and created a Social and Work Flow Model. These identified a few issues within the MAC. First of all, the Moss Arts Center saw itself as a multifaceted facility meant to provide engaging experience and interdisciplinary learning about the arts, for students. However, our WAAD showed that many work roles believed it was unable to fully connect to students because it was new and thus unknown to them. Also the demographics of visitors showed that undergraduate students had less involvement than Blacksburg community members or graduate students, and that most visitors learned about the events they were attending through ‘word of mouth’, not because of any effort on the Moss Art Center’s part. In addition, we had enough data to create an entire branch on our WAAD called ‘Outreach’, which had a sub branch called ‘Issues’.
To test these claims we heard from interviewees, we ended up polling students on campus outside Owens Dining Hall, asking them if they could name any art galleries on campus and how they heard about them. Many students could not name any, or at least not their proper names. We got a lot of responses that identified ‘the building behind squires’, or ‘the big glass building’ when trying to refer to the Moss Arts facility. And of the students who knew what the Moss Arts Center was, it was again through word of mouth, not because of visitation or intentional programming.

Above: WAAD, Below: Social and Workflow Models


Critique
During this time, we also made sure to check back with some key work roles and ask them if our interpretations about the Moss Arts Center and its problems with outreach were accurate. In particular we interviewed Margo Crutchfield, the Curator at Large, and Raymond Ginn the Partnerships and Director. They both agreed that our understanding of the Moss Arts Center’s values was correct, which we described to them as ‘a multifaceted facility with an emphasis on engagement and interdisciplinary learning, and transparency’. They also expressed that our identification of weak student involvement in the Moss Arts Center was indicative of a barrier to what the Moss Arts Center wanted to provide for students. Lastly, they enjoyed our proposed solution and looked forward to seeing its completion.
Then, after opening up our project for peer critique we came up against two difficult questions:
What about art exhibits that do not lend themselves well to interaction?
At the time our design did not cater well to these kinds of arts outside of ICAT. So we consulted some of our alternative design ideas to see if we could fix this problem. We remembered our virtual tour guide that was meant to use location services to give information about the galleries you were currently visiting. We decided that for more static art, we could have it tell you things about itself, basically any kind of fun or interesting supplemental information that could not be put on a plaque. Now to make its presentation interesting, we decided to wrap this information in little icons that float around on your smart device – another alternative we had discussed earlier. To us, this unique informational display held to our values of interactivity for the application.
How does this incentivize someone to visit the Moss Art Center if you can do everything on your smart devices?
At this point we recognized that we needed to save some of our installations for local interaction within the Moss Arts Center, so that users are incentivized to visit in order to unlock these extra features.




Ideation
Thus, we saw an obvious disconnect between what the Moss Arts Center wanted to provide for students, and how aware and interested students were about it. And this was the problem that we decided we wanted to solve! In discussing possible solutions, we asking ourselves - ‘How do we get people invested in the arts?’ To answer this, we studied some anecdotal success stories from the Moss Arts Center - like how a light projection display on the building brought a ton of social media traffic, even though the event was minimally advertised. We then realized the power of engaging students as contributors to the arts as a way to create investment, and decided to run with that.
Through sketching, discussion, walking our personas through different scenarios and story boards, and looking at the problem through the three different design perspectives – ecological, interaction and emotional – we finally came to a solution.
We decided to create an app that allowed for users to interact with digital installations in real time. Meaning, that if there was an art audio and visual display in the CUBE at the Moss Arts Center, you could change its sounds and visual effects from any remote location using the app - essentially having crowdsourcing art. In addition, we would provide a live stream of the installation site so you could see your effects and how people visiting the site responded to them. Our hope was this interactive feature would generate interest because of its fun and remote functionality that would lead to more awareness about the Moss Arts Center.
However, we also wanted to showcase scheduled performances over live stream to widen attendance opportunities outside of space and time constraints, again touching on some ICAT hopes. In addition, we wanted to connect these two features (interaction and live streams) to social media, because it was another invested area of the Moss Arts Center. Thus we decided to incorporate an archive of videos of prior performances that could be shared to social media with a suggested tag and hashtag. Our aim was to help users start their own conversations about art, and be the vehicle we needed to quickly and largely spread awareness about the Moss Arts Center. We ended up naming our application Arts Alive, because it is meant to help make the arts come alive at the Moss Arts Center for students.
Prototyping
With these new changes, we recognized that the time had come to start prototyping. We ended up choosing a T prototype, so that we could shallowly cover all features of our app to be used in evaluation, but cover in depth the virtual tour guide because it was new. Next, we decided it would be appropriate to have a prototype that was somewhere between low and mid fidelity, because we had made a couple wireframes for older screens and only had one feature that we were adding on. Lastly, we wanted to pick a type of prototype that fit our goals for evaluation – analyzing if our app could effectively generate interest and knowledge through fun interaction. We decided to create a click through prototype because we felt that it was dynamic enough to showcase the new feature’s functionality as well as demonstrate some of the engaging aspects of our application.
User Testing
Our study would start with a pre-questionnaire of basic questions about user knowledge and interests in the Moss Art Center and arts as a whole. Afterwards we conducted a quasi-empirical study with the think aloud technique. It would begin with an evaluator asking the subject to pick a feature to explore on our App. Then, while investigating it with them, the evaluator would use their discretion to also bring in planned follow up tasks. Afterwards, the post-questionnaire was given, which was more heavily focused on the Arts Alive application and its features, effectiveness, flexibility, and design. For each question using a 6 point Likert scale, we quantified our results by determining “Strongly disagree” to be a 0 and “Strongly Agree” to be a 5, with all other responses falling on an integer between the two. We averaged the responses and converted it to a percent to determine our success – anything above or equal to 80%.
Evaluation Results
Before taking our evaluation to our participants, we conducted a heuristic study of our project and some pilot tests with our peers to check our work and make appropriate tweaks. Next, we conducted our studies with VT students from a variety of majors. After compiling data we learned that our feature labels and directions were not always clear. For one example, several participants thought the message “For more features visit the Moss Arts Center” would take them to the Moss Arts Center website. To fix this, we decided to reword the message to “To unlock more features, come visit the Moss Arts Center”. Another subject had concerns that there was not a search field on the WATCH feature (our archive of videos of prior performances). So we decided to add one for user convenience, especially considering that the number of videos will increase overtime.
​
Lastly, our risky yet innovative floating icons approach brought up mixed impressions. Some users did not mind them, while others were concerned that they would make interaction with the app confusing and difficult. We want give priority to this problem because it was an important part of our design, and make sure that it is disruptive before throwing it away. In particular, we believe that if we had an actual animation of how quickly the icons would move around the screen, we may lessen participant worry. Thus, as future work we need to create a local prototype with the animation and interaction capabilities for further investigation, and another study to test different terminology and interpretations from users.
In order to better gauge our app’s success with respect to its goals and its intended audience, we decided to analyze our feedback from the evaluation phase through the eyes of our personas: Josh, Sarah, and June. We reviewed the commentary from our prototype evaluation and post questionnaire and grouped the noteworthy points with the persona that would be most likely to offer them:
Our target persona, mainly concerned with the app being interactive and engaging, not so much with the publicity or traffic generated for the Moss Arts Center
More interested helping her peers to be aware of opportunities and information pertaining to the arts at Virginia Tech
Has a large focus on outreach, especially with respect to non-regular visitors
​
-
Q: “This app is fun to use” – 90%
-
Q: “I felt this app was interactive” - %86.7
-
Participant Feedback:
-
features are too static – instead videos could be more interesting with multiple viewpoints and cameras – like a 360 video
-
-
Q: “I would recommend this app to a friend” – 83.3%
-
Participant Feedback
-
The live stream’s use of the word “viewers” instead of “views” provided a form of connection between the many people watching the video at the same time
-
There is not a way to project videos onto a larger screen such as a computer for viewing with friends
-
There is no search functionality for videos
-
-
Q: “I think this app will interest regular and non-regular visitors of the Moss Arts Center” – 76.5%
-
Q: “This app incentivizes me to visit the Moss Arts Center” – 80%
-
Participant Feedback
-
The terminology on the app often leads to different interpretations, a barrier to usage and outreach
-