There’s an intriguing article in the 3/7/2014 edition of the Twinsburg Bulletin by Parker Perry where the author recounts a recent experience attending a Cleveland Orchestra performance at Severance Hall. The author, a college student, was attending a Severance concert for the first time and he provides a newbie’s perspective toward a number of elements that contributed to his overall user experience, or UX.
By and large, Perry’s found the overall experience to be underwhelming and even if you set aside some of his issues with the musical end of the evening, there’s plenty of non-artistic material for arts managers to wade through in order to perceive the experience from the one of the most sought after ticket buyers: newbies.
In web design, a bad UX is rarely defined by one, two or even three items; instead, it’s a host of items that conspire to cross a user’s tolerance threshold.
The tricky part is this threshold is never a clear, single line that remains in the same place from one day to the next. Instead, it is similar to weather patterns which require monitoring several variables on a fairly regular basis in order to build a comprehensive picture.
Good web design doesn’t stop at usability testing prior to launch, and just like weather forecasting, it requires ongoing attention to be useful.
A performing arts organization’s primary point of contact in measuring concert event UX efforts is a Quality Assurance (QA) team.
In short, QA pros are responsible for making sure that your design for a rewarding and uplifting concert experience works as intended. Granted, for most groups, the word team may be overly formal but it is quite common for a casual QA squad to consist of several key administrative members that regularly attend concerts.
One of the better examples from Perry’s article is his interaction with ushers. Based on my professional experience, and assuming the orchestral organization is also responsible for ushers, most orchestras do a very good job at training ushers.
In Perry’s account, the ushers were never rude but they weren’t terribly proactive either; instead, their efforts were reportedly focused more on moving through the necessary motions of crowd management and minimizing wait times for regular patrons. They were keeping the trains running on time as it were.
Again, based on Perry’s account, it doesn’t appear that there was much effort along the lines of converting observational behavior into action by identifying (or acknowledging) patrons that may be out of their element and taking action to help mitigate potential UX downers.
Assuming an orchestra’s QA team is well versed in all of the variables that contribute to positive concert event UX, they should be able to monitor and identify the ever changing UX environment, make adjustments accordingly, and realize any need for reinforcing existing training.
But how often does this happen on a regularly scheduled level at your orchestra? I’m curious to hear your thoughts and what transpires at your organization.