In its eighth big year, the Orchestra Website Review has become the benchmark for how well groups present their concert schedule, sell tickets, and facilitate donations. And in order to continue serving in that capacity, the reviews adjust each year to incorporate shifts in technology and how patrons use the web. To that end, big changes were put into place for the 2011 reviews which produced an equally big impact on scores.
Astute observers have noticed that the reviews tend to add a few elements each year; for example, they include a few additional quantifiable factors or reach a little deeper into existing evaluation items. That has been a deliberate effort so as not to overwhelm organizations by expecting too much too soon but still keeping up with changes in technology.
Fortunately, this has proven a worthwhile strategy as average scores have steadily increased over the years. But we don’t have the luxury of taking our time for this installment as 2011 has seen enormous changes in the habits of typical web users; due in large part to the advent of Smartphone and Tablet usage.
In short, the touchscreen user experience (UX) has had a profound impact on the way users interact with the traditional desktop/notebook browser experience.
NEW EVALUATION CRITERIA
Orchestra websites are still graded on six categories but each one has undergone anywhere from minor to substantial changes in order to incorporate the new UX requirements alongside the regular regiment of enhancements. A perfect example of this is the homepage design category:
Previous Evaluation Criteria
- Patrons need to be able to visit an orchestra’s website and be able to gather information about the latest performances as well as upcoming events directly from the home page. Meaning, those events should be prominently displayed on the homepage with dates and times.
Enhanced Evaluation Criteria
- Patrons still need to find the same event information on the homepage but that info must also contain action buttons (links) to immediately enter the ticket purchase process as well as a separate action button to visit the corresponding event page.
Requiring potential ticket buyers to load one or more interior pages before they can enter the shopping cart process is tantamount to UX blasphemy in today’s user environment. As a result, landing page layout is more about finely tuned and available persuasion triggers capable of accommodating a user culture that probably knows what they are after to begin with and are therefore focusing on what’s important “right now.”
Here’s a breakdown of all the 2011 review enhancements:
Category 1: Landing Page – 20 points maximum
- Prominent layout and persuasion triggers need to provide users with the ability to convert directly to the purchase process and/or expanded event information.
- An interactive concert calendar must provide similar action buttons by way of user selected search interface.
Category 2: Purchasing Tickets – 20 points maximum
- In addition to the existing criteria, this category now examines the content within each individual event page along with the actual shopping cart process.
- There are five sub-categories covering subscription and single ticket sales, event listing, customer service elements, and the shopping cart process.
[quote style=”boxed” float=”right”]New evaluation areas include the shopping cart process, mobile platform optimization, and social media sharing from within individual pages.[/quote] Category 3: Making Donations – 20 points maximum
- This category has been heavily redesigned to include examination of the donation process, or what I refer to as “donor friendliness” and ease of contribution along with the overall variety of giving opportunities.
Category 4: Organization Information – 20 points maximum
- In addition to the existing criteria, this section now incorporates the sub-categories from the institutional transparency, media pages, and education content.
- A new set of criteria were designed to evaluate the use of social media; meaning not only posting outbound links to third party outlets like Facebook and Twitter but how those elements are incorporated within interior pages (you might be surprised how few organizations have social sharing links on individual event pages).
Category 5: Usability – 15 points maximum
- New content here includes an expanded sub-section on navigation elements along with patron support efforts in the form of customer service help content alongside conversion markers (such as a “need help?” link next to box office information or ticket purchase links).
- There are four sub-categories that incorporate everything in the previous item along with traditional items such as searchability, URL clarity, and legal notices. It also brings together previously separate sub-category items related to overall website security.
Category 6: Mobile Optimization – 5 points maximum
- This brand new category examines whether or not an orchestra has a Smartphone and Tablet optimized version of their site along with a similarly optimized ability to purchase tickets, find venue information, etc.
- There are two sub-categories; one to evaluate Smartphone platforms and the other to evaluate Tablet platforms.
A LITTLE HELP FROM MY FRIENDS
In case you aren’t getting the big picture for just how much the reviews have expanded (we’re even taking up the entire width of the blog page for some of the tables and graphs in later articles!), consider this:
- Two additional days were added to test and refine the new evaluation criteria as well as having it vetted through a few colleagues.
- The number of evaluation hours increase by nearly 150%.
- The number of article preparation time doubled.
That’s a great deal of time on top of something that already took more than a full work week to compile.
Fortunately, generous offers from colleagues to help out this year eased the load so in addition to the outside vetting I handed over the entire mobile platform evaluations to two individuals. One evaluated the Smartphone platform while the other evaluated Tablets.
Consequently, I’d like to take a moment to extend my infinite gratitude to Christopher Barton, Marketing/Box Office Manager for the College of Arts + Architecture at University of North Carolina at Charlotte, and another colleague who prefers to remain anonymous (but rest assured s/he does not actively work inside the orchestra field so there is no conflict of interest).
Without their assistance, the website reviews wouldn’t be such a comprehensive resource!
[quote style=”boxed” float=”right”]Expect to see some changes in the status quo and remember; don’t panic.[/quote]Let me warn everyone now that due to the enhanced evaluation criteria, you’re going to see a comparatively big shakeup in the traditional standings; but more to the point, you’re going to see much lower average scores than in recent years.
In its own way, this is a positive thing when you consider how low scores were in the very first Orchestra Website Reviews (really, they were pretty bad). But following that review, the business started to rise to the challenge and sites steadily improved each year. As such, there are plenty of reasons to expect this year’s review will have a similar positive impact on the entire field.
However, in order to take little bit of the sting, it seemed like a good idea to convert from a letter grade based ranking to a five star system. If nothing else, it has a slightly softer visual impact but keep in mind, the overall 100-point system remains unchanged; so in that sense, you can still make an even comparison between the 2010 and 2011 reviews.
MAINTAINING A LEVEL PLAYING FIELD
Although the reviews underwent major enhancements in the review criteria, it is important to reassert changes were guided by the fundamental principles that websites are not examined on the subjective basis of color schemes, graphics, or other aesthetic qualities except in cases where those elements hindered usability. Consequently, the reviews not only remain fair but based on a set of quantifiable criteria, all of which allows orchestras of varying budget size to be evaluated on a level playing field.
An Important Disclosure
One of the byproducts of conducting the Orchestra Website Reviews for so many years, listening to so many marketing and IT professionals pinpoint their frustrations with developing an online presence, and working directly with numerous groups on these efforts is a precise knowledge of what arts organizations need to improve those efforts. Over the years, I’ve searched for a way to bring all of this together by creating a system designed especially for performing arts organizations and over the past season that goal was finally achieved with the release of The Venture Platform.
When I announced Venture some readers wondered if the Orchestra Website Reviews would continue to be impartial and the answer to that is an undeniable “YES!” Simply put, the evaluation criteria are almost exclusively quantitative so there’s simply no way to implement favoritism. Likewise, assigning an entire evaluation category to independent reviewers helps affirm the objectivity.
Consequently, it will never matter whether any orchestra in the review is a Venture user or not because Venture is a platform and not the actual content.
What this means is Venture users are ultimately responsible for content and design.
All of this boils down to what a publishing platform is all about and how it rarely intersects with what the Orchestra Website Reviews are designed to measure: how well organizations present their concert schedule, sell tickets, facilitate making donations, provide organizational information, utilize dynamic content, and on overall content and functionality.
As a publishing platform, Venture makes these tasks as easy as possible while also encouraging creativity but it doesn’t generate content for users. Consequently, a Venture user could garner a low score just like any other user if they don’t use the platform to build an amazing online presence.
In the end, all of this allows the Adaptistration Annual Orchestra Website Review to continue as the gold standard of unbiased, objective, and constructive benchmarking for performing arts organization website effectiveness.
And to put my money where my mouth is on all of this, I’m offering to personally examine each sub-category evaluation with any orchestra in the review if we’ve ever discussed your organization becoming a Venture user. Typically, requests for additional review details are redirected toward an Adaptistration Premium subscription (where that information is available) but in this case, simply get in touch and I’ll be happy to find a time to walk through everything with you.