2011 Canadian Orchestra Website Review: Overall Rankings

Between 10/12/11 and 11/04/11, 14 professional orchestra websites were examined and ranked by how well they presented their concert schedule, sold tickets, facilitated making donations, provided organizational information, utilized dynamic content, and on overall content and functionality on both desktop and mobile platforms.

Keep in mind; the websites were not examined on the subjective basis of color schemes, graphics, or other aesthetic qualities except in cases where those elements hindered functionality. Consequently, the reviews are not only fair but based on a set of quantifiable criteria, all of which allows orchestras of varying budget size to be evaluated on an even playing field.

It is also worth keeping in mind that the 2011 reviews employed an expanded set of evaluation criteria (details) along with measuring how well sites performed on a variety of mobile platforms. And given the weeklong gap between the US and Canadian reviews this year, it would be worth your while to visit the 2011 evaluation criteria article so as to better understand the changes and how they impacted overall scores.

Canadian orchestra websites were evaluated using the very same criteria used for the US evaluations with one notable exception; the Canadian review does not include the Institutional Transparency sub category. This is due to differences between federal laws governing tax forms and public transparency requirements.

The 2001 Orchestra Website Review data is only available at Adaptistration Premium; get your subscription today.

[quote style=”boxed” float=”right”]Kudos to the Edmonton Symphony for taking top honors this year.[/quote] Although the #1 spot remained unchanged, the remaining positions experienced a fair amount of shifts. And much like the US reviews, overall scores were noticeably lower; only one orchestra managed to score over 70/100 and only three groups earned at least two out of five stars.

The primary reason for most orchestra’s diminished performance was due in large part to the following issues:

  • A lack of direct buy tix links for events featured on the landing page.
  • A convoluted donation shopping cart (some systems actually required users to remove ticket purchases before they could add a donation).
  • Inefficient optimization for tablet platforms.
  • A lack of search features and/or sitemaps.
  • Lackluster and sparse education content.
  • Concert calendars that displayed nothing more than an event’s name (no what/where/when details, no “buy tix” link, etc.).

 Tomorrow’s article will examine detailed category scores and survey results for each orchestra in addition to analyzing scoring trends over the past five years.

Canadian Website Homepage Gallery

Following in the footsteps of the US reviews, I wanted to be certain to include an image gallery of the Canadian website homepages. Of all the review years, 2011 saw the highest number of homepage redesigns. Can you see some of the items listed above that contributed to lower scores? Remember, look past the aesthetics and focus on the content (or in some cases, lack thereof).

About Drew McManus

"I hear that every time you show up to work with an orchestra, people get fired." Those were the first words out of an executive's mouth after her board chair introduced us. That executive is now a dear colleague and friend but the day that consulting contract began with her orchestra, she was convinced I was a hatchet-man brought in by the board to clean house.

I understand where the trepidation comes from as a great deal of my consulting and technology provider work for arts organizations involves due diligence, separating fact from fiction, interpreting spin, as well as performance review and oversight. So yes, sometimes that work results in one or two individuals "aggressively embracing career change" but far more often than not, it reinforces and clarifies exactly what works and why.

In short, it doesn't matter if you know where all the bodies are buried if you can't keep your own clients out of the ground, and I'm fortunate enough to say that for more than 15 years, I've done exactly that for groups of all budget size from Qatar to Kathmandu.

For fun, I write a daily blog about the orchestra business, provide a platform for arts insiders to speak their mind, keep track of what people in this business get paid, help write a satirical cartoon about orchestra life, hack the arts, and love a good coffee drink.

Related Posts

0 thoughts on “2011 Canadian Orchestra Website Review: Overall Rankings”

  1. Interestingly, by changing the criteria upon which Canadian orchestra websites are judged, it is not possible to fairly compare American orchestra websites witht their northern cousins. It’s the old “apples and oranges” problem again.

    Also, despite lacking a legal requirement for transparancy, wouldn’t transparancy remain a useful and desirable criterion? For any nonprofit organization (not just symphony orchestras) transparancy remains a vital link between administration and public support. I don’t think that anyone should be exempt from scrutiny simply because local laws or custom fail to mandate otherwise.

    • Those are good points and on the surface, I can see where one might get that impression but it doesn’t apply to the reviews for two primary reasons:

      1) The numbers don’t amount to a substantial difference. Granted, this sort of info is typically restricted to those with Adaptistration Premium subscriptions but I’ll pull out this one item here for the sake of an example. The institutional transparency sub-category amounts to a maximum cumulative point value of 2.22/100; so the overall impact in the apples to apples comparison is, at best, marginal assuming a US orchestra earns all 2.22/100 points. But the reality is the average US group only managed to earn 0.38/100 points for that sub-category (and as an aside, that was the lowest sub-category average). This means that the real difference between US and Canadian scores is 0.38, which is just about as negligible as one can get.

      2) The ugly-American syndrome. Set aside the reasons stated in the reviews behind why institutional transparency, is less applicable to Canadian orchestras for a moment, not to mention the inherent differences in how federal tax documents are designed and information is reported, and applying grading criteria designed around US based public disclosure documentation to non-US groups projects a bit of an supercilious image. Meaning, that as Americans, we would expect all other countries to adopt similar business models and practices without any consideration for inherent operating environment only invites more problems than any well intended application.

      Is it beneficial for Canadian groups to include whatever institutional transparency documentation they can at their websites? Sure, but the end results on overall giving will be different than their US counterparts. For example, many Canadian groups have more work to do along the lines of convincing annual donors on the value of giving in light of increased levels of government support; all of which doesn’t even approach the psychological perspective of individual donors perceive the relationship between government/nonprofit regulation. As such, it becomes clear why the disconnect between something that seems like a good idea can have less impact from the point of concept to the point of application.

      Now, having said all of that, I do think that Canadian nonprofits are beginning to move more toward an environment that is closer than not, but it has a ways to go and it could begin to move apart once again if the global economy shifts back toward traditional levels. So in the end, apples to apples is preferable but it sometimes requires levelers in order to achieve.

Leave a Comment