Unique to the Canadian portion of the annual website reviews, this installment examines the similarities and differences between how Canadian and U.S. orchestra websites compared in overall grades as well as the average scores category by category.
You’ll also get to see how the Canadian orchestras would fare compared to the Top 10 orchestra websites from the U.S. reviews and gain some insight into which components could stand the greatest cumulative improvement…
Grades
For 2007, the overall grades for Canadian orchestra websites showed signs of significant improvement. Although there were no orchestras to receive a grade of A and the same percentage earned a grade of B, the percentage of orchestras to receive a grade of C increased by more than four times. Furthermore, the percentage of orchestras to receive a grade of D or F decreased. The chart to your left illustrates the division between grades for the 2007 evaluation (click to enlarge).
Canadian/U.S. Comparisons
In 2005, the differences in overall and category scores between Canadian and U.S. orchestra websites were marginal; likewise, 2007 produced slightly different results. Those results are illustrated in the chart to your left (click to enlarge). For example, in the three out of five categories where the Canadian websites scored higher than U.S. websites, the average increase was 13.28 percent. However, in the two categories out of five where U.S. websites scored higher, the average increase was 16.34 percent.
One category where the Canadian groups pulled significantly ahead was Category 1: how the organizations presented concert information. Although the Canadian websites garnered an equally in Category 5: Content & Functionality, that was the one category where the Canadian websites had one less subcategory which happened to contribute to the lowest average subcategory score for U.S. websites.
It is telling that the one category where Canadian websites trailed the most was Category 4: Making Donations. This was equally true in the 2005 review although the gap between average scores has decreased. If nothing else it goes to show that on average, the Canadian system of developing funds from individual donors has not yet progressed to the same average level as their U.S counterparts.
For example, there are several third-party resources U.S. orchestras can take advantage of to process online donations if they lack the ability or resources to do so in-house. Compare that to the Canadian system which only has one similar resource and it seems that the Canadian system may have to begin making faster progress just to catch up. According to Katherine Carleton, Executive Director/Directrice générale Orchestras Canada/Orchestres Canada, the third-party resource they recommend to members when asked is CanadaHelps.org.
Overall, the average score gap between Canadian and U.S. websites from 2005 to 2007 decreased from 2.92 to 2.50, respectively, which indicates both groups are making unhurried headway in the quality and effectiveness of their websites.
A final component which remained unchanged from 2005 is there was only one Canadian orchestra which would have placed in the Top 10 if included in the U.S. scores. The Toronto Symphony would have come in at the #4 slot, edging out their U.S. counterparts that also earned a score of B. The chart to your left illustrates the combined U.S & Canadian Top 10 (click to enlarge).
In the end, Canadian websites showed marked improvement and provided that they can make some cumulative progress in providing expanded PR contact information as well as musician information beyond a simple roster, then they stand a very good likelihood of raising their average score by a full letter grade.