2007 Orchestra Website Review: Trends & Detailed Scores

What a difference one year can make. From 2006 to 2007 there were quite a few changes among orchestra websites, some good and some not so good. Perhaps even more interesting, however, was the obvious amount of increased resources many orchestras directed toward website development efforts for the 2007-2008 season…

Here, you’ll discover which orchestra website components are improving and which are falling behind. For orchestra managers, this installment is undoubtedly one of the most resourceful articles in this series as you’ll have access to the detailed survey results. According to those responses, the vast majority of orchestras have great expectations for the role their websites will play in single ticket sales. for instance, one ensemble plans to generate nearly $10 million in single ticket revenue, solely from their website.

Grades
For 2007, the overall grades showed signs of improvement. For example, this was the first year where three orchestras exceeded 90/100 points, thereby receiving a letter grade of A. At the same time, the number of orchestras that received a grade of B or C remained exactly the same while there was an increase in the number of orchestras that received a letter grade of D. Fortunately, the number of orchestras receiving an F dropped by one. The chart to your left illustrates the shifts from 2004 through 2007 (click to enlarge).

Along with other notable improvements, 2007 experienced a slight increase in the percentage of orchestras that earned a letter grade of B or better (the acceptable minimum grade for an orchestra website) and for the first time, one ROPA ensemble earned a score high enough to cross the minimum acceptable threshold, The Florida West Coast Symphony. The pie chart to your left illustrates the breakdown of grades for the entire review as well as how each conference (ICSOM and ROPA) performed (click to enlarge).

On average, ROPA orchestras cumulatively improved at a faster rate than ICSOM ensembles, but both conferences improved enough to raise the overall average from 60.23 in 2006 to 61.53, as illustrated in the chart to your left (click to enlarge). Unfortunately, the downside to these statistics is that only 14 percent of orchestra websites are acceptable while 68 percent are severely underperforming and 18 percent are somewhere between those two parameters.

Highs and Lows
Another interesting trend since 2004 are the orchestras which consistently rank in the Top 5 or Bottom 5 organizations. At the top end, the only ensemble to consistently appear in the Top 5 throughout each year of the review is the Chicago Symphony. Moreover, they have placed in either the #1 or #2 spot each one of those years; an accomplishment the organization should be enormously proud of. At the same time, the San Francisco Symphony has appeared in the Top 5 throughout three out of last four reviews.

Conversely, none of the orchestras included in the review have found themselves in the Bottom 5 for each year of the review. However, the South Bend Symphony has been in the Bottom 5 for three of the past four years, each year falling in score while occupying dead last for the 2006 and 2007 reviews. With regard to the “battle for the basement”, South Bend is not alone as the Mississippi Symphony has finished one position higher for each of the last three years. The chart to your left illustrates the Top 5 and Bottom 5 results from 2004-2007 (click to enlarge).

Category Scores – Some Lessons Are Learned Better Than Others
With regard to the five evaluation categories, orchestras seem to be making progress at different rates. As the chart to your left illustrates (click to enlarge), the two fastest growing areas of improvement are in the categories of Making Donations and Purchasing Tickets while the Performance Schedule category continues to make progress as well, albeit at a slower rate.

Unfortunately, the Content & Functionality category suffered tremendously this year with one of the single largest declines for any category in the history of the review. At the same time, in an age where orchestras are struggling to make stronger connection with their community, the Orchestra Information category witnessed a slight decline for the second straight year.

Detailed Scores

Click to learn about how you can access this information at Adaptistration Premium
Click to learn about how you can access this information at Adaptistration Premium

About Drew McManus

"I hear that every time you show up to work with an orchestra, people get fired." Those were the first words out of an executive's mouth after her board chair introduced us. That executive is now a dear colleague and friend but the day that consulting contract began with her orchestra, she was convinced I was a hatchet-man brought in by the board to clean house.

I understand where the trepidation comes from as a great deal of my consulting and technology provider work for arts organizations involves due diligence, separating fact from fiction, interpreting spin, as well as performance review and oversight. So yes, sometimes that work results in one or two individuals "aggressively embracing career change" but far more often than not, it reinforces and clarifies exactly what works and why.

In short, it doesn't matter if you know where all the bodies are buried if you can't keep your own clients out of the ground, and I'm fortunate enough to say that for more than 15 years, I've done exactly that for groups of all budget size from Qatar to Kathmandu.

For fun, I write a daily blog about the orchestra business, provide a platform for arts insiders to speak their mind, keep track of what people in this business get paid, help write a satirical cartoon about orchestra life, hack the arts, and love a good coffee drink.

Related Posts

Leave a Comment