Part 1 in this series examining the Knight Foundation’s Magic of Music final report focused on a number of the most positive aspects of the report. This final installment will continue by examining some of the more puzzling components…
After having the opportunity to read through the report, there were a few areas where the foundation appeared to make bewildering connections between data from their very useful classical music consumer segmentation study and programs initiated by participating orchestras. At the same time, I was glad to discover that, more often than not, this was the exception and not the rule for the entire report.
Too Much Static Analysis
One of the most valuable components of the entire Magic of Music program was the classical music consumer segmentation study, which serves to replace anecdotal and observational knowledge with hard data. Although I thought much of the report did a good job at presenting some of the data and resulting conclusions, there were a few areas where the foundation failed to follow through with a reasonable level of dynamic analysis.
One glaring example of this shortcoming was with regard to the audience’s perception of artistic quality. For example, on page 31 the report states that the classical music consumer segmentation study determined that “only 6 percent of those interested in classical music considered themselves very knowledgeable about it, while more than half described themselves as “not very knowledgeable.”
Nevertheless, shortly thereafter on page 33 the report uses this same data to arrive at a bizarre conclusion:
“Orchestras continued to point to…mission statements focused on “quality” even though this issue was not at the top of most consumers’ concerns, and a definition of classical music that the consumer, meanwhile, seemed to be appropriating and refashioning into something different.”
How can you put much merit into where the consumer places the importance of quality when more than half of those interested in classical music describe themselves being “not very knowledgeable” about classical music in the first place? As such, how would consumers be able to define quality with any reasonable level of confidence that would be capable of helping the art form thrive and evolve?
If anything, the data gathered by the classical music consumer segmentation study points to a strong need for helping consumers become more confident in their interpretation and enjoyment of quality performances. In short, they need to become discerning and know when they are getting what they pay for. This is an issue which is discussed on a regular basis at Adaptistration and among most of the administrators and musicians I know.
As such, with regard to this particular point in the report, I think the foundation came to some overly simplistic conclusions based on some very static analysis. At the same time, if they sifted through that same data with a more dynamic approach, it is likely that they would come to some practical conclusions.
Pound For Pound
Another conclusion I found particularly puzzling was in the “Lessons for Funders” section about the foundation’s surprise at how much money they ended up spending on the entire program; more than $13 million. On page 50, the report states:
“To produce transformational change in a field, the dollars and time invested need to be commensurate with the scale of the industry.”
I found that this conclusion didn’t sit very well in my mind. First, why is there such a burning desire to change an entire field? Secondly, why would you compare what may or may not be transformational in an organization like the Brooklyn Philharmonic with what could be useful for somewhere like Charleston?
In general, I think universal application of specific programs has far less merit than examining the process an organization uses to develop the program. In this case, the report spends too much time focusing on specific results endemic to unique environments.
I mention Brooklyn above because I thought the report put far too much emphasis on what transpired there because much of the program they implemented would likely be counterproductive for similar size budget organizations that serve as the primary point of contact for live classical music in their respective communities.
The result left me thinking that it would have been much more satisfying to read about the process Brooklyn used to arrive at their particular program than the program itself (although in a perfect world, I’ll gladly take both). Furthermore, the fact that the foundation identified this as a lesson made it seem as though much of the project was approached from a top-down perspective.
A Red Herring
Last, but certainly not least, the final point that caught my attention several times throughout the report was sincerely troubling. It was disappointing to see the foundation fall prey to the old, tired behavior of blaming some of the program’s failures and shortcomings on collective bargaining agreements.
For example, in the “Lessons for Orchestras” section on page 50 the report lists collective bargaining agreements among other issues as preventing organizations from finding solutions to their problems:
“Magic of Music started with the simple premise that changes in the concert hall experience would transform orchestras. That turned out to be simplistic. More varied and interesting programming, a revitalized concert hall experience, more involved music directors, better marketing, enhanced participation of musicians in governance and decision-making, less restrictive collective bargaining agreements, more innovative use of technology, alternative leadership models, larger endowments, more education and outreach – all these things and others can contribute to solutions.”
The report touched on each of the other issues in one fashion or another in the report with the exception of collective bargaining agreements. If the report devoted some time to explore collective bargaining agreements in the same way it did with other issues, their disgruntled attitude would have more merit. However, the report makes no reference to any specific issues it encountered within any participating orchestra’s collective bargaining agreement throughout the program’s decade of operation.
Additionally, the foundation apparently considers collective bargaining agreement negotiations as a negative force within the orchestral environment. In fact, the report once again mentions collective bargaining agreements as a barrier to “transformational change” as well as “genuine and substantive dialogue.” For example, on page 49 the report states:
“Transformational change in orchestras is dependent on the joint efforts of all members of the orchestra family – music director, musicians, administration, and volunteer leadership and trustees. An early major discovery of the foundation was that rarely were the important components of the orchestra family coming together to plan for the future. As a result, transformational change was being blocked. Music directors were largely absent, musicians and management engaged in discussion through collective bargaining, and trustees and musicians seemed completely removed from one another. Ten years later, due to the efforts of Knight and others within and outside the orchestra field, barriers are being removed and genuine and substantive dialogue is occurring.”
This is a puzzling conclusion. On one hand, the conclusion is a little too vague; is the foundation saying collective bargaining agreement negotiations are a barrier to transformational change or that management/musicians simply spend too much time involved in the negotiation process at the expense of other communication?
At the end of the above point from page 49, the foundation claims that they, along with others, are removing these barriers, which include the collective bargaining agreements negotiation process. What exactly is the foundation advocating in this statement? Do they advocate the end to the negotiation process? Are they suggesting that CBA’s should be done away with?
Unfortunately, it is difficult to come to any reliable answer to the above questions because the report doesn’t include any details regarding how the program dealt with collective bargaining agreements or the negotiation process. In fact, the only other reference in the report to collective bargaining agreements is a positive reference on page 21. In that instance, the foundation claims that the Saint Louis Symphony Orchestra was able to capitalize on some of their Magic Of Music initiatives because of an added provision to the collective bargaining agreement.
In the end, the report sends a very mixed message on this issue. On one hand, they lambaste collective bargaining agreements along with the negotiation process without providing any examples to support this position. However, on the other hand the only other vague reference to these issues (on page 21) is far more positive.
In the end, the report loses credibility by failing to provide the same level of evidence offered for other issues to support their conclusions.
Much like the first installment in this series, there are other issues which fell into today’s examination but they failed to rise to the same level of analysis. For example, the report makes it quite clear that the foundation believes orchestras in the midst of financial crisis are not capable of instituting “programmatic innovation.”
As such, it was puzzling to see that the foundation selected a number of orchestras which were far from financial stability to participate in certain initiatives at the time they joined the program; such as Louisiana and San Antonio. This made me curious to know more about what sort of system the foundation used to evaluate a participating orchestra’s financial health.
In the end, the report was a very useful tool to help summarize a seemingly endless stream of reports and papers produced by the decade long program. At the same time, it doesn’t replace those documents and anyone sincerely interested in learning about Magic Of Music’s full details will need to add everything the foundation has published on the program to their reading list.
Nevertheless, if you have had the opportunity to read the final report, what did you think? Did any of the above issues strike you as being puzzling or did you notice something else not mentioned in this article? I invite you to take a moment and submit a comment with your thoughts and observations.