Forcing data to reach a conclusion causes nonprofit arts organizations to issue pronouncements that just aren’t so.

Conclusion
I had a friend who had a Ph.D. in Psychobiology. She spent her post-doc work in a lab where, among other experiments, they tested a hypothesis about access to the brain via the olfactory nerve. That particular nerve, one of the key elements of the sense of smell, uses one of the shortest distances from the brain, which is why smell is usually a keen sense. The hypothesis being tested had to do with sending impulses to the brain from the nerve (rather than the usual access from the nerve to the brain). When she told me about the work, my immediate reaction was that of hope.
“If you can prove that an impulse can be sent backwards,” I asked excitedly, “into the brain instead of out of it, does that mean that we’ll be able to treat brain diseases, tumors, and aneurysms more directly? With more speed and accuracy? And with a higher potential for success?”
“Slow down,” she said. “Scientists don’t do that. Data scientists collect data. Period. All we proved was that under the right circumstances, an impulse was sent to the brain that way. You can’t jump to conclusions like that. It’s not scientific.”

Her study intended to discover not only if access could happen, but how it could happen, and how reliable the methodology could be repeated, if it was repeatable at all. That study had nothing to do with anything else.
I understand. It’s hard not to make that jump into hope. Hope is exciting. Hope can push people into great things. Hope is, well, hopeful.
But hope is not a scientific attribute. It is strictly emotional and relatively unhelpful. Hope can be harmful to your health.
And to a nonprofit arts organization.

So when a study comes out, it’s best to look at its data as interesting or curious. No more than that. Above all, one should not jump to any conclusions. Instead, a reader should try to poke holes in the study’s methodology; it’s what any 11th grade Chemistry student would be asked to do, after all. They’d look at the questions being asked, analyze the information gleaned from the answers to those questions, and ask why the questions were asked in the first place.
They would never assume if there were some hidden subtext. There might not be any subtext.
And before making any decisions about acting on the data, they’d ask themselves this important question:

This is not a question that is asked to be dismissively or with snark. It’s a real question. Does this national data affect what they do? Or is it simply interesting? And if it is interesting, why?
The Top 40 Most
The most recent example of this kind of heavy data gathering came from the remarkable data scientists at SMU DataArts. Entitled “The Top 40 Most Arts-Vibrant Communities of 2024 (which, oddly, repeats the inflated words “Top” and “Most”),” this is the iron pyrite standard for research about, specifically, the forty most arts-vibrant communities in the United States. It does not compare arts vibrancy to that of any other “thing to do” in the 947 metropolitan service areas (MSA) in the United States, nor does it claim to offer the arts as either actively good or actively bad.

On the other hand — and dangerously so — SMU DataArts does engage in some beginner-level conclusion-jumping in its press release and its methodology page. A good scientist does not engage in hyperbolic puffery and this may qualify as such.
The Arts Vibrancy Index (AVI) [author’s note: they changed the name here…why?] can help arts leaders, businesses, government agencies, funders, and engaged citizens understand the overall intensity and capacity of the community’s arts and culture sector. Past AVI reports have helped communities get the recognition they deserve from their mayors, city council members, and state legislators. Arts leaders have informed us that they use the AVI reports and interactive map on our website to consider where to relocate their operations and what markets are ripe for touring performances or exhibitions. Communities can benchmark themselves against an aspirational set of communities and understand what sets them apart by examining the underlying dimensions of demand, supply, and public support for arts and culture. Numerous funders have engaged with the AVI data to better understand how investments to increase arts vibrancy might be best directed in the communities they serve, given existing strengths and opportunities for improvement. The AVI’s multidimensional framework provides insights as to why two cities that seem very different on the surface might be close to one another in the ranking.
Is this an admission from SMU DataArts that their data are meaningful only in their ability to act as fodder for arts organizations to gain additional, regardless of the true nature of the study?
As my psychobiological expert friend might have said, “Slow down. Scientists don’t do that. Data scientists collect data. Period.”
National data does not prove local worth.
So, if the SMU DataArts data (whatever it’s called) merely measures some form of data about the “vibrancy” of the arts in various parts of the country — and nothing else — how is it useful for planning plays, concerts, ballets, operas, exhibitions, and whatever forms of entertainment that don’t neatly fit into those categories?
And how does it help raise funds for those activities?
Simply put: it’s not and it doesn’t.
In 2022, approximately 50 billion hamburgers were consumed by Americans. 333 million people ate an average of 4.2 billion hamburgers every month. (And you wonder why we’re fat.)

Regardless, if opening a hamburger stand in one of the Top 40 Most Burger-Vibrant Communities in America offers no guarantee that it will achieve any profits. Costs might make it too expensive for the area. Workers might screw up the recipe. Supplies might include inferior meats and bread. The burger, like a $20,000 vacuum cleaner, might cost a lot and still suck.
What’s worse is that it might be delicious, the price may be right, the workers are a joy, and the stand might have set up in a high-density area. And it still might fail. Businesses fail all the time.
Now suppose that the owner used a national Burger Vibrancy Index to tell its community story to the leaders of the region. Or supposed they used it to decide where to place their burger stand somewhere in the United States. Would that change anything?
Maybe. Maybe not.
Data are data. Data can be helpful for an entrepreneur to make the best guess, but it’s not fool-proof. Data are not inconsequential (business success is owed to a lot of inconsequential data becoming consequential at just the right time). And if all those conditions were true, they might become successful. All things considered.
There are two points here to be made and they are interwoven with other disheartening features of the nonprofit arts community in the United States. One is that too often, a report will come out and cause arts leaders to embellish (lie?) about the positive nature of arts on a community. And the second is that, especially in the largest, flagshippy nonprofit arts organizations with the most capital, the majority of the people being helped by their charity don’t need the help.
On the first point: as we’ve seen with the SMU DataArts study, among 947 MSAs who answered its queries, 40 of them have the most arts going on in comparison with the rest of the country. That’s all we know. Not that they’re impactful; just that they’re more active than the other 907.
The second point reveals even more. For nonprofit arts organizations, the community is everything. Not the art. Not the audiences. Not the finances. Impact is the coin of the realm, not coins.

The list of activities from which to center one’s work (not just have a program on the side) is supposed to come from the exempt purposes listed by the IRS:
- Charitable
- a) Relief of the poor, the distressed, or the underprivileged
- b) Advancement of religion
- c) Advancement of education or science
- d) Erecting or maintaining public buildings, monuments, or works
- e) Lessening the burdens of government
- f) Lessening neighborhood tensions
- g) Eliminating prejudice and discrimination
- h) Defending human and civil rights secured by law
- i) Combating community deterioration and juvenile delinquency
- Religious
- Educational
- Scientific
- Literary
- Testing for public safety
- Fostering national or international amateur sports competition
- Preventing cruelty to children or animals
In the ethical corner of the industry lies the Dance Data Project (DDP). DDP puts out studies and reports regularly about equity in the dance industry. They gather data from the organizations themselves (the data has to be made available by law) or, in the case of intransigent and ignorant leadership, they glean what they can off the IRS website, as I do.
The data are provocative all by themselves. There is no need to push people into jumping to conclusions to fulfill some projection or “good news” impulse. Most importantly, they do not charge companies to read all their data. The data are free for anyone to see, anyone to question, and leaves nothing to the imagination (or to a false sense of hope). DDP reports salaries, revenues, expenses — all those things that a lot of questionable nonprofit arts organizations wish to keep to themselves.
There are myriad nonprofit service companies that charge tibiae and femurs for that information because they’ll do the packaging for you, even though the data are free of charge. Some snake oil consultants, too. Even a few august (read: olde, with an Olde English e) nonprofit publications charge for salary information under the agreement that when the nonprofit supplies the info, they get first crack at spending a large chunk of money to read it. They beg for info and charge a small fortune to read it.
If you understand Yiddish, you’ll know the meaning of schnorrer and goniff. Usually, someone is either one or the other. Sadly, the nonprofit service agencies and consultants that put together data reports filled with free information (but still charge you for it) might be rare examples where the two combine.
Beginning
It’s not a question of whether a nonprofit arts organization can exist merely by producing art. It can. A court said so. But that doesn’t make it worthy of a plug nickel of support. In fact, refusal to pay attention to the community’s needs over their own is figuratively criminal in the nonprofit sector. It turns arts organizations into elitist sandboxes in which only the rich can play. It’s why the sector will unquestionably fade into obscurity unless more organizations choose to “get it,” and happily some are.
As nonprofit arts organizations begin the process of tearing down and rebuilding with a goal to serve their communities above all else, the best will remember to use the scientific method that my psychobiologist friend uses every day. When they do, they’ll increase their chances to succeed. When they don’t, they’ll be pretty well screwed, as described in this passage from Scene Change 2: The Five Real Responsibilities of Nonprofit Arts Boards:
When you know that your nonprofit arts organization is a scientific experiment instead of a production company, I think you’ll come to learn that the higher the human stakes are, the more vital the community will see you. As former US Army Chief of Staff Eric Shinseki once said, “If you don’t like change, you’re going to like irrelevance even less.”


Leave a Reply