Work in Progress: The Problem of Patronage

Friday 27 January 2023

Nick Nielsen
9 min readJan 31, 2023
W. B. Gallie

In the preface to W. B. Gallie’s Philosophy and the Historical Understanding I happened upon an interesting observation. After citing the work of Windleband, Rickert, Croce, and Collingwood, Gallie writes:

“…in history we understand particular thoughts and actions in respect to their aptness or un-aptness for their particular contexts, so that in history to understand what happened is also to understand why it happened; and they succeeded in drawing out some interesting implications from this central insight. But none of them found for his thesis a sufficiently clear-cut and arresting starting-point from which to shake the entrenched presuppositions of the dominant empiricist epistemology of our age. And none of them succeeded in conveying the characteristically philosophical importance of his thesis, viz., its relevance to the whole range of human knowledge. The result is that their writings have been taken to be of primarily methodological concern, and so peripheral to the main issues of philosophy.”

I only learned a few days ago that Gallie had written works on the philosophy of history; previously I had known him for his philosophy of war — I have a copy of his Understanding War — but, admittedly, philosophy of war and philosophy of history are close cousins. In any case, the point he makes in the above passage I believe to be generally true for a number philosophical principles that have been formulated with particular acuity in respect to some single branch of thought, but which have profound consequences if applied generally, but these potentially sweeping formulations seem to rarely come about.

I wrote a blog post some years ago, The arc of cognitive astrobiology is long, but it tends toward rationality, in which I discussed the extremely slow uptake of philosophical ideas, and part of this slow uptake is due to ideas being familiar only in some restricted context and not applied outside that context, or not appreciated outside that context. And most of this concerns the circulation of ideas prior to the internet, which latter has given a megaphone to noise in the ceaseless competition that is the attention economy, vying for the eyeballs of consumers who might, just might, spend a dollar on a product being advertised. And it’s going to get worse before it gets better. The forces fighting to capture eyeballs have been set loose at an industrial scale. This is such an appallingly reductionist way to view to view human activity — the fact that the work of writers and artists is called “content” tells you all you need to know — that it can make a person rather pessimistic for the future.

For anyone working today on what they believe to be fundamental philosophical or scientific work, or writing a great novel or great poetry, the dilemma is existential. No one can come close to even surveying all that is available. At some point you have to draw the line and determine what your influences will be, and you will have to accept the fact that you are going to miss some important work that is going on elsewhere — possibility because the others engaged in such work are also having to quasi-isolate themselves simply in order to get something done. If you spend all your time surveying the new literature, you’ll know what’s going on in a given field, but you won’t have the time or the energy remaining to do original work yourself.

Add to this that the field of creative work will now be diluted by the productions of artificial intelligence, and it is easy to see that we will all have to use artificial intelligence helpmeets that we carefully train to our scholarly interests in order to filter for us what will be of value to our work. On one level this sounds great. In the reality of contemporary scholarly work, certain departments of certain institutions become attached to the tradition associated with a given thinker or thinkers, and those who are interested in this tradition are then attracted to join these departments as faculty or students. Joining such a department is a de facto admission that this particular influence is going to dominate one’s outlook, but one accepts this as the price of specialization. In some ways, the use of artificial intelligence to assist in research will be less specialized that the model I have just described, but by now everyone knows that AI agents are deeply compromised by the agendas of the large technology companies that are funding their development. If your AI assistant is an off-the-rack bot because you can’t afford anything better, then the real filter is going to be the developers of the technology and not the sources you would want brought to your attention if only you knew of them.

In the above I have singled out intellectual work that intends to be fundamental and artistic work that is great, or aspires to greatness. I make this qualification because the way in which book publishing and art has been turned into an industry has given encouragement to the very worst kind of grifters — people who have nothing of their own to say, but who like the idea of being famous and influential because of their work, and so they game the system. As such individuals are more persistent in playing the game, since they aren’t interested in the work for its own sake, they often end up being more successful than the people who are actually trying to do something of value.

Jan van Eyck included the donor in the painting (who paid for it), but not himself.

Few societies have satisfactorily solved the problem of how to support intellectuals and artists (and I hesitate to use those terms because they have become so tainted by association with the grifters mentioned in the previous paragraph). For the greater part of human history, the elite classes with resources to spare would support a few artists who could flatter them and produce works to amuse them. This system is open to grifters as well, and no doubt it produces a lot of hopeful supplicants who are never quite successful. I have often noted the absurdly flattering dedicatory epistles that philosophers placed as the dedications of their books in an attempt to secure royal patronage. These dedications don’t sit well with the modern reader (at least, they don’t sit well with me), but they certainly do get the point across that philosophers needed to try to curry favor if they were going to promote their work.

I am not saying that aristocratic patronage was a uniformly bad thing, though the dedicatory epistles are a little thick on the flattery, and in retrospect they do not reflect well on their authors. Many years ago when I was in high school I had an art teacher who made fantastically imaginative pottery, and I can still remember a conversation we had in which she talked about how artists in the past had had patrons who supported their work. The way she described this clearly reflected her approval of the arrangement, and I’m sure she would have preferred making her pottery for patrons who prided themselves on their connoisseurship rather than teaching uncomprehending high school students or talking to the equally uncomprehending parents of students.

Diego Velazquez put himself in this painting along with this aristocratic patrons.

There is much to be said for the system of patronage. In aristocratic governments, the accidents of history place essentially ordinary people in positions of great power (due to how the mechanisms of inheritance play out over the long term), and this meant that the artists who produced works to tickle the fancy of their patrons had to keep these works within certain bounds of propriety. Now that faceless committees give grants, and most of the people on the committee come from the same academic milieu as the artists and writers seeking support, the art that is supported in this way has become unbearably self-referential and trivial. When there was a man of ordinary (if educated) tastes in the loop as the person who ultimately controlled the money, the worst excesses that we see today would be avoided. However, I think that this tends to work better with art than with philosophy.

So the problem is not new, and all the solutions that we find in history are compromises with inconsistent outcomes. One can see the attractiveness of the Marxist dream in which each would contribute according to his ability, and be supported in accordance with his needs, but this never works in practice. It would be tiresome to recite the reasons it doesn’t work, but specifically in relation to creative work — in the above I singled out artists and philosophers — we would get a lot of mediocre individuals producing mediocre work and contributing nothing of real value.

Jean François Millet, Gleaning in Belgium

Now, in the future of maximized abundance imagined by some technology utopians, this would be a good thing. People aren’t going to have to work in order to live, so if they can invest themselves in producing a lot of mediocre paintings, that is much better than them getting into trouble. So you collect your UBI and you go back to your studio imagining that you’re the next Monet or Millet, and all is well with the world as robots produce enough food, clothing, and housing for everyone. That’s part of the problem solved, but the problem is multi-faceted and each facet needs its own solution. In a conversation recently I made the point that most people need order and structure in their lives, they need to do something practical to keep themselves busy, and they need to feel needed (or to feel useful) for practical reasons. Again, for most people, employment fills these needs. Without this structure, many would descend into alcoholism and drug addiction. This isn’t everyone. Just as with mediocre artists (or, for that matter, mediocre philosophers), this represents part of the problem but not the whole problem.

Perhaps you’re getting an eerie feeling that what I am describing sounds like Nietzsche’s “last man” or Fukuyama’s description of the “end of history.” A lot of people doing a lot of pointless things until they die. It’s not an inspiring vision, and as long as resources are finite, there will be competition for these resources, the competition will drive conflict, and the conflict will keep humanity from terminal stagnation. But the two most expansive contemporary visions of the future — space exploration and life in virtual environments — imply declining scarcity and resources in such abundance that they are beyond the ability of human beings to exhaust. Each has its own problems and its unique opportunities. Space has more resources than human beings can exhaust, but considerable effort and difficulty is required to access these resources. (But this, as we have seen, is a good thing for those who need to feel needed and useful.) Life in a virtual utopia seems to uncouple growth from the familiar industrial model, but it is predicated upon escalating energy requirements that have to be fulfilled in the real world in order to power these virtual worlds.

These two visions for the future implicitly lay out an economic and technological program that would enable their realization, and both have ramifications for individuals who would want to pursue scholarship and creative work in these futures (which are not mutually exclusive). If everyone wants to leave the actual world for the adventures and excitement of virtual worlds, there would be a lot of work to be done by artists and architects and storytellers to create these exciting worlds, though I don’t see much of a role for philosophers and the kind of intellectual activity done by philosophers and scientists.

From this point of view, a spacefaring civilization would be closer to the social structure of traditional civilizations than is the social structure implied by the virtualization imperative. In an expanding spacefaring civilization, there would be all of the people doing the hard work of creating off-world infrastructure, but there would always be the niches for artists and philosophers, as well as new and exotic forms of human experience that would inspire new artistic and philosophical attempts to assimilate them.

--

--

Nick Nielsen
Nick Nielsen

No responses yet