The world as recently learned of the death of Steve Jobs. He is being hailed as a hero who virtually created the world of technology we live in single-handedly. He started the first personal computer company in his parent’s garage, took it public, and drove the computer industry with his relentless vision, not on the designs of his engineering partner, Steve Wozniak, but on his vision of how consumers would interact with the computer. Wozniak, who had technical skills but no idea how to translate those technical skills into a business, acknowledges Jobs’ genius for business in the following clip:
Jobs became famous for wanting to change the world, and he did so many times, marketing a scientific innovation invention by made by others that he had seen at Xerox PARC, the graphics based computer, with its mouse and its more intuitive design, to the masses. But that’s what happens to those who are business innovators. They take ideas from others and market them better than anyone else. Then, after their innovation is followed by others, they claim credit for the innovation itself. This is why, after Bill Gates copied Steve Jobs’ copying of Xerox PARC ideas, Jobs claimed that his invention had been stolen by his lifelong rival, Bill Gates: “Bill is basically unimaginative,” he said, “and has never invented anything, which is why I think he’s more comfortable now in philanthropy than technology. He just shamelessly ripped off other people’s ideas.”.
But Gates took Jobs’ idea for the graphical user interface marketed to the masses, leaving Jobs’ company as a niche company which charged higher prices for the premium of owning an Apple computer. As the company settled into a state in which the company dominated only 5% of the market, Steve Jobs got fired as CEO. He went on to found NeXT Computers, as well as giving a small company called Pixar some startup capital which will with which they experimented and eventually produced the first all digital film, Toy Story.
These things would’ve been enough to secure his legacy as one of the great technology leaders at the turn-of-the-century, and yet his greatest contributions to technology were still to come. After Apple failed to win its share of the mass market, Jobs was invited back as CEO. He quickly got the company into the black and then with his experience designing high end graphics workstations gleaned from NeXT Computers invented the iMac, followed by the iPod, still the most popular MP3 player in the world. He changed the distribution system of music from record stores to an online based music distribution system with iTunes. He changed retailing, by opening up a line of Apple Stores, a store that has the highest sales per foot of floor space of any company in the world. Then, he finally got the recipe right for a shift away from the personal computer to a cheaper computer that people had been trying to invent for years with only limited success when he introduced the iPad.
As he did so, his company, which he had rescued from being in the red and so from potential oblivion, became briefly the most valuable company in terms of market capitalization in the world.
Me and Jobs as Products of the 60s
Steve Jobs is only seven years older than I am, but he and I are both the products of the 60s. We both went to college; we both dropped out. When he was in college, he took a class in calligraphy, which he later credited with his obsession with fonts in his Mac OS. I, too, took a class in calligraphy (it was called paleography when I took it) when I was in graduate school, but have nothing to show for in except for a story that no one wants to hear about how I learned to instantly determine which font (of 30 – 40) I was looking at through a decision tree that I made in class. As a result, Steve Jobs died infinitely wealthier than I ever hope to be.
I can’t decide whether this matters. Nobody pays any attention to blog, but I write it without any expectation that anyone will be interested in what I am writing, being more interested in expressing my thoughts on what matters to me. Steve Jobs, on the other hand, was a cultural leader. When he spoke, people listened, because he was an oracle of the future. Listening to what Jobs had to say gave people insights into their own future that they lacked on their own. This is how a community is built, whereas my approach to my blog involves me in quite of selfish blowharding that is not reflected in the larger culture. This thought was on my mind when I took off blogging for the summer. Unlike Steve Jobs, I have no desire to make money with my blog or my writing. I write my books because I want to read them and no one else in the culture is writing what I want to read.
My success as an author cannot be measured in monetary terms; and although I often have to explain to others the selfish philosophy that governs my life as a writer, I don’t regret the course my non-remunerative life has taken. I live happily with my wife, and she and my kids are all that has ever mattered to me. I spent my youth chasing idealist dreams, and I pursued them into graduate school. My life after graduate school meant learning about how business works, and it turns out is not at all how they told me that the world works was in graduate school. As a result, I went to work as a minor cog in the world dominated by people like Bill Gates, Steve Jobs, and companies like Wal-Mart, who had mastered the way the world in actuality.
I originally decided to go into academia because I wanted a quiet life or I could be in control of my destiny. When I realized that academia and I have different sets of values, it and I parted ways, as I turned to entrepreneurship, which I learned about when I was reading my hundred books on business, marketing, finance, etc. I was happier doing this that I ever was in academia, which put limits on my ability to think outside the box despite their belief that they (academics) were the only people who could think outside boxes. I learned a great deal when I was reading about business, including my insight (derived from Pareto, but like Jobs I’ll be happy to claim invention) of the basic inequality of nature, which contradicted everything they had taught me in graduate school. I took a job as a temp, got hired full-time as a secretary, got promoted based on my skills as a programmer to market analyst, quit that job for a job in the field I got my promotion for (which I rightly perceived as my skill as a computer programmer, not for my remedial skill as a marketer), and eventually went to work for myself as an independent contractor. There, I was happier than I had ever been in grad school, where people had attempted to keep me in line by constantly testing my political allegiances. I would still be an entrepreneur if not for my having had a stroke in 2004. So I changed my profession again. I founded a company, and I now view myself as an entrepreneur of ideas gleaned from my own attempt to integrate my academic experience with my business experience, which I package in my books.
On the basis of the difference in outcomes between myself and Steve Jobs, it might seem unfair for me to poke holes in the thoughts of a man who is among the last great American entrepreneurs (so far), but this is exactly my intention in this post.
Going Back to School
The chief difference between myself and Steve Jobs (in my humble opinion) is that he went to work after dropping out of college, whereas I felt that work was less fulfilling. I read incessantly when I was out of college, and when I went back I found that I was better read than almost all my classmates. But what I was missing, and the reason I went back to college, was a sense that I had not been able to give myself a well-rounded education when I was out of college. I had encountered Joseph Campbell, who was the first of many comprehensive thinkers about the world when I was out of school; but Campbell left me with some deep questions about how I could resolve the world I lived in with the world of “the word behind the words,” which he pointed to as having answers that words themselves could not get to. This seemed to me to take me out of my independent and individual self and impose upon me a requirement that I alienate myself from myself an instead grasp a new set of principles based in our common inheritance with no guarantee that I would ever be able to get back from the division-less area in space back to my individual self. The whole thing requires that I believe that the “higher” construction is real and not a delightful but impossible fiction. Unfortunately, I could not believe this, and I went back to college looking for answers to what appeared to me to be unresolvable questions. I was sure that someone knew.
It turned out when I went to grad school that others had discovered a similar gap between words and what they refer to as soon as I got into graduate school. Derrida’s work fascinated me, and I attempted to work it into the knowledge that I built up over two years working in a local bank. It turned out that I came to a different conclusion than my academic colleagues, many of whom had never had any business experience. For them, going into business meant simply a capitulation to greed; and more than once I had a conversation with academics who believed that they could have gone into business and made money, but they had pursued a “higher” calling whose point of pride was that they had made a conscious decision to turn away from making money altogether.
I’ve always been very wary of such professions. In my own life, I’ve attempted to learn about business, because my parents told me that I should learn enough to follow all of the things they reported on in the news. This involves a smattering of national and local politics, sports, weather, and of course, business. When I was young I never really cared much for sports, and weather was something, as Mark Twain once said, there’s not much you can do about changing. Politics and business, on the other hand, require a good memory for past behavior and the inability to predict future behavior based on your deeper knowledge of the past. This makes both politics and business appropriate for intellectual inquiry.
In the 1990s, when I was in graduate school, New Criticism, with its sense that aesthetic objects were to be counted for as “autotelic” objects without reference to culture or any other external factors, was waning. In its place came a New Historicism, which made culture the static metaphysical object in the universe and the individual as being in negotiation with something larger than themselves. But with the switch from autotelia to negotiationata, I still perceived a problem in the resulting configuration; for it seemed to me that this same problem existed with culture has had existed with the metaphysical individual at the center of the aesthetic universe. No one could say what the boundaries were for culture anymore than they could say what the boundaries of the individual were.
Being a new idea competing with an older idea, people in academia were sure that they had finally reached the Promised Land. Having gotten there, there was no more reason to explore the world for cracks in their own configuration the world; all that remained do was to cleanse the academic world of those who did not believe as everyone in the academic world believed. So this demotion of the individual played out in the world of politics, where two opposing points of view were posited, and through election one won out. Academics secured the election which had taken place within their ivory tower by declaring within that ivory tower a state of permanent revolution, and only one side (the left) could perceive the “truth.” This made it very difficult for me to ask questions about things that had already been decided on; and it made me into a creature of the right within academia, because only someone on the right could ask questions that involved the resurgence of an idea as old as individual liberty without negotiation with larger collective forces. And within the medieval period, which sported more conservative scholars, I was thought to be too liberal in my desire to throw open all things medieval to the forces of Derrida’s corrupting vision of society. I was firmly on the left, as far as most of my medieval professors were concerned.
I find it endlessly fascinating what happened in academia in the 90s. Rather than looking within their ranks for cracks in their system, academics started to displace the frustrations they had with their own “perfect” system onto business people, excluding them, who in turn had dismissed academic thought as a useless pursuit and who (according to my academic colleagues) were more concerned with their own greedy point of view than with collective action. This placed “them” by definition far from the “truth.” But, at the same time as my academic colleagues were making the case for the absolute exclusion of business people from the universe of wisdom, even a cursory reading of the logic of definition would convince anyone that definitions are relative, not absolute. In my opinion, my academic colleagues had made an unacknowledged switch, which I am in the habit of calling the Absolute-Relative Switch. In such a switch, you reserve relative constructions to your party (this can be done on the left and the right), while maintaining that the other party means what they say absolutely. This is the stuff that radio partisans thrive on, but academics and those on the left are not immune to such a fallacy.
Now in my world, businesses always been excluded from liberal arts on the basis of the study of business not fitting in with the standard configuration of the universe given to us by academia. In academia, some people have knowledge, while other people don’t. This is the way that classrooms work: teachers have knowledge, while students are (or should be) in class to learn what their teachers have spent a lifetime learning. This makes it possible for academics to congratulate themselves on pursuing a higher calling, while demonizing their students, who are not thought to be as serious about the “higher” calling of the life of the mind as their professors are. But this is only true if a professor has a secure position in the world and not one of many relative positions that one could take. This would destroy the classroom setting by making the distinction between teacher and student a completely arbitrary thing, so my academic colleagues maintain their absolute positions on some things (like the importance of knowledge and of the importance of teachers who pass on the accumulated knowledge to their students) in an otherwise arbitrary universe in and on which business people operate.
I just didn’t think that my academic colleagues, who had walled themselves off from society by relying on a firm (read: absolute) boundary between themselves and the world that they judged without wanting to be judged, were correct in their assessment of how easy would be to make money in the world should they have chosen the path that they dismissed as only the path of greed. I found them to be as greedy (not more or less so) than their fellow men who stood outside their arbitrarily constructed walls.
My experience with business has been that business is organized on different principles altogether. Whereas academics can elevate themselves up to a higher world while dismissing the lower world as being one of “greed” in a bit of what we academics used to like to call a synchronic analysis, business people have a more diachronic analysis of their position in the universe.
And here’s the rub. If you follow my link under diachronic analysis, you will find that it leads to the notion of “historical analysis.” Looked at from this academic point of view, it appears that business people are shallow thinkers who think in “lower” terms, while academics pursue a set of “higher” values. This accords with Aristotle, who said in Part IX of his Poetics that “Poetry, therefore, is a more philosophical and a higher thing than history: for poetry tends to express the universal, history the particular.” This, too, reflects the academic position that the first thing one needs to do to seek the “truth” is to abandon one’s attachment to one’s individual life and instead tend to a “higher” truth. According to this model, the business person has no notion that there is a “higher” world that could be pursued if only they would give up their base and debasing focus on themselves at the expense of their betters, who have turned away from selfish greed.
In many senses, my academic colleagues are correct. [See my the first point in my article on Rush Limbaugh, who dismisses Darwin as one of the two worst thinkers in the history in favor of a (presumably static world in which things don’t change beyond a certain point).] But that is beside the point. The academic view point is limited to thinking about the past, as the indication of diachronic linguistics reference to “historical linguistics” ought to tell us. In such a universe, there is no room for thinking about the future. As a result, academics tend to believe that the future will come out of present experience, as I note here, and the disallow all other changes and ideas that do not pass through their hands.
But there is another problem with the academic construction of the problem: the problem is that such a “higher” truth based in “historical linguistics” might be a fictional construction. And here I perceive the difference between academic thinkers and business people. Academics spend a great deal of time thinking about the historical past but cannot tell with certainty what the future will hold except that it must of necessity come out of the historical experience that only academics have fully grasped. Business people, despite not being very good academic thinkers, spend a lot more time thinking about the future than academics do, because success in business involves having a new vision that has never been thought of in the past. So the past is a deep and detailed record of things that have happened; but I learned in my year of 100 books that it is useless to concentrate on the rise of railroads except as a model of the past. New ideas come from thinking outside the box, which academics are all for; but only to the extent that they are included in the final box that thinkers end up with. If not, they, like all dictators before them, will stand in the way of progress.
In my Poker Tales, I note some serious limitations of this model, which seem more appropriate to a European sensibility than to an American one. In particular, I noted the difference between European and American models of culture in my chapter on “Reykjavík” and later in my chapter on the “Four Parisians” who come to America with some high-minded ideals but who get taken to the cleaners by the absolute fool “Belcher” Owens because they are not looking at the world as it is, but as they would like it to be. America works because we have a model of how the world works that is more efficient than older European models because it does not hold on to any residual metaphysical constructs but allows prices to run free on the basis of two cooperating people involved in a transaction without any metaphysical guide that would prohibit setting of the (not a) just price. It was my aim in writing Poker Tales to rehabilitate aesthetic culture on the American model, which (like my point in graduate school) is built on no solid foundation whatsoever but only on the basis of someone’s being at the forefront of something so obvious while being at the same time so brand new that no one has ever seen it before.
On the Cutting Edge
In that respect, Steve Jobs is on the cutting edge of societal evolution. He has stepped outside the box and sees a world that other people can only follow once he has seen the way forward. In that respect, he is the upper 1% of the 1%. He’s a leader who was able to adapt because he dropped out of school and went his own way. On his death, he has been hailed as a hero, the latest (and everyone hopes not the last) innovator in a world of followers. This is the basis of his belief that collective behavior is not responsible for new ideas; it is only a brilliant mind that can see farther than others can: “People don’t know what they want until you show it to them” he once said.
Occupy Wall Street
At the moment that Jobs died, we had reached reached a pivotal moment in American history. As wealth has grown, the difference between the wealthy and the poor have grown. This has given us Barack Obama, who wants to redistribute wealth on more equitable lines. I am all for this, as huge relative differences in wealth lead to different interests in each party and a lack of social cohesion around common goals (this is why I voted for him). But Obama has attempted to redress the problem by using the academic viewpoint of Saul Alinsky, a man who is for the “real” people as opposed to the abstractions of big corporations. This continues the historical and so academically respectable position of excluding big business from the more noble goals of fighting for the little people. So deeply ingrained is this way of thinking that all of my friends on Facebook with few exceptions are clamoring to support Occupy Wall Street this week.
That’s fine, but it is in their surety that the past will dictate the future that I find troubling. Three of my friends have ignored my warnings about their misunderstandings of the business mind that they wish to exclude from the conversation on account of their being greedy SOBs who are not thinking about the collective good. Rather than take heed to my warnings, they stop communicating with me (I am sure they are thinking that there’s something wrong with me and are too embarrassed on account of my having evil (not just different) views on the subject; but that is perhaps my own paranoia talking, and I can’t really know this). Although they won’t say it to me personally on account of their having grown up in a gentler age, I am convinced that each of them is thinking “he’s one of them,” the “other,” whom the Occupy Wall Street folks continue to (very selectively) target. This indicates to me that their targets are more political than philosophical, but when I attempt to engage them on their to my mind errant philosophy, they either shut down, or they confuse their public professions of loyalty as being no more than private expressions of their own preferences and ask me to shut my pie hole, because they was just expressing their thoughts. This makes me the bad guy who is stepping on the untrammeled right of free speech; it is only when I stop objecting to their misconstrual of the philosophy that underlies their protests that free speech can again take center stage. Objections themselves have become reasons to support what “us” have always supported and to label as “them” what “them” object to, securing the “us”‘s position from ever being subject to a philosophical challenge. “Us” knows what “us” knows, and we like it that way.
As I say, I don’t have a problem with anyone’s public expressions of their view, but I do have a problem if you express your views but do not allow others to disagree or question you on your views. This was what happened to me in graduate school; and while I could have maintained my position as an outsider on the inside, I thought it would have been more work than it was worth to me personally. I, like the Old-Timer in my Poker Tales, went away and did my own thing without regards to the consequences to the collective needs of a society that had made it perfectly clear that my services (being so definitively “other” in the world of ‘us-or-them’) were unwelcome. I, like Steve Jobs, dropped out of college once more and went to work in the private sector, where I had no other obligation than to meet the needs of my customers through my superior knowledge of obscure things.
Steve Jobs as Master in the World of Niche Marketing
My desire to pursue my own goals at the expense of the collective goals is parallel with Steve Jobs’ individual goal that made him a leader among economic producers in this economy. Jobs has been the most successful exploiter of niche markets in which the consumer stands still, while the producers have got to be nimble marketers in order to meet the consumer’s changing needs. I trace the development of a divide between producers and consumers in my essay on Wal-Mart economy in my Writing for People Who Hate Writing, where I point out to the young consumer who wants a job in the productive society that writing is important in the world of production, but that it also requires a very different skill set than is required of you as a consumer of products marketed to you.
And to be clear I’m not saying there’s anything wrong with Steve Jobs. But there is something wrong with Steve Jobs as a producer of metaphysical value. Metaphysics has been the principal on which we have hung our collective notion of art and aesthetic value. We see people all the time posting their thoughts on poetry and art, and in almost every instance people are trying to peer through and beyond reason to a whole and complete thoughts on which they can hang their whole and complete person. At the same time, people tend to find flaws in their whole and complete personae. In this, I thought, was the lesson I’ve learned from Derrida and his followers. There is no center at the center of ourselves. We will always be looking to maintain our sense of ourselves, while knowing that if we ever stop and find the center, that we’ve made some sort of mistake. This is the point I made a long time ago talking about Nina Hagen.
It is in the middle space, between extremes, that I find the approximation to the “truth.” Such is the nature of “truth” that it must be passed through imaginative re-creation in our minds before we can get to the truth. And the universal nature of imaginative interference means that we can never (never, never, never, never, never, never, never) get back to the ontology of truth. No one, not Joni Mitchell or Rush Limbaugh, has found it as it in in its ontological perfection. It is, in my opinion, the weakness of both sides that they think they have come to the end of the road of “truth.” This is a too-easy solution in which “us” are in possession of “truth” and it is only “them” that stands in the way of forming a more perfect society. This seems to me to be the product of a specialist society in which no one knows the truth but in which at the same time everyone thinks that someone else knows the truth.
Limbaugh’s hero, William Buckley once wrote “Someone somewhere remarked that Erasmus was probably the last man on earth about whom it could more or less safely be generalized that he knew everything there was to know.” He then goes on to qualify his remark: “By ‘everything’ was meant everything in the Western canon.” This leaves out all the “other” cultures that didn’t participate in Western culture. And it was in precisely those “other” cultures that Steve Jobs placed his emphasis. But he, too, thought that there was an “end” to human problems when he contracted cancer. Like Steve McQueen before him, he chose to undergo more experimental treatments that were aligned with his own mind’s orientation to the world than more traditional Western treatments. Jobs apparently believed his doctor when he told him that “he was either going to be one of the first ‘to outrun a cancer like this’ or be among the last ‘to die from it.’” He, like Limbaugh’s hero, was an idealist who thought that it was possible ever to have known everything. Buckley had displaces “all-knowing” into the past. Jobs, being a business person, placed it into the immediate future, perhaps just out of reach but still graspable.
The reality of both positions is far grimmer. Sometimes perfectly good people (like me) are fine, and then they fall over, having had a stroke at 7:00AM, right in the middle of a semester in which I was doing what I thought was good work of teaching people an introductory class in writing at a local community college, rather than a class at an Ivy League school on the works of allegory in the Middle Ages and Renaissance (my academic specialty). Such things are random, and would be completely unnecessary in a rational world. But the world is not rational. How we deal with that fact tells us a lot about ourselves and our culture. In America, we tend to displace the faults of the world onto “other’s” in order to maintain our sense of ourselves as whole and complete persons. It is for this reason that Steve Jobs, despite all his brilliance, could not surrender his body to be opened up by others, and so (perhaps) died sooner than he would have had he followed a more scientific route to health.
Some folks surrender themselves to God, who is thought to be all-powerful. Others scapegoat “others,” rich people, or poor people, or blacks, or whites, or people who believe in ‘liberal’ or ‘conservative’ causes. But nobody, apparently, has decided that there always will be room for improvement in our relation to an evanescent “truth” that flits away each time you attempt to grasp it. My “middle way” is my attempt to keep open the avenues of truth in a universe where everybody has their version of the “truth,” and that having their own private verion of “the truth” is good enough for them. But such a system rapidly becomes one of autonomous and private monads who do not grasp themselves but only others as in any way limited. When I or anyone else attempts to challenge their most intimate and personal ideas, they can do no more than object to my bad faith.
I don’t resent Steve Jobs’ vision of the universe; his is one of many. But he made his money appealing to consumers who took him at his word and believed that they can have things delivered to them without having to look at the universe themselves for new ways to make money on their own. This consumer orientation is responsible for the utterly irresponsible demands of those members of Occuy Wall Street who are demanding a free college education that will continue the academic policies that make it possible in the first place for students not to understand how the producer end of the supply and demand chain actually works differently from the consumer end. Such a position will inevitably lead to a decline in productive workers (as it already has in the Jobs generation, as model producers are being freshly minted in China and the other BRIC countries but not in America itself) in favor of consumers who take no care for the very different skill sets required to make them into productive workers.
Like my experience in academia, I conceive of the problem differently, and I get frustrated sometimes by my lifelong friends’ inability to see things as I do. I put them down to having been raised in a “culture” that Steve Jobs is largely responsible for. But as I have said before, “culture” is a choice as much as it is a metaphysical boundary of experience. And I, like Steve Jobs, have no obligation to participate in it but to transform it from a different (not necessarily a better) position. But, unlike Steve Jobs, I recognize that there can be enormous consequences to taking personal choice too far. At some point, our rhetoric runs up against reality, and when that happens, something’s got to give. In every case in recorded history, unknowable reality trumps the knowledge of the wisest among us, no matter how shallow or deep the knowledge that each of us carries around with us on a daily basis.
That makes it doubly or trebly or infinitely more important that we don’t lose ourselves in our own conceptions of how we want the world to be and try to transform it in our own image, but instead concentrate on how the world is and then react after the fact. This is the lesson that education should teach us. It is at that point, when we get so full of ourselves, that we should remember our forefather, Socrates, who said that his wisdom consisted of his knowing nothing. Sadly, however, the lesson of Socrates has been attacked by Nietzsche, who hated Socrates for his position that he knew nothing; and by his modern predecessor, Rousseau, who thought, like the Occupy Wall Streeters still think, that is only the consequence of mankind’s having fallen out of alignment with our original natures, which were once at one with nature’s equitable distribution of resources.
Such a position only makes sense if it is true. And the “truth” is not for Occupy Wall Street crowds to know without a conversation with those who think differently (as I do) than they do. As I note in my Writing for People Who Hate Writing, conversations take place in the “middle space” between two people who have firm opinions on how the world works. When they disagree, each should go back to their corner and rethink their position in relation to the different position that the other has taken. After considering one’s position, one should go back to the “middle space” and try to make their case again, taking in all the points that their opponents have made that seem good to them and dismissing with carefully wrought arguments those that do not measure up.
This is precisely what is not happening in American “culture” today. Both sides come to the table with their positions set in stone and expect the “other” position to budge. When they do not, each side is assured that their position is more secure, while the position of the “other” is not just different but “evil.” Holding such atomic (monadic) positions, moreover, requires no education. Instead, it is the sort of “instant intellectualism” that is available to everybody of all classes (as Descartes says, everyone knows that there is nothing wrong with their own thought).
In my universe, nature is not equal in the first place. It seeks to eliminate the strong and eliminate the weak, as Darwin (who Limbaugh dismisses as one of the two worst thinkers in history) was the first to discern, and which Pareto first noted as a systematic feature of the natural universe. If I am right about Pareto’s having a better vision of the universe than Rousseau, then people like Obama and his college-educated followers in Occupy Wall Street are wrong to attempt to build a human society along the lines of nature in the first place.
Conservatives abandoned education after they couldn’t get heard in the 1990s during the PC decade. Steve Jobs, too, abandoned education after he found that it was too constraining. The reaction I would have expected to this was for academics to rethink their positions in terms of their shrinking manifest. I stuck it out, because I have always believed that the better-educated mind was the superior mind but with the reservation that no one knows what the future holds.
I managed to make it through graduate school to the end, but only by ignoring people who demanded my submission to their political construction of the universe but who were not really interested in much more than my submission to their power. Having passed through an environment that others find so toxic relatively unscathed, I have not lost my enthusiasm for education. But I think that my experience has changed my opinion of the world. I do not believe that politics is important at all. I do believe that philosophy is more important than ever. And I believe that both of these positions have no place in the world of American culture as it is currently configured.
I hope to change that through my art. But, then, as I have often asked on these pages: Who am I to be saying any of this when so many famous people in the past and the present have achieved fame saying different things, while I rest content in my suburban home, poor and far away from New York, Washington, and LA, where the real work of building “culture” takes place?