American Higher Education: An Obligation to the Future


By Vartan Gregorian, President, Carnegie Corporation of New York


 

Photo by Michael Falco

In recent years, there has been a debate raging among policymakers, students, educators, concerned parents, and many others about the purpose of higher education: is it meant to help develop an inquiring mind and a deep appreciation for the value of how knowledge enriches one’s lifelong personal and professional achievements or should it be simply focused on gaining the skills to pursue a well-paying career? In other words, we seem to have divided higher education into a black-and-white scenario in which either an individual becomes a sort of pie-in-the-sky dreamer, well-read and able to quote great thinkers but probably starving in a garret while unable to get a decent job, or else he or she graduates from college and immediately plunges into the world of technologically complex, high-stakes, high-financial-reward work and becomes a “great success.”

Perhaps the time has come to reconsider that either-or proposition about higher education. The issue is too complex to be addressed in such a simplified manner. For example, as a new study1 from the Association of American Colleges and Universities (AACU) reports, “Students, parents, and policymakers interested in the ‘return on investment’ of college education [often assume] that a major in a liberal arts field has a negative effect on employment prospects and earnings potential.” But the AACU study makes clear there is compelling evidence that a liberal arts degree continues to be a sound investment, especially in these difficult economic times. The facts show that compared to students who major in professional, preprofessional, or STEM fields, liberal arts majors fare very well in terms of both earnings and long-term career success.

The specifics are indeed eye-opening. They reveal that over the long-term, humanities graduates actually fare better than their peers who are focused on particular professional fields. Upon graduating from college, those who majored in the humanities and social science made, on average, $26,271 in 2010 and 2011, slightly more than those in science and mathematics but less than those in engineering and in professional and pre-professional fields. However, by their peak earning age of 56 to 60, these individuals earned $66,185, putting them about $2,000 ahead of professional and pre-professional majors in the same age bracket.2 Further, employers want to hire men and women who have the ability to think and act based on deep, wide-ranging knowledge. For example, the report finds that 93 percent of employers agree that candidates’ demonstrated capacity to think critically, communicate clearly, and solve complex problems is more important than their undergraduate major, and 55 percent said that what they wanted from potential employees was both field-specific knowledge and skills and a broad range of knowledge and skills. Even more evidence of hiring managers’ interest in richly educated individuals is the finding that four out of five employers agree that all students should acquire broad knowledge in the liberal arts and sciences.3

Students do not have to make artificial choices between what they want to know about the world and the skills they need to succeed in it.

All this is heartening news in that it reminds us that the current generation of students—and those who follow after them—do not have to make artificial choices between what they want to know about the world and the skills they need to succeed in it. But there are some who are still not persuaded of this. In fact, it is interesting to note there is yet another choice that various pundits have recently suggested students should consider—not going to college at all. The rationale behind that notion is that while the knowledge gained in college and university classrooms may be both wonderful and enlightening, it is not necessarily useful in “real life.” That seems an empty argument to me and one that is refuted, for instance, by a quick glance at a recent list of the Forbes 400 richest people in America, which shows that 84 percent hold postsecondary degrees. Similarly, of the Fortune 500 CEOs, 93 percent have a college degree—many in the humanities and social sciences.
The success of these individuals and others underscores a point I have often made to students: that one of the immeasurable values of a liberal arts education is how it can open up a world of possibilities, including life and career paths to follow that might otherwise have seemed unimaginable to a young man or woman just starting out. But that is a wonderful challenge for someone who is motivated to explore their own potential: after all, if the only purpose of education is to train an individual for a specific job or skill, life would be much simpler—and, I might add, perhaps much less interesting.

With all that said, it remains clear that increasing our expertise in technology and related fields is critical to the progress of our society. Nevertheless, it is still useful to remind ourselves that the greatest service technology can provide us is as an adjunct to knowledge, not as a replacement for it. Technology by itself is not a creator of content. Though the Internet and all the technological devices that now connect us to it have made it possible for much of humanity to have access to a virtual Library of Alexandria, access alone does not equal knowledge. The ability to carry around the entire corpus of Greek literature on an iPhone or some similar device may be astonishing, but that does not mean that the individual who possesses such a device actually knows anything about Greek literature. One still has to read. One still has to listen and see with one’s own eyes. One still has to ponder ideas, explore the realms of both material and spiritual knowledge, and discuss these matters with other people.

It is still useful to remind ourselves that the greatest service technology can provide us is as an adjunct to knowledge, not as a replacement for it.

In that connection, I would argue that the deep-seated yearning for knowledge and understanding endemic to human beings is an ideal that a liberal arts education is singularly suited to fulfill. Albert Einstein, in his inimitable fashion, went right to the heart of the matter, asserting that the practical men and women among us try to explain all phenomena by cause and effect. But, Einstein said, “This way of looking at things always answers only the question ‘Why?’ but never the question, ‘To what end?’”4 To search for even a glimpse of the answers to such great philosophical conundrums one needs to know not only what is taught in a classroom, but also how to think for oneself.

Of course, one also has to know history, particularly the history of one’s own nation. In that regard, as Americans, we have an obligation, as citizens to whom the future of our country has been entrusted, to understand the obstacles we have faced in the past and both the problems and opportunities that lie ahead. As Benjamin Franklin said, issuing a still-timely challenge in response to a query at the close of the Constitutional Convention of 1787, what the Founding Fathers had created was “A Republic, if you can keep it.”

Keep it we must, and we will, but to do so we need an informed and educated citizenry who can take full advantage of the almost 4,200 colleges and universities in our country, including some 1,700 public and private two-year institutions. And let me point out that computers and Web sites have yet to put those colleges and universities out of business. Why is this? Because of one simple reason: we are not a virtual society yet. Not yet. Human beings, by their very nature, are rational, spiritual, and social beings. They are not abstractions. They are not socioeconomic, consumer or entertainment units destined to be confined inside the small world of their cubicles and subject to what I call “cubicle alienation.” Even though people can watch almost any movie they want on-demand from their cable service or on DVDs, men and women still go to movie houses to share the experience of being immersed in a story told through sound and images in the company of other human beings. People have Bibles, Talmuds, and Korans in their homes but they still go to churches, synagogues, and mosques to share their common bonds and traditions. People need to be part of a community—and for many, the college classroom provides an invaluable experience of community and collaboration.

The diversity of talents, interests and aims of the men and women who look to higher education to help them reach their goals is mirrored by the diversity of our colleges and universities, from which our system of higher education draws great strength. Individual institutions have traditionally emphasized different local, regional, national and international needs by providing educational opportunities to diverse populations, expanding scientific and technical knowledge, providing opportunities for continuing education, and other means.

But that certainly wasn’t always the case. Higher education was actually available to only a small proportion of America’s population until Congress enacted the Land Grant College Act in 1862. This legislation—the first Morrill Act—which was, astonishingly passed in the middle of the Civil War (making it clear how strongly both President Lincoln and Congress felt about the importance of education, as well as about the future of the nation) in effect, put universities where the people were. The Act not only provided much greater access to higher education, it also promoted specialized training and spurred the development of both theoretical knowledge and its practical application. The Industrial Revolution was in full swing and the Morrill Act helped to provide the research and the educated work-force that were desperately needed in agriculture, mining and manufacturing.

Today, there are new challenges, and one of the greatest facing higher education is how to protect the diversity of our colleges and universities at a time when it seems that instead of emphasizing variety and competition—which affects all aspects of higher education, from recruiting students to developing curricula—there is a worrisome trend towards uniformity. Joseph Aoun, President of Northeastern University, expressed similar ideas in a recent op-ed5 in which he discusses how higher education must begin to respond to an increasingly diverse student body, with different needs, different goals, and different expectations. His particular emphasis is on the growing number of students who are not following the path directly from high school graduation to the college campus. As he points out, “The ‘traditional’ college student aged 18 to 22 is no longer the norm. Many people still think that the typical college student is an 18- to 22-year-old who’s attending a four-year residential institution. But according to some estimates, nontraditional students—returning adults, part-time students, midcareer professionals, and every other permutation of learner—now make up 85 percent of all undergraduates.”

The diversity of talents, interests and aims of the men and women who look to higher education to help them reach their goals is mirrored by the diversity of our colleges and universities.

I believe that startling statistic helps to provide an answer to the question with which I began this essay: is there a value to the kind of education that promotes the ability to become a lifelong learner? Clearly, the answer is a resounding yes, if education is going to be a resource available to all Americans that can parallel their path through life, if that is what they need. Noted author and Columbia University professor Andrew Delblanco addresses similar concerns in his recent book, College: What It Was, Is, And Should Be6, suggesting that higher education should offer more to students than a rigid curriculum and a lock-step parade towards a degree. As he suggests, though more and more students are going to college with “the narrow aim of obtaining a preprofessional credential” (a phenomenon he attributes to the accelerating commercialization of American higher education), guiding young men and women down this path is a mistake. In fact, he argues, it means that they are losing the chance to experience the traditional—and wonderful—attributes of the undergraduate years, “an exploratory time for students to discover their passions and test ideas and values with the help of teachers and peers…” He also worries that this kind of multi-faceted, aspirational education is in danger of becoming available only to the wealthy and privileged, which would would pose a great danger to the progress of American society. While science, technology, engineering, and math play an increasingly prominent role in our globalized economy, innovation still requires original and imaginative thinking. The new discoveries that will improve the living conditions, health, and welfare of men, women, and children around the world will not be found without those who have the education to work toward those discoveries. And if we do not nurture the talent among us, who will provide literature and art and music for ages yet to come?

These are some of the purposes for which we, as a society, created, supported, and continue to value a liberal arts-oriented college education. As W.E.B. DuBois said, “The true college will ever have one goal—not to earn meat, but to know the end and aim of that life which meat nourishes.”7

For myself, I believe that the immeasurable value of American higher education and the potential it has to open
doors to a future of one’s own making is the proverbial pearl beyond price that we must all cherish. That is one of the reasons I am so gratified that some of our nation’s most eminent university leaders, along with prominent scientists, engineers, and others are sharing their thoughts and ideas about higher education in this special edition of the Carnegie Reporter. I am pleased to be able to contribute to their work by including an address I gave to the President’s Council of the University of Tokyo (below), of which I am a member.

In many ways—and I can attest to this from personal experience—education is the bridge that allows us to travel from where we are to that further place where we can become who we want to be and do all the wonderful things we might otherwise only dream of. Whatever we can do as educators and citizens to strengthen that bridge is an obligation to the future that we all share.

 


 

Presentation by Vartan Gregorian to
The Seventh President’s Council
of the University of Tokyo

June 8, 2010

 

©Erika Mitchell/iStock/Thinkstock

Let me begin by noting that the American university is incomparably the most democratic in the world. It’s popular in the best sense of the term, admitting and educating unprecedented numbers of men and women of every race and socioeconomic background. Students from every corner of the world—and here I speak for myself as well—have found a place in the nation’s incredible variety of colleges and universities, public or private, large or small, secular or sectarian, urban or rural, residential or commuter. Today there are more than 3,600 colleges and universities in the United States, including some 1,400 public and private two-year institutions.

U.S. colleges and universities enroll more than 19 million students and annually grant nearly 3 million degrees. Higher education employs more than 3.6 million people, including 2.6 million faculty, in what amounts to a more than $380 billion business.

The diversity of our education system gives it strength, great strength. Individual institutions have traditionally emphasized different functions that have complemented each other by addressing different local, regional, national, and international needs. They also provide educational oppotunities to diverse populations by expanding scientific and technical knowledge, and providing opportunities for continuing education, and also opening their doors to the world.
Until several years ago, two-thirds of all students from foreign countries studying abroad were in the United States; two-thirds of the entire international student body that went abroad studied in the United States.

In the last century, enrollment in American higher education grew from 4 percent of the college-age population in 1900 to almost 70 percent by the year 2000. Our student body, moreover, is incredibly diverse. Following a long period of little or no growth in total enrollment, the nation’s institutions of higher education are now seeing the biggest growth spurt since the baby boom generation arrived on campus in 1960.

Between 1995 and 2015, enrollments are expected to increase 16 percent, and one-third of the increase will be members of minority groups. By 2015, minority enrollment is anticipated to rise by almost 30 percent to 2 million in absolute numbers, representing almost 38 percent of undergraduate education.

Clearly there is a strong case to be made for the fact that American higher education is a vital and successful endeavor. But let me take a few moments here to review its history and highlight several aspects of higher education in the United States in order to understand the underpinnings of its success.

The first major opportunity for the expansion of American higher education came in 1862. Even in the middle of the Civil War, and despite the fact that 500,000 people died in the greatest tragedy of American history, President Abraham Lincoln enacted the Morrill Act, which established land-grant universities throughout the United States. The Morrill Act coincided with the Industrial Revolution, and it helped to establish universities just about everywhere the people of the United States were, and where they needed institutions of higher education that addressed their particular needs. Some of our current universities grew from these roots such as the University of California, Irvine, which deals with agriculture; in Wisconsin, the state university includes a focus on the fact that the dairy industry is important; in Minnesota, the mining industry, and on and on. Because of the needs of the state, the resources of states were tapped at the time and folded into the educational curriculum.

The second most important revolution that happened, in addition to land-grant universities—which, by the way, have produced, since their inception, some 20 million degrees— was the establishment of the National Academy of Sciences. Again, it is remarkable to note that Lincoln had such faith in the strength and continuity of the U.S. that in 1863, while the Civil War raged on, President Lincoln signed another piece of landmark legislation—a law that created the National Academy of Sciences. The Academy, which was established to advise Congress on “any subject of science or art,” has done that job well and expanded to include the National Research Council, the National Academy of Engineering, and the Institute of Medicine.

It was not until World War II, though, that the federal government began supporting university research in a significant way. Prior to that, research was done in Europe and in corporate laboratories. To strengthen U.S. growth in science, President Franklin Roosevelt established a commission headed by Vannevar Bush, a former professor at the Massachusetts Institute of Technology. His landmark report was published in 1945 and adopted by President Truman. In this piece, a beautiful report entitled Science: The Endless Frontier, Bush noted that the business of industry naturally took the lead in applied research but was deterred by marketplace considerations from conducting pure research. Bush argued that it was the federal government’s responsibility to provide adequate funds for basic research, which pioneers the frontiers of human knowledge for the benefit of society. He also wrote that the nation’s universities were, by their very nature, best suited to take the lead in conducting basic research. Public funding, he said, would promote competition among researchers and projects could be selected on the merits through a peer review process. Bush suggested a federal agency should oversee the program, and Congress created the National Science Foundation to do the job in 1950.

The agency got off to a slow start, but after October 1957, when Sputnik was launched, support for science, science education, and basic research rose rapidly. From 1960 to 1966, federal spending on research not associated with defense leapt from $6 billion a year to almost $35-$40 billion. Until recent years, federal investment in research rarely fell below $20 billion a year, and much of this money went to universities. Giving the universities—that’s the difference— giving the universities the lead in basic research turned out to be a brilliant policy. Instead of being centralized in government laboratories as science tended to be in other parts of the world, scientific research became decentralized in American universities. This policy spurred a tremendous diversity of investment. It also gave graduate students significant research opportunities and helped spread scientific discoveries far and wide for the benefit of industry, medicine, and society as a whole.

Another revolutionary phase in American higher education came about in 1944 and was known as the GI Bill of Rights. This legislation ranks up there in importance with the Morrill Act because the law, enacted at the height of World War II, opened the doors of America’s best colleges and universities to tens of thousands of veterans returning from the battlefields, ordinary Americans who had never dreamt of going to college, and who were now actually being encouraged to do so by their government. The G.I. Bill made an already democratic system of higher education even more democratic in ways that were simply inconceivable in Europe and other parts of the world. In the following decades, the GI Bill—and its legislative offspring enacted during the wars in Korea and Vietnam, and now Iraq and Afghanistan—have resulted in the public investment of more than $60 billion in education and training for about 18 million veterans, including 8.5 million in higher education. Currently, the United States offers an education benefit as an incentive for people to join its all-voluntary military forces.

Shortly after World War II, in 1946, Congress also created the prestigious Fulbright scholarships, which all of you are familiar with, and which have been enormously successful. All in all, there have been some 235,000 American and foreign Fulbright scholars—146,000 alone from countries other than the U.S. The program was created, by the way, as one of the best ways of investing in international education.

The noted sociologist David Riesman said that the greatest contribution to the American economy in the post-war period was the liberation of women.

In 1947, the democratization of higher education was advanced when the President’s Commission on Higher Education recommended that public education be made available up to the 14th grade, thus opening the door to the development of community colleges, or two-year colleges, which are now playing a major role in American higher education, but also point to some of the problems I will discuss later.

In a more recent effort to promote international cooperation and security, Congress enacted the National Security Act of 1991, which provides scholarships for undergraduates and graduate students to study many of the less well-known languages and cultures in key regions of the world, including East Asia, Central Asia, and the Middle East, not to mention Eastern Europe, the former Soviet Union, and Africa.

Another major landmark was the creation of federal loan grant guarantees and subsidy programs as well as outright grants for college students. In the decades since its founding in 1965, the Federal Family Education Loan Program has funded more than 74 million student loans worth more than $180 billion. And in the years since the 1973 Pell Grant program—named after Senator Claiborne Pell— was created, more than $100 billion in grants have been awarded to an estimated 30 million postsecondary students.

Last but not least, let me add something important about Pell grants: when they were proposed, there was a big debate about whether to give the money to university presidents or to give it directly to students so the funds would be portable. It was decided—in fact, Clark Kerr of University of California who led the Carnegie Commission on Higher Education recommended—that the money be designated as portable by students because this would create competition among universities. Many of Clark Kerr’s friends stopped talking to him after that recommendation, including his president. Thus, we can see that land-grant universities, the National Academy of Sciences, the GI Bill, Pell grants, and a host of other innovative strategies for advancing American higher education and increasing access to colleges and universities played a major role in enriching and expanding American education at the college and university level.

Naturally, the civil rights movement in the United States and the end of formal, legal discrimination also contributed to advancing higher education and educational access. In this connection, I should mention that my late friend, the noted sociologist David Riesman, said that the greatest contribution to the American economy in the post-war period was the liberation of women. He was right, because today, almost 54-58 percent of students enrolled in American higher education are women and that, along with the advancement of minorities—especially Asians and African Americans—is truly revolutionary.

Now, let me turn to the problems facing American higher education. There are many things I can talk about. Problem number one is that when there was no competition, America could afford duplication in its higher education. The nation could afford to have thousands of colleges and universities because they provided educated leaders and skilled labor, but at the same time, unskilled workers—those who could not afford higher education or even dropped out of school, could still find jobs in manufacturing and so on, but today, that’s not the case. So duplication in education is no longer affordable, and quality has become very important and a key to competition among educational institutions.

Perhaps the second most important problem is the state of public universities which, as I indicated earlier, were created to be funded by public sources. Private institutions had to rely on private sources, on philanthropy. And parenthetically, ladies and gentlemen, as you know, philanthropy is a big deal in the United States. Annually some $350 billion dollars in philanthropic giving is disbursed by Americans, and not only the rich; 70 percent of those sums come from families with incomes of less than $100,000 dollars a year. Giving has become an American phenomenon. Even during presidential campaigns and debates, candidates now have to reveal the amounts of their philanthropic giving because otherwise they will be known as being stingy, being cheapskates.

But now, the barriers between public and private funding of universities have all but disappeared. Both private and public universities seek support from private sources as well as from the public, with one major difference: when I came as a freshman to Stanford University in 1956, tuition and fees were $750 dollars at Stanford, $50 dollars at the University of California, Berkeley—yes, 50, five-oh. Now, all the costs have gone astronomically high. Colleges and universities have to keep up with inflation and support the costs of laboratories; technology; of stocking their libraries; building and maintaining dormitories and other facilities; paying for athletics; paying for health and other types of insurance; providing health, food, counseling and other services; legal and government affairs departments, public affairs departments, etc. In short, universities, nowadays, are like city states. But what has changed over the years is that individual states can no longer afford by themselves to pay for public higher education. For example, I’m told that today, only 8 or 9 percent of the funding needed for the University of Michigan comes from the state of Michigan; in Missouri, it’s 9-10 percent; Maryland, 9-10 percent; etc. The rest has to come from tuition, fees, federal research grants, federal loans and grants as well as philanthropy, which was not how the system of supporting public higher education was supposed to work.

In addition, when Pell grants were inaugurated, there were two components: loans and outright grants. As time has passed, the proportion of loans and grants has changed so that today, more loans are given than grants. Hence, students often have to borrow money to pay back the loans, and if they are unable to pay their debts or go into bankruptcy as a result of their debt burden, this will adversely affect their future, including their ability to find jobs and advance in their careers. If, on the other hand, they take jobs with low pay and because of their low salaries remain unable to pay their loans, it discourages some people from embarking on careers where the financial rewards are not great but the mission is important to society and the nation. As a former teacher myself, I have first-hand experience of that type of situation. If you become a teacher with a $30,000-a-year salary and you have to pay six-to-ten thousand a year for your college debt, especially if you get a higher degree, that’s a very serious challenge.

alignleft

©Robert Ford/iStock/Thinkstock

Yet another problem that we face is universities of uneven quality because we don’t have a national accrediting system. We have a regional accrediting system. In the absence of a steady flow of public and private funds, many higher education institutions rely on increased levels of enrollment as a way of meeting their budgets. This, naturally, affects quality. In addition, universities, by necessity, incur financial aid obligations, which they sometimes cannot fully meet because the more students they enroll, the more financial aid they have to provide. This situation is worsened by the fact that now, there is a new, major enterprise competing for students: proprietary, primarily for-profit organizations along with online institutions such as Phoenix University and others, which have access to federal loans. These entities are expanding their reach exponentially. Currently, the U.S. Congress is investigating why a disproportionate amount of Pell grants are going to proprietary and online schools. Some argue that Pell grants should not go to these institutions at all but those who want specific kinds of job training, such as beauticians and various kinds of technicians and so forth argue that they should have access to the same kind of funding sources as other students.

So these are some of the problems. But there is still another that is among the most important of all, and that is the following: we all agree that what makes universities great is the quality of their faculties. I have always believed that the faculty is the bone marrow of the university. Students come and go, administrators come and go—even visionary leaders, though they be few and far between, come and go—but a university’s faculty provides continuity. In that connection, the challenge is that many universities cannot afford to maintain or recruit high-quality faculty nor can they have the same number of top-level faculty that they did in the past. As a result, they resort to replenishing their ranks with adjunct and part-time faculty. Part-time faculty size has increased from 22 to almost 40 percent in many universities, making the overall quality of their faculty questionable. I’m not referring to the Harvards, Princetons, Yales and others of that rank; I’m talking about those small colleges and public universities that cannot afford to maintain an excellent faculty roster and so must rely on part-timers in order to preserve themselves during difficult financial times. Remember, when you have part-time faculty, you save money because you don’t have to give them offices, or provide benefits or sabbaticals or other types of resources. It’s almost like piece-work is being introduced into higher education.

One of the greatest challenges facing our society is how to distinguish between information, which may be true, false or some tangled combination of both, and real knowledge.

In addition, naturally, during times of financial crisis such as we find ourselves in now, another challenge that arises is that there is a growing impulse to do what is expedient, such as reducing the number of academic units required to graduate. Hence, I am not surprised that once again there are also voices raised, asking why can’t the time required for BA and other degrees be reduced to three years? After all, some say, Oxford started with four years and then reduced it to three. Harvard copied the four-year system and it has been with us since the beginning of the higher education system in the U.S., but why does it have to remain that way? Let’s reduce it. Quality, depth and richness of education don’t seem to factor into these suggestions.

This brings me to what may be the core crisis facing higher education today, and that is the onslaught of information that now accosts almost every human being in our borderless, always tuned in, always connected and interconnected globalized world. Perhaps nowhere is this flood of information more apparent than in the university—particularly in the United States. Never mind that much of the information is irrelevant to us and unusable. No matter, it still just keeps arriving in the form of books, monographs, periodicals, web sites, instant messages, social networking sites, films, DVDs, blogs, podcasts, e-mails, satellite and cable television shows and news programs, and the constant chirping of our Blackberries and smart phones—which, by the way, I hope you have turned off, if just for now!

While it is true that attention to detail is the hallmark of professional excellence, it is equally true that an overload of undigested facts is a sure recipe for mental gridlock. Not only do undigested facts not constitute structured knowledge but, unfortunately, the current explosion of information is also accompanied by its corollary pitfalls, such as obsolescence and counterfeit knowledge.

And, if you will indulge me for sacrificing the English language for a moment, another phenomenon we are confronting is the “Wikipediazation” of knowledge and education. At least in part, this is a result of the fact that we are all both givers and takers when it comes to running the machinery of the Information Age, particularly the virtual machinery. I am talking, of course, about the Internet. Let me tell you about a notorious event involving Wikipedia that has come to represent how easily false information can virally infect factual knowledge. What has come to be known as the Seigenthaler Incident began in 2005 when a false biography of the noted journalist Robert Seigenthaler, Sr., who was also an assistant to Robert Kennedy when he was Attorney General in the 1960s, was posted on Wikipedia. Among the scurrilous “facts” in the biography were that “For a short time, [Seigenthaler] was thought to have been directly involved in the Kennedy assassinations of both John, and his brother, Bobby. Nothing was ever proven.”

This horrendous misinformation—represented as truth— existed on Wikipedia for 132 days before Seigenthaler’s son, also a journalist, happened upon it and called his father. Seigenthaler, Sr. then had Wikipedia remove the hoax biography, but not before the same false facts had migrated to many other sites. Probably, somewhere in the estimated 30 billion online pages, it still exists. Wikipedia has taken steps to address this problem, but estimates are that there may be somewhere around two million distinct sites on the Internet, with more being created all the time, and there is no central authority, no group, individual or organization to oversee the accuracy of the information they purvey.

Clearly, therefore, one of the greatest challenges facing our society and contemporary civilization is how to distinguish between information—which may be true, false, or some tangled combination of both—and real knowledge. And further, how to transform knowledge into the indispensable nourishment of the human mind: genuine wisdom. As T. S. Eliot said, “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?”

Today’s universities—along with our colleges, libraries, learned societies and our scholars—have a great responsibility to help provide an answer to Eliot’s questions. More than ever, these institutions and individuals have a fundamental historical and social role to play in ensuring that as a society, we provide not just training but education, and not just education but culture as well. And that we teach students how to distill the bottomless cornucopia of information that is ceaselessly spilled out before them twenty-four hours a day, seven days a week, into knowledge that is relevant, useful, and reliable and that will enrich both their personal and professional lives.

This is not an easy task, especially in a nation where, as Susan Jacoby writes in her recent book, The Age of American Unreason, “the scales of American history have shifted heavily against the vibrant and varied intellectual life so essential to functional democracy. During the past four decades, America’s endemic anti-intellectual tendencies have been grievously exacerbated by a new species of semiconscious anti-rationalism, feeding on and fed by an ignorant popular culture of video images and unremitting noise that leaves no room for contemplation or logic. This new form of anti-rationalism, at odds not only with the nation’s heritage of eighteenth-century Enlightenment reason but with modern scientific knowledge, has propelled a surge of anti-intellectualism capable of inflicting vastly greater damage than its historical predecessors inflicted on American culture and politics.”

What Jacoby so forcefully points out is that ignorance is absolutely not bliss when both the strength of our democracy and the future of our society is at stake. And it may well be, for not only are we distracted and overwhelmed by the explosion of images, news, rumor, gossip, data, information and knowledge that bombard us every day, we also face dangerous levels of fragmentation of knowledge, dictated by the advances of science, learning, and the accumulation of several millennia of scholarship. Writing about the fragmentation of knowledge and the advent of specialization, it was not so long ago that Max Weber criticized the desiccated narrowness and the absence of spirit of the modern specialist. It was also this phenomenon that prompted Dostoevsky to lament in The Brothers Karamazov about the scholars who “…have only analyzed the parts and overlooked the whole and, indeed, their blindness is marvelous!” In the same vein, José Ortega y Gasset, in his Revolt of the Masses, as early as the 1930s, decried the “barbarism of specialization.” Today, he wrote, we have more scientists, scholars and professional men and women than ever before, but fewer cultivated ones. To put the dilemma in 21st century terms, I might describe this as everybody doing their own thing, but nobody really understanding what anybody else’s thing really is.

Unfortunately, the university, which was conceived of as embodying the unity of knowledge, has become an intellectual multiversity. The process of both growth and fragmentation of knowledge underway since the seventeenth century has accelerated in our time and only continues to intensify. The modern university consists of a tangle of specialties and sub-specialties, disciplines and sub-disciplines, within which specialization continues apace. The unity of knowledge has collapsed. The scope and the intensity of specialization are such that scholars and scientists have great difficulty in keeping up with the important yet overwhelming amount of scholarly literature of their own sub-specialties, not to mention their general disciplines. Even the traditional historical humanistic disciplines have become less and less viable as communities of discourse. As the late professor Wayne C. Booth put it wistfully in a Ryerson lecture he gave more than twenty years ago that still, sadly, sounds like breaking news from the education front: Centuries have passed since the fateful moment…when the last of the Leonardo da Vincis could hope to cover the cognitive map. [Now], everyone has been reduced to knowing only one or two countries on the intellectual globe…[In our universities] we continue to discover just what a pitifully small corner of the cognitive world we live in.

In that regard, I would add that this fragmentation of knowledge into more and more rigid, isolated areas is contributing to a kind of lopsidedness in the way education is organized and a growing disconnect between value-centered education and the kind of training that is aimed specifically at career preparation. What is hopeful is that there is a growing realization among the leaders of the nation’s higher education sector that this lopsided system of education is both deficient and dangerous, that we need a proper balance between preparation for careers and the cultivation of values, that general and liberal education is the thread that ought to weave a pattern of meaning into the total learning experience, that unless such a balance is restored, career training will be ephemeral in applicability and delusive in worth; and value education will be casual, shifting and relativistic. I strongly believe that one of the great strengths of American higher education is that it is home for liberal arts education, which is a sound foundation for all the professions and professional schools.

Ignorance is absolutely not bliss when both the strength of our democracy and the future of our society is at stake.

In the words of Albert Einstein, “It is essential that the student acquire an understanding of a lively feeling for values. He or she must acquire a vivid sense of the beautiful and the morally good. Otherwise he or she—with his or her specialized knowledge—more closely resembles a well-trained dog than a harmoniously developed person.” That is why I believe, and every year, whether I was a Dean, President or Provost of a University, I always reminded incoming freshmen to remember the famous line in Sheridan’s Critic (1799), that the number of those who undergo the fatigue of judging for themselves is precious few. It is the task of higher education to increase the number of those who do undergo that fatigue.

To sum up, it seems to me that by trying to reduce the requirements for a degree and at the same time, expecting to be able to break down education into specialized parts— each part swollen to overflowing with endlessly and exponentially increasing amounts of data and information—we are going in absolutely the wrong direction. Why? Because all this pushing and pulling and compartmentalizing presupposes that somehow, one’s education will eventually be finished, that it will come to an end where an individual can say, now I’ve graduated and I don’t have to learn anymore. But of course, you never graduate from your life and hence, you never really graduate from learning. One’s “formal” education is really just an introduction to learning where the skills to go on educating oneself are acquired and inculcated into everyday life—because learning is a lifelong endeavor. In that connection, when I was president of Brown, one day I decided, as a joke or as an ironic act, to propose awarding two kinds of degrees, one certifying that you know the following subjects, the other one certifying the subjects that you know, but most thought it was a crazy idea because parents would say, we paid you to educate our sons and daughters and instead, you’re giving us an uneducated person. So I decided that we’d just say the BA degree was, as I’ve described above, an introduction to learning, an undertaking that must be carried on throughout all the years of one’s life.

One of the greatest strengths of American higher education is that it is home for liberal arts education, which is a sound foundation for all the professions and professional schools.

In order to further make my point about lifelong learning, let me share this one last story with you. Some years ago, when asked to give a major speech to an illustrious gathering at Southern Methodist University, instead of a speech, I gave an exam. I said, imagine that you are the last person on earth. Nothing is left, no monuments, no other human beings, no libraries, no archives and hence, you are the best-educated person on the planet. Suddenly, the Martians land and they want to debrief you, the last human being standing, so they can preserve the history of humanity and the civilizations of the planet Earth. They begin by asking you questions such as: We heard that you had some objects that could fly, but that’s such an antiquated mode of transportation, so can you explain to us the principles by which these objects were made to fly? After all, your society awarded PhDs and MDs and all kinds of other degrees to people like yourself, so can you just prepare a schematic for us about these flying things? And we also heard that you had some kind of ships that could travel under water, but how was that possible? We also heard that you were able to phone each other, and despite mountains and oceans and so forth, you could talk to each other across thousands of miles; how did that work? And, oh yes, we’d also like to have the maps of all the continents, so can you draw them for us? Please include all the nations along with rivers, counties, capitals, and so forth. After all, we understand that you are an educated person, so these things should be easy for you.

Then I said to the gathering—still speaking on behalf of the head Martian—there’s another subject we Martians want to know about. We have a long list of the names of the religions that people on Earth followed, and they were well-represented in the United States. We don’t quite understand the differences between these religions and why you argued about them century after century. Here is just part of the list we have: Hinduism, Islam, Judaism, Jainism, Sikhism, Shintoism, Confucianism, the Baha’i faith, and then the different forms of Christianity: Catholics, Protestants, Baptists, Southern Baptists, Lutherans, Pentecostals, Evangelicals, Amish, Mormons, Jehovah’s Witnesses, Seventh Day Adventists, Greek Orthodox, Eastern Orthodox, and Russian Orthodox. Could you please pick five of these and tell us where they agree and where they disagree? Of course, there was dead silence in the audience. So I concluded my “exam” by saying, I thank you for not being the last man or woman on Earth, because education is a life-long experience and endeavor, and I believe you might have some catching up to do…!
In a way, perhaps we all have constant “catching up” to do when it comes to finding ways to address the many challenges facing our colleges and universities. But we will find them, I am sure, because in the words of Henry Rosovsky8, the economist and educator, in higher education, “‘made in America’ is still the finest label.” We all should have a hand in ensuring that continues to be true.

1How Liberal Arts and Sciences Majors Fare in Employment: A Report on Earnings and Long-Term Career Paths, by Debra Humphries and Patrick Kelly, published by the Association of American Colleges and Universities.
2“Humanities Majors Don’t Fare As Badly As Portrayed, New Earnings Report Says,” HuffPost College, January 23, 2014.
3 Op. cit. How Liberal Arts and Sciences Majors Fare in Employment.
4 The Born-Einstein Letters 1916-1955 (MacMillan Press Ltd. 1971; 2005).
5 “To Meet President Obama’s Job Goals, Involve All Colleges,” Bloomberg Business Week, January 29th, 2014.
6 College: What It Was, Is, and Should Be (Princeton University Press, 2012).
7 Ibid
8 See page 59 of this magazine for Henry Rosovsky’s article, “Research Universities: American Exceptionalism?”

Posted in Article, Research, What Is the Value? Tagged with: , , , , , , , , , ,