Why does the United States keep using “old” date representations and imperial system, while being in the minority?

Why does the United States keep using “old” date representations and imperial system, while being in the minority?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

ZJ iR Dl nC RK rQ ws yX KA ir vp BD qU BX

Other than the US (and a few other countries), the vast majority of countries use

  • International System of Units (SI).
  • Celsius temperature scale.
  • DMY or YMD date format¹.
  • 24-hour clock when written².
  • Monday as the first day of week³.

These differences can cause technical difficulties.

What are the historical reasons for the United States, which is one of the most advanced, powerful, and influential of countries, to keep using units and date representations which were abandoned by most of the countries of the world, especially developed ones?


¹ except the US, Philippines and a few other countries.
² except the US, Canada, Australia and a few other countries.
³ except the US, Canada, Mexico and a few other countries (also usually Saturday in the Middle East).


The Status of the Metric in the United States
Strictly speaking, the US has been "metric" since the Mendenhall Order, issued in 1893. The inch is defined as exactly 2.54 centimeters, the pound (mass) is exactly 0.45359237 kilograms, the pound force is exactly 4.4482216152605 newtons, and so on. The conversion factors have changed a bit since 1893, but that there are defined conversion factors has not.

That said, there's a lot more to "going metric" than having some conversion factors hidden underneath the hood. There's a whole lot more to "going metric" than changing our speed limits and highway signs. Printing 453.6 grams in small print after a bold 1 pound on a can of peas is not "going metric", nor is exchanging the order of those units on that can of peas. Printing the size as 453.6 grams in bold and 1 pound in small, parenthesized print also is not "going metric."

"Going metric" means changing the size of that can of peas to 500 grams or 400 grams and printing the customary units (which will now be oddball numbers) in parentheses. It means changing the size of wires from American Wire Gauge to the metric wire standard, changing the sizes and pitches of screws and bolts from nice even fractions of an inch to nice even fractions of a centimeter. "Going metric" means changing the manufacturing base, from bottom to top.

European Measurement Systems in the 19th Century
No answer has yet mentioned the chaos of measurement systems in Europe prior to the French Revolution. Different countries each had their own system of units, or worse. Oftentimes, towns separated by a day's ride had their own systems of units. It was chaos, and it was that chaos that the French Revolution tried to address. There were no standards prior to the French revolution. Continental European countries addressed this chaos by switching to metric units. Metrication in western continental Europe was largely complete by 1876.

Other countries addressed that chaos in less draconian ways. Industrialization in the United Kingdom mandated having a consistent set of units. The UK Parliament did consider converting to metric units, but eventually instead standardized the informal units used in slightly different ways across the British isles in the Weights and Measures Act of 1824. This act cemented the use of imperial units in the UK until 1965. It was this standardization that formed the basis for the goofy units still used in the US (and informally, still used in the UK).

World Wars
No answer has yet mentioned the importance of centuries of war in Europe, culminating in the two World Wars. The two world wars wiped out the manufacturing base throughout most of continental Europe (and also Russia, Japan and China). They had to rebuild. The only system of measurements that made a lick of sense as the basis for that rebuilding was the metric system. Continental Europe was already metric. They weren't going to switch to the goofy British units.

It took those countries devastated by World War II twenty years to recover from the horrors of that war. The countries whose manufacturing base were not devastated? That would be the Commonwealth nations and the US. Manufacturing capabilities in continental Europe were bombed to oblivion during those wars, particularly during WWII. At the same time, the Commonwealth nations and the US underwent a huge build-up of their manufacturing base. This build-up was done using imperial units. There was a lot to lose in the Commonwealth and in the US by converting to metric. The Commonwealth countries were amongst the last to officially "go metric". The US? Not yet, but that too will come to pass.

The UK was the first of the Commonwealth nations to "go metric," and that only started to happen in 1965. By that time, 20 years after WWII, continental Europe had rebuilt their manufacturing base. Continental consumers liked having their cans of peas and all kinds of other consumer products expressed in metric units, and continental manufacturers liked having their screws, bolts, and all kinds of other industrial products expressed in metric units. UK manufacturers found themselves in the untenable position of maintaining two production lines, one based on imperial units for a small domestic market and another based on metric units for a potentially much larger export market across the Channel. The impetus for the British conversion to metric was largely industry-driven. The British people were steadfast against going metric; some holdouts still are.

Metrication in the United States
The US is a special case. No bomb were dropped on US cities, railway depots, or manufacturing plants during WWII. A large number of American soldiers did die in that war, but the US manufacturing base escaped the war unscathed. To the contrary! The US instead built up a massive manufacturing base during WWII. It was this build-up that resulted in the US being the world power after WWII. This build-up is also why the US has not yet "gone metric."

Unlike Great Britain, the US has a huge domestic market. Being attractive to that huge domestic market was key to survival for a US-based company for much of the post-WWII era. Exports? They were a nice add-on to the bottom line. Besides, for the first twenty years after WWII, what else were those outsiders going to buy other than American products? Europe and Asia had no manufacturing base. They bought American made products.


That calculus is changing. Just as it made no sense for UK-based manufacturers to have two production lines 50 years ago, it makes no sense for many US-based manufacturers to have two production lines now. If you own a recently built automobile, it will be metric through and through, It doesn't matter whether than car was built in Mexico, Canada, Europe, Asia, or Detroit. The US automotive industry has "gone metric."

That the US automotive industry has indeed "gone metric" will ripple throughout the US manufacturing base. This plus other aspects of globalization will eventually end the use of customary units in the US. The US will convert to metric units for the same reason the UK did: Those archaic customary unit make no sense from an industrial perspective.


Edit: As pointed out in the comments, I realize this answer doesn't deal with the history of metrication in America. I intended it only as an answer to "why does the US keep using their systems?" However, other answers here do a very good job outlining the history, and I encourage everyone to check those out too.


As a non-American, I've always found it amusing that the 3 countries that officially cling to Imperial units are Liberia, Myanmar, and the United States. Quite a motley crew!

Anyway, for the US there are a number of reasons why it'll be hard to switch to metric/24-hour clocks/logical(!) date formats. Many of these reasons have already been mentioned.

But the simplest thing is probably a thought experiment for non-Americans: Imagine your own country wanted to switch to American units and formats. How receptive would you be to that idea?

Now setting aside the scientific arguments for using SI units, I imagine you would be very hesitant. It's not a lack of will, but an abundance of opposition.

These units and formats touch everybody's lives, meaning everyone is a stakeholder. Everything from grocery shopping to weather forecasts to your calendar would suddenly be a pain to figure out - just like it's a pain to figure out for non-Americans visiting the country. If you've grown up with one system, you've internalized it to such a degree that switching to anything else will seem completely ridiculous.

From the simple (replacing all your cookbooks) to the complex (retooling entire industries and changing every single road sign, to name a few), it's just a hornet's nest.

Sure, you can use rational arguments for why SI units etc. are just plain smarter, but if you're dealing with hundreds of millions of people, rational arguments tend not to work.

Add to that a certain anti-authoritarian streak that has defined much of American history and politics. If the US government declared that the country should switch to metric, I'd bet many would say that that's government interference in their lives. And they'd be right, because - as mentioned - it literally does affect everyone.

There could also be a wee bit of isolationism and perhaps even exceptionalism: Why should America even care what others are doing? And why should America follow anyone else?

And to many it will just seem like the stupidest thing to spend time and tax payer money on. To American eyes, there's no problem to solve. The US is self-reliant on almost everything, so as an American you never, ever have to deal with SI units for anything in your daily life. Goods are produced, sold, bought, and consumed by the pound, by the ounce, and by the gallon. No tricky conversions necessary.

Of course, the difference in units do cause problems. But it's not something the general public has to concern itself with. For instance, in 1999 an (unmanned) spacecraft was lost because one part of the system used Imperial units, while the rest used SI units. While that's the butt of many a joke (and a loss to science, but mostly a joke), it's again not something that affects anyone's daily life - especially since the craft crashed on Mars, not Earth.

So, in the end: Yes, America should absolutely switch to metric! It's crazy that they still use those weird systems :)


With regard to imperial measurement, there is actually an interesting reason (at least in my opinion) why the US was not an early adopter of it. Thomas Jefferson had actually developed his own base-10 system of measurement (I believe he even attempted a base-10 system of time), and, had US relations been better with post-Revolution France, we may well have become one of the earliest adopters. Unfortunately, such was not the case:

The evolving political situation didn't help matters. Although France supported the American colonies during the Revolutionary War, it became hostile to the U.S. after Jay's Treaty was ratified in 1795. The French viewed the treaty, which eliminated British control of posts in the Northwest Territories and provided America a limited right to trade in the West Indies, as a blossoming alliance between the U.S. and England. France retaliated by sending privateers to target American merchant ships. By the time John Adams became president in 1797, the hostilities between the U.S. and France had grown quite intense. It's no surprise, then, that in 1798, France snubbed the U.S. when it invited dignitaries from foreign countries to travel to Paris to learn about the metric system.

Why isn't the U.S. on the metric system?

Now, granted, that does not explain why the US did not adopt the metric system, for instance, 40 years later, or 140 years later for that matter. Actually, technically speaking the US has adopted it since 1866 (see the same article above for more details), but as everyone living in the States knows, technical adoption is not the same as the population accepting it.

At this point, the largest reason we still cling to the imperial system is inertia. So much is in place that we think of via imperial measurements - your weight is in pounds, your height in inches, your milk in gallons, and so on - that at this point it would be a lot of work to change. Nevertheless, attempts were made as recently as the 1970s to switch over to the French system, and we're beginning to see metrics encroaching on all aspects of our lives as we accept the reality of global trade.


My answer is more about the metric system then about dates.

About dates, also consider that there are Chinese, Hebrew and Islamic calendar, which are much more different from the Christian one.

According to Wikipedia:

In 1866, Congress authorized the use of the metric system and supplied each state with a set of standard metric weights and measures. In 1875, the United States solidified its commitment to the development of the internationally recognized metric system by becoming one of the original seventeen signatory nations to the Metre Convention or the Treaty of the Metre.

But several decades later:

Congress passed the Metric Conversion Act of 1975 "to coordinate and plan the increasing use of the metric system in the United States". Voluntary conversion was initiated, and the United States Metric Board (USMB) was established for planning, coordination, and public education. The public education component led to public awareness of the metric system, but the public response included resistance, apathy, and sometimes ridicule. In 1981, the USMB reported to Congress that it lacked the clear Congressional mandate necessary to bring about national conversion. Because of this ineffectiveness and an effort of the Reagan administration - particularly from Lyn Nofziger's efforts as a White House advisor to the Reagan administration, to reduce federal spending - the USMB was disbanded in the autumn of 1982.

And even more recently:

On December 31, 2012, a petition was created on the White House's petitioning system, petitioning the White House to "Make the Metric system the standard in the United States, instead of the Imperial system." On January 10, 2013, this petition garnered over 25,000 signatures - exceeding the threshold needed to require the Obama Administration to officially respond to the petition. Patrick D. Gallagher, director of the National Institute of Standards and Technology, provided the official response stating that customary units were defined in the metric system, thus making the nation "bilingual" in terms of measurement systems.

Se also Metrication opposition (Wikipedia).


I'll start with about the only place actual history comes into this: why it started.

In English there are two ways to say dates:

America's official birthday is on the fourth of July, seventeen seventy-six.

and

America's official birthday is July fourth, seventeen seventy-six.

You may notice that the second way is far shorter. It requires no prepositions, which means it is much less awkward a phrasing. Easier for both the mouth and the ear. So it shouldn't surprise even a non-English speaker that this is the preferred and traditional way to say it for much of the English-speaking world.

When writing a date out numerically, what you are essentially doing is abbreviating. So if in English one typically speaks a date as "month day, year", then the proper way to abbreviate it (assuming slashes as separators) would naturally be MM/DD/YYYY. Any other way is going to confuse people (even if there is some standard somewhere saying it should be that way).

So where did the other order come from? Well, it turns out that in French, the natural way to speak a date is in fact "day month, year". So for a Frenchman, abbreviating dates as "DD/MM/YYYY" is the natural abbreviation. The Francophone world insists on that ordering, and will accept no other (as it would be confusing to them).

I won't get into the politics of who "won" when the EU standardized things. However, it should at least be noted that the capitol of the EU is in a Frankaphone country.

The USA is a much larger country (in just about every sense) than the UK, and does not have to worry nearly so much about French sensibilities. So it does dates the way its people want to do dates. If people in other countries have a problem with that, then they have a problem.

Now this being said, IMHO both systems are old systems. The "modern" way to do dates is in fact YYYY-MM-DD (aka: ISO 8601). This format is much easier for computers (and by extension, us Computer Scientists) to deal with.

The conversion to SI is a fairly different story, although the enemy is still established mindshare. Presidents Ford and Carter actually tried to move the US to metric back in the 70's. The general public balked, both were defeated in their next elections, and the next President (Regan) abolished the ineffective agency in charge of the effort. Today the USA uses Metric units in many of the sciences, but for the most part happily sticks to English units.

Generally it is probably the case that the USA is so large and self-sufficient of a society, that any radical change in units from what everyone is used to is nearly impossible. The vast majority of the population never has to deal with a non-USA person, so changing something everyone already understands solely for the benefit of this rarely dealt with non-USA person is just not going to fly.


There's one thing people usually forget about "customary" measurement systems (aka "imperial" in this case): they are evolved over considerable period of time in the society, and thus are much more convenient for use in everyday life (where complex calculations are usually not required).

Lame examples were edited out due to popular opposition.

Natural fractions are also more intuitive then decimal one (one third, one quarter, etc). Thus, base 12 (as oft used in "customary" measures) is better for many purposes then base 10 (more prime factors to think with). Revolutionary (as in "French Revolution") French system (the direct precursor to SI) tried to mandate decimal measures for time and angular quantities - those were not accepted by anybody and faded into complete obscurity, while base 12 is alive and kicking.

Another interesting feature of customary systems, especially related to volume, was the use of base 2 system (each next measure is exactly twice the volume/weight of the preceding one).

In fact, while introducing many important innovations, French system was notoriously bad in choosing its étalons for most common measures. It was not mandated by lack of knowledge, but by a misplaced desire to remove the entirety of Ancien Régime legacy. Truly, customary foot makes a much better base length than a meter - apart from being more convenient in everyday life, the speed of light could be trivially defined as 1e9 feet per second, avoiding the need to work with a very cumbersome metric 'c' constant (to achieve this, modern definition of foot needs to be adjusted by only 2%; this is well within the original "customary" precision of foot definition).

Considering the above, it is not surprising that USA had never switched to the metric system:

  1. Relatively weak federal government and strongish local one, along with strongly opinionated population (with legal means to stand by their opinion) meant that people had a chance to stick to system they find convenient. For comparison, in continental Europe, metric system was introduced by the decree of the governments and with considerable level of violent oppression.
  2. Lack of real incentive, as most professional activities employ customized measurement systems (a good dozen of those is employed in physics/chemistry) or work with fixed sets of measurements. Standardization of those sets is of much more real importance than the underlying system employed, and it can be said that USA has much better standardization institutions than any other nation (NIST, ANSI, etc.).
  3. In the modern era all non-trivial computations are done by computers, which can do arbitrary unit conversions at negligible computational costs.

We can conclude from the above that large scale measurement system conversion will be a completely pointless exercise, which explains why USA never bothered (and probably would not bother in the future).


The sheer size of the install base will ensure that American Customary measures will stick around in fact, if not in name. For example, if we go metric and a kid tosses a baseball through my window, I'll replace it with one that measures 122 cm wide by 91.5 cm high -- but that's just a 4-foot by 3-foot window dressed up in metric numbers. This extends to all sorts of things: for example, if you don't want to go around re-threading every single screw hole in the country, you'll wind up producing things like a M6.35x1.27 bolt -- the same thing as the UTS 1/4"-20 bolt, but with numbers that are far more awkward.


I can suggest a reason for continuing report weather Imperial Units. The benefit of Fahrenheit is that its scale is more granular. For every degree Celsius you get 1.8 more degrees of precision in Fahrenheit. Which when reporting the weather in Celsius it is almost only reported in whole numbers. Most of the time it's not a big deal, until it's very hot or cold in human terms. That's the second benefit of Fahrenheit, at 0 and 100 it's uncomfortable but not fatal to people

  • Fahrenheit
    • 0: Very Cold
    • 100: Very Hot
  • Celsius
    • 0: Cold
    • 100: Dead
  • Kelvin
    • 0: Dead
    • 100: Dead

it would be too easy to just wake up and actually LOOK at the metric system, see the logic of how easily you can specify how many centimeters are in 5.3 kilometers (so, the first clue is that the actual prefixes in the metric system MEAN SOMETHING- aka centimeter has 100 to a meter, as centi would indicate. then you have kilometer- which would indicate 1000 meters… so you have 5,300 meters times 100, and thus even a moron can see that 53,000 centimeters constitute 5.3 kilometers)

instead, people in usa rely on what people everywhere else rely on on all manner of issues: stubbornness and ingrained habitual responses. these are actually the way the entire human race deals with change. it just so happens that the entire course of history since the industrial revolution has contributed towards ingraining this particular set of practices in americans, much the way the irish or polish ingrained the potato into their cuisine: it just sort of stuck.

trying to promote change, while often perceived as helpful or necessary, in an uphill battle when dealing with humans due to this "conservatism in spurts" whereby certain systems may be very flexible but suddenly harden as usage piles on. This is much as a production web server will often never get its updates and minor fixes, as there simply is no more concept of downtime for maintainence. oops.


The answer is quite simple. Freedom. In the United States individuals are free to choose whichever system they want to use, and the government does not yet have enough power to force them to use a different system. So the milk bottler that has been bottling his milk in gallons, half gallons, quarts, pints and half pints will continue to do so, because that is what he is tooled for, and that is also what his customers demand and understand. He has a considerable economic advantage to continue using the same units. We have a constitution that is supposed to limit the power of the government, and so far that has prevented a switch of units by fiat.

In order to understand this answer, it is instructive to look at the metrication process in Britain. All it takes is a little research into this Britain's history with the metric system to see that this question is kind of based upon a false premise. I have found news articles lamenting the fact that Britain uses a mixture of imperial and metric units (Will British people ever think in metric? http://www.bbc.co.uk/news/magazine-16245391). I recall a time I was visiting one of my English friends from college. I sat down by him on the bed and my weight created a depression, pulling him toward me. He jumped up and exclaimed: "Good Lord, you must weigh 16 stone!". He was spot on by the way, that is almost exactly my weight.

Ok, so now back to the history. Below is an excerpt from wikipedia on the Metrication of the U.K. (http://en.wikipedia.org/wiki/Metrication_in_the_United_Kingdom)

Adopting the metric system had been discussed in the Parliament as early as 1818 and some industries and even some government agencies had metricated, or were in the process of metricating by the mid 1960s. However, a formal government policy to support metrication was not agreed until 1965. This policy, initiated in response to requests from industry, was to support voluntary metrication, with costs picked up where they fell. In 1969 the government created the Metrication Board as a Quango to promote and coordinate metrication. In 1978, after some carpet retailers reverted to pricing by the square yard rather than the square metre, government policy shifted, and they started issuing orders making metrication mandatory in certain sectors. In 1980 government policy shifted again to prefer voluntary metrication, and the Metrication Board was abolished. By the time the Metrication Board was wound up, all the economic sectors that fell within its remit except road signage and parts of the retail trade sector had metricated. The treaty of accession to the European Economic Community (EEC), which the United Kingdom joined in 1973, obliged the United Kingdom to, incorporate into domestic law all EEC directives, including the use of a prescribed SI-based set of units for many purposes within five years. By 1980 most pre-packaged goods were sold using the prescribed units. Mandatory use of prescribed units for retail sales took effect in 1995 for packaged goods and in 2000 for goods sold loose by weight. The use of "supplementary indications" or alternative units (generally the traditional imperial units formerly used) was originally to have been permitted for only a limited period. However, that period had to be extended a number of times due to public resistance, until in 2009 the requirement to ultimately cease use of traditional units alongside metric units was finally removed.

The Quango (quasi-autonomous non-governmental organisation) was active from 1969 to 1980. This period represented the largest jump forward in metrication since the process began.

By the time the Metrication Board was wound up, all the economic sectors that fell within its remit except road signage and parts of the retail trade sector had metricated.

These Quangos are non-governmental organizations to which the government gives power and funding. Because these Quangos are non-governmental and somewhat autonomous, they can exert power against the will of the people and the people have very little recourse. These Quangos appear to be a failed experiment in autocracy, as the UK is in the process of defunding and eliminating many of them. There have also been accusations that the appointments to them are based more upon political patronage than qualifications, and they are very expensive for the services they provide.

In his recollection of his time serving as head of the Metrication Quango, Jim Humble, appears to validate the 'metrication by force' hypothesis

High Street retailers found enormous commercial advantage in reverting to sales by the square yard. Consumers could not be persuaded to believe that goods costing, for example, £10 per square yard or £12 per square metre were virtually priced the same. Consumers bought, in very significant volume, the apparently cheaper priced imperial version. Metrication of carpet sales entered into full scale reverse and the Chambers of Trade and retail associations pressed for firm Government leadership, i.e. compulsory cut-off.

What would have been the result if the members of the Quango were elected officials, answerable to their constituents? Compare to the Metrication Process for the same time frame in the US.

Voluntary conversion was initiated (1975), and the United States Metric Board (USMB) was established for planning, coordination, and public education. The public education component led to public awareness of the metric system, but the public response included resistance, apathy, and sometimes ridicule.[7] In 1981, the USMB reported to Congress that it lacked the clear Congressional mandate necessary to bring about national conversion. Because of this ineffectiveness and an effort of the Reagan administration - particularly from Lyn Nofziger's efforts[8] as a White House advisor to the Reagan administration, to reduce federal spending - the USMB was disbanded in the autumn of 1982.

The USMB in the US reported that they were unable to bring about change because they lacked the Congressional mandate necessary to bring about national conversion. Translation: They did not have enough power to force the people to incur the costs of switching. It is clear that in both of these cases that the cost to switching to metric system was quite high, and individuals are unwilling to foot the costs unless it is by force. Indeed, it seems that they had a considerable economic advantage to staying with their existing systems.


Imperialism

Imperialism is a term used to describe the domination of one state over a number of others. In the early twenty–first century imperialism is generally thought to be a bad idea. After World War II ended in 1945—and increasingly during the late twentieth century—most people came to view imperialist policies as both morally reprehensible and as economically unsound.

During the Cold War both superpowers, the United States and the Soviet Union, were officially opposed to imperialism and generally tried to prevent other countries from pursuing such policies. This was partly because their two ideologies, communism in the Soviet Union and democratic capitalism in the U.S., were opposed to imperialism. They also had national interests that conflicted with those of the major European imperial powers. In addition, the many newly independent countries of the Third World opposed European imperialism, which they believed had been only recently bad for them.

who controls government? Nation–state

how is government put into power? Conquest

what roles do the people have? Provide military and labor services

who controls production of goods? Nation–state

who controls distribution of goods? Nation–state

major figures Genghis Khan Hernán Cortés

historical example Mongol Empire, 1206–1368

But imperialism has not always been so unpopular. Indeed, many countries have openly and aggressively pursued imperialist expansion. Throughout much of human history there have been writers who have extolled imperial conquest, politicians that have designed policies to enable imperial rule, and peoples who have supported imperial designs.


The new Naturalism

The first of the independent theatres was the Théâtre-Libre (“Free Theatre”) founded in 1887 by André Antoine, who made his living as a clerk for the Paris Gas Company. The Théâtre-Libre was an amateur theatre with no home of its own. It hired rooms or theatres where they were available and sold tickets for its performances to a closed membership. In this way it avoided censorship. Antoine’s original intention was to present plays that had been rejected by the Comédie-Française, and thus the repertoire was eclectic. The major impact the group made was with a number of naturalistic plays. The theatre was at this time lagging behind literature, and, although Émile Zola had written an essay entitled “Naturalism in the Theatre” in 1881 and had produced what is seen as the first Naturalist play, Thérèse Raquin, in 1873, no theatre devoted itself to a Naturalist policy until Antoine founded the Théâtre-Libre.

Following on the scientific developments and the philosophical skepticism of the 19th century, the social reformers of the last two decades of the century probed into the causes of human behaviour and postulated that the meaning of human character was to be found in its interaction with the physical, social, and economic environment. The new theatre demanded “truthfulness” not only in the writing but also in the acting and stage setting. The actors were expected to ignore the audience and to behave and speak as though they were at home. Antoine is normally credited with being the first to require an actor to turn his back on the audience from this style of acting arose the concept of the “ fourth wall” separating the stage from the audience. Behind this “wall”—invisible to the audience, opaque to the actors—the environment portrayed was to be as authentic as possible. Antoine himself designed rooms and then decided which wall would be “removed.” In The Butchers, he hung animal carcasses on the stage.

It is possible, however, to overestimate Antoine’s commitment to Naturalism, since a great deal of his repertoire was not naturalistic and the descriptions of several of the Théâtre-Libre presentations show an imaginative experimentation with lighting effects that goes well beyond creating realistic temporal and atmospheric conditions. The first production of the Théâtre-Libre had no scenery at all but only a few pieces of furniture borrowed from Antoine’s mother, yet it was this production that set the Naturalist style. Zola, the philosopher of the movement, had deplored the fact that the Naturalist theatre began by creating an external representation of the world instead of concentrating on the inner state of the characters. Strindberg showed that a few carefully selected properties could suggest an entire room. With the ideas of Antoine and Strindberg, the days of flapping canvas doors and kitchen shelves painted on the walls of the set came to be numbered. The more natural and detailed the acting became, the more it clashed with a painted background.

Antoine’s innovations did much to establish the principle that each play requires its own distinct setting. In 1906, as director of the state-subsidized Théâtre de l’Odéon, he produced classical plays in which he strove for realism not by means of period decor and costume but by re-creating theatrical conventions of the 1600s.

The new pattern of theatre set in France was imitated in Germany during the same period. Otto Brahm modeled his theatrical society, the Freie Bühne, founded in Berlin in 1889, after Antoine’s Théâtre-Libre. Its first production was Ibsen’s Ghosts. On the basis of this and other examples, it could be said that Ibsen pioneered the repertoire, Saxe-Meiningen the staging methods, and Antoine the organizational form for a range of small, independent theatres springing up throughout Europe.

With both ideological aims and theatrical tastes in mind, members of the German middle-class theatre audience formed an organization called the Freie Volksbühne in 1890 for the purpose of buying blocks of tickets and commissioning performances and even productions for its membership, which included a large working-class element. Early in its history the organization split between the Freie Volksbühne, who were attempting to make theatre available to a wider audience, and the Neue Freie Volksbühne, who had specific Socialist attachments and policies. Eventually the two arms recombined and were able not only to subsidize performances but also to build their own theatre and mount their own productions.

During the 1890s in France, a similar program of democratization was attempted. One of the prime movers in this was Romain Rolland, whose book The People’s Theatre ( Le Théâtre du peuple, 1903), inspired similar movements in other countries.

In England the works of Ibsen aroused great interest and attracted the attention of the censors. The first English independent theatre was organized by Jack Thomas Grein, and its first production in 1891 was Ibsen’s Ghosts. Grein’s intention of finding British writers of the new drama was frustrated until the arrival of George Bernard Shaw, the most famous Ibsenite of them all, in 1892, with his first play, Widowers’ Houses. Shaw remained the mainstay of the independent theatre movement in Britain. His preeminence in the independent theatre in England coupled with the success of Arthur Wing Pinero in the commercial realist theatre led to a major innovation in staging in England. Both playwrights participated in the casting of their plays, which in Pinero’s case led to a break away from the old stock company casting and the institution of casting to type. Shaw was able to impose his own interpretation and stage direction on the production of his plays.

Russia also followed the pattern of the independent theatre movement that developed in France, Germany, and England (see below Developments in Russia and the Soviet Union).


How Does Grammarly’s Plagiarism Checker Work?

Our online plagiarism checker compares your text to over 16 billion web pages and academic papers stored in ProQuest’s databases. When part of your text matches something written online or in a database, you’ll get a plagiarism alert.

When you use Grammarly’s free online plagiarism check to detect plagiarism, you’ll see an instant report that tells you whether or not plagiarism was found and how many grammar and writing issues are present in your document. Grammarly’s Premium plagiarism checker flags specific sentences and provides reference information about the source, calculates an overall originality score for your document, and offers advanced writing feedback and corrections across several dimensions.

Rest assured, your writing will stay private. Our plagiarism checker will not make your writing searchable publicly or in any other database. No other plagiarism checkers will see your text.


Cultural Survival vs. Forced Assimilation: the renewed war on diversity

Ethnologue, published by SIL International, estimates that of the more than two million people who identify themselves as American Indians in the United States, only 361, 978 still speak one of the remaining 154 indigenous languages, and many of those are only spoken by the very old. This is about half the number of languages spoken in 1492 in what would become the United States. At one extreme, seven of the remaining 154 languages are spoken by only one person (Coos, Eyak, Kalapuya, Coast Miwok, Plains Miwok, Northeastern Pomo, and Serrano), and at the other extreme, 148,530 of an estimated 250,000 Navajos still speak their Diné language. American Indian languages, which cannot be helped by immigration like other minority languages in the United States, are becoming extinct, one after another.

One of the key factors in the survival of American Indian languages has been the isolation of many Indian reservations, which tend to be located on lands that none of the white conquerors wanted when reservations were established in the nineteenth century. Today, however, roads, satellite dishes, and progress in general are rapidly reaching the most isolated Indian communities. As one elder interviewed by Northern Arizona University Professor Evangeline Parsons Yazzie stated in Navajo: "Television is robbing our children of language." As Navajo children learn English and the mainstream culture through the media and through school, they increasingly become separated from their grandparents, some of whom speak no English. As one of Yazzie's informants said, "Older people who speak only Navajo are alone." Yazzie concluded that, "The use of the native tongue is like therapy specific native words express love and caring. Knowing the language presents one with a strong selfidentity, a culture with which to identify, and a sense of wellness."

Many American Indians see language as the key to their identity, and they question whether one can be Navajo, Apache, or Crow without speaking the tribal language. Navajo language survives most strongly among older Navajos, in Navajo chapter houses (the tribe's unit of local government), and in some Christian churches that use a Navajo-language bible and hymnal. Younger Indians are less likely to speak their tribal language because the schools they attend, the music they listen to, and the television they watch are in English. Tribal languages are considered "old fashioned," "out of date, "and "not cool" to children raised on television. When these children grow up and have children, they raise them to speak only English because it is the only language they have learned to speak fluently. If this situation is not changed, most of the remaining Indian languages will be extinct in another generation or two.

The loss of isolation is not the only current threat to American Indian languages. The old idea that all Americans should just speak English is being promoted by groups like U.S. English (once led by Linda Chavez) and English First. These groups advocate an amendment to the U.S. Constitution to make English the official language of the United States and to limit legally the use of other languages. Already, half the states have some kind of Official English law. Louisiana's 1811 law is the earliest of these, and Utah's 2000 law is the most recent. This concern over the importance of English is comparatively recent: 21 of the 26 states with Official English laws passed them since 1981.

A second approach to attacking minority languages is the movement to oppose bilingual education. Sixty-three percent of Arizona voters, for example, elected to end bilingual education when they voted for Proposition 203 on their November 2000 ballots. In its place, voters substituted one year of untested English immersion marketed under the slogan, "English for the Children." This, despite opposition to Proposition 203 by the state's major newspapers, university presidents, and experts in language education, and despite the fact that test scores reported by the Arizona Department of Education showed students in bilingual programs doing better academically man those who were not enrolled in such programs.

Proposition 203 was spearheaded and financed by Ron Unz, a computer millionaire with political ambitions who in 1998 backed a similar successful initiative, Proposition 227, in California. Unz portrays himself as "a strong believer in American assimilationism." Contributing an article entitled, "California and the End of White America" to the November, 1999 issue of Commentary, he wrote of the "social decay and violence" in the new multi-ethnic California, and of how the passage of Proposition 227 would save America from ethnic divisiveness. Although immigrants, especially from Mexico, were Unz's targets, American Indians were not exempted from Proposition 227's provisions.

Arizona's Indian tribes saw Proposition 203 as a direct attack on their attempts to keep their languages alive and strongly opposed it. In a September, 2000 press release, Navajo Nation President Kelsey Begaye declared that the "preservation of Navajo culture, tradition, and language" is the most important guiding principle of the Navajo Nation. He went on to state:

The Navajo Way of Life is based on the Navajo language. By tradition, the history of our people and the stories of our people are handed down from one generation to the next through oral communication. Naturally, the true essence and meanings for many Navajo stories, traditions and customs cannot be fully transmitted, understood or communicated as told through non-Navajo languages.

Only four of Arizona's 15 counties voted down Proposition 203 three of those four were the ones comprising portions of the Navajo Nation.

After the passage of Proposition 203, Jack Jackson, a Navajo Arizona State Senator, requested an Attorney General's opinion as to whether Proposition 203 applied to Navajos. On February 15, 2001, Janet Napolitano gave her opinion that it did not apply to any of Arizona's Indians living on or off reservations. She based her opinion on "principles of tribal sovereignty," wording taken from the Native American Languages Act of 1990, which provides that "the right of Native Americans to express themselves through the use of Native American languages shall not be restricted in any public proceeding, including publicly-supported education programs." The opinion also noted the use of the term "immigrant" in the proposition's wording.

Minority Cultural Suppression

The ethnocentrism that breeds assimilationism is a worldwide phenomenon, and legal efforts to suppress minority languages and cultures are not new, especially as regards American Indian languages.

Repeatedly in the 1880s, the U.S. government required all instruction for Indians to be in English. Traditional Indian ceremonies, such as the Sun Dance of the Plains Indians, were banned. Students entering government boarding and day schools were reclothed, regroomed, and renamed. Locked rooms were used as "jails," and corporal punishment was employed to enforce school rules that usually included a ban on tribal languages. In his autobiography, Indian Agent, long-time teacher, school administrator, and Indian agent Albert Kneale reported that Indian students in Indian schools "were taught to despise every custom of their forefathers, including religion, language, songs, dress, ideas, methods of living." The alternatives for Indians were annihilation or assimilation (then called "civilization").

Schooling was enforced using tribal police, who were under the control of Indian agents, and even the U.S. Calvary. Adults who resisted sending their children to schools that devalued their tribal cultures were punished in 1894, 19 Hopi Indian men were sent to the military prison on Alcatraz Island for such an infraction. While the harsh assimilationist methods worked with some Indians, they also bred resistance in others. Hopi artist Fred Kabotie recalled in his autobiography, "I've found the more outside education I receive, the more I appreciate the true Hopi way. When the missionaries would come into the village and try to convert us, I used to wonder why anyone would want to be a Christian if it meant becoming like those people."

Ironically, after years of suppression in schools, Navajo and other tribal languages were pressed into service by the U.S. military during WWII to rapidly encode and decode military transmissions. Specially trained Navajo "Code Talkers" were particularly useful in the South Pacific, where they used a Navajo-language-based code that the Japanese were never able to decipher. Initially kept "a military secret," the original 29 Navajo Code Talkers received Congressional Gold Medals of Honor for their service last year a "GI Joe" Navajo-speaking Code Talker doll is currently being marketed.

The Civil Rights Movement

The Civil Rights Movement created a climate for more culturally appropriate schooling. In 1968, the U.S. Congress passed the Bilingual Education Act (Title VII of the Elementary and Secondary Education Act) under unanimous consent provisions. Though it was targeted at Hispanics, American Indian tribes quickly saw that they could profit from the provisions of the Act. In 1975, Congress passed the Indian Self-Determination and Educational Assistance Act, which provided for more Indian control of Indian education.

The results of past repressive government policies specifically aimed at American Indian languages were recognized by Congress in 1990 with the passage of the Native American Languages Act (P.L. 101-407). Congress found that "the status of the cultures and languages of Native Americans is unique and the United States has the responsibility to act together with Native Americans to ensure the survival of these unique cultures and languages." Congress made it the policy of the United States to "preserve, protect, and promote the rights and freedom of Native Americans to use, practice, and develop Native American languages."

Although the Bilingual Education Act of 1968 led to some teaching of non-English languages in schools, Blackfeet language activist Darrell Kipp rightly points out that:

Bilingual programs are designed to teach English, not your tribal language. We aren't against English, but we want to add our language and give it equal status. Bilingual education typically teaches the language fifteen minutes a day.

Fifteen minutes -- or even 50 minutes -- a day is just not enough time to develop language fluency. Increasingly, Kipp and other indigenous language activists are advocating immersion teaching methodologies that give more classroom time to tribal languages. U.S. Secretary of Education Richard W. Riley, in a speech on March 15, 2000, strongly supported dual-language immersion schools, which allocate about half the school day, rather than 15 minutes, to language learning. Of course, with that much time spent in language learning, academic content is integrated into the lessons so students do not fall behind in mathematics, science, social studies, and other school subjects. While working at Rock Point Community school in Arizona, I found that Navajo students who were immersed in Navajo for half a day in the primary grades not only learned to read and write their Navajo language they also learned English better than in surrounding schools where only English was taught. It is hard enough to learn to read, write, and understand subjects like math in a language you can speak. It can become an overwhelmingly negative experience to learn these first in a language you are only beginning to understand.

Increased efforts to teach indigenous languages are being made outside of school as well. For example, during the summer of 2000, The Hopi Village of Mishongnovi ran a program that involved local artists from the village working with children 5 to 19 years old. Along with traditional crafts, the program worked to immerse the children in the Hopi language.

Of special importance in the revitalization of American Indian languages and cultures has been the tribal college movement the number of tribal colleges has grown from one in 1969 to over 30 today. Lionel Bordeaux, long time president of Sinte Gleska College, called cultural preservation "the foundation of the tribal colleges."

Proponents of English as the official language see its dominance threatened and consider it the "glue" that holds our country together and a panacea to the problems of poverty faced by many ethnic minorities in the United States. A letter to the editor in the December 27, 1999 issue of USA Today claimed, "The one thing that binds the USA as a nation and makes possible the blending of so many varied cultural and ethnic mixes is that we have a common language." A similar letter appeared in the November 21, 2000 issue of the Arizona Republic. Its author insisted, "We must all be able to communicate in one language, the only glue uniting this great country."

I maintain that the "glue" holding this country together is not the English language, but rather the ideas embodied in the Declaration of Independence, the U.S. Constitution, and other key documents of the democratic experience. The definitions of "freedom," "liberty," and "free speech" in those documents need to be broadened to include group as well as individual rights to heritage, languages, and cultures. Government suppression of minority languages and cultures violates the liberty of American Indian, Latino, and other language minority citizens. Forced conformity is still being imposed on ethnic minorities in the United States through assimilationist, English-only schooling to the detriment of full and equal citizenship.

Research indicates that immigrants are learning English faster now than they ever have before the dominance of English in the United States is in no way threatened. On the contrary, it is immigrant languages that are threatened. In the words of attorney Lani Guinier (1994) and others, minorities through the initiative process are being subjected to democracy's "tyranny of the majority." American Indians, comprising less than one percent of the nation's population, are defenseless in the face of the majority unless they present a united front, link arms with other minorities, and actively recruit the support of mainstream Americans. Journalist David Broder, in his new book, Democracy Derailed: Initiative Campaigns and the Power of Money, details how the initiative process in California and other states can submerge minority viewpoints and offer slogan-driven panaceas to deep-rooted societal problems.

As American Indian languages die, the accumulated wisdom of their cultures dies. At a bilingual education conference in Anchorage, Alaska, in 1996, I picked up a card describing traditional Iñupiaq Eskimo values. One side of the card read:

Every Iñupiaq is responsible to all other Iñupiat for the survival of our cultural spirit, and the values and traditions through which it survives. Through our extended family, we retain, teach, and live our Iñupiaq way.

The other side read, "With guidance and support from Elders, we must teach our children Iñupiaq values." Listed were the values of "knowledge of language, sharing, respect for others, cooperation, respect for elders, love for children, hard work, knowledge of family tree, avoidance of conflict, respect for nature, spirituality, humor, family roles, hunter success, domestic skills, humility, [and] responsibility to tribe." With the loss of these traditional values and the languages through which they were taught, functioning American Indian communities and families are being destroyed, leaving in their wake dysfunctional families and myriad other social problems.

American Indian elders want their grandchildren to respect their elders, work hard, study in school, not drink, and, of course, remember that they are Indian. Today, even on rural Indian reservations, there is youth gang activity. Dr. Richard Littlebear, president of Dull Knife Community College and Northem Cheyenne language activist, writes,

Our youth are apparently looking to urban gangs for those things that will give them a sense of identity, importance, and belongingness. It would be so nice if they would but look to our own tribal characteristics because we already have all the things that our youth are apparently looking for and finding in socially destructive gangs. [One] characteristic that really makes a gang distinctive is the language they speak. If we could transfer the young people's loyalty back to our own tribes and families, we could restore the frayed social fabric of our reservations. We need to make our children see our languages and cultures as viable and just as valuable as anything they see on television, movies, or videos.

My quarter century of involvement with American Indian education and bilingual education as a junior high school teacher, school administrator, and university professor supports Dr. Littlebear's contention that language and cultural revival movements are generally healthy for America. Riots and ethnic violence are a product of the loss of traditional values and of poverty, not of multilingualism and multiculturalism. Linguistic and cultural assimilation will cure none of these ills.

The legally enforced aspects of assimilation epitomized in Propositions 203 and 227 are divisive and destructive. Not only do they divide "white" America from minority America they also create divisions within minorities between those who think that being a "good American" is associated with surface features such as speaking English. Being an American means adhering to the principles of the Declaration of Independence, the Constitution, the United Nations Charter, and other representations of democracy, freedom, and tolerance. These can be lived in any language.

References & further reading

Begaye, K. (2000). Guest commentary: President Begaye addresses English only proposition. The Navajo Hopi Observer 19:37, pp 4.

Broder, D.S. (2000). Democracy derailed: Initiative campaigns and the power of money. New York: Harcourt.

Grimes, B., Ed. (1996). Ethnologue: Languages of the world (13th edition). Dallas: SIL, International. www.ethnologue.com/.

Guinier, L. (1994). The tyranny of the majority: Fundamental fairness in representative democracy. New York: Free Press.

James, J.S. (2000). Keeping cultures, and communities, strong on Hopi. The Navajo Hopi Observer 19:47, pp 1.

Kabotie, F. (with Bill Belknap) (1977). Fred Kabotie: Hopi Indian artist. Flagstaff: Museum of Northern Arizona.

Kipp, D. (2000). Encouragement, guidance, insights, and lessons learned for Native language activists developing their own tribal language programs. Browning, MT: Piegan Institute.

Kneale, A.H. (1950). Indian agent. Caldwell, ID: Caxton.

Littlebear, R. (1999). Some rare and radical ideas for keeping indigenous languages alive. In Revitalizing indigenous languages. Reyhner, J., Cantoni, G., St. Clair, R.N. & Parsons Yazzie, E., Eds. Flagstaff, AZ: Northern Arizona University. Pp 1-5.

Oberly, J.H. (1885). In Annual report of the commissioner of Indian affairs to the secretary of the interior for the year 1885, lxxv-ccxxv. Washington, DC: U.S. Government Printing Office.

Parsons Yazzie, E. (1995). A study of reasons for Navajo language attrition as perceived by Navajo-speaking parents. Unpublished doctoral dissertation. Flagstaff: Northern Arizona University.

Article copyright Cultural Survival, Inc.


Research and Publication

Scholarly journals in women's studies were begun in the United States early on (1972 for Feminist Studies 1975 for Signs: A Journal of Women in Culture and Society but not until 1988 for the National Women's Studies Association Journal ), and soon there were journals published around the world. In 1999 an informal International Network of Women's Studies Journals (now the Feminist Journals Network) was formed, meeting first in Tromso, Norway, then in Halifax, Canada, in 2001 and in Kampala, Uganda, in 2002. Thirty editors from twenty-seven journals in twenty-one countries were represented in the membership in the early twenty-first century. Joint publishing projects, including a book series by Zed Press, reprinting of articles from journals in the "economic south" (developing nations) by journals in the "economic north" (industrialized nations, mostly in the north but including Australia), a Web site, and a listserv to make members aware of current issues are all part of their work.

Ellen Messer-Davidow surveyed the number of books and scholarly monographs available in English between 1980 and 1998 and estimated that 10,200 feminist books were published during that period. As she says, the print knowledge is so voluminous that scholars cannot keep track, much less read it all. And the topics are superabundant: "everything and anything is gendered, … gendering is narrated, quantified, or modeled, … and 'gender' as an analytical category is interrogated" (Messer-Davidow, p. 167).


David Bindman

These are all difficult questions but we should begin with what we have inherited. The problem we have in the history of art is that it is strongly rooted in national narratives. When asked, we all identify ourselves as working predominantly in, say, British, French or American art. This means that the transnational tends to be marginalized, but it can also be an excuse for ignoring the colonial.

It may well be very difficult to avoid national specialization but it can be turned to art history's advantage by regarding the nation as encompassing its empire, so that, say, the study of Spanish art includes the slave colonies of Latin America, and British art the slave colonies in the Caribbean. In a sense we need to re-colonize the history of art.

This would require a much stronger grounding in history, not as background but as integral to the subject. All medievalists should, for instance, be aware of connections with African kingdoms, the influence of Islamic art and architecture, just as those working on the Italian Renaissance need to be aware of connections with the Ottoman empire.

In terms of the discipline there needs to be more emphasis on the power of images to construct ideas of nationality and race. We need to investigate the visual construction of other peoples, for it plays a decisive role in naturalizing ideas of difference that can result in social action. This will involve a broadening of experience to include the study of all forms of visual culture along with the art of the museums.

David Bindman is Emeritus Durning-Lawrence Professor of History of Art at University College London.


Historical consciousness in Africa is&mdashof course&mdashquite literally as old as time, but in Europe and the Americas awareness of Africa&rsquos past has dawned only more recently.1 In the United States, African Americans during the nineteenth century first attended to Africans&rsquo pasts in the face of the racialized skepticism of the era. Writing more than a hundred years later as an Africanist historian here in the journal of the American Historical Association for colleagues in all fields, I want to suggest some of the intellectual pathways along which they and their successors have brought Africa within the practice of professional history at the end of the twentieth century and thus what learning to do history in a place as remote&mdashaffectively, culturally, geographically, and intellectually&mdashas Africa was for the founders of the historical discipline may reveal about history itself as process and as epistemology. It will become clear that I write of history in a humanistic vein that has become meaningful to me as I have matured&mdashor perhaps merely aged&mdashin our profession, speaking personally with what seems to be an executive privilege that the American Historical Association accords to presidents on this occasion.2 I do so without intent, thereby, to excommunicate colleagues who may balance in other ways the complex combinations of personal insight, techniques of inquiry, research data, engagement with popular memory, and practical application through which historians discern and disseminate meanings in evidence from the past.

The story that follows begins against the familiar background of the birth of the modern discipline of history at the end of the nineteenth century, torn as it was then between theological-philosophical speculation and faith in empirical data as evidence that would satisfy lingering cravings for certainties about the past, confirmed scientifically both tendencies specifically excluded most of Africa from the human progress that they celebrated. Those whose own lives confirmed that Africans belonged within universal history had to circumvent the exclusionary particularity of the discipline by adapting aspects of other more comprehensive&mdashthough also abstract, static, less humanistic&mdashgeneralizing epistemologies to bring Africa within the realm of academic respectability. From such academically alien beginnings, they only slowly and haltingly restored the humanism, the sense for change, and the sensitivity to contexts of time and place that distinguish history&rsquos way of credibility against knowing. But in relying, faute de mieux, on mythological oral traditions, reified languages, mute archaeological artifacts, and presentist ethnographic descriptions, they tested multiple limits of how they thought as historians. Looking back, their struggles highlight complex balances among several epistemological aspects of historians&rsquo craft: between particularity and generality, theory and data, sequence and chronology, internal subjectivities and unavoidable (whether or not &ldquoreal&rdquo) externalities, and empathetic similarity and curiosity-stimulating (or fear-provoking) differentiation in the relationship between historians and their subjects. I hope to suggest here how bringing Africans within the orbit of historical discipline may remind historians in any field of what is most historical about how we all have come to think.

Africans and African Americans adapted the progressive historiographies current at the end of the nineteenth century to write about Africa, while historians in Europe and the United States were laying out standards of the modern discipline.3 The problem they faced was that, following Hegel, the meta-narrative of the emerging discipline excluded Africa&rsquos past as morally unedifying and methodologically unverifiable, leaving Africans outside its exultation in European superiority as &ldquopeople without history.&rdquo4 The search for an African past sculpted in these progressive terms meant highly selective emphasis on monumental achievements comparable in antiquity, size, and military power to what Europeans then celebrated about their own past.5 They drew, first, on their contemporaries&rsquo appreciation for ancient Egypt and the mysterious lands to the south, some of them biblical&mdashPunt, Nubia, Kush, and Ethiopia or Abyssinia&mdashand sought monumental ruins comparable to what they knew of the &ldquoglory that was Rome&rdquo and the Egyptian antiquities publicized in the wake of Napoleon&rsquos 1798 invasion of the lower Nile.6 They limited research to written texts, which in Europe&rsquos experience conveyed direct impressions from remote times in relatively unchanged, or reconstructible, forms that met the demanding standards of verifiability emergent in scientific history. But writing also testified to the intelligence of its authors, otherwise suspect as illiterate &ldquonatives&rdquo living out mindless lives of changeless, endless barbarity. They accepted durable archaeological evidence as also providing similarly irrefutable credibility against the currents of racist skepticism then flowing. In retrospect, the prestige that progressive historians accorded continuities from ancient origins seems a singularly contradictory way to validate the recent advances on which they prided themselves, while in Africa the same perpetuation of ancient custom explained only contemporary primitiveness. The implicit accent on continuity undermined the progressives&rsquo insistence that devotion to change as a centrally revelatory element in human experience distinguished their discipline from theology and other competing epistemologies of their era. The roots of the paradox lay, of course, in the premises of biological racism on which its logic rested: priority in achievement demonstrated inherent racial superiority, and subsequent continuity in culture reassuringly paralleled transmission of the knack for civilization by genetic means.

The only possible source of evidence from the other side, turn-of-the-century anthropology, redoubled the challenge to those who would discover a meaningful past in Africa by validating its moral distance from the modern West. The first phases of anthropological investigation in Africa grew out of German idealism and received no small romantic impetus from European self-exiles disillusioned by the failing promise of industrialized capitalist society at the fin de siècle.7 Broadly inspired by Hegel&rsquos &ldquouniversal history&rdquo of the development of the human spirit, one historically oriented group of German ethnologists derived &ldquoadvanced&rdquo cultural traits from primal centers of civilization in the ancient Middle East and explained apparently &ldquocivilized&rdquo achievements reported from other parts of the world as products of a quasi-historical dynamic of &ldquodiffusion&rdquo of their unique inspiration. Diffusionist theories linked what people had done indissolubly to who they were, and so they accounted for historical change only in terms of &ldquomigrating&rdquo groups, mysterious conquerors who had spread &ldquocivilized&rdquo culture into remote corners of the world, or by imitative natives &ldquoborrowing&rdquo from them.8 To construct a &ldquohistory&rdquo for Africa meaningful by these high and ancient standards, but independent of presumed origins in southwestern Asia, meant positing an independent font of inspiration south of the Sahara, primary either because it was older than Egypt or because it possessed virtues more estimable than modern Europe&rsquos mechanized military power.

The German ethnologist Leo Frobenius became an erratic champion of Africa in these errantly historical terms.9 Frobenius shared his contemporaries&rsquo disdain for the &ldquodegenerate&rdquo colonial Africans of his own time, but he nonetheless found them fascinating &ldquobecause he thought them to be living documents of an otherwise unrecoverable universal human past.&rdquo10 In the course of repeated research trips to Europe&rsquos new African colonies around the turn of the century, he sensed traces of a creative, simple, and unspoiled local form of civilization higher than his embittered assessment of modern Europe. To account for the anomaly, he hypothesized an ancient, since-vanished civilization in West Africa known to its Mediterranean contemporaries, the Etruscans, hence anterior to Rome, and remembered later in the European myth of a lost Atlantis.11

Frobenius&rsquos &ldquoAfrican Atlantis&rdquo reversed the diffusionist &ldquoHamitic hypothesis&rdquo dominant in progressive history&rsquos vision of Africa.12 This pseudo-historical Hamitic theory reconciled older faith in the Christian Bible with newer, scientifically styled studies of language, physical type, and political economy to account for what Europeans could recognize in Africa as vestiges of &ldquocivilization&rdquo understood in modern terms. From the moment that self-styled European &ldquoexplorers&rdquo and colonial armies had set foot in Africa, they encountered formidable opponents, leaving the would-be &ldquocivilizers&rdquo with considerable and perplexed respect for African military power, political leadership, and even monumental architecture, the litmus tests of progress. All of these contradicted the low rankings that the racial classification schemes of the time accorded dark-skinned people. Only a &ldquowhite&rdquo residue in Africans&rsquo cultures could explain so unanticipated a suggestion of competence among &ldquoNegroes.&rdquo By the convenient logic of diffusionist inference, such &ldquoCaucasian&rdquo influence could have reached sub-Saharan Africa through historical contact with emigrant &ldquowhites&rdquo of Mediterranean origin, long enough ago to match the presumed antiquity of authentic originators and to leave time for their salutary influence to have degenerated to the faint traces still evident in the otherwise universal genetic and cultural gloom.&rdquo In the United States, where God-fearing Southerners justified the violent racism of the &ldquoJim Crow&rdquo era on their faith &ldquothat God had shaped the Negro&rsquos physical and emotional makeup at the beginning of existence and rendered him forever inferior to whites,&rdquo13 these biblical, evolutionary, environmental, and racial determinisms hung heavily in the immediate background to nineteenth-century thinking about the past in Africa.

The scholarly W. E. B. Du Bois led several African-American colleagues at the beginning of this century in creating a professional history for Africa against the backdrop of American racism. As an undergraduate at Fisk University, where the &ldquonatural inferiority [of people of African descent was] strenuously denied,&rdquo it had been Bismarck who struck Du Bois as a model of the &ldquostrength and determination under trained leadership&rdquo that would &ldquoforeshadow . the kind of thing that American Negroes must do&rdquo for themselves. But as Du Bois entered Harvard&rsquos graduate history program in 1888, he found &ldquoAfrica . left without culture and without history.&rdquo14 With no alternative, Du Bois concentrated his studies on American history and politics but oriented his thesis research toward Africa by taking up the &ldquosuppression of the African slave-trade to the United States of America, 1638&ndash1870.&rdquo He read his first academic paper on the subject to the annual meeting of this Association in 1891, in Washington, D.C.15 Realizing &ldquowhat in my education had been suppressed concerning Asiatic and African culture,&rdquo Du Bois followed the German pilgrimage of the time among historians in America for two years&rsquo study at the University of Berlin (1892&ndash94).16 There, he must have heard metropolitan echoes of Germany&rsquos wars of colonial conquest, seen the published reports of nineteenth-century German scientific expeditions in Africa, and drawn on contact with German ethnology to frame the first continental-scale history of Africa in his sweeping, racially unified history, The Negro.17

In The Negro, Du Bois described ancient African kingdoms comparable to Europe in civilization. But the glories of such earlier accomplishment cast an unavoidable dark shadow over a contemporary Africa recently subjugated to European colonial rule. Du Bois found an explanation for this painful realization in a description of historical reasoning: he attributed contemporary Africans&rsquo apparent degradation to damage done by subsequent European and Muslim slaving, a theme prominent in the writings of eighteenth-century opponents of the slave trade that he must have encountered in researching his doctoral dissertation. Du Bois&rsquo horror at the loss of &ldquo100,000,000 souls,18 the rape of a continent to an extent never paralleled in ancient or modern times,&rdquo led to his tragic concession, by the standards of progressive historiography, &ldquoof the stagnation of culture in that land since 1600!&rdquo19 Without personal experience on the continent, not even Du Bois could escape the European and American judgment of contemporary Africans&rsquo backwardness by contemporary standards. But asserting a retrogressive narrative of damage and decline by historical agents, albeit external ones, at least allowed him to avoid the eternal burden of inferiority by reason of race.

&ldquoProgressive&rdquo history early in this century thus confined even this brilliant defender of the &ldquoNegro&rdquo to a salvage operation, a search for racial respect by interpreting specifics from ancient Africa to support modern, European valuation of the national state, military power, and monument building. The result corresponded to historical thinking at the end of the twentieth century only in its contemplation of times past. It lacked African contexts of time and place independent of presentist projections, or inversions, of European racial presumption. Du Bois could attribute recent initiative only to outsiders, European (and Muslim) slavers, and thus left Africans in roles perilously close to passive victims, without agency of their own. Du Bois&rsquo, and Frobenius&rsquos, concessions of the recent stagnation of Africa&rsquos cultures&mdashor of a singularized &ldquoAfrican culture,&rdquo as the rubric of racism usually homogenized them&mdashall but excluded current ethnographic description of Africans since their fall into backwardness as a source of insight into the earlier but vanished glories. Without human, African context to stimulate motive and action, even Du Bois&rsquo prodigious reading in published writings left his story of triumphal political leadership in Ghana, Mali, and Songhai&mdashthe empires of Africa&rsquos medieval Sudan, cut down at the threshold of modernity&mdasha fable not of tragedy but rather of failure.

African teachers and scholars, and the Europeans and Americans who worked in Africa with them after World War II, gradually distinguished modern African history from the liberalizing intellectual currents that swept Europe and the United States during the waning years of colonial rule. They did so by adding empirical evidence focused on issues arising from circumstances particular to Africa.20 This postwar generation of academics, intent on preparing colonies in Africa for political independence and African youth for future civic responsibility, lived amid intense preoccupation with politics. African politicians, several of them trained in the United States in Du Bois&rsquovision of African history under Leo Hansberry, who had introduced the first academic courses in African history at Howard University in the early 1920s, capitalized on its nationalist spirit to justify Africans&rsquo political accountability.21 These pioneering historians of Africa overwhelmed the regional historical traditions of the colonial era and adapted basic progressive assumptions to African purposes, demonstrating political centralization and expansion in political scale in Africa of European proportions.22

The academic institutions in the colonial metropoles in Europe, which held authority to validate these teachers&rsquo efforts as professional &ldquohistory,&rdquo expressed fewer reservations than previously about Africans&rsquo inherent eligibility for history, but they showed strong hesitation about the lack of evidence from Africa that seemed to meet the historical discipline&rsquos positivist standards.23 As in all history, only disciplined recourse to voices independent of the present, to primary evidence understood in terms of its originators in the past, could convey Africans&rsquo agency and the contexts to which they reacted. Research strategies that might historicize Africa&rsquos past had to start in Africa, draw on African sources, and array the new information around historical hypotheses focused on concerns of Africans. Objections on such technical grounds presented challenges that the first Africa-based generation of professional historians welcomed with enthusiastic inventiveness.

They regarded the prized documentary sources of the progressives as highly suspect for these purposes in Africa. Europeans had written about Africans since they had arrived in the fifteenth century, but documents became sufficiently comprehensive to bear the weight of historical interpretation alone only much later, since about the 1880s, with the advent of government records accompanying the establishment of colonial authority. However, these writings of modern Europeans were alien and self-interested, as well as tainted by the use made of them in colonial and imperial history to lionize Europe&rsquos civilizing political mission around the globe. Nationalist historiography rejected them as very nearly polar opposites of the Africans&rsquo history that they sought.

What little research drew on colonial government files, even though it reached monumental proportions in isolated instances, was administrative and sociological, not historical.24 Narratives of the colonial governments&rsquo economic &ldquodevelopment&rdquo programs, or, alternatively, the success of nationalist politicians at mobilizing popular opposition to them, since the nineteenth century, were &ldquocase studies&rdquo in a social-science mode, with primarily comparative and theoretical implications. They tended to extract &ldquovariables&rdquo relevant to the &ldquomodels&rdquo and theories they tested from their full historical contexts. The enabling generation of post&ndashWorld War II historians had little choice but to appropriate these other disciplines for their own historical purposes, even&mdashas was repeatedly the case&mdashwhen their sociological accents tempted them to phrase arguments in terms of aggregated behavior and abstractions.

Social-science &ldquomodels&rdquo tempted historians also because they offered the alluring logical coherence of theory to paper over the initial lack of enough empirical evidence from the African past to make sense on its own terms, and to distract from the dubious standing, by conventional historical standards, of what there was. Still more seductive of historical epistemology were the equilibrium assumptions of much mid-twentieth-century sociology, with its stable institutions and equilibrium models. In terms of change, these amounted to social-scientific analogs of the timeless &ldquoprimitive&rdquo African cultures that they sought to replace. Structural logic thus diverted historians&rsquo attention from their own discipline&rsquos reliance on change as a primary mode of explaining, observing transience as a fundamental aspect of human existence.

Yet, out of this initial reliance on methods, conceptualization, and narratives distinctly ahistorical in logic and alien to Africa, historians gradually added context, change, and African agency, the three epistemological elements that together distinguish history from other disciplines, to create a more historicized African past.25 Driven back in time by the unacceptability of colonial-era documents and by progressive history&rsquos respect for ancient origins, aspirant historians of Africa had to confront the technical challenges of making responsible use of unwritten sources. As historians, they sought to identify properties in these novel forms of evidence familiar from documentary records. Their need to justify themselves by disciplinary standards alien to Africa distracted them from these sources&rsquo distinctively African characteristics, and thus from their historicity.

Narrative oral traditions&mdashrecountings of events attributed to a past beyond the experience of living witnesses and presumed to have passed down to the present through multiple tellers and hearer26&mdashseemed particularly authentic voices from Africans&rsquo pasts. Their narrative form made them seem subject to critical methodologies developed for reconstructing primary versions of the similarly discursive written sources familiar to historians elsewhere.27 However, application of this documentary analogy to &ldquotraditional&rdquo narratives revealed that Africans told their tales so creatively, at least in the politically charged circumstances of talking to the powerful European outsiders who recorded them, that the scenes they portrayed amounted to outright fabrications.28 They structured their accounts by aesthetic, rhetorical, and interpretive strategies more than by chronological sequence, and they tended to account for change by radical, magical-appearing transformations rather than by detailing the incremental sequences plausible as change to historians.

Once historians recognized that they could not read oral narratives as histories or reconstitute them as wholes, they reexamined their elements to see how they might offer valid pointers to circumstances&mdashif not the actors or events narrated&mdashin the past. But historians prepared to extract evidence from traditions by dissecting them faced a cross-disciplinary challenge from anthropologists eager to claim the same oral representations for theoretical purposes of their own as social, conceptual, and performative entities.29 British structural-functional anthropologists, authors of much of the ethnography on which the first generation of historians drew in their search for African historical context, emphasized the presentist aspects of narratives constructed to legitimate privilege and power, often by deploying metaphors of antiquity to assert the inalterability of current inequalities.30 Structuralist anthropologists influenced by French symbolic anthropology joined the cause against traditions&rsquo historicity by interpreting the logic and language of the same materials as cosmological speculation, even as expressing fundamental structures of mind untouched by any specific experience or conscious reflection, present or past.31 Historians responded that narratives need not directly describe times gone by to contain elements bearing marks of origins in times past, even without the performers&rsquo awareness of the antiquity on which they drew. Anthropologists exaggerated the presentist aspects of oral performances only by selectively emphasizing the tales&rsquo narrative meanings and aesthetic strategies, or the political and intellectual reasons why performers might displace into former times narratives fabricated in&mdashor even deliberately constructed as metaphors for&mdashthe present.

In the oblique and mutually stimulating way in which divergent disciplines interact, historians historicized their use of oral traditions by converting the anthropologists&rsquo emphasis on compositional strategies to understand how Africans selected, preserved, and shared collectively important knowledge through time in mnemonic environments.32 Mnemonic techniques of preserving knowledge, for example, distributed vital information among several individuals, all responsible together for mutual verification of essential points, however made.33 Individual performers engaged existentially with their auditors around the immediate occasion, against backgrounds of current power, rank, and privilege, but arguments for the exclusively presentist idiosyncrasy of oral performances could be sustained only by isolating them from their distinctive communal context, by restricting analysis to a single performer along lines that presumed individual artistry comparable to performance in literate cultures.34 By analyzing the compositional strategies of oral performers as group processes, historians replaced abstracted oral traditions with intellectual history contextualized in African environments.35

The centrality of precise chronology to progressive methods of inferring (possible) cause and consequence from contemporaneity and sequence led the first generation of professional historians at work in Africa down obscure paths in search of proxies for calendrical dates that would bring African evidence up to accepted standards. Lists of kings common in the royal traditions of African political systems seemed convertible to calendrical years on the supposition that succession in royal lineages exhibited demographic regularities. Historians might then count the rulers named and multiply assumed average lengths of reigns back from recent monarchs of known date to estimate dates of earlier rulers.36 That African dynasties might have exhibited greater order, and hence more regular sequences, than unpredictable struggles over power elsewhere in the world proved a vain hope, nurtured in part by the illusion conveyed by colonial-era social anthropology of mechanistic, functionally integrated political institutions in Africa.37 But in following Africans&rsquo ways of speaking about the past, historians gradually abandoned such artificial and abstracted chronologies in favor of contextualizing pastness as people in mnemonic cultures experienced it: as absence, as broad contrasts between what is proximate and what is remote, mixing space with time accordingly.38

Historians looked also to Africans&rsquo 1,500 different languages for the aspects of linguistic change that might yield calendrical dates.39 The resulting chronologies were, of course, similarly mechanistic artifices and proved imprecise as historians&rsquo needs grew more refined. They also riveted historians&rsquo attention on classifications of abstracted Languages of Africa rather than on the people who created thern.40 But other historical aspects of Africans&rsquo linguistic behavior&mdash&ldquolanguage communities always in contact and constantly evolving&rdquo&mdashspoke more directly about their experiences in the past.41 The marked contrasts among Africa&rsquos five major language families gave sharp definition and multiple dimensions to the discrete, specific linguistic innovations that produced Africa&rsquos diverse linguistic heritage. Phonetic shifts in the way that people pronounced old words, or mispronounced words they appropriated from neighbors, are key markers of historical experience, and changes for many areas of collective life can be reliably sequenced by reconstructing them. Sets of novel words that clustered in conceptual fields within this phonetic framework pointed to specific technology, political institutions, fashions in apparel, or moments of enduring human inventiveness in the past, including the kinds of people who might have changed the ways they talked and the reasons why their descendants preserved their linguistic habits down to the languages of the present.42 This historicizing transition from statistical analysis of abstracted vocabularies to historical inferences from reconstructed past linguistic behavior paralleled historians&rsquo abandonment of the formal properties of oral narratives in favor of sensing how narrators drew on inherited memories to compose them.

The preoccupations and enthusiasms, circumstantial worries and collective accomplishments of ancient parents literally echo in the present through the speech habits they taught their children. Moreover, their accents express historical experience without conscious intent and hence, unlike the ideological distortions characteristic of oral narratives, are unfalsifiable. Historical inference from linguistic reconstruction is attaining degrees of detail, depths in time, and regional comprehensiveness that outline a coherent narrative&mdashthough with increasing selectivity as the focus lengthens to more remote eras&mdashof who in Africa experienced what in times past as long as 20,000 years ago.43 Historical inferences from linguistic evidence thus approach the threshold of intentionality as a significant determinant of human experience, the dawn of dependence on communication for collective welfare, and reliance on self-conscious creativity through cultural consensus, all marking the beginnings of history understood as deliberate, effective agency. In another ironic interplay of disciplines, historians&rsquo failure to extract chronologies from languages useful for history in the progressive style left them with powerful linguistic techniques for hearing about the past as Africans experienced it.

Chronology-dependent historians also embraced archaeology in significant part because it produced datable stratigraphy and artifacts. In Africa&rsquos predominantly rural areas, that hope rested on the physical dating of radioactive isotopes of carbonized organic materials, such as wood charcoal, and then inferring likely relationships of these material remains to human issues of interest to historians.44 Beyond the imprecision of the dates calculable from these radiocarbon techniques, the uncertain associations of materials thus dated to specific human activities left their conclusions far from historical in style.45 The search for hard evidence to civilize Africans by European standards also turned historians of Africa to archaeology for traces of early metallurgy, a technology of undeniable accomplishment by modern industrial standards. This line of investigation gained momentum when iron smelting turned up in Africa earlier than anticipated, five centuries or more before the Common Era in several regions. Africans had thus smelted iron&mdashas was nearly always emphasized in the lingering competitive spirit of the quest&mdashbefore much of western Europe replaced bronze with ferrous metals. African smelting techniques also arguably derived from local inspiration, and iron workers there primarily fabricated agricultural implements. This last purposive nuance rescued Africa&mdashit was hinted&mdashfrom the retardation implied by the still more ancient dating of iron in Anatolia, but there for less reputable use as weapons. Subsequently, study of the African contexts of iron production, with emphasis on culture and environment, has replaced &ldquoearly enthusiasms&rdquo about iron artifacts in Africa with historicized comprehension of African metal workers and their metal-working strategies.46

The progressive impulse to unearth African evidence of antique monuments respectable in European terms showed little promise south of the Nile corridor and Ethiopia, with the exception of massive thirteenth- and fourteenth-century stone walling in southern Africa centered at &ldquoGreat Zimbabwe,&rdquo47 in towns that dotted Africa&rsquos Indian Ocean coastline since at least the eighth century, and such famed thirteenth- to sixteenth-century West African cities as Timbuktu, along the southern fringes of the Sahara Desert. These town centers had attracted attention as sub-Saharan prototypes of modern, Western-style urbanity since Du Bois&rsquo initial attempt at African historiography. However, the classic archaeological research at these sites focused on the imported wares found in their ruins, on Muslim building in Arab and Persian styles, and on other evidence of datable foreign contacts. Because archaeologists then contemplated their findings in terms of abstract typological contrasts rather than as historical products of human creativity, few remarked on the faint aroma of the discredited &ldquoHamitic hypothesis&rdquo that emanated from attempting to give Africans credit only for taking up the good ideas of immigrants from southwest Asia.48

Archaeologists, like linguists, have learned to interpret their findings according to the mental maps of the Africans who built these towns.49 The West African cities, once treated as isolated outposts of North African Muslim traders in search of sub-Saharan gold valuable in Mediterranean markets, have been revealed as late elaborations on African patterns of urbanization that arose from desiccation and local exchanges across the region&rsquos increasingly sharp environmental gradients two millennia before they attracted foreign merchants.50 All these centers expressed distinctively African communal strategies of production, distribution, and provisioning necessary to support dense settlement.51

In the beginning, historians had turned to ethnography for data distinguishable as &ldquoAfrican&rdquo among the prevailing written Europeans&rsquo impressions of Africa. They accepted the theorized social structures, mental worlds, and cultures in which anthropologists phrased these descriptions as enduring determinants of African behavior rather than as modern, Western constructs about them. Moreover, the urgency of their search for evidence from the past predisposed them to overlook the contemporaneity of the mid-twentieth-century circumstances that ethnography in fact described. Ethnographers&rsquo assertions that they abstracted aspects of Africans&rsquo lives as they had existed before European modernity intruded gave an illusion of pastness&mdashhowever static&mdashthat dulled the sense of change critical to history. In particular, the hoary colonial fallacy that Africans could usefully be understood as belonging to enduring, homogeneous ethnic aggregates&mdashthe &ldquotribes&rdquo still current in popular discourse&mdashfurther distracted historians from positioning ethnographic evidence firmly in its historical present. Although historians rejected the connotations of backwardness conveyed by the colonial idea of &ldquotribes,&rdquo the functional integrity of African &ldquosocieties&rdquo rendered every element of the contexts in which people &ldquomust have&rdquo lived so essential to all others that reference in a conventional dated source to one of them seemed to allow historians to assume the connected presence of most, or surely some, of the rest in the otherwise undocumented past.52 Functional &ldquotribal&rdquo integration of this sort allowed historians, further, simply to bundle the conclusions of all the other disciplines they had engaged, assuming that conclusions from one could verify inferences from others without considering the specific contexts that might have generated each.

This rationalization, however well-intended and cautiously applied, placed even the scattered direct evidence then available for earlier times squarely within the timeless vision of Africa&rsquos past that historians meant to refute. The few options for accommodating change that such &ldquotribes&rdquo offered were familiar from progressive history: like &ldquocivilizations&rdquo and &ldquoraces,&rdquo they had &ldquoorigins&rdquo locatable in time and space, subsequently acted primarily as groups by &ldquomigrating&rdquo to wherever their members currently lived, &ldquoconquered&rdquo anyone they encountered along the way, and reliably passed &ldquotraditional&rdquo behavior through the generations. African sources offered few ways out of this time trap of &ldquotribal&rdquo logic, since traditions everywhere expressed the inviolable integrity of current groups as enduring ethnic antiquity. To historians working in the pressure cooker of trying to confirm scattered information by the rules of a doubting discipline, the documented presence of a few elements of a current ethnographic &ldquosociety&rdquo or &ldquoculture&rdquo appealed seductively as the visible tip of a likely ethnic iceberg of associated (even if unremarked) behavior and institutions in the past.

Even now, in an era that emphasizes the contingent and constructed character of groups of any sort, anywhere in the world, a lingering reliance on &ldquotribes,&rdquo though long rejected among Africanists,53 still sometimes substitutes for historicized context among nonspecialists drawn to consider Africa&rsquos past. As appreciation of Africa&rsquos relevance to history beyond its own shores has grown, historians of other world regions have necessarily approached so unfamiliar a subject through simplifying assumptions that they reject in areas they know better. &ldquoTribes&rdquo now usually lie concealed behind polite euphemisms&mdash&ldquocultures,&rdquo &ldquoethnic groups,&rdquo and neologistical &ldquoethnicities,&rdquo even &ldquocommunities&rdquo&mdashbut politesse does not eliminate the time-defying, history-denying static logic of the notion: stereotyped Africans confined within abiding structures, individuals submerged in depersonalized, abstracted aggregates, who act mostly by realizing social (or cultural) norms, that is, by preserving unchanged what colonial-era language reified as &ldquotradition.&rdquo54

Definitive historicization of ethnography came not only from situating ethnographic descriptions in time and context55 but also from seeing the African strategies colonial ethnography had reified as institutions as Africans&rsquo ways of achieving specific historical objectives.56 Africans compose &ldquotraditions,&rdquo for example, by adapting popular memories about the past to apply the ideological force of claimed antiquity and stability for discernible purposes of the moment.57 Historicization has transmogrified such ethnographic staples as African &ldquokinship,&rdquo and its common expression as &ldquolineages,&rdquo from functional frameworks within which Africans thought into collective entities that they created and adapted to secure valued resources in land, in political standing, or in people themselves. Anthropologists and historians together have sensed that &ldquowitchcraft&rdquo in Africa was a historical reaction against the danger that individuals grown wealthy, powerful, and independent posed not only to their relatives and neighbors but also to the ethos of collective responsibility itself commercialized exchanges with the Atlantic economy since 1600 or so and the colonial-era introduction of a monetary economy raised public alarm about abuses of private accumulation to haunting intensity.58

African politicians and intellectuals created ethnicity itself by manipulating supple collective identities to meet historical circumstances.59 A capsule history of ethnicity in Africa would trace the oldest of the collective identities that colonial ethnographers froze in time as &ldquotribes&rdquo to ancient adaptations of basic agricultural and other productive technologies to local environments, wherever these were so successful that whoever later lived in those areas carried on in terms of the community arrangements that the first settlers worked out. Others derive from a wave of political consolidation that swept through Africa from the thirteenth to the fifteenth centuries, wherever people continued to rely on political solutions derived from the early states that had attracted Du Bois&rsquo admiration. Still others date from seventeenth- and eighteenth-century conflicts and population movements, as people fled slave-raiding and reorganized their collective lives around the straitened circumstances it created. Others again formed as communities gathered around commercial, agricultural, and extractive enterprises of the nineteenth century. Colonial conquest once more challenged men and women in Africa to transform the group identities dominant at the opening of the twentieth century, to resurrect some that had drifted into latency, and to invent others out of momentary conjunctures to exploit cash economies and European political power. Where nominal continuity is evident,60 new personnel frequently (one suspects always!) adapted &ldquotradition&rdquo to dramatically shifting circumstances, if only to preserve viable aspects of shared heritages and wrap themselves in the legitimacy of the ages. Even the stereotypically unchanged hunters in the Kalahari (so-called &ldquoBushmen&rdquo) have survived by adapting,61 and Africa&rsquos nomadic forest people turn out to have maintained their strategic flexibility only by innovating against heavy odds.62

In no small irony, the methodological distractions of using the blueprints of other disciplines,63 not yet historicized, to construct a past for Africa left historians vulnerable to haste in handling evidence in familiar written forms. The founding generation&rsquos intense commitment to an autonomous African history&mdashled by inexperienced research students, sometimes by faculty of necessity trained in other fields,64 nearly always institutionally isolated from their historical colleagues in area-studies programs&mdashinsulated them from the discipline in the rest of the world, and from the methodological caution that prevailed in departments of history.65 This liberal generation of aspirant historians acquired too easy a sense of having met their professional responsibility for source criticism by exposing the racist biases of European writings about Africa. Although fighting racism was an unavoidable component of constructing a history of Africa, even the passing racist ambiance of the time still distracted historians of Africa from the critical methods of their discipline.

The limits of well-intended innocence as historical method appeared as soon as the initially high yields of plowing virgin documents for superficially accessible content about Africans&rsquo interactions with their European authors began to decline. The second generation of Africanist historians&mdashor, often, in fact, the first generation, wiser with experience&mdashtook up positions in departments of history where they encountered the questions of historical methodology that underlay their search for answers in Africa. With tenure and with the outline of an African past becoming clearer in their minds, more of them found time to follow through on doubts raised, but not resolved, by their uses of documentary sources in their early research.66 By the 1970s, their students had to reinterpret the same limited corpus of written sources more closely for their implications for new, more subtle questions that an increasingly complex history of Africa was raising. The increased awareness of African contexts at the same time enabled them to read the written sources&mdashand not only &ldquoEuropean&rdquo documents&mdashagainst the grain of their authors&rsquo ignorance for the shadows that Africans&rsquo activities cast over what they reported.67 Unsurprisingly to historians of the ancient Mediterranean and medieval Europe,68 even the authorship and chronology of seemingly familiar publications of known dates have proved very uncertain without thorough explication du texte.69

As historians of Africa re-engaged their discipline&rsquos text-based methodologies, they also incorporated the content of early modern and modern European (and American) history as context for Africa&rsquos past.70 At the birth of modern history in Africa, when ignorance of what had happened there left historians little alternative, they cited the relative isolation of one continent from the other, intercommunicating regions of an Old World &ldquoecumene&rdquo to explain Africa&rsquos apparent failure to share in the advances under way elsewhere.71 Stimulating contact with ideas different from one&rsquos own, as this liberal meta‑history of diversity ran, accounted for progress throughout Eurasia. Africa&rsquos presumed historical isolation saved its inhabitants from the racist condemnation of &ldquoHamitic&rdquo contact but only at the cost of once again conceding backwardness and exclusion from the world of progress. Further, by limiting stimulating interactions to Europe and its Asian partners, this history imposed a pan-African homogeneity, at least congruent with the racial stereotype it tried to avoid, that ignored intense, animating communication across many cultural borders within Africa.72 The assumption of isolation also underestimated Africans&rsquo intercontinental contacts and missed the creativity with which they had appropriated from outsiders what made sense in the contexts in which they lived, not least their adaptations of Islam since the eighth century and of Judaism and Christianity in the millennium before then.73

History nonetheless emerged, even from research that had strained so hard against the distractions of alien disciplines during the 1960s to meet the standards of historical method that inspired it. Historians gradually discerned sufficiently probable patterns of past African actions that their successors could place imported artifacts, world religions, and international capital in historical contexts independent of modern values. A breakthrough of sorts came in the 1970s, when French neo-Marxist anthropology highlighted dysfunctional tensions within structural functionalists&rsquo harmonious ethnographic families, distinguished the diverse actors formerly homogenized within &ldquotribes&rdquo&mdashcommunities with and without lands and performers of their own,74 elders and youth,75 slaves,76 richer and poorer,77 even ambitious and successful individuals78&mdashand positioned them in dynamic, historicized tensions.79 Systematic neo-Marxist emphasis on material differentiation within Africa broke through the racialist homogenization lingering from earlier formulation of the subject as &ldquothe Negro&rdquo and moved beyond conflicts stereotyped as &ldquotribal.&rdquo Differentiation by gender, after first missing its full potential by celebrating African women who excelled in normatively male power roles, focused on sex-specific inequalities of colonial rule and gradually explored the distinctive experiences of the larger half of the African population to add a pervasive, vivifying dialectical tension to the context of Africa&rsquos recent past.80 With these frustrations and other motivations in view, Africans emerged as active historical agents, in ways recognizable to historians practiced in the politics and processes of European and American history, where struggles over sharply differentiated ambitions are axiomatic.

But Africans also acted on intellectual premises and constructed historical contexts with salient aspects very different from those of progressive Europeans and maximizing materialists: prominently among many such contrasts, behind and against all of the practical tensions, was an ethos of collective responsibility rather than modern individualistic autonomy.81 The community values of Africans&rsquo histories constitute a kind of moral historiography82 that exhibits precisely the &ldquoideological&rdquo qualities social anthropologists cited as evidence of ahistoricity in oral narratives. The sense in which historical &ldquoagency&rdquo may be attributed to Africans is prominently&mdashthough never exclusively&mdasha collective one, especially during centuries before the nineteenth. Archaeological data are nearly always anonymous and interpreted generically, and words are by definition standardized products of recurrent collective practice.83 Africans recollect their experiences communally, and performers of oral traditions publicly address shared concerns. Although oral performers characteristically build their narratives around figures of dramatically distinctive character&mdashculture heroes, monarchs, and others, these apparent personages are in fact stock figures who reflect subsequent consensus about them more than particular persons in the past, even the individuals who may in fact have inspired such commemoration. Beginning in the sixteenth century, documents mention individually some of the Africans who met Europeans or at least characterize the specific roles they assumed in approaching literate outsiders, and from the seventeenth century onward they allow increasingly nuanced interpretations of personality in African contexts.84 But the collective aspect of the people otherwise detectable in the more remote epochs of Africa&rsquos past means that individual agency must often be understood in terms of its effects rather than its motivations, and that the effects remembered are public rather than private.

The anonymity of individuals in much of the evidence available thus becomes less a deficiency of the sources than a window opening onto Africans&rsquo collective ways of thinking. Even though individuals pursued personal ambition in Africa no less than elsewhere, they did so by subtly evoking responses from those around them rather than by asserting their autonomy too obviously. Autonomous success invited suspicion of &ldquowitchcraft&rdquo rather than admiration. This African emphasis on collective responsibility also had its own history, with individualism becoming more effective and more acknowledged since about the eighth century, when a few Africans took advantage of outsiders&mdashmostly Muslim and then later Christian merchants from commercial, literate backgrounds&mdashwho were prepared to deal with them on a personal basis. More than coincidentally, these foreign visitors also left the documentary records from which historians may now derive evidence of African agency as individual.

From recognizing Africans&rsquo distinctive mental worlds, historians could also appreciate their experiences of change itself as more abrupt and discontinuous than their own notions of processual incrementalism, The smaller the alterations modern historians can note, the more individuated and specific the changes, and&mdashas a logical complement&mdashthe greater the multiplicity of their aspects, the more plausible and historical they find the process thus defined. Perception of change in such nuanced form relies on dense and continuous runs of documentary records and, beyond them, on habits of writing that preserve momentary impressions of every step along the way.85 Mnemonic notions of change in Africa more resembled what nominally literate historians in Europe before the seventeenth century accepted as &ldquomiracles&rdquo both elided progressive history&rsquos processual stages of modification into sudden transitions between preceding and succeeding (but other than that timeless) states, sequenced but not otherwise connected.86 Both depersonalized the human-scale agency of processual change by displacing causation into extra-human realms, usually taken seriously in Europe as &ldquoreligious&rdquo but in Africa for many years dismissed, with connotations of superstitious irrationality, as &ldquomagical.&rdquo Subsequent liberal revision of this pejorative characterization rationalized causation of this African sort as &ldquocosmological&rdquo or as respectably &ldquospiritual&rdquo but did not interpret its implications for historical thought.

Within these frameworks of causation and historical agency, Africans&rsquo strategies of action focused on ends rather than means, which they left mysterious though not beyond human access. Africans acted on the premise that humans did not themselves possess transformative power, but they might nonetheless convert existing states into desired ones by gaining personal access to a limitless pool of potentiality inherent in the world around them, a force personalized to varying degrees as &ldquospirits.&rdquo87 Europeans, who restricted the idea of historical efficacy to human initiative, misconstrued individual action conducted in these terms as &ldquosorcery.&rdquo But African action was in fact efficacious socially as intended, that is, to the often‑considerable extent that people feared the ability of individuals to tap the imagined pool of natural potency and acted on their apprehensions. Once historians accepted Africans&rsquo strategies of acting, they recognized that the ways in which they applied reasoned inquiry, calculated experimentation, and close observation of effect to transform their situations paralleled&mdashthough within the limits of detection imposed by their reliance on only the human senses&mdashthe microscopic, chemical, and eventually nuclear and electronic techniques of observation that seventeenth-century Europeans elaborated as &ldquoscience.&rdquo88

It would be misleading to overdraw these subtle distinctions in emphasis between European and African historical ways of thinking. Modern general theories of human behavior and abstracted processual models of causation are hardly less naturalizing and impersonal than Africans&rsquo metaphors of change. They internalize the power Africans see inherent in nature as inalienable human &ldquorights&rdquo and as &ldquosociological&rdquo or &ldquopsychological&rdquo constants they understand agency to include manipulating &ldquohuman nature&rdquo by influencing consciousness and belief. Nor do Africans&rsquo attributions of agency to collective culture heroes and founding kings in oral narratives differ in their implicit dynamics of causation from charismatic &ldquogreat man&rdquo theories in Western philosophies of history. The collective solidarities that Africans represent as &ldquoancestors,&rdquo or the kings they view as embodying entire polities, produced historical effects, just as people everywhere change their worlds by acting together in groups of similar proportions. Strict rationalist observers dismissed African behavior as timeless &ldquoritual&rdquo or &ldquoreligion,&rdquo hence unreflective, inexplicable, and pointless &ldquotraditional&rdquo failed attempts at agency. But contexualizing efficacy, change, and causation in these non-modern&mdashand also postmodern, and only incidentally African&mdashterms makes it plausible by historicizing it. The postmodern embrace of cultural history, social constructivism, memory, and collective consciousness throughout the historical profession has now brought these universal aspects of human existence clearly into view in other parts of the world.

The implications of Africa&rsquos past for history as a discipline do not, of course, arise only from the earlier eras on which nearly all of the present discussion has concentrated. The bulk of historical research in Africa has in fact shifted during the last twenty years to modern times, roughly since the mid-nineteenth century, but early Africa exemplifies the process of historicizing its study more dramatically than does the colonial era.89 Its historiography reaches back more than a century, long enough to reveal the dynamics of the process, while modern Africa has been subject to historical study for barely more than two decades. Further, the formidable technical challenges of eliciting evidence from Africa&rsquos more remote eras reveal more sharply the challenges of maintaining disciplinary integrity while drawing on other academic epistemologies than do the interviews, colonial documents, and other relatively familiar sources employed for the twentieth century. In addition, distinctively African historical processes visible only over spans of time reaching back to ancient eras frame all interpretations of recent periods.90 These processes decidedly do not constitute a static, &ldquopre-colonial&rdquo past, defined only negatively by contrast to European political authority, but rather the centuries when Africans developed solutions to problems of their own times, some of which their descendants have struggled to adapt to contemporary challenges.91 Without early history to give African context to recent experience, Africans&rsquo appropriation of current opportunities falls by default into projections of Europe&rsquos dreams of &ldquomodernization,&rdquo or lapses into pessimistic resurrections of meta-histories of terminal decline&mdashas predictions for the future!92&mdashto explain their failure. Whatever the ethical and political overtones of distorting modern Africa to fit into these alien terms, they fail as history because they perpetuate the teleological and ahistorical premises of the racist progressivism and liberal structuralism from which they grew.

Frictions in Africa, however effective they may have been in generating historical dynamics in Africa&rsquos past, exacted a considerable price by separating African from African-American history, making two fields from the one that Du Bois had presented generically and genetically as the history of the &ldquoNegro.&rdquo African history now appeals to other professionals in terms of the discipline and methodology that characterize the academy, more than it reflects the memories of its popular audience in the African-American community. Ambiguities arising from tensions in Africa, formerly concealed behind the American racial mask of &ldquothe Negro,&rdquo seem to expose disharmonies inappropriate for public discussion in Western societies still redolent of the intolerance that Du Bois wrote to refute, to compromise commitment, to reduce the vigilant solidarity necessary for community survival in an unwelcoming world. But since Africa looms integrally in the background of African-American history as a unified ancestry reflecting the racial sense of community forced by American prejudice on African Americans,93 for many professionalization of the subject leaves a distinct sense of loss.

History reinvented from African circumstances resonates throughout the profession, perhaps even revealingly because of the distinctive intensity with which Africa challenged the exclusionary premises of the classic, progressive form of the discipline at the end of World War II. The ahistoricity, even anti-historicity, of the social-science disciplines with which aspirant Africanist historians had to begin forced them to look deep into their own professional souls as well. Their experience of inventing a history for Africa, not by rejecting established standards but by embracing and extending them to integrate the unconventional forms in which the world&rsquos &ldquopeople without history&rdquo had remembered their pasts, exposed inner logics of historical reasoning.94 Inclusive liberalism brought &ldquoothers&rdquo formerly segregated in the separate spheres of &ldquoethnohistory&rdquo within a single, comprehensive, and seamless history of humanity. It also replaced the artificial barrier between &ldquohistory&rdquo and &ldquopre-history&rdquo set by limiting evidence to its documentary form with a processual threshold for history defined in terms germane to how historical inquiry proceeds, that is, in terms of human agency: historical method gradually becomes productive of understanding ancient women and men as calculation supplemented biological evolution, animal instinct, and random accident as a coherent, significant source of intended&mdashand unintended&mdashchange in the affairs of those thereby rendered human.

Historical inquiry&mdashand historians are, above all, questioners&mdashrequires the challenge of the unknown to spark the curious imagination. History derives its essential energy from explaining difference, from the tension of the distance that separates historian and subject. All knowledge gains clarity and coherence from the elementary binary mental function of discriminating like from unlike, of course, and history is distinctive primarily in focusing on distance across time, between then and now, between the historian&rsquos and subjects&rsquo eras. The centrality of difference to the wellspring of history&rsquos epistemology underlies the reflexive appreciation, recently prominent in all historical fields, of the complexity of relations between historian-observer and observed subject, between selves and others. Progressive Europe&rsquos and America&rsquos praise of their own ways of doing things prevented historical inquiry from drawing fully on this core potential, by coding cultural behavior as biological absolutes, by limiting its subjects to the relatively familiar, by celebrating selves rather than exploring others.

Such exclusion distracts attention from the equally important, countervailing premise of historical inquiry: the shared humanity that links otherwise distanced, but not alienated, historians and their subjects. History is fundamentally humanistic in the sense that its way of knowing depends on an intuitive sense of commonality, of sheer comprehensibility, beyond the differentiation by which it defines in order to explain. This connective aspect of history&rsquos method binds historians to their subjects in multiple ways. Affectively, it appears in the emotionally engaged fascination that attracts scholars&mdashor the horror and dismay that repel them no less engagingly&mdashto the parts of the past that they choose to investigate. Cognitively, it sustains their inquiring interest to the point of inspiring them to impose order on chaotic evidence. Historians convey this sense of understanding by presenting their subjects as people like themselves and their audience, by touching readers and hearers intuitively, evoking contradictions, paradoxes, and ironies of life that they understand because they share them. All history is thus ethnic, part of the creation of group identities by authors who claim affinity with the subjects about whom, and for whom, they write.95 The prominence of the past in the rhetorics of nationalism, racism, &ldquoculture wars,&rdquo and chauvinisms of every sort amply confirms the extent to which history is inherently about &ldquous.&rdquo

But never exclusively so. Historical curiosity and understanding start together, from the tension of holding the opposing sensations of difference and similarity, distance and intimacy in the precarious, productive balance that makes inquiry conducted in this spirit productive. The delicate equilibrium of historical thinking makes its practice dependent on the training, poise, and control of professionals. But clarifying contrasts is so basic to human thought that carrying the same mental process to extremes allows nonprofessionals to imitate history by emphasizing either of the two tendencies of its dialectic, claiming its appeal while violating its dynamic epistemological equilibrium. Since both parts are always present and hence available to employ one-sidedly, imitators&rsquo claims may be difficult to distinguish from those of historians balanced on the tightrope of professionalism. History&rsquos humanism is so intuitive, and its legitimating intimacy so powerful, that it gives enormous popular appeal to versions of the past that draw only on commonalities, distorting the evidence in response to what the community, or the historian, wants to believe, ideologically proclaiming their obviousness as truth or more openly acknowledging them as entertaining fiction. When historians differentiate excessively by projecting onto &ldquoothers&rdquo their own envy, fears, or hopes, they betray the integrity of the discipline no less than when they proclaim similarities beyond those that in fact exist. They then merely generate obscurantism, stereotyping &ldquoothers&rdquo to stress difference, as Africans&mdashwho have lived with alienating histories&mdashand African Americans&mdashwho have been excluded by racist association&mdashknow all too well.

Recent critical examination of this reflexivity&mdashenriched by literary, hermeneutic, or psychoanalytic theories (but not by theory from the empirical social sciences)&mdashoffers productive ways to examine these inevitable, and essential, subjectivities. Taken alone, however, even informed self-awareness loses touch with the discipline&rsquos equally essential focus on others, on people in the past understood as unlike themselves. The epistemological function of empirical data for historians is to draw them outside their own imaginations. No objective reality may lurk out there, awaiting &ldquodiscovery,&rdquo but externalities inevitably intrude on consciousness sufficiently to differentiate and provoke curiosity. The subtle distinction that places history among the humanities rather than the sciences turns on its use of induction to test insight rather than its deploying intuition to interpret data.96 The empirical aspect of this subtle interplay may be similarly intensified to extremes, and in the positivist phase of the discipline documents attained a sanctity that had nightmarish consequences for Africa. Novelty and sheer abundance on the empirical side of historical scrutiny have repeatedly tempted the leaders of advances into new ranges of evidence, from the written documents of the founding &ldquoscientific&rdquo generation to the unwritten sources that beguiled historians of Africa. The social historians of the 1960s struggled to comprehend data in quantities incomprehensible to unaided human cognitive faculties, and some limited themselves to conclusions in the forms in which electronic technologies and statistical techniques&mdashnecessary to detect patterns hidden within these sources&mdashdelivered them. Quantitative methods framed critical aspects of historical context by describing aggregate tendencies in human behavior, but by themselves they seldom generated historical insight into the human experience.

History turns data into evidence not by pursuing the technical attributes of data but by substituting a distinctively intuitive, humanistic, holistic strategy for the experimental method of science. It assesses meaning by qualitatively contextualizing evidence in the complex, multivariant circumstances of the past in which people created it. History ultimately fails as &ldquoscience,&rdquo since historians can assemble only random evidence from the debris of the past that reaches them through processes far beyond their control. They cannot replicate the closely regulated conditions of laboratories, in which their scientific counterparts precisely measure varying outcomes of exact, determining circumstances. Rather, they can compare only approximately, among the few aspects of past conditions known to them, of only general similarity, and seldom in instances numerous enough to establish levels of statistical probability beyond the plausibility inferable from intuition alone. Auras of ambiguity hover over all bits of evidence considered out of context, removed from the human creativity from which they come. Even the apparent precision of timing establishes only correlations based on chronologies, not cause or effect. History&rsquos holistic methodology thus makes sense of the data at hand by setting all information available, considered simultaneously together, in the context of the human moment from which it originated. The more information at hand, the richer the context it creates and the greater its potential thus to explain. The more intuitive the historian, the more contradictions and paradoxes in the assemblage she or he is able to reconcile.

Contemplating history without calendrical chronologies in Africa generalizes the discipline&rsquos sense of time beyond sequence according to numbered dates. History&rsquos fundamental sense of change emphasizes not dates but rather the ephemerality of the human experience and the processual aspects of historical contexts, the becomingness, the remembered absences. No human being can escape the imposing imponderability of change itself: everyone orients himself or herself to the inaccessibly fleeting present in which they live by trying to apply perceptions of past experience, to prepare for an impending future foreseeable only as projections from the here and now, which events will render irrelevant before the limited &ldquolessons of the past&rdquo can be applied. Sequential narrative modes of exposition render these settings coherent in the flow of time, but only by invoking irony, actions with consequences unintended, tragic fates for those who cling to efforts after they have failed, uncertainty and fear, and causes arising from circumstances far beyond human control replace ineluctable progress and all the comforting regularities of >theory: all very unlike the uncomplicated sense of progress that gave birth to the modern discipline.

History is thus neither empirical nor imaginative but rather a continual dialectical confrontation of insight with evidence, of intuition and empirical induction, of past and present, of mutually challenging awarenesses of the self and of the world. The meanings that historians seek are similarly multiple and distinct, simultaneously those of their subjects, those of their audience, and private ones, unconscious as well as conscious. Professional history must constantly, exhaustively, test its intuitive aspect against evidence and awareness external to both historians and their publics, in order to keep the actions of the others that it is held to reveal at safe, respected distances from their interpreters. Once historians acknowledged Africans&rsquo human accessibility and plunged into the uncertain contexts of alien disciplines and of data in half-understood forms from unfamiliar cultures, they encountered contrasts, less absolute than race, that energized their inquiries. The cultural originality that challenged anthropologists to understand Africans as exotic &ldquoothers&rdquo whom time&mdashunderstood only as progress toward modernity&mdashseemed to have forgotten became a source of revealing contrasts when historians recognized it as products of the creativity of people like themselves. Scholars with personal backgrounds in the cultures of Africa gained a parallel sense of understanding from their mastery of Western historical method and often led colleagues from Western backgrounds in understanding Africans&rsquo histories that were their own.97

Historians achieved a similarly subtle, biased balance between history and its sister disciplines as they generated historical ways of contemplating Africa&rsquos past out of anthropology, political science, archaeology, linguistics, and other ways of knowing characterized by the primacy of theory. In sharp contrast to history, theory achieves coherence largely by abstracting selected elements from their historical contexts, to expose the logical relationships among them. Because history&rsquos core subject is the human experience, that of the historian as well as those of the historian&rsquos subjects, because all human situations exhibit multiple facets, and because historical actors shift their attention selectively among them and act momentarily in relation to many different ones, historians therefore must appropriate theories and models in eclectic multiples, not to test any one of them on its own terms but rather to apply the relevant insights of all, pragmatically, to the compound ambiguity of past experience&mdashin combinations that are historical because each is unique to its moment. Because the external rationality of any single theory may explain behavioral tendencies of large aggregates of people over long periods of time, the theorized disciplines&mdashphilosophical as well as social and psychological sciences&mdashreveal the longue durée tendencies, cultural assumptions, and general human inclinations that are vital aspects of historical context.

The disciplinary distractions of historians&rsquo early efforts in Africa thus derived not from inherent limits of the social-science theory and structure they employed but rather from their having to substitute conclusions from them for evidence from the past. Historians simply lacked sufficient data independent of their own imaginations to hold generalizing disciplines in heuristically secondary positions, supportive of their primary project of particularizing moments. In pursuit of the illusion of &ldquotribes,&rdquo they also attempted to blur distinctions among the distant epistemologies by treating them as if they focused on the same elements of past historical contexts. For example, historians hoped that momentous subjects consciously remembered in oral tradition might directly confirm evidence of the thoroughly unremarkable aspects of life retrieved through archaeological methods or, historians and anthropologists&mdashin efforts recurrent as both disciplinary camps confronted their need for the other&mdashthought it possible to synthesize a &ldquohistorical anthropology&rdquo or an &ldquoanthropological history.&rdquo But they learned that the dialectic of thinking interdisciplinarily does not resolve on the plane of method. Rather, Africanists of all academic persuasions maintained their academic composures separate from others and applied the insights of all simultaneously or, as they became relevant, to complex historical contexts. There, each contextualized and thereby rendered plausible conclusions reached independently through others. Engaged historically through intuitive application to human contexts, scrupulous respect for inherent differences among academic disciplines preserved the integrity of each and enriched the productivity of all. The balance among them paralleled history&rsquos differentiated affect of scholar and subject and its mutual testing of intuition and induction.

Once historians learned enough about African local and regional dynamics to juxtapose them against Europe&rsquos experience, they engaged perhaps the most dynamic differential contrast of all, Africans seen as living in coherent &ldquoworlds of their own,&rdquo fully integral but not isolated, stood in fertile tension with broader currents of world history. Africans had in fact lived in broader historical contexts long before colonial rule in the twentieth century, and before their contact with the Atlantic economy over the preceding three hundred years. They had interacted with the Islamic world in transformative degrees for nearly a millennium preceding the consolidation of classical Egyptian civilization there. The interpretation of these historical interactions as African borrowing &ldquotraits&rdquo abstracted from their historical contexts had generated no fruitful dynamic, nor had the effort to endow Africans with agency by isolating them within separated spheres of autonomy. Equally, single-sided &ldquodomination&rdquo by European colonialism, &ldquomodernization&rdquo by industrial civilization, or feminized &ldquopenetration&rdquo by world capitalism had left Africans passive, reactive subordinates. But balanced tension between regional and global rhythms of change in African contexts summoned up the proximate differences, distanced intimacies, of active historical inquiry.98

The differences exploited by contemplating Europeans&rsquo experiences in Africa together with Africans&rsquo experience of global historical processes99 suggest that change in the large, persisting &ldquocivilizations&rdquo favored by progressive history in fact originates on their fringes, not in their relatively stable centers. Just as the Kuhnian process in science operates at the margins of awareness and intelligibility and as the unknown stimulates historical curiosity, it is at the edges of what is familiar that people in history encounter others different enough from themselves to appear baffling, where strangers pose challenges they are not prepared to meet, and to which they may respond with innovation.100 The alternative reaction&mdashhatred, denial, incomprehension&mdashleads only to the loss of perspective from which unduly ethnic history&mdashAfrican as well as European&mdashsuffers. But history&rsquos humanistic premise of commonality, of intelligibility, turns dread of the unknown into a quest for explanation. In the case of Africa, differences had exceptional power to challenge the historical discipline, since they assumed extreme forms, wrapped in the emotional garb of race that lurked at the core of progressive history, appeared to transgress its apotheosis of evidence in documentary form, confronted modernists with present practices like witchcraft presumed left behind by the advance of civilization in Europe, and&mdashfar beyond what Africans in fact were doing&mdashrepresented fanciful projections of private subjectivities that progressive historians&rsquo insistence on rational objectivity most obscured from themselves. Explanation of anomalies as multiple and sensitive as these that Africa seemed to present could not but deepen professional historical sensibilities and broaden historians&rsquo skills.101

Finally, historians turn to the past to implement their tragic sensibility to transitoriness only in part because ephemerality and contingency appear there in demonstrable ways. The past matters equally to the epistemology of history because evidence rooted firmly and inalterably in times gone by remains inaccessibly impervious to the inquirer&rsquos imagination in the present. Strict respect for the pastness of evidence renders it inaccessible and thus immunizes historians against the constant temptation to manipulate it in the service of concerns of their own, created by history&rsquos contravening metaphors of continuity and contiguity. It thus preserves the distance that makes historians of those who engage the lives of others, back then. Time, or&mdashas Africans see it&mdashabsence within communities of empathy, makes the difference from which the vitality of historical inquiry flows.

Africans thus historicized as people with pasts of their own, with autonomous contributions distinguishable from the passivity assigned them by slavery, and with identities no longer rendered invisible by its racial sequels, are poised to enter the world&rsquos longer-established historical regions. More Africans than Europeans reached the Americas until sometime early in the nineteenth century, as we have known for long enough to think more carefully than most have done about the implications of the fact, and recent evidence confirms that 80 percent of the women and 90 percent of the children coming before 1800 to the New World from the Old traveled in the holds of slaving vessels.102 Historical insights are now passing in both directions between Africa and Europe and the Americas, no less than Europeans and Africans have long interacted across the Mediterranean and all around the Atlantic.103 The regional fields that once confined action within contexts distortingly narrow are becoming &ldquoglobalized.&rdquo Once historians recognize Africans as people with stories of their own, they expand their vision of large parts of mainland North American colonies to take account of the Africans who helped, however involuntarily, to make those places what they became. The significant presence of their African-American descendants, whatever their nominal exclusion by reason of race, then follows ineluctably. With Africans brought in from the cold beyond the periphery, Atlantic history stands solidly on three legs,104 and Africans join others around the world as intelligible participants in themes central to European history,105 beyond their former bit parts as foils for European follies overseas. By the maxim of history&rsquos enrichment by diverse and comprehensive context&mdash&ldquoresearch locally, but think globally&rdquo&mdashall need all the others, and to equal degrees.

I am not the first president of this Association to acknowledge&mdashat least implicitly through my confidence that lessons learned from doing history in Africa matter to historians specialized in fields once considered remote&mdashthe opportunity that the American Historical Association presents, distinctively among our many other, more specialized professional societies, to deepen understanding by providing a forum for cultivating awareness of the full historical context in which all whom we study in fact lived.106 The AHA has taken fruitful steps in recent years to &ldquoglobalize regional histories,&rdquo in the phrase of one recent initiative, in the pages of this Review, and in supporting development of sophisticated historical thinking on a world scale.107 The &ldquoAtlantic context&rdquo of North American history and the global aspects of modern European history, not to mention the position of Christian Europe for a millennium before on the periphery of the Islamic world, and the Indo-centric and Afro-Eur-Asian dynamics around the Mediterranean long before the age of Philip V of Spain, all thrive on the stimulus of balancing, without abandoning, perspectives inherent in each against pulses of change in the others. The subjectivity essential to history comes alive in this interplay we realize ourselves most fully when we engage with others unlike ourselves. Historians have achieved productive diversity as the discipline has matured, but&mdashas progressive history showed&mdashstark differentiation without compensating engagement is sterile. As the inclusive arena in which historians can avoid disintegrating into isolated, inert fragments, the American Historical Association keeps newer styles of history from taking older ones for granted and exposes older ones to resonances of the new that animate what they have already accomplished. Africa offers historians a rich challenge as part of this process, a place not fundamentally opposed to &ldquoourselves,&rdquo as progressive history once constructed it, but one stimulatingly distinct in modulated ways from which all historians gain by including, just as Africanists thrive on being included.

Joseph C. Miller is T. Cary Johnson, Jr. Professor of History at the University of Virginia, where he has taught since 1972. His research has concentrated on early Africa, particularly Angola. He has written two monographs, Kings and Kinsmen: Early Mbundu States in Angola (1976) and Way of Death: Merchant Capitalism and the Angolan Slave Trade, 1730&ndash1830 (1988), and numerous shorter studies. Way of Death received the Herskovits Prize of the African Studies Association and a Special Citation from the American Historical Association&rsquos Bolton Prize Committee in 1989. Miller has compiled a definitive bibliography of Slavery and Slaving in World History, 2 vols. (1999), with some 15,000 entries, and plans to write a historical survey of this ubiquitous form of human domination.


Family and Community Dynamics

Communalism did not develop in overseas Japanese communities as it did among the overseas Chinese. In the fifteenth and sixteenth centuries Japan's land-based lineage community gave way to down-sized extended families. Only the eldest son and his family remained in the parental household. Other sons established separate "branch" households when they married. In Japan, a national consciousness arose while in China, the primary allegiance remained to the clan-based village or community. Thus, Japanese immigrants were prepared to form families and rear children in a manner similar to that of white Americans. The "picture bride" system brought several thousand Japanese women to the United States to establish nuclear branch families.

The "picture bride" system was fraught with misrepresentation. Often old photographs were used to hide the age of a prospective bride and the men sometimes were photographed in borrowed suits. The system led to a degree of disillusionment and incompatibility in marriages. The women were trapped, unable to return to Japan. Nevertheless, these women persevered for themselves and their families and transmitted Japanese culture through child rearing. The Issei women were also workers. They worked for wages or shared labor on family farms. Two-income families found it easier to rent or purchase land.

By 1930, second-generation Japanese Americans constituted 52 percent of the continental U.S. population of their ethnic group. In the years preceding World War II, most Nisei were children and young people, attempting to adapt to their adopted country in spite of the troubled lives of their parents. For many young people the adaptation problem was made even more ambiguous because their parents, concerned that their children would not have a future in the U.S., registered their offspring as citizens of Japan. By 1940, over half of the Nisei held Japanese as well as American citizenship. Most of the Nisei did not want to remain on family farms or in the roadside vegetable business and with the strong encouragement of their parents obtained high school, and in many cases, university educations. Discrimination against Japanese Americans, coupled with the shortage of jobs during the Great Depression, thwarted many Nisei dreams.

The dual-career family seems to be the norm for Sansei households. Recently, spousal abuse has surfaced as an issue. If it was a problem in previous generations, it was not public knowledge. In San Francisco an Asian women's shelter has been established, largely by third-generation Asian women.

In Japanese tradition, a crane represents 1,000 years. On special birthdays 1,000 hand-folded red origami cranes are displayed to convey wishes for a long life. Certain birthdays are of greater importance

At a wedding dinner, a whole red snapper is displayed at the head table. The fish represents happiness and must be served whole because cutting it would mean eliminating some happiness. Silver and golden wedding anniversaries are also occasions for festive celebrations.


What Ancient Rome Tells Us About Today’s Senate

The U.S. Senate’s abdication of duty at the start of this Memorial Day weekend, when 11 senators (nine of them Republican) did not even show up to vote on authorizing an investigation of the January 6 insurrection, makes the item below particularly timely.

Fifty-four senators (including six Republicans) voted to approve the investigative commission. Only 35 opposed it.

But in the institutionalized rule-of-the-minority that is the contemporary Senate, the measure “failed.” The 54 who supported the measure represented states totaling more than 190 million people. The 35 who opposed represented fewer than 105 million. (How do I know this? You take the list of states by population you match them to senators you split the apportioned population when a state’s two senators voted in opposite ways and you don’t count population for the 11 senators who didn’t show up.)

The Senate was, of course, not designed to operate on a pure head-count basis. But this is a contemporary, permanent imbalance beyond what the practical-minded drafters of the Constitution would have countenanced.

Why “contemporary”? Because the filibuster was not part of the constitutional balance-of-power scheme. As Adam Jentleson explains in his authoritative book Kill Switch, “real” filibusters, with senators orating for hours on end, rose to prominence as tools of 20th-century segregationists. Their 21st-century rebirth in the form of phony filibusters (where senators don’t even have to make a pretense of holding the floor) has been at the hands of Mitch McConnell, who made them routine as soon as the Republicans lost control of the Senate in 2006.

The essay below, by a long-time analyst and practitioner of governance named Eric Schnurer, was written before the Senate’s failure on May 28, 2021. But it could have been presented as a breaking-news analysis of the event.

Several days ago I wrote a setup for Schnurer’s essay, which I include in abbreviated form below. Then we come to his argument.

Back in 2019, I did an article for the print magazine on Americans’ long-standing obsession with the decline-and-fall narrative of Rome. Like many good headlines, the one for this story intentionally overstated its argument. The headline was, “The End of the Roman Empire Wasn’t That Bad.” Of course it was bad! But the piece reviewed scholarship about what happened in the former Roman provinces “after the fall,” and how it prepared the way for European progress long after the last rulers of the Western Empire had disappeared.

Many people wrote in to agree and, naturally, to disagree. The online discussion begins here. One long response I quoted was from my friend Eric Schnurer. I had met him in the late 1970s when he was a college intern in the Carter-era White House speechwriting office, where I worked. Since then he has written extensively (including for The Atlantic) and consulted on governmental and political affairs.

In his first installment, in the fall of 2019, Schnurer emphasized the parts of the America-and-Rome comparison he thought were most significant—and worrisome. Then last summer, during the election campaign and the pandemic lockdown, he extended the comparison in an even-less-cheering way.

Now he is back, with a third and more cautionary extension of his argument. I think it’s very much worth reading, for its discourses on speechwriting in Latin, among other aspects. I’ve slightly condensed his message and used bold highlighting as a guide to his argument. But I turn the floor over to him. He starts with a precis of his case of two years ago:

I contrasted Donald Trump’s America then—mid-2019—with the Rome of the Gracchus brothers, a pair of liberal social reformers who were both assassinated. Of course, the successive murders of two progressive brothers at the top rung of national power would seem to suggest the Kennedys more than, say, Bernie Sanders and Elisabeth Warren, to whom I compared them. But that’s to say that no historic parallels are perfect: One could just as fruitfully (or not) compare the present moment to America in the late 1960s and early 1970s, a period we managed to make it through without ultimately descending into civil war.

Yet, historical events can be instructive, predictive—even prescriptive—when not fully de-scriptive of current times and customs.

What concerned me about the Roman comparison was, I noted at the time, “the increasing economic inequality, the increasing political polarization, the total eclipse of ‘the greater good’ by what we’d call ‘special interests,’ the turn toward political violence, all of which led eventually to the spiral of destructive civil war, the collapse of democracy (such as it was), and the wholesale replacement of the system with the imperial dictatorship: Looks a lot like the present moment to me.”

In the 1960s, such developments were in the future, although perhaps apparent then to the prescient …

The question that raised was the extent to which the tick-tock of republican decline in Rome could provide a chronometer something like the Bulletin of Atomic Scientists’ famous “doomsday clock”:

If we could peg late summer 2019 to the Gracchi era—roughly up to 120 B.C.—with the fall of the Republic equated to Julius Caesar’s crossing the Rubicon and subsequent assumption of the dictatorship (roughly speaking, 50 B.C.), we could set our republican sundial at, more-or-less, “seventy years to midnight.” But time under our atomic-era clocks moves more quickly than in ancient Roman sundials, so how could we equate a seventy-year margin on a sundial to our own distance from a possible republican midnight? We’d need another contemporary comparison to understand not just where we stood, but also how fast we were moving.

A year later I wrote about the developments of 2020 that seemed to move us closer to midnight. I compared last year’s Trump to Lucius Cornelius Sulla Felix: Despite common descriptions of Trump as a would-be Caesar, Sulla is, in terms of temperament and background, a closer match to The Donald: “Sulla, a patrician who indulged a fairly libertine, sometimes vulgar, lifestyle even throughout his several marriages, was nonetheless the champion of the economic, social and political conservatives.”

Of perhaps greater similarity—and great concern, in my view—was the increasing hollowing out of the Roman state from a “common good” into simply another form of private corporation benefiting the already-wealthy and powerful who could grab hold of its levers and hive off its components … After a tumultuous reign, Sulla retreated to his villa at Mar-a-Lago, er, Puteoli, and Rome fell into a period of relative quiescence.

That took us from the 120’s B.C. in July 2019 to roughly 80 B.C. by August 2020: By that measure, our republican doomsday clock had lurched forward about 40 Roman years—a little more than halfway to midnight—in roughly a year …

But as U.S. politics fell into a period of relative quiescence lately, with Trump ensconced quietly at Puteoli—er, Mar-a-Lago—and a relatively calming, moderate and institutionalist Everyman (if no Cicero …) installed in the White House, I didn’t think much further about the Roman comparison.

That is, until last week, when I made an off-hand comment about a young family member’s misbehavior, jokingly complaining, “O tempora, O mores!”: “O the times, O the customs”—the most famous line from the most famous speech by Rome’s greatest lawyer, politician and orator, Marcus Tullius Cicero. I was suddenly struck by the similarity between the circumstances of Cicero’s famed oration and those we face now in the wake—and denial—of the assault on the Capitol of January 6.

I have a personal fondness for what have become known as Cicero’s “Catilinarian Orations”—a series of speeches he delivered at the height of a failed conspiracy to assault the citadels of republican governance and seize power. I read them in the original in my high school Latin class, at a time when my major focus was on school politics and, as the immediate past student body president, I was leading a similar (in my mind) effort to beat back a coup attempt by the would-be conspirator who had been defeated electorally by my chosen successor.

As a result, I found reading Cicero’s words uncovering, indicting, and overcoming Lucius Sergius Catilina (known to us as Catiline) and his co-conspirators, and thereby preserving democracy, rather thrilling. These orations—especially the first—have become famous as among the greatest speeches in history, not least because of the self-promoting Cicero’s promoting them as such. But to read them in the original is to recognize them as deservedly so.

Latin is an extremely complicated but flexible language. Its elaborate system of agreements between nouns, adjectives and verbs allows for words to be ordered in sometimes almost-random-seeming patterns requiring extensive detective skills to puzzle out the actual meaning of a sentence. At the peak of my Latin studies, for example, I could probably translate an average sentence in the great Latin epic, The Aeneid, at the rate of about one per hour. Reading Cicero in Latin, however, is like spreading warm butter over a piping-hot piece of bread: It simply flows.

Cicero could reach unequaled heights of high dudgeon with the simplest of sentences. He reached for his greatest in the opening lines of his first Catalinarian Oration to the Roman Senate. The immediacy of the language fairly leaps off the dead pages as if alive itself, overpowering the reader with the desperation of Cicero’s fight for democracy, his courage in the face of danger, his importuning his at-first-impassive audience seated in their clean white togas amidst the marble walls and red-cushioned banquettes, slowly distancing themselves from the censored Catiline as Cicero’s oratory builds in mighty waves.

Catiline was yet another aristocratic yet amoral politician who had aimed at absolute power by appealing cynically to the reactionary foot soldiers of Sulla’s former army and their “blue-collar” supporters. But he nonetheless was headed to a loss in the consular election of 63 B.C., which would have ended his political ambitions, so he conspired to overthrow the Roman state, intending literally to decapitate the official vote count on election day by killing the consul overseeing it—Cicero—and seizing violent control of the government.

Cicero could be considered something of a moderate, an institutionalist who revered the Republic as he rose to power in its capital despite being what the Romans called a “new man,” one who had made his own way from an undistinguished upbringing in the hinterlands (“Cicero” means “chickpea,” a literal and uncomplimentary nod to the family’s roots).

Upon uncovering the conspiracy, Cicero called an emergency meeting of the Senate to denounce this attempt to short-circuit the election and end republican government through violence. Cicero was surprised that Catiline dared pompously to show himself at the day’s proceedings, as if his efforts to undermine the state were perfectly proper, and to deny he was doing what everyone knew he was doing: “When, O Catiline, do you mean to cease abusing our patience? How long is that madness of yours still to mock us? When is there to be an end of that unbridled audacity of yours, swaggering about as it does now?”

But what is most notable about the famed opening of this first and greatest oration is Cicero’s clear astonishment at the blasé reaction of much of the Senate to this open assault on republican values. “O the times, O the customs,” he responds, and then continues:

“The Senate understands these things, the Consul sees them yet this man still lives. He lives? Indeed, he even comes into the Senate, he takes part in public debate, he notes and marks out with his eyes each one of us for slaughter!”

Despite the fact that, at this point, Catiline’s intent to murder Cicero and various other members of the Senate, to stop the vote count and overturn the foregone election results, and unlawfully to seize the levers of government through violence is well known to all of them, a good number of these very same legislators and leaders shrug the whole thing off. Some sympathized with his political program others were implicated in the plot still others were basically in the same boat as Catiline, having committed similar crimes and sexual debaucheries that limited their political futures and still others were perfectly fine with ending the trappings of republicanism if it meant they retained their power and Senate seats. And some simply couldn’t be roused to care.

The conspiracy ultimately collapsed and was defeated, but not without further militant uprisings aided by Rome’s enemies abroad. Catiline, a demagogue but in the end not the best of politicians or insurrectionists, was killed. Democracy, and the old order of things, seemed to have survived, and matters returned to a more-or-less normal state under Cicero’s stable hand.

But it turned out to be a brief reprieve. The rot had already set in. What mattered most in the long-term was not the immediate threat of the insurrectionists, but rather the complacency, if not sympathy, of the other ostensibly-republican leaders. It revealed the hollowness of not just their own souls but also the nation’s.

Another 10 months in America, another 15 years forward on the Roman sundial. At this rate, we’re about a year before midnight.

I don’t know how many people in the reading public would recognize the name Dan Frank. Millions of them should. He was a gifted editor, mentor, leader, and friend, who within the publishing world was renowned. His untimely death of cancer yesterday, at age 67, is a terrible loss especially for his family and colleagues, but also to a vast community of writers and to the reading public.

Minute by minute, and page by page, writers gripe about editors. Year by year, and book by book, we become aware of how profoundly we rely on them. Over the decades I have had the good fortune of working with a series of this era’s most talented and supportive book editors. Some day I’ll write about the whole sequence, which led me 20 years ago to Dan Frank. For now, I want to say how much Dan Frank meant to public discourse in our times, and how much he will be missed.

Dan Frank during a 2015 interview with Thomas Mallon at the Center for Fiction in New York

Dan started working in publishing in his 20s, after college and graduate school. While in his 30s he became editorial director at Viking Books. Among the celebrated books he edited and published there was Chaos: Making a New Science, by James Gleick, which was a runaway bestseller and a critical success. It also represented the sort of literary nonfiction (and fiction) that Dan would aspire to: well-informed, elegantly written, presenting complex subjects accessibly, helping readers enter and understand realms they had not known about before. As it happened, Gleick worked with Dan on all of his subsequent books, including his biographies of Richard Feynman and Isaac Newton, as well as Faster and The Information.

In 1991, after a shakeup at Pantheon, Dan Frank went there as an editor, and from 1996 onward he was Pantheon’s editorial director and leading force. As Reagan Arthur, the current head of the Knopf, Pantheon, and Schocken imprints at Penguin Random House, wrote yesterday in a note announcing Dan’s death:

During his tenure, Dan established Pantheon as an industry-leading publisher of narrative science, world literature, contemporary fiction, and graphic novels. Authors published under Dan were awarded two Pulitzer Prizes, several National Book Awards, numerous NBCC awards, and multiple Eisners [for graphic novels] ….

For decades, Dan has been the public face of Pantheon, setting the tone for the house and overseeing the list. He had an insatiable curiosity about life and, indeed, that curiosity informed many of his acquisitions. As important as the books he published and the authors he edited, Dan served as a mentor to younger colleagues, endlessly generous with his time and expertise. Famously soft-spoken, a “writer’s editor,” and in possession of a heartfelt laugh that would echo around the thirteenth floor, he was so identified with the imprint that some of his writers took to calling the place Dantheon.

There are surprisingly few photos of Dan available online. I take that as an indication of his modesty of the contrast between his high profile within the publishing world and his intentionally low profile outside it and of his focus on the quiet, interior work of sitting down with manuscripts or talking with authors. The only YouTube segment I’ve found featuring him is this one from 2015, when Dan interviewed the author Thomas Mallon at the Center for Fiction in New York. (I am using this with the Center’s permission.)

Dan is seated at the right, with his trademark round glasses. The clip will give an idea of his demeanor, his gentle but probing curiosity, his intelligence and encouragement, his readiness to smile and give a supportive laugh. Watching him talk with Mallon reminds me of his bearing when we would talk in his office at Pantheon or at a nearby restaurant.

Everything that is frenzied and distracted in modern culture, Dan Frank was the opposite of. The surest way to get him to raise a skeptical eyebrow, when hearing a proposal for a new book, was to suggest some subject that was momentarily white-hot on the talk shows and breaking-news alerts. I know this firsthand. The book ideas he steered me away from, and kept me from wasting time on, represented guidance as crucial as what he offered on the four books I wrote for him, and the most recent one where he worked with me and my wife, Deb.

Dan knew that books have a long gestation time—research and reporting, thinking, writing, editing, unveiling them to the world. They required hard work from a lot of people, starting with the author and editor but extending to a much larger team. Therefore it seemed only fair to him that anything demanding this much effort should be written as if it had a chance to last. Very few books endure hardly any get proper notice but Dan wanted books that deserved to be read a year after they came out, or a decade, or longer, if people were to come across them.

The publisher’s long list of authors he worked with, which I’ll include at the bottom of this post, only begins to suggest his range. When I reached the final page of the new, gripping, epic-scale novel of modern China by Orville Schell, called My Old Home, it seemed inevitable that the author’s culminating word of thanks would be to his “wonderful, understated” editor, Dan Frank.

What, exactly, does an editor like this do to win such gratitude? Some part of it is “line editing”—cutting or moving a sentence, changing a word, flagging an awkward transition. Dan excelled at that, but it wasn’t his main editing gift. Like all good editors, he understood that the first response back to a writer, on seeing new material, must always and invariably be: “This will be great!” or “I think we’ve really got something here.” Then, like all good editors, Dan continued with the combination of questions, expansions, reductions, and encouragements that get writers to produce the best-feasible version of the idea they had in mind. Their role is like that of a football coach, with the pre-game plan and the halftime speech: They’re not playing the game themselves, but they’re helping the athletes do their best. Or like that of a parent or teacher, helping a young person avoid foreseeable mistakes.

You can read more about Dan Frank’s own views of the roles of author, editor, publisher, and agent, in this interview in 2009, from Riverrun Books. It even has a photo of him! And you can think about the books he fostered, edited, and helped create, if you consider this part of Reagan Arthur’s note:

Dan worked with writers who were published by both Pantheon and Knopf. His authors include Charles Baxter, Madison Smartt Bell, Alain de Botton, David Eagleman, Gretel Ehrlich, Joseph J. Ellis, James Fallows, James Gleick, Jonathan Haidt, Richard Holmes, Susan Jacoby, Ben Katchor, Daniel Kehlmann, Jill Lepore, Alan Lightman, Tom Mallon, Joseph Mitchell, Maria Popova, Oliver Sacks, Art Spiegelman, and many, many others.

Deb and I will always be grateful to have known Dan Frank, and to have worked with him. We send our condolences to his wife, Patty, and their sons and family. The whole reading public has benefited, much more than most people know, from his life and work.

Harris & Ewing / Library of Congress

The renowned filmmaker Ken Burns has a new project called UNUM, about the sources of connection rather than separation in American life.

His latest segment involves “Communication” in all its aspects, and it combines historical footage with current commentary. Some of the modern commenters are Yamiche Alcindor, Jane Mayer, Megan Twohey, Kara Swisher, and Will Sommer. You can see their clips here.

One more of these segments covers the revolution in political communication wrought by Franklin D. Roosevelt’s radio addresses known as “fireside chats.” It was drawn from Burns’s earlier documentary Empire of the Air, which was narrated by Jason Robards. You can see a clip from that documentary here.

As part of the UNUM series of contemporary response to historical footage, Burns’s team asked me to respond to the FDR segment. (Why me? In 1977—which was 44 years after FDR’s first fireside chat, and 44 years ago, as of now—the newly inaugurated President Jimmy Carter gave his first fireside chat, which I helped write. It’s fascinating to watch, as a historical artifact you can see the C-SPAN footage here.)

This is what I thought about FDR’s language, and how it connects to the spirit of our moment in political time:

For reference, here is the text version of what I said in the Burns video, about those FDR talks, as previously noted here:

The most important words in Franklin Roosevelt’s initial fireside chat, during the depths of Depression and banking crisis in 1933, were the two very first words after he was introduced.

They were: My friends.

Of course political leaders had used those words for centuries. But American presidents had been accustomed to formal rhetoric, from a rostrum, to a crowd, stentorian or shouted in the days before amplification. They were addressing the public as a group—not families, or individuals, in their kitchens or living rooms: My friends. A few previous presidents had dared broadcast over the radio—Harding, Coolidge, Hoover. But none of them had dared imagine the intimacy of this tone—of trying to create a national family or neighborhood gathering, on a Sunday evening, to grapple with a shared problem.

Roosevelt’s next most important words came in the next sentence, when he said “I want to talk for a few minutes” with his friends across the country about the mechanics of modern banking. Discussing, explaining, describing, talking—those were his goals, not blaming or declaiming or pronouncing. What I find most remarkable in the tone that followed was a president talking up to a whole national audience, confident that even obscure details of finance could be grasped if clearly explained, rather than talking down, to polarize and oversimplify.

Consciously or unconsciously, nearly every presidential communication since that time has had FDR’s model in mind. In 1977 the newly inaugurated 39th president Jimmy Carter gave a fireside chat about the nation’s energy crisis, a speech that, as it happens, I helped write. Nearly every president has followed Roosevelt’s example of the basic three part structure of a leader’s speech at time of tragedy or crisis: First, expressing empathy for the pain and fear of the moment second, expressing confidence about success and recovery in the long run and third, offering a specific plan, for the necessary next steps.

Some of these presentations have been more effective, some less. But all are operating against the background, and toward the standard of connection, set by the 32nd president, Franklin Roosevelt, starting in 1933. “Confidence and courage are the essentials of success in carrying out our plan,” he said in that first fireside chant. “Let us unite in banishing fear.”

The opening words of that talk had been “My friends.” His closing words were, “Together we cannot fail.”

Shannon Stapleton / Reuters

The pandemic ravaged America’s big cities first, and now its countryside. The public-health and economic repercussions have been felt everywhere. But they have been hardest on the smallest businesses, and the most vulnerable families and communities.

This is an update, following a report last month, on plans to repair the damage now being done.

1) What the federal government can do: The Institute for Local Self-Reliance is a group concentrating on the business-structure, technological, political, and other obstacles that have held small cities and rural areas back—and how they might be reversed.

This month the ILSR released a report on steps the federal government could take to foster business and civic renewal at the local level. The report is available in PDF here, and a summary is here. The larger argument is designed to:

… help the federal government avoid the mistakes made in the wake of the 2007-08 financial crisis …

Rather than the housing sector [as in the previous crisis], the current economic fallout is decimating America’s small businesses. Nearly 100,000 small, independent businesses have already closed their doors permanently, with Black-owned businesses taking the biggest hit. As of early November, small business revenue was down a stunning 31 percent from January. As small businesses close or hang on by their fingernails, meanwhile, a handful of big corporations are recording massive profits, increasing their already-dominant market share, and dramatically accelerating concentration of the economy….

People are losing their dreams and livelihoods. Neighborhoods are losing beloved local stores and gathering spots. The country is losing much of its local productive capacity. To answer this generational challenge, we must have a federal economic recovery strategy focused on rebuilding, creating, and growing America’s small, independent businesses.

The report covers large policy areas—a different approach to antitrust—and very tangible specifics, like the way credit-card processing fees are handled. It is certainly worth consideration by the Biden team. (And, in the same vein, here is another worthwhile piece, by Maddie Oatman in Mother Jones, on the importance of economic prospects for rural America.)

2) What some state governments can do (a California model): Responding to a crisis that is both global and intensely local naturally involves a combination of measures—international efforts to detect and contain disease, nationwide economic strategies, and city-by-city and state-by-state responses to the problems and opportunities of each locale.

California, which has roughly one-eighth of the whole population of the United States and produces roughly one-seventh of U.S. economic output, also has been responsible for an outsize proportion of innovations. Some of them have run afoul or amok, as Mark Paul and Joe Mathews described a decade ago in their book The California Crackup (and as I mentioned in this 2013 profile of Jerry Brown). Others are a positive model for other states and the nation as a whole—notably, a non-partisan, anti-gerrymandering approach to drawing political-district lines. Arnold Schwarzenegger, who was governor when this reform came in, has been taking the anti-gerrymandering cause nationwide, as Edward-Isaac Dovere reported here.

One of California’s innovations that deserves broader attention is its “Little Hoover Commission.” After World War II, current president Harry Truman appointed former president Herbert Hoover to head a commission looking into broad questions of government organization and efficiency. That was the “big” Hoover Commission.

California’s “Little Hoover Commission” counterpart was created in 1962 and was meant to be a permanent, independent, non-partisan source of oversight and expertise about the state’s long-term challenges, and the state government’s response to them. In my new print-magazine article, I argue that, on the national level, formal commissions have played a surprisingly important role in investigating calamities (the space shuttle Challenger explosion, the 9/11 attacks) or assessing crises and trends (educational failures, resegregation and racial justice). California has, in effect, institutionalized this kind of non-partisan inquiry.

This month, the Little Hoover Commission has released its report on how badly the pandemic-era economic implosion is hurting businesses and families in California, and what might be done about it. The executive summary is here, and the full report is here.

I won’t attempt to summarize the whole thing here, but in essence their recommendation is an emergency effort to link public and private resources of all sorts—individual donors, NGOs, corporations, financial institutions—in a “rebuilding fund.” The fund, in turn, would concentrate on small businesses, and especially those in disadvantaged communities. One of its recommendations:

The state needs to use its megaphone to make financial institutions, private investors, and philanthropic donors aware of the Rebuilding Fund and to encourage high-net-worth individuals, impact investors, and major corporations to lend and/or donate to the Rebuilding Fund.

This may include working with regional business councils to disseminate information about the Rebuilding Fund and explain why it is vital to support small businesses, especially those in underserved communities. It may also include fully leveraging existing state investment networks..

In order to encourage investment, GO-Biz and IBank should also develop a strategy for publicly recognizing institutional investors and explore additional means for incentivizing participation.

In parallel with this effort, two California-based business-and-economic authorities, Laura Tyson and Lenny Mendonca, have put out a paper on the urgency of a new federal stimulus program. (For the record, both of them are friends of mine.) They say:

It is incumbent on the federal government to provide more generous and flexible funding for state and local governments. Governors and mayors across the country are pleading for help ahead of a challenging winter. Most states and cities have exhausted rainy-day funds and are facing a collective shortfall of $400 billion or more, according to the most recent estimates.

Because most state and local governments cannot legally spend more than they receive in revenues, they need no choice but to raise taxes or cut essential services and employment in health, public safety, and education, as many are already doing. Either option will the fiat of Mitch McConnell, the U.S. Senate seems likely to end this year without addressing the states’ and cities’ needs. Many states and cities are improvising in useful ways, but national crises require a national response. Help!

(And while I am at it, here is another locally based initiative to create more supportive ecosystems for entrepreneurs.)

3) Ways around the college-degree bottleneck: Research universities and four-year colleges are simultaneously the glory and the heartbreak of America’s educational system. They’re the glory for obvious reasons. They’re the heartbreak because of the financial challenges for many liberal-arts schools, and the student-debt burdens for millions of young people, and the factors that can make higher education reinforce existing privileges, rather than offset them.

The negative power of judging people purely by sheepskin credentials is very familiar. (I actually did an Atlantic cover story about it 35 years ago, here.) But a positive counterpart in the past few years has been rapidly opening pathways to careers that don’t require a four-year degree. That’s what we’ve emphasized in our reports on community colleges, “career technical” programs in high schools, apprenticeship systems, and other ways of matching people with the opportunities of this moment.

Last week The New York Times had a story by Steve Lohr with the headline, “Up to 30 Million in U.S. Have the Skills to Earn 70% More, Researchers Say.”

This is a great headline that conveys the essential point: There are opportunities (post-pandemic) for people who for various reasons have not completed the four-year bachelor’s gantlet. More information is available at Opportunity@Work and through the Rework America Alliance. (For the record, I know many of the people involved in the Opportunity and Reword initiatives.)

As with previous dispatches, none of these approaches is “the” answer to this era’s many crises. But they’re all potential parts of an answer. They deserve attention.

When I was a kid, the sin of returning books late to the public library populated a category of dread for me next to weekly confessions to the Catholic priest (what can an 8-year-old really have to confess?) and getting caught by the dentist with a Tootsie Roll wrapper sticking out of my pocket. So decades later, when I heard about libraries going “fine-free,” it sounded like an overdue change and a nice idea.

Collecting fines for overdue books has been going on for over a century, originally seen as a source of revenue and as an incentive for people to behave responsibly and actually return borrowed books. Then, as early as the 1970s, research and experiments with going fine-free began to pick up steam. But as recently as four years ago, over 90 percent of libraries in the U.S. were still charging small change for late returns.

A Seinfeld episode from 1991, called The Library Cop, seems at once timely and untimely. This is Seinfeld it will make you laugh.

Missions, Policies, Changes:

The last five years have been very busy in the world of overdue fines. In what has been the “Fine-Free Movement,” many librarians have begun to question the traditional policy of overdue fines, and attitudes have begun to change. Are fines consistent with a fundamental mission of libraries: to serve the public with information and knowledge? And to address that mission equitably across the diverse population of rich and poor library users?

A 2016 Colorado State Library system report showed that eliminating overdue fines removed barriers to access for children. While some people only notice fines as an irritation, others feel the weight heavily enough to be driven away from the library.

In 2017, a Library Journal poll of 450 libraries found that over 34 percent considered eliminating at least some fines.

In 2018, a poll of Urban Libraries Council (ULC) member libraries found that the most common reason (54 percent, dwarfing all others) responding libraries had gone fine-free was that eliminating fines increased access for low-income users and children.

By late in 2018, several big-city public-library systems including San Diego, Nashville, Salt Lake City, Baltimore, St. Paul, and Columbus, Ohio eliminated overdue fines.

The powerful American Library Association, representing some 55,000 members, adopted “a resolution of monetary fines as a form of social inequity” at their midwinter meeting in 2019.

In January, 2019, the city of San Francisco issued an extensively-researched and influential report called Long Overdue, on the impact of fines on the mission of libraries, and the costs of eliminating fines on libraries, users, and the city and county of San Francisco. The report ultimately recommended eliminating overdue fines throughout the public library system.

When the pandemic closed libraries and made it hard or impossible for people to return books, many libraries revisited their policies on overdue fines. In Washington D.C., an early shorter-term amnesty experiment at the beginning of COVID-19 grew into a subsequent vote by the Public Library Board of Trustees to expand eliminating fines for only youth, to everyone.

Experiments in fines, amnesties, alternatives:

Libraries have been experimenting with lots of different ways to address fines for overdue books. Some stopped fining all patrons others only children or youth still others exempted active military and veterans from fines. Some forgive fines up to a certain dollar amount. Santa Barbara, California, follows one common practice—forgiving fines for a certain number of days (30 in this case) days, then charging for the cost of the book, which can be forgiven upon its return.

Lost or damaged books are in a different category. The loss of a book is much more costly and cumbersome to a library than a late return, and libraries work out various ways to address that.

When libraries offer popular amnesty periods for returning overdue books, the books often pour in like gushers. An amnesty program in Chicago brought in 20,000 overdue items Los Angeles nearly 65,000 San Francisco just shy of 700,000. And a bonus: After the Chicago library went fine-free, thousands of users whose fees were forgiven returned to the library for new cards, and readers checked out more books overall than before.

Other libraries found substitutes for monetary fines. In 2018, the public libraries in Fairfax County, Virginia, began a food-for-fines program, which collected 12,000 pounds of food to donate to a nonprofit food pantry. Each donated item accrued one dollar toward a maximum $15 fine forgiveness. In Queens, New York, the public library has a program for young people to “read down” their 10-cent per day fines. One half hour of reading earns one dollar in library bucks to pay off fines.

Calculating costs of fines and the benefits of going fine-free:

The 2017 Library Journal poll of about 450 libraries across the country estimated that nearly $12 million in monthly library fines would be collected nationwide that year.

In fact, loss of revenue takes different size bites from libraries’ budgets. Some seemed like nibbles. When the New Haven, Connecticut, public library went fine-free in July 2020, the sum of overdue fines was less than one-quarter of one percent of the library’s annual budget. In San Francisco, fines in FY 2017-18 represented 0.2 percent of the operating budget. In Schaumburg Township, Illinois, 0.25 percent of the annual budget. In Santa Barbara, 1 percent. The St. Paul, Minnesota, libraries found that they spent $250,000 to collect $215,000 in fines.

But a late 2018 ULC poll of its roughly 160 members reported that one in five libraries that were considering eliminating fines named the biggest deterrent as financial. (Only larger was political reasons, at 34 percent.) The Long Overdue report found that fines disproportionately harmed library customers in low-income areas and those with larger proportions of Black residents. While libraries in all areas “accrued fines at similar rates,” those located in areas of lower income and education and higher number of Black people have “higher average debt amounts and more blocked users.”

As Curtis Rogers, the Communications Director of the Urban Libraries Council described the findings to me: “Overdue fines do not distinguish between people who are responsible and those who are not—they distinguish between people who have or do not have money.”

Funding sources for libraries vary considerably. Some libraries enjoy a secure line item in a city or county budget. Others patch together a more fragile existence of fundraising, philanthropy, public bonds and levies, and other sources.

Other factors have changed the landscape as well. The growth of e-book lending, which can automatically time out and incur no fines, have cut into overall fine revenue numbers somewhat.

To make up for losses in revenues, libraries have come up with creative answers. For example: processing passport renewals a “conscience jar” for overdue books charging fees for replacing lost cards and for copying, scanning, and faxing charging rent for community rooms or theaters and general tightening of spending.

The impact of fines should be measured in ways beyond cash revenues. Collecting fines and blocking accounts can be time-consuming, stressful, and unpleasant for librarians, and can cause general discomfort and even ill will in a community.

I witnessed a small episode of the toll that fines can take on the strong currency of people’s trust and goodwill in libraries. During a summer visit a few years ago to the public library in an unnamed town in the middle of the country, I was hanging around the check-out-desk when I saw a man reach the front of the line to borrow a few books. The librarian told him that his card was blocked, and he needed to pay his fines before he could borrow the book. The man was part of the town’s sizable Spanish-speaking population, and he didn’t understand the librarian. She repeated her message, louder each time. A line was building at the check-out. Finally, the man went to fetch his elementary-school-age daughter to translate for him. It all ended badly: He was embarrassed, the daughter was embarrassed. Others like me who witnessed the exchange were embarrassed. The man left without borrowing the books. The librarian was stuck behind non-transparent rules, although I have seen more gracious handling of such situations.

In 2016, the Orange Beach, Alabama, public libraries swapped overdue fines with voluntary donations, which they soon dropped as well. Steven Gillis, the director of the public library, wrote that the overall goodwill the library earned in the community with their new fine-free policy had leveraged into increased municipal funding from a sympathetic and appreciative city council.

The Long Overdue report also found that eliminating fines increased general goodwill between users and staff, and also increased the numbers of users and the circulation of books. They saw no increases in late book returns.

* * *

In 2018, a young research fellow at the Urban Libraries Council (ULC), Nikolas Michael, set out to tell the story of libraries going fine-free by creating an interactive map, which has since become one of ULC’s most used resources.

Here is the map and how it works:

View larger map | Provided courtesy of the Urban Libraries Council

Each arrow on the map represents a library that ULC has logged to tell its story of going fine-free. The gold arrows are ULC member libraries silver are non-member libraries.

The map is interactive click on an arrow and you’ll see some of the whys, wherefores, and impact of the change on a particular library. The map updates with each additional entry.

Curtis Rogers, from ULC, and Betsey Suchanic, a program manager there, described on a Zoom call the background and impact the map has made on telling the story and building a movement.

The map helps libraries make well-informed decisions, as they use it for research and evidence to weigh the pros and cons of going fine-free.

In Philadelphia, Councilwoman Cherelle Parker called for a hearing to explore eliminating fines at the Free Library of Philadelphia. She directly referenced the ULC map of fine-free libraries as evidence. ULC also submitted written testimony for the hearing.

The map and ULC’s other reporting on the fine-free movement contribute to larger-context conversations—for example, on the topic of the pros and cons of other kinds of municipal fines, like parking tickets.

The Public Library of Youngstown and Mahoning County just went fine-free, and they used the map specifically to make their case to their board. You can see the map on page 8 of the library’s PowerPoint presentation.

* * *

America’s current national focus on issues of racial, economic, educational, health, and environmental equity, and on policing and justice, has a way of reaching a sound-bite ending in media segments or conference panel wrap-ups. It goes something like this: “We need to have a national conversation about …”

Public libraries, which are in business to be responsive to public needs and wants, are a model for moving beyond conversations to action. For example, public libraries open their doors to homeless people, they feed hungry children in after-school programs, they offer free Wi-Fi access for people and places (especially rural) where it is hard to come by, and in increasing numbers, they find ways to forego monetary fines. These actions shore up in a tangible way a major mission of public libraries: to provide equal access to information and knowledge for all citizens.

As it was in 2016, so it is again in 2020: A central axis of national-election results is the rural-urban gulf. Larger cities—really, conurbations of any sort—mainly went for Joe Biden. Donald Trump’s major strength was in the smallest cities and in rural areas.

Obviously there has been more to Donald Trump’s power than purely regional dynamics. (In particular, there are racial dynamics, as laid out here and here and here.) And as Deb Fallows and I have argued for years, the United States looks more hopelessly divided when it comes to national elections than it does from any other perspective. For instance, see these dispatches from western Kansas, back in 2016.

But also obviously, national elections matter, and regional and locational polarization makes every other challenge for America more difficult. In a new paper for Brookings, John Austin argues that Midwestern voting patterns for Trump and Biden show how the sense of being “left behind” fuels resentment-driven politics—and how a sense of possibility can have the opposite effect. August Benzow of The Economic Innovation Group has a related paper on the stark differences within rural America on racial diversity, economic positioning, and political outlook.

Does anyone have an idea of how to blunt these differences and open more opportunities? Especially as a new administration faces all the economic, public health, law-enforcement, and other crises the new Biden team is about to take on? Here are some recent items worth noticing:

1) A Marshall Plan for Middle America: During election years, reporters troop into cities (and especially diners) in Ohio, Pennsylvania, and other parts of “interior America” to get political quotes. Then, typically, the press spotlight moves someplace else.

This past weekend in The Washington Post, the mayors of eight of these middle-American cities wrote about what could be done to move their areas ahead. These are places we know and have written about, many of whose mayors we also know personally. The cities are Pittsburgh, Pennsylvania Cincinnati, Columbus, Dayton, and Youngstown in Ohio Louisville, Kentucky and Huntington and Morgantown, West Virginia. All are in the Appalachian or Ohio River Valley regions, often stereotyped in national discourse as the land of coal mines and decrepit factories.

The mayors argue that it is time to draw on the region’s manufacturing heritage, and recreate its economy in a fundamental way. For instance:

According to our research, taking advantage of our community assets, geographic positioning and the strengths of our regional markets can help create over 400,000 jobs across the region by investing in renewable energy and energy efficiency upgrades to buildings, energy infrastructure and transportation assets.

Renewable sources of power are proving less expensive, and fossil fuel companies are increasingly dependent on federal subsidies to survive. Couldn’t these subsidies be strategically shifted to invest in a green economy that keeps these largely suburban and rural jobs but transitions them, with federal support, into new industries that will grow in the 21st century?

Like our friends at Reimagine Appalachia—a grass-roots community and environmental organization—we believe a Marshall Plan-scale reinvestment is necessary. Rather than a “Green New Deal,” our plan would seed long-term regional investments in Appalachia’s rural and suburban communities, while leveraging the technological successes of our tentpole cities to assist them. The same goes for our neighbors in the Ohio River Valley throughout the Rust Belt and up to the Great Lakes region.

I agree with their pitch, and hope their prospectus gets attention. Here is a complementary argument from Bill Peduto, the mayor of Pittsburgh, and another from Annie Regan, in the Pittsburgh Post-Gazette.

2) Reducing Polarization by Modernizing Rural Policy: The political and cultural ramifications of a rural-urban divide are hot topics journalistically. “Rural policy,” not so much. But in a new report for Brookings (available here), Anthony Pipa and Nathalie Geismar argue that straightening out the rat’s-nest of programs intended to help rural America could make a big difference.

Rat’s nest? Take a look at this organization chart included in the Brookings report:

Courtesy of the Brookings Institution

“The economic fallout from the COVID-19 pandemic threatens to further disrupt local economies that in 2019 were still recovering from the Great Recession” and other long-term disruptions, Pipa and Geismar write. They add:

Just recently, COVID-19 prevalence in nonmetro U.S. areas surpassed those in metro areas for the first time Rural residents are now almost 2.5 times more likely than urban residents to die from the virus. This is compounded by the decreasing access to health care that many rural communities face …

Now, rural communities must navigate a virtual world of work with intermittent broadband access and adapt to additional shocks to manufacturing and agriculture supply chains ….

Despite these challenges, rural communities are diverse—both demographically and economically—and entrepreneurial. They help power, feed, and protect America at rates disproportionate to other geographies. They house 99 percent of wind power capacity and will play a key role in national climate strategies that require investments in clean energy infrastructure.

The report has many recommendations, but here are the three main ones:

  1. Launch anew development corporation, to invest in local vision and leadership through long-term block grants at the community level and innovative financing tools that give communities a fighting chance to strengthen and renew their local institutions, economies, and vision.
  2. Create a national rural strategy, elevate White House and interagency leadership, and undertake a set of specific and targeted reforms to enhance federal coherence and effectiveness.
  3. Appoint abipartisancongressional commission to undertake a top-to-bottom review regarding the effectiveness of federal assistance and build political momentum to transform federal rural policy.

3) Local journalism and local recovery: This is a big ongoing theme, which will only gain in importance if recovery efforts like those mentioned above are giving a serious try in communities across the country. Margaret Sullivan of The Washington Post, a former editor herself and an indispensable media observer, published a book this year about the accelerating forces working against local news. Just after this year’s election, Dan Kennedy, another important longtime media writer, argued on the GBH news site that shoring up local journalism would have direct benefits community-by-community, plus the broader potential of calming down now-fevered national discussions. On the Poynter site, Rick Edmonds—yet another important longtime media writer—gives a comprehensive overview of how “shoring up” might actually work. For instance:

As the pandemic advertising recession and longstanding negative trends have made the financial precariousness of these enterprises obvious, Congress has pretty much decided it should come to the aid of local news. The question of how remains, together with making the help timely.

My take comes from conversations with a variety of advocacy groups pushing one form or another of legislative assistance. A surprising favorite approach has emerged, too—direct subsidies for news subscribers, local journalists and small business advertisers.

That’s the structure of HR 7640, the Local Journalism Sustainability Act, sponsored by Rep. Ann Kirkpatrick (D-Ariz.), Rep. Dan Newhouse (R-Wash.) and more than 70 co-sponsors from both parties.

There is a lot more detail in Edmonds’s piece, and the others. (See also this pre-election analysis at the Ground Truth Project, by Steven Waldman, whose work I have described here.) And while I’m at it, please check out the latest dispatch from John Miller, creator of the film Moundsville, about regional culture gaps. Also this, by Katherine Bindley in The Wall Street Journal, about big-city tech-industry people who have considered entirely different careers, in entirely different parts of the country, because of the pandemic.

Important transformation work is underway at the national level, as I’ll discuss in an upcoming print-magazine article. But that would be doomed, or at least limited, without comparably intense efforts to improve local-level prospects. These ideas are a start.