17. Jenkins, Short History of London (3 September 2025)

The City predominated medieval and early modern London, at Westminster’s expense. Thereafter the British capital has passed through phases of growth and decline without settling on effective local governance or planning. She’s been profligate with land, unable to strike a balance between the market and the needs of poorer residents or transportation. Consequently London’s physical development, i.e., geography in contradistinction to architecture or socioeconomic character, has sacrificed neighborhood continuity and civic historicity.

The medieval decision to situate London’s power outside the district of St. Paul’s shaped the capital’s early history. Following the 1348’s plague of the Black Death, the City regularized government in 25 aldermen from 12 guilds, bolstered by 100 representatives from 25 wards. Most of the former were lifetime positions, though because of the guild’s basis in trade, especially maritimes, the oligarchy was regularly, naturally renewed by changing fortunes. Well-to-do medievals could escape the city’s walls for the countryside: commercial power was not co-resident with political heft.

In seizing the monasteries Henry III added to his hunting estates and consequently such parks as St. James, Hyde, Kensington Gardens, and Regent’s. Richard Gresham and his son Thomas, Henry’s bankers, saw to the City’s admitting Flemish refugees from Spanish Netherlands, contributing to London’s surpassing Antwerp as the commercial hub of northern Europe. By 1600, half of the wealthiest were ‘suburban’, contributing to a rising gentry, and as such Elizbeth’s government restricted building beyond three miles – creating a proto-greenbelt.

During the Civil War, the City favored the Puritans before taking a compromise position at decade’s end. It had financed Cromwell and overawed Westminster, reaching the apex of its power. London’s classes began intermixing during the Restoration, as the building of aristocratic squares (e.g., St James, Grosvenor) required servant quarters nearby. Later these became slums, then gentrified townhouses. Outside the City, late 17th-century London was inadequately governed by parish vestries (Middlesex, Essex), lacking the guilds’ organization and self-policing. Thus Sacheverell’s 1709 attacks on Irish Catholics and other Dissenters provoked discord, leading to the passing of the Riot Act in 1714.

By the Hanover era, Westminster was no longer a suburb but truly a second city, built around St James and Mayfair mansions. Johnson, of Fleet St, personified London, observing he who tires of it ‘is tired of life, for there is in London all that life can afford. You will find no man that is at all intellectual who is willing to leave’. Campaigns by Henry Fielding and William Gogarth against cheap gin led to heavy taxes, for alcoholism and infant mortality soared over 1720-50, hitherto the only time the population had stagnated; and thereafter the populace switched to beer.

(In 1709, Cheyne Row was built in then-distant Chelsea, which shortly became one of the few estates to change hands, sold to Hans Sloane, founder of the British Museum. In 1753 it was divided between his daughters, one of whom married the Welsh Lord Cardogan, who developed in Knightsbridge and in the 1880s was responsible for heavily rebuilding the north section in the redbrick, neo-Dutch style. His estate remains active; fortunately, the white-stuccoed terraces toward the Army hospital survived.)

The regulation of construction, which began after the Great Fire of 1666, accelerated with 1774’s Building Act, banning exposed woodwork and prescribing uniform streets. The century’s latter half also saw renewed estate building. The Napoleonic wars stimulated development, but the broad trend over 1720-1800 had been imperial trade moving to Bristol, Liverpool, and Glasgow as London’s docks were outmoded. Hence construction commenced in the Isle of Dogs and Wapping: the East End became England’s greatest working-class city – almost unknown to the rest of the town.
Victorian reform (sanitation) and improvement – rapid, ‘brutal’ railway construction – came hand in hand, both being piecemeal. The Metropolitan Board of Works originated in 1855 after poor water was identified as the cause of ill health; it would be replaced by the more expansive London County Council (LCC) in 1888. Other British town councils were already in place in the 1830s. Yet Salisbury’s Tories opposed a unitary capital government as too big and powerful, favoring the unreformed City, Poor Law administrators, and vestries as bulwarks. Fabian capture of the bureaucracy would soon prove his point. Notwithstanding serious problems in 1832, ‘48, ‘67, and ’88, threats to public order were minimal because tradesmen were separated into ‘little islands’, making citywide agitation too difficult, There remained land for expansion, to accommodate the growing residency, Jenkins cites of historian Roy Porter.

Shipbuilding virtually disappeared in the 1860s. Services now predominated, accounting for 60 percent of jobs, notably in finance, law, and public administration. One-third of these were filled by women. By 1900 individual buildings had surpassed the street or square as the cityscape’s best-known features. Urban sprawl accelerated with the coming of cars, ownership rising to 1 million after World War I. Separately, London’s deep clay would prove ideal for the tube (whose iconic map was modeled on an electrical circuit board).

Planning came into vogue in interwar era. The Blitz damaged London much less than Allied bombs rent Germany, those killed numbering 30,000 versus 500,000, but postwar Labour efforts were more political than practical, density remaining fairly low. Clearance of traditional communities led to unwonted gentrification, and its evident failure to alleviate housing shortage led laissez-faire development in the 1950s and early ‘60s. (South Kensington remains one of the town’s densest districts.) Patrick Abercrombie, author of 1944’s Greater London Plan, builder of the Barbican, and all-round promoter of brutalism, is the villain. In the 1970s, arts and education bodies too destroyed heritage sites in the spirit of progress. Jenkins chides politicians for ‘taking orders’ from architects, planners, and quangos lacking in civic spirit. From 1950-70 the population fell by 9 percent, 17 in the inner city districts. However, the smog of 1952 (the successor of the Great Stink of 1858), was addressed in a private bill of ’56, resulting in the banning of coal heating by 1968.

London’s rebirth commenced in 1980 with Thatcherite policy and the City’s Big Bang, Nigel Lawson’s disbanding of monopoly practices, later extended by Blair. The working class moved rightward as enrichment spread. London would become an international city – by 2001 nearly 40 percent were foreign born, and 55 percent considered themselves ‘non-white British’. The populace, which had fallen to 6.6 million in 1985, rebounded to 9 million in 2019; however, the West End became the province of empty second homes (especially along Chelsea’s King’s Road or in Kensington’s Phillimore estate, the combined borough’s residency falling at the 2011 census.) Ken Livingstone and Boris Johnson are responsible for permitting London’s unruly towers.

Surprisingly, Jenkins makes no comparisons with other great, imperial cities. He writes not in the interests of policy; his efforts in that direction, such as recalling his native Camden, are incomplete or unconvincing; yet he is a kindly and fond narrator. His view that cities belong to residents as much as developers is resonant.

12. Petraeus and Roberts, Conflict (29 June 2025)

Studies the development of counterinsurgency doctrine in the postwar era, characterized by asymmetric military-political operations. Selected case studies illustrate the thesis that the people are the prize, until in later phases the conventional or the exit becomes possible.

Between 1943 and 1975, almost all western Europe’s colonies were handed back, in aggregate history’s largest recorded transfer of territory. Malaya, 1952-54, was among the most successful counterinsurgency campaigns. The British protected villages, eschewed torture for productive interrogation, and trained locals in security measures and low-key fighting and maneuvers. Contemporary Borneo evidenced the importance of British offensive action: containment fails if solely passive. In this era, generally the British come of favorably to the Americans.
In Vietnam, the US ignored winning the populace, instead seeking to stop a North Korea-style invasion, though the engagement’s different nature was already evident by 1959. Generalship under the Kennedy and Johnson administrations reprised Eisenhower-era tradecraft: Johnson even authorized Westmoreland to send troops against insurgents operating independent of South Viet forces. 95% of combat forces engaged in search-and-destroy mission instead of clear-and-hold; Westmoreland did not retain what had been cleared. (Later the Americans made the mistake of rotating individuals not units, at the expense of continuity.) The authors conclude, however, that failure in Vietnam bought time for Thailand, Indonesia, Malaysia, and Singapore to stave off Communist expansion. Relatedly, if government is charged with protecting individual rights, then rights must inhere in individuals.
Indochina raised the question of how to separate anti-communism from nationalism. In Algeria, the European population was approximately 10 percent of 10.5 million, a substantially higher proportion, pointing up new questions of self-determination. Torture in that African country was considered antithetical to French values, making it unpopular in France.
1973’s Yom Kippur war showed deterrence works only when consequences are seen to be overwhelming. (Conventionally, attacking forces ought to outnumber defenders by at least 3:1, concentrating on the most vulnerable points.) Money spent on deterrence is seldom wasted, in comparison with the cost of war.
The Falklands War illustrated the evolution of naval campaigning, that is the integration of tactical land action. Smallish military operations in Grenada and Panama offered further lessons; absent Grenda, it’s doubtful the US would have improved its joint operations. As it turned out the USSR saw evidence for conceding.
Vietnam had introduced casualty rates as a focus of domestic criticism; 1993’s debacle in Somalia persuaded the US to stay out of Rwanda. Then the Falklands, Iraq, Serbia, and Gaza brought forth a just war-flavored idea of acceptable enemy rates – at least in Western countries. The UN has usually failed in its original mission of preventing interstate conflict. (It’s better at managing children, refugees, and world health(!).) To what extent are powerful nations responsible for the affairs of failed states even though they have no national interest there?
At the start of the 1990s in Iraq, precision munitions constituted 2% of armament; at the end in the Balkans, the proportion had grown to 90%. Hence the future of the West depends in part on the best fighter planes and pilots. In the 21st century, Western armies have become very dependent on civilian technologies such as robotics, data analytics, and AI, a reversal of the prior century. (AI is tactically brilliant but strategically banal, George Friedman writes.) After a century which favored mobility and agility, drones may reintroduce the advantages of mass and quantity. Not only supply chains but generalized interdependence of the Western countries has created more vulnerability. But the authors denigrate ‘isolationism’ (presumably in contradistinction to freeloading).
Turning to Petraeus’ direct participation, the primary issue confronting Afghanistan’s postwar government in 2001 was incorporating the defeated Taliban. Managing the country was a more difficult matter than Iraq, despite a lower level of social violence, because of its geography: no roads, harsh winters, and the proximity of sanctuary in Pakistan. Karzai would ultimately fail to assemble a catholic loya jirga, despite military and civil assistance outstripping the Marshall Plan. By 2006, the Taliban was again on the offensive. Counterinsurgency tactics must persist in the postwar. Relatedly, there’s nothing wrong with planning to withdraw, but announcement removes every reason to cooperate.
In post-Saddam Iraq, Petraeus’ predecessor Jerry Bremer went too far in replacing the prior regime, notably security personnel, thereby destroying the successor state’s foundation. Maliki ruled in narrow interests, as in Vietnam, leading to civil war. Condoleezza Rice’s clear-and-hold would have performed better than Don Rumsfeld’s soon-as-possible transition. Further, Bremer ruled as a viceroy. Every liberating army has a half-life before it becomes an occupier: stripping back too far reduces both goodwill and time. Petraeus elaborates his technical approach; one wonders what was censored?
In contemporary Ukraine, Putin didn’t see the advantages of the World War II offensive had shifted in the Internet era to defensive postures, facilitated by distributed communications and low-tech harassment. Russia’s numerical advantage was too low, especially for urban warfare, and Moscow’s ineffectively steered tactical adjustments, deployment of reserves, logistics, etc. (The Ukrainians could rely on fixed-base logistics, the Russians could not.) Evidence as at time of writing is warfare had returned to the strategy of the hedgehog: when there is little sanctuary, the defense must disperse.
Modern military leadership entails grasp of the strategic situation – getting the big picture right – effectively communicating to troops and also civilian leaders, and driving execution. Officers recursively refine, adapt, and supplement.

4. Burns, Milton Friedman (21 March 2025)

A sympathetic biography portraying the much-celebrated economist’s journey from Keynesian ‘institutionalism’ to empirical monetarism, entailing portraiture of leading 20th-century schools of economics, where economics is portrayed as the natural foil to law – practicality versus theoretical equity.
In the 1920s, the predominant classical school synthesized price theory and marginal analysis with laissez-faire, characterized by Alfred Marshall and exemplified by Andrew Mellon. Irving Fisher’s quantity theory enlarged the neoclassical view: M (supply of money) times V (its velocity) equaled P (the level of prices) times T (the volume of transactions). As the depression set in, neoclassic theory lost credibility. The rising Keynes accepted the importance of prices but emphasized the role of timing and dynamics of supply as seen in the current (when demand sets price) and the short run (when the management of supply enables the supplier to lift prices). (Then in the long run, efficiencies gear supply to demand reducing price; secular time accounts for when social changes reconfigure supply and demand altogether). Meanwhile Chicago’s Frank Knight theorized entrepreneurship, offering the precision of math while seeking for pattern and causation, and econometricians went fully into comprehensive theory and planning. Such institutionalism (which also entailed the study of large, established players such as corporations and unions), most popular at Columbia, Wisconsin, Brookings, Amherst, and the National Bureau of Economics (NBER), shunned the neoclassical, seeing markets as evidencing degrees of monopoly or alternately inefficiencies.
Friedman, unusually, crossed over to work among institutionalists at Columbia and NBER, before returning to the neoclassical and subsequent price theory at Chicago. By 1937 neoclassicism committed to Marshall’s ‘freedom of industry and enterprise’. From 1940, Friedman often testified to Congress on behalf of Treasury Secretary Henry Morgenthau’s view of income tax as a fairer means of funding rising government expense; taxpayers would increase from 4 million in 1939 to 43 million in 1945. Notwithstanding Morgenthau’s contribution to the Bretton Woods system, historians often label the 1940s Friedman’s Keynesian phase.
His postwar return to Chicago brought a turn to quantitative testing of theory, seeking not for conformity with formal logic but the deduction of yet-unobserved facts, which evidence must be falsifiable and not later contradicted by future evidence. Keynesianism was enshrined in the 1946 Employment Act, an econometric effort to maximize employment, production, and purchasing power as represented by IS – LM (investment times saving equals liquidity times demand for money); falling interest rates lift output, goosed by the Keynesian multiplier, whereas demand for money lifts interest. New Deal planning, seen by Knight as social control, had become aggregate (i.e., fiscal) demand management, represented by Paul Samuelson. But Friedman argued statistical criteria were insufficient to judge the model: the test was the real world, not inputs already in the econometric model. In other words, one shouldn’t derive theory after sifting facts but start with theory to be tested by facts.
In modeling the economy, competition was preferable to Keynesian planning. At Chicago, Friedman sidelined the econometric Cowles Commission, opposing its ideal narratives comprising multi-equation models, rational agents, and perfect information as untestable, instead favoring predictive accuracy. He had discovered Hegel’s notion that a plan must have a unitary conception, a controlling view to be enforced, which led to Friedman’s observation that when frustrated, government turns to coercion. A republic which starts to plan relinquishes its view of people as paramount.
Friedman thought price theory deficient in the instances of natural monopoly (which are beneficial), market monopolies (firms which capture a controlling role in pricing), and certain externalities. Market monopolies divided the neoclassic from the New Deal liberal, hence it was vital to establish its features. He did not see the US on the road to serfdom, for it would have the good sense to shun overbearing government on either side, but did embrace economics’ theoretical role in advancing liberty. Then in the 1950s Hayek reoriented the field toward the interplay of politics and economics, which Friedman later addressed in Capitalism and Freedom). Capitalism was not a self-sufficient ethic; freedom differentiated it from planners and socialists further left; liberty ought to surpass equality, as in Stigler. Hence Friedman shifted his research from industries to markets, replacing firms, workers, and interests with efficiencies (cost-benefit analysis) and prices (not price regulation).
Whereas Washington in the 1930s worried about the instability of capitalism, the capital in the 1950s fretted over the role of the Federal Reserve, prompted by the qualities of fractional-reserve banking and the Fed’s liquidity mistakes during the Depression. If, as in Keynes, V drove the economy then the Fed alone could remedy slowdowns. But in Marshall’s view, market and/or short-term unpredictability naturally stabilized over longer periods, so money supply was more important. Such was Friedman’s leaning.
Friedman articulated his permanent income hypothesis in 1957’s Theory of the Consumption Function: people make consumption decisions not on their current income but expected long-term average, smoothing decision. This challenged Keynes’ idea of contemporaneous, marginal decision making. Consequently, fiscal stimulus would have less impact because it’s transitory. The idea explained stable postwar consumption regardless of business cycles. Along with the previous year’s Quantity Theory of Money, Consumption completed the first phase of Friedman’s challenge to institutional orthodoxy: seeing off Cowles, rehabilitating Marshall, and now confronting Keynes.
Friedman’s support for large parts of Keynesian new Deal, namely the multiplier effect, amounted to rejecting Austrian (Hayekian) austerity; government must prop up the economy in crisis. But the economics profession had discarded the specifics and saw only general applicability, and so cast aside monetary policy. But monetarism substituted managing velocity for managing inflation. The goal of full employment, e.g., 4% unemployment, doubled as the inflation target. Capitalism and Freedom (1962) asserted government’s role was to protect liberties including economic choice making. Anna Schwartz deserved more credits for its authorship, especially for its resonance beyond pure economics.
By approximately 1970, Friedman was more a policy wonk than practicing economist. He was thus estranged from Arthur Burns, who did not distinguish between changes in relative price and systemic inflation, which allowed for thinking inflation could be quarantined into sectors and treated with price controls. Burns was an institutionalist, explaining the surprise of his endorsing Nixon.
The rational expectations school sees public anticipation as incorporated in the Keynesian general equilibrium model, as exemplified by the Cowles commission. Its purpose was to rework Samuelson’s neoclassical synthesis of price theory and Keynesian demand management, applying pricing tools to macro behaviors e.g., inflation, and also to incorporate unpredictable movements / changes by including monetary rules, in order to make the model work (while sidestepping political philosophy). The 1980s, it rose to surpass Keynesianism as rival to Friedman’s monetarism.Volker kept Friedman at arm’s length, seeing monetarism as sometimes too rigid for central banking, and not liking the latter’s calling for an end to the Fed’s independence. But he acknowledged inflation was always a monetary phenomenon: for him, monetarism ws more art than science, since the velocity of money is erratic in crises.
The economic market is freer, more democratic than the political arena, Friedman thought. But a trip to Chile catalyzed the realization that both were expected of Western democracies. Political freedom was a necessary, long-term condition for economic freedom, in something of a reversal, resonant of Hayek. Meanwhile he saw socialism had turned to calling for transfer of wealth instead of means of ownership. At his zenith, Friedman turned to excoriating tax, a kind of Hayekian culmination.
NB: ‘Social control’ was originally a Progressive term meaning planning for efficient use of resources including labor.

20. Clarey, Warrior (16 November 2025)

Portrays Rafael Nadal as the competitor’s paragon, focused on one’s own potential while gracious to rivals and sundry. Nadal blended ambition of overachievement – not so much better than others as better than Rafa – with a common decency rare among champions.

Nadal, semi-ambidextrous, consciously adopted a lefty forehand, which became his signature ‘sobraquero’. The move may have hindered his service game, but was unsolvable until Novak Djokovic began to attack it, forestalling Nadal’s running around backhands to hit winners and instead forcing him to the sidelines. Carlos Moyà’s training with Nadal, aged 24 and 14, respectively, in Mallorca saved the latter from having to move to Barcelona.

Roland Garros and more generally clay courts were ever his forte. 2010 may have been his finest season. Returning from injury, he claimed three Grand Slam titles; he withdrew from the Australian with a knee injury. Nadal missed 15 Slams through injury, Federer 6, Djokovic 1 (through publication in 2024). He finished 14-0 in Paris, 8-8 in all other tournaments.
Nadal relished his rivalry with Roger Federer, thinking it made both better; but Federer would haven happy to be alone at the summit. Novak Djokovic would later become his paramount rival because of shared relentlessness; Nadal never beat him on a hard court after 2013.

Nadal’s work ethic wasn’t delayed gratification, it was gratification. He simplified the complex and the tense via routine and habit, as a counterbalancing of nervous energy. Some paraphrases: to become better the past must have a place in the present; I am not injured but pain-free is a long time ago. On the possibility of retirement: Not if I hadn’t pushed myself to the summit of possibility, where I am not capable of being better; but if I have, then it’s justified.

His well-trained body follows the demands of the mind, according to the sculptor of his Stade Roland-Garros statue. The figure was controversial for its unveiling prior to Nadal’s retirement, a view Clarey shares. Yet Nadal was notable for cleaning courts after practice and recognizing tournament personnel. To not acknowledge people he knew (i.e., Clarey after a press conference) would have rude in Nadal’s view, making him an outlying superstar.

The development of clay courts in France, Spain, and Italy greatly expanded the scope of tennis, since lawn courts were difficult to maintain in the Mediterranean (and prior to modern hard courts). Clarey frequently sketches France’s early 20th-century mousquetaires, Suzanne Lenglen, and the unique flavor of the Roland Garros tournament (the French Open) as well as French administration.

NB: Goethe – ‘Talent develops in quiet places, character in the full current of life’.

19. Blainey, Tyranny of Distance (15 November 2025)

Australian remoteness from mother England shaped its colonies’ economic and hence national development through World War I, if not the 1960s. All their primary activities depended on transoceanic connections and technologies.

Aside from transportation, Botany Bay afforded England opportunity to participate in Asian piracy and smuggling, while Norfolk Island’s supply of flax was promising for sail and cord sought by the British navy. For settlers, navigable waterways surpassed arable land in value: Flinders’ locating a way through the Torres Straights unlocked a better route to India and the Cape of Good Hope. Bengali goods often went to Sydney; supply ships later went to China for tea. In 1814 Britain exported £100,000 of cotton to Indian Ocean sites plus Canton; the trade then grew eleven-fold in 7 years. Asian exchange hastened New South Wales’ transition to free settlement. Trade routes also led to military outposts, notably Hobart and Launceston in soon-to-be Van Diemen’s Land (i.e., Tasmania) and Victoria, forming the basis of Britain’s claiming the continent. Interior settlement was secondary.

By the 1830s Australia was more shipping terminus than a self-sufficient settlement, links to Europe surpassing those to Asia, the exports comprising wool and whale oil (during its ascendancy in the 1840s). The latter was a free man’s calling, but the trade was cut short by the gold rushes, which as in San Francisco lured away men on arrival in Hobart and Melbourne. (NB: Boston and Salem whalers had no tradition of grog.)

The lack of inland waterways explains the hub-and-spoke tradition later seen also in railway expansion. Wool and wheat were expensive to ship amongst the colonies, for which reason mining was initially ignored. Wool was 10 times more valuable than wheat, immediately worth sending to the UK while fostering centralization in the coastal cities. Sheep runs, stretching from Brisbane to Melbourne, a span equal to Boston to New Orleans, were often a half- or full day’s ride apart, such that few regional towns grew up. Immigrants therefore tended to look for work in the seaports because the inland cost of living was high, due to transport and lack of services for women and children. The towns also manufactured cheap goods.

Gold was the main export of the 1850s-60s, supplanting and replaced in turn by wool. The ‘circle route’, west around Antarctica, was more important in the 1850s than the Suez Canal in the 1870s (map p.180). Its debut coincided with fast American clippers, suited for the ‘high latitudes’ notwithstanding the risks of icebergs and uncharted islands. Ideal trips extended from Australia to China to collect tea for return to England. The trade fueled emigration, as Chinese ships were incapable of ocean journeys. [NB: Most Chinese immigrants to America came from the southern tea ports.] Nonetheless, Suez came to replace the circle route as gold and wool warranted the shortcut’s high charges and emerging steamships. By the 1880s steamers dominated coastal Australian shipping but not yet the European routes; mails and gold predominated as the London-Melbourne run halved to 45 days, helped by European trains.

In the 1830s Australian colonies began selling land to immigrants and by extension consolidating squatters, creating funds to subsidize immigration. Queensland was most assiduous, wishing to shun tropical (Chanak) labor. By the late 1840s freemen surpassed convicts. The undertaking was necessary to compete with cheaper migration to North America.

The first Geelong railway ran to Melbourne not the northwest goldfields, unprofitably and ultimately selling to the Victorian government. By 1860 most railways (save Melbourne’s city lines) were publicly owned. Unlike the US and Europe, Australia relied on public subsidy to build her rails, the colonies looking to unlock their interiors (but not those of rival colonies). There were three separate gauges, the narrowest being nimblest in the hills. Cargo featured imports sent inland and primary goods for export. Farming came to rely on rails: South Australia’s pair of peninsulas shortened transport of wheat, making it the Aussie granary over the 1850s-70s; the other states of the ‘southeast boomerang’ followed suit; as cost fell, Australia came to supply Europe.

The fastest steamers to Australia in the 1890s were German, a sign of her rising power or of British decline. Ships remained the primary means of alleviating Australia’s isolation as late as the 1960s, when air travel – already domestically important – and telecoms came to the fore. Shipping costs continued declining because of the Suez shortcut (hence Menzies’ interest) and the shift to oil. Cars were domestically important – holidays no longer depended on the rail terminus – and Australia surrendered self-sufficiency in transport fuel in the 1950s. Governments sought to prop up their interests in rail shipping through the 1960s via subsidized rail charges, neglecting to build or repair roads, rationing fuel, and tax; such policies were found unconstitutional in 1954.

World War I and the interwar years are treated lightly; World War II, though threatening to isolate Australia, stimulated her economy, which gained in the postwar even as British trade halved. Thus Blainey’s thesis seems to lose steam as the 20th century wears on. The final chapter detours into sociocultural discussion of immigrant waves, seemingly to point out eastern European and then Asian migration was bound to surpass British arrivals. Notwithstanding, a well-sketched view of pre-Federation Australia.

Aussie mateship, a collectivist ideal, held that men should be loyal to the others with whom they lived and worked, rooted in the idea of the country as a man’s land. The tradition ran strongest in the interior, and was taken up by the Labor Party after 1900. Education was seen as spurning mateship, as snobbery not self-improvement.

18. Black, English Nationalism (11 September 2025)

Nationalism, though often seen to have originated in 19th-century transitions from monarchical to republican government and self-determination, dates in England to the 10th century. This older archetype of shared culture, solidarity, and cohesive leadership is far from the racist phenomenon portrayed by contemporary academics.

History not ethnicity is the decisive feature of English nationalism. Alfred’s England, comprising Angles, Saxons, Danes, Vikings, and Britons, was based on Christianity and the common law, marked by equality before the law and trial by jury. The latter’s emphasis on content, not prescription, promoted a civic character and continuity, enshrining rights and resolving disputes where no law was established. It was a vital element of government and society in pre-Norman England. During Normanization, the common law gained strength by enhancing public power, that is the government’s ability to resolve, distinguishing between the crown as benign but rulers as (potentially) bad. Whereas the Church, though not taken over by the Danes, was Latinized by the invaders; upon French conquest of Normandy in 1203-04, Anglo-Norman elites were forced to choose sides.

The English vernacular became identified with the country in the 13th century, including literature such as Chaucer’s Canterbury Tales. The end of the Hundred Years War (1337-1453, when all save Calais was recaptured by the French) and Henry VIII’s reformation (1529-36, including sales of monasteries) marked decisive breaks with continental culture. The ‘new nationalism’ of 1453-1603 then added the rise of Parliament and popular acceptance of Protestantism (for example by Foxes’ Book of Martyrs). Elizabeth rejected looking into men’s consciousness, seeking only outward conformity, such that sects needn’t flee to Europe.

In British England, 1603-1783, the political history of the Stuarts and Hanovers can been seen partly as an attempt to reconcile ethnic diversity via Parliamentary sovereignty. Commencing in earnest with Cromwell’s postwar unification, nationalism was a byproduct, not the aim of racism. For England’s religious loyalties remained divided and 1707 created a multi-confessional state, Scotland’s Episcopacy having been replaced by Presbyterianism in 1689. Further the progressive Whiggish society (frequently in debt to the Scottish Enlightenment) saw itself as premised on political and cultural traits such as the Blackstone’s understanding of the Magna Carta, Petition of Right, and common law and Burke’s Glorious Revolution and Declaration of Right – but not ethnicity. Meanwhile, patriotic Anglo-Saxonism counted for little in the residents of Ireland, Scotland, and Wales, who viewed medieval history differently; Scotland retained the Roman law.

In the 18th century, stagecoaches were linking the country as never before, and then came Victorian railroads, both serving to strengthen communal ties. Speaking of Britain often meant England, albeit such works as Churchill’s History of the English-speaking Peoples were nuanced. By World War I, tensions between England and Britain fully emerged. But nationalism was also evident in the preference for the rural over the urban (especially London): manufacturing was only 150 years old, while villages dated to the Anglo-Saxon era. The decline of the British empire too rekindled English ways.

The next point of departure was 1973’s accession to the European Economic Community, though it had been presaged by the turn from the dominions to the multiethnic Commonwealth. The emphasis of the England’s latter 20th century was not empire or was but social progress and ‘diversity’. The Church of England’s decline played a substantial role in nationalism falling out of fashion. The common law too suffered at the hands of governments wanting to rule by statute as well as the invasion of EU Roman law. Regionalism rose in the 2010s, not just because of devolution but also through Cornish and Midlands resentment of London (‘leveling up’). That is, English nationalism confronts not only the Celts but also anti-southern sentiment, which again dispels simplified ethnic explanations. (Here is a reasonable plaint: why should some 120 MPs vote for England when the English cannot vote in Edinburgh, Cardiff, or Belfast?)

English nationalism is not merely a variety of anti-elitism. Black seeks to disentangle it from populism, which owes much to 20th-century Communist ideology (i.e., the supposed good of the people driven in a collective march forward – from the top downward). The contemporary left, focused on race and sex, makes no effort to unify. Englishness entails fair play, friendliness (but not intimacy) and helpfulness, individualism but concern for the less fortunate (but not communitarian), and traditionalism (settled Burkean prejudice), all toward an ideal of social cohesion and patriotism or love of one’s country. These elements, not merely disgust with unmanaged immigration – that is, no attempt at assimilation – were all evident in England’s leading the vote to exit the European union.