Civil and Human Rights · History · Science

Significant changes in NZ birth statistics

New Zealand’s birth rate has fallen well below replacement level. The average number of births per woman has fallen to 1.6 in 2021, the lowest ever, compared with its 1961 peak of 4.31 (which was in the final quartile of the post-war Baby Boom and was the year the Pill became available in NZ).

A total of 57,105 live births were registered in the year to 31 March 2021, representing a statistically significant fall from the 21st-Century peak of 64,341 births in 2008 and the all-time peak of 65,391 in 1962 (when the population was much lower; 2.42 million compared with 5.12 million now).

Despite continued growth in average life expectancy (around 82 years now, compared with around 63 years a century ago), the “natural increase” in population is now consistently below the death rate; meaning the population can only increase with continued immigration.

Another major change is a sharp fall in the number of teenagers giving birth and an equivalent rise in women over 40 giving birth. In 1980, the teenage birth rate was 38.2 per 1000 teen females. It is now 9.8 per 100. By comparison, the birth rate for women between 40 and 44 is now 13.42

The number of new mothers in their 40s now exceeds those in their teens for the first time. Many teen pregnancies now appear to be planned; as are pregnancies in women over 40. The growth in the latter appears driven by women choosing to start families later in life, and coincides with women having become a majority of those with tertiary education and many women enjoying careers.

The median age of women giving birth has risen from 25.7 years in 1980 to 30.8 in 2020. For Māori women it is 27.3 years, for Pasifika women 27.7 years, for Pākehā 31 years and “Asian” women 32.1 years.

These NZ trends are reflected around much of the world and may be linked to steady growth in the past 50 years in the proportion of women in higher education and the growth of the middle classes. Countries including Japan and many in Europe have sharply falling birth rates. So do formerly high-birth rate counties such as India and especially China, which abandoned its one-child-policy in 2015 from concern at having insufficient young people to support a growing elderly population, with no upward effect on birth rates.

World population is expected to peak at 9.7 billion in 2064 and decline to 8.8 billion by the end of the century.

All this is good news – for the status and position of women in societies; and for the reduction in the pressures on resources that were caused by the population growth of the past two centuries. The growth was caused by the very welcome advance of modern science with its resulting massive increases in health and life expectancy. Demand for resources will fall from mid-century, for the rest of this century and well beyond. Thus will modern science and education also solve the climate question so many people are alarmed by and which the media unthinkingly portray as the next Armageddon.

Statistics NZ data retrieved 9 August 2021
Stuff article 9 August 2021: Changing age of motherhood
The Lancet, 17 October 2020: Population scenarios for 195 countries

Civil and Human Rights · News media · Public affairs

New ABC survey an insight into Twitter’s Thought Police and cancel activists

In the Roman Colosseum of old, when a gladiator fell, the watching mob often decided whether he died or lived. If the mob raised its thumbs up, he lived. Thumbs down meant he died. Twitter is the new Colosseum and its inhabitants are the new mob, deciding what opinions, facts and beliefs can be expressed, and what cannot.

As the range of opinions people are “allowed” to express in public (and increasingly in private) has narrowed rapidly in the past few years, Twitter has become the de facto arbiter of “acceptable” opinions. Those whose thumbs work the phones connected to it daily and world-wide unleash barrages of tweets demanding “wrong” views be taken down and that those who transgress issue grovelling apologies,  be sacked from their (often unconnected) jobs, or both.

This mob of the  righteous great and good on Twitter is demonstrably a small minority. Some 350 million people have Twitter accounts. If that sounds a lot, consider that 2.85 billion people have Facebook accounts. Even though Facebookers far outnumber the denizens of Twitter, consider also that the global population is 7.7 billion, so most people in the world are not on either of these platforms that dominate so much of public discourse (at least in the countries where they are not banned, principally China).

So who are these 350 million people who think they alone can decide whose views will be heard and whose will be cancelled in the “public square” which today is dominated by “social media” at the expense of the fast-shrinking old media of newspapers, radio and television, much less the “town hall” and “speaker’s corner” public meetings of not that long ago?

Clearly many tweeters are journalists. One hardly encounters a journalist now without a Twitter account. Check a few out. They are all there, tweeting at each other in an echo chamber that rebounds with what they put into the public square in their day jobs. Many tweeters are clearly celebrities. Many are academics. Many politicians are there; according to Wikipedia (which appears to be dominated by the same people who dominate Twitter), Barack Obama has the most Twitter followers of all (more than a third of everyone on Twitter!), despite relinquishing office in January 2017. But you have to be a politician with the “right” credentials. You won’t find Donald Trump on Twitter. He is banned. He’s also banned from Facebook, at the baying of the Twitter mob, which is slightly odd, because Twitter mobs hold their noses at Facebook, whose huddled masses they openly despise. You’d think they’d allow Trump there.

But put aside hearsay and observed opinion. This week the ABC in Australia gave us a valuable statistical view of the kind of people who dominate Twitter. The ABC runs an annual online survey of public opinion it calls Australia Talks. Starting in 2019, the 60,000 people claimed to be in its panel have been asked questions about the issues that motivate the ABC, such as climate change, gender, discrimination, inequality, national identity, politics and social media. The answers they give also tend to reflect the ABC’s world-view  (which in turn reflects the views of its journalists on Twitter),  but that’s just my opinion. You can judge for yourself.

In a survey of social media published this week as part of the 2021 Australia Talks results, the ABC revealed that Twitter is, as one would expect,  used by a small minority of Australians (14 pc; incidentally more than I’d have guessed, given Twitter’s  international total,  but then the Australia Talks panel will be skewed towards ABC followers, many of whom would be tweeters).

Many of them are high-income folk, earning more than $A2500 weekly or $A130,000 a year. Of the Twitter users surveyed, more of them are Green voters than supporters of any other party (47pc of Greens use Twitter, compared with 38pc of Labor voters and 26pc of Scott Morrison’s supporters).

The survey also says 64pc of Australian tweeters have reported “offensive comment” and 74pc have boycotted a corporation because of “misbehaviour or offensive messaging.”  That certainly gives weight to the view that Twitter sits like a Colosseum mob, thundering who will be heard, who will be cancelled, and who will lose their job. It seems simply extraordinary that such a high proportion of the small number of people who are affluent inner-urban dwellers not only spend massive amounts of their time on Twitter, but are also finding so much that offends them and then reporting and boycotting the transgressors.

While Australia Talks only covers Australia, I suspect from my Kiwi observations that a similar survey in Aotearoa New Zealand would produce similar results, as would surveys in the UK and probably many other countries where the public square is dominated by the Twitter mob.

Those who dwell on Twitter truly are the Thought Police. Offend them at your peril.

Civil and Human Rights · History · Obituaries · Public affairs

Long live the memory of Linda Brown, almost-forgotten kick-starter of the American civil rights movement

Linda Brown at Sumner Elementary

Schoolgirl Linda Brown walks on by Sumner Elementary in Topeka Kansas

Linda Brown died on Sunday in Topeka, Kansas, aged 76, all but forgotten in her own land, let alone in far-away places like New Zealand. But before Rosa Parks, before Martin Luther King, before even Nelson Mandela became household names, Linda Brown was at the very start of the fight to bring down the apartheid-like system of “Segregation” enacted in America’s South to stop the black population enjoying their freedom from slavery.

In 1951, Brown, then aged nine, was a third grade school pupil in Topeka, her home town. Like all her black schoolfriends, she was required by law to attend Monroe Elementary, a “segregated” black school three kilometres from her home, rather than the nearby Sumner Elementary, which was only for white children.

Her father, Oliver Brown, a welder and pastor, tried to enrol her in Sumner Elementary and when she was refused, he challenged the refusal in court. Her case got all the way to the United States Supreme Court, which ruled, in 1954, that the refusal was unconstitutional. The case, which opened my eyes when I learnt about it at university in the 1980s, was called, in legalese, Oliver L Brown et al v Board of Education of Topeka, Shawnee County, Kansas, et al, or, for short, Brown v Board of Education.

It led to the desegregation of America’s school system, allowing non-white children to attend their local formerly whites-only schools, which always offered better facilities and better schooling than the poorly resourced black schools. It was the first big victory against the Jim Crow laws that had kept black Americans marginalised since the Civil War ended slavery in 1865. Further court cases followed; followed, eventually, by Congress passing the Civil Rights Act in 1964 to legislate equal rights for all, then the Voting Rights Act of 1965 to ensure black citizens could vote in those states that even then still tried to stop them from voting (as an aside, it wasn’t until another Supreme Court case in 1967 that black Americans were allowed to marry white Americans).

Oliver Brown’s court case on behalf of his daughter was taken up by the National Association for the Advancement of Colored People, a famous civil rights group founded in 1909 to advance the rights of America’s oppressed blacks, whose status—especially in the southern former slave states—had barely improved since Abolition.

“My father was like a lot of other black parents here in Topeka at that time,” Linda Brown said in a 1985 interview. “They were concerned not about the quality of education that their children were receiving, they were concerned about the… distance that the child had to go to receive an education.”

Brown and her sisters had to walk across railway tracks and a busy road to catch the bus to Monroe Elementary, and so it was this distance, more than a disparity in quality between Monroe and Sumner, that initially motivated her father.

“He felt that it was wrong for black people to have to accept second-class citizenship, and that meant being segregated in their schools, when in fact, there were schools right in their neighbourhoods that they could attend, and they had to go clear across town to attend an all-black school. And this is one of the reasons that he became involved in this suit, because he felt that it was wrong for his child to have to go so far a distance.”

The Fourteenth Amendment to the US Constitution—enacted in 1868 to grant the freed slaves the same rights as white Americans—declared that “no State shall… deny to any person… the equal protection of the laws.” But the Jim Crow segregation laws passed in many southern states after Abolition had been upheld as legal until Brown v Board of Education, particularly by an 1896 Supreme Court decision, Plessy v Ferguson, which ruled that separate facilities for different races were allowed under the Constitution as long as those separate facilities were equal. Which of course they were not.

The NAACP lawyers combined 13 cases of refusals to enrol black children in white Topeka schools into a single court case for legal emphasis and convenience. They put Oliver Brown’s name at its head so as to have a man as the main applicant, something seen as being important to the white male judges of the day. The twelve other applicants were mothers of refused children.

The case was initially heard by the United States District Court for the District of Kansas. The three judges rejected it, upheld Plessy v Ferguson and declared that, though segregation in public education had a detrimental effect on “negro children”, “negro and white schools” in Topeka had “substantially equal” buildings, transportation, curriculums and teacher qualifications.

This ruling was not a defeat but the outcome the NAACP hoped for, as the organisation wanted to get the case before the US Supreme Court, which has the final say about  whether a law is legal under the Constitution.  By 1954, when Brown v Board of Education was decided after three years in the judicial system, the nine-member Supreme Court had a liberal majority under Chief Justice Earl Warren, who had been appointed the previous year. Brown was to be the first of many civil rights cases heard by the Warren Court to produce landmark judgements that changed American society forever.

The NAACP’s chief counsel was Thurgood Marshall, who in 1967 was to be appointed to the Supreme Court as its first black justice. Marshall combined Brown with similar school cases from other states, making it the lead one because of the earlier Kansas ruling that the separate black and white schools in Topeka were “substantially equal,”  something he wanted to decisively challenge. That the winds of change were beginning to blow was exemplified by supporting briefs from US Federal Attorney General James P McGranary and Secretary of State Dean Acheson that America’s racial discrimination (hitherto upheld by the courts on behalf of the states) was hurting America’s image and interests abroad as the Cold War with Soviet Russia intensified.

The Supreme Court heard the case in 1953. Documents revealed since show that then-Chief Justice Fred M Vinson was opposed to overturning Plessy v Ferguson, but some senior justices stalled for time. They could not reach a decision and the court did not issue a judgement. Then Vinson died, Earl Warren was appointed, and the case was heard again with additional arguments. Much internal debate took place between the justices. Warren wanted to issue a nine-nil unanimous declaration that segregation was unconstitutional and thus illegal, but some of the justices held out. Eventually all agreed with Warren, who read his earthquake of a judgement on 17 May 1954. Its conclusion stated:

“Segregation of white and colored children in public schools has a detrimental effect upon the colored children. The effect is greater when it has the sanction of the law, for the policy of separating the races is usually interpreted as denoting the inferiority of the negro group. A sense of inferiority affects the motivation of a child to learn. Segregation with the sanction of law, therefore, has a tendency to [retard] the educational and mental development of negro children and to deprive them of some of the benefits they would receive in a racial[ly] integrated school system.

“We conclude that, in the field of public education, the doctrine of ‘separate but equal’ has no place. Separate educational facilities are inherently unequal. Therefore, we hold that the plaintiffs and others similarly situated for whom the actions have been brought are, by reason of the segregation complained of, deprived of the equal protection of the laws guaranteed by the Fourteenth Amendment.”

This historic judgement was, of course, only the beginning of the end of segregation. Many states, including the supposedly “liberal” New York, had de facto school segregation. Some school boards with statutory segregation prevaricated, leading to the so-called Brown II of 1955, in which the Supreme Court ordered desegregation to proceed “with all deliberate speed.” The bright yellow school buses of today’s America came about as a result of court-ordered “busing” of children between neighbourhoods to attend the same schools as children from richer or poorer neighbourhoods.

And beyond Brown, many hard battles lay ahead for black Americans. Far more than schools were segregated; so were trains and buses, parks and libraries, even restaurants, cafes and public toilets. They still faced the  fight for the right to vote, let alone being treated as equal citizens by the majority; a battle still to be fully won, something demonstrated by the Black Lives Matter movement.

By 1954, Linda Brown had passed the age of being able to attend Monroe Elementary. Ironically, Topeka’s junior high schools had been desegregated without any court action since 1941 and Topeka High had not been segregated since its opening in 1871. The Kansas Jim Crow laws only permitted segregation “below the high school level”.

Linda Brown and friends a decade on

Linda Brown is on the left in this photo of her and three other plaintiffs in her case, taken in 1964 on the 10th anniversary of Brown v Board of Education.

Other black Americans from the Civil Rights era became far better known than Linda Brown. The year after Brown v Board of Education, Rosa Parks refused to stand up for a white passenger on a bus in Montgomery Alabama and was arrested. Martin Luther King took up her cause and the Civil Rights Movement was on the road, shattering white America’s post-war calm and featuring such monumental events as the 1963 March on Washington and King’s “I have a dream” speech, one of the most powerful orations in the English language.

But I hope it is never forgotten that Linda Brown, a little girl from Topeka Kansas, began it all. I find it sad that, unlike civil rights peers such as Rosa Parks, she does not even have her own page in Wikipedia and is mentioned only four times, almost in passing, in the exhaustive Wikipedia article on Brown v Board of Education. Her adult life was rarely in the public eye. She featured in news items on the 10th anniversary of her case, and in a 1985 television documentary. She was a talented pianist, I learned from the little published information about her life. She taught children to play piano at the Topeka church she attended. There has been no movie about her, no best-selling book, no television series. Perhaps with her death, her importance to the shaping of modern America — and because of that, the world — will be given the recognition it deserves.

News media · Science

¡Hola! Buenos días del Ciclón Hola! Or maybe not.

In recent years, the media has gleefully embraced tropical cyclone categories, telling us, for example, that Cyclone Gita, which reached Tonga on February 12 was “Category 4,” the same intensity claimed for the meandering Cyclone Hola, which the media tell us will wreck New Zealand’s Northland,  Coromandel and East Cape today.

Further, we’re often told, many of these storms are almost as big as the Category 5 Hurricane Harvey, which affected Texas and Louisiana last August (a cyclone, hurricane and typhoon are simply different terms for the same kind of tropical storm, depending where they are).

Until relatively recently, they were simply cyclones, when they were cyclones. Or just storms, when they were just storms.

It’s time to hang on a minute. For a start, Hola is so weak it’s no longer a cyclone by any definition, whatever the media are frantically telling us. It is not now even a particularly big storm, as storms go. MetService, the voice of reason, says it will be a “one-day wonder,” but if you find that at all in a news story, it will be buried at the end.

Most importantly, New Zealand forecasters use a completely different category-system than the US. What we call Category 1 and 2 cyclones are not even Category 1 in the US; their Category 1 is our Category 3. Our Category 4 is their Category 2, and so on. The only similarity is that 5 is the top category in both, and we get to 5 well before the American category does.

We use Australia’s Bureau of Meteorology Scale, which (to the media at least) makes our cyclones appear bigger than the same-digit category used under the American Saffir-Simpson Scale. Look at the chart below  this article, or read more about it on the Bureau of Meteorology Tropical Cyclone FAQ.

And stop worrying. Such storms are not becoming more frequent, anywhere. It is just the media coverage of them becoming noisier and more doom-laden. Harvey was the first major hurricane to make landfall in the US since Katrina and Wilma of 2005 — an 11-year gap — but you would never have learned that from the panic-driven media.

New Zealand has had just two real cyclones in the past half century — Bola in 1989 and Giselle in 1968. Bola mainly affected East Cape north of Gisborne. Giselle was the biggest storm by far  experienced in New Zealand’s continuously recorded history, which goes back to a little before 1800. Giselle  did enormous damage along most of the length of the country and sank the inter-island ferry TEV Wahine with the loss of 52 lives. But the way every tiny storm gets reported now, you’d think they’ve never been worse and that this week’s quite normal storm is always worse than last week’s.  And that is nonsense.

cyclone category scale

Chart: Bureau of Meteorology, Australia.

Language · News media

The Language Always Changes Department: #1754


Radio New Zealand reporters have been talking about cracks discovered in “semi-trailer” towing bars this week; cracks that could cause the trailer to come free while the truck is moving. A “semi-trailer” is what Australians call the truck-and-trailer unit New Zealanders have traditionally called an “articulated truck” or “artic”.

Just as some Kiwis decades back called a truck a “lorry” — the British word for a truck; truck came from America — it seems our artics are becoming semis. Just as railway stations are now “train stations” — the American term — not only here but also in Australia and England, where they also used to be called railway stations.

Is this a language rort? A “rort” is a vulgar Australian term for a scam, which crept into New Zealand English in the 1990s when it was used colourfully by former Aussie PM Paul Keating when he unilaterally stopped Air NZ flying domestic routes in Australia. I don’t much like “rort” but it is part of the language now and the language always adopts, adapts and changes.

The one change I won’t accept is the vowel shift in the way many Kiwis — including some RNZ reporters — say “women”. They pronounce it the same way they say “woman” which forces people who still know the difference to guess whether they mean one woman or two or more women. I’ve not heard this vowel shift anywhere else English is widely used, whether in Australia, England, Ireland, the US, Canada, or even Fiji or India for that matter.

Meanwhile, my top photo shows that some truck driver has parked his semi on the grass verge in Parararaumu. Or is that the berm? In Australia it would be called the nature strip. Whatever, it’s a fair dinkum parking rort. And below, the engraved name above the entrance to what is now called the Wellington train station.

History · Science

A history of time, space and the wonders of the universe, in 800 words

Expansion and history of the universe

We live at a time when we know many of the secrets and history of the universe, and even where time and the universe are heading. Yet until only a few centuries ago, people looked into the night sky and — if they thought about it at all — had no real idea of what they were seeing, only myths and guesses.

The people of 1618 could look up and see the stars and planets (the latter looked like stars) but (the very few with the just-invented telescope aside) had no idea what they were, beyond being tiny dots of light in the blackness. They could see the moon, often even in daylight, but did not know what it was or what it did. During the day, they could see the sun and feel its heat, and while assuming it was a ball of fire, they did not know actually what it was, despite using it for agriculture, light and warmth.

Even a century ago — when, thanks to great developments in telescopes and science since 1618, we knew the Earth was a big globe that circled a star we called the Sun; that the Earth orbited the Sun and was one of a number of such orbiting planets, many of which had moons that circled them like our own moon; and that our Sun (a huge ball of plasma converting hydrogen into helium by nuclear fusion) was one of millions of stars in a galaxy we knew as the Milky Way — we believed our galaxy was the entire universe.

Today, we know the Milky Way contains billions of stars, many of them with their own orbiting planets, and that the universe contains billions of galaxies. Scientists have estimated the age of the universe at some 13.7 billion years and the age of the Earth as some 4.5 billion years, around the time our solar system formed from matter solidifying into lumps around our newly formed Sun.

We humans have walked upon this Earth for only 200,000 years. Civilisation — the groupings of people into villages, towns and cities and using written languages and rules governing how people interact — has existed for only some 5000 years of that speck of time, and in the beginning not everywhere on Earth, chiefly the Middle East, China, the Mediterranean and likely Zimbabwe and a few parts of what we now call Latin America.

Scientists today believe the universe started with a gigantic cataclysm we now call the Big Bang, a single moment in which time and space came into being. We can posit this—and the age of the universe—from the theories of geniuses like Albert Einstein and Stephen Hawking; by measuring the expansion of the visible galaxies into the void; and by such things as the background radiation believed to be from the Big Bang.

Current physics suggests that all the matter and energy in the entire universe was created in the instant of the Big Bang and was flung into space as plasma that settled, cooled and formed over billions of years into the galaxies, stars and planets, all of it shaped and held together and kept apart by gravity.

Gravity—still regarded as a semi-mysterious force—keeps us standing on the ground rather than floating into space; it keeps the moon orbiting the Earth (while creating the tides as the moon moves above us); the planets circling the sun; and all the stars in the Milky Way and all the other galaxies rotating in huge spirals in space. Spirals that probably have huge black holes in their centres, holes so dense and with gravity so strong that not even light can escape, so we cannot see them.

For some decades, up to the late 1990s, scientists believed the universe would continue expanding, though at a slowly decreasing rate, until gravity forced all the galaxies to fall back towards the centre where it all began, and then everything would collapse in a Big Crunch, leading to a new Big Bang that would start it all again.

But observation of the universe with our latest ocular and radio telescopes suggests the expansion of the universe is actually speeding up, not slowing. This acceleration has been hard to explain by Relativity Theory, so physicists have developed theories about “dark matter” adding to mass and “dark energy” driving the expansion — but so far, we can neither see nor find these strange matters and forces.

Stars have finite lives and though new ones are still being born, all will eventually die. Our sun will consume all its hydrogen in about five billion more years. If the universe keeps expanding to infinity, all the stars will eventually burn out, and nothing will remain — no light, no life, nothing. A dismal prospect.

And yet against all this wonder of the universe, we humans have become so conceited that many of us think we can wave a piece of paper and make the temperature of the Earth rise by exactly 1.5 °C, as if we were King Canute. Wise Canute, though, knew that he could not stop the tides, and proved it. Ironically, his home was Copenhagen.

  •  Image of the expanding universe courtesy NASA Jet Propulsion Laboratory
News media · Obituaries

Vale Pat Booth, the fearless journalists’ journalist


One of the great New Zealand journalists of my time, Pat Booth, died today. He was 88.

Pat is best known for his work at the Auckland Star investigating the police case against farmer Arthur Thomas, who was framed for murder, but freed, thanks to Pat’s tenacity, after almost a decade in jail.

He was a journalists’ journalist, of the tough, self-taught kind too-often looked down on by today’s mass-produced journalism graduates. I had the privilege of working with him at North & South magazine, where he was deputy editor when I started there in 1989.

Of course, I was in awe of working with such a legend. I remember him for his ready smile, his gruff voice and his total lack of political correctness. He would try to make me bite with some terrible lines. Writing about the new airline, Ansett NZ, he mentioned the aircraft’s tail markings and added: “Speaking of tail, the hostesses’ uniforms….” Of our first Maori governor-general, Sir Paul Reeves, he said: “Paul? I knew him before he was a Maori.” It was awful stuff, but said with a big smile and without malice. And whatever he wrote, Robyn printed it unchanged. You can’t edit a legend.

The Auckland Star for years had a motto that was typified by Pat’s work there, but seems quaint if all you know is today’s news media:

For the cause that needs assistance
For the wrong that needs resistance
For the future in the distance
For the good that we can do.

He republished it right at the front of his 1997 memoir, Deadline.

Pat died today in a rest home in West Auckland. What a sad end for someone so full of life, a journalist never afraid to chase a story, no matter where it led.

Picture: Pat in the Auckland Star printery in 1976 with printer Leo Smith.  Almost all males wore ties back then!


News media · Public affairs · Reviews

Golden age of newspapers recalled in The Post, a film that could not be set today

Every journalist brought up admiring Woodward and Bernstein will be seeing The Post, though of course, this film is set (just) before Watergate, and features Washington Post publisher Katherine Graham (Meryl Streep) and editor Ben Bradlee (Tom Hanks) rather than America’s most famous reporting duo.

The film is about an important footnote to America’s war in Vietnam, the 1971 newspaper scoop-publication of the so-called Pentagon Papers, a secret Defense Department true history of successive American government machinations in that hopeless war. The documents were leaked first to the New York Times, and then the Washington Post, by military analyst Daniel Ellsberg, whose pyschiatrist’s office was later infamously burgled by Richard Nixon’s “plumbers” (the leak-fixers behind the 1972 burglary of the Democratic National Committee offices in Washington DC’s Watergate complex).

Hanks and Streep play Bradlee and Graham much as I remember them from books and their reputation at the time, as being the stubborn newspaper editor wanting to bring his “small-town” paper to national prominence, and the proud establishment proprietor of a family firm.  Their close working relationship and the challenge that taking on the power of the state means for both the company’s finances and freedom to publish is the heart of this film. The film also references Bradlee’s close relationship with John F Kennedy (they met every week till his death) and Lyndon Johnson, which had given the Post the whiff of being a Democratic Party mouthpiece, accusations still thrown by conservatives against America’s traditional liberal newspapers.

But what leapt from the screen for me was the film’s stunningly accurate recreation of the newspaper world I began working in in the late-1970s and which is now long gone; the clattering typewriters, the rows of chain-smoking reporters yelling into telephones at their paper-piled desks under fluorescent lights, the Lamson (pneumatic) tubes that whooshed canisters holding the paper pages of typewritten stories from the news desk to the printers down below; and above all, the clanking ancient Linotype machines that cast newspaper stories (yes, type-cast!) line by line in hot lead for fitting in the big metal page frames from which printing plates were made. And, oh what an experience, the rumble and shaking of the whole building as the huge presses built up speed to thunder out tens of thousands of thick, inky newspapers an hour.

Hanks Post linotypeTom Hanks as Ben Bradlee, posed beside a Linotype machine.

I don’t pine for those days; today’s technology is superior, cheaper and produces a better physical product, let alone the online access. But The Post reinforced for me that the film is set in what really were the golden years of newspapers, years when newspapers, routinely,  actually broke very big news stories, when most households in most Western countries had the paper delivered not just daily, but every morning and afternoon. An era when politicians respected, and sometimes feared, the power of the press rather than manipulated it with photo-ops and sound bites. I became a journalist near that era’s end, but at least I was part of it before the decline.

Nixon’s White House took the New York Times and the Washington Post to the Supreme Court to try to stop publication of the Pentagon Papers. The court ruled six to three that the First Amendment (“Congress shall make no law… abridging the freedom of speech, or of the press”) trumped the desire of a government to keep its deepest secrets secret.

Supreme Court Justice Hugo Black famously wrote in his judgement on the case: “In the First Amendment the Founding Fathers gave the free press the protection it must have to fulfill its essential role in our democracy. The press was to serve the governed, not the governors.”

It all seems so quaint in this new age of a news media world-wide mostly obsessed with clickbait, celebrities and bread and circuses.


Enjoy your holiday : 2017 flying to be second-safest year for air travel

SONY DSCBarring a big plane crash in the next week or so — highly unlikely — 2017 will edge last year out as the second-safest year in civil aviation history. This is a piece of very good news you won’t read anywhere else, because it’s not bad, disastrous or puerile, and features no celebs.

With just 11 days, there has not been one major crash of a large scheduled passenger aircraft this year, anywhere on the planet. As of today, just 301 people have died in air crashes  in 2017 (less than New Zealand’s road toll!), 155 of them in eight military crashes and 39 because of a cargo plane crash.

The safest year for aviation so far was 2013, with 273 fatalities in 29 crashes, followed by last year, with 316 deaths in 19 crashes, measured by the Aviation Safety Network as involving planes with more than 12 seats. The worst year was 1972, with 2370 deaths, but far fewer people flew then.

None of the 37 fatal crashes so far this year involved a large jet, except for the cargo plane, (a 747 freighter, in Kyrgyzstan; 35 of the 39 deaths were people on the ground when the plane hit). All the rest bar three small Lear jets were turbo-props. The cargo plane apart, not one Boeing or Airbus jet has been in a fatal crash this year. Not counting the cargo and military deaths, just 107 people have died in civilian passenger aircraft.

This is despite more people flying in more planes every year: Commercial aircraft will carry about 4 billion passengers this year — more than half the world’s population. Only 10 years ago, the number was 2.5 billion. About 14,000 commercial planes are in the air at any one moment. They don’t crash into each other, or the ground, or mountains, or anything else because of the progress in navigation aids and onboard collision warning systems in the past three decades.

When you board your plane this week for your holiday, fly assured you will get safely to your destination, and celebrate — joyously — the stunning technology we humans have created in only the past century of the 200,000 years that our branch of the great apes family has walked upon this Earth.

Picture: A Singapore Airlines Boeing 777-200 arriving from Canberra at Wellington Airport.

Public affairs · Television

Borgen: The greatest political drama you’ll never see on TVNZ

What a find! The Danish political drama Borgen is quality television of the kind New Zealand no longer enoys. It follows the rise and ultimate fall of (fictional) minor-party politician Birgitte Nyborg, who becomes prime minister (statsminister) of Denmark through the machinations of that country’s proportional representation electoral system.
The scripting, acting and photography are simply superb, as is the opening sequence (attached). Nyborg (the wonderful Sidse Babett Knudsen) has to juggle the intense politicking of being prime minister with raising her two children and the collapse of her marriage because of the pressures of office. Other major characters include television journalist Katrine Fønsmark (Birgitte Hjort Sørensen) Nyborg’s spin doctor (yes, “spin doctor” is Danish for “spin doctor”!) Kasper Juul (Pilou Asbæk; he is also in Game of Thrones) and her husband, Phillip Christensen (Mikael Birkkjær), whose scene asking his wife for a divorce is so traumatic that Birkkjær in real life cried after filming it.
Denmark (population 5.7 million) is, like New Zealand (population 4.8 million) a small, vigorous democracy, though Denmark is much more affluent (“We are the 12th-richest country on Earth” Nyborg says when trying to justify improved public hospital care).
Like us, Denmark is a constitutional monarchy; its parliament was established in 1849, ours in 1852. Both are unicameral (single-chamber) parliaments; we abolished our upper house in 1951; Denmark its in 1953. Its electoral system is a proportional party list with a 2pc threshhold, unlike our system with geographical electorates as well as a party list. Ours is the same as Germany’s Mixed Member Proportional system — we got ours from Germany and we both have a 5pc threshold. But Danish political parties, like those here and in Germany, have to thrash out coalition agreements after an election, a drama that features prominently in Borgen. Unlike New Zealand, Denmark in real life (and in Borgen) has no Winston Peters figure who has been around forever, deciding who gets to be prime minister.
“Borgen” is Danish for “castle” or “fort” and refers not only to the Christiansborg Palace (the København building housing Denmark’s parliament, government offices, supreme court and the queen’s residence), but also the colloquial Danish word for the government, known as “Borgen” in much the way we refer to the Beehive. The word stems from the proto-Germanic “burgz” (a walled town) which is the origin of the English borough, Scots Edinburgh and German burg as in burgermeister (mayor) and Freiburg (the city in Germany).
The series is in fast-paced Danish with English subtitles, which oddly are in American English. Danish is, as is English, a Germanic language but closer to Swedish and Norwegian than to German, Dutch and English. However, if you understand German, then after a few episodes you start to pick out some of the more curious translations. And you also quickly pick out such nice Danish colloquialisms as “hi hi” for “goodbye” !
I found the DVD box set on Trade Me.