Madsen Pirie Madsen Pirie

Labour’s landslide postwar victory

One of the worst disasters in Britain’s peacetime history happened on July 26th, 1945, when a Labour government headed by Clement Atlee swept into office with a landslide majority of 145, sweeping wartime leader Winston Churchill from office. The UK had been on the winning side of the war, but was soon on the losing side of the peace.

It was reckoned that Labour’s promise of nationalization, a National Health Service and a welfare state appealed to voters, especially those who had fought for Britain’s future against Nazism, and who wanted a new and better world. Alas. They were in for a cruel disappointment. The monies that could have rebuilt Britain’s war-shattered industry and economy went on transfer payments instead of investment.

A system of compulsory insurance for healthcare, with government meeting the premiums for those unable to do so, would have given universal coverage, and created a huge fund that insurance companies would have invested in the economy. Instead a zero-sum game was created, with a giant bureaucracy that was simply too big to manage. The founders actually thought that health expenditure would diminish as people were cured of ailments, but instead created a potentially infinite demand.

The nationalization of Britain’s industries turned them into loss-making, subsidy-dependent monsters that served the interests of the unions, political paymasters and bureaucrats instead of the public who were their consumers. They became a laughing stock and a joke, turning Britain into “the sick man of Europe” with the worst strike record and the lowest growth rate in Europe. That was until we privatized them under Margaret Thatcher.

The blunt fact discovered the hard way then was that socialism doesn’t work. Markets allocate resources, weed out the inefficient, and bring success to those who satisfy consumer demand at prices they are prepared to pay. Socialist planners do not. They leave industry serving political, rather than economic, ends, and under Atlee took huge strides into running it into the ground. It took decades for the UK to recover from the damage that socialism inflicted upon it in that disastrous government of 1945-1951. When the voters heaved it on the scrapheap to return Churchill to power, it was already too late.

The point is that socialism was, and is, a construct of the mind, a vision of a mythical world that might be, one that bears little relation to the real world. It was a disaster then, and would be a disaster now. No-one who experienced that immediately postwar world would want to return to it. Yet the policies that socialists propose today would undoubtedly reproduce its squalor and poverty, just as they have done in Venezuela. One is left wondering how many times we have to repeat that disaster before its lesson is finally beaten into their thick skulls. It doesn’t work.

Read More
Tim Worstall Tim Worstall

Keynes on economic nationalism - the facts have changed

Marginal Revolution points us to Keynes discussing economic nationalism and the idea that actually, you know, it might not be that bad an idea:

But I am not persuaded that the economic advantages of the international division of labor to-day are at all comparable with what they were. I must not be understood to carry my argument beyond a certain point. A considerable degree of international specialization is necessary in a rational world in all cases where it is dictated by wide differences of climate, natural resources, native aptitudes, level of culture and density of population. But over an increasingly wide range of industrial products, and perhaps of agricultural products also, I have become doubtful whether the economic loss of national self-sufficiency is great enough to outweigh the other advantages of gradually bringing the product and the consumer within the ambit of the same national, economic, and financial organization. Experience accumulates to prove that most modem processes of mass production can be performed in most countries and climates with almost equal efficiency. Moreover, with greater wealth, both primary and manufactured products play a smaller relative part in the national economy compared with houses, personal services, and local amenities, which are not equally available for international exchange; with the result that a moderate increase in the real cost of primary and manufactured products consequent on greater national self-sufficiency may cease to be of serious consequence when weighed in the balance against advantages of a different kind. National self-sufficiency, in short, though it costs something, may be becoming a luxury which we can afford, if we happen to want it.

The problem with this being that the facts have changed. We are now at a level of technology where the globe itself is the efficient market size. One of us here was, for a time, the global monopolist in a very minor market indeed. That monopoly being the result of the market being so small that it wouldn’t support two people attempting to operate in it.

But it is possible to be more serious about this. The problems various countries are having with Huawei are well known. Who really wants a foreign government - allegedly - with access to the domestic telecoms system? And yet The Pentagon itself points out the problem here. Even the US market isn’t of sufficient size to be an efficient market. The costs of developing 5G are such that if standards there diverge from the rest of the world - likely, given spectrum issues - then the US is always going to be some years behind in that technology.

No single country supports an entire computer ecosystem. There are many more such examples. The efficient production size is now supranational meaning that the costs of economic nationalism are far greater than they were near 80 years ago. And, as Keynes the economist would be the first to point out, we should do less of things that are more expensive.

Read More
Tim Ambler Tim Ambler

Sorry we cannot pay carers properly; we have to train trolley pushers instead

Matt Hancock celebrated what should, according to The Guardian, have been his last day in office with two press releases. The first “Carers Action Plan 2018 – 2020” looked at the progress made in supporting carers since the plan was put in place a year ago.  The second chucks £20M at helping the NHS train entry-level staff – something it is quite capable of doing for itself.  

Everyone in the country, except in Whitehall, knows that social care is underfunded as a whole and relative to the NHS. Within that carers are under-rewarded with the result that there are too few and acquiring the necessary skills cannot be funded. Carers do not need platitudes, they need money.

One has to wait until p.20 of the review document to reach “Financial support” and all one gets there is:

“2.17 The Department for Work and Pensions will ensure that benefits for carers (including Carer’s Allowance and Universal Credit) meet the needs of carers and support employment for those carers who are able to work.  

2.18 The Department for Work and Pensions will review and improve the information and signposting available to carers who visit Jobcentres to seek support in finding employment.”

There is also a bit about encouraging greater flexibility and generosity by those businesses that employ part-time carers, i.e. volunteers looking after friends or relatives alongside paid employment.  Some people would call that “passing the buck”. 

The government does not even fund holidays for volunteer year-round carers as a charity in Norfolk does.

One of the most heart-rending areas of social care is that provided by young carers, i.e. children who have to prepare themselves for school in the morning and then do the shopping and housework as well as taking care of disabled, or otherwise handicapped, relatives at home.  Imagine the stress involved all year round and the difficulty of creating a normal life thereafter. What progress in supporting them? Sections 3.1 – 3.4 boil down to saying that the DHSC will try to find out who these young carers are. Beyond that, not a penny will be spent.

On the other hand, a glimmer of light may perhaps be glimpsed in the next two paragraphs but again it is not cash for carers just more employment for civil servants:

“3.5 The Department for Education is undertaking a review of Children in Need, which includes young carers, to understand the challenges these pupils face and the support that best improves their educational outcomes, both in and out of school. The findings from the review will inform how best to support Children in Need in order that they achieve their full potential.

3.6 The Learning and Work Institute (LWI) and the Department for Work and Pensions launched customer information materials setting out the rules for students claiming Carer’s Allowance in September 2017. The impact of this activity will be evaluated and consideration given to further activity in due course.”

I will not weary you with more of this one year on review. Suffice to say it is claptrap.  Compare it though with the other press release of the day: “£20 million funding to help 10,000 young people into NHS careers.” The Prince’s Trust will add £7M so we are looking at £2,700 per school leaver to acquire the skills they should have gained at school.  Obviously the NHS, like any other large employer should train the young people they take on. That is a regular part of employment: why should the government suddenly fork out another £20M to do it for them at the same time as denying any further cash to professional social carers or even the kids providing care on top of their school work.

Furthermore, why should entry level NHS catering staff and trolley pushers need further numeracy and literary skills? One does not need A level English to read a menu. And job application skills? By the time they are on this programme, they already in the NHS. And how will the benefits of this expenditure be evaluated? The press statement makes no reference to that but we get a clue from this: the “Health Education England has already worked in partnership with The Prince’s Trust to run 250 pre-employment programmes, helping over 1,000 young people find work in healthcare across the country.” That is a whole government department and the Prince’s Trust delivering programmes each of which has as many as four people on it. Now that’s what I call productivity.

Nick Stace, UK Chief Executive, The Prince’s Trust, has a neat line in irony and concluded: “We believe that when young people succeed, our country succeeds and this is a great example of what that can mean in reality.”

Read More
Madsen Pirie Madsen Pirie

Test tube babies

Louise Joy Brown was born on July 25th, 1978. Her name is not widely known in popular culture, but her claim to fame is that she was first baby born by in-vitro fertilization. This is a procedure that involves monitoring (and sometimes stimulating) a woman's ovulatory process, and removing an egg or eggs to be fertilized externally in a liquid in a laboratory. After fertilization, the fertilized egg undergoes embryo culture for 2-6 days before being implanted in the uterus of the same, or another, woman to set a pregnancy under way. The glass vessel used in the fertilization gives us the "vitro" of the procedure's name, as well as the nickname "test tube baby."

The first successful case took place in Oldham, where Robert Edwards, Patrick Steptoe and Jean Purdy developed the procedure. Edwards was awarded the Nobel Prize in Physiology or Medicine in 2010. Steptoe and Purdy would have shared the prize, but unfortunately both had died, and the prize is not awarded posthumously.

Like many scientific breakthroughs, IVF has brought choices and chances to people who had previously been denied them. In the 40 years after that first success, it is estimated that eight million children have been born worldwide using IVF and other assisted reproduction techniques. While the optimal woman's age is 23–39 years at the time of treatment, the technique has enabled older women, including those past the menopause, to bear children successfully. A woman unable to carry her own children can have a surrogate mother carry the fertilized egg of herself and her partner, and bring it to term. IVF has enabled couples in gay or lesbian relationships to become parents and bring up children.

Developments in cryogenics has enabled first embryos and later unfertilized eggs to be frozen for subsequent use. This has been important for those about to undergo medical treatment that might render them infertile. The outcome from using cryopreserved embryos has uniformly been positive with no increase in birth defects or development abnormalities.

Recent advances have enabled couples who carry genes that can cause life-damaging conditions (such as muscular dystrophy) in a child, to select for implantation an embryo that does not carry them. This eliminates the defect not only for the child, but for its offspring. More recently still, has come the opportunity to remove or even repair defective genes before implantation.

Ethical issues have been raised by the process and its subsequent developments, and there were those who voiced alarm and opposition when Louise Brown was born. The Catholic Church opposes all kinds of assisted reproductive technology, but there is no consensus in religion, science or philosophy on when a human embryo should be recognized as a person and have the rights pertaining to that status. For those who believe this happens at the moment of fertilization of the egg, there are ethical issues when multiple eggs are fertilised, begin development, and only some of them are implanted, with the remaining ones discarded.

Some people are concerned that IVF makes it possible to select for the sex of the child, as Monique and Scott Collins did in 1997, when they chose to have a daughter to balance their family. Others are concerned that people will choose "designer babies" for their intelligence, looks, athletic, musical or mathematical skills. The suggestion is that rich parents will choose to bear children that have vastly improved chances of worldly success and achievement. Still others suggest that the more people that can do this, the more people will enter the world capable of improving it for everyone.

In the absence of a single world authority to impose universal rules, some countries will be more ready than others to allow experiments, innovation and advances, allowing would-be parents denied opportunities in one country to seek them in another. Few people who suffer from crippling hay fever or a life-threatening nut allergy would want their children to endure the same if they could prevent this. And the same is more obviously true for more serious life-limiting conditions. On a more general level, many people, if not most, would want a world that gave humans the chance to enjoy better lives. The genie came out of the bottle when Louise Brown came out of the womb, and it is not going back.

Read More
Tim Worstall Tim Worstall

Craig Newmark's quite right, he doesn't need a billion dollars, nor deserve nor is it just he has

Craig Newmark is, for those not entirely au fait with this internet business, the guy behind Craigslist. He’s very rich from having been so and he wonders whether he should be:

Craigslist founder Craig Newmark: 'As a nerd, I don't know why I need a billion dollars'

The answer being that you don’t need a billion dollars Mr. Newmark. No one does. But we’re absolutely delighted that you do and will defend to that last ditch your ability to keep it if you so wish. On purely pragmatic grounds.

As William Nordhaus has pointed out entrepreneurs gain perhaps 3% of the value created by their innovations. Near all the rest flows to consumers. The only way that these people are getting rich is by delivering far greater value - in aggregate - to us out here. We think that us getting richer is a good thing therefore stand by the arrangements which make us so.

And that’s it. There isn’t anything about fairness or justice in this. The reason the last set of entrepreneurs are left with Scrooge McDuck like mountains of cash to surf down is so that the next generation get to work to make our children even richer. No other justification is either necessary nor righteously used.

Economics is, famously, amoral. This being one excellent example of that point. We’ll even put up with an awful lot of rentiers and inheritance to keep that entrepreneurial show on the road for that is the very engine of increased prosperity over the generations.

Read More
Madsen Pirie Madsen Pirie

The Window Tax

It was on July 24th, 1851, that the hated Window Tax was finally abolished. Introduced under King William III, it was not intended to hit poor people, and exempted cottages, but was designed to be in proportion to the wealth of the taxpayer. An income tax was thought too intrusive, because the government had no business knowing how much people earned. The Window Tax was initially levied in two parts. People had to pay 2 shillings annually (a tenth of a pound) per house if they had fewer than 10 windows, 6 shillings if they had between 10 and 20, and 10 shillings for those with more than 20 windows. In current values, 2 shillings then would be worth about £13.50 now.

Of course, taxes change behaviour, and dynamic models must take this into account. The tax did not raise the hoped-for sums because many people responded to it by bricking up some of their windows in order to avoid it. Visitors to Britain stare in fascination at some of our old houses, noting that where there was clearly once a window, there are now bricks or plaster. Sometimes this can be seen in whole rows of terraced houses. New houses were built with fewer windows to avoid the tax.

In Scotland a Window Tax was introduced after 1748. A house had to have at least seven windows, or a rent of at least £5 to come under the tax. When it was increased in the 1780s, some Scots opted, instead of bricking up windows, for the less costly recourse of painting them black, with a surrounding white frame. These were known as Pitt’s Pictures, after the prime minister of the day, and can still be seen in some places.

The Window Tax was unpopular, because it was seen by some as a tax on "light and air." The tax was increased many times, especially during the wars with France, but it was halved by the reforming administration of 1823, and ended altogether in 1851 after popular agitation.

It does provide a salutary lesson for those who would levy taxes. Increases in tobacco duty might be intended to raise money or to make people smoke fewer cigarettes, but they also encourage smuggling. Increases in alcohol duties might be for revenue or to cut alcoholism, but they also lead people to opt for cheaper booze, and in Scotland, perhaps even for opioids.

Stamp duties on house purchases result in fewer transactions because people stay put in order to avoid it. This leaves the elderly staying in larger homes than they need once their children have left, leading to a market shortage of homes suitable for young and growing families. There is a point at which income tax increases produce a fall in revenue as people put in less work and use tax-shelter schemes to avoid paying them. Higher corporation taxes lead corporations to locate elsewhere, and higher capital taxes lead people to move it beyond the reach of the tax man.

A ruling of the United States Supreme Court stated that "The legal right of an individual to decrease the amount of what would otherwise be his taxes or altogether avoid them, by means which the law permits, cannot be doubted," and a judicial ruling in the UK declared almost a century ago that the law did not require a person to pay the maximum tax if they could avoid doing so.

Two things in particular irritate those who would spend our money as they wish rather than as we wish. One of these is tax competition, which gives people the option of moving assets and earnings to lower tax environments. And the other is the ability of people to modify their behaviour in order to escape the incidence of taxes levied upon it. The history of the Window Tax is a good lesson.

Read More
Tim Worstall Tim Worstall

To entirely miss the point of NHS privatisation

Well done to The Guardian here for really missing the point of NHS privatisation. Their numerical complaint seems odd to begin with:

In some sectors the proportion of private spending is many times the overall average of 7.3%, with 44% of all spending on child and adolescent mental health going to private providers, and 30% of mental health budgets overall.

And as far as we know about these things 100% of GP services are contracted out and always have been. Shrug. As ever, the question is what is best done inside a command and control organisation, what best contracted out to the market on cost, efficiency, specialisation grounds? You know, the Coasean question about why we even have production organisations at all?

But it’s this that irks:

Evidence that private providers are failing in their duty of care to vulnerable young people is mounting. In April, Priory Healthcare was fined £300,000after pleading guilty to criminal charges related to the death of Amy El-Keria. Another of the company’s hospitals is set to close after being rated inadequate. While poor practice is not limited to private providers, on accountability and transparency measures they fall far short.

No, that’s the wrong way around. On accountability and transparency the market wins, hands down.

How many NHS wards or hospitals have been closed down because they turn out to be terrible? Or inadequate, or a bit not very good? The answer, as we all know, is none. How many private sector things get killed off for being a bit not very good?

Well, that’s the point, isn’t it? One thing the market is inordinately good at doing is pushing out, bankrupting, closing, those things which, by any absolute standard are pretty much fine but which also happen to be just worse than others. Nothing particularly wrong in that absolute sense with House of Fraser, varied spinning mills, that NHS sandwich maker. They were just not quite as good as others at doing those things so bye bye and off they went.

Yes, we talk about how markets increase productivity, quality, lower price and so on. But they do these things through competition. That is, by killing the businesses which aren’t as good as the others. Which is a great deal more accountability and transparency than we get from a tax funded monolith like the NHS, isn’t it? Stafford Hospital is still in business even if it’s called County Hospital now. The Priory’s place, on the other hand, is out of business entirely for being a bit not very good. That’s pretty accountable there.

Read More
Madsen Pirie Madsen Pirie

Banning foreign words

On July 23rd, 1929, the fascist Italian government on Benito Mussolini officially banned the use of foreign words in Italy. The aim was to “Italianize” the culture and purge it from foreign influences. Actually, most people then spoke regional languages there, and when Mussolini had come to power in 1922, only 12% spoke Italian. The aim was to get the words of foreign languages out of use, but many regional languages were lumped in as well.

New Italian words were invented to replace the foreign ones, and a huge industry sprang up to dub foreign films into Italian, an industry that still thrives. Part of the problem was that the Italian alphabet is five letters shorter than ours, but some of the foreign words brought them in.

France has tried the same for many years with its Académie française acting as a language police to seek out and replace Anglicisms that creep in over the years. I remember them telling me that le jumbo jet had to be replaced by le grossporteur, and that l’hovercraft must become l’aeroglisseur. It didn’t work, of course. They remain attached as ever to le weekend, when they might indulge in le jogging, or perhaps le camping. It is particularly hard for them to resist English and American words surrounding new technology.

The English language has had no such qualms, eagerly lapping up a plethora of foreign words every year. Some are from America, from whence came gadget, gimmick and maverick, among hundreds of others. Some come from France, a veritable dossier of etiquette, and coupons for restaurants. The UK's colonial past enriched it with words from India and Africa, which we happily use as though they came over with William the Conqueror, as hundreds of words did.

We regard English as a living language, changed day by day as usage changes. Some we resist for a time, but they settle in if people find them useful. It's obviously helpful to have a word for each number with an extra three zeros, so the English billion (meaning million million) has gone, and now means a thousand million following American usage.

The English language is rather like English common law, made more by usage then prescription. We have rules of grammar, but we don't learn them from books; we pick them up instinctively as children, by listening to how people use the language. We have rules of meaning and pronunciation, but we acknowledge that they change over time. It is part of the joy of English that it can adapt itself and meet new needs by new uses.

Yes, indeed. We do cultural appropriation very well, only we call it assimilation. It is a tribute to other countries that we take in their words, just as we take in their foods, their fashions, and even their people. Mussolini and the Académie française do it differently, but to us they can seem like latterday Cnuts trying or order back the tide of history. We prefer to absorb good words from others, just as we absorb good ideas. It makes us richer.

Read More
Tim Worstall Tim Worstall

John Harris' little misunderstanding - trade is a technology

John Harris wants to tell us all that the right wing just don’t get how difficult this automation, AI, the robots taking all the jobs thing is going to be. He then contrasts this with the attention paid to trade and the terms upon which it happens. This thought having a certain problem to it:

So far, technology has not been one of the favoured themes of the western world’s populists, who are still much keener on talking about work and prosperity in the context of globalisation, trade and such supra-national institutions as the EU. But Frey’s book holds out the prospect of these politicians sooner or later floating the idea of somehow slowing the pace of automation so as to protect their supporters. History offers lessons here: given the convulsions of the industrial revolution led eventually to such liberating, job-creating innovations as mass access to electricity and the internal combustion engine, to do so would threaten things that, in the long run, will surely be to everyone’s benefit. Clearly, any convincing answer to technological disruption lies not in trying to deny the future, but coming up with the kind of ameliorative social programmes – housebuilding, huge changes to education, either a universal basic income or a system of basic social rights – that might both protect people and allow them to make the most of huge change. But when do you hear Trump, Johnson or Nigel Farage talk about any of that?

We don’t speak for those politicians of course. But as David Friedman has pointed out trade is just a technology. His example was, if only we had a machine that could turn corn into cars. Which, of course, we do (or did, when he composed the thought experiment). Japan. The US sent corn to Japan, it got back cars. That’s a technology for turning the veggies into the vehicles.

The point of the story being that there’s no real useful distinction between the two. Automation and trade are just two different flavours of the same thing, technologies.

As to what we do about them both of course the answer is the same. Let them play out and yes, as Harris suggests, then deal with the bits of the results we don’t like in the aftermath. We do already have a welfare state of course and while we might spit feathers about the specific details of it we’re not suggesting that there won’t be one.

We would though insist that we think hard on “surely be to everyone’s benefit”. Because yes, the increased productivity that comes from advancing technology - whether of the robot, AI, automation or trade flavours - is exactly what makes us all richer and yet richer again. Just as the last 250 years of it has got us to this peak, the richest human beings ever.

So far.

Read More
Madsen Pirie Madsen Pirie

When Roosevelt failed to pack the Supreme Court

Although Franklin D Roosevelt is rightly hailed as a popular hero who led America through Word War II, there were darker sides to his presidency, including a blatant bid to pack the Supreme Court with his supporters. He wanted to expand the court to outvote the justices who were blocking some of his New Deal legislation because in their eyes it violated the Constitution they were sworn to uphold. On July 22nd, 1937, the US Senate voted down his Judicial Procedures Reform Bill by voting 70-20 to send it back to committee, where the controversial innovations were deleted from it.

Although the Judiciary Act of 1869 had stipulated a Chief Justice and 8 others to make up the Supreme Court, Roosevelt suggested that because this wasn’t in the US Constitution, Congress had the power to change it. He wanted power to appoint extra justices up to a maximum of 6, to supplement the existing 9 members when any failed to retire on reaching the age of 70 years 6 months. The aim was to add justices to outvote those striking down some of his New Deal Measures.

Although he unveiled it in one of his fireside chats and sought popular support, the public remained hostile on balance after brief initial backing. The President claimed that the Court needed more members because it “was having to decline, without any explanation, to hear 87% of the cases presented by private litigants.” Chief Justice Charles Evans Hughes publicly denied this, claiming that for several years they had been hearing cases with 4 weeks.

Democratic committee chair Henry F. Ashurst delayed hearings in the Judiciary Committee, holding the bill in committee for 165 days, contributing to its ultimate defeat. The Republican National Chairman, Henry Plather Fletcher, suggested that “an administration as fully in control as this one can pack it [the Supreme Court] as easily as an English government can pack the House of Lords." He was right, in that the threat to do this has been used several times in the UK to secure the compliance of the Upper House.

Although FDR lost out to his Chief Justice, who was backed in Congress by the President’s opponents, FDR did, by staying in office 12 years, eventually get to appoint 8 of the 9 justices. However, it was the vote in the Senate on this day in 1937 that is reckoned to have saved the independence of the judiciary.

What we observe in history is that if the executive acquires this kind of power, it will eventually use for to get its own way on trivial, everyday matters, in addition to the vital ones used to justify the power. The Parliament Act that reduced the delaying power of the UK’s House of Lords “in cases of vital national emergency,” was used by Tony Blair to ban fox-hunting. The universal lesson is that in a democracy, you don’t acquire extra powers that you are not happy to see the other side use at a later date. President Trump has made use of the Executive Orders, and the reduced majority needed to confirm judges, that were the hallmarks of Barack Obama’s administration.

Read More
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Blogs by email