We’re all Doomed!

It seems like everyone, right now, is convinced of their own doom. The pages of the Guardian are full of angry leftie types who feel betrayed by 52% of the electorate, while the pages of the red-top papers are little better and filled with angry accusations of treachery and imagined attempts to subvert the result of the Brexit vote. Bizarrely, every side feels under threat simultaneously, but in crazy times like these maybe they’re all right. When the rules of the game fall apart, no-one can be sure anymore what it takes to win, or what winning even means, even if the team captains themselves were the ones who shot the referee and are now warring for possession of the rule book. The village has to be destroyed in order to save it.

According to Google and Wikipedia, the original sense of the word doom was judgment, someone receiving what they deserved:

Old English dōm ‘statute, judgement’, of Germanic origin, from a base meaning ‘to put in place’; related to do.

And there’s certainly a lot that needs judging. The Brexits and Trumps of the world weren’t random events, they were responses to the faults that run through the way we organise the world, and the increasingly desperate papering over the cracks. Every system contains the seeds of its own destruction (as Marx supposedly said of capitalism), and when those seeds germinated there were two choices:

  1. Be honest about the problems and the drastic changes required to fix them
  2. Lie, pervert and distort all law and reason to delay burying the bloated but familiar corpse as long as possible

It doesn’t take much to see which way our illustrious leaders went.

Large societies always find it hard to fundamentally reform, because the illusion of permanence is the only way they can work. Imagine what would happen if people with pounds in their pockets didn’t believe that the pound was a stable store of value, with a stable and strong government behind it? Or if they didn’t believe that their pensions would deliver what was promised? If they didn’t act as if institutions, which are composed entirely of rules and therefore in theory more mutable than anything physical, would endure forever?

In such a world, the only thing you can trust are friends and family, but no system of 60+ million people (such as the UK itself) can be run on that basis. It’s been tried in Africa and other parts of the world where small tribes were randomly allocated to states, and the result has generally been massive corruption and a state that barely functions at best. The ability to work together with strangers, the historically bizarre cultural belief that the system itself can substitute for personal links, is what makes the advanced economies work, but that very illusion means that nothing fundamental can be fixed without bringing into question the very basis of the system, even when the system is tearing itself apart. There’s always a temptation to tweak and paper over the cracks and hope.

Let’s take our current repeat of the Great Depression as an example:

  1. SUCCESS: In the post-war period, Keynesian policies created a much more equal society with a much higher standard of living for the working class than before.
  2. FAILURE: The pursuit of full employment gave organised labour a lot of bargaining power. In the 70s when the economy hit the rocks, workers were basically unable to reach a compromise with the investment class about how to divide limited spoils. The result was the three day week and a country at war with itself.
  3. SUCCESS: Margaret Thatcher broke the UK unions and redistributed power towards the investment class. After a sharp recession, the economy started functioning again and, combined with a boost from North Sea gas and oil, things seemed to improve.
  4. FAILURE: Instead of rebalancing power between labour and investors, the pendulum swung massively too far the other way. Not only was the power of organised labour destroyed, but the government deliberately pursued policies to increase competition in low-skilled jobs by bringing cheaper labour into the EU and pushing free-trade deals that facilitated off-shoring of jobs. This caused a shortage of demand for the goods being produced – the propensity to spend at the bottom is higher than the top, and there has to be enough money spent to buy the goods being produced or the economy enters a death spiral. The lesson of Keynes was ignored by Thatcher and other followers of Friedman.
  5. SUCCESS: A liberalised financial sector (also due to Thatcher) propped up consumer demand by creating a debt bubble. Secured lending on housing, and unsecured lending via credit cards, created money to plug the spending gap (yes, banks create money, and the money multiplier model is empirically wrong). Everything seemed to be OK as long as the bubble was inflating. Since the liberalisation of finance also occurred across borders, positive feedback loops crossed borders too to reinforce the bubble.
  6. FAILURE: In parts of the system, the bubble popped. Strong connections between banks meant that instead of one national bubble popping, the entire international debt bubble went pop.
  7. SUCCESS: Banks were bailed out, and continue to be bailed out (the latest being Italy’s third biggest bank, Monte dei Paschi). At first, economies were stimulated by governments to maintain consumer demand and avoid an even bigger recession. Things seemed to stabilise.
  8. FAILURE: Government debts soared faster than for a long time, causing panic among politicians and influential people. Austerity was imposed to try to slow the growth of debt or decrease debt, causing a return of the lack of demand problem which was ‘solved’ by the debt bubble in the first place. Some parts of the world entered a death spiral (Greece).
  9. SUCCESS: Everything can be solved by better public relations! Politicians rely on animal spirits to magically solve the problem. Unemployment and inflation statistics are calculated in misleading ways to make people more optimistic. Crapification is progress: zero-hours contracts and a replacement of well paid full-time roles with minimum wage and part-time jobs is presented as a good thing. In the UK, politicians succeed in restarting debt fueled house price inflation, for a time.
  10. FAILURE: It turns out that the dogs won’t eat the dog food. Long-standing anger at economic policies at the bottom combine with anger of the previously insulated to deliver Brexit and Trump. Spain is ungovernable. Le Pen and other right wing populists rise in France, Austria, the Netherlands. The 5 Star Movement has a good chance of being the biggest party in the Italian parliament, and want to withdraw Italy from the Euro.
  11. SUCCESS: ???

Another example might be our inability to solve the environmental problems of economic growth, or to accept that growth itself, of both populations and economies, is limited in a finite world.

Note that the first failure has been returning to haunt us for decades now. We started down this road because of the difficulty of maintaining a balance between demand and supply, labour and capital. Now we have exactly the same problem, but we also have:

  1. The aftermath of the biggest debt bubble in history
  2. A zombie financial system
  3. A government that regards lying (everything is fine!) and perverting justice as a problem solving mechanism (no bankers were jailed!)
  4. The disintegration of the political centre (Brexit, Trump, …)

How much easier might it have been if the fundamental flaw in the system, the need to maintain some kind of equilibrium between supply and demand, workers and capitalists, had been comprehensively solved in the seventies? But to do that would have required an admission that this man-made system can be whatever we want it to be, that the only constraint is what we’re willing to collectively live with, and admitting that would have destroyed the system all by itself. So instead we get the sticking plasters and the gradual march towards total disintegration.

Python vs C#

I thought I’d post about something different for a change, just to prove I have interested outside of gardening. And since I was debating this topic with a colleague on Friday, it seemed like a good place to start.

Static vs Dynamic Typing

I should explain that I work in the digital department of a medium sized engineering company. My department is divided between people who are more focused on developing algorithms to solve interesting problems (‘modellers’) and people whose focus is building production IT systems (‘software engineers’). Obviously these two sides tend to favour different languages, thus the Python vs C# debate.

The debate at the time focused on whether static typing is a major plus in choosing a language or not. I personally don’t think that static types should be the deciding factor in choosing a language – not because I find it hard to use a statically typed language, but because my experience is that static types typically only really catch the easy errors early.

Of course, it’s useful to be immediately told if you’re trying to add a string to a number. But generally such errors are identified even without static checking as long as your testing is thorough enough to cover your code base properly. Where static types don’t help is with the hard to find errors, where you’re doing the algorithmically wrong thing with the right types, and those are the ones where you lose a lot of time. The only way to find those is to do thorough testing, which tends to catch your type errors in any case.

Of course, a better type system can make static typing slightly more useful by making the types of functions more descriptive of what they should do, and by ruling out at compile time more incorrect behaviour.

For example, in the past I’ve written code outside of work in Haskell, which has a very strong but powerful type system. In Haskell, the type of a function to get the length of a list is:

length :: [a] -> Int

Here, ‘a’ represents any type at all. The function can take a list of values of any type and return a number. The type signature also encodes that length does no IO or anything else non-deterministic.

Given that we know that ‘length’ is a deterministic function that does no IO, and that it knows nothing about the type of value containing in the list it takes as an input, and it returns a number, there’s a very limited number of things it could do, and a lot of things the compiler can object to up front.

Broadening the Argument

Of course, my debating partner didn’t agree with my argument. But thinking about it afterwards, that was the tip of the iceberg. Here’s some other criteria where I’d disagree with common practice:

  1. Benefits of OO (Object Orientation)
  2. Reference vs value programming

I think from our discussion that my colleague would disagree with me on OO, but agree with me on the importance of restricting shared references.

Object Orientation

Object orientation is probably the most popular programming paradigm around right now. And for good reason, since it directly encodes two human tendencies / common ways of thinking:

  1. Organising things into hierarchies
  2. Ascribing processes to entities

Now, I don’t deny that OO can be a helpful way to structure your thinking for some problems, although helpful isn’t the same as necessary. What I have a bigger problem with is what you might call the OO fundamentalism that’s taken over much of the field.

Depending on the language, you might find things like:

  • The inability for a function to just be a function without an owner

Do Cos and Sin really need to belong to a class rather than just a library of functions? Do you really need to use a Visitor pattern instead of just passing a first class function to another function? Does a commutative function like add really ‘belong’ to either of the things being added?

  • The inability to separate shared interfaces from inheritance

In some languages, interfaces as a concept don’t exist. In others, they do exist but are under-utilised in the standard library, meaning that in practice you’re often forced to build a subclass when all you really want is to implement a specific interface required by the function you want to call.

In Python, this issue is solved by duck typing. In at least one non-OO statically typed language (Haskell), it’s solved by type classes. In C# it’s inconsistently solved by interfaces.

  • WORSE: interfaces only by sub-classing, and single inheritance so classes can effectively only implement one ‘interface’ at a time

Who decided to make it so hard to specify the actual interfacing standards in a generic way?

  • The tendency for OO languages to encourage in-place mutation of values and reliance on identity rather than value in computation as the default, rather than as a limited performance enhancing measure

It’s now widely agreed that too much global state is a bad thing in programming. But the badness of global state is really just an extreme of the badness of directly mutating the same memory from many different locations. The more you do this without clear controls, the harder it is to debug the resulting program. And yet the most common programming paradigm around encourages, in almost all cases, in-place mutation and reference sharing as the default.

This annoys me so much I’m going to write a small section about it.

References vs Values

Let’s illustrate the problem with a simple Python example shall we? In python you can multiply a list by a number to get multiple copies of the same values, for example:

[1] * 3 = [1,1,1]

Now, let’s imagine we want a list of 3 lists:

[[1]] * 3 = [[1], [1], [1]]

Let’s say we take our list of lists and add something to the first list.

x = [[1]] * 3

x[0].append(2)

What do you think the value of x is now? Do you think it’s [[1,2], [1], [1]]? If you do then you’re wrong. In fact it’s [[1,2], [1,2], [1,2]], because all elements of the list refer to exactly the same object.

How dumb is this? And to make it even worse, like many OO languages Python has a largely hidden value vs reference type distinction, so the following does work as expected:

x = [1] * 3

x[0] += 1

You get the expected x = [2,1,1] as a result.

So you have a pervasive tendency for the language to promote object sharing and mutability, which together mean you have to be incredibly careful to explicitly copy things otherwise you end up corrupting the data other parts of your program are using. And unlike in C, where for the most part it’s clear what’s a pointer and what’s not, you also have an unmarked lack of consistency between types which do this and types where operations are by value.

Similar issues occur in most object orientated languages, creating brittle programs with vast amounts of hidden shared state for no obvious benefit in most cases. It would be better to have special syntax for the limited cases where shared state is important for performance, but that’s not the way most of the world went. And now we’re paying the price, since the reference model breaks completely in a massively-parallel world.

So – Python vs C#?

How do Python and C# stack up on all three criteria?

  • Typing

Both are strongly typed, but C# has static typing while Python doesn’t.

As I said, for me C#’s type system isn’t clever enough to catch most of the hard bugs, so I think it only earns a few correctness points while losing flexibility points.

No overall winner.

  • Object orientation

Both languages are mostly object orientated. Python is less insistent on your own code being OO than C#, and will happily let you write procedural or semi-functional code as long as you don’t mind the standard library being mostly composed of objects.

C# has interfaces as a separate concept, but for added inconsistency also uses sub-classes for shared interfaces. Python mostly does shared interfaces by duck typing, which is of course ultra-flexible but relies on thorough testing as the only way to check compliance to the required interface.

Since I don’t think OO is always the best way to structure a problem, I’d give the points to Python on this.

  • References vs values, aka hidden pointers galore

Both Python and C# have the same disturbing tendency to make you work hard to limit shared state, and promote bugs by choosing the dangerous option as the default.

Both languages lose here.

So in terms of a good experience writing code, I’m inclined to give Python the advantage, but to be honest there isn’t much in it. For me, other factors are much more important, such as availability of functionality required for a project, avalability of others in the team with the right skillset for ongoing maintenance, and very occasionally performance (Python is not fast, but this doesn’t matter for most projects).

Now if a good functional language like Haskell or Ocaml would just become common enough to solve the library and available personnel issue, we’d at least have the opposite extreme as an option (value over reference, less or no OO). Then maybe in another decade we could find a compromise somewhere in the middle…

Bottomless Pots

Terracotta pot

Terracotta pot

When I was growing up, my parents used to use chimney pots as planters. I’m not sure whether they just had a couple of spare or whether it was a deliberate plan, but I’ve been thinking recently what a good idea it was.

I’ve never been a big fan of pots as permanent place to grow things. They do have advantages: like raised beds, they will warm up quicker in spring, it’s often handy to have segregated areas for growing smaller or weaker plants for extra attention and coddling, and of course pots can look good as well. But on the other hand they’re always too dry or too wet, nutrients get used or leeched from the soil, and before you know it you’re spending half your time watering and feeding the potted plants while the ones in the ground just look after themselves.

So I’ve decided to take inspiration from the chimney pots. We’ve already taken an angle grinder to the bottom of one large pot, and I’ve let it be known in the family that 30 – 40cm square terracotta pots would be a welcome christmas gift. I’m going to line some of the paths with them, and the open bottoms should let worms and plant roots in and out and avoid the nutrient depletion and moisture issues.

I just wonder why everyone isn’t already selling them pre-cut…

The missing ingredient, source: Wikipedia, by Widar23

The missing ingredient, source: Wikipedia, by Widar23

SD Card Failure

About a week ago, the SD card in the raspberry pi died. I’ve spent a few hours over the weekend rebuilding the system and trying to mitigate future SD card failure by migrating all but the boot partition to an external hard-drive, but unfortunately some things still aren’t up and running and for some reason WordPress is now incredibly slow compared to past performance.

I’m hoping to resolve the issues soon and get everything back more or less to how it was. Until then, if you’re patient the old posts should now be back up, with some pictures missing.