In Western Australia it an offence to possess more than 50kg of potatoes, unless you are a member of the Potato Corporation. Challenging someone to a duel in Tasmania carries a $6,000 fine. There's no law against wearing hot pink pants after midday on a Sunday afternoon in Victoria, but flying a kite in public "to the annoyance of any person" can put you back $800. Under the Rain-Making Control Act (1967) you are advised that it is an offence to "carry out unauthorised rain-making operations".
Getting a measure of how many laws there are in Australia is not easy, but one practitioner's resource The Laws of Australia proudly lists over 40,000 legal propositions across 320 specific topics and 36 broad topic areas. Despite the legal maxim that ignorance of the law is no excuse, it is clear that it is impossible for even an interested reader to have a full and complete knowledge of the law - let alone a disinterested reader.
In the Design of Everyday Things, Donald Norman describes two crucial principles for a useful object: Discoverability and Understanding. It should be easy to learn how an object works, and clear to see how to use it. For the vast majority of laws, this is not the case: a typical Act may run for many hundreds of pages, define case-specific jargon, refer to rules set out in external pieces of legislation, involve ambiguities, internal inconsistencies, or simply poor writing (the 528 page Duties Act in Queensland is a prime example).
In part, this is nature of the beast: the administrative structure for a law to be changed is a complicated and multi-staged affair, involving a parliament which will not seek to amend a law unless it serves for political capital. Repairing cracks in the sidewalk is necessary, but it's nothing you want to turn into a press statement. As a result law is updated only sporadically, and often in response to the most vocal and focal failures rather than being optimised.
There are moves to improve this process. One of the more interesting is being led by the CSIRO Regulation as a Platform project which recommends converting regulatory rules into machine readable logic which can be quickly checked for internal consistency, and clearer ways this may communicated to the public or automated for simple administrative tasks. Advances in natural language processing may additionally aid in users being able to ask for clarification for a legal rule, and receive general advice from a machine.
The idea has received pushback due to concern that an algorithmic approach to law removes the human element and the role of discretion. Moreover the stink of Robodebt scandal makes some leery of outsourcing application of the law too far to machines. However the idea has some merit - not only in terms of directly using new tools for searching or interpreting the law; but also conceptually, using programming concepts to better understand how law may be better designed to run efficiently and user-friendly.
In many ways a law is a program which runs on society rather than on circuits: wet, squishy humanware rather than hardware. It involves a system of logic and operation. Functions which are defined and acted upon in a systematic order, with constraints provided which restrict the operation of a system. The analogy is particularly clear for administrative law applied by rote. However, it also applies for laws which require the application of discretion or decision-making (the decision maker is simply a sub-operation requiring a person).
In code, there is the concept of algorithmic efficiency - how resources (time and storage) vary with the size of an operation. This is measured by reference to the cost of a base operation (e.g. adding up two numbers, or plucking out an item from a list). The common measure is the worst case running time: a constant operation O(1) runs the same time no matter the size of the input; a linear operation O(N) runs as the sum of the base operations; and polynomial operations [e.g. O(N^2)], and exponential operations [e.g. O(2^N)] run increasingly less efficiently with the size of the input. Designing an efficient algorithm is understandably a goal for a good program that will run fast and easily with little overhead.
In principle the idea of algorithmic complexity also applies to law. Here we have additional forms of resources: costs of implementation (the cost of administrative public servants), costs of interpretation (the cost of understanding the law, e.g. employing external legal advice, reading legislation, or checking informal explainers), and costs in terms of error correction (the judicial system, including the many barriers to entry). While less tangible, these are similar to those borne by computers in algorithmic complexity: they represent the cost of resources required to run a particular process, which may be comprised of many smaller processes or required actions.
As a result programming principles can be applied to make law more efficient and user-friendly. The aim is to minimise the total costs of operation while sufficiently achieving the desired legal aim. Looking at the regular individual operations involved in a procedure, including complex, expensive, or repeated operations, is the first step in improving the efficiency of a law. Beyond thinking about law in terms of algorithmic complexity, there are a number of core programming techniques that lend themselves to direct analogies in legal design. Many of the rules outlined in The Pragmatic Programmer (outline here) are directly applicable - particularly ideas around orthogonality (reducing cross-dependencies), maintaining single unambiguous definitions, and building in exception handling are all good concepts.
It's no coincidence that the authors of The Pragmatic Programmer turned to legal analogies when outlining concepts for programmers. As is often the case when two fields share conceptual links learnings from the two domains can flow in both directions, and legal professionals and policy designers would be advised to learn and take from design techniques in computer science where they can.
Comments