Unpacking these sentiments is enlightening. Effectively the clamour was for a detailed model and computation of what leaving the EU versus staying in might mean, particularly in practical financial ways like affordability of housing.

The fact is, no-one knows, even approximately. In practice you can't predict it, not with today's methodologies. The ecosystem is too complex, with huge numbers of feedback loops and linked components, many of which even individually are almost unknowable.

Sometimes the error bars swamp the value. In the end there's too much variability to say anything much quantitative. You can surmise things like "there'll be a shock if we exit", but not its detailed consequence or even whether saying this is perturbing the consequence eg. is self-fulfilling.

What's amazing is that I'm having to say all this. I wouldn't have had to 100 years ago: there wouldn't have been any concept that such predictions could be computable. But in recent times, real-world maths and computation have in many ways been so successful for so many predictions that societally there's an assumption we can "compute anything" or always quantitatively predict with some accuracy how decisions will play out in whatever field.

"A key part of using any powerful tool—computation included—is knowing when it works and when it doesn't."

Don't get me wrong. I'm a keen advocate for computing answers, driving decision-making and optimising actions with computation; I spend much of my working life driving more use of computation; I think we're only at the start of where it can take us in increasingly broad swaths of life. Indeed, since the mid-twentieth century, I'd argue that the rise of mechanised computation continues to be the biggest underlying driver of human progress—particularly through engineering and hard science.

But a key part of using any powerful tool—computation included—is knowing when it works and when it doesn't. With no effective, general education in computational thinking, most people can easily be mislead, and they are.

This educational failure is a major part of what's caused "post-truth". Years of apparently precise, prominent predictions with at best over-stated accuracy or worse, that are just wrong. "Experts" push numbers to assume an importance beyond their ability to inform to the point where a sizeable fraction of our population, given no computational education to fall back on, no longer believes any logic, any number, any expert.

I remember a blind "belief in computation" starting to take hold in the 1980s crystallised in particular for me through a conversation with a friend at school. Some early global climate predictions were starting and I was sceptical that they were right, whether over or under estimating the effects. He argued that if the predictions "made a good point" and garnered attention, it was important that scientists en masse were simplistically presenting them whether or not they really could justify their veracity. I argued that in the end this would backfire: if any of the widely accepted expert predictions failed, science, computation and logic would suffer huge rejection. Perhaps to needle my (Catholic) friend, I pointed to his church's insistence that it knew the sun orbited the earth in a perfect circle—and the damage this had done both in actions taken (eg. against scientists) and to its own credibility.

" 'Experts' push numbers to assume an importance...

...beyond their ability to inform."

The promulgators of predictions—politicians, campaigners and experts—certainly bear responsibility for post-truth. They get more publicity for being definitive on a "big issue" with "evidence" even if they're massively overstating the claim or its precision and damaging long-term credibility of their professions. Instead they need to be prepared to give real answers like "we don't know" or "the only prediction we know how to make assumes xxx, which is probably wrong".

But a major responsibility also lays with the public and in particular their mainstream education. They need experience and developed instinct in questioning models, science, computation. They need measured scepticism that comes of experience with real, messy situations, today's computational tools, manifested by ready-to-use questioning to help them pick apart expert statements. Things like "What are your assumptions?", "Why do you believe this is the basis?", "Have you forgotten any effects?", "What happens if there's a small error in your opening assumption?" and so forth. They need to be versed in dangerous scenarios to look out for and take special care over. For example, often I am more likely to believe localised, short-term predictions than global long-term ones because the likelihood of massive errors in the model tend to grow very sharply with time and complexity; there's often no control scenario either; and it takes too long to see effects of the prediction. That's a small example of experience I've developed.

Why isn't STEM and specifically maths education—as the only mainstream computation subject—teaching these vital topics? They need to be central but aren't. They aren't because they can't be with subject's overwhelming focus on hand-calculating.

Look at today's maths curricula around the world and you'll find scant coverage of topics like these, and none with needed modern computer-powered analysis of messy scenarios that give real experience.

Therefore I think a crucial step in the long journey to fixing the post-truth problem is laying out what we want from mainstream maths—why we've developed our new "outcomes" list as part of our computerbasedmath.org project. Here's one snippet...

FULL CBM draft outcomes LIst SUMMARY

Only when our populations can't so easily be misled by maths will they re-engage with its power to persuade. This is vital individually, and societally, or we may start down a path of mysticism, a new era of Unenlightenment.

]]>Firstly, I've got to say, I really like the term.

To my mind, the overriding purpose of education is "to enrich life" (yours, your society's, not just in "riches" but in meaning) and different ways in which you can think about how you look at ideas, challenges and opportunities seems crucial to achieving that.

Therefore using a term of the form “xxx Thinking" that cuts across boundaries but can support traditional school subjects (eg. History, English, Maths) and emphasises an approach to thinking is important to improving education.

Now we've had widespread use of the term "Critical Thinking" for sometime, but to me it has much less power of actuality than "Computational Thinking".

“Computation” is a highly definitive set of methodologies—a system for getting answers from questions, and one rapidly gaining in power and applicability each year. There is no parallel, definitive, “Critic” system, and even the related “Critiquing” is a rather vague skill bucket, not a systemic—and highly successful—roadmap. As a result, Critical Thinking often becomes more of an aspiration of student capability not a definable, definite, life-enriching set of problem-solving abilities.

To be specific, I'd argue that Computational Thinking is a mode of thinking about life in which you creatively and cleverly apply a 4-step problem-solving process to ideas, challenges and opportunities you encounter to make progress with them.

Here's how it works.

You start by **defining the question** that you really want to address—a step shared with most definitions of "Critical Thinking".

But computational thinking follows this with a crucial transitional step 2 in which you take these questions and **translate into abstract **computational language—be that code, diagrams, algorithms. This has several purposes. It means that 100s of years worth of figured out concepts and tools can be brought to bear on the question (usually by computer), because you've turned the question into a form ready for this high fidelity machinery to do its work. Another purpose of step 2 is in forcing a more precise definition of the question. In many cases this abstraction step is the most demanding of high conceptual understanding, creativity, experience and insight.

After abstraction comes the **computation** itself—step 3—where the question is transformed into an abstract answer—usually by a computer.

In step 4 we take this abstract answer, **interpret the results**, re-contextualising them in the scope of our original questions and sceptically verifying them.

The process rarely stops at that point because it can be applied over and over again with output informing the next input until you deem the answers sufficiently good. This might take just a minute for a simple estimation or a whole lifetime for a scientific discovery.

"Modern technology has dramatically shifted theeffective process because you don’t get stuckon[the Computational Thinking]helix roadway at step 3,so you may as well zoom up more turns of the track faster."

I think it's helpful to represent this iteration as ascending a helix made up of a roadway of the 4 steps, repeating in sequence until you can declare success.

While I've emphasized the process end of computational thinking, its power of application comes from (what are today!) very human qualities of creativity and conceptual understanding. The magic is in optimising how process, computer and human can be put together to solve increasingly tough problems.

Is this process of Computational Thinking that I describe connected with maths—or even one and the same subject; and what about coding? Talking education, there is very heavy overlap with our Computer-Based Maths approach, much less with today's traditional maths education; coding is an important element, in particular as the way in which you manifest abstraction.

Real-world maths—defining it and its applications broadly, as I do—absolutely relies on Computational Thinking but there are also specific areas of knowledge that maths is considered to contain (eg. particular concepts and algorithms), and which are often important to applying computational thinking to different areas of life. Maths is a domain of factual knowledge as well as the skills knowledge of how to process them.

"Computational Thinking is a mode of thinking about life

in which you apply a 4-step problem-solving process to ideas, challenges and opportunities you encounter"

Even in the real-world, this broad definition of the term “maths” may be alien to engineers or scientists who would consider what I’m describing simply as part of engineering or science respectively.

There’s another key difference too between a traditional maths way of thinking about a problem and a modern computational thinking approach and it has to do with the cost-benefit analysis between the 4 steps of the helix.

Before modern computers, step 3—computation—was very expensive because it had to be done by hand. Therefore in real life you’d try very hard to minimise the amount of computation at the expense of much more upfront deliberation in steps 1 (defining the question) and 2 (abstracting). It was a very deliberate process. Now, more often than not, you might have a much more scientific or experimental approach with a looser initial question for step 1 (like “can I find something interesting in this data”), an abstraction in step 2 to a multiplicity of computations (like “let me try plotting correlation of all the pairs of data”) because computation of step 3 is so cheap and effective you can try it lots and not worry if there’s wastage at that step. Modern technology has dramatically shifted the effective process because you don’t get stuck on your helix roadway at step 3, so you may as well zoom up more turns of the track faster.

"[Applying the Computational Thinking process] might take

just a minute for a simple estimation

or a whole lifetime for a scientific discovery."

A useful analogy is the change that digital photography has brought. Taking photos on film was relatively costly (though cheap compared with chemical-coated glass plates it replaced). You didn’t want to waste film, so you'd be more meticulous at setting the shot before you took it. Now you may as well take the photo; it's cheap. That doesn't mean you shouldn't be careful to set-up (abstract) it to get good results but it does mean the cost of misfires, wrong light exposure and so forth is less. It also opens up new fields of ad-hoc photography to a far wider range of people. Both meticulous and ad-hoc modes can be useful; the latter has added a whole new toolset, though not always replaced the original approach.

"Real-world maths absolutely relies on Computational Thinking"

Back to maths. However we term the real-world need, whether computer-based maths or computational thinking, what’s sadly all too clear is how today’s mainstream educational subject in this space of "maths" isn’t meeting the need. Its focus on teaching how to do step 3 by hand might have made sense when that was the sticking point in applying maths in life: because if you couldn’t do the calculating, you couldn’t use maths or in general computational thinking. Conversely, primarily gaining experience in a very deliberate, meticulous, uncontextualised, pre-computer application of the computational process—rather than a faster-paced, computer-based, experimental, scientific-style use on real problems—cannot continue to be maths’ primary purpose if the subject is to remain mainstream. Instead its primary purpose ought to be Computational Thinking—as it is in our CBM manifestation.

"Our aim is to build the anchor Computational Thinking

school subject as we explicitly broaden CBM

beyond being based in maths"

Like real-world maths, coding likewise relies on Computational Thinking but again isn't the same subject or (by most definitions) anything like a complete route to it. You need Computational Thinking for figuring out how to extract problems to code and get the computer to do what you want, but coding is the art of instructing a computer what to do, it's the expertise needed for being the sophisticated manager of your computing technology which includes speaking a sensible coding language, or several, to your computer.

What of other school subjects? Computational Thinking should be applicable to a very wide range. After all, it's a way of thinking—not the only way of thinking—but an important perspective across life. Whether in design (*“How can I design a streamlined cycle helmet?”*) or history (*“What was the key message each President's inaugural address delivered?”*), or music (*”How did Bach’s use of motifs change over his career?”*), every subject should envelop a Computational Thinking approach.

"The Computational Thinking approach needs

knowledge of what’s possible, experience of how you can apply it, and know-how of today’s machinery for performing it."

An important practical question is whether that can happen without a core educational subject of the learning of Computational Thinking itself? I don't think so, not at school levels anyway. That’s because the Computational Thinking approach needs knowledge of what’s possible, experience of how you can apply it, and know-how of today’s machinery for performing it. You need to know which concepts and tools there are to translate and abstract to in step 2. I don’t think you can only learn this in other subjects; there needs to be an anchor where these modern-day basics (learnt in a contextualised way) can be fostered.

Politically, there are two primary ways to achieve this: introduce a new core subject or transform an existing one. Either is a major undertaking, with coding and maths as the only possible existing school subject contenders for the transformational route. Maths of course is ubiquitous, well resourced and occupies a big part of the curriculum—but today's subject largely misses the mark. Coding is the new kid on the block, too narrow, not fully established and with far less time or money but with a zeal to go new places.

How does CBM relate? For the very short-term, simply as the start of today's best structured program for engendering computational thinking—one that's principally around maths but applied to problems and projects from all subjects.

Ultimately our aim is to build the anchor Computational Thinking school subject as we explicitly broaden CBM beyond being based in maths, and just as importantly being seen to be based only in maths. Look out for modules of CBM geography and CBM history!

Make no mistake. Whatever the politics or naming, whoever wins or loses—some day, a core, ubiquitous school subject in the space I'm describing will emerge. The first countries, regions, schools that manage this new core and its cross-curricular application will win big time.

]]>I'll leave my main blogpost to do the talking, but suffice it to say that I'm pleased there's a clean, powerful, modern way to put computation at the heart of the enterprise--what I call Enterprise Computation.

It's important for organisations to start to think now about how they manifest this new opportunity which will rapidly become a necessity--one driven particularly by data science.

]]>I am also sad that I never met Seymour. I even can't say when I consciously became aware of him or different strands of his work either. But his name has seemingly for ever been familiar, cropping up with increasing regularity and force in so many of the interests I've pursued, particularly fundamentally reforming maths education (our computerbasedmath.org or CBM project). So many routes in so many areas lead back to Seymour. I can't help but notice with some wry amusement this morning how constructionist an approach I have taken to learning about Seymour's life and work!

While I never met Seymour, I have very much enjoyed getting to know so many of those he knew and worked with from his daughter Artemis to Brian Silverman, Gary Stager to Mitch Resnick (and no doubt many others that I know but didn't know he directly influenced too). They have been extremely supportive of CBM, our recent Wolfram Language and Mathematica, of helping us to take Seymour's and their thinking, successes, mistakes and experience to try to shift to a much better educational world.

That this post is brief is an indication that others can much better enunciate the broad array of Seymour's achievements than can I. Indeed I am already learning more as they do.

However about one aspect I am quite clear. Seymour's death does not end any aspect I've come across of his vision. Quite to the contrary, we are instead starting to see its manifestation more than ever: a mixed computer-human approach, recasting of subjects, increasing pressure for change in education (frustratingly slow though it can seem). I expect Seymour's fame outside education circles will posthumously grow.

We will work hard to carry key strands of his work forward.

]]>This week's issue is significance arithmetic, similar to what you might know from school as significant figures. The idea is when you do a calculation not just a single value but bounds that represent the uncertainty of your calculation too are calculated. You can get an idea of how accurate your answer is or indeed if it has any digits of accuracy at all.

How important is the concept of significant figures to applying maths? And if useful, what of the mechanics of computing the answer? Is significance truly significant in concept and calculation for today? And therefore should it be prominent in today's maths education?

Here's our surprising conclusion. Significance arithmetic should be far more significant than it is in real-life maths today, rather like its role (if not all the detail) in maths education where it is covered fairly extensively. It would really help to get good answers, far fewer misinterpretations, a view on whether any of the numbers are justifiable. But people just aren't using it much in their maths lives.

Yes, paradoxically, I'm saying that this is a case in which traditional maths education got it right(er) and real-world maths didn't know what was good for it! Education is ahead!

So, what's gone wrong with significance out there?

Let's start with traditional notation. If you write down 3.2, it's in practice unclear if you're being precise at justifiably saying "2 significant figures" or if really those were what came to hand or all you could be bothered to calculate by hand. The notation (like so much of traditional maths notation) is ambiguous, causing huge confusion to practitioners and particularly students.

Then there's how calculators and computers usually operate. They churn out whatever their hardware "machine precision" is--often 16 or 19 digits--even if it has no justification from input or calculation. People ignore the digits or if transcribed, just think those quoting them are ill-educated (rather like the misuse of apostrophes suggest poor education in English).

When you use significance arithmetic there are several stages that can trip you up. You have to be clear what your input precision really is. What is the error I'm inputting (eg. does 2 digits represent this)? But then the calculations you do dramatically change what significance can be justified coming out. For example, a steep slope of a function reduces relative significance coming out (eg. to 1 digit, because a little variation in the x value results in a big y change), whereas a small gradient can increase justifiable significant digits (eg. to 4 digits). That means it really matters not only which functions you're computing with, but where you are on the function.

Screen-capped from an old video: Comparing how the output precision of a computation can vary widely even for a simple function where the input precision is being kept constant.

This is messy, time-consuming and tedious to handle by hand. And yet most numerical computation software (or hardware) doesn't do significance arithmetic.

The result is predictable: significance arithmetic is usually too hard to use so people don't bother. But rather than this cutting out significance in education, it cuts it out in the real-world but with the effect that lack of accessible computing usually has in education. Computing significance in and out of education has traditionally been much too complex to bother with.

Mathematica is an exception. Right from the beginning we've had significance (and big number) arithmetic. We invented a notation using ` to specify input precision; all relevant functions compute significance and output only what's justified. Some (eg. for numerical differential equation solving) even step up precision sufficiently during calculation to meet a goal you have set for output precision, assuming this isn't trumped by too lower a precision of input.

We have fun demoing a collapse of significance problem in Excel v. Mathematica. At each iteration the result should be 0.2 but the precision is constantly reducing. Excel goes way out after just 20 steps with no warning.

**Excel **does not track significant digits and very quickly produces nonsensical answers. Easy to spot the failure of significance in this simple example; potentially disastrous inside a more complex model.

Starting with 30 significant digits input to **Mathematica**, it tracks and displays reducing justifiable digits, until an error box alerts to a complete loss of significance.

We've picked on Excel here, but almost any numerical software will behave somewhat similarly on an example like this.

Yet if you use the right numerical tools for the job, significance arithmetic take on its true significance.

Back to CBM and how significance should or shouldn't figure in what's covered? Our view is that the failure of widespread adoption of significance in real-life isn't because its worth has been superseded or mechanised away, but to the contrary, because the computing (with some notable exceptions!) has yet to make it easy enough for the apparent benefit. That will change, and hopefully before our students graduate. So we're voting for more significant significance!

One final point. This is but one example of thousands across all areas of the maths curriculum that require deep rethinking for fundamental reform of the subject of maths. I hope it gives a flavour for why CBM is so hard a project to do thoroughly and why it requires so broad a view of the use of mathematics and state of computing technology.

]]>Why? Because the world's most transformative machines have been used for largely the wrong purpose in most classrooms: automating pedagogy not changing the subject taught.

Countries with the most attentive teaching are also likely countries where there is least pressure to computerise pedagogy for teaching today's school subjects. They do best in PISA because they are best at helping students through those subjects.

But this misses two fundamental points. Firstly, that, particularly around STEM and maths, computers changed the real-world subject, but computers have yet to change the school subject or PISA's assessments. We're assessing largely the wrong type of maths (hand-calculating not computer-based problem-solving) and noting that computers in the classroom haven't helped improve the results. Computers need to be used for doing the calculating, not for teaching students how to do hand-calculating. Secondly, that today's computers are machines you need to get familiar with − both just in operating them today and also in being able to adapt as they improve. Those skills aren't being assessed.

I think OECD should be praised for bringing up the correlation between technology usage and results, but − as we teach in computerbasedmath.org − they need to be careful to assign causality correctly. They need to move their PISA assessment quickly to a new domain of computer-based subjects to counter the problem, not suggest that the technology is redundant.

]]>Neither characterisation is true in my view.

What really seems to have spooked people is the psychological turnaround from apparently omnipotent Chinese government, able to command and fix at will, to a government that's apparently largely as financially impotent as any other.

Haven't we seen this same "country on a pedestal" culture before? The one that saw Japan fall from grace in the 1990s, the US in 2000s (along in a small way with the UK) and now China?

It's astounding that China has maintained its 10% growth rate for 30 years. But like all civilisations, it hasn't got all the answers. Nor had the US or Japan. But each of these eras did have substance, just not to the extent that everyone wanted to believe.

Is there a counterbalance to the human condition of generating stardom beyond rational views of reality?

Factual information that's accessible certainly helps. Really asking diverse questions of the data rather than taking a few people's perceptions as gospel at least can produce greater variety of viewpoints and allow everyone to ask questions they have.

But achieving this requires us to step up to the mark on computable data and on human ability to know how to question it. We need truly accessible interfaces to data and our people to have instinctive, innate ability to ask the right questions to uncover its significance.

Just my two cents (or was that yuan−even jiao or fen!) on how we can move from today's quasi-quantitative world to one where the richness of a real quantitative approach shines through.

]]>It's taken me a few days to realise that there were actually two very different "importance of evidence" conversations--one with which I completely concur, and one with which I vehemently disagree. In the end, what I believe this exposes is a failure of many in charge of education to understand how major innovation usually happens--whether innovation in science, technology, business or education--and how "evidence" can drive effective innovation rather than stifle it. In an age of massive real-world change, the correct and rapid reflection of this in education is crucial to future curricula, their effective deployment, and achieving optimisation for the right educational outcomes.

I'm going to call the 2 evidence utilisations "innovation-led evidence" and "evidence-led innovation".

The difference is whether you build your "product" (eg. phone, drug, curriculum) first, then test it (using those tests for iterative refinement or rejection) or whether formal evidence that exists from previous products becomes the arbiter of any new products you build.

The former--"innovation-led evidence"--is highly productive in achieving outcomes, though of course care must be taken that those outcomes represent your objectives effectively. The latter--"evidence-led innovation" almost by definition excludes fundamental innovation because it means only building stuff that past evidence said would work.

When you build something significantly new it isn't just a matter of formally assembling evidence from the past in a predictable way. A leap is needed, or several. Different insights. A new viewpoint. Often in practice these will occur from a mixture of observation, experience and what still appears to be very human-style intelligence. But wherever it comes from, it isn't straightforwardly "evidence-led".

I strongly agree with the late physicist (and friend of my brother's) Richard Feynman who explained nicely in one of his famous 1960s Caltech lectures how the scientific process works. I could summarise: Guess, Make a theory, Test it and compare with theory. (Film of this lecture exists--see the first minute!)

In the case of technology, "theory" is the product, in pharmaceuticals it's the drug and in education (for the most part) it's the curriculum.

"Evidence-led innovation" stifles major innovation--it locks out the guess--yet I firmly believe that that's what most of "evidence-led education" is talking about with painfully little "innovation-led evidence" applied.

I've faced this repeatedly with Computer-Based maths. I'm asked "do you have evidence it works"? I sometimes answer "where's your evidence that today's traditional maths education works? Have you done randomised control trials?".

As quickly as we can build curricula, fund their development and set up projects in different countries, we are starting to gather evidence. Something that slows this down is the need to have student assessments that accurately reflect required outcomes: it's not just a matter of comparing exam results before and after, open-ended computer-based maths assessments are needed too.

One problem with the "evidence-led innovation" crowd is that they often have no idea how hard it is to build something completely new. They think you can do micro-innovations, then test, then micro-innovate then test.

Actually so far CBM is the hardest innovation I've been involved in. It's been amazing to me just how different every aspect of the maths curriculum becomes when you do not need to assume hand-calculating. Equally amazing is how deep everyone needs to dig into their own understanding to uncover those differences, particularly since those involved have learnt maths traditionally.

You might ask whether now is the time for a new maths curriculum? Can we really take the risk? As guesses go, the idea that maths education should be the same subject as maths in the real world (ie. using mechanised computation) and not the current hand-calculating proxy is an extremely sure-footed one. The risk of not leaping with the real-world poses a very significant danger. Let's have the courage to develop and test CBM around the world, indeed more thoroughly than any maths curriculum has been tested before.

]]>

Clearly data science is a major, growing and vital field—one that's relatively new in its current incarnation. It's been born and is driven forward by new technology, our abilities to collect, store, transmit and "process" ever larger quantities of data.

But "processing" has often failed to elucidate what's important in the data. We need answers, not just analytics, we need decisions not just big data.

Computation in all its forms is a key to getting decisions from data. And funnily enough, it's not only for analytics that computation's used but for enabling human language data interrogation, interactive deployment and many other examples—crucial usability, not only raw computational power.

It's to bring all these aspects together that we're hosting a one-day summit in London this Thursday entitled "Mastering your data with the [latest, most powerful!] computation", that's with my [ ] editorial.

Most people don't recognise Wolfram as a key data science company. And yet over the last few years, we've built up and a unique and integrated technology stack not only to offer the most powerful computation on data but to optimise usability across the whole workflow and crucially to be a data science platform.

Now we have hosted the "Wolfram Data Summit" in the US for the last 5 years. The emphasis is a little different from our upcoming London summit though related: "A high-level gathering of innovators in data science, creators of connected devices, and leaders of major data repositories". In London, we'll be focussed on how new ideas can be deployed today in your organisation.

In the end, London will be a great follow-on, fill-out and extension to many fields beyond democracy from my TEDx at the UK Parliament kick-off talk earlier this year in which I addressed the question "has more data led to better decisions and better democracy".

Really hope we'll see you there.

]]>Traditional areas of maths like algebra, calculus or trig don't seem a good way to think about subdividing the subject in the modern world.

You might ask, why subdivide at all?

In a sense, you shouldn't. The expert mathematician utilises whichever maths areas helps them solve the problem at hand. Breadth and ingenuity of application is often key.

But maths represents a massive body of knowledge and expertise, subdividing helps us to think about different areas, for curricula to focus their energies enough that there's sufficient depth of experience gained by students at a given time to get a foothold.

However I believe the subdivisions should be grouped by modern uses of maths, not ancient divisions of tools.

So here goes with our 5 major areas:

- Data Science (everything data, incorporating but expanding statistics and probability).
- Geometry (an ancient subject, but highly relevant today)
- Information Theory (everything signals--whether images or sound. Right name for area?).
- Modelling (techniques for good application of maths for real world problems)
- Architecture of Maths (understanding the coherence of maths that builds its power, closely related to coding).

Comments welcome!

]]>I believe PISA is meticulous in conducting its tests and reflects a good evaluation of standards of today's maths education. And yet I think if countries like the UK simply try to climb up today's PISA assessment, they'd be doing the wrong thing.

The playing field of today's maths education is restricted to manual calculating procedures allied to the limited problem-solving that they can support. Today's mainstream real-world maths is much broader: applying the process of maths--using the best computational mechanisation--to much harder problems. The skills it requires are rather different, but if anything more conceptual, more intellectual and definitely more creative.

That's a playing field on which Brits and the like could do relatively much better than on the playing field of procedural hand-calculating. It's a playing field on which drilling kids for hours a day on their algebra isn't going to win.

Now let's be clear. I'm not saying that that's universally what's happening in Asia. In fact there's great innovation in the process of schooling and particularly the learning of maths in the region (famously Singapore). Nor am I in any way writing off Asian problem-solving ability which I think, correctly and creatively trained, could be top-notch too. What I am saying is that if Brits really put their minds to modern computer-based maths, they are just as able to compete with their Asian counterparts--whereas I don't think culturally we will do so well at drilling the needlessly pre-abstracted and often irrelevant current subject. I think that non-conformity, creativeness and looking around the rules is key to British (and many other Western) cultures and a great competitive strength if tethered appropriately, opposite to the cultural imperatives present in many of the countries performing well in today's maths PISA test, countries that may struggle to imbue such charactertistics.

And crucially, it's many of these abilities and the computer-based maths subject we desperately need in the workplace, and in life--not for the most part the subject we're largely failing to succeed at of hand-calculating procedures. (My recent talk opening the CBM summit at UNICEF details the argument).

A central question in all this is precisely what outcomes we wish for our students after their years of maths study? This is a question which we have been addressing from first principles in formulating CBM, unencumbered by constraints PISA necessarily has of not going too far ahead of today's curriculum and needing accurate quantitative assessment of it. For the brave, here is an early (hard to digest) draft which spans 10 dimensions. I won't detail all the ideas here but point out the importance of confidence, knowing how to operationally manage the application of maths, and understanding the separation between maths concepts (like significance) and use of a wide variety of specific tools (like a hypothesis test).

Intelligently ranking countries as PISA does is very helpful in pushing progress in education because succeeding at today's maths education or tomorrow's computer-based variety needs well-directed effort and focus and competition. But in the end, however well education is delivered, it must deliver the right subject.

Notice that our first CBM country Estonia is already high on PISA. They recognise that despite their achievements, they need to lead the change to maths. Actually, many of the countries near the top of today's rankings have been most active in pursuing the CBM approach.

Now the UK is doing well with Estonia in leading the coding education agenda. But why oh why does the UK government choose to separate coding in primary education from maths with which it should be so intertwined (as has the US)? They need to be closely associated as I pointed out last year. And it's particularly galling that they're not in the country where a mathematician invented the computer...

Playing badly on the wrong field is hardly smart. As the playing field shifts, let's lead the change, not be laggards at a game we can succeed well in.

]]>This really has at least 4-dimensions of consequence:

Firstly, it's a unique way to excite students about maths by marrying it up with coding. Coders will be able to use the power of Mathematica's maths out of the box, not only enriching what they can do but also showing off the power and importance of maths. Attaching maths to something already enjoyable to make it better and more enjoyable I think will be very encouraging in learning more maths. And you never know, politicians and policy-makers might even start to see the connection between coding, maths and fun--rather as I outlined in an earlier blogpost

Secondly, it's cheap. For $25 + some bits and pieces, you can be up and running. One reason I was excited to be able to announce this today is because we've been hosted UNICEF's building for our summit and I think we'll have a great solution for maths, coding and CBM in developing countries.

Thirdly, this is the first pass of the Wolfram language. For years it's been lurking under the umbrella of Mathematica, a key aspect not only of our technology stack but the framework, even our symbolic way of thinking about structuring ideas. And because Wolfram Language is multi-paradigm it's a great early language to learn because it avoids students getting into thinking of everything as best expressed in one structure or other. This all complements Raspberry Pi and its goals very well and so it's nice that our first manifestation of Wolfram Language is there. Others will follow.

Fourthly, it's simply amazing that Mathematica and Wolfram language can run on something as small and cheap as Raspberry Pi. Yes, by modern desktop PC standards it can be a little clunky, but functionally it's all there--all the thousands of functions (even including my show-off special function HypergeometricPFQRegularized[ ]!). One further consequence: because Raspberry Pi is small and cheap enough to act as an embedded computer, we for the first time we have a quick-to-deploy yet full-power embedded solution.

Really looking forward to seeing what the world's students (and their tinkering parents!) come up with with this new super-combo and how it can help to drive CBM forward.

P.S. This rather completes our fruity announcements for the moment--from Apple to Blackberry to Raspberry Pi (though not as my daughter keeps calling it the Apple Pi).

]]>So I am very pleased that we're able to collaborate with UNICEF on our 3rd CBM summit, holding it at their headquarters in New York City on November 21-22.

That collaboration means a few things. Firstly, it demonstrates UNICEF's recognition of maths as crucial to improving the lives of all children, and particularly in the sort of developing countries in which UNICEF's role is central. Great credit to UNICEF and Chris Fabian (their Innovation unit chief) for being so proactive in getting this. Secondly it will broaden horizons on CBM, by bringing new groups into the action-plan, shaping the outcomes we're trying to achieve and the reality of deployment in many different environments.

I am really looking forward to this summit and also how it will push us to get some "in gestation" projects ready. Look out for a new **visualisation of the maths process**, what's currently a **10-dimensional outcome tree** and demos of draft **Estonian CBM modules** amongst many outside contributions.

This promises to be a unique gathering for fixing the world's maths education--not to mention your country's, state's or industry's. Policy-makers and key maths education voices: please come! Or suggest who should :-).

]]>As I understood his central point it was that practising hand calculations is akin to practising music pieces--it's simply the way to learn to play. Also there was some attempt to draw the analogy between listening to music and CBM, whereas playing was like traditional hand-calculating maths.

I think music education can teach us quite a bit but believe his analysis and conclusions were wrong.

We need to start from outcomes. What do we hope to achieve from people learning maths and music?

For most people, music is enriching. And for some, generating that music adds enrichment. For a few, it may even be financially enriching too, if they become professional. But I don't think that latter case is why most people study music.

The objective of learning an instrument is to play music. And practising playing music is a direct requirement to achieve that. It usually starts very early--as soon as you can string notes together, you're off trying to practice simple pieces. You are also supposed to practice scales and arpeggios. In my case I wasn't very punctilious at scales, primarily because I didn't see the point. If it had been explained that getting really good at the Eb major scale would aid my playing of an Eb major Haydn piano sonata, I would have been much more interested. In fact no association was made between the scale being practiced and the key of the piece I was trying to play.

Back to maths. My argument for CBM is that practising hand-calculating doesn't relate to the real-world outcomes in any direct way. It's not akin to practising a piece of music because the real-world outcome is disconnected. In fact my adversary in the debate agreed completely with my analysis of real-world maths: that it's computer-based. He just believed practising hand-calculating was the way to get there. I don't. In fact for all the reasons I've gone into before, I think it's detrimental for a start because it de-prioritises much more important, much more real-world outcome-connected material.

Far from just learning that practice is important, we should learn from music education that repeated practice or experience of the actual outcomes (in maths--real-world problem-solving) is vital. CBM aims to do just that.

We shouldn't forget that one big difference between music and maths is compulsion. For the most part you only learn an instrument if you (and/or your parents) want to. Everyone is made to learn maths. In music if you want to play pieces, you need to practice them; that motivates you. In maths, if you have no idea why you'd learn it, can't see an outcome you're interested in, why would you practice? And in fact the practice prescribed is largely dissociated from outcomes you'll face; so you'd have a point!

There's something else music can teach us--about how assessment works. (Lord) Jim Knight pointed this out to me. At least in the UK, you take a "Grade" exam when *you're* ready, not along with everyone else whatever your level. The exams are closely tied to the outcomes, mainly playing pieces live to examiners. There's some sight-reading (you'll need that if you want to learn new things), some scales and some questions on listening to music. Most people do well in the exams because they're ready, yet they are still highly-regarded, not dumbed down.

Why not adopt this sort of model in maths?

]]>I'm very excited to announce that computerbasedmath.org has found the first country ready for our completely new kind of maths education: it's Estonia. (...and here's the press release).

I thought Estonia could be first. They are very active on using technology (first to publish cabinet decisions immediately online, first to include programming in their mainstream curriculum), have ambition to improve their (already well respected) STEM aptitude and lack the dogma and resistance to change of many larger countries. There aren't so many countries with all those characteristics.

In our first Estonia project we will work with them to rewrite key years of school probability and statistics from scratch. This is an area that's just crazy to do without a computer, even harmful. It's an area that's only come to the fore since computers because it only makes sense with lots of data. No-one in real life does these hand analyses or works with only 5 data points, so why do we make our students? Why get students emulating what computers do so much better (computing) rather than concentrate on imaginative thinking, analysis and problem-solving that students ought to be able to do so much better even than today's computers?

Worse, in a subject like probability and statistics, current maths education often forces you to learn the wrong tools for the job.

Take the Normal Distribution--one of very few options taught to students for data analysis. Approximating your data with it rarely gets you the most accurate solution; it can be wildly wrong. Instead why not learn to select the best of 100+ other distributions and test their predictions against each other? Or why use a distribution at all, when you can work out results directly from each and every data point?

The reason is historical. Normal Distributions (and Poissons) are easiest to calculate, appear appropriate in over-simplified problems and you can't practically compare lots of distributions or work directly with data by hand. But with a computer you can and you should.

Out in the real world, there are real consequences to drilling students in de- or mis- contextualised techniques--and with the idea that each school problem has one right technique, and each technique has particular patterns of problem. Take the miscalculating of large swathes of financial risk analysis: people applied Normal distributions because they knew of them, had been trained to expect them but that didn't make for effective representations for the data.

This reminds me of an old adage. "If all you have is a hammer, everything looks like a nail". In maths context--the fact that every school problem can be solved with a small subset of maths tools leads to a false expectation that in the real-world this same subset will suffice. CBM broadens the toolkit dramatically by not insisting students should make all the tools that they use, freeing time for using more in a wider variety of situations.

Estonia is the first place where we're starting to change all of this though many other countries have voiced interest in pursuing CBM and being within the first group.

But it's slow work on several fronts for a little while. Even though several of us have been thinking along CBM lines for ages, we're constantly questioning whether a particular way of thinking or doing is in fact now the best way or simply a legacy of the pre-computer era. Indeed what outcomes are we trying to achieve and how does learning tools of maths fit with learning how to solve problems?

And education changes slowly, though now is the most vibrant and exciting time of change in my lifetime. Even with this, I expect it to be a couple of decades until the world's mainstream maths subject is universally computer-based maths rather than today's "history of hand-calculating". But today is an important step.

Countries that start the change early will reap many benefits from being first--a bit like the changes that universal education brought to countries who were first, but in microcosm for maths.

In fact it's more of a macrocosm. It affects lucrative problem-solving STEM jobs where pushing the boundaries of modeling is crucial to success. But it can make happier citizens too--able to assess risk, understand complex finances better, have an in-built mathematical 6th sense by which to understand life.

]]>But where does programming fit with ICT, computer science and maths? How central a subject is it?

What's termed ICT seems to be "how to operate your computer...or generic applications on it...or even past computing forms like calculators". Frankly children are often good at operating the latest tech--usually better than their teachers. Primary schools need to help, verifying that they can do basic operations and offer remedial, individual help if not, but this "operating your computer" should not be a subject per se and is far from programming in subject-matter and required skillset.

What about computer science? It's the specialist subject of how you optimise programs, programming, build large-scale software or even design new programming languages. Important though this is, attaching programming only to CS is too narrow a viewpoint.

Instead, programming is much more fundamental to STEM: it's the way you communicate technical ideas and processes in the modern world. It's as central as that.

You can view it as a superset evolution of mathematical notation, far more general and with the immediate consequence of machine computable results. Programs are the way you write down maths.

And so I believe programming is an integral, core part of maths education. It's the hand-writing of technical ideas and just like hand-writing is in the early years attached to learning English (if you're in England!), so core, basic programming should be attached to maths.

To be clear, I'm not talking calligraphy, but basic hand-writing. Calligraphy is the CS end--the subject in which you study programming in its own right, its nuances, detailed optimisation. Hand-writing is the basic tool, to let everyone communicate. Just like hand-writing is more generally applied than in English, programming is more generally applicable than is today's perception of maths' applicability in schools (though not than maths' actual utility). Whether in geography, economics or science, technical problem solving needs maths and the way you write down and do anything but trivial arithmetic is with programming.

I'm not knocking the new efforts with programming. Far from it. I'm all for getting programming into education under whatever guise is easiest. If making ICT "rigourous" is the politically expedient way, starting there is fine so long as we recognise it just as the start.

It would be folly indeed if in the very country where a mathematician invented the computer and effectively the concept of programming, we should fail to see the crucial integration of programming with maths education.

(Perhaps if Alan Turing had lived longer, computer science would have been generally considered a part of maths, not a separate discipline--just like mechanics or statistics usually are today).

]]>It's amazing how little cross-pollination there is between computational areas. Each area has largely had systems with their own lingo and customs and only the types of computation with which they have become familiar.

We can do a simple demo of graph layout of stock correlation to a group of financial engineers and they are impressed. Well, we do have a very nice implementation, but the algorithms are well established and standard fitment in areas like social network analysis.

Finance is clearly an area where the analytics needs rebuilding, particularly for risk. In truth, it's a mixture between questionable analysis and antiquated reporting. So it's not just straight computation we're talking here either. It's high-level language, instant interactive reporting and linguistic interfaces to name a few. But what it really needs is the coherence of having an all-in-one system with intelligent automation that builds trust.

This is just the start of taking *Mathematica* technology and doing much deeper deployment in finance and other, different verticals.

When governments talk maths, they seem intent on convolving hand-calculating with rigour, rigour with understanding, calculating with numeracy, maths with calculating, rote-procedure learning with the vital conceptual and intellectual requirements of today's real-world maths.

I read the response letter first. It fitted this mould rather too well.

Then I scanned the curriculum itself. It seemed much better. I agree with many problem-solving aspirations and indeed many of the skills cited. I like its not-too-prescriptive approach, as I understand it, giving leeway for lots of different ways to achieve the teaching outcomes including (though this is not specifically cited) basing it on technology. Yes, I'd like this to be much more radical and programming to be included as a core skill, but I understand the difficulty of hard-coding it at this stage. I also understand why there's little reference to technology on the basis that its use isn't an outcome but a highly appropriate (I'd argue essential) tool to reach the outcome--outcomes which I think could have been bolder if computers were the default assumption for calculating.

Where my support starts to diverge is with procedures for multiplying fractions (when did you last use this formally eg. 3/16 x 7/8?) and there's a gaping chasm by the time we get to long-division (ever need to use that?).

Not only are these examples mechanics-led outcomes, not problem-centric (in the end it's problems that maths is there to solve not its own mechanics), but the mechanics in question is in practice obsolete ie. it's not in use in the real world nor do I believe it empowers understanding that is.

This saps student's time, energy and motivation. But I'm concerned about a far more serious problem: the lowly government portrayal of maths.

Should placing long-division or learning your times tables really be portrayed as the pinnacle of achievement in maths at primary school? Worse still, why imply that those tedious procedures are what maths is primarily about?

This is about the worst maths marketing you can do to prospective students--and in the long term to parents. Perhaps it's a good short term vote-winner for some, like brands that consistently do special offers improving sales short-term, but it's not a good long-term strategy for building a quality image of maths in our society or one that's aspirational. It's using long-division as a badge of honour of what the government call rigour when in fact it's a prime example of mindless manual processing.

And more than ever, it presents a broadening chasm between government's view of maths and the real-world subject.

The nub of real maths isn't rote-learning procedures nor does it depend upon them. It's not calculating, but the highly challenging mathematising of ever more complex situations for a computer to calculate, de-mathematising the results and validating their worth. It's creative, applied, powers some of the most successful ideas and developments of recent centuries and can even be fun and engaging!

A useful analogy is with survival skills. In the past your life would depend on rubbing sticks together to make a fire. Now those aren't likely to be life or death. Instead basic survival is how to cross the road or handle money. What are today's maths survival skills? What's at the pinnacle of today's maths?

Instead of rote learning long-division procedures, let's get students applying the power of calculus, picking holes in government statistics, designing a traffic system or cracking secret codes (so topical this month with Alan Turing's anniversary and his computer-based code breaking). All are possible, all train both creativity, conceptual understanding and have practical results. But they need computers to do most of the calculating--just like we do in the real world.

Examples from Wolfram Demonstrations site.

I hope these sorts of examples will all be encouraged under this new curriculum, and crucially that the assessments will highly value the skills they require, utilising computers so problems can be harder, more realistic and far more engaging.

One country will take this computer-based approach first and leapfrog others' technical education. This change will happen. The question is when not if. I worry that the UK won't be in the leading edge of this, but in so many ways it still could be.

Meanwhile, I won't be much help with my daughter's long-division homework. I've actually never learnt how to do long-division; I don't think it's disadvantaged me one iota.

P.S. Paradoxically, I would include times tables in a curriculum: they're still useful. Surrounded as I am by computers, I still find mentally hand-estimating helps me make quick judgements on information and I often do approximate multiplications to achieve that. Of course I could get my computer to do this---but for the moment, it's that bit slower. I do not think times tables give one valuable inherent understanding; but they are useful. Long-division doesn't and it isn't.

]]>It was great to welcome David Cameron, British Prime Minister, officially to open our new Wolfram Centre in Oxfordshire, UK today.

Rather than a traditional plaque unveiling, we went virtual: an iPad button wirelessly firing off a sequence on a nearby TV, the ending "plaque" presenting live data captured at the moment of unveiling--the current weather, FTSE level, star chart and even the PM's age of 16562 days.

More seriously, we talked two topics I believe are key Britain's hi-tech role: making government data truly accessible (to citizens and government(!) alike) and resetting maths education to be computer-based--both more conceptual and more practical.

It's interesting how much the first chimed with the PM's 2010 TED talk about people empowerment in a "post bureaucratic age". It was fun showing how Wolfram|Alpha queries and interactive CDF could serve this agenda (including through Siri), and how the problem-centred approach of computerbasedmath.org might give the UK an opportunity to leapfrog other countries in STEM.

It's clear that the PM is keen to see Britain as a bold new tech and information hub, able to punch above its weight in reshaping the value-chain of knowledge, or what I've described before as the "computational knowledge economy".

In our unusual kind of way, I believe we can contribute unique facets to driving this agenda.

]]>

I really like the badge our team came up with for computerbasedmath.org. If you stand on the power and automation of computers, you really can reach to infinity! Maths has been truely aspirational to world development and so it can be to each individual too.

Our next challenge: make a 3D printout.

P.S. If you like our plans why not show your support by adding this to your website? Available here.

]]>