From the NYT, 28 Aug 2015:

# Monthly Archives: August 2015

# Latest speech by Nick GIbb

Nick Gibb speaks at the Researchers in Schools celebration event, 25 August 2015.

What follows are paragraphs from the text containing the words **maths** or **mathematics**.

The Researchers in Schools programme prioritises recruiting teachers in STEM subjects, in particular

mathematicsand physics. Nobody needs reminding that British employers face ongoing skills shortages in these areas.One in 10 state schools have no pupils progressing to either further

mathsor physics at A level, and 1 in 3 physics teachers have themselves not studied the subject beyond A level.

# The Quadratic Formula in Malta’s Learning Outcomes Framework

What I see as a deficiency of the Learning Outcomes Framework is that it does not specify learning outcomes in a usable way.

There are several references to quadratic equations in Levels 8–10, for example

Level 8

Number – Numerical calculations

18. I can solve quadratic equations by factorisation and by using the formula.

If a student from Malta comes to my university (and I have had students from Malta in the past, I believe), I want to know what is his/her level of understanding of the Quadratic Formula.

There are at least 7 levels of students’ competencies here, expressed by some sample quadratic equations:

(a) x

^{2}– 3x +2 =0

(d) x^{2}– 1 = 0

(c) x^{2}– 2x +1 = 0

(d) x^{2}+ sqrt{2}*x – 1 = 0

(e) x^{2 }+ x – sqrt{2} = 0

(f) x^{2}+ 1 = 0

(g) x^{2}+ sqrt{2}*x + 1 = 0

These quadratic equations are chosen and listed according to their increasing degree of conceptual difficulty: (a) is straightforward, (b) has a missing coefficient (a serious obstacle for many students), (c) has multiple roots, (d) involves a surd, but no nested surds in the solution, (e) has nested surds in the answer, (f) has complex roots, although very innocuous ones, and (g) has trickier complex roots. Of course, another list can be made, with approximately the same gradation of conceptual difficulty.

I would expect my potential students to be at least at level (d); but LOF tells me nothing about what I should expect from a student from Malta.

And one more comment: a comparison of the statements in the LOF Level 10:

I can solve quadratic equations by completing a square

and in the LOF Level 8:

I can solve quadratic equations by factorisation and by using the formula.

apperars to suggest that at Level 8 the Quadratic Formula is introduced to students without proof or proper propaedeutics which appear only at Level 10. In my opinion, this should raise concerns: at Level 8, this approach has a potential to degenerate into one of those “rote teaching” practices that make children to hate mathematics for the rest of their lives.

# Square Root of Kids’ Math Anxiety: Their Parents’ Help

By Jan Hoffman, NYT Blogs:

A common impairment with lifelong consequences turns out to be highly contagious between parent and child, a new study shows.

The impairment? Math anxiety.

Means of transmission? Homework help.

# The use of the term ‘Expected Frequency’

The June 2015 GCSE Subject Level Conditions and Requirements for Mathematics includes (P3)

“relate relative expected frequencies to theoretical probability, using appropriate language and the 0 – 1 probability scale”

and this leads to questions like

“If you rolled a die 600 times, how many sixes would you expect to get’.

which is taken from the CIMT MEP Pupil’s textbook on probability, and is given the answer

‘You would expect to get a 6 in 1/6 of the cases, so 100 sixes’.

This seems a confusing and misleading term. What exactly is an ‘expected frequency?’ The obvious meaning is the frequency that you expect. But we are trying to support the concept of a random variable, with ideas that a random variable is unpredictable in terms of value, that values do not form patterns or sequences, and can only be forecast and predicted in some general ways.

If you roll a die 600 times, I do not expect any value for the number of sixes. That is the most significant aspect of a random variable.

The implied sub-text is that

Expected frequency = probability X number of trials

So that, for example, if we toss a fair coin 100 times, what is the expected frequency of heads? Well, 50. So does that mean we expect to get 50 heads? This is a Bernoulli trial, and the probability of getting precisely 50 heads in 100 tosses is about 0.08. So we would need to say to a pupil

‘The expected frequency is 50; but it is unlikely that you would get 50 heads’

which hardly makes sense.

The probability of 51 is about .078, and 52 is .074. So, of course, 50 is the* most likely *frequency.

The phrase ‘most likely frequency’ is straight-forward, makes sense, and says what it means, unlike ‘expected frequency’.

Please can we stop using the phrase ‘expected frequency’?

# Malta’s Learning Outcomes Framework: a Discussion

Malta’s new Learning Outcomes Framework for school mathematics is an important case study of the European Union’s approaches to implementation of its education policies in member countries. For that reason the Framework deserves a close attention.

The original post of 13 August generated more responses than it was anticipated, and it is useful to collect them all at a single page.

- A. Borovik, The Quadratic Formula in Malta’s Learning Outcomes Framework, 26 August 2015
- A. Borovik, The Great Mystery of Malta’s Learning Outcomes Framework, 23 August 2015, updated 24 August 2015
- V. Gutev, Outcome Based Education, 13 August 2015 (+ 2 comments)
- J. Lauri, Response to “Malta: new Learning Outcomes Framework”, 12 August 2015 (+ 3 comments)
- A.Borovik, Malta: new “Learning Outcomes Framework”, 7 August 2015 (+ 8 comments)

# The Great Mystery of Malta’s Learning Outcomes Framework

**Important update below: it is no longer a mystery.
**

Malta’s new Learning Outcomes Framework is an important case study of the European Union’s approaches to implementation of its education policies in member countries. For that reason the Framework deserves a close attention.

An attempt to study the official website

http://www.schoolslearningoutcomes.edu.mt/en/pages/about-the-framework

immediately leads to a question:

Who had actually developed the Framework?

According to Wikipedia, population of Malta is about 445,000. When compared with the City of Manchester (about 514,000), it becomes clear that development of the Framework is a job beyond capabilities of a small nation.

So, external consultants were hired, some institutions or companies from English speaking parts of Europe. Taking into consideration traditional cultural connections, this part of Europe is likely to be the UK.

**Added 24 August 2015: ** Indeed I could not locate contractor’s names using advanced Google search on gov.mt,but serendipitously discovered their logos in the document Joint Venture Presentation dated 28 Jan 2015:

Outlook Coop is a company on Malta specialising in project management with expertise in EU funded projects.

East Cost Education Ltd is a small private company based in Northumbira with specialism, judging by their website, concentrated mostly in vocational education and training. In recent years, they worked on Malta on several projects in vocational training.

Institute of Education, London, is

the world’s leading centre for education and applied social science.

# The Inspection Paradox is Everywhere

From a brilliant blog by Allen Downey:

The inspection paradox is a common source of confusion, an occasional source of error, and an opportunity for clever experimental design. Most people are unaware of it, but like the cue marks that appear in movies to signal reel changes, once you notice it, you can’t stop seeing it.

A common example is the apparent paradox of class sizes. Suppose you ask college students how big their classes are and average the responses. The result might be 56. But if you ask the school for the average class size, they might say 31. It sounds like someone is lying, but they could both be right.

The problem is that when you survey students, you oversample large classes. If are 10 students in a class, you have 10 chances to sample that class. If there are 100 students, you have 100 chances. In general, if the class size isx, it will be overrepresented in the sample by a factor ofx.That’s not necessarily a mistake. If you want to quantify student experience, the average across students might be a more meaningful statistic than the average across classes. But you have to be clear about what you are measuring and how you report it.

# Outcome Based Education

In 1990, South Africa regarded Outcome Based Education (OBE) as its preferential educational paradigm, and designed Curriculum 2005. The South African Department of Education was very influenced by William Spady — an American proponent of OBE, who visited South Africa as a consultant on the issue. The National Qualification Framework went into effect in 1997 with great expectations, but these expectations were not met. It became evident even to the most vocal OBE-proponents that the educational approach gave inculcate skills not conducive to pursue any university education in mathematics and science. Since then, the curriculum underwent several corrections, and now is at stage of Curriculum Schooling 2025. Meanwhile, William Spady distanced himself from the South African version of OBE, describing it as a professional embarrassment:

“So now, with a decade of confusion about OBE behind us, I would encourage my South African colleagues to stop referring to OBE in any form. It never existed in 1997, and has only faded farther from the scene since. The real issue facing the country is to mobilize behind educational practice that is sound and makes a significant difference in the lives of ALL South African learners. Empty labels and flowery rhetoric are no longer needed; but principled thinking and constructive action are.”

Educational experts may argue whether it was Outcome Based Education, or some kind of Education Based on Outcomes. These experts may further argue on the terminology, but the fact remains it was supposed to be transformational OBE. A close look at their mathematics curriculum reveals that it is not so different from the proposed new Learning Outcomes Framework (LOF) for school mathematics in Malta, and in some aspects is even better. What is however completely identical in both is the educational utopia of outcomes coming from nowhere.

Essential mathematical skills are not just about a computational answer, for it is not the answer that is of the greatest importance to school children’s mathematical development. Rather it is children’s ability to apprehend mathematics as a conceptual system. Many education systems are emphasising on this, here is an excerpt from the Secondary Mathematics Syllabuses in Singapore:

“Although students should become competent in the various mathematical skills, over-emphasising procedural skills without understanding the underlying mathematical principles should be avoided… Students should develop and explore the mathematics ideas in depth, and see that mathematics is an integrated whole, not merely isolated pieces of knowledge.”

Unfortunately, in Malta’s case the design falls far short of such goals. Here is an example from level 5:

(COGNITIVE LEARNING) 16. I understand that multiplication is repeated addition.

Accordingly, a factor can only be added to itself a counting number of times. In Singapore’s Primary Mathematics Syllabus, multiplication and division are conceptualised gradually, and still on that level are introduced area and various square units. In contrast, square units are not present in Malta’s LOF for school mathematics. In fact, the proposed LOF is teeming with conceptual deficiencies. For instance, there is some kind of misconception between “equation” and “function”. Equations were never related to unknown variables, while functions are assumed to be somehow equations between the variables “*x*” and “*y*“. Use of radian measurement is not present, but learners are supposed to “plot graphs of trigonometric functions”.

Perhaps, Malta can learn from Singapore’s remarkable success since independence and the policies underlying its achievements in mathematical education.

# Parents’ Math Anxiety Can Undermine Children’s Math Achievement

From: news@psychologicalscience.org

**For Immediate Release**

If the thought of a math test makes you break out in a cold sweat, Mom or Dad may be partly to blame, according to new research published in *Psychological Science*, a journal of the Association for Psychological Science.

A team of researchers led by University of Chicago psychological scientists Sian Beilock and Susan Levine found that children of math-anxious parents learned less math over the school year and were more likely to be math-anxious themselves—but only when these parents provided frequent help on the child’s math homework.