The debate over the future of work, long running in research circles but kicked into public consciousness amongst others a Oxford University study titled ‘The Future of Employment: How susceptible are jobs to computerisation’ suggesting over 40 per cent of jobs are at threat in the next 11 years due to technology, continues. In truth there is little agreement from economists and labour market specialists. Some claim techn0logy is leading to more jobs, some that it is destroying jobs and still other that it is neutral. Some claim technology is leading to jobs being deskilled, others the reverse.
I like a recent blog post entitled ‘More on digitalisation and skills: What happens within occupations?’, by Guillermo Montt on the OECD Skills and Work web site. The article says that “as technology enters the workplace, the tasks related to a job and an occupation change” citing Alexandra Spitz-Oener (2006) who found that in Germany, occupations in the 2000s require more complex skills than in 1979 and that this change is more pronounced in occupations that adopted computers. Although something of a simplification, that finding is largely born out in analysis of the USA O*NET data. The article also draws attention to research by James Bessen published in his recent book ‘Learning by Doing: The Real Connection between Innovation, Wages and Wealth‘. “He follows the evolution of occupations over time and claims that accelerated technological change has implications for inequality within occupations with more and more occupations becoming winner-take-all markets.” Essentially, as new technology is introduced pay and opportunities in occupations bifurcate with a few taking high high, pay levels and more taking home lower pay. “In occupations requiring above-median computer use, the 90th to 50th percentile wage ratio has risen by 0.2% per year but has remained stagnant in occupations with below-median computer use. Workers who stay ahead of the curve, those who learn by doing, reap the wage benefits of technological change.”
This has major implication for training and continuing professional development. CPD has traditionally been organised through courses. But as we have already found in in the EmployID project working with employees in European Public Employment Services, traditional course delivery is both too slow to respond to change and even more problematic is unable to deliver the volume of training required. The approach adopted in EmployID is both to look at using new technologies for learning and for promoting informal learning in the workplace but also to center on changing occupational identities. For instance there is a very different occupational identity associated with a print graphic designer than todays web designer. But the ability to change occupational identities may be shaped by previous learning experiences and by motivation as well as the ability to reflect on both individual and group learning. Within EmployID we are exploring how Learning Analytics can bets be deployed to assets people in reflection (Reflection Analytics) and to assist in transforming identities to deal with such change. I am presenting this work next week at a LAKs pre conference workshop in Glasgow and will publish by slides on this blog.
I like this blog post by Robert Peal entitled ‘A Myth for Teachers: Jobs That Don’t Exist Yet’. The article looks at the origins of the idea that the top 10 in-demand jobs in 2010 didn’t exist in 2004 and its later variant that 60 per cent of the jobs for children in school today have not been invented. In both cases he found it impossible to track these statement in any reliable research. Of course these are myths. But often such myths can be tracked back to quite prosaic political objectives.
For a long time, the European Union has pushed the idea of the knowledge society. And whilst there are many learned papers describing in different ways what such a society might look like or why such a society will emerge there is little evidence of its supposed impact on labour markets. Most common is the disappearance of low and unskilled jobs, linked to growing skill shortages in high skilled employment. Yet in the UK most recent growth in employment has been in low skills, low paid jobs in the retail sector. I remember too in the late 1990s when the European industry lobby group for computers were preaching dire emergencies over the shortage of programmers, with almost apocalyptic predictions of what would happen with the year 200 bug if there were not major efforts to train newcomers to the industry. Of course that never happened either and predictions of skills shortages in software engineering persist despite the fact the UK government statistics show programmers pay falling in the last few years.
I’ve been invited to do several talks in the last year on the future of work. It is not easy. There are two lengthy reports on future skills for the UK – ‘Working Futures 2012- 2022’ and ‘The future of work: jobs and skills in 2030’, published by the UK Commission for Skills and Industry. Both are based on statistical modelling and scenario planning. As one of the reports says (I cannot remember which) “all models are wrong – it is just that some of more useful than others. Some things are relatively clear. There will be a big upturn in (mainly semi skilled) work in healthcare to deal with demographic changes in the age of the population. There will also be plenty of demand for new skilled and semi skilled workers in construction and engineering. Both are major employment sectors and replacement demand alone will result in new job openings even if they do not expand in overall numbers (many commentators seem to forget about replacement demand when looking at future employment).
But then it all starts getting difficult. Chief perhaps amongst this is possible disruptions which can waylay any amount of economic modelling. The following diagram above taken from ‘The future of work: jobs and skills in 2030’ shows possible future disruptions to the UK economy and to future jobs. One of these is the introduction of robots. With various dire reports that up to 40 per cent of jobs may disappear to robots in the next few years, I suspect we are creating another myth. Yes, robots will change patterns of employment in some industries, and web technologies enable disruptions in other areas of the economy. Yet much of the problems with such predictions lay with technological determinism – the idea that technology somehow has some life of its own and that we cannot have any says over it. At the end of the day, despite all the new technologies and the effects of globalization, there are massive policy decisions which will influence what kind of jobs there will be in the future. These include policies for education and training, inter-governmental treaties, labour market and tax policies, employment rights and so on. And such considerations should include what jobs we want to have, how they are organised, where they are and the quality of work. At the moment we seem to be involved in a race to the bottom – using the excuse of austerity – which is a conscious policy – to degrade both pay and work conditions. But it doesn’t need to be like this. Indeed, the excuses for austerity may be the biggest myth of all.
PISA in Focus shows, in many countries the teaching profession is having a hard time making itself an attractive career choice – particularly among boys and among the highest-performing students.
PISA 2006 asked students from the 60 participating countries and economies what occupation they expected to be working in when they are 30 years old. Some 44% of 15-year-olds in OECD countries reported that they expect to work in high-status occupations that generally require a university degree; but only 5% of those students reported that they expect to work as teachers, one of those professional careers.
The numbers are even more revealing when considering the profile of the students who reported that they expect to work as teachers. If you read our report on gender equality in education published earlier this year, you may remember that girls tend to favour “nurturance-oriented” careers more than boys do – and teaching is one of those careers. In almost every OECD country, more girls (6%) than boys (3%) reported that they expect to work as teachers. This statistic is particularly worrying when you recall that the majority of overall low achievers in school are boys, who could benefit from the presence of more male role models at school.
As promised some further thoughts on the DISCUSS conference, held earlier this week in Munich.
One of the themes for discussion was the recognition of (prior) learning. The theme had emerged after looking at the main work of Europa projects, particularly in the field of lifelong learning. The idea and attraction of recognising learning from different contexts, and particularly form informal learning is hardly new. In the 1990s, in the UK, the National Council for Vocational Qualifications (as it was then called) devoted resources to developing systems for the Accreditation of Prior Learning. One of the ideas behind National Vocational Qualifications was teh decoupling of teaching and learning from learning outcomes, expressed in terms of competences and performance criteria. Therefore, it was thought, anyone should be able to have their competences recognised (through certification) regardless of whether or not they had followed a particular formal training programme. Despite the considerable investment, it was only at best a limited success. Developing observably robust processes for accrediting such learning was problematic, as was the time and cost in implementing such processes.
It is interesting to consider why there is once more an upsurge of interest in the recognition of prior learning. My feeling was in the UK, the initiative wax driven because of teh weak links between vocational education and training and the labour market.n In countries liek Germany, with a strong apprenticeship training system, there was seen as no need for such a procedure. Furthermore learning was linked to the work process, and competence seen as the internalised ability to perform in an occupation, rather than as an externalised series of criteria for qualification. However the recent waves of migration, initially from Eastern Europe and now of refugees, has resulted in large numbers of people who may be well qualified (in all senses of the word) but with no easily recognisable qualification for employment.
I am unconvinced that attempts to formally assess prior competence as a basis for the fast tracking of awarding qualifications will work. I think we probably need to look much deeper at both ideas around effective practice and at what exactly we mean my recognition and will write more about this in future posts. But digging around in my computer today I came up with a paper I wrote together with Jenny Hughes around some of these issues. I am not sure the title helped attract a wide readership: The role and importance of informal competences in the process of acquisition and transfer of work skills. Validation of competencies – a review of reference models in the light of youth research: United Kingdom. Below is an extract.
“NVQs and the accreditation of informal learning
As Bjørnåvold (2000) says the system of NVQs is, in principle, open to any learning path and learning form and places a particular emphasis on experience-based learning at work, At least in theory, it does not matter how or where you have learned; what matters is what you have learned. The system is open to learning taking place outside formal education and training institutions, or to what Bjørnåvold terms non-formal learning. This learning has to be identified and judged, so it is no coincidence that questions of assessment and recognition have become crucial in the debate on the current status of the NVQ system and its future prospects.
While the NVQ system as such dates back to 1989, the actual introduction of “new” assessment methodologies can be dated to 1991. This was the year the National Council for Vocational Qualifications (NCVQ) and its Scottish equivalent, Scotvec, required that “accreditation of prior learning” should be available for all qualifications accredited by these bodies (NVQs and general national qualifications, GNVQs). The introduction of a specialised assessment approach to supplement the ordinary assessment and testing procedures used when following traditional and formal pathways, was motivated by the following factors:
1. to give formal recognition to the knowledge and skills which people already possess, as a route to new employment;
2. to increase the number of people with formal qualifications;
3. to reduce training time by avoiding repetition of what candidates already know.
The actual procedure applied can be divided into the following steps. The first step consists of providing general information about the APL process, normally by advisers who are not subject specialists, often supported by printed material or videos. The second and most crucial step includes the gathering and preparation of a portfolio. No fixed format for the portfolio has been established but all evidence must be related to the requirements of the target qualification. The portfolio should include statements of job tasks and responsibilities from past or present employers as well as examples (proofs) of relevant “products”. Results of tests or specifically-undertaken projects should also be included. Thirdly, the actual assessment of the candidate takes place. As it is stated:”The assessment process is substantially the same as that which is used for any candidate for an NVQ. The APL differs from the normal assessment process in that the candidate is providing evidence largely of past activity rather than of skills acquired during the current training course.”The result of the assessment can lead to full recognition, although only a minority of candidates have sufficient prior experience to achieve this, In most cases, the portfolio assessment leads to exemption from parts of a programme or course. The attention towards specialised APL methodologies has diminished somewhat in the UK during recent years. It is argued that there is a danger of isolating APL, and rather, it should be integrated into normal assessments as one of several sources of evidence.”The view that APL is different and separate has resulted in evidence of prior learning and achievement being used less widely than anticipated. Assessors have taken steps to avoid this source of evidence or at least become over-anxious about its inclusion in the overall evidence a candidate may have to offer.”We can thus observe a situation where responsible bodies have tried to strike a balance between evidence of prior and current learning as well as between informal and formal learning. This has not been a straightforward task as several findings suggest that APL is perceived as a “short cut”, less rigorously applied than traditional assessment approaches. The actual use of this kind of evidence, either through explicit APL procedures or in other, more integrated ways, is difficult to overview. Awarding bodies are not required to list alternative learning routes, including APL, on the certificate of a candidate. This makes it almost impossible to identify where prior or informal learning has been used as evidence.
As mentioned in the discussions of the Mediterranean and Nordic experiences, the question of assessment methodologies cannot be separated from the question of qualification standards. Whatever evidence is gathered, some sort of reference point must be established. This has become the most challenging part of the NVQ exercise in general and the assessment exercise in particular.We will approach this question indirectly by addressing some of the underlying assumptions of the NVQ system and its translation into practical measures. Currently the system relies heavily on the following basic assumptions: legitimacy is to be assured through the assumed match between the national vocational standards and competences gained at work. The involvement of industry in defining and setting up standards has been a crucial part of this struggle for acceptance, Validity is supposed to be assured through the linking and location of both training and assessment, to the workplace. The intention is to strengthen the authenticity of both processes, avoiding simulated training and assessment situations where validity is threatened. Reliability is assured through detailed specifications of each single qualification (and module). Together with extensive training of the assessors, this is supposed to secure the consistency of assessments and eventually lead to an acceptable level of reliability.
A number of observers have argued that these assumptions are difficult to defend. When it comes to legitimacy, it is true that employers are represented in the above-mentioned leading bodies and standards councils, but several weaknesses of both a practical and fundamental character have appeared. Firstly, there are limits to what a relatively small group of employer representatives can contribute, often on the basis of scarce resources and limited time. Secondly, the more powerful and more technically knowledgeable organisations usually represent large companies with good training records and wield the greatest influence. Smaller, less influential organisations obtain less relevant results. Thirdly, disagreements in committees, irrespective of who is represented, are more easily resolved by inclusion than exclusion, inflating the scope of the qualifications. Generally speaking, there is a conflict of interest built into the national standards between the commitment to describe competences valid on a universal level and the commitment to create as specific and precise standards as possible. As to the questions of validity and reliability, our discussion touches upon drawing up the boundaries of the domain to be assessed and tested. High quality assessments depend on the existence of clear competence domains; validity and reliability depend on clear-cut definitions, domain-boundaries, domain-content and ways whereby this content can be expressed.
As in the Finnish case, the UK approach immediately faced a problem in this area. While early efforts concentrated on narrow task-analysis, a gradual shift towards broader function-analysis had taken place This shift reflects the need to create national standards describing transferable competences. Observers have noted that the introduction of functions was paralleled by detailed descriptions of every element in each function, prescribing performance criteria and the range of conditions for successful performance. The length and complexity of NVQs, currently a much criticised factor, stems from this “dynamic”. As Wolf says, we seem to have entered a “never ending spiral of specifications”. Researchers at the University of Sussex have concluded on the challenges facing NVQ-based assessments: pursuing perfect reliability leads to meaningless assessment. Pursuing perfect validity leads towards assessments which cover everything relevant, but take too much time, and leave too little time for learning. This statement reflects the challenges faced by all countries introducing output or performance-based systems relying heavily on assessments.
“Measurement of competences” is first and foremost a question of establishing reference points and less a question of instruments and tools. This is clearly illustrated by the NVQ system where questions of standards clearly stand out as more important than the specific tools developed during the past decade. And as stated, specific approaches like, “accreditation of prior learning” (APL), and “accreditation of prior experiential learning” (APEL), have become less visible as the NVQ system has settled. This is an understandable and fully reasonable development since all assessment approaches in the NVQ system in principle have to face the challenge of experientially-based learning, i.e., learning outside the formal school context. The experiences from APL and APEL are thus being integrated into the NVQ system albeit to an extent that is difficult to judge. In a way, this is an example of the maturing of the system. The UK system, being one of the first to try to construct a performance-based system, linking various formal and non-formal learning paths, illustrates the dilemmas of assessing and recognising non-formal learning better than most other systems because there has been time to observe and study systematically the problems and possibilities. The future challenge facing the UK system can be summarised as follows: who should take part in the definition standards, how should competence domains be described and how should boundaries be set? When these questions are answered, high quality assessments can materialise.”