Article - Issue 46, March 2011

Response to: Value of apprenticeships; Response to: Autonomous systems

Doug Oughton FREng; Anthony Finn and Richard Grover

Download the article (63 KB)

Response to: Value of apprenticeships

Many Ingenia readers will already know of Education for Engineering (E4E), based at The Royal Academy of Engineering. Created in 2009, it is the mechanism by which the engineering profession offers coordinated and clear advice on education to UK government and the devolved assemblies. E4E deals with all aspects of learning that underpin engineering, and has a wide membership drawn from the professional engineering community.

I would like to clarify and add to the comments made by Damon de Laszlo, Chairman of Harwin, in his letter published in Ingenia 45. He indicated that qualifications associated with apprenticeships in engineering need to be made clearer for industry. He added that at Harwin, apprentices in their first two years will be following a BTEC National. This is a vocational level 3 qualification (A-level equivalent), different from a bachelor’s degree at level 6, which some readers may have inferred from his letter.

E4E shares Mr de Laszlo’s concerns about sufficient resources for the training of engineering apprentices. There was a National Apprenticeship Week in February 2011 and while we applaud the government’s commitment to increasing the number of apprenticeships, we believe there should be a greater focus on engineering and technology apprenticeships for the productive industries. We would also like to see greater emphasis placed on people taking advanced apprenticeships with level 3+ qualifications, not intermediate apprenticeships with level 2 (GCSE equivalent) qualifications. This is where choice of the right vocational qualification becomes important to meet the needs of industry.

Vocational qualifications make up part of the education on offer to young people in schools and colleges. They make an important contribution to the knowledge and skills of the workforce in many areas of engineering. In our correspondence with government, E4E is supporting vocational qualifications because these are so important to our sector, in particular the technician workforce on which so many of our industries rely.

E4E have recently submitted responses to the BIS consultation on the future strategy of the Further Education sector and the independent review of 14-19 vocational education, currently being conducted by Professor Alison Wolf for the Department for Education. In addition, the Academy recently led a project to review the number of science, technology, engineering and mathematics (STEM) vocational qualifications, learners and teachers in the further education sector. This research has highlighted just how complex the vocational qualifications system currently is and why it is difficult for engineering businesses to discern the value of the various qualifications on offer.

We believe that the engineering community should fully support engineering apprenticeships and vocational education as an alternative pathway for progression in industry and higher education. As Chair of the Operational Group of E4E, I would encourage your readers to visit our website and would welcome suggestions from the wider engineering community of ways in which we can raise awareness of this matter, perhaps through E4E or the engineering institutions.

Doug Oughton FREng
Chair, E4E Operation Group

 

Response to: Autonomous systems

We read with interest the article Are we ready for Autonomous Systems? (Ingenia 45). It acknowledged the increased decision-making capabilities of these systems to the point where they may now make truly independent decisions and that this replacement of the human in the decision-making loop represents a sea-change in the role of technology in society. The article also noted that their use is largely unregulated. Perhaps what the article did not convey was the degree of urgency that confronts us or the magnitude of challenges ahead.

Within a decade, in the military domain, we will probably see a fully autonomous unmanned combat aerial vehicle in service. It is likely that they will then become vital components of force consideration, as occurred with cruise missiles and GPS. Furthermore, as evidenced by the proliferation of ‘driver assistance technologies’ such as automatic parking and speed management systems, the decision-making will then have become sufficiently sophisticated to blur the boundary between the system being an extension of its human operators and an independent agent.

When the inevitable accident occurs how shall we understand who might be responsible for any infringement perpetrated by them and their decision-making technology? Is it the user, retailer, manufacturers, or owner? Or do we simply put it down to an ‘unfortunate mistake’ and, as is compulsory for commercial and domestic ground vehicles, insist that insurance be purchased so that liability be assumed by an underwriter? The latter may be superficially attractive, but really just shifts and quarantines the liability rather than introducing novelty into the problem with respect to solving it.

In fact, it is not clear that we even understand how we might establish causality, or what criteria we should use to assign this responsibility or degree of responsibility. At one level the law of tort clearly has a highly developed set of principles that apply to product liability and can no doubt be applied to autonomous systems, just as we do with other intelligent, safety-critical systems whose malfunction can lead to death or injury. Moreover, the majority of concerns will no doubt fall under this mundane interpretation. However, most laws pertaining to the movement of vehicles apply to pilots, drivers or riders, as they are assumed to be in control; this is manifestly not the case with autonomous systems.As a result it may be desirable for such systems to hold some sort of legal personality. We could choose to afford autonomous systems quasi-legal status and allow them responsibility for a limited set of decisions and actions, just as some legal entities (such as children and corporations) frequently act through agents.

Answers to other basic questions also elude us. For instance, when is an autonomous system good enough to take on the discretionary roles currently exercised by humans? Do we know how to measure this? Are we in fact applying higher standards to the technology than those currently expected of humans? And, even if we work out all this, do we know who will be responsible for the certification, maintenance, safety cases and regulatory regimes? These and many other related conundrums are simply beyond the scope of this letter but the following is instructive:

“In 1936, a Duke University law student published an article summarizing the path of automobile liability law. He observed that in 1905 all of American automobile case law could be contained within a four-page law review article, but three decades later, a ‘comprehensive, detailed treatment [of automobile law] would call for an encyclopaedia.’ That law student was Richard M. Nixon, who would later become President of the United States. His conclusion was that courts were mechanically extending ‘horse and buggy law’ to this new mode of transportation in most doctrinal areas. However, some judges were creatively crafting new doctrine in certain subfields of automobile accident law by stretching the legal formulas at their command in order to reach desired results.” (from Cybertorts and Legal Lag, 2004, Rustad and Koenig).

Let us hope that somewhere buried deep in a university a future world leader is working hard on the law of autonomous systems.

Anthony Finn
Professor of Autonomous Systems, University of South Australia

Richard Grover
Formerly Senior Scientist BAE Systems (Advanced Technology Centre, Bristol)

[Top of the page]