I spent quite a lot of time with people responsible for tertiary IT education in 2014 and I feel for them in terms of the pressure they are under from the IT industry, the primary consumer of the principal “product” of universities – graduate students.
It’s become part of the collective consciousness that the role of tertiary institutions is to “prepare students for their careers”. But the definition of “prepared” in this context is driving misguided behaviour from universities in an effort to reach a goal which is frankly unachievable given the constraints under which they operate.
The working definition of “prepared” for graduates leaving university and entering the IT industry seems to be:
Prepared: Capable of contributing to their employers’ business from Day 1. Requiring little/no ongoing training or mentorship. Knowledgeable in all current technologies and techniques used by the employer.
In pursuit of this goal, universities build degrees and populate them with a vast range of subjects across the fundamental principles of development, operating systems, networking, databases and softer skills around communication. But very little of this knowledge will “prepare” graduates as defined above. So what do universities do? Into the precious little free space left in most degrees, they add subjects around “modern” computing topics like web development, cloud computing, big data, mobile applications and the like.
While presumably appealing to uninformed students and university marketing teams alike, none of these subjects will cover their respective topics in sufficient depth to appease employers when these graduates arrive in the workforce. And while these topics are currently trending within the IT industry in general, not every employer has a need for mobile development or big data, for example.
So I propose a new working definition for “prepared”, one which could change the expectations industry have on IT graduates and allow universities to spend more time on what I see to be the really core skills a graduate needs to have.
Prepared: Enters industry with a passion for application of technology. Accepts and embraces the lifetime of learning ahead. Understands the benefit of strong mentors and is capable of growing these relationships. Accepting that there is no “one right way” in most cases.
Under this definition, universities can focus their efforts on sparking and nurturing that passion. From there, impress on students how dynamic this field is and how much it is likely to change by the time they graduate. And from there, place emphasis on how many solutions there are to so many problems, some of which may be completely non-technical.
So how would this approach be reflected in a typical IT degree?
There a variety of ways you could skin this cat, but some of the ways I would consider are:
- Make attendance at local community groups in relevant areas compulsory for several subjects. Arrange with the community organisers to put aside one community event for student presentations.
- Wherever there are leading technical candidates with different emphasis (e.g., OO versus functional approaches to development, RDMBS versus NoSQL databases, teach both simultaneously to help students appreciate the similarities and differences
- Continue the good work many universities currently do regarding industry based learning, internships, students projects, guest lecturers, etc, etc.
- Stop trying to build subjects around trending topics because (a) they’ll age quicker than soft cheese on a summer’s day and (b) they’ll inevitably leave students with a poor idea of how that topic is used in industry.
- And finally, resist further attempts by industry to neglect their duty in continuing the education of their graduates and other employees.