I was attending a conference last month in London, question was with regards to Engineering and panel member said “good engineering people are like gold-dust”. I am an engineer by trade, have worked in a financial services company at the heart of London and later-on became the driving force behind many consumer and B2B applications but still at a time, my first university was about to kick-me out because I was not interested in grasping the courses which were not of my interest. I reluctantly turned back and used the same systems to pass those courses which were used by other student in my class (rote, cheating), just to study those course which were of my interest. This led me to question the “gold dust” equation and role of what education system is today!
The problem starts from in-take of Undergrad batches. Let’s take Computer Engineering intake, in most universities, nobody asks a student why you want admission in Engineering discipline? Maybe Computer Science is right for you, why don’t you go there? Or, your entrance exam shows that your skills are more theoretical rather than analytical, have you thought about taking an Arts related degree? While, many high-end schools and colleges around the world ask this question and interviews are mandatory but when we talk about more mainstream universities in countries like Pakistan, the idea is to fill more seats or emphasis is on the marks you get in your secondary and higher secondary schools (which are again based more on rote based studies rather than filling student with necessary tools which he would desperately need in University and real world. That’s another debate and I will leave that here..).
Let Us C
Why we need a full 3 and half months course to teach “Let Us C”? Why can’t we just do that whole thing in one week? Not everyone is a programmer by default, if you tried to counsel the prospective student at in-take stage, you would not need to worry much about the background and grasp of students in your class. The whole basic programming resolves around 6 or 7 basic tools (variables, loops, arrays etc.) you teach them, give them couple of exercise to do and then take couple of hands-on labs, show them how to pass from blue screen to a calculator, all done within a week and in next week you can do polymorphism, abstract classes, inheritance etc. (these all are more theoretical concepts, if the base is right then these come as logical steps and a teacher need not to worry too much about it, yes, you will worry if your in-take is not right but if your class is on-par with basic tools and have done a calculator in your first week, they will be a pro in object oriented programming within second week – approximately 18 hours later and it’s all what is required to teach programming to someone who is really interested in this subject. If someone is not, you can go on to teach him for next 4 years and he will come out of university thinking “aliens do programming, I can’t understand a word of it”).
Now, I have heard about this approach in some universities, they will emphasise on one course for couple of weeks or a month, finish it and then start the second. I am not arguing anything about the duration. If a course is requires 4 months mandatory class then it should be given but what is it you are teaching in those 4 months? Whatever it is, it shouldn’t be a wastage of time.
The Professional Studies Architecture
The study architecture should be based on developing base for a student which most curriculums do, like in first 4 semesters (that’s 2 years) we studied Calculus-I to III, Discreet Maths, Ordinary Differential Equations, Signals & Systems and several electronics courses. I am sure those courses were of some importance if I had not taken the road to be a Web Developer and then an App Developer and then finding myself running a software house and then ditching everything for online travel industry. However, I am pretty used to of taking a spin on everything, be it Image Processing, Machine Learning, Human Language Processing etc. and yes at very deep level I had to work around some equations but I am still waiting to see when those advance Fourier transforms or triple integrations will come to my aid. Those courses surely have opened my mind, given me necessary critical thinking skills which were lacking from schooling background but I take it as side-advantage rather than intended advantage of studying these courses.
Now, some will argue that a course is designed for everyone and for all fields and it’s up to the student to decide where to go. That is exactly my point: Why we design and teach everything to everyone? Why can’t we design courses and speed-up the processes? I have taken interviews of engineering students, they come equipped with A-grades in Calculus, Communications Systems and all the way up to System on Chip but they struggle while understanding what an object is?. I in fact hired couple of such students thinking they have 3+ GPA and grasping OOP should be a night’s work for them but guess what, after giving them at least 3 or 4 lectures and repeated work of same nature for at least 6 months, they were still struggling with basic concepts. They still didn’t knew what polymorphism is and why we use OOP at first place!
All of this has led me to question very basic system of University education.
Absent Skills vs Present Skills
Why don’t we teach:
- Analytical Reasoning
- Critical Thinking
- Ability to question everything
Rather than teaching everything above Calculus-I, Electronics-I and basic introduction to Programming and OOP. Even in Computer Science side of things why we need Compiler Construction for example or why we study Assembly language in Advanced DLD? Who needs them? Maybe those who built companies like Oculus and Deep Mind? Or may be the guys who are responsible for world’s fastest search algorithms at Google? Or may be the guys at helm of Facebook’s face recognition technology? Do you really think each of 200 students you are taking as an in-take in an Undergrad course at COMSATS will be able to work at all those technologies or will build one of those? My class went to all types of fields, industry automation, telecommunication, game programming, apps and even software QA. Couple of guys from my junior class are working for Intel, maybe writing algorithms for their next-gen 4g technologies but do we really think that 2 out of 1000 is the success story of our education system? Do we not think that we have failed at least 600 out of 1000? Do we not think that we lack all the soft skills for our jobs and possess and struggle and waste our time for 4 years on acquiring skills most of which are irrelevant to current job market?
The Solution as it should be
Do we not think that with proper guidance and with proper short base we can grow a system where a student would spend a maximum of 1 year in the class and may be a year as an apprentice before going full-fledge behind what he really wants to become. If he thinks that most local jobs are of iOS and Android devs (current market) then he could go there or human language processing or machine learning side. Maybe 6 months more theory would be required for churning out a students who would have experience, will and skills to assist the today’s industry. Maybe, Universities should have CTOs from local companies onboard, curriculums should be devised in presence of these CTOs and once every 3 months one of these CTOs should attend classes to see what’s going on. As I understand, that’s the only way we can convert gold-dust into plenty-of-skilled-workforce to gain momentum in innovation and to bring in those techno-dollars.