Rambles around computer science

Diverting trains of thought, wasting precious time

Tue, 26 May 2020

Mission and marketing in computer science degrees

At lunch with colleagues some months ago (remember those days?), I provocatively suggested that our course offering would be improved if we eliminated marketing terms from the names and specifications of our taught programmes and modules. Depending on exactly what counts as a marketing term, this might mean doing away with “cybersecurity”, “Internet of things”, “cloud computing”, “big data” (thankfully not currently used here at Kent) and ideally also “artificial intelligence” (once respectable, but no longer). To replace them, there are perfectly good time-honoured and serious phrases for the same areas of study: “computer security”, “distributed computing”, “embedded software development”, “statistical inference” and even “machine learning” (usually what “AI” means nowadays, and while faintly marketing-y, still more descriptive of that).

Although my suggestion was rather tongue-in-cheek, there is an important issue lurking not far behind it. Serious study on any technical topic, even in primarily “applied” areas like CS, is very different from product development. It's not about “learning the latest technologies”. An academic degree should not be a “workplace simulation” or an apprenticeship; it's about developing skills that are more underlying, more general and more timeless. We need to give students the message that university study is about bulding those skills. We need to give ourselves, as academics, the same message. If we don't, and instead fool ourselves that our responsibility is something about “teaching what will help the students get a job” then as an institution we're not fulfilling the very function of a research university, and as individual academics we're undermining the long-term justification for our own jobs.

The paradox is that here at the School of Computing, even our own industrial panel is reported to emphasise, consistently, that they want us to teach principles rather than specific technologies. Certainly our degrees do contain a decent amount of principled content, even if there is room for debate on what fraction of our students come away really understanding those principles. Despite that, currently I'd say we don't give the “timeless skills” message to our students very much. But we do give out the “get a job” message all the time. We do this by endorsing marketing-speak in what we offer, but also by other messages we send far more subtly, pervasively and latently, in what we teach and how we teach it. We award good marks to relatively insubstantial final-year projects whose core is to implement CRUD functionality mimicking commercial web sites or apps, provided they show due allegiance to the hallowed engineering methods we teach. We teach “software engineering” by attempting to rote-instil good practice, but backed by relatively little critical or analytical content, and taught at a stage where we are inevitably asking the students to take this wisdom on trust. (That is undergoing overhaul for next year, but it's unclear whether the new version will change this. If not, I will be partly to blame, as a co-author of the module specification.) We abolished as superfluous a module on “people and computing”, when at least its title captures what ought to be the single biggest topic in the whole discipline. We teach very little that conveys the historical, cultural or political aspects of technology. We continue to downgrade our “unpopular” theoretical content below even a basic minimum level (and I'm no theory zealot). And we let our pedagogy be compromised by the unwisdom of student “feedback”, such as, in response to claims that a module was not “up-to-date”, de-emphasising principles in favour of technologies—a move which ultimately leaves students uninformed and confused, since it necessarily blurs the distinction between what is essential and what is accident.

Of course, in all this we are pushed along by external pressures. One message that circulates loud and clear among colleagues, at staff meetings and elsewhere, is that the market matters: bums on seats are our financial bottom line. This follows through: we find ourselves specifying new modules according to our perceptions of marketability. Let me say that again: we're guided not by actual marketability, but our perceptions about marketability. Maybe my colleagues are more informed than I am, but I don't think many would disagree that we have limited basis for these perceptions. We allow our fuzzy mental caricatures of “the job market”, “industry”, “applicants” and hence “what will sell” to determine what and how we teach.

These fuzzy beliefs are why, for example, I have to respond periodically to the question of “why teach systems, when hardly any of then will get a job writing operating systems?”. Meanwhile, teaching “AI” left, right and centre is not questioned at all. If we really believed in “principles” and degree skills as “long-term” we would be more interested in breadth and balance. Instead we have collectively imbibed the message “AI is big; AI is the future” and regurgitated it into our curriculum development. I've nothing against the longstanding field of artificial intelligence, and I fully expect us to teach from it. But this eagerness to exploit its current trendiness is a symptom of something far more toxic than pragmatic compromise. Putting trend-chasing at the heart of our endeavour, not just on the surface of our marketing materials, undermines the social purpose of a research-intensive university—supposedly an institution for the long-term.

The prevailing line is that this is a financial necessity. That might even be the case. But to me, as a relative newcomer, that seems to be an assumption rather than an established fact. Nominally, the selection of our curriculum still rests with us, the academics, as it should. But the current pressure to be “market-driven” reduces us to headless chickens: base reflexes fill in for a lack of informed thought. Having absorbed a belief that an unapologetically “long-term” degree course is “unmarketable”, we knee-jerkingly propose to do what we perceive to be marketable. Where's the basis for believing any of this?

In my more optimistic moments I believe that for the good of everyone—our students, our academics, and our fortunes as a credible research university—there are things we can and should do to change this. Although I'm not a marketing expert, it seems that what I propose is not about ignoring “the market”, but rather about more seriously “doing marketing”, as distinct from mere publicity. One message we could plausibly try to get across is something like the following. Yes, we will teach you all about hot topics like AI, cybersecurity or whatever else the current trends favour. But that's not all we'll do, and we're offering an education that will last. What we give you is substantial, durable and transferrable.

Trying harder for an approach that emphasises the transferrable over the short-term vocational would be a huge challenge. It is the exact opposite of following the line of least resistance. It would take creativity, skill and effort. Still, it's what we academics mostly believe to be the right thing. I've no reason to believe these efforts would not be forthcoming. What is really stopping us?

The real problem seems to be risk. Proposing any change in this direction brings worried-sounding replies. Trying such a revamp would be a risk—how else would we discover what is marketable than by trying something different? If our intake takes a hit as a result of such an experiment (which would necessarily take several years to run), our political and financial position within the university would suffer. This is why having a university sector that is market-driven and financially straitened is so toxic. Doing the very job your institution was set up for has been turned into a radical feat of “innovation” whose risk is too great to take on. The very purpose of the institution has been redefined as impossible.

Nevertheless, I believe we can do more than we are doing. More cleanly separating our taught programmes' specifications (what we set for ourselves) from their marketing pitches (for applicants) might be one small, simple but possibly significant measure. It would help us make a mental distinction between our mission and our marketing strategy. One way to re-spin that mission might be explicitly to contrast the flawed presumption of “preparing students for industry”; with a more forward-thinking “preparing students to prepare a better industry”. This matters especially in our field because much of the software industry is, regrettably, built on froth and fads. If we want the software industry to be better, we need to turn out students who know how to look for the substance underneath the froth; who have a questioning mindset; who aspire to be more than just hired guns (to borrow the words of Christopher Alexander). That's a tall order, and perhaps many of our students aren't especially inclined to ask these questions... but if universities don't make it clear that substance exists beneath the froth, what hope do we have?

We also need to remember that boom easily turns to bust. “Computer science” is currently attractive to students because software is perceived as a boom industry. In this sense, courting the market seems to work in our favour. But again, this is happening because our applicants think “job” when they should think “career”. We do nothing to correct this. “Cybersecurity? Artificial intelligence? Internet of things? Walk this way.” Our university is collapsing financially and compulsory redundancies are on the horizon in some subjects. But at the School of Computing, complacency rules because “we're doing well” and “this doesn't affect us [much, directly]”. Maybe not—this time. But one only has to go back to around 2008–09 we there was last a threat of compulsory redundancies among computer science academics here at the University of Kent. Something called a “credit crunch” was happening around then. Only a few years years previously there had been something called the “dot-com bubble” which burst in 2000–01, seeing many institutions' CS applications take a sudden dip, one which did not recover for about ten years.

These market lurches did nothing to diminish the importance of what we teach. Yet somehow they had a keen effect on the capacity of our institutions to support that teaching. That is a sign of the unquestionably appalling stewardship of our institutions in recent times, and sadly, in the ten years since, the grip of marketisation has only got stronger. But we cannot entirely blame external factors for our choice to emphasise froth, nor can we rely on what for us are “favourable market conditions” remaining so—and this ties back to the same problem. When an industry is based on froth, spooked investors are all it takes to bring the finances crashing down. The same goes for a computer science department and spooked applicants. What goes around comes around. If we believe that what we teach does not suddenly become irrelevant the moment “the market” takes its next lurch, we owe it not just to ourselves, or our colleagues, but to society at large, to doubt the wisdom of “the market” in deciding what passes for an education. Can't we at least try to make our case, rather than chasing what we think the consumers think they want?

[/teaching] permanent link contact


Powered by blosxom

validate this page