Re-configuring computer education


 

IT has been the largest generator of jobs in the organised sector in India over the past five years with about 3,50,000 jobs getting added every year.

  • The maturity of the Indian IT services sector,
  • The emergence of IT product companies from India,
  • The dramatic rise of smartphones / tablets and the huge Apps ecosystem that is driving these “post PC devices”,
  • Cloud computing entering mainstream, and,
  • The dramatic rise of “big data” and “Analytics”

are changing the talent needs of this growing industry.

Naturally, colleges, particularly the premier Institutes are looking at ways and means of responding to such changing needs.

A word of caution is in order. Any curriculum has foundational elements that address the long term needs of a professional who goes through 4-6 years of college education (Bachelors or Masters Degree) that is expected to provide the basis for the professionals 40+ year career. The curriculum must be relevant so that the college graduates can find themselves a meaningful employment in industry that employs them (barring a small number who take up academic / government / military jibs). Foundation and relevance are like the cake and the icing on the cake; both are important, but the proportion should not be forgotten. Countries like India have a history where low quality, fly-by-night operators mushroom everywhere, promising relevance (often at the cost of quality); and, Universities steeped in bureaucracy, turn a blind eye to relevance of the curriculum. Luckily, premier Institutes always had the right balance, though sometimes the industry fails to take notice of such response.

Every aspect of computing is getting affected in the past five years. For example,

  • Microprocessors are increasingly multi-core with application specific core (graphics, computing, networking) optimised to do the specific function; developers tomorrow must be able to leverage this “architectural” development.
  • From days of “memory crunch” we are into “memory abundant” days of computing; with FLASH memory entering mainstream, the primary and secondary memory separation is blurring. In turn, specialised applications like “data look up” can benefit from “main memory” computing, as opposed to retrieving data from any number of high-speed disk spindles.
  • Databases are no longer limited to “structured” data (where OLTP systems excelled), but moving to “No SQL data” that too of enormous size, often called “Big data” problem.
  • With Internet penetration increasing even in counties like India, wireless technologies like 3G, LTE and SDN becoming a reality, one finds abundance of bandwidth too (at least in countries like Korea).
  • With “Big Data” available instantaneously thru high-speed Net, “real time” Analytics is becoming a reality, at least in a handful of industry segments (Telecom, Retail and Banking).

Naturally, education of tomorrow’s IT professional has to factor in such changing developments.

How do Institutes of Higher Education address such changing demands?

Definitely, not by encouraging fly-by-night “training” outfits, but,

  • Carefully re-visiting the foundation courses (architecture, programming, compilers, operating system, databases and networking),
  • Integrating “cloud”, big data, Analytics, Apps development, Post PC devices, and,
  • Starting new elective courses that go into the details of Analytics, Cloud and Big Data.

In a sense, the foundation is tweaked so that the students see a continuum of ideas that have evolved; for example, cloud computing as a natural extension of distributed computing, location-independent hardware / software resources, compute elasticity, “eventual consistency” that is sufficient for many “Big data” problems as opposed to “every time consistency” needed by OLTP (banking, online trading, insurance). In the process, curriculum gets richer, relevant and interesting; with so much of online resources and free environments to experiment along with the advice of “open source” volunteers, learning can be far more fun and rewarding. Companies like HP and Infosys have domain experts who have developed courses in some of the emergent areas; Institutions do benefit from them. The maturity of cloud infrastructure permits the creation of “laboratories on the cloud” to test out many of the emerging ideas.

In short, the strategies uses to address the changing talent needs include,

  • Tweaking the curriculum
  • Benefiting from industry expertise
  • Utilizing “open source” materials
  • Leverage “cloud” to provide “on demand” laboratory resources

Professor Sowmyanarayanan Sadagopan is the Director of IIIT-Bangalore. These are his personal views. He can be reached at ss@iiitb.ac.in

(Appeared in “The Hindu Business Line of October 23, 2013)

Tags: ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Follow

Get every new post delivered to your Inbox.

%d bloggers like this: