Robert Dewar right on CS curriculum, wrong on Java
Jesse Fish
The author James Maguire wrote two articles in 2008 about Robert Dewar and his views on Java as the core language in a Computer Science curriculum: "Who Killed the Software Engineer? (Hint: It Happened in College)" and "The 'Anti-Java' Professor and the Jobless Programmers". Before that Professor Dewar wrote an article "Computer Science Education: Where Are the Software Engineers of Tomorrow?". In these articles Professor Dewar sites Java as the downfall of Software Engineers in the U.S.
While I do think that the CS curriculum at many schools are failing their students, I do not believe that teaching CS in Java is the root of these issues. Professor Dewar poses some great solutions for fixing CS departments, but I don't believe that teaching a different first language needs to be part of that solution.
Professor Dewar understands the root of the issues with CS:
“...Part of the trouble with universities is that there are relatively few faculty members who know much about software. They know about the theory of computer science and the theory of programming languages. But there are relatively few faculty who really are programmers and software engineers and understand what’s involved in writing big applications.”
“It’s just not the kind of thing that universities are into, really. Because they tend to regard computer science as a scientific field rather than an engineering field. So I’ve always felt that was a weakness.”
The crux of the problem is that the purpose of a CS degree is not to turn out Software Engineers. I was shocked when I learned this, but after speaking/complaining to numerous professors, I learned that the skills that people actually need to get programming jobs in the real world are not considered when teaching students CS. Things like unit testing, MAKE, proper documentation, code style, UML, proper debugger use, etc. Oh, the good CS students learn these things mind you. The good CS students learn very quickly that their professors have no interest in teaching us the things that the industry wants us to know and we need to do that ourselves.
Professor Dewar poses two questions to CS students to answer were he to hire them for a job.
"1.) You begin to suspect that a problem you are having is due to the compiler generating incorrect code. How would you track this down? How would you prepare a bug report for the compiler vendor? How would you work around the problem?
2.) You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification. How would you track this down? How would you prepare a bug report for the chip manufacturer, and how would you work around the problem? "
I would argue, according to the current CS ideals, that it is not a professor's responsibility to explicitly teach these things. Mind you, I don't agree with the current CS ideals, but knowing how to solve either of those problems does not deepen your knowledge of Computer Science; they are technical skills. As a CS student you can either teach them to yourself on your own time or learn them on the job.
I have argued with professors on this issue several times. I was frustrated with a compilers class in which the professor fully expected everyone to fluently program in C or C++ and be able to use a linux terminal after taking an intro to programming course in Java and data structures in Java. I am not against forcing students to learn multiple languages. On the contrary I believe all CS students should learn at least 3 languages before they earn their degree (c/c++, Java, and one of their choosing... usually python), but I take issue with the complete lack of preparation. When I asked one of my professors, Michael Branicky (Sc.D. EECS MIT), why it was we were not taught how to use the tools fundamental to our success as programmers he curtly responded that CS is not about teaching people how to use tools but to teach theory, and if we wanted to be taught the tools we should go to a technical school.
I have a very large problem with this attitude towards teaching CS. This attitude is what leads to CS graduates who cannot do simple tasks because no one has taught them how to use the tools. They only learn them enough to complete their assignments. The sheer number of bad habits and incompetent programmers that are created by this frightens me. This is the equivalent of not teaching mechanical engineers how to use anything in a machine shop and not teach them CAD. Then give them a part to create and expect them to be able to do it. Sure a few of the more driven students would sit down together and teach each other how to work the machines but you would have others afraid to use any of the machines and creating parts with hand tools alone. To give a more solid example of what this leads to; I had classmates who, when asked to submit code in C++, rather than creating a simple Make file to compile the project, would keep a text file with the commands necessary to compile the code and paste those commands into the terminal when needed. The smarter of the bunch would put those commands in a bash script but that is still the wrong tool for the problem.
Professor Dewar made comments about teaching graphic libraries and not teaching algorithms in CS curriculum. I simply do not believe this is happening. I have extensively searched through the CS curriculum's of many Universities around the country both large and small and could not find any evidence of CS programs not teaching algorithms.
Now as for why I think java is not a bad choice for a first language. Reading Professor Dewar's own article his main issue with java seems to be that his university did not teach Java correctly. I can't fault them. I myself have been a teachers assistant for a Java course that, well, doesn't teach Java to my satisfaction. The truth is Java is difficult to teach. Often references are under emphasized, and the difference between primitives and references as arguments is completely ignored. Professors rush to try to get their students to graphic libraries to make their programs fun and interesting. Simple sorting, algorithmic thinking, overall program structure, inheritance and polymorphism are all pushed aside in favor of creating an applet that looks shiny. The simple truth of the matter is that an intro to programming course in Java can be taught exactly like one in C++. The overall theory and concepts can be duplicated. Professors simply are not teaching Java in this way. This is not the languages fault but the professors.
Professor Dewar sited Java as having too many built in libraries for teaching as a primary language. The solution is don't let the students use this built in libraries in their programs. Simple as that. There are plenty of C++ libraries out there too, mind you they aren't standard like Java's, but simply because a language has a large library base does not mean you should not teach it as a primary language. Force the student to build from the ground up.
One of by far the most annoying statements that Professor Dewar made was, "And [forget] all this business about ‘command line’ – we’ll have people use nice visual interfaces where they can point and click and do fancy graphic stuff and have fun." I'm sorry but computer programming should not be a contest in who likes chewing on glass more. It should not be painful, and if it is then you need to develop a tool that eases that pain. The classic mantra, "a good programmer is a lazy programmer" applies here. I'm sorry if the development of the IDE annoys you and makes accessible skills that used to be only available to those who wished to toil away for hours mastering command line text editors, debuggers and subversion tools. Perhaps you would prefer we go back further and program entirely with punch cards again, or receive error messages in hex code. In the real world people use IDE's and at the introductory level students should be taught using IDEs. I believe that CS students absolutely need to know how to use command line tools and compile using the terminal. When I was in undergrad I pushed for having the first 2 weeks of the intro to Java course at my institution be teaching simple terminal commands and terminal compilation and running of programs (instead it is spent teaching HTML). But students also need to be familiar with IDEs, Eclipse (or Netbeans), Visual Studio, and possibly even Xcode.
I understand Professor Dewar's feelings about Java as an introductory language. He believes that the CS jobs that will be available to CS students are in lower level languages like C++, C, Ada, and Fortran compiling for machines that are not personal computers. For crying out loud the guy owns a company called AdaCore Inc. He is a compilers man, and I understand how he might feel that the lower levels of Computer Science are being underrepresented at most universities. And frankly, they are. But this is because of the high level theoretical Computer Science Professors who are doing the teaching. Teaching an AI algorithm or tree traversal algorithm is far more interesting to them than forcing a CS student to program C on the register level. Teaching low level languages as introductory courses in CS often causes the language to get in the way of the solution. Let lower level aspects be taught later. I can't imagine how Professor Dewar feels about MIT changing their introductory language from SCHEME to Python, a language which, if he feels Java has next to no application in real world industry CS, he must feel has absolutely no place in real world industry.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment