Monday, August 17, 2009

The Shades of Piracy

I have been at my family's ranch in Wyoming for the past 3 days. I made a stop here to see the family on my way driving home to Minnesota from California (after that I'm going to Cleveland wish me luck). While here my younger brother and two of his friends he brought with him spent quite a bit of time playing Warcraft 3. They all piggy backed on my younger brothers CD key and played LAN games. I showed them the ways of DOTA and they quickly wanted to get on battle.net. My brother wanted me to give his friend my CD key so that they could play together on battle.net. I refused. He was quite upset and did not understand why I would not give up my key to him.

In truth, my reasoning was twofold. First of all, I understand how key sharing spreads; I give it to him and he gives it to his friend to use this time, then he wants another friend playing a month down the road, and before long my key is in the hands of several of his friends and perhaps his friends' friends. I want my copy to remain my own. The second reason was a matter of piracy. Since getting a "real job" and having money I can spend on things, I have not pirated games. It got me thinking about shades of piracy, and the ethics of it.

My main concern with piracy is that it leads to worse and worse DRM being put into software and media. It makes the software/media less usable and can often cause the removal of features.

As I see it there are 6 types of piracy: Operating System, Application, Game, Movie, Music, Television, and book

There are really only two operating systems pirated in the world, Mac and Windows.

Piracy against the Mac operating system is pretty sad because they are trusting enough to not require any keys or validation of their OS and in the scheme of things their OS is pretty inexpensive. That being said, most of their profits do not come from OS sales since it is bundled with their computers. If you are pirating their OS you are likely building a hackintosh, which means you are probably not dying to give Apple your money. However, if Mac OS piracy gets out of control they might start putting DRM in it, and no one wants that.

Piracy against the Windows operating system is also a grey area. The product is usually priced much higher than it is actually worth. A handicapped version of the OS is unfortunately packaged with every PC on the market along with a bunch of useless software (looking at you AOL). Meaning that anyone who actually knows about computers should wipe and reinstall a clean copy of windows onto their new computers. Another issue is license binding to hardware. So you have bought a PC with windows bundled. But now you want to change the parts around, maybe upgrade the processor. Now your bundled copy no longer works because your hardware has been upgraded (this was how it was with XP they may have changed it in vista or 7). You shouldn't need to repurchase the same OS because you wanted to update your computer. Also, Microsoft would rather you pirated their software than learned how to use Linux. So if you are making the choice between ubuntu and a cracked XP copy, Microsoft would rather that you use the cracked XP copy even if they don't get a sale.

Windows is pirated all over the world and there are people who pirate the new Mac versions rather than paying for them. My decision on the subject is that if you are a company or business large or small you MUST buy every copy you use for that business of either OS. If you can easily afford to buy the OS and are a private user you should at least buy 1 copy and use it on all of your and your immediate families' machines. If you are stinking rich you could go the whole nine yards and buy a copy for every machine you put it on (Though honestly the Mac family licenses aren't that bad). If you have a job where you use a computer the OS on that computer MUST have a legal copy of the OS.

Applications I view by use case. If it is an application that is meant to be used by hobbyists or anyone and is priced accordingly you should buy it. If it is a professional piece of software meant for someone who makes money off of its use then as long as you do not make money off of its use you can pirate it. As soon as your use for it moves from simple hobby to moneymaking business then you MUST buy the application. If you are a business small or large you MUST buy every application you use. As an example if you are a college student teaching yourself to make a webpage and you want to use Dreamweaver and Photoshop then as long as you are not making money off of it I think it is fine to pirate them. The moment people start paying you to make websites you need to have legal copies of those programs.

Games are a specific subset of applications. The main use case of games is never one of making money off of it. So ethically if you want to play a game you should buy or rent it. That said if someone lends you a copy of a game that is fine, but with the understanding that there is still only one working license of that game in existence. That is they do not keep the game working on their machine and continue playing it while you have it on yours. I think pirating a game to test run it is fine too, provided you don't play it for more than an hour or so. In gaming, piracy leads to some of the worst DRM in existence (looking at you EA). It also leads to companies removing things that should be part of games. My example is that Starcraft 2 will not have LAN play specifically because of the activities my brother and his friends were doing. They each got at least 7-10 hours of play in over the last 3 days of Warcraft 3. I can understand why they should have had to buy the game. I don't like Blizzard as a company. I really hate WoW and the way they run battle.net. But it is in part the consumer's fault for the success of WoW. If every person on the planet who played a Blizzard game for more than an hour had purchased the game, they would have seen more incentive to make more RTS and Diablo games. Instead they saw a market for the MMO and subscription services and went where the money was. That philosophically is why I refused to give my brother my key. Because I knew the effect that kind of piracy was having on the gaming world.

Music movies and television are each extremely complicated in terms of the ethics of piracy.

The music industry is broken in many ways. As soon as fans start believing that they are hurting the artists by pirating their songs most will stop. Also piracy clearly hasn't killed the music industry. I would say if you want to pirate music make sure you go to concerts of the bands you like, and buy a CD of a band you like every once in a while, even if you already pirated it. Itunes now has DRM free songs, so if you have one stuck in your head that you will listen to for a week straight spend a dollar on it. Basically give the industry feedback on good things because if everyone pirates everything then the good artists wont rise to the top (hahah as if they do now..).

Movies are better in the theater. They just are. Most of the time. When you don't have a bunch of people shouting at the screen... Because of that, I will not watch a camcorder recording of a movie. I will watch a prerelease dvd rip though. I honestly think there are some movies that must be watched in theaters. Then there are some that are crap and aren't worth your money. The term for those movies used to be "its a rental". I think that perhaps movies should be priced not only on more than time of day they show, but also by the demand of the movie. How many people are seeing it should affect its price. But right now prices aren't determined that. So, I say if the movie looks sweet see it in theaters. If you miss it in theaters, get it anyway you can. If you like it buy the DVD. If the movie looks crap, but you still want to watch it, watch it somehow. If it was actually pretty good, buy it or suggest a friend buy or rent it. I have a huge DVD collection. Bigger than anyone I know. I support movies I like and I don't care about movies I don't.

TV is extra tricky because you don't really pay for it outside of your cable bill. You pay for it by watching advertisements. So it feels pretty stupid to be told you are pirating something if you have it on your computer and you would have paid for it by watching a commercial. My thought is, if you liked the show a lot and have the cash, get the DVD box set. If you have the option, which you increasingly do now, watch the show online rather than watching a downloaded copy of it.

Book piracy is an issue that has gotten worse with the advent of the kindle and Ebooks. There are libraries where you can get books freely. So what is really the difference between getting the book from the library and pirating the book to your computer? The only book I have ever pirated was a college textbook. I pirated it because the college bookstore never stocked the damned thing, and I bought it at the end of the semester because I wanted a real book to study for the final with. I think if you like a book or are going to use it a lot buy a copy, digital or hard. I understand college textbooks are a tricky issue because you are poor when you are in college and the textbooks are expensive and pirating is a cheap alternative. I honestly don't know where I stand on that issue. I bought every textbook I have ever used, usually at a price higher than I should have. But I sympathize with people who cannot afford the textbooks. There are several alternatives to piracy for getting cheap textbooks at most colleges though, such as they book swaps or borrowing.

Lastly, my golden rule for piracy is that if you are giving pirated material to someone else and getting money for it, that is WRONG. You should never accept money in exchange for pirated material.

I believe that piracy does not have to be detrimental to any of the systems listed here. It can get people interested or using things that they otherwise would not have spent money on. If you share my views or disagree with them please leave a comment. I would love to hear your opinion.

Standing Up For Best Practices Right Out Of College

This past summer I was employed as an intern for a high profile software company. I must admit I was afraid that I would be unprepared for the real world of Software Engineering. These fears came from many articles I have read in the past, some of which were linked in a previous post here. Basically, I was worried that I didn't feel like I knew anything about unit testing, scrum teams, agile software dev, and many other software development tools/practices that should be taught in schools and aren't.

The one tool/practice I did know about was version control. In my junior year of college I took an advanced game design class where I lead a team of programmers and artists to make a game. Version control was of paramount importance for the success of our team. We used a subversion repository with branches and very regular check ins. You checked your code in the moment you were done working on it and checked it out the moment you started working. If there were conflicts you merged them as well as you could.

Now I felt like I was quite successful in my internship this past summer. I was mostly working on a simple tool for internal use and I was the only developer on that tool. However, there was another project I was also working on. A separate project that I was a small part of. This project was mainly run by a team in China. The source of the project needed serious retooling before I could implement my part. This retooling was not needed because to the current team had coded the software poorly but because the software was the result of several intern projects over several years, and there was a lot of copy/pasted code and a overall lack of design in the source.

I wanted to check up on the source periodically so I could see where it was going, how it was developing, basically get a feel for the new architecture before I had to dive in and code in it. The problem was that the China team refused to check in their code base until it was stable. There was no trunk/branch hierarchy there was one codebase and changes between versions were not a gradual stepwise process but they were sudden and massive. They would go upwards of weeks between check-ins.

I was obviously shocked by this. Good version control practices do not involve weeks between check-ins and everyone working from one branch. I approached my manager with the problem and he understood the situation and wanted to fix it. I was quite fortunate that my manager had a Degree in Computer Science and was not interested in deadlines so much as getting a quality piece of software.

We had a meeting specifically about the process. At this point I would like to take an aside and say that the Chinese-American accent is my least favorite accent in the world. It is extremely difficult for me to understand and I personally dislike listening to it. A lot. In college I had several professors with thick Chinese-American accents and I could not understand them in lecture 90% of the time. It annoyed me more because during the short bits where I was both alert enough to understand them and they were speaking semi-coherently I recognized that they understood the material and were actually making a very decent attempt and using examples and language that put the ideas across well. Understand that I do not have any dislike of the Chinese because of this accent; it just makes the exchange of ideas harder for me personally.

That being said during the meeting we had there were several times where I had no idea what the team from China was saying. Thankfully my manager did. I tried explaining to them that once we have a solid trunk in place that is stable code for our project any changes to be made should be done in separate branches. Those branches should be checked in daily and the modified branch should be fully reviewed by the team before it is merged with the trunk. The branches should be structures in such a way that makes sense to minimize merging issues and makes logical sense with the development going on. This was a process that seemed to me a no brainer and I was shocked that they were not using it.

I then went on leave for a few days. When I got back I had emails in my inbox explaining why these methods did not make sense for the project and why they felt like they should not need to implement them. They also did not seem to understand that I was not the only one who needed to branch when the code was being modified but that no one should be touching the trunk directly. I was again extremely thankful for my manager who insisted that it was a practice that needed to be adopted by our team. He had responded to their emails before I got the chance.

I must put across that I do not believe that this was a regular practice throughout the company. I think most of the major products that we produced used best practices and that this was the exception. It just happened to be the exception that I had to deal with.

I do not honestly believe this to be that uncommon an occurrence especially as more and more software jobs are outsourced to other countries. You may find yourself in a job where you are working with people who through poor teaching or management, or simply through laziness do not follow standard software practices. And your job may get progressively harder because of this. I must admit I do not think that the team would have changed its ways if my manager had not backed me up on the practice. People are just not likely to take advise from the new employee, much less an intern.

This all said I have a few messages for people in the field of Software Engineering.

College students and new hires: learn the best practices for whatever you are working on. Use them, and instruct others to use them. Stand up for them at work if you see they are not being followed. It will make your job easier in the long run, and make a better product because of it.

Older software developers (some of this goes for new guys too): don't get lazy. Many of you know the best practices but choose to ignore them. Take pride in your work and do things right. Test your code, do personal reviews on your code, do peer reviews on your code, write your code so when someone else looks at it they will be impressed by how readable and elegant it is. Use version control. Follow the practices your company has decided you should use, and if you feel like there is a better practice or a better way to do something bring it up and suggest it.

Managers: read up on the best software practices. Buy books. Listen to your employees. Push them and encourage them to do better but don't have them working so hard they need to cut corners to make deadlines. If you can learn anything from Windows Vista and peoples unwillingness to upgrade from XP, learn that no matter how many new features and how shiny you make software, if it is poor quality software people wont buy it or upgrade to it.

Wednesday, June 17, 2009

Going From High Level to Low Level and Getting Lost on the Way Back, the Beginner Programmers Nightmare

Often people are driven to learn the art of computer programming because they have a great new revolutionary idea that they want to create. It may be the next addictive game, killer iPhone App, or amazing web service. They will ask their tech savvy friends what they should learn to achieve their goals. The answers they receive will vary drastically from programming languages and programming paradigms. But in the end most people usually get started on some form of C based language (C/C++/C#, Java, python, etc). The only issue is that the beginning fundamentals of programming never seem to have anything to do with their idea, or how to achieve it. But they are told to persevere, that there is a pot at the end of the rainbow. After time, many give up frustrated. While others do master the fundamentals of the programming language they have been tasked to learn, but still have no idea how to approach making their idea a reality. I find this all too often for budding programmers. There is a huge disconnect between high-level ideas for an application, and code syntax.

For those of you unfamiliar let me define high-level and low-level in a programming sense using a list going from high-level concepts down to low level.

High level

1. Once sentence description of what the product is and does
2. Feature list of the product
3. User requirements for the product
4. Technical requirements for the product
5. Architectural overview of the product
6. UML Diagram of the pieces of the architecture
7. Java/C++/Python code-the actual code- Classes/Functions/methods and algorithms used in the program
8. Assembly/byte code- the compiled or semi compiled product of the code that is the program executing
9. Registers/CPU- the hardware the program is running on
10. Transistors/capacitors/resistors/inductors- the components that make up the hardware
11. Electrons/Protons/Neutrons/Laws of physics- the things that make up everything (you have gone too low level at this point)

Low level

As a new programmer hopefully you will never have to go below level 7 on this list. It is very rare that you ever would. There are some low level programmers I know that can't think above level 7, or even really at it, who are great at what they do. But for the kinds of people I am targeting with this article I intend to ignore everything below level 7.

So lets talk a bit about programming. When you first start learning it you generally are taught the fundamentals of procedural programming; simple variable creation and manipulation, primitive variable data types, standard operators, flow control (if statements and loops), arrays of simple data types. You would hopefully get as far as functions before coming across too much trouble.

Now after learning functions generally there is the first hurdle to overcome for most people, classes. Classes and object oriented programming are pretty fundamental to the way most people program now, but the ideas of creating a blueprint for a data type and of having a variable which owns it own functions and inner variables is often lost on new programmers.

But lets say you overcome those issues and you learn classes and inheritance, polymorphism, interfaces, generics, namespaces, recursion, even lambda functions. You are a master at the fundamentals of programming. But you still can't make that application.

It's understandable. Very rarely in college do you actually put all of the pieces together to make a fully functioning application. More often you are tested on algorithmic problems, and sorting.

You take a class in Computer Graphics hoping to learn how to display a 3d world, get mouse and keyboard input, import animate and make 3d models. Instead they have you rewrite the OpenGL engine. Mind you, you do in the process learn displaying a 3d world and input using OpenGL, but creating importing and animating 3d models properly in a program is still beyond you.

You take a networking course with the hopes of making a waiting room/chat/host joining section of a game that you hoped to make multiplayer. Instead you learn how to send email via SMTP messages, how networks are structured, and how the TCP/IP stack works. You do glean how to set up socket communication between computers, which allows for basic network communication but it's not what you hoped.

Fundamentally the problem is the disconnect between level 7 and levels 2 and 3 in these people's understanding of how to approach their problem. They can write code and understand it, and they know what they want, but they don't know how to design it, or what the right design is for it.

To create a parallel, it is the equivalent of teaching a mechanical engineer how to machine any part, and then asking them to build an internal combustion engine. Or like teaching an electrical engineer how a resistor, transistor, capacitor, inductor, and battery all work, then asking them to build a radio. Needless to say this lack of knowledge can lead to terrible and or innovative designs.

Unfortunately there is no one book or quick remedy that will solve this problem (though looking up books on design patterns will be a big help). My suggestion is if you have an idea, just start doing it, and don't be afraid to start wrong. Find better programmers than yourself and ask them to review your code. You will make some terrible mistakes. Believe me, you should have seen the first time I tried writing a GUI based tic-tac-toe program in Java before I knew how event driven programming worked. You will have times when you have to completely throw out a weeks worth of work and start from scratch because your design was so wrong. But it's ok. Being a bad programmer when you start programming but asking for help and pushing yourself to do better and get it right the next time is infinitely better than being a bad programmer who is afraid to start doing anything because they don't know how.

Tuesday, June 16, 2009

What happened to you iPod? It used to be about the music man..

I have never in my life considered myself an audiophile. I grew up during the mp3/napster days of the nineties; when poorly/improperly tagged mp3 files were swapped like stds. I was young and didn't know to look for songs with good bit rates. I once made the mistake of burning an audio CD from low bit rate mp3s and then ripping the wav formatted cd to files of higher bit rate. In effect bloating the size of the files without actually increasing the sound quality... I had much to learn about technology in those years.

Even now in my library there is a good portion of music that is 128 kbps or below. Any audiophile worth their salt would cry at this abysmal quality. However, I have not been particularly bothered by it. I have never been someone with an abundance of storage space. I have always tried to keep my music on my primary laptop, and until a few years ago I was limited to 30 gigs of music max. Many times when I put a recently purchased CD in my computer to rip I pondered what format to rip to and what bit rate. In the end I would usually pick something lower than I would like simply so I would not waste space.

Recently however, I have been made aware of the FLAC music format. I had just beaten the phenomenal game Braid and was interested in purchasing the soundtrack online. The site gave me many format options and many people seemed very pleased to see FLAC listed. For those of you unfamiliar with FLAC formatting, it is a lossless format that unlike WAV and AIFF is compressed. Which means you get lossless audio quality that takes less space on your hard drive.

After reading up on FLAC format I decided it was just the thing for me. I only wanted to check if iTunes and my iPod could use it... They couldn't. There exists an elaborate hack, which allows iTunes to play the format. But natively it just balks at it.

Naturally the iPod also does not know what to do with the format. I find this stupid because rockbox, a firmware replacement for many mp3 players including iPods, can play FLAC files.

Many other mp3 players actually support the format. I am not going to get into the argument of which headphones you have to use to even appreciate the format or what sounds better x or y. I think people who get into those arguments sound like jackasses (see comments on that useful article).

I do think it is embarrassing that apple does not support this format. It may be that they are trying to encourage people to use their own lossless ALAC format. However, I don't see what the motivation would be to do that. The iPod is no longer only a music player (it can also do 100s of other useless things). But I have an iTouch and the thing I do the most with it is listen to my music. Even people with iPhones I would argue most often use their device either as a phone or a music player.

There are really 2 solutions. Use the hack workaround to listen to FLAC files on my laptop and forget putting them on my iTouch. OR Pick a new music player on my laptop and a new portable music player.

So far I have just held back on using/getting FLAC files. Which, I suppose means Apple is winning. If an update for iTunes and the iPod are not released soon. I may be changing my primary computer to one of my Unix or Windows machines so I can enjoy all of my music. This would be a sad day indeed.

WTF Apple. Step your game up.

Edit:
Also, Apple make ID3 tags more transparently editable. Like when you add that field to make things podCasts, allow us to disable it. And Propagate the information across both types of ID3 tags....

Sunday, June 7, 2009

Robert Dewar right on CS curriculum, wrong on Java

Robert Dewar right on CS curriculum, wrong on Java
Jesse Fish

The author James Maguire wrote two articles in 2008 about Robert Dewar and his views on Java as the core language in a Computer Science curriculum: "Who Killed the Software Engineer? (Hint: It Happened in College)" and "The 'Anti-Java' Professor and the Jobless Programmers". Before that Professor Dewar wrote an article "Computer Science Education: Where Are the Software Engineers of Tomorrow?". In these articles Professor Dewar sites Java as the downfall of Software Engineers in the U.S.

While I do think that the CS curriculum at many schools are failing their students, I do not believe that teaching CS in Java is the root of these issues. Professor Dewar poses some great solutions for fixing CS departments, but I don't believe that teaching a different first language needs to be part of that solution.

Professor Dewar understands the root of the issues with CS:

“...Part of the trouble with universities is that there are relatively few faculty members who know much about software. They know about the theory of computer science and the theory of programming languages. But there are relatively few faculty who really are programmers and software engineers and understand what’s involved in writing big applications.”
“It’s just not the kind of thing that universities are into, really. Because they tend to regard computer science as a scientific field rather than an engineering field. So I’ve always felt that was a weakness.”

The crux of the problem is that the purpose of a CS degree is not to turn out Software Engineers. I was shocked when I learned this, but after speaking/complaining to numerous professors, I learned that the skills that people actually need to get programming jobs in the real world are not considered when teaching students CS. Things like unit testing, MAKE, proper documentation, code style, UML, proper debugger use, etc. Oh, the good CS students learn these things mind you. The good CS students learn very quickly that their professors have no interest in teaching us the things that the industry wants us to know and we need to do that ourselves.

Professor Dewar poses two questions to CS students to answer were he to hire them for a job.

"1.) You begin to suspect that a problem you are having is due to the compiler generating incorrect code. How would you track this down? How would you prepare a bug report for the compiler vendor? How would you work around the problem?
2.) You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification. How would you track this down? How would you prepare a bug report for the chip manufacturer, and how would you work around the problem? "

I would argue, according to the current CS ideals, that it is not a professor's responsibility to explicitly teach these things. Mind you, I don't agree with the current CS ideals, but knowing how to solve either of those problems does not deepen your knowledge of Computer Science; they are technical skills. As a CS student you can either teach them to yourself on your own time or learn them on the job.

I have argued with professors on this issue several times. I was frustrated with a compilers class in which the professor fully expected everyone to fluently program in C or C++ and be able to use a linux terminal after taking an intro to programming course in Java and data structures in Java. I am not against forcing students to learn multiple languages. On the contrary I believe all CS students should learn at least 3 languages before they earn their degree (c/c++, Java, and one of their choosing... usually python), but I take issue with the complete lack of preparation. When I asked one of my professors, Michael Branicky (Sc.D. EECS MIT), why it was we were not taught how to use the tools fundamental to our success as programmers he curtly responded that CS is not about teaching people how to use tools but to teach theory, and if we wanted to be taught the tools we should go to a technical school.

I have a very large problem with this attitude towards teaching CS. This attitude is what leads to CS graduates who cannot do simple tasks because no one has taught them how to use the tools. They only learn them enough to complete their assignments. The sheer number of bad habits and incompetent programmers that are created by this frightens me. This is the equivalent of not teaching mechanical engineers how to use anything in a machine shop and not teach them CAD. Then give them a part to create and expect them to be able to do it. Sure a few of the more driven students would sit down together and teach each other how to work the machines but you would have others afraid to use any of the machines and creating parts with hand tools alone. To give a more solid example of what this leads to; I had classmates who, when asked to submit code in C++, rather than creating a simple Make file to compile the project, would keep a text file with the commands necessary to compile the code and paste those commands into the terminal when needed. The smarter of the bunch would put those commands in a bash script but that is still the wrong tool for the problem.

Professor Dewar made comments about teaching graphic libraries and not teaching algorithms in CS curriculum. I simply do not believe this is happening. I have extensively searched through the CS curriculum's of many Universities around the country both large and small and could not find any evidence of CS programs not teaching algorithms.

Now as for why I think java is not a bad choice for a first language. Reading Professor Dewar's own article his main issue with java seems to be that his university did not teach Java correctly. I can't fault them. I myself have been a teachers assistant for a Java course that, well, doesn't teach Java to my satisfaction. The truth is Java is difficult to teach. Often references are under emphasized, and the difference between primitives and references as arguments is completely ignored. Professors rush to try to get their students to graphic libraries to make their programs fun and interesting. Simple sorting, algorithmic thinking, overall program structure, inheritance and polymorphism are all pushed aside in favor of creating an applet that looks shiny. The simple truth of the matter is that an intro to programming course in Java can be taught exactly like one in C++. The overall theory and concepts can be duplicated. Professors simply are not teaching Java in this way. This is not the languages fault but the professors.

Professor Dewar sited Java as having too many built in libraries for teaching as a primary language. The solution is don't let the students use this built in libraries in their programs. Simple as that. There are plenty of C++ libraries out there too, mind you they aren't standard like Java's, but simply because a language has a large library base does not mean you should not teach it as a primary language. Force the student to build from the ground up.

One of by far the most annoying statements that Professor Dewar made was, "And [forget] all this business about ‘command line’ – we’ll have people use nice visual interfaces where they can point and click and do fancy graphic stuff and have fun." I'm sorry but computer programming should not be a contest in who likes chewing on glass more. It should not be painful, and if it is then you need to develop a tool that eases that pain. The classic mantra, "a good programmer is a lazy programmer" applies here. I'm sorry if the development of the IDE annoys you and makes accessible skills that used to be only available to those who wished to toil away for hours mastering command line text editors, debuggers and subversion tools. Perhaps you would prefer we go back further and program entirely with punch cards again, or receive error messages in hex code. In the real world people use IDE's and at the introductory level students should be taught using IDEs. I believe that CS students absolutely need to know how to use command line tools and compile using the terminal. When I was in undergrad I pushed for having the first 2 weeks of the intro to Java course at my institution be teaching simple terminal commands and terminal compilation and running of programs (instead it is spent teaching HTML). But students also need to be familiar with IDEs, Eclipse (or Netbeans), Visual Studio, and possibly even Xcode.

I understand Professor Dewar's feelings about Java as an introductory language. He believes that the CS jobs that will be available to CS students are in lower level languages like C++, C, Ada, and Fortran compiling for machines that are not personal computers. For crying out loud the guy owns a company called AdaCore Inc. He is a compilers man, and I understand how he might feel that the lower levels of Computer Science are being underrepresented at most universities. And frankly, they are. But this is because of the high level theoretical Computer Science Professors who are doing the teaching. Teaching an AI algorithm or tree traversal algorithm is far more interesting to them than forcing a CS student to program C on the register level. Teaching low level languages as introductory courses in CS often causes the language to get in the way of the solution. Let lower level aspects be taught later. I can't imagine how Professor Dewar feels about MIT changing their introductory language from SCHEME to Python, a language which, if he feels Java has next to no application in real world industry CS, he must feel has absolutely no place in real world industry.