Often people are driven to learn the art of computer programming because they have a great new revolutionary idea that they want to create. It may be the next addictive game, killer iPhone App, or amazing web service. They will ask their tech savvy friends what they should learn to achieve their goals. The answers they receive will vary drastically from programming languages and programming paradigms. But in the end most people usually get started on some form of C based language (C/C++/C#, Java, python, etc). The only issue is that the beginning fundamentals of programming never seem to have anything to do with their idea, or how to achieve it. But they are told to persevere, that there is a pot at the end of the rainbow. After time, many give up frustrated. While others do master the fundamentals of the programming language they have been tasked to learn, but still have no idea how to approach making their idea a reality. I find this all too often for budding programmers. There is a huge disconnect between high-level ideas for an application, and code syntax.
For those of you unfamiliar let me define high-level and low-level in a programming sense using a list going from high-level concepts down to low level.
High level
1. Once sentence description of what the product is and does
2. Feature list of the product
3. User requirements for the product
4. Technical requirements for the product
5. Architectural overview of the product
6. UML Diagram of the pieces of the architecture
7. Java/C++/Python code-the actual code- Classes/Functions/methods and algorithms used in the program
8. Assembly/byte code- the compiled or semi compiled product of the code that is the program executing
9. Registers/CPU- the hardware the program is running on
10. Transistors/capacitors/resistors/inductors- the components that make up the hardware
11. Electrons/Protons/Neutrons/Laws of physics- the things that make up everything (you have gone too low level at this point)
Low level
As a new programmer hopefully you will never have to go below level 7 on this list. It is very rare that you ever would. There are some low level programmers I know that can't think above level 7, or even really at it, who are great at what they do. But for the kinds of people I am targeting with this article I intend to ignore everything below level 7.
So lets talk a bit about programming. When you first start learning it you generally are taught the fundamentals of procedural programming; simple variable creation and manipulation, primitive variable data types, standard operators, flow control (if statements and loops), arrays of simple data types. You would hopefully get as far as functions before coming across too much trouble.
Now after learning functions generally there is the first hurdle to overcome for most people, classes. Classes and object oriented programming are pretty fundamental to the way most people program now, but the ideas of creating a blueprint for a data type and of having a variable which owns it own functions and inner variables is often lost on new programmers.
But lets say you overcome those issues and you learn classes and inheritance, polymorphism, interfaces, generics, namespaces, recursion, even lambda functions. You are a master at the fundamentals of programming. But you still can't make that application.
It's understandable. Very rarely in college do you actually put all of the pieces together to make a fully functioning application. More often you are tested on algorithmic problems, and sorting.
You take a class in Computer Graphics hoping to learn how to display a 3d world, get mouse and keyboard input, import animate and make 3d models. Instead they have you rewrite the OpenGL engine. Mind you, you do in the process learn displaying a 3d world and input using OpenGL, but creating importing and animating 3d models properly in a program is still beyond you.
You take a networking course with the hopes of making a waiting room/chat/host joining section of a game that you hoped to make multiplayer. Instead you learn how to send email via SMTP messages, how networks are structured, and how the TCP/IP stack works. You do glean how to set up socket communication between computers, which allows for basic network communication but it's not what you hoped.
Fundamentally the problem is the disconnect between level 7 and levels 2 and 3 in these people's understanding of how to approach their problem. They can write code and understand it, and they know what they want, but they don't know how to design it, or what the right design is for it.
To create a parallel, it is the equivalent of teaching a mechanical engineer how to machine any part, and then asking them to build an internal combustion engine. Or like teaching an electrical engineer how a resistor, transistor, capacitor, inductor, and battery all work, then asking them to build a radio. Needless to say this lack of knowledge can lead to terrible and or innovative designs.
Unfortunately there is no one book or quick remedy that will solve this problem (though looking up books on design patterns will be a big help). My suggestion is if you have an idea, just start doing it, and don't be afraid to start wrong. Find better programmers than yourself and ask them to review your code. You will make some terrible mistakes. Believe me, you should have seen the first time I tried writing a GUI based tic-tac-toe program in Java before I knew how event driven programming worked. You will have times when you have to completely throw out a weeks worth of work and start from scratch because your design was so wrong. But it's ok. Being a bad programmer when you start programming but asking for help and pushing yourself to do better and get it right the next time is infinitely better than being a bad programmer who is afraid to start doing anything because they don't know how.
Wednesday, June 17, 2009
Tuesday, June 16, 2009
What happened to you iPod? It used to be about the music man..
I have never in my life considered myself an audiophile. I grew up during the mp3/napster days of the nineties; when poorly/improperly tagged mp3 files were swapped like stds. I was young and didn't know to look for songs with good bit rates. I once made the mistake of burning an audio CD from low bit rate mp3s and then ripping the wav formatted cd to files of higher bit rate. In effect bloating the size of the files without actually increasing the sound quality... I had much to learn about technology in those years.
Even now in my library there is a good portion of music that is 128 kbps or below. Any audiophile worth their salt would cry at this abysmal quality. However, I have not been particularly bothered by it. I have never been someone with an abundance of storage space. I have always tried to keep my music on my primary laptop, and until a few years ago I was limited to 30 gigs of music max. Many times when I put a recently purchased CD in my computer to rip I pondered what format to rip to and what bit rate. In the end I would usually pick something lower than I would like simply so I would not waste space.
Recently however, I have been made aware of the FLAC music format. I had just beaten the phenomenal game Braid and was interested in purchasing the soundtrack online. The site gave me many format options and many people seemed very pleased to see FLAC listed. For those of you unfamiliar with FLAC formatting, it is a lossless format that unlike WAV and AIFF is compressed. Which means you get lossless audio quality that takes less space on your hard drive.
After reading up on FLAC format I decided it was just the thing for me. I only wanted to check if iTunes and my iPod could use it... They couldn't. There exists an elaborate hack, which allows iTunes to play the format. But natively it just balks at it.
Naturally the iPod also does not know what to do with the format. I find this stupid because rockbox, a firmware replacement for many mp3 players including iPods, can play FLAC files.
Many other mp3 players actually support the format. I am not going to get into the argument of which headphones you have to use to even appreciate the format or what sounds better x or y. I think people who get into those arguments sound like jackasses (see comments on that useful article).
I do think it is embarrassing that apple does not support this format. It may be that they are trying to encourage people to use their own lossless ALAC format. However, I don't see what the motivation would be to do that. The iPod is no longer only a music player (it can also do 100s of other useless things). But I have an iTouch and the thing I do the most with it is listen to my music. Even people with iPhones I would argue most often use their device either as a phone or a music player.
There are really 2 solutions. Use the hack workaround to listen to FLAC files on my laptop and forget putting them on my iTouch. OR Pick a new music player on my laptop and a new portable music player.
So far I have just held back on using/getting FLAC files. Which, I suppose means Apple is winning. If an update for iTunes and the iPod are not released soon. I may be changing my primary computer to one of my Unix or Windows machines so I can enjoy all of my music. This would be a sad day indeed.
WTF Apple. Step your game up.
Edit:
Also, Apple make ID3 tags more transparently editable. Like when you add that field to make things podCasts, allow us to disable it. And Propagate the information across both types of ID3 tags....
Even now in my library there is a good portion of music that is 128 kbps or below. Any audiophile worth their salt would cry at this abysmal quality. However, I have not been particularly bothered by it. I have never been someone with an abundance of storage space. I have always tried to keep my music on my primary laptop, and until a few years ago I was limited to 30 gigs of music max. Many times when I put a recently purchased CD in my computer to rip I pondered what format to rip to and what bit rate. In the end I would usually pick something lower than I would like simply so I would not waste space.
Recently however, I have been made aware of the FLAC music format. I had just beaten the phenomenal game Braid and was interested in purchasing the soundtrack online. The site gave me many format options and many people seemed very pleased to see FLAC listed. For those of you unfamiliar with FLAC formatting, it is a lossless format that unlike WAV and AIFF is compressed. Which means you get lossless audio quality that takes less space on your hard drive.
After reading up on FLAC format I decided it was just the thing for me. I only wanted to check if iTunes and my iPod could use it... They couldn't. There exists an elaborate hack, which allows iTunes to play the format. But natively it just balks at it.
Naturally the iPod also does not know what to do with the format. I find this stupid because rockbox, a firmware replacement for many mp3 players including iPods, can play FLAC files.
Many other mp3 players actually support the format. I am not going to get into the argument of which headphones you have to use to even appreciate the format or what sounds better x or y. I think people who get into those arguments sound like jackasses (see comments on that useful article).
I do think it is embarrassing that apple does not support this format. It may be that they are trying to encourage people to use their own lossless ALAC format. However, I don't see what the motivation would be to do that. The iPod is no longer only a music player (it can also do 100s of other useless things). But I have an iTouch and the thing I do the most with it is listen to my music. Even people with iPhones I would argue most often use their device either as a phone or a music player.
There are really 2 solutions. Use the hack workaround to listen to FLAC files on my laptop and forget putting them on my iTouch. OR Pick a new music player on my laptop and a new portable music player.
So far I have just held back on using/getting FLAC files. Which, I suppose means Apple is winning. If an update for iTunes and the iPod are not released soon. I may be changing my primary computer to one of my Unix or Windows machines so I can enjoy all of my music. This would be a sad day indeed.
WTF Apple. Step your game up.
Edit:
Also, Apple make ID3 tags more transparently editable. Like when you add that field to make things podCasts, allow us to disable it. And Propagate the information across both types of ID3 tags....
Sunday, June 7, 2009
Robert Dewar right on CS curriculum, wrong on Java
Robert Dewar right on CS curriculum, wrong on Java
Jesse Fish
The author James Maguire wrote two articles in 2008 about Robert Dewar and his views on Java as the core language in a Computer Science curriculum: "Who Killed the Software Engineer? (Hint: It Happened in College)" and "The 'Anti-Java' Professor and the Jobless Programmers". Before that Professor Dewar wrote an article "Computer Science Education: Where Are the Software Engineers of Tomorrow?". In these articles Professor Dewar sites Java as the downfall of Software Engineers in the U.S.
While I do think that the CS curriculum at many schools are failing their students, I do not believe that teaching CS in Java is the root of these issues. Professor Dewar poses some great solutions for fixing CS departments, but I don't believe that teaching a different first language needs to be part of that solution.
Professor Dewar understands the root of the issues with CS:
“...Part of the trouble with universities is that there are relatively few faculty members who know much about software. They know about the theory of computer science and the theory of programming languages. But there are relatively few faculty who really are programmers and software engineers and understand what’s involved in writing big applications.”
“It’s just not the kind of thing that universities are into, really. Because they tend to regard computer science as a scientific field rather than an engineering field. So I’ve always felt that was a weakness.”
The crux of the problem is that the purpose of a CS degree is not to turn out Software Engineers. I was shocked when I learned this, but after speaking/complaining to numerous professors, I learned that the skills that people actually need to get programming jobs in the real world are not considered when teaching students CS. Things like unit testing, MAKE, proper documentation, code style, UML, proper debugger use, etc. Oh, the good CS students learn these things mind you. The good CS students learn very quickly that their professors have no interest in teaching us the things that the industry wants us to know and we need to do that ourselves.
Professor Dewar poses two questions to CS students to answer were he to hire them for a job.
"1.) You begin to suspect that a problem you are having is due to the compiler generating incorrect code. How would you track this down? How would you prepare a bug report for the compiler vendor? How would you work around the problem?
2.) You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification. How would you track this down? How would you prepare a bug report for the chip manufacturer, and how would you work around the problem? "
I would argue, according to the current CS ideals, that it is not a professor's responsibility to explicitly teach these things. Mind you, I don't agree with the current CS ideals, but knowing how to solve either of those problems does not deepen your knowledge of Computer Science; they are technical skills. As a CS student you can either teach them to yourself on your own time or learn them on the job.
I have argued with professors on this issue several times. I was frustrated with a compilers class in which the professor fully expected everyone to fluently program in C or C++ and be able to use a linux terminal after taking an intro to programming course in Java and data structures in Java. I am not against forcing students to learn multiple languages. On the contrary I believe all CS students should learn at least 3 languages before they earn their degree (c/c++, Java, and one of their choosing... usually python), but I take issue with the complete lack of preparation. When I asked one of my professors, Michael Branicky (Sc.D. EECS MIT), why it was we were not taught how to use the tools fundamental to our success as programmers he curtly responded that CS is not about teaching people how to use tools but to teach theory, and if we wanted to be taught the tools we should go to a technical school.
I have a very large problem with this attitude towards teaching CS. This attitude is what leads to CS graduates who cannot do simple tasks because no one has taught them how to use the tools. They only learn them enough to complete their assignments. The sheer number of bad habits and incompetent programmers that are created by this frightens me. This is the equivalent of not teaching mechanical engineers how to use anything in a machine shop and not teach them CAD. Then give them a part to create and expect them to be able to do it. Sure a few of the more driven students would sit down together and teach each other how to work the machines but you would have others afraid to use any of the machines and creating parts with hand tools alone. To give a more solid example of what this leads to; I had classmates who, when asked to submit code in C++, rather than creating a simple Make file to compile the project, would keep a text file with the commands necessary to compile the code and paste those commands into the terminal when needed. The smarter of the bunch would put those commands in a bash script but that is still the wrong tool for the problem.
Professor Dewar made comments about teaching graphic libraries and not teaching algorithms in CS curriculum. I simply do not believe this is happening. I have extensively searched through the CS curriculum's of many Universities around the country both large and small and could not find any evidence of CS programs not teaching algorithms.
Now as for why I think java is not a bad choice for a first language. Reading Professor Dewar's own article his main issue with java seems to be that his university did not teach Java correctly. I can't fault them. I myself have been a teachers assistant for a Java course that, well, doesn't teach Java to my satisfaction. The truth is Java is difficult to teach. Often references are under emphasized, and the difference between primitives and references as arguments is completely ignored. Professors rush to try to get their students to graphic libraries to make their programs fun and interesting. Simple sorting, algorithmic thinking, overall program structure, inheritance and polymorphism are all pushed aside in favor of creating an applet that looks shiny. The simple truth of the matter is that an intro to programming course in Java can be taught exactly like one in C++. The overall theory and concepts can be duplicated. Professors simply are not teaching Java in this way. This is not the languages fault but the professors.
Professor Dewar sited Java as having too many built in libraries for teaching as a primary language. The solution is don't let the students use this built in libraries in their programs. Simple as that. There are plenty of C++ libraries out there too, mind you they aren't standard like Java's, but simply because a language has a large library base does not mean you should not teach it as a primary language. Force the student to build from the ground up.
One of by far the most annoying statements that Professor Dewar made was, "And [forget] all this business about ‘command line’ – we’ll have people use nice visual interfaces where they can point and click and do fancy graphic stuff and have fun." I'm sorry but computer programming should not be a contest in who likes chewing on glass more. It should not be painful, and if it is then you need to develop a tool that eases that pain. The classic mantra, "a good programmer is a lazy programmer" applies here. I'm sorry if the development of the IDE annoys you and makes accessible skills that used to be only available to those who wished to toil away for hours mastering command line text editors, debuggers and subversion tools. Perhaps you would prefer we go back further and program entirely with punch cards again, or receive error messages in hex code. In the real world people use IDE's and at the introductory level students should be taught using IDEs. I believe that CS students absolutely need to know how to use command line tools and compile using the terminal. When I was in undergrad I pushed for having the first 2 weeks of the intro to Java course at my institution be teaching simple terminal commands and terminal compilation and running of programs (instead it is spent teaching HTML). But students also need to be familiar with IDEs, Eclipse (or Netbeans), Visual Studio, and possibly even Xcode.
I understand Professor Dewar's feelings about Java as an introductory language. He believes that the CS jobs that will be available to CS students are in lower level languages like C++, C, Ada, and Fortran compiling for machines that are not personal computers. For crying out loud the guy owns a company called AdaCore Inc. He is a compilers man, and I understand how he might feel that the lower levels of Computer Science are being underrepresented at most universities. And frankly, they are. But this is because of the high level theoretical Computer Science Professors who are doing the teaching. Teaching an AI algorithm or tree traversal algorithm is far more interesting to them than forcing a CS student to program C on the register level. Teaching low level languages as introductory courses in CS often causes the language to get in the way of the solution. Let lower level aspects be taught later. I can't imagine how Professor Dewar feels about MIT changing their introductory language from SCHEME to Python, a language which, if he feels Java has next to no application in real world industry CS, he must feel has absolutely no place in real world industry.
Jesse Fish
The author James Maguire wrote two articles in 2008 about Robert Dewar and his views on Java as the core language in a Computer Science curriculum: "Who Killed the Software Engineer? (Hint: It Happened in College)" and "The 'Anti-Java' Professor and the Jobless Programmers". Before that Professor Dewar wrote an article "Computer Science Education: Where Are the Software Engineers of Tomorrow?". In these articles Professor Dewar sites Java as the downfall of Software Engineers in the U.S.
While I do think that the CS curriculum at many schools are failing their students, I do not believe that teaching CS in Java is the root of these issues. Professor Dewar poses some great solutions for fixing CS departments, but I don't believe that teaching a different first language needs to be part of that solution.
Professor Dewar understands the root of the issues with CS:
“...Part of the trouble with universities is that there are relatively few faculty members who know much about software. They know about the theory of computer science and the theory of programming languages. But there are relatively few faculty who really are programmers and software engineers and understand what’s involved in writing big applications.”
“It’s just not the kind of thing that universities are into, really. Because they tend to regard computer science as a scientific field rather than an engineering field. So I’ve always felt that was a weakness.”
The crux of the problem is that the purpose of a CS degree is not to turn out Software Engineers. I was shocked when I learned this, but after speaking/complaining to numerous professors, I learned that the skills that people actually need to get programming jobs in the real world are not considered when teaching students CS. Things like unit testing, MAKE, proper documentation, code style, UML, proper debugger use, etc. Oh, the good CS students learn these things mind you. The good CS students learn very quickly that their professors have no interest in teaching us the things that the industry wants us to know and we need to do that ourselves.
Professor Dewar poses two questions to CS students to answer were he to hire them for a job.
"1.) You begin to suspect that a problem you are having is due to the compiler generating incorrect code. How would you track this down? How would you prepare a bug report for the compiler vendor? How would you work around the problem?
2.) You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification. How would you track this down? How would you prepare a bug report for the chip manufacturer, and how would you work around the problem? "
I would argue, according to the current CS ideals, that it is not a professor's responsibility to explicitly teach these things. Mind you, I don't agree with the current CS ideals, but knowing how to solve either of those problems does not deepen your knowledge of Computer Science; they are technical skills. As a CS student you can either teach them to yourself on your own time or learn them on the job.
I have argued with professors on this issue several times. I was frustrated with a compilers class in which the professor fully expected everyone to fluently program in C or C++ and be able to use a linux terminal after taking an intro to programming course in Java and data structures in Java. I am not against forcing students to learn multiple languages. On the contrary I believe all CS students should learn at least 3 languages before they earn their degree (c/c++, Java, and one of their choosing... usually python), but I take issue with the complete lack of preparation. When I asked one of my professors, Michael Branicky (Sc.D. EECS MIT), why it was we were not taught how to use the tools fundamental to our success as programmers he curtly responded that CS is not about teaching people how to use tools but to teach theory, and if we wanted to be taught the tools we should go to a technical school.
I have a very large problem with this attitude towards teaching CS. This attitude is what leads to CS graduates who cannot do simple tasks because no one has taught them how to use the tools. They only learn them enough to complete their assignments. The sheer number of bad habits and incompetent programmers that are created by this frightens me. This is the equivalent of not teaching mechanical engineers how to use anything in a machine shop and not teach them CAD. Then give them a part to create and expect them to be able to do it. Sure a few of the more driven students would sit down together and teach each other how to work the machines but you would have others afraid to use any of the machines and creating parts with hand tools alone. To give a more solid example of what this leads to; I had classmates who, when asked to submit code in C++, rather than creating a simple Make file to compile the project, would keep a text file with the commands necessary to compile the code and paste those commands into the terminal when needed. The smarter of the bunch would put those commands in a bash script but that is still the wrong tool for the problem.
Professor Dewar made comments about teaching graphic libraries and not teaching algorithms in CS curriculum. I simply do not believe this is happening. I have extensively searched through the CS curriculum's of many Universities around the country both large and small and could not find any evidence of CS programs not teaching algorithms.
Now as for why I think java is not a bad choice for a first language. Reading Professor Dewar's own article his main issue with java seems to be that his university did not teach Java correctly. I can't fault them. I myself have been a teachers assistant for a Java course that, well, doesn't teach Java to my satisfaction. The truth is Java is difficult to teach. Often references are under emphasized, and the difference between primitives and references as arguments is completely ignored. Professors rush to try to get their students to graphic libraries to make their programs fun and interesting. Simple sorting, algorithmic thinking, overall program structure, inheritance and polymorphism are all pushed aside in favor of creating an applet that looks shiny. The simple truth of the matter is that an intro to programming course in Java can be taught exactly like one in C++. The overall theory and concepts can be duplicated. Professors simply are not teaching Java in this way. This is not the languages fault but the professors.
Professor Dewar sited Java as having too many built in libraries for teaching as a primary language. The solution is don't let the students use this built in libraries in their programs. Simple as that. There are plenty of C++ libraries out there too, mind you they aren't standard like Java's, but simply because a language has a large library base does not mean you should not teach it as a primary language. Force the student to build from the ground up.
One of by far the most annoying statements that Professor Dewar made was, "And [forget] all this business about ‘command line’ – we’ll have people use nice visual interfaces where they can point and click and do fancy graphic stuff and have fun." I'm sorry but computer programming should not be a contest in who likes chewing on glass more. It should not be painful, and if it is then you need to develop a tool that eases that pain. The classic mantra, "a good programmer is a lazy programmer" applies here. I'm sorry if the development of the IDE annoys you and makes accessible skills that used to be only available to those who wished to toil away for hours mastering command line text editors, debuggers and subversion tools. Perhaps you would prefer we go back further and program entirely with punch cards again, or receive error messages in hex code. In the real world people use IDE's and at the introductory level students should be taught using IDEs. I believe that CS students absolutely need to know how to use command line tools and compile using the terminal. When I was in undergrad I pushed for having the first 2 weeks of the intro to Java course at my institution be teaching simple terminal commands and terminal compilation and running of programs (instead it is spent teaching HTML). But students also need to be familiar with IDEs, Eclipse (or Netbeans), Visual Studio, and possibly even Xcode.
I understand Professor Dewar's feelings about Java as an introductory language. He believes that the CS jobs that will be available to CS students are in lower level languages like C++, C, Ada, and Fortran compiling for machines that are not personal computers. For crying out loud the guy owns a company called AdaCore Inc. He is a compilers man, and I understand how he might feel that the lower levels of Computer Science are being underrepresented at most universities. And frankly, they are. But this is because of the high level theoretical Computer Science Professors who are doing the teaching. Teaching an AI algorithm or tree traversal algorithm is far more interesting to them than forcing a CS student to program C on the register level. Teaching low level languages as introductory courses in CS often causes the language to get in the way of the solution. Let lower level aspects be taught later. I can't imagine how Professor Dewar feels about MIT changing their introductory language from SCHEME to Python, a language which, if he feels Java has next to no application in real world industry CS, he must feel has absolutely no place in real world industry.
Subscribe to:
Comments (Atom)