Musings++

Something something programming

Anachronistic Programming

I've decided to repost this article that I originally posted on September 17, 2013 while cleaning up the blog archives from when I was using Fargo.

I want to show you a piece of code. Something that's touted whenever this language is spoken of, and everyone seems to be able to pull out of their ass. I want you to look at it, understand it and then see how far down the execution chain you can take it. I'm not talking about if you can debug the app, set a breakpoint and then step through it. I want you to sit there, hands off the keyboard and go through the request cycle that needs to occur for this particular piece of code to execute and run as you expect it to. Spare no detail.

Did you get far enough? Did you get down to the HTTP protocol? The packet dance that happens before the actual request from a browser is sent? Or did you stop at "Browser makes a request to the IP address"?

See the problem today isn't that the computer is this magic box that only a few can understand. It isn't relegated to the guys in beards and thick glasses. It isn't just for the geeks and nerds. To be a programmer meant that you needed to also understand the specific hardware stack that you were working on. The exact chipset, the exact instructions available to you. The exact specs on the memory and video controllers.

Today, computers have become common place. And in order for it to get to this stage a few things needed to happen. The first one being "It just works". That's the basis of the consumer computer. With no fiddling, no worrying about any kind of internals, you should be able to get up and running in no time.

But you're not just a typical consumer. You're a programmer. And unfortunately, this idea of "It just works" has found its way into programmers minds everywhere. You don't need to think about the HTTP protocol, "It just works". You don't need to worry about Little vs Big Endian - "It just works".

Until it doesn't.

I think the problem with programmers today is quite simple. We've been lead to believe that we can rely on certain things within the system. Which, is great. I mean, if we couldn't rely on the HTTP protocol where would we be today? But this reliance has led to an entirely new problem - "I don't care". Programmers today don't need to learn about how the protocol works because "It just works". They don't need to think about it. And I think that has lead to an entire generation of programmers who don't understand the fundamentals of programming. They don't understand that the code they type into their fancy IDE's is really powered by the ideas of a few people and run on hardware. There's a severe disconnect between hardware and software and that is hindering them without knowing it.

Don't get me wrong, I'm not trying to say that I'm some incredible programmer - far from it. I'm actually a terrible programmer, because programming isn't all software and algorithms. There's a hardware component to it that's overlooked way too often. The things you're doing with code you're RELYING on the hardware to accomplish.

Don't you think you should at least have a vague understanding of how it works?


Becoming a web developer

The web is a big deal. Like, a HUGE deal. And the people that make the web have been thrown from their basements into the limelight. They've been lauded and applauded for being on the forefront of the technological innovation. But everything has its cost. Jonathon Hill posted today about the cost of being a web developer. I think Jonathon missed an important cost - he forgot what it's like to be starting out. It's a common problem, and one that inevitably plagues even the great ones. I don't mean to belittle what Mr. Hill does for a living. While I don't have a lot of experience with his work, I'm sure he's a great web developer. And I'm sure that his work is incredible because of the tools he has.

What I take issue with is that he seems to believe that this is what a new developer requires to be great. On the contrary, I think having tools like this at your disposal from the beginning causes one of two things.

  1. You start, the tools don't confer god-like web development skills. You get upset and leave.
  2. You start, the tools don't confer god-like web development skills. You get upset and work harder.

One of those two is good - and that same one doesn't require the initial investment that Mr. Hill thinks.

The real cost

  • Laptop (Hey, you probably have one of these right now!) ~ 700$
  • Books ~500$*
  • Linux (this one is optional, but if you're new to development I'd recommend it. There are a lot of great tools and utilities out there that show up on linux first. They won't be pretty, but god dammit they'll be awesome.)

*optional

Total Startup Cost: ~1200$

The books, of course, are not required but eventually you'll find that there are some things that people will always refer to that you'd like to have around. Javascript: The Good Parts for example. Or the Dragon Book. Books are very important.

The beauty of the web is that it isn't memory intensive when you're trying to figure out what you're doing. But it can grow to whatever you want.

What about training?

Technology moves fast. Really fast. You know that awesome new phone you got two months ago? Out of date. You know that great new framework you learned last year? Technology has made that irrelevant. I will agree with Mr. Hill to a certain degree here. A technology focused college education is quite the waste of time. However, I don't recommend skipping it right away. There are other things that a college education offers you apart from your program.

Presentations, working with others, taking charge of projects when you end up with a bunch of slackers. Making the tough decisions to kick that one dude out of your group because he does nothing. Essays, reports, being on time. And the most important to a FREELANCE WEB DEVELOPER THAT MR HILL SEEMS TO MISS. MANAGING YOUR TIME. For many people College is the first time when they're left to their own devices. They are responsible for themselves and they have to figure out how they work best, and how to manage their social lives AND their work lives. People are give 4 years to make this work. Four years when you're allowed to screw things up and start over. Because the thing is, once those 4 years are up, if you don't have some understanding of how to be you - you're pretty fucked.

However, don't waste your education on a technological degree. Instead, I'd recommend doing something unrelated. Psychology, Marketing, or even English/Theater.

See there's a weird stereotype about a lot of tech people - they tend to be rather introverted. This isn't true for everyone of course, but for those whom it is, you have to understand that even when you enter the workforce as a developer, you still need to interact with people. A LOT.

You have meetings and phone calls, you have to explain your choices to management and clients. If you're a freelancer, you have even MORE work. You need to be a sales guy, support staff and a developer. If you find it hard to talk to people - good luck.

It's a great time to be a good developer.
~ Jonathon Hill

How true it is Mr. Hill. It is a great time to be a good developer. But starting out with a 3000$ investment doesn't make you a good developer.

Having the drive to be better makes you a good developer.


Again, I feel like I have to point out - I think Mr. Hill has some great work. Browsing through his projects, I'm not claiming he doesn't know what he's talking about. Just that maybe he's forgotten what it's like to start out as a developer.