When I was a kid (and, admittedly, I’m severely stretching how young I was to preserve my dignity), PBS ran a show called “Cyberchase” before the reruns of “Wishbone.” (That perky little literature-loving Jack Russell was really why I was watching PBS so late into high school.) Despite my reasons, I vividly remember the villain, Hacker, being made out to be a “bad” hacker.
After having that drilled into my head, I never thought about the fact that a hacker wasn’t necessarily the guy who just wanted to drain my already sparsely occupied bank account. (Not like they’d find anything remotely useful, anyway. The Weasleys might have a few more coins in their vault than I do.)
I really like the idea of hacking as being a tweak instead of a dangerous piece of computer ownership. I would’ve have thought of things that way. After all, when PBS locks it into your brain that a hacker is someone who is the reason you can’t have nice things, you tend to believe it. After all, PBS had never lied to me before.
(Don’t you try to tell me that Wishbone was a lie. That little guy had me

You can't tell me PBS lied. Come on. This little guy is the reason I inhale books.
reading “Ivanhoe” and Dickens well before any of my classmates knew what a classic was.)
As far as the actual coding of a “hack” goes, I won’t try to lie. I’m not a coding expert and I never will be. I know enough to get me by. I do, however, know that I like to make things. I would spend hours watching HGTV as a kid and Carol Duvall was my hero. (Mom, if you’re reading this, I’m still somewhat bitter about the lack of Sculpey in my childhood.)
Because of this underlying creative genius (or nutcase), I really like the idea that we need to be able to make things. It doesn’t matter if that thing is the new hit app, the next Facebook, or a simple, functional website. We need to be able to create, not just spit back. I’m not about to delude myself — or anyone else — into thinking that I’ve got a social network that will put Facebook to shame in the works. I don’t. I never will. I can, however, create in more than one medium and make something functional. I can think freely, not just regurgitate facts and the opinions of others.
However, once again, I’m concerned with how far technology has come. Matthew Kirschenbaum states in Hello Words:
The English department where I teach, like most which offer the doctorate, requires students to demonstrate proficiency in at least one foreign language. Should a graduate student be allowed to substitute demonstrated proficiency in a computer-programming language instead?
My immediate reaction? No. Absolutely not. Yes, computer programming is important. Yes, we’re a technological society. However, I think a foreign language is much more valuable. There is something to be said for being proficient in another language of human interaction versus being fluent in conversation with an inanimate object. Have we really become such a tech-driven society that computer-programming language is considered a suitable substitute for a true foreign language? That concerns me. It’s really no wonder the younger generations seem to be becoming less able to hold up meaningful conversations face-to-face.

What scares me is that I've seen kids even younger than these two with cell phones. Where do we draw the line and decide that interaction face-to-face is more important? I feel like this reliance on technology is relevant to the argument above.
I agree that my connections with this article were based mostly on my love of crafting. It’s important to make the connections between all creation as being valuable. The idea of learning programming instead of a language, however, was off putting to me at first, might use the same skills as learning a foreign language. I’d be curious to see a study on that one.