Category Archives: Development

The Rise And Fall Of Programming Languages

Everybody out there who is fluent in conversational Latin, please raise your hands. (Peering out) Not many. How about Esperanto? Perhaps a few more, although you all seem to have shaggy beards and a perpetually quizzical expression. (Not that there’s anything wrong with that.)

Human languages come and go, even though they are so closely identified with a people. There are efforts to keep them going wherever possible, and records indicate that there may be as many as a thousand or more so-called “endangered languages.”

So it should come as no surprise with the pace of technology that there are endangered programming languages as well. Some, like B, were stopgaps until something else came along. Others, like COBOL, were historically important but really aren’t around much today (other than a lingering small group).

When does a programming language become pervasive enough to get interested? And when does it wither enough in ubiquity that it’s no longer relevant for a project? Both are tough (and squishy) issues.

In terms of the upswing, my bias is to get interested pretty early in gestation — not necessarily to use the language for a client project, but to get a sense about where language (and compiler) development is going. I’m not likely to use Ada or even Haskell when building something that others will need to maintain, but, as an example, looking at how Haskell handles lazy evaluation and “first-class” functions is fascinating, and broadens the knowledge of the team.

So perhaps the better questions are about when to use a language for a project that will be released into the wild — and when to stop doing so, as a language’s star is falling? The answers to both are really the same: when maintenance and long-term expertise is easy and relatively cheap to find.

We’d love to be in lifetime engagements with clients. And many of our clients are with us for many years. But we don’t assume that, and we don’t want to build something that will create hassles for the client later. So that means, no matter how much we love Forth, we’re probably not going to use it to build a web application. There just aren’t enough people out there to support it. (Plus, that’s not really a great use of the tool.)

But let’s take a tougher example: perl. Fifteen years ago, it was everywhere. If you didn’t know it, you weren’t considered serious about building for the web. As PHP has usurped some of that space, perl remains a widely-used language (although more and more, it seems to be confined to the back-end and server side).

But man, I love perl. It has an ability to work with bits of data and patterns that is perhaps matched, but rarely surpassed. Contrary to some of its reputation, it can be elegant — but it doesn’t force it. (Why is there so much bad perl code? Bad perl coders.) And the CPAN archive of modules and third-party libraries is peerless.

What to do, then? Objectively, perl’s fortunes are falling. Has it passed the threshold of use on a major project? Well, as of this writing, I’d say no — but it’s getting close. The thumb on the scale that balances cost-benefit of using a language for a project is getting kinda heavy. It’s probably in the space where we will build and maintain perl-based projects that are already in that language, but are unlikely to starting something from scratch in it.

Which is sad, but for every one of those, there’s an Objective-C or a C# that’s climbing up the charts. Goodbye Esperanto, hello Mandarin.

Patterns Everywhere

As humans, we see patterns all over the place — how do we read? We identify patterns of letters. How do we navigate around our world? We see visual patterns and respond to them in our habitual ways. In fact, there’s a common phenomenon called apophenia, where we see patterns in randomness all the time (even when there really isn’t a pattern there).

So why not put this pattern-seeking behavior to good use? When doing design or development, there are also an array of patterns out there. In terms of development, this may be a broad framework pattern, like Model-View-Controller, that specifies how one can build the basic structure of an application — or something smaller, like how a variable can be structured.

But patterns exist in design as well — and I don’t mean the textured backgrounds or patterns that dress up a layout. Rather, as in development, these can be broad structures (like a layout pattern, for instance, a grid-based layout), a navigation feature (like double-tab navigation), or a functional aspect, like a captcha.

So why are these patterns important? Well, the more comfortable you are with the patterns, the easier (and faster) it is to bake them into projects. Even better, as you get more comfortable, you begin to extend the patterns and do new variations that help evolve design (and/or development) in new and interesting ways.

And therein lies the potential problem with pattern-based design or development — it can make it too easy to stay within the same set of parameters, and things stagnate. Nobody wants to see that you build the same old thing for every different client or need. If you hew too closely to the patterns or use them by rote, you are losing a lot of creativity.

But knowing the patterns and how they work is one of the best starting points there is — after all, if you’re going to leap off into the great design/development unknown, why not start atop a foundation of years of others’ expertise. You’ll get a lot further that way.

The Cobbler’s Kids are Shoeless

There’s a 16th-century proverb that goes something like, “The shoemaker’s children go barefoot.” Or unshod. Or something olde-timey like that. But however you phrase it, the sentiment certainly seems true today.

Last week, Apple announced that their developer site was hacked, and that potentially thousands of developers’ emails and other info may have been compromised. The hack (and the hacker) have since been called into question, and the real scope of the intrusion is unclear. But, to put it mildly, it ain’t good.

If Apple, with all of its resources and intricate technological knowledge, can’t keep it’s, ahem, stuff together with basic security, it seems like there’s not a great amount of hope for the rest of us. At least under the current security regime. Some of this is certainly due to neglect close to home — the shoemaker/cobbler proverb again — but much of it is based on how we handle security in general.

The username/password or email/password security approach just doesn’t work. It really doesn’t. Oh, sure, you might argue, it’s ubiquitous, so it must work fine. But there are SO many examples of breaches that something is amiss, and even where there aren’t breaches, it may just as likely be because nobody has really targeted that system yet.

So if not that, then what? Biometrics? RSA keys for everyone? Implanted chip under the skin? How apocalyptic sci-fi movie do we want to get?

Frankly, I don’t know. Each of those approaches has definite pros and cons, including a glimpse of dystopia. But I know this: what we have now is not working. And perhaps this is just another example of Apple leading the way.

Not that they were wanting to lead in this particular area… Apple, get your kids some shoes!

Where to Start?

I get asked every couple of months how to become a developer — someone excited about technology who wants to learn “how to code.” And I think that’s great. But they’re going about it all wrong.

There’s coding, and there’s programming, and there is a difference between them. One is a prerequisite to doing the other well.

If you are a coder, you can (probably with one or maybe two languages), attack problems and solve them. It may not always be elegant or efficient, but it works. You’re able to Get It Done.

But if you’re a good coder, you can (with whatever language is thrown at you, and probably choosing the one that is best suited to the task) attack problems and solve them as well. You can do it quickly, efficiently, and with as much simplicity as possible (without over-simplifying and missing the target). You can Get It Done Right.

So what’s the difference? The good coder is also a good programmer.

Learning how to program is mostly language-independent. It’s about how to think like the computer. How to spot common kinds of problems and solve them algorithmically. To use one of my favorite example, when to use a quick sort or a shell sort. What kinds of data structures work better in different cases. And so much more.

Almost none of that depends on a single language. In fact, learning those things in a language you think you’ll be using on “real” projects is probably a BAD idea. Which is why many universities use languages like Ada or Scheme. By doing that, you get (at least) two benefits: you can abstract the language away and focus on the underlying programming; and when it comes time to do “real” work, you’ll be learning a new language, which helps cement the programming concepts.

It’s no coincidence that many self-taught developers are coders — but not all. The key is to search through the midst…and find a programmer.