Channels ▼
RSS

Design

The Quiet Revolution in Programming


About every ten years, it seems, we're told that we're on the precipice of some revolutionary development in computing: The PC revolution. Ten years later, the networking and client/server revolution. Ten years on, the Internet revolution. Today, we're in the throes of the mobile and cloud revolutions. Authentic and important as these tides of change are, they do not greatly affect the way we approach programming. There, tides roll in more slowly. Object orientation in the 1990s, the subsequent rise of dynamic languages, and the emergence of parallel programming cover many of the important changes. (Arching over these is the magic dust of agile programming, which changed the coding experience by integrating tests and moving to a faster, leaner model.) And even these waves sweep over programming slowly — note, for example, the snail's pace adoption of parallel programming.

However, during the last 24 months, the sheer volume of change in the computing paradigm has been so great that programming has felt its imprint right away. Multiple programming paradigms are changing simultaneously: the ubiquity of mobile apps; the enormous rise of HTML and JavaScript front-ends; and the advent of big data.

The greatest effect these changes have had on software development is the requirement of multiple languages. Web apps, by definition, are multilanguage beasts. Many mobile apps require more than one language; and even if they can be written in one language, they often rely on a Web- or cloud-based back-end that's written in an entirely different language. And, of course, big data is written using all forms of specialized languages. We have had to become polyglots to develop almost any current application. As a result, when the research and analysis firm Forrester recently surveyed our readers about how much time they spend writing in any given language, the results (from 500 developers) looked like this:

2013 time programmers spend writing in any given language
Fraction of programmers (y-axis) who spend x amount of time coding in a given language in 2012.

Note the big spike on the left and the mostly sub-2% numbers for programmers coding more than 50% of the time in one language. I expect, after some reflection, that most readers will find this chart unexceptional. Most developers work in two or more languages. (And of those languages, JavaScript is the one most frequently combined with other languages. I think its numbers will remain high for years as its use in Web and Windows 8 apps assure its continued use.)

If the previous chart looks unsurprising, consider how the responses looked when we asked the question in late 2010:

2010 time programmers spend writing in any given language
Fraction of programmers (y-axis) who spend x amount of time coding in a given language in 2010.

These charts are stunningly different when it comes to the right side. Two years ago, fully one-quarter of programmers wrote in just one language, and half wrote in only two languages. Today, such conservative use of languages looks like a luxury.

Even though these charts show only major languages, I believe a secondary development will reinforce the trend; namely, the use of embedded scripting languages. They have already become standard practice in game development, where Lua is frequently embedded for scripting UI components in C and C++ codebases. In Java, the proliferation of JSR-223 scripting engines that are callable from within Java applications foreshadows more of this activity on servers and desktops.

The movement from few to many languages has important ramifications. For example, it's now more difficult to find programmer talent that satisfies all the needs of a project; and it's more difficult as a programmer to be deeply fluent in all the necessary languages and idioms. These obstacles might suggest division of labor along the lines of programming languages (which likely reflect different concerns and separate components), but as the first chart shows, this is not happening. Rather, the traditional division of labor along domains appears to be the continuing norm. Because the trend is so new, it's hard to tell what the other repercussions of this shift are and will be. I'll cover them as they emerge. Meanwhile, all hail polyglots!

— Andrew Binstock
Editor in Chief
alb@drdobbs.com
Twitter: platypusguy


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 

Comments:

ubm_techweb_disqus_sso_-9e9711dbc73220a974a9d007acb99624
2014-01-08T22:06:13

I find it interesting that Google is trying to get back to the one or two language model from two different directions. They've developed Dart with the primary intention of replacing Javascript on the web front-end, but also capable enough to replace Java on the back-end. And I see no reason why the language couldn't go directly to native-code instead of it's VM and give C/C++ a run for it's money.

The second Google direction is developing Native Client (NaCl) that takes C++ code and generates Clang object/byte code that can be converted to native code at install time and run under the Chrome browser. Other languages having Clang-based compilers could theoretically do the same thing.

Two other projects based on Clang technology, Emscripten and Duetto, compile C++ code into Javascript for web applications. So I don't think it's the language that's so important, but the compiler technology/implementation underlying the language and its runtime environment.

Wouldn't it be nice to stop wasting so many brain cells on becoming a polyglot and instead learn problem solving techniques for multiple application domains.


Permalink
ubm_techweb_disqus_sso_-f1c86d36cf4899b8ba250f0506570e00
2013-06-26T06:13:30

universal language should be created and AI must be foundation for it .Why don't try it?


Permalink
ubm_techweb_disqus_sso_-4d701a4f8114e6c5eb661f0edb928a70
2013-05-08T12:55:12

I think this may be partially due to the economy. When times are good you can afford to focus on a single language/technology. When times are challenging it pays to be proficient with a variety.

Over the decades I've seen many friends lose jobs because demand for their skills disappeared with the advent of newer languages.


Permalink
ubm_techweb_disqus_sso_-0364134f1f69d4ab2c764ed780b128c1
2013-05-06T09:08:36

yuck


Permalink
ubm_techweb_disqus_sso_-bf67b400e08a0cfc81cee3a104be46d0
2013-04-10T15:12:37

If you can do one you can (probably) do the other. That being said, I actually like the idea of multiple languages targeting the same platform. Whether the reason is legacy support or languages that are better for specific niches, the ability for .NET to handle multiple languages is a boon, not a curse.


Permalink
AndrewBinstock
2013-04-09T18:25:14

Think about the programmers who used both languages in the 2010 survey and you'll be able to figure it out from the problem from there, I expect.


Permalink
ubm_techweb_disqus_sso_-8e26fb82becb4c33dc8b5acb5b2dc2c9
2013-04-09T18:07:01

#3 The stated question was "how much time they spend writing in any given language".

Programmers who used C 10% and C++ 10% in the 2012 data, are shown once for C and once for C++ in the 10-20% bucket.

Sum the C 10% and C++ 10% to give C/C++ 20% and they are shown just once for C/C++ in the 20-30% bucket.

Whatever they used the other 80% of the time doesn't change.

Where's the problem?


Permalink
ubm_techweb_disqus_sso_-b51f7a869a1175412e56da18be4810cb
2013-04-09T12:49:04

I heard someone suggest--in jest, I hope--that the way to get past the resume keyword scanners is to fill you resume with things like "I do not know Python. I am not familiar with Web Services. I have never used Ruby." Then the scanners will pop you out for every job ;-)

Of course, there may be a slight downside....


Permalink
ubm_techweb_disqus_sso_-0cd02431100cced55cb6ba2793570183
2013-04-09T08:11:15

I agree with you.

As I belong in the second category you describe, my experience switching jobs falls exactly on your description.

It is not easy to convice the HR guys how flexible I am, usually I need to focus on their keywords to get in somehow.


Permalink
ubm_techweb_disqus_sso_-b51f7a869a1175412e56da18be4810cb
2013-04-08T14:50:52

An interesting insight. However, I can't help but think about an observation I've made in the past. Take a big Fortune 100 company and look at their technical "elite" (fellows or whatever title they give them). If the company is smart, there will be two kinds: one group will know everything about something. They are experts, often world-class experts, on some thing. C++, ultrasonic welding, or auger spectroscopy, or whatever -- they know it all or, at least, a substantial part of it. These people are important because they can be a resource for an entire organization. However, they are often subject to being displaced by new technologies (it is hard to be a COBOL guru these days, or--in my case--not much call for experts on DOS Extenders anymore.

The second population are people who understand a lot (but not everything) about a lot of things. These people are usually able to make associations between different areas and are often skilled designers. They are usually fast to pick up new things, as well. If you find the people like this who can also communicate and deal with people, they become your chief architects and chief engineers because they can deal with the hardware, software, and problem domain. It is harder to displace these people with new technologies and, as your post points out, these people may be even more valuable in the future. But you still need both populations, I think.

One of the difficulties, as someone else pointed out, is at job interview time. It is easy to say "We need someone who knows Linux and C++." It is harder to say "We need someone who understands just about everything to some degree and can focus down on a number of topics at will." That's too hard for HR departments--who typically think we are all interchangeable anyway--to measure. So people in that second population often have to start in strange jobs and create their jobs once they are in place. I remember a time in the past when I was brought in on a consulting job because of what I knew about Windows. But the project was all on a Unix-like system (Dynix) and the team was struggling. Since I knew Unix as well, I had a lot of impact doing things that shouldn't have been a big deal. I never did figure out why they hired Windows people to work on a system that didn't use Windows.

I think as an industry we have done a poor job of differentiating our various specialties and this is part of it. You would not go to a podiatrist for chest pains. But I observe daily people with some deep skill set being asked to do something totally different and, often, not doing such a great job at it.

The real cost of everything is in terms of people. Companies that learn how to get, retain, and develop the right people and then use them to their maximum potential are going to have an advantage over the ones who just randomly shoot people at problems.

You know I'm long-winded, but a thought-provoking blog post and it--well--provoked my thoughts. Maybe the shift you've pointed out will call attention to the specialization conundrum. Thanks for the post.


Permalink
ubm_techweb_disqus_sso_-06770b5746e551e06414781868a49324
2013-04-06T08:50:20

Lua isn't just used for scripting UI components in game development, it's rather used for game logic - eg. scripting missions.


Permalink
ubm_techweb_disqus_sso_-f8bbf463ed55549ae5ebbe8098289ddd
2013-04-05T19:01:09

I'm surprised that SQL (T-SQL or PLSQL) wasn't icluded. I spend about 1/3 of my time writing or maintaining SQL code.


Permalink
AndrewBinstock
2013-04-05T17:20:11

#1. I believe it is as yet unpublished by Forrester. #3 No, combining C and C++ numbers won't give you a number you can accurately compare with C/C++ due to the programmers who use both languages.


Permalink
ubm_techweb_disqus_sso_-8e26fb82becb4c33dc8b5acb5b2dc2c9
2013-04-05T14:11:53

#1 -- So is the report you're referencing unpublished?

#2 (see #3)

#3 At least, you could add together the 2 separate 2012 C and C++ categories to give the data "you'd like to have" !


Permalink
ubm_techweb_disqus_sso_-18000e32e2e7b4d8a0484050008bb18e
2013-04-05T03:09:53

Maybe Microsoft should read this article. Maybe they would stop providing multiple programming languages for their platforms, based on the silly notion that software engineers know only one. We would get rid of the weird distinction between C# and VB.NET, which fragments their developer community and limits access to jobs ("Oh you have 2 years of VB.NET experience, sorry, we need 2 years of C# experience.").


Permalink
AndrewBinstock
2013-04-04T20:46:21

Thanks for your note.
#1 -- no that's some other Forrester report.
#2 -- yup, now it's less than 2%, which is why I say the era has come to a close
#3 -- in a perfect world, these would have been lined up perfectly. But you have to go with the data you have, rather than what you'd like to have.

So far, most of the feedback on HN and elsewhere suggests the numbers reflect the trend. This supports my personal observations in speaking with (many, many) developers and vendors. They're all seeing the same trend and the same rapid deceleration of single-language programming.


Permalink
ubm_techweb_disqus_sso_-8e26fb82becb4c33dc8b5acb5b2dc2c9
2013-04-04T19:50:08

1) "Forrester recently surveyed our readers"

Presumably Here Comes The Open Web — Embrace It

2) "During the last two years, one of the longest eras in programming has quietly drawn to a close."

3 years ago Forrester Research was already reporting in DrDobbs that -- "less than 15% of the developers we surveyed spend all their time writing in a single language."

3) "These charts are stunningly different when it comes to the right side"

The most recent chart breaks out the previous C/C++ category into 2 separate categories C and C++ (which obviously would show higher if they were summed).

The most recent chart adds HTML/CSS to the mix -- maybe there were just as many who did a bit of HMTL/CSS back in 2010 (but were never asked).


Permalink
AndrewBinstock
2013-04-04T03:26:56

Agreed,that's a limitation. But it's also why the key point is how much the single-language numbers declined, rather than how much the multi-language numbers increased.


Permalink
ubm_techweb_disqus_sso_-26be4e708c84c817e84abd2af4fd4ba2
2013-04-04T03:18:02

so if I spend 10% of time on 10 languages, my input inflates the numbers for all plots on the left hand side, but if I spend 100% of time on a single language, only that one plot gets an uptick on the right hand side of the graph.


Permalink
ubm_techweb_disqus_sso_-cf42724450cab2052dbda6c8528b6109
2013-04-03T21:55:33

No surprise. In the last few years I've done my current "primary" language (C++), but also web apps using c# with asp.net, ajax, javascript, etc.

However I don't view this as a "new" thing.

A company I worked for in the late 80's/early 90's (System Integrators, Inc.) used a five different languages, each tailored to the required task. Some were quite full featured while others were scripting languages for processing of inbound AP wire stories and text markup. (For those that might have worked there - TAL, CPL, Mapgen, Textgen, and STYL). This doesn't include the OS/2 workstation applications built using C++.

Meanwhile my wife does mobile applications across multiple platforms, and frequently has to shift between java, c++, c# and objective-c. Depending on which platform is in need, that's where she works. They have no interest in hiring ideologically-pure language people. The last thing they want to hear when trying to get product out the door is "I can't help - I don't do c#!"


Permalink
AndrewBinstock
2013-04-03T21:03:43

Thanks for your correction. That's my mistake. Yes, the spike on the left. We'll get that corrected and add a little bit of additional explanation on the charts to make them clearer. Both fixes should be posted shortly. No, the drop off was not the result of less coding.


Permalink
ubm_techweb_disqus_sso_-b6e86d92ed278fe2aafdb4949116aa70
2013-04-03T20:52:48

"Note the big spike on the right"

Do you mean on the left? And you imply that the axes should be labeled "Percent of respondents using a language" for the Y axis and "Percent of time using a language" on the X axis - but I'm not sure.

I remember in the late 70s and early 80s, it seemed like every C programmer also used assembler for "optimized" loops and embedded systems functions. Things have just shifted a bit.

And were the Forrester people able to determine whether the drop-off (from the "peak on the left") was due to people using more than one programming language or just doing less coding and doing more design and/or administration?


Permalink

Video