Hope and Fear in the World of vim

For some reason, I decided to step back from any integrated development environments at work and just use my preferred power editor of choice, vim. If you are unfamiliar, an IDE is a very powerful tool, like an assistant constantly at your elbow to remind you of how to code better, how to use libraries, and kick off any number of tasks like builds, deploys, and debugging. Early in my career, I resisted them, feeling that I should be able to be productive with a loose collection of individual tools. I finally caved when I read Martin Fowler’s “Refactoring” and realized that the one thing the IDEs at the time could do that my much beloved power editor plus command line could not was tear through a project’s code base, transforming the code smartly, based on an actual understanding of how source files fit together. The alternative was to use tools like sed, grep and awk to essentially do global search and replaces across many, many files. To do so, you had to explicitly write out what you wanted changed, character by character. In an IDE, you could select a menu option–extract method, introduce parameter, rename class, etc.–and provide very lightweight input like new names and new locations, depending on how you are re-factoring the code.

As much as I have been using IDEs for the past handful of years, part of me still felt they were a little intellectually lazy. I’ll be the first to admit that feeling is fairly irrational but that is how I have felt. I guess that a quiet part of me feels I should be skilled enough and deeply embedded in the languages, tools and projects I use to make the loose collection of tools approach work, strengthening my abilities as a coder as I work by being much more actively in the mix of those tools. Sort of how I feel about having a manual transmission on my car rather than an automatic. Except the clutch is n-th dimensional and I have to use all five fingers of my right hand, plus a few toes of the corresponding foot, to manipulate the stick through the gear changes.

I have to say, I am enjoying engaging deeply with my power editor of choice. Even though I took a break from coding professionally at my last job for about two years, I still used vim for any and all of my text writing and editing needs. Since the last time I used vim for coding on a regular basis, the biggest difference is how much more mature and robust the plugin ecosystem has become. While I still supplement vim’s incredibly powerful and efficient text manipulating capability with tools to search, check and build code, and more, a lot of very passionate folks have written some very nice glue code to allow those tools to be invoked from within vim and to interact with the way the edtior thinks about text and files.

I have been sharing some updates as I noodle around with my vim set up, focused on the editor’s main configuration file, .vimrc. In response to one of those posts, a friend, Jed, asked if I’d write a post about the plugins I use. I am more than happy to comply. If you are curious, you can read along at the github repo I created to house and share my hacking on my .vimrc. I have tried to comment the various files I use to explain my thought process and pay forward the many, many tricks I have found in recent days to make working with vim that much more pleasant than it is already.

First and foremost, there are at least a few ways now to automatically install and manage plugins, a capability that was sorely lacking even a few years back. Of those tools, I use vundle. I like it because it only requires a one time manual install then self manages updates of itself as well as plugins. It provides a really nice set of commands within vim, and in editor help explaining them, to search, install and clean up unneeded plugins. My main .vimrc simply sources a supplemental file, vundle.vim, that both interacts with vundle and itemizes my current raft of plugins.

I have a bunch of language specific plugins, I’ll forego discussing those, for now. The other, more general plugins are both more likely to be of interest to any random reader and contribute the most value, and joy, to my use of vim on a daily basis.

The first plugin I’ll recommend is airline. I have long had a custom status line in vim, one I configure to be always present. It showed a few useful things but projects like airline and powerline improve on this massively and make it possible to have a very slick looking set up. I use the patched powerline fonts, explained and linked to in the airline docs, to gild the lily that is the info rich display of this plugin. The readme at airline’s github project is its own best sales pitch, I encourage you to give it a read. I like it even more for being a vim native re-write of the original powerline plugin. It demonstrates just how incredibly powerful vim’s own scripting environment has become.

Another backbone of my toolset is exuberant ctags. This is a unix program that digests source code for any number of languages and builds a database of tags. That database allows a programmer to quickly jump around a code base, from reference to declaration. vim has long supported integration with ctags so that if you have an appropriate database built and accessible, jumping to a function declaration is only a hot key away. If my aging memory is right, this support predates most if not all of the current popular IDEs. I use the plugin easytags which allows vim to continuously refresh my ctags databases as I work. I pair that with the plugin Tagbar which allows me to open a split frame that uses ctags to present a code structure aware outline view of any source file on which I am working. Further, Tagbar integrates with airline, enriching my status line with more contextual info about where I am within any particular segment of code.

I use two plugins from github user scrooloose, NERDTree and Syntastic. The first brings in a file explorer that is way more powerful and easy to use than the already pretty nice explorer built into vim. The second integrates any number of standalone programming language syntax checkers, like jshint and Java Checkstyle. That integration allows me to jump around my sources, seeing where I have slightly junky code I might want to clean up or an actual break I definitely need to fix well before it hits a compiler or a deployed application.

I just added a tool called fugitive. Mostly I pulled it in to see my current working git branch in airline. It supports invoking git, the whole hawg of distributed version control tools, from within vim but I have barely scratched the surface of that. I am still creeping towards the deep end of git at present, at present trying to use more interactive adds to selectively commit not just whole files I’ve changed, but collections of fragments of files that are related, separating them from other changes in those same files I want to save for a subsequent commit, for a separate reason. Being able to use vim to select those chunks to build up more smartly commit sets is intensely appealing while still slightly daunting.

Up to this point, if you are a coder, you may be thinking that all these plugins do is approach but not quite compete with a modern IDE. I wouldn’t necessarily disagree, as I said, these plugins just make existing tools easier to keep at the periphery of where I exert the vast majority of my focus every day. I have such deeply ingrained muscle memory for vim that I can create and change text far more effortless with it than even the best editor offered by the best IDE. I have tried “vim” bindings in some IDEs and even an abomination that yokes vim directly to Eclipse but all they have done is remind me of what I love about vim.

There is one last plugin that I feel is the most compelling response to the refactoring support available in packages like Eclipse and IntelliJ. For geeks of a certain age, tools like grep and sed are equal parts comfortable and frustrating. They are so responsive to sifting through masses of files yet so demanding in the exactitude of their use. A few years back, a colleague introduced me to a kinder, gentler alternative specifically for coders, ack. ack combines the expression power of grep with a better understanding of the peculiarities of source code and programming language syntax.

ack.vim brings that useful and useable code searching into vim. In this case, search is only half the picture. vim supports macros, not in the sense of scripting or snippet expansion, but the ability to record and play back key strokes. Everything vim does can be invoked without taking your hands off the keyboard. I feel this is vim’s best, most under appreciated feature. Imagine searching for some repetitive fixture in your code you feel must change, something thorny enough to require ack to confidently spot all instances of it. Now, couple that, through this plugin, with the ability to repeatedly and correctly apply that transform, not just something superficial, but a complex interaction with every single occurrence you unearth. Using vim, ack and ack.vim may not be as simple as picking from a few menu options but it also isn’t anywhere near as constrained. I’ve used macros to work over related sections of text, multiple lines together, not just a line or two around a simple declaration.

I’ve only been applying this latest deliberateness to my use of vim for a few weeks. My hope and fear is that there is so much more to explore. Hope in that I already feel I have finally found a combination and integration of tools capable of giving any IDE a run for its money. Fear in that these waters run very deep indeed and my ability to plumb them is only so great.

If It Sounds Right

One of My GuitarsWhen I first starting learning guitar, before I had a teacher and was just using books and videos to learn on my own, I worried whether I was learning good technique. I admit to obsessing a little bit about the shape of my grip, the arch of my fingers, whether I was fretting cleanly enough or not. I suppose I was carrying over assumptions I had from other skills into the pursuit of the new one. When writing software, there are definitely more and less correct ways of putting a program together. A lot is bound up in language and tool choice, often in terms of certain idioms that both affect correctness but also readability of source code.

A couple of months along, I decided I would benefit from a teacher. I didn’t really reflect on this choice but thinking back, I suppose it was motivated in equal parts by the desire to want to progress a bit faster than I was on my own and by that worrying over proper technique. One piece of advice I kept encountering was to make sure that whatever I was practicing, I should make sure I was doing so correctly. After a little searching, I found a local teacher with stellar online reviews.

My guitar teacher is great. He seems to have an intuitive grasp of when I need to refine something in my practice or am ready to stretch out and tackle something new. At first he took a little while to understand my tastes and interests. Some of that was no doubt do to how much those tastes have been changing and growing in the last year or so. Lately he has been supplementing more structured skill work with learning songs I enjoy, showing both the applications of the techniques I am learning and rewarding my efforts with playing real songs as opposed to simpler studies.

I expected from the start that my teacher would pay close attention to my technique and correct minute imperfections. This is how I learned Tai Chi, with an eye towards correctness, one that progressively looked deeper and deeper into form and movement. Brewing has involved close attention to key variables in the process, ensuring good control as well as exercising good taste. Why would guitar be any different?

A few months into my guitar lessons, we finally discussed my expectations, briefly. I had already started to realize on my own that there was something different in approaching guitar so really we just confirmed a certain truism that is common among musicians: if it sounds right, it is right.Basically, my teacher just confirmed this notion and we got back to the business of producing right sounding notes.

For most guitarists, certainly at my level of just gaining some basic fluency, the result usually far more important than the way it was produced. Guitar reinforces that because it is such a flexible instrument. The same note can be sounded on several strings. Chords can be constructed and varied in an infinite number of ways. Realizing this, I understand that training of the ear and using judgement takes priority over particular technique.

How can we draw a parallel with programming? Certainly when creating software for regular people, there is definitely a sense of what seems right trumping hard and fast engineering precepts. If someone sitting down at an app cannot figure out how to make best use of it, like a muddled or disharmonious tune, then it has failed. I suppose even the compiler could be thought of as enforcing a similar, if more rigorous, critical evaluation. Witness single line programming, underhanded and obfuscated contests that play with the plasticity arising from language rules and semantics while still producing something that can by some definition be deemed “right.”

What role, then do the volumes of text on correct theory and practice, for coding and guitar both, serve if ultimately the prime criteria admits so much variation?

If you are working with others, either to make some software or some music together, than a better understanding of your craft is invaluable. If you want to advance beyond a certain level of competence, as well, and tackle the greatest challenges, and hence feel the most keen depths of accomplishment, likewise you’ll need to cultivate an expertise beyond just a reasonably critical ear or eye.

Technical skill is a large part of the vernacular of a pursuit. To talk about challenges in programming, it helps to have clear and accurate terms and phrases. I have certainly struggled with some subtle aspect of a bit of code, for instance figuring out the right data structure, or kind of code to hole some information, that makes working with data simple, clear and effective in terms of code complexity and efficiency. When I don’t stop and really listen closely to someone trying to help me, I end up spending more time, struggling harder, than when I recall my fundamentals of computer science and theory. In either case, I usually get there, improving or fixing my program, but when I leverage the full depth of my technical mastery, I get there sooner and with much less stress.

I am only just learning music theory but see a strong parallel. Most times when I am learning a piece of music with my teacher, I have to trust the music as written and his recommendations of where I can vary from it. As I have been reading about scales, intervals and the basics of chord construction, I am now able to think back and understand my teacher’s guidance. I can easily foresee a day where my grasp of theory lets me interpret and improvise much more easily than I can now. Knowing the theory increases my appreciation even if I have to rely on trust more than a skill, or lack thereof, to reverse engineer a piece or part of a piece and really, rationally understand why it works.

The sound of a song, or the user apparent function of a program, is still important but truly understanding the hows of producing a result both deepens the ultimate appreciation and allows for even more nuanced play in both cases. The interplay between correctness and playing to ear yields a far more interesting result than hidebound adherence to trade craft. Strict formalism is sterile. Grit yields joy. In a song, that’s a catchy lick or memorable chorus. In code, it is a program that delights, that allows you to do something or something more easily while to the source code reader reveals a fun-loving and inspired mind.

Getting Back Up to Speed

I had three hours to write a JavaScript single page application that implemented a very simple Reddit client in the browser. I was assured I would likely not need more than one hour. At the end of the first hour, the interviewer checked back in with me to see how I was progressing. I had been struggling just to get the basic scaffolding of the application put together. I had used a lot of the hour on looking up libraries and tools as well as refreshing myself on my JavaScript syntax.

I managed to complete the assignment in a little over two hours. I felt not only frustrated but embarrassed at how long it took me as compared to when I was in my top programming form. The interviewer assured me it was fine but I knew it was far from my best possible effort. As it turns out, the interviewer was right. For that particular prospect in my recent job search, I progressed to the next round of interviews.

I wrote an Inner Chapter years ago about brushing up programming skills after a period of not using them. I wrote that chapter for a listener who had gone through some hardship and was looking for help adjusting. Partly it was coping with changes in physical ability specific to that listener, partly it was more general advice on sharpening skills. I’ve been thinking a lot again on the subject of getting back up to speed at programming having just left a full time management job to go back to a full time programming one.

Even in the first year at the job I just left, when I was still in more of an engineering role, I didn’t feel like I got a lot of practice at programming. They had some technology projects but the pace of and pressures on them were very different, being a non-profit, than what I had experienced up to that point. Even when I was promoted, a lot of my skills as a technical manager didn’t apply very well. The kinds of stakeholders we had didn’t really know much about the standard software life-cycle. I ended up spending far more time managing personnel, managing budgets, and fund raising rather than implementing and supporting processes for creating software in a regular and consistent way. It was a rare day when I got to work with technology directly.

As I’ve written about recently, a large part of my decision to leave that job was realizing how unhappy the lack of opportunity to code was making me. To make matters more stressful, during my job search I had to explain over and over again why I was leaving an upper management role and looking for work as a programmer.

“You are currently director of technology?” “Yes.” “You know this is not a management job, right?” “Yes.” “You’ve been a manager before. You truly understand this is just a programming job?”

I got very practiced at telling the story of my multiple runs at management. I would tell them that I never started as a manager. Working at a small organization, it would become clear, usually due to growth, that more leadership was needed. Foolishly, perhaps, I would volunteer, at least until they found someone better at managing. They never did. Eventually the strain and the desire to get back to programming would become overwhelming. Taking a demotion was never a practical option so three times, now, I have found myself in this same juncture.

I have only been on my new job three weeks but I’ve been thinking about a few experiences that could be helpful to someone else looking to get up to speed. I already shared the first one but I had several other experiences just as part of the job search process that I think are relevant. Just going through several successive coding assignments, especially in a very reflective state of mind, revealed insights I continue to ponder.

In that very first coding assignment, I mostly felt frustration. I had to look everything up, even the basic tools and libraries I was using. I had lost the equivalent of muscle tone. I couldn’t reflexively just spin up the bare minimum set up and configuration to get some arbitrary bit of new code working. I was struggling with my tools rather than feeling that they were smoothing and accelerating my efforts. The silver lining to that frustration was the immense sense of accomplishment on mastering it. As long as I remember this frustration, I am less likely to take my tools for granted. When I have felt that frustration again at the new job, even with more time and resources to get something done, I am able to see it in context, as a very normal part of the experience, trusting that if I persevere, I will be rewarded.

In a good environment, getting tools and configurations right will be a shared burden. My new gig is very much like this, though the work to keep the tooling up to snuff is a little bit of a guerilla priority. Engineers have to squeeze time in around things other people find more pressing. A lower priority on process and tools compared to more obvious business value is actually common in my experience. Regardless, I was grateful, as a consequence of the frustrations I felt during the coding assignments throughout my search. My new colleagues apologized for the roughness of the code and documentation, I was just glad to not have to do all the necessary grunt work on my own. I was motivated to contribute what I could out of that appreciation. I felt that contributing to improving the basic set up in this way paid forward the kindness done to me, lessening the frustration of the next new hire. Jumping in was also good practice to help with my overall brushing up and gave me an opportunity to understand the set up a bit better, always a good thing when the need to troubleshoot it arises.

I didn’t even have to get to the new job to realize some improvements in my disused coding skills. In the subsequent coding assignments, I was able to make better progress more quickly. I had to do a second assignment for the same prospect as that very first, super frustrating assignment. On that second attempt I was better able to focus on the challenge at hand and the quality of my code. I found that trend continued through a few other coding projects I did as part of my search. By the end, I was spending as much time on the code comments and the commit history, to really show off my thought process and how well I understand the problem I had been presented.

The last coding assignment I did, I completed in less than an hour. I got to show off a little, too. The problem definition included some hints that could easily be missed. It asked to include some comments on the code’s performance. I was given the maximum expected input size. Once I had the code provably working, as demonstrated by some unit tests I turned in with the assignment, I thought about that maximum input size. I didn’t think it was included for no reason so added a unit test that run a random input sample that matched the stated max. Not surprisingly, my code slowed to a crawl.

I was able to recall not just my core programming skills but also some troubleshooting. I didn’t fire up a full on interactive debugger but did add some logging. I worked from very broad to increasingly narrow sections of code until I had isolated the problem. I figured out and made some changes that my tests now revealed improved the performance to be acceptable. I even left some code comments suggesting how the performance could be improved further.

I think that just the opportunity to practice with some clear goals can do a lot to help knock off the rust. You may not always have coding assignments like in a job search but there are online resources that can serve a similar purpose. As part of my orientation packet at my new job, I was turned on to Code School. The topics are limited, focusing on web development with just a few technologies but I like the approach. The courses combine short instructional videos and interactive exercises.

Whatever resources you find to use, I think the practical aspect is important. I am still relying on books a lot, even if I am loading them on my tablet rather than accumulating heavy stacks of paper as I did far earlier in my career. The best books, just like the online courses, include exercises that give the reader a chance to work with the material covered. Application is key to cementing a new skill or reviving a disused one.

I suppose there is a role for trivia, accumulating simple facts about programming or tools, but it is less helpful when learning for the first time or getting back into the swing of things. When working through an exercise, I will often have one of two very gratifying experiences. I will either laugh out loud or nod and smile as some bit of skill or knowledge comes flooding back to me. I laugh when feeling the visceral joy of either getting something to work or even more powerfully, of feeling come concept really snap into place in my mind.

The next challenge for me is getting back into good daily habits for coding. I think I will save my thoughts on that subject for another post. Hopefully, my early experiences getting back up to speed make sense and are useful to at least some of you.

TCLP 2014-12-21 A Sense of Scale and Getting Back Up to Speed

Used under CC license, by Wikipedia user Sunil060902This is an episode of The Command Line Podcast.

In this episode, I share a couple more essays. Before the essays, I help spread the request for support from the GnuPG project. You can donate here. The first essay was already published on the web site, “A Sense of Scale.” The second essay is new, “Getting Back Up to Speed,” and I will publish it on the site in the next few days.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License

Sense of Scale

Seahorse Tail fractal by Wikipedia user, Wolfgangbeyer CC-BY-SA

What is high quality code?

I have been asked this question now a couple of times in my current job search. I usually go back to my thoughts on functional decomposition. I like the concept but that phrase is clunky. It doesn’t help explain the concept very well.

As readers of the site know, recently I have been walking more in an effort to improve my health. For a longer time, walking has been a large part of my experience of travel. My favorite places to visit are those that are highly walkable. I suppose that is why I fell so hard for the few great European cities I have managed to visit so far. Most recently I spent a fair amount of time walking around Cleveland with a close friend, seeing that city through his eyes.

Like a lot of American cities, I noticed how distance in and around Cleveland works a bit differently than in the old world. To get from one part of a sprawling America metro area to another, often you have to take a car or some other form of transport like light rail or bus. The best America cities, as in Europe, are very walkable once you get to a specific part.

During this most recent trip I really dwelt on the different experiences of the same place when walking versus riding. Driving around, the scale of the place collapsed. Place names were much more shorthand labels than something I got any direct visceral sense of. I might see a storefront but it was there and gone too quickly for me to see into the interior, see the people inside, or any other specifics. I could form a very rough map of the overall city but could not yet identify with its districts and neighborhoods.

Once we got out of the car and started walking, my perspective shifted. The character of each district emerged in the how the various business, buildings and open spaces were arranged in close relation to each other. Architectural themes became more clear, revealing more detail and texture. Individual people, groups and crowds made more of an impression. Everything opened up, feeling a lot bigger. The other place names on my mental map receded further away in favor of the scenes right in front of me. Conversation with shop owners, restauranteurs and citizens became possible, filling in more of the narrative sense of place than the useful but necessarily brief sketches my guide gave me on the way in.

Even more recently, I took a walk through a well known area park that happens to extend right up to within a block or two of my home. The park is named for a creek that meanders through it and the foot paths closely follow the aimless curves the running water takes. I love walking with the sound of running water nearby. I took my ear buds out and left them out for most of my walk. Like the sense of expanding scale I felt during my week in Cleveland, my sense of the neighborhood opened up, through sound, sight and sense of distance.

I walked about to the nearest shopping plaza and back. By mileage covered, I actually walked at least twice as far as the straightest distance between there and my house. Walking through this stretch of green space right through in the midst of an active DC suburb, it felt like ten times as far. I realized how much I had been taking that distance for granted. I usually cover it in my car, in trips measured in tens of minutes rather than the hour my walk took. That walk covering the same area and a comparable distance felt like an entirely different scale.

I think there is a closer parallel revealed in the best written source code. High quality source code is like a fractal. The more you zoom, the more detail is reveal. That detail doesn’t have to be self similar, it usually isn’t. What emerges the closer you look is more like Mandelbroit’s more complex fractals, like the one pictured above.

When reading to get the general sense of a program, it should feel like driving through a city scape, taking in more of a gestalt than any specific surface detail. A reader should be able to grasp the general orientation of the code. This module does this, that subsystem does that. More specific details should be easy to ignore, behind API boundaries or at least in different but well laid out source files. The placeholders and map should cover just the broad functions, responsibilities and relationships of the code.

When troubleshooting, especially in an interactive debugger, it should be possible to walk down into the code to focus on a specific neighborhood to the exclusion of the rest of the code. That overall map may still be in mind, but it should feel comfortable and natural to just take in the local concerns without needing to know anything more specific about more distant code. If distance code intrudes, odds are your bug is going to be much, much more difficult to isolate and to fix.

When writing high quality code, think about the experiences you have that reveal different senses of scale for the same thing. For me, walking is one activity that helps me grasp this idea on an intuitive level. No matter how fast I walk, or even if I decide to run, the zoomed in sense of scale remains. I have to express a concrete intention to trigger a shift in scale, by getting in a car or a bus.

If you can use the tools your language and technology of choice provide to cultivate that same possibility of overlapping, simultaneous but different scales, then I think you are on to something. In my experience, code that works at various scales is easier to test, to read, and to troubleshoot.

Measurement Lab and Google Summer of Code 2012

I rarely post directly about my day job but wanted to reach out that firmly on my head for a second since I know there are a fair number of hackers in amongst my readers, several of whom I have heard from are also students.

The main project on which I work, Measurement Lab, has been accepted as a mentoring organization for this year’s Google Summer of Code. We are acting as an umbrella for one sub-project of our own and two other fantastic, network measurement related ones (DONAR and Paris Traceroute.)

Please take a moment to look over our ideas page and either consider submitting a proposal or sharing with any students you may know who might be interested.

Rant on the Failure of Programming to be Pragmatic

A listener sent me a link to this rant, The State of the Art is Terrible, by Zack Morris. If you can wade through the technical humbuggery, I think there is a useful point. Several decades after the advent of high level programming languages and well into the age of ubiquitous computing, it genuinely is time for computing technology to be more focused on outcomes.

I’m on the threshold now of rejecting this false idol, but for at least a little longer I have to cling to it to carry me through. I have a dream of starting some kind of open source movement for evolvable hardware and languages. The core philosophy would be that if your grandparents can’t use it out of the box to do something real (like do their taxes or call 911 when they fall down) then it fails. You should literally be able to tell it what you want it to do and it would do its darnedest to do a good job for you. Computers today are the opposite of that. They own you and bend you to their will. And I don’t think people fully realize how trapped we are within this aging infrastructure.

The post is rife with examples of how the status quo is an abysmal failure to all but those of a very hackish bent. Morris touches on why this is so, the industrialization of software and the subsequent urge to profit. If you can wade through the very down tone, I think there is a kernel of optimism–a call for a sea change in how computers work and work for us.

Morris isn’t alone in this view, keeping company with the likes of Jaron Lanier. This is not likely to be the last rant in this vein. I think he is a bit more pragmatic, though, highlighting PHP as an example of a step in the right direction. His point isn’t that PHP has a natural language based syntax or that it has syntax or semantics that mirror concepts and idioms with which non-programmers are familiar. Rather he suggests it for its more productive failure modes, that it makes a best effort on the easy stuff and doesn’t obscure breakages requiring more investigation.

Whether you agree with PHP specifically or not, it is worth considering it as an example of the model he is proposing–languages and tools that are more focused on outcomes than abstract design principles or idealized syntax. The emphasis of getting out of the way of doing useful things ultimately sets apart this rant from a crowd of voices raising many of the same critiques of the state of the art.

Microsoft Enters the Parallel Programming Fray

If ever there was a sign a particular area of development was going mainstream, it is the entry of Microsoft into the space. Rik Myslewksi at The Register has a somewhat breathless write up of Microsoft’s answers to OpenCL and CUDA, C++ AMP.

Microsoft principal native-languages architect Herb Sutter unveiled the technology Wednesday morning at AMD’s Fusion Developer Summit. Initially, C++ AMP will help devs take advantage of general purpose GPU computing (GPGPU), but in the future, Microsoft will extend the technology to multi-core architectures and beyond.

Those future plans include opening up the specification, if not the implementation of C++ AMP. Sutter also explains the reasons for choosing C++ over C although I expect that have more to do with Microsoft’s proprietary tooling investments in the form of Visual C++. Given both of those bits, I don’t really expect to see this particular approach employed aware outside of Windows. The siloing of approaches in this area is more tragic in my mind than existing application development since building usable and effective parallel programming techniques and tools is a vastly greater challenge.

Microsoft juices C++ for massively parallel computing, The Register

Next Volume of Knuth’s Master Work is in Print

Slashdot has the news, that the next volume in the definitive series by renowned computer scientist, Donald Knuth, is now available in print. Knuth’s books have a somewhat mythic status amongst many hackers. Reading them represents a litmus test for a certain level of programming chops to which few mortals aspire let alone attain.

(I own a boxed set of the first three volumes. You may recognize if from an avatar I use on my various social network profiles. I have started but made made very poor progress on volume 1 to date.)

Volume 4A of Knuth’s TAOCP Finally In Print, Slashdot

Democratization of Coding

Clive Thompson at Wired has an interesting suggestion, that coding should be accessible enough for anyone with an idea to be able to readily be able to implement it. He uses the example of an app that helps deal with texting while driving made possible by Google’s Android App Inventor.

So what if the phone knew you were driving—and responded on its own?

Normally, Finnegan wouldn’t have been able to do anything with his insight. He was a creative-writing major at the University of San Francisco, not a programmer. But he’d enrolled in a class where students were learning to use Google’s App Inventor, a tool that makes it pretty easy to hack together simple applications for Android phones by fitting bits of code together like Lego bricks.

Finnegan set to work, and within a month he’d created an app called No Text While Driving. When you get into your car, you hit a button on the app and it autoresponds to incoming texts with “I’m driving right now, I’ll contact you shortly.” I’ve used the app, and it’s terrific: By getting you off the hook socially, it makes your driving safer. It ought to be available—mandatory, even—on every phone.

I like the further implications, that programming literacy would help everyone more effectively filter out bullshit, elitist claims made by the relative few who are fluent at coding now. I think there is a pleasant counter-intuition for pros like myself, that the more easily everyone can code up any interesting idea, the more likely the skills of an experienced programmer will be in demand to take it that much further.

Clive Thompson on Coding for the Masses, Wired