Hope and Fear in the World of vim

For some reason, I decided to step back from any integrated development environments at work and just use my preferred power editor of choice, vim. If you are unfamiliar, an IDE is a very powerful tool, like an assistant constantly at your elbow to remind you of how to code better, how to use libraries, and kick off any number of tasks like builds, deploys, and debugging. Early in my career, I resisted them, feeling that I should be able to be productive with a loose collection of individual tools. I finally caved when I read Martin Fowler’s “Refactoring” and realized that the one thing the IDEs at the time could do that my much beloved power editor plus command line could not was tear through a project’s code base, transforming the code smartly, based on an actual understanding of how source files fit together. The alternative was to use tools like sed, grep and awk to essentially do global search and replaces across many, many files. To do so, you had to explicitly write out what you wanted changed, character by character. In an IDE, you could select a menu option–extract method, introduce parameter, rename class, etc.–and provide very lightweight input like new names and new locations, depending on how you are re-factoring the code.

As much as I have been using IDEs for the past handful of years, part of me still felt they were a little intellectually lazy. I’ll be the first to admit that feeling is fairly irrational but that is how I have felt. I guess that a quiet part of me feels I should be skilled enough and deeply embedded in the languages, tools and projects I use to make the loose collection of tools approach work, strengthening my abilities as a coder as I work by being much more actively in the mix of those tools. Sort of how I feel about having a manual transmission on my car rather than an automatic. Except the clutch is n-th dimensional and I have to use all five fingers of my right hand, plus a few toes of the corresponding foot, to manipulate the stick through the gear changes.

I have to say, I am enjoying engaging deeply with my power editor of choice. Even though I took a break from coding professionally at my last job for about two years, I still used vim for any and all of my text writing and editing needs. Since the last time I used vim for coding on a regular basis, the biggest difference is how much more mature and robust the plugin ecosystem has become. While I still supplement vim’s incredibly powerful and efficient text manipulating capability with tools to search, check and build code, and more, a lot of very passionate folks have written some very nice glue code to allow those tools to be invoked from within vim and to interact with the way the edtior thinks about text and files.

I have been sharing some updates as I noodle around with my vim set up, focused on the editor’s main configuration file, .vimrc. In response to one of those posts, a friend, Jed, asked if I’d write a post about the plugins I use. I am more than happy to comply. If you are curious, you can read along at the github repo I created to house and share my hacking on my .vimrc. I have tried to comment the various files I use to explain my thought process and pay forward the many, many tricks I have found in recent days to make working with vim that much more pleasant than it is already.

First and foremost, there are at least a few ways now to automatically install and manage plugins, a capability that was sorely lacking even a few years back. Of those tools, I use vundle. I like it because it only requires a one time manual install then self manages updates of even itself from then on. It provides a rely nice set of commands within vim, and in editor help explaining them, to search, install and clean up unneeded plugins. My main .vimrc simply sources a supplemental file, vundle.vim, that both interacts with vundle and itemizes my current raft of plugins.

I have a bunch of language specific plugins, I’ll forego discussing those, for now. The other, more general plugins are both more likely to be of interest to any random reader and contribute the most value, and joy, to my use of vim on a daily basis.

The first plugin I’ll recommend is airline. I have long had a custom status line in vim, one I configure to be always present. It showed a few useful things but projects like airline and powerline improve on this massively and make it possible to have a very slick looking set up. I use the patched powerline fonts, explained and linked to in the airline docs, to gild the lily that is the info rich display of this plugin. The readme at airline’s github project is its own best sales pitch, I encourage you to give it a read. I like it even more for being a vim native re-write of the original powerline plugin. It demonstrates just how incredibly powerful vim’s own scripting environment has become.

Another backbone of my toolset is exuberant ctags. This is a unix program that digests source code for any number of languages and builds a database of tags. That database allows a programmer to quickly jump around a code base, from reference to declaration. vim has long supported integration with ctags so that if you have an appropriate database built and accessible, jumping to a function declaration is a hot key away. If my aging memory is right, this support predates most if not all of the current popular IDEs. I use the plugin easytags which allows vim to continuously refresh my ctags databases as I work. I pair that with the plugin Tagbar which allows me to open a split frame that uses ctags to present a code structure aware outline view of any source file on which I am working. Further, Tagbar integrates with airline, enriching my status line with more contextual info about where I am within any particular segment of code.

I use two plugins from github user scrooloose, NERDTree and Syntastic. The first brings in a file explorer that is way more powerful and easy to use than the already pretty nice explorer built into vim. The second integrates any number of standalone programming language syntax checkers. That integration allows me to jump around my sources, seeing where I have slightly junky code or an actual break I definitely need to fix.

I just added a tool called fugitive. Mostly I pulled it in to see my current working git branch in airline. It supports invoking git, the whole hawg of distributed version control tools, from within vim but I have barely scratched the surface of that. I am still creeping towards the deep end of git, at present using interactive adds more and more to selectively commit not just whole files I’ve changed, but collections of fragments of files that are related, separating them from other changes in those same files I want to save for a subsequent commit, for a separate reason. Being able to use vim to select those chunks to build up more smartly commit sets is intensely appealing while still slightly daunting.

Up to this point, if you are a coder, you may be thinking that all these plugins do is approach but not quite compete with a modern IDE. I wouldn’t necessarily disagree, as I said, these plugins just make existing tools easier to keep at the periphery of where I exert the vast majority of my focus every day. I have such deeply ingrained muscle memory for vim that I can create and change text far more effortless with it than even the best editor offered by the best IDE. I have tried “vim” bindings in some IDEs and even an abomination that yokes vim directly to Eclipse but all they have done is remind me of what I love about vim.

There is one last plugin that I feel is the most compelling response to the refactoring support available in packages like Eclipse and IntelliJ. For geeks of a certain age, tools like grep and sed are equal parts comfortable and frustrating. They are so responsive to sifting through masses of files yet so demanding in the exactitude of their use. A few years back, a colleague introduced me to a kinder, gentler alternative specifically for coders, ack. ack combines the expression power of grep with a better understanding of the peculiarities of source code and programming language syntax.

ack.vim brings that useful and useable code searching into vim. In this case, search is only half the picture. vim supports macros, not in the sense of scripting or snippet expansion, but the ability to record and play back key strokes. I feel this is vim’s best, most under appreciated feature. Imagine searching for some repetitive fixture in your code you feel must change, something thorny enough to require ack to confidently spot all instances of it. Now, couple that, through this plugin, with the ability to repeatedly and correctly apply that transform, not just something superficial, but a complex interaction to every single occurrence you unearth. Using vim, ack and ack.vim may not be as simple as picking from a few menu options but it also isn’t anywhere near as constrained. I’ve used macros to work over related sections of text, multiple lines together, not just a line or two around a simple declaration.

I’ve only been applying this latest deliberateness to my use of vim for a few weeks. My hope and fear is that there is so much more to explore. Hope in that I already feel I have finally found a combination and integration of tools capable of giving any IDE a run for its money. Fear in that these waters run very deep indeed and my ability to plumb them is only so great.

Reading Walden

Chris Miller turned me on to The Jefferson Hour. A few episodes back, they got to talking about Walden. What struck me was the discussion of one of Thoreau’s main points, apparently, about leading a deliberate life. This makes sense, he is most often quoted for his thought on the mass of men leading lives of quiet desperation. I am only a chapter or so in but have already encountered the quote. The context is his pondering of how so many of us go through life driven by perceived obligation–that we have a certain job, possess certain things, make certain choices.

I have yet to reach any of Thoreau’s thoughts on how to avoid the pitfall he elaborates so well. As I said, I’ve only just started reading. I am squeezing in pages amongst my other reading. My night time routine of decades is to read a bit of fiction before sleep. I find myself anxious if I skip this and less able to readily sleep. I am still grappling with more than one learning curve, at work. I have books I am trying to keep pushing along on JavaScript, Node and Scala.

I will say I have a bit of a worry. Thoreau was only just thirty when he wrote the book. Tonally, it shows, especially in his extended rant about how those older than him didn’t necessarily have anything to teach him. I get his point, to a degree, that time on this earth alone isn’t an adequate predictor of wisdom. Even where someone has acquired hard won insight through experience, those experiences may not be anywhere near universally applicable. I guess as someone tarred by the brush with which he paints, and a parent to boot, I could have wished for a bit more humility, nuance or both at that point in the opening of the book. My complaint isn’t strong enough for me to set the book down but I do wonder if I should have read it well before now.

I hope to find some sort of a pace that makes regular thoughts on the book at least somewhat coherent, here. I have been thinking more lately about the non-technical parts of my professional life. I mentioned this more on my other site but I have returned to Franklin’s 13 virtues, another idea introduced to me by Chris Miller. Walden’s notion of a deliberate life seems highlight compatible with Franklin’s idea of both a highly contemplated, daily life and a set of ideals around which regular reflection should be focused.

As a geek of a certain age, I am starting to feel that my ability to learn new technology and apply it may not be as valuable as these other, less tangible aspects of my professional life. At my current gig, which I started at the end of last year, I am well north of the average age of my peers. They hired me despite a deficit of recently hands on coding experience, at a pretty senior level, too. Looking to my role models, living and historical, long admired and newly found, seems like a good lense through which to start getting my thoughts and values clearer in my own mind so that I can apply them, well, more deliberately.

Giving Balticon a Miss

I remember vividly when I got started in podcasting. I managed to finagle the admission cost and travel expenses out of my then employer to attend Apple’s big developer conference, WWDC, the very year they announced podcast support in iTunes. On the first day, I visited the local Apple store to pick up an audio interface for my laptop, hoping to record my very first audio at that conference. Sadly, thanks to the ignorance of an Apple “genius,” my first recording happened a week or so after I got back.

Live events have been a large part of my experience of podcasting. That has tapered off in recent years but early on I was invited to a lot of interesting conferences, conventions and events due to being a podcaster. One of the earliest was Balticon. I had befriended a few local area podcasters through a now long defunct meetup. One of them got me invited as a participant some eight or so years ago during that early, heady rush. Even as I started scaling back my speaking engagements a while back, I kept going to Balticon. It is local and so was easier on the pocketbook, not to mention including so many of my friends.

The problem is there was a valid reason I scaled back in general. Oddly, the events I got asked to were rarely tech focused. Like Balticon, many of them were science fiction conventions. I happen to like science fiction. I have fond memories of the earliest cons I attended back in college. When trying to find my voice, though, as someone talking about technology, public policy, and society, an SF/F convention is an odd place to find myself. That tension has only grown over the years.

Each of the last three or four years, around this time, I have debated with myself if it made sense to go back to Balticon. Up until this year, I ultimately decided I had the patience, energy and enthusiasm to make the best of it. This year, I don’t think I do any more. Professionally and personally, the past twelve months or so have been trying, to say the least. Many of the reasons I looked forward to Balticon on a personal level have evaporated, or even worse, become reasons not to go.

I honestly don’t know if I’ll go again in the future. It will depend on a lot of things, things I couldn’t even predict right now. If you are going and would like to get together, up to a point, I could probably manage that, since the convention is still local for me. Just message me privately.

TCLP 2015-02-01 If It Sounds Right and My Site Has a POSSE

This is an episode of The Command Line Podcast.

One of My GuitarsIn this episode, I share another essay, this time also from the web site, on the the interplay of critical judgement and skill.

Following up on a promise to myself towards the end of last year, I extemporize for the latter half of the show. I hope that by allowing myself to just speak with little or no preparation on a subject, I will make it easier to get more shows out in a given year. I decided the work I’ve done exploring this concept of POSSE, or Post Own Site and Syndicate Everywhere, would be interesting to discuss, in particular responding to some mild criticism and clarifying why I find what I have set up so useful.

I wrote two posts covering more technical details of what I am using, here and here. You can view my lifestream category on this site, or the same category on my new site, to see how this is all working for me. I have to once again give Dave Slusher credit for getting my thinking about this idea and for sharing how he has implemented it. Once again, you can subscribe to the links I curate throughout the week from my own aggregator.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License

If It Sounds Right

One of My GuitarsWhen I first starting learning guitar, before I had a teacher and was just using books and videos to learn on my own, I worried whether I was learning good technique. I admit to obsessing a little bit about the shape of my grip, the arch of my fingers, whether I was fretting cleanly enough or not. I suppose I was carrying over assumptions I had from other skills into the pursuit of the new one. When writing software, there are definitely more and less correct ways of putting a program together. A lot is bound up in language and tool choice, often in terms of certain idioms that both affect correctness but also readability of source code.

A couple of months along, I decided I would benefit from a teacher. I didn’t really reflect on this choice but thinking back, I suppose it was motivated in equal parts by the desire to want to progress a bit faster than I was on my own and by that worrying over proper technique. One piece of advice I kept encountering was to make sure that whatever I was practicing, I should make sure I was doing so correctly. After a little searching, I found a local teacher with stellar online reviews.

My guitar teacher is great. He seems to have an intuitive grasp of when I need to refine something in my practice or am ready to stretch out and tackle something new. At first he took a little while to understand my tastes and interests. Some of that was no doubt do to how much those tastes have been changing and growing in the last year or so. Lately he has been supplementing more structured skill work with learning songs I enjoy, showing both the applications of the techniques I am learning and rewarding my efforts with playing real songs as opposed to simpler studies.

I expected from the start that my teacher would pay close attention to my technique and correct minute imperfections. This is how I learned Tai Chi, with an eye towards correctness, one that progressively looked deeper and deeper into form and movement. Brewing has involved close attention to key variables in the process, ensuring good control as well as exercising good taste. Why would guitar be any different?

A few months into my guitar lessons, we finally discussed my expectations, briefly. I had already started to realize on my own that there was something different in approaching guitar so really we just confirmed a certain truism that is common among musicians: if it sounds right, it is right.Basically, my teacher just confirmed this notion and we got back to the business of producing right sounding notes.

For most guitarists, certainly at my level of just gaining some basic fluency, the result usually far more important than the way it was produced. Guitar reinforces that because it is such a flexible instrument. The same note can be sounded on several strings. Chords can be constructed and varied in an infinite number of ways. Realizing this, I understand that training of the ear and using judgement takes priority over particular technique.

How can we draw a parallel with programming? Certainly when creating software for regular people, there is definitely a sense of what seems right trumping hard and fast engineering precepts. If someone sitting down at an app cannot figure out how to make best use of it, like a muddled or disharmonious tune, then it has failed. I suppose even the compiler could be thought of as enforcing a similar, if more rigorous, critical evaluation. Witness single line programming, underhanded and obfuscated contests that play with the plasticity arising from language rules and semantics while still producing something that can by some definition be deemed “right.”

What role, then do the volumes of text on correct theory and practice, for coding and guitar both, serve if ultimately the prime criteria admits so much variation?

If you are working with others, either to make some software or some music together, than a better understanding of your craft is invaluable. If you want to advance beyond a certain level of competence, as well, and tackle the greatest challenges, and hence feel the most keen depths of accomplishment, likewise you’ll need to cultivate an expertise beyond just a reasonably critical ear or eye.

Technical skill is a large part of the vernacular of a pursuit. To talk about challenges in programming, it helps to have clear and accurate terms and phrases. I have certainly struggled with some subtle aspect of a bit of code, for instance figuring out the right data structure, or kind of code to hole some information, that makes working with data simple, clear and effective in terms of code complexity and efficiency. When I don’t stop and really listen closely to someone trying to help me, I end up spending more time, struggling harder, than when I recall my fundamentals of computer science and theory. In either case, I usually get there, improving or fixing my program, but when I leverage the full depth of my technical mastery, I get there sooner and with much less stress.

I am only just learning music theory but see a strong parallel. Most times when I am learning a piece of music with my teacher, I have to trust the music as written and his recommendations of where I can vary from it. As I have been reading about scales, intervals and the basics of chord construction, I am now able to think back and understand my teacher’s guidance. I can easily foresee a day where my grasp of theory lets me interpret and improvise much more easily than I can now. Knowing the theory increases my appreciation even if I have to rely on trust more than a skill, or lack thereof, to reverse engineer a piece or part of a piece and really, rationally understand why it works.

The sound of a song, or the user apparent function of a program, is still important but truly understanding the hows of producing a result both deepens the ultimate appreciation and allows for even more nuanced play in both cases. The interplay between correctness and playing to ear yields a far more interesting result than hidebound adherence to trade craft. Strict formalism is sterile. Grit yields joy. In a song, that’s a catchy lick or memorable chorus. In code, it is a program that delights, that allows you to do something or something more easily while to the source code reader reveals a fun-loving and inspired mind.

Refining My POSSE Setup

Dave Slusher responded yesterday to one of my posts tracking my experiments with implementing a POSSE strategy on my web site. He clarified a couple of things. POSSE actually stands for Post Own Site, Syndicate Everywhere. POSSE as a concept is being advanced by the Indie Web folks. I have been following the Indie Web constellation of ideas, as well. They claim that strategies like POSSE are in some ways an improvement over federation, a concept which as much as I like it has gained little traction. Federation simple means you post wherever you want and your info flows to federated services seamlessly. Email is probably the most popular example of a federated system.

With federation, there still may be barriers that interfere with simple sharing of information. The examples Indie Web give run along the sentiment none of the existing services directly support federation so you have to adopt something new and try to cajole all your existing online connections to do the same. As the short history of the idea has demonstrated, that’s just too much inertia to overcome for all but the early adopter set.

By contrast, implementing a POSSE approach puts your content in front of your friends and followers wherever they happen to be. It concentrates on the ownership of the origins of that content (the Post Own Site part) so that if there is some issue with a particular aggregator on the other end, it limits the impact on your ability to generate and propagate the information and messages you want. Perhaps a fine nuance but one I have been pondering since Dave’s correction.

I am now onto the second day of an ad hoc weekend project to make my existing site (and one new one) drive a POSSE strategy by simply adding to tools I had already been using, primarily WordPress. I think I have things dialed in a bit better and have found some simple solutions to the few workflows I laid out in my last post.

For the link sharing workflow, I realized I was overlooking that I could extend my RSS aggregator, Tiny Tiny RSS. Doing so makes more sense to me since my aggregator is the canonical store of my curation efforts, not my web site. Tiny Tiny RSS supports plugins, as so many tools do, and I found plugins to share directly from it to Twitter, Google+ and Facebook. You can follow my curated links directly out of my aggregator or continue to follow along on Twitter. Up until now, I had only really been sharing links there but I think I will experiment with pushing those links to Facebook and Google+. I will keep an eye on the reaction of my connections on those services and adjust as seems appropriate. Regardless, this workflow should work much better using more of the right tool for the job.

For the workflow of sharing blog posts, I returned pretty much to the typical configuration of SNAP. I restored the templates it suggests for each target service which is pretty much an excerpt and a link. I set these up to auto post anything within the categories I use for blogging, on both my sites.

For the lifestream workflow (thanks Chris Miller for reminding me of this term) I re-thought my approach. I like the idea of posts to a specific category being handled a little differently but tweaked that to try to make it a little more intuitive. I still withhold life stream posts from my main page and my feed, as I figure these introduce a bit more noise so should require a little more intent and action beyond just visiting my site. If you navigate into my archives or view the life stream category itself, you can now see these intermingled with the rest of my posts and all on their own, respectively.

That left cracking the length constrained vs. free form content problem. I didn’t want to be editing sharing templates post by post, that is a retrograde action, introducing more work rather than simplifying. I had the idea of adding a second config for each of the social services I use. Each new setup would have a template more appropriate for direct posting, that is just the contents of the post on my site without any link back or other extra information. SNAP’s free edition doesn’t allow adding the same social service more than once, however they are running a sale on the pro edition right now. For just under fifty bucks a year you get both the pro upgrade and support for Google+. I went ahead and bought the upgrade, figuring it was cheap enough over the course of a year and if it worked, would be well worth the monetary cost.

So in my SNAP settings as an example I have two entries for Twitter. The first uses the blog post template, the second a template that just shares the text of a post on my site. The first autoposts anything I publish to the usual categories for my blog. The second only autoposts when I publish to one of three life stream categories.

The last piece of the puzzle, at least for now, is that I have gone from one life stream category for everything to three. I kept the existing category so that I can write a short post and simply have it go everywhere. I added one for long form content and added it in SNAP only to those services that don’t have length constraints, like Facebook and Google+. The third category is for short form posts for, you guessed it, Twitter which has its character cap. If I have a longer message I want to share everywhere, I can write it once, adding it to the long form category. Then I can chunk it up and post it as a series of length appropriate entries to the short form category. Tying this all to autoposting to remote services and sites means that if I only have my phone, everything will still work simply by using the right category in the WordPress mobile client. This is critical since up until now, a lot of what I post on the various social services I post from my phone.

The view of my life stream category on my own sites may at times appear a little repetitive. That original category contains the other two, but it is closer to what I think is ideal. Hopefully that makes sense. Or if it doesn’t, trust that it makes sense to me. Or just visit my web site and explore for a bit. At least I think this will all work for right now. It is good enough for a weekend of noodling around, anyway. I will continue to tinker, sharing updates as I come up with improvements or interesting possibilities.

Initial Impressions of a POSSE Setup

I’ve only been playing with SNAP, a WordPress plugin, a few hours, but have some initial thoughts.

NAND Cat Has a Posse, used under CC-BY thanks to Flickr user Paul Downer

NAND Cat Has a Posse, used under CC-BY thanks to Flickr user Paul Downey

I started down this path thanks to Dave Slusher who wrote about POSSE which stands for Post to Own Site, Share Everywhere. I like this concept a great deal. Investing the typical time more and more folks do in communications and information only to have that effort evaporate at the whim or circumstance of the platform, tool or channel of the moment seems very foolish to me. I had been experimenting with Bridgy but still manually sharing all of my posts and posting shorter thoughts directly to all these sharing outlets like Twitter and Facebook.

I stood up a second site for another set of interests of mine. Doubling my existing workflow was not appealing in the slightest. I decided to take the plunge with the WordPress plugin, SNAP, that Dave mentions. For the most part, I really like it. With a little fine tuning, content showing up on sharing networks looks native but cleanly and clearly originates from my site. I will warn that this is a power tool, which affects the effort to set up and how smoothly it supports more than one workflow.

In terms of the installation, just getting it into WordPress is easy enough. Connecting it to other services may require a good deal more effort. I have only connected Twitter and Facebook so far as Google+ requires some additional bits. For both Twitter and Facebook, I had to sign up as a developer and essentially create new applications with each of those for each of my sites. The SNAP documentation for this is superb but this may be outside a lot of people’s comfort zone.

For the primary workflow of simply sharing regular blog posts, SNAP is great. You can configure templates for messages to each with some pretty clear replacement parameters (although finding the list from the plugin is pretty much impossible, I bookmarked their documentation page.) Unlike other tools I had tried previously, messages can be tailored so they look native to the sites on which they appear. This is a huge plus as the current crop of popular sharing sites increasingly penalize anything coming in that doesn’t smoothly fit into their design, flow and expectations. Mismatched updates often get down ranked, defeating the point of sharing everywhere.

I have two other workflows I am still working on. The first is simply using my site, as Dave discusses, as the source for my usual social updates. I have a couple of plugins I use, as he suggests, to have a hidden category for purely social content on my site. Posting to those then only shows on the intended target and if someone follows a link back I provide. Unfortunately, the differences in content size limits makes this a bit clunky. WordPress supports one or two ways to break up content and SNAP can take advantage of those. But if I want a long update on services that support that chunked into three or more shorter updates, there is no good support for that. I am contemplating going back to what I was doing, but doing it via my own site–writing one long post then just simply copying that out into several shorter messages, massaged to work better on services with tighter limits.

If SNAP would add a character count to WordPress, for starters, then introduce a way to add markers that are invisible on my site but SNAP uses to break up posts into smaller pieces, as needed, that would be splendid. I haven’t looked to see if their Pro version does this or it is on their roadmap. I also haven’t looked to see if they have a robust way to suggest features. I have, after all, been using it less than a day. I will give them the benefit, continuing to investigate and push even if I have to fall back on some manual labor in the meantime.

The other workflow I use is Twitter specific, for sharing links out of my RSS aggregator. Usually I simply compose a tweet with the title of an article, its link, a via if warranted and then in any space left a comment. Predicting how SNAP will mangle such a post if composed in WordPress is proving difficult as it isn’t leveraging Twitter’s own URL shortener or offering its own. Again, the lack of character counting on posts is frustrating, I feel like using WordPress/SNAP for this is a bit like aiming blind. I am less concerned for this workflow since my aggregator is my canonical source for link curation and has its own way to share a feed of what I have shared, with my comments attached.

I have only just started using SNAP’s own comment import. Dave recommended Bridgy over this feature but that service doesn’t appear to support more than one site per sharing service, a use case I now inhabit. Also, it has always bugged me that it was a service rather than a tool I could host and run myself. I did like that Bridgy used an emerging, open standard, web mentions, so I may look for a third option that has the best of both. I’ll share more thoughts as I have more experience with the import feature.

 

More or More of the Same?

I ended the web site and podcast where I had been discussing some of my other interests, in particular around beer, brewing and cocktail culture. I don’t want to discuss the reasons why. While that project is definitely over, my interests and pursuits related to it continue. Once or twice I have brought those, here, with an OK reception. I’ve been thinking about how best to continue to share and discuss those interests. I am of two minds.

This site and podcast, largely by happenstance, has had a a pretty consistent set of themes on which it focused. If I were to expand that drastically, I worry it might be jarring or off putting to readers and listeners who follow along for just those discussions. I do have an idea for another site, a successor to the now defunct project, where I might like to house another constellation of interests, especially as I am really ramping up my pursuit of home brewing.

On the other hand, I have always viewed this site and podcast as a reflection of myself. As focused as the writing and discussion has been, that has been an accident of my interests, not an intentional focus on just the few things I touch on over and over. Home brewing, beer and libations, as well as some more recent interests to which I’ve alluded as well, are just as valid for consideration. One of the advantages of a well categorized and/or tagged endeavor like this one is listeners and readers can easily skip the things in which they are less interested and stick around for the rest.

Whatever I decide, I’ll share another post so folks are clear on where new content will be appearing so they can follow along if they like. In the meantime, I am genuinely curious to know what all of you think.

Should I explore all of my interests here in a single site and have faith that folks can and will self select and filter down to what they like? Or should I spin up a fresh site where folks can subscribe additionally if they want to read and hear about my ongoing efforts in the space of home brewing, beer and other things related?

Getting Back Up to Speed

I had three hours to write a JavaScript single page application that implemented a very simple Reddit client in the browser. I was assured I would likely not need more than one hour. At the end of the first hour, the interviewer checked back in with me to see how I was progressing. I had been struggling just to get the basic scaffolding of the application put together. I had used a lot of the hour on looking up libraries and tools as well as refreshing myself on my JavaScript syntax.

I managed to complete the assignment in a little over two hours. I felt not only frustrated but embarrassed at how long it took me as compared to when I was in my top programming form. The interviewer assured me it was fine but I knew it was far from my best possible effort. As it turns out, the interviewer was right. For that particular prospect in my recent job search, I progressed to the next round of interviews.

I wrote an Inner Chapter years ago about brushing up programming skills after a period of not using them. I wrote that chapter for a listener who had gone through some hardship and was looking for help adjusting. Partly it was coping with changes in physical ability specific to that listener, partly it was more general advice on sharpening skills. I’ve been thinking a lot again on the subject of getting back up to speed at programming having just left a full time management job to go back to a full time programming one.

Even in the first year at the job I just left, when I was still in more of an engineering role, I didn’t feel like I got a lot of practice at programming. They had some technology projects but the pace of and pressures on them were very different, being a non-profit, than what I had experienced up to that point. Even when I was promoted, a lot of my skills as a technical manager didn’t apply very well. The kinds of stakeholders we had didn’t really know much about the standard software life-cycle. I ended up spending far more time managing personnel, managing budgets, and fund raising rather than implementing and supporting processes for creating software in a regular and consistent way. It was a rare day when I got to work with technology directly.

As I’ve written about recently, a large part of my decision to leave that job was realizing how unhappy the lack of opportunity to code was making me. To make matters more stressful, during my job search I had to explain over and over again why I was leaving an upper management role and looking for work as a programmer.

“You are currently director of technology?” “Yes.” “You know this is not a management job, right?” “Yes.” “You’ve been a manager before. You truly understand this is just a programming job?”

I got very practiced at telling the story of my multiple runs at management. I would tell them that I never started as a manager. Working at a small organization, it would become clear, usually due to growth, that more leadership was needed. Foolishly, perhaps, I would volunteer, at least until they found someone better at managing. They never did. Eventually the strain and the desire to get back to programming would become overwhelming. Taking a demotion was never a practical option so three times, now, I have found myself in this same juncture.

I have only been on my new job three weeks but I’ve been thinking about a few experiences that could be helpful to someone else looking to get up to speed. I already shared the first one but I had several other experiences just as part of the job search process that I think are relevant. Just going through several successive coding assignments, especially in a very reflective state of mind, revealed insights I continue to ponder.

In that very first coding assignment, I mostly felt frustration. I had to look everything up, even the basic tools and libraries I was using. I had lost the equivalent of muscle tone. I couldn’t reflexively just spin up the bare minimum set up and configuration to get some arbitrary bit of new code working. I was struggling with my tools rather than feeling that they were smoothing and accelerating my efforts. The silver lining to that frustration was the immense sense of accomplishment on mastering it. As long as I remember this frustration, I am less likely to take my tools for granted. When I have felt that frustration again at the new job, even with more time and resources to get something done, I am able to see it in context, as a very normal part of the experience, trusting that if I persevere, I will be rewarded.

In a good environment, getting tools and configurations right will be a shared burden. My new gig is very much like this, though the work to keep the tooling up to snuff is a little bit of a guerilla priority. Engineers have to squeeze time in around things other people find more pressing. A lower priority on process and tools compared to more obvious business value is actually common in my experience. Regardless, I was grateful, as a consequence of the frustrations I felt during the coding assignments throughout my search. My new colleagues apologized for the roughness of the code and documentation, I was just glad to not have to do all the necessary grunt work on my own. I was motivated to contribute what I could out of that appreciation. I felt that contributing to improving the basic set up in this way paid forward the kindness done to me, lessening the frustration of the next new hire. Jumping in was also good practice to help with my overall brushing up and gave me an opportunity to understand the set up a bit better, always a good thing when the need to troubleshoot it arises.

I didn’t even have to get to the new job to realize some improvements in my disused coding skills. In the subsequent coding assignments, I was able to make better progress more quickly. I had to do a second assignment for the same prospect as that very first, super frustrating assignment. On that second attempt I was better able to focus on the challenge at hand and the quality of my code. I found that trend continued through a few other coding projects I did as part of my search. By the end, I was spending as much time on the code comments and the commit history, to really show off my thought process and how well I understand the problem I had been presented.

The last coding assignment I did, I completed in less than an hour. I got to show off a little, too. The problem definition included some hints that could easily be missed. It asked to include some comments on the code’s performance. I was given the maximum expected input size. Once I had the code provably working, as demonstrated by some unit tests I turned in with the assignment, I thought about that maximum input size. I didn’t think it was included for no reason so added a unit test that run a random input sample that matched the stated max. Not surprisingly, my code slowed to a crawl.

I was able to recall not just my core programming skills but also some troubleshooting. I didn’t fire up a full on interactive debugger but did add some logging. I worked from very broad to increasingly narrow sections of code until I had isolated the problem. I figured out and made some changes that my tests now revealed improved the performance to be acceptable. I even left some code comments suggesting how the performance could be improved further.

I think that just the opportunity to practice with some clear goals can do a lot to help knock off the rust. You may not always have coding assignments like in a job search but there are online resources that can serve a similar purpose. As part of my orientation packet at my new job, I was turned on to Code School. The topics are limited, focusing on web development with just a few technologies but I like the approach. The courses combine short instructional videos and interactive exercises.

Whatever resources you find to use, I think the practical aspect is important. I am still relying on books a lot, even if I am loading them on my tablet rather than accumulating heavy stacks of paper as I did far earlier in my career. The best books, just like the online courses, include exercises that give the reader a chance to work with the material covered. Application is key to cementing a new skill or reviving a disused one.

I suppose there is a role for trivia, accumulating simple facts about programming or tools, but it is less helpful when learning for the first time or getting back into the swing of things. When working through an exercise, I will often have one of two very gratifying experiences. I will either laugh out loud or nod and smile as some bit of skill or knowledge comes flooding back to me. I laugh when feeling the visceral joy of either getting something to work or even more powerfully, of feeling come concept really snap into place in my mind.

The next challenge for me is getting back into good daily habits for coding. I think I will save my thoughts on that subject for another post. Hopefully, my early experiences getting back up to speed make sense and are useful to at least some of you.

TCLP 2014-12-21 A Sense of Scale and Getting Back Up to Speed

Used under CC license, by Wikipedia user Sunil060902This is an episode of The Command Line Podcast.

In this episode, I share a couple more essays. Before the essays, I help spread the request for support from the GnuPG project. You can donate here. The first essay was already published on the web site, “A Sense of Scale.” The second essay is new, “Getting Back Up to Speed,” and I will publish it on the site in the next few days.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License