Hiccup with My Mixer under Linux

My MixerA problem being mysteriously fixed through no clear action of my own bugs me. A problem this weekend with my mixer is just such a case. After upgrading to the latest version of my operating system, a flavor of Linux that I prefer called Kubuntu, I could not get the software driver I had been using with my mixer working. I could get close but not to the point where my audio workstation would see my mixer. Of course I discovered this right when I sat down to record. Last night, last thing, when there wasn’t time or will left for extensive research and troubleshooting left in the weekend. When else would I discover a problem resulting from upgrading my OS? Not when it would be more convenient to investigate and fix.

The break bugged me so much, especially losing an opportunity to record when I had the will to do so, that I spent some time this morning before work to see if I could get things working again. I installed the latest version of my audio workstation, Ardour 4, because I had been meaning to, anyway. Through some experimentation, I stumbled upon the fact that letting Ardour directly drive my mixer, rather than using an external software controller worked. Doing so only worked using an option I didn’t think had a chance in hell, just using the basic audio stack that comes with Linux.

I was simultaneously relieved to have my mixer back but vexed as to why something that previously did not work, suddenly did. I dislike mysterious fixes almost as much as random breakages. If I don’t understand why something suddenly starts working, I feel helpless to deal with any subsequent breakage. This situation is usually a recipe for cascading frustration.

I did a bit more searching, just now, and I think I found the explanation.

Very current versions of the Linux kernel will support most of the same devices that FFADO can support, directly via the normal ALSA audio device driver layer.

That is from Ardour’s own documentation, specifically on its requirements. FFADO is the separate driver I used to use for my mixer since it is a Firewire mixer, an increasingly less common way to connect peripherals to a computer but one still very well suited to the demands of audio. ALSA is basically the core sound handling component of Linux. As a consequence of my operating system upgrade, I indeed do have a very new kernel.

Mystery solved. Of course, I have only done several seconds worth of testing. In past versions of ALSA, it hasn’t done well with the demands of high quality audio, such as working with a prosumer mixer. It is entirely likely there are frustrations still yet ahead but at least they won’t be entirely mysterious. And I will hopefully get some new recording in soon, regardless.

JSConf US 2015 and Future Conferences

In a couple of weeks, I will be heading to JSConf US 2015. I was going to talk about this, I will still talk about this more, in my next podcast. Given the quickly dwindling time remaining before the conference comes up, I wanted to share a quick heads up. Any reader or listener who will also be there, please feel free to shoot me a quick note if you’d like to meet.

I have one other possible work related conference, AWS re:Invent, that I have tentatively agreed to attend later this year, in October. Again, if you are going to be there and want to hang out, let me know.

I am enjoying being at a gig that is well resourced. While I got to travel at times for my last job, it was feast or famine. Either I was traveling a lot, too much really, or not at all. Now I have some discretion. If you know of any conferences you think I might be interested in, let me know about those too. I can always ask and I think within reason can expect to go to some of the more interesting and relevant tech conferences for the foreseeable future.

So you know, and I’ll unpack these more in a future podcast and/or essay, the technologies I am currently using, learning or interested in include React/Reflux, Node, Scala, microservices, reactive/concurrent programming, Docker, continuous delivery, and automated testing. I can probably make a case for less tech specific but still work relevant conferences like ones that bear strongly on agile as I am currently a scrum master for my day job. Anything else, while it may be of interest to me, I’ll have to foot on my own dime.

TCLP 2015-04-18 Choosing Your Aspect

gangster-154425_640This is an episode of The Command Line Podcast.

I rushed this episode out for listeners who do not read the web site. I wanted to let those listeners know about the coming change in feeds, in case of any disruptions. I am hoping there are no problems, but if someone has one, I want them to have the heads up and to know I will help where I can if there are any feed hiccups.

I discuss some listener feedback, namely Pete’s recommendation of the Gundo plugin for vim and Tim’s move from nano to vim.

For the rest of the episode, I extemporize on lessons I have learned about the benefit of choosing between a hard or soft aspect as someone who used to almost unilaterally always approach situations with a firm set to his jaw.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License

Feed Change on 5/1/2015

I have been using Feedburner since before Google bought it to save bandwidth and for some nice high level statistics on subscribers. This service has not been updated pretty much since Google bought it, to the point where it breaks if the source file for any given feed it wraps is encrypted, an increasingly common practice for web pages and resources, it breaks. I’d like to move off of Feedburner before the next breakage or Google ultimately shuts it down.

I have updated all the feed links for this site and for the podcast. I will be deleting my Feedburner feeds at the end of the month. Deleting a feed can be done with a permanent redirect. The redirect means you should not have to do anything in order to continue to receive episodes and posts. However, technology frequently doesn’t work as we’d like. If you are an active subscriber, you may want to change over your feeds, now, to avoid any possible problem.

To make things easier, here are all my feed URLs, so you can just copy and paste as needed:

  • https://thecommandline.net/feed/ – The feed for this site, includes blog posts and podcast episodes both. I recommend this feed if you want everything.
  • https://thecommandline.net/cmdln – The feed for the podcast in MP3 format. If all you want is the podcast and you are not sure about what kinds of audio your device or software supports, I recommend this feed.
  • https://thecommandline.net/cmdln_free – The feed for the podcast in Ogg Vorbis format. This audio format is patent and royalty free, if you are a support of free software and open source, I recommend this feed. It should work well on any Android devices and Linux software but your mileage may vary with Google and Apple devices and software.

For iTunes subscribers, I have taken advantage of their support for updating feed locations so you shouldn’t have to do anything.

I will make this post sticky for the next few weeks in case you miss the change over and come to the web site looking for an explanation of what has changed or broken.

TCLP 2015-04-12 Hope and Fear in the World of vim

blank_preview1This is an episode of The Command Line Podcast.

I share a quick game review in this episode of Coup and its expansion, Coup: Reformation. The feature is a reading of my recent essay about my favorite text editor, vim.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License

TCLP 2015-03-29 My Drivable Computer

My New CarThis is an episode of The Command Line Podcast.

This episode is an experiment. I recorded an extemporized monologue in my car during what is a fairly regular drive for my right now. I expressed my gratitude for, and inspiration from, Patrick McLean, in particular his return to The Seanachai, and Dave Slusher.

I talked about the new car I bought at the end of January, my reasons for buying it and my experience of genuinely feeling that I am driving a computer.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License

Hope and Fear in the World of vim

For some reason, I decided to step back from any integrated development environments at work and just use my preferred power editor of choice, vim. If you are unfamiliar, an IDE is a very powerful tool, like an assistant constantly at your elbow to remind you of how to code better, how to use libraries, and kick off any number of tasks like builds, deploys, and debugging. Early in my career, I resisted them, feeling that I should be able to be productive with a loose collection of individual tools. I finally caved when I read Martin Fowler’s “Refactoring” and realized that the one thing the IDEs at the time could do that my much beloved power editor plus command line could not was tear through a project’s code base, transforming the code smartly, based on an actual understanding of how source files fit together. The alternative was to use tools like sed, grep and awk to essentially do global search and replaces across many, many files. To do so, you had to explicitly write out what you wanted changed, character by character. In an IDE, you could select a menu option–extract method, introduce parameter, rename class, etc.–and provide very lightweight input like new names and new locations, depending on how you are re-factoring the code.

As much as I have been using IDEs for the past handful of years, part of me still felt they were a little intellectually lazy. I’ll be the first to admit that feeling is fairly irrational but that is how I have felt. I guess that a quiet part of me feels I should be skilled enough and deeply embedded in the languages, tools and projects I use to make the loose collection of tools approach work, strengthening my abilities as a coder as I work by being much more actively in the mix of those tools. Sort of how I feel about having a manual transmission on my car rather than an automatic. Except the clutch is n-th dimensional and I have to use all five fingers of my right hand, plus a few toes of the corresponding foot, to manipulate the stick through the gear changes.

I have to say, I am enjoying engaging deeply with my power editor of choice. Even though I took a break from coding professionally at my last job for about two years, I still used vim for any and all of my text writing and editing needs. Since the last time I used vim for coding on a regular basis, the biggest difference is how much more mature and robust the plugin ecosystem has become. While I still supplement vim’s incredibly powerful and efficient text manipulating capability with tools to search, check and build code, and more, a lot of very passionate folks have written some very nice glue code to allow those tools to be invoked from within vim and to interact with the way the edtior thinks about text and files.

I have been sharing some updates as I noodle around with my vim set up, focused on the editor’s main configuration file, .vimrc. In response to one of those posts, a friend, Jed, asked if I’d write a post about the plugins I use. I am more than happy to comply. If you are curious, you can read along at the github repo I created to house and share my hacking on my .vimrc. I have tried to comment the various files I use to explain my thought process and pay forward the many, many tricks I have found in recent days to make working with vim that much more pleasant than it is already.

First and foremost, there are at least a few ways now to automatically install and manage plugins, a capability that was sorely lacking even a few years back. Of those tools, I use vundle. I like it because it only requires a one time manual install then self manages updates of itself as well as plugins. It provides a really nice set of commands within vim, and in editor help explaining them, to search, install and clean up unneeded plugins. My main .vimrc simply sources a supplemental file, vundle.vim, that both interacts with vundle and itemizes my current raft of plugins.

I have a bunch of language specific plugins, I’ll forego discussing those, for now. The other, more general plugins are both more likely to be of interest to any random reader and contribute the most value, and joy, to my use of vim on a daily basis.

The first plugin I’ll recommend is airline. I have long had a custom status line in vim, one I configure to be always present. It showed a few useful things but projects like airline and powerline improve on this massively and make it possible to have a very slick looking set up. I use the patched powerline fonts, explained and linked to in the airline docs, to gild the lily that is the info rich display of this plugin. The readme at airline’s github project is its own best sales pitch, I encourage you to give it a read. I like it even more for being a vim native re-write of the original powerline plugin. It demonstrates just how incredibly powerful vim’s own scripting environment has become.

Another backbone of my toolset is exuberant ctags. This is a unix program that digests source code for any number of languages and builds a database of tags. That database allows a programmer to quickly jump around a code base, from reference to declaration. vim has long supported integration with ctags so that if you have an appropriate database built and accessible, jumping to a function declaration is only a hot key away. If my aging memory is right, this support predates most if not all of the current popular IDEs. I use the plugin easytags which allows vim to continuously refresh my ctags databases as I work. I pair that with the plugin Tagbar which allows me to open a split frame that uses ctags to present a code structure aware outline view of any source file on which I am working. Further, Tagbar integrates with airline, enriching my status line with more contextual info about where I am within any particular segment of code.

I use two plugins from github user scrooloose, NERDTree and Syntastic. The first brings in a file explorer that is way more powerful and easy to use than the already pretty nice explorer built into vim. The second integrates any number of standalone programming language syntax checkers, like jshint and Java Checkstyle. That integration allows me to jump around my sources, seeing where I have slightly junky code I might want to clean up or an actual break I definitely need to fix well before it hits a compiler or a deployed application.

I just added a tool called fugitive. Mostly I pulled it in to see my current working git branch in airline. It supports invoking git, the whole hawg of distributed version control tools, from within vim but I have barely scratched the surface of that. I am still creeping towards the deep end of git at present, at present trying to use more interactive adds to selectively commit not just whole files I’ve changed, but collections of fragments of files that are related, separating them from other changes in those same files I want to save for a subsequent commit, for a separate reason. Being able to use vim to select those chunks to build up more smartly commit sets is intensely appealing while still slightly daunting.

Up to this point, if you are a coder, you may be thinking that all these plugins do is approach but not quite compete with a modern IDE. I wouldn’t necessarily disagree, as I said, these plugins just make existing tools easier to keep at the periphery of where I exert the vast majority of my focus every day. I have such deeply ingrained muscle memory for vim that I can create and change text far more effortless with it than even the best editor offered by the best IDE. I have tried “vim” bindings in some IDEs and even an abomination that yokes vim directly to Eclipse but all they have done is remind me of what I love about vim.

There is one last plugin that I feel is the most compelling response to the refactoring support available in packages like Eclipse and IntelliJ. For geeks of a certain age, tools like grep and sed are equal parts comfortable and frustrating. They are so responsive to sifting through masses of files yet so demanding in the exactitude of their use. A few years back, a colleague introduced me to a kinder, gentler alternative specifically for coders, ack. ack combines the expression power of grep with a better understanding of the peculiarities of source code and programming language syntax.

ack.vim brings that useful and useable code searching into vim. In this case, search is only half the picture. vim supports macros, not in the sense of scripting or snippet expansion, but the ability to record and play back key strokes. Everything vim does can be invoked without taking your hands off the keyboard. I feel this is vim’s best, most under appreciated feature. Imagine searching for some repetitive fixture in your code you feel must change, something thorny enough to require ack to confidently spot all instances of it. Now, couple that, through this plugin, with the ability to repeatedly and correctly apply that transform, not just something superficial, but a complex interaction with every single occurrence you unearth. Using vim, ack and ack.vim may not be as simple as picking from a few menu options but it also isn’t anywhere near as constrained. I’ve used macros to work over related sections of text, multiple lines together, not just a line or two around a simple declaration.

I’ve only been applying this latest deliberateness to my use of vim for a few weeks. My hope and fear is that there is so much more to explore. Hope in that I already feel I have finally found a combination and integration of tools capable of giving any IDE a run for its money. Fear in that these waters run very deep indeed and my ability to plumb them is only so great.

Reading Walden

Chris Miller turned me on to The Jefferson Hour. A few episodes back, they got to talking about Walden. What struck me was the discussion of one of Thoreau’s main points, apparently, about leading a deliberate life. This makes sense, he is most often quoted for his thought on the mass of men leading lives of quiet desperation. I am only a chapter or so in but have already encountered the quote. The context is his pondering of how so many of us go through life driven by perceived obligation–that we have a certain job, possess certain things, make certain choices.

I have yet to reach any of Thoreau’s thoughts on how to avoid the pitfall he elaborates so well. As I said, I’ve only just started reading. I am squeezing in pages amongst my other reading. My night time routine of decades is to read a bit of fiction before sleep. I find myself anxious if I skip this and less able to readily sleep. I am still grappling with more than one learning curve, at work. I have books I am trying to keep pushing along on JavaScript, Node and Scala.

I will say I have a bit of a worry. Thoreau was only just thirty when he wrote the book. Tonally, it shows, especially in his extended rant about how those older than him didn’t necessarily have anything to teach him. I get his point, to a degree, that time on this earth alone isn’t an adequate predictor of wisdom. Even where someone has acquired hard won insight through experience, those experiences may not be anywhere near universally applicable. I guess as someone tarred by the brush with which he paints, and a parent to boot, I could have wished for a bit more humility, nuance or both at that point in the opening of the book. My complaint isn’t strong enough for me to set the book down but I do wonder if I should have read it well before now.

I hope to find some sort of a pace that makes regular thoughts on the book at least somewhat coherent, here. I have been thinking more lately about the non-technical parts of my professional life. I mentioned this more on my other site but I have returned to Franklin’s 13 virtues, another idea introduced to me by Chris Miller. Walden’s notion of a deliberate life seems highlight compatible with Franklin’s idea of both a highly contemplated, daily life and a set of ideals around which regular reflection should be focused.

As a geek of a certain age, I am starting to feel that my ability to learn new technology and apply it may not be as valuable as these other, less tangible aspects of my professional life. At my current gig, which I started at the end of last year, I am well north of the average age of my peers. They hired me despite a deficit of recently hands on coding experience, at a pretty senior level, too. Looking to my role models, living and historical, long admired and newly found, seems like a good lense through which to start getting my thoughts and values clearer in my own mind so that I can apply them, well, more deliberately.

Giving Balticon a Miss

I remember vividly when I got started in podcasting. I managed to finagle the admission cost and travel expenses out of my then employer to attend Apple’s big developer conference, WWDC, the very year they announced podcast support in iTunes. On the first day, I visited the local Apple store to pick up an audio interface for my laptop, hoping to record my very first audio at that conference. Sadly, thanks to the ignorance of an Apple “genius,” my first recording happened a week or so after I got back.

Live events have been a large part of my experience of podcasting. That has tapered off in recent years but early on I was invited to a lot of interesting conferences, conventions and events due to being a podcaster. One of the earliest was Balticon. I had befriended a few local area podcasters through a now long defunct meetup. One of them got me invited as a participant some eight or so years ago during that early, heady rush. Even as I started scaling back my speaking engagements a while back, I kept going to Balticon. It is local and so was easier on the pocketbook, not to mention including so many of my friends.

The problem is there was a valid reason I scaled back in general. Oddly, the events I got asked to were rarely tech focused. Like Balticon, many of them were science fiction conventions. I happen to like science fiction. I have fond memories of the earliest cons I attended back in college. When trying to find my voice, though, as someone talking about technology, public policy, and society, an SF/F convention is an odd place to find myself. That tension has only grown over the years.

Each of the last three or four years, around this time, I have debated with myself if it made sense to go back to Balticon. Up until this year, I ultimately decided I had the patience, energy and enthusiasm to make the best of it. This year, I don’t think I do any more. Professionally and personally, the past twelve months or so have been trying, to say the least. Many of the reasons I looked forward to Balticon on a personal level have evaporated, or even worse, become reasons not to go.

I honestly don’t know if I’ll go again in the future. It will depend on a lot of things, things I couldn’t even predict right now. If you are going and would like to get together, up to a point, I could probably manage that, since the convention is still local for me. Just message me privately.

TCLP 2015-02-01 If It Sounds Right and My Site Has a POSSE

This is an episode of The Command Line Podcast.

One of My GuitarsIn this episode, I share another essay, this time also from the web site, on the the interplay of critical judgement and skill.

Following up on a promise to myself towards the end of last year, I extemporize for the latter half of the show. I hope that by allowing myself to just speak with little or no preparation on a subject, I will make it easier to get more shows out in a given year. I decided the work I’ve done exploring this concept of POSSE, or Post Own Site and Syndicate Everywhere, would be interesting to discuss, in particular responding to some mild criticism and clarifying why I find what I have set up so useful.

I wrote two posts covering more technical details of what I am using, here and here. You can view my lifestream category on this site, or the same category on my new site, to see how this is all working for me. I have to once again give Dave Slusher credit for getting my thinking about this idea and for sharing how he has implemented it. Once again, you can subscribe to the links I curate throughout the week from my own aggregator.

You can grab the flac encoded audio from the Internet Archive.

Creative Commons License