Docker Red Hat and Containerization Wreck Virtualization

Conversation has popped up around a few tweets Alex Williams regarding virtualization at the Red Hat Summit. One of the starts to the conversation.

Paraphrased the discussion has been shaped around asking,

“Why is OS-level virtualization via containers (namely Docker) become such a massive hot topic?”

With that, it got me thinking about some of the shifts around containerization and virutalization at the OS level versus at the hyper-visor level. Here’s some of my thoughts, which seemed to match more than a few thoughts at Red Hat.

  1. Virtualization at the hyper-visor level is starting to die from an app usage level in favor of app deployment via OS-level virtualization. Virtualization at the OS level is dramatically more effective in almost every single scenario that virtualization is used today for application development. I’m *kind of* seeing this, interesting that RH is I suppose seeing this much more.
  2. Having a known and identified container such as what Docker works with provides a dramatically improved speed and method for deployment over traditional hyper-visor based virtualized or pure OS based deployment.

There are other points that have brought up but this also got me thinking on a slight side track, what are the big cloud providers doing now? Windows Azure, AWS, Rackspace or GCE, you name it and they’re all using a slightly different implementation of virtualized environments. Not always ideally efficient, fast or effective but they’re always working on them.

…well, random thoughts aside, I’m off to the next session and some hacking in between. Feel free to throw your thoughts into the fray on twitter or in the comments below.

Breaking Up Again, OneNote and I Must Go Separate Ways

Ok, psychologically one is supposed to tell the good news last and the bad news first. Well, I’m doing that backwards with this article. First things first, all the awesome about Microsoft’s OneNote App.

Microsoft OneNote

The cool thing is, after more than a few years, OneNote runs on most mobile and desktop systems. When I say most, what that equates to is: Windows, iOS and OS-X. Now, I wouldn’t doubt if it works on some other things that I’ve missed, but those are the places I know it works because that is where I’ve used the application the most.

OneNote does a number of things that are pretty cool. The first is simply look pleasant and make it easy to add notes, images, sound or other objects into any notebook in the app. This makes note taking extremely easy. There are also a lot of features around note history to move back and forward, play things back and more. This interaction with the notes across all of the devices is pretty seem-less, when the features are similar across all of the devices.

I actually really like the interfaces built specifically for the device that OneNote is running on. If I’m running the iOS iPhone App it is oriented to small screen touch and interactions of that sort. The iOS version is focused on creating notes, not on managing or organizing the actual notebooks and related structures.

On the iPad iOS App it’s oriented toward a larger workspace and more navigation between each notebook and and a little to the management of these notebooks and the respective notes. The iPad version is a happy middle ground between the note creating focus of the iPhone App and the full blown OS-X and Windows Desktop versions.

Speaking of that, it’s been about 3+ years since I’ve used OneNote on Windows and about 2 years since I actually used Windows for anything relevant. So when One Note was released on OS-X I was all over that. I’d always been a fan of the product, but it was limited since it only ran on Windows for the longest time. So when I switched off of Windows as my core operating system years ago, it went away. I had a list of top apps I lost when leaving Windows.

  1. LiveWriter for blogging because it hooked up to all the blogs I wrote to at the time; WordPress, DotNetBlogEngine and Blogger. So it was hugely useful.
  2. ReSharper for Visual Studio. Note I did NOT say Visual Studio, but just ReSharper. I’ve got a lot of this power back via WebStorm and IntelliJ, but I still miss the robustness of refactoring options with ReSharper.
  3. OneNote with Office. Note, again the specifics of just missing OneNote and not Office. The Office Suite, especially when I moved to non-MS Operating Systems was already useless to me. It was stuck in the 90s world of files and file systems. I’d already moved on to web options where the files were always where I needed them and versioned appropriately.

When I got down to this list, I assumed I could go ahead and switch. I did, haven’t regretted it for a moment and will still tell anybody that’s good at adopting to tools and finding the best for the job, the grass is indeed greener being not on Windows.

But I sure was happy to get OneNote back, but as I used it I realized…

…and now OneNote dies to me again.

…that I’d moved on form the paradigm that OneNote has to offer. I use more than just merely the iOS or OS-X or Windows Version. I need an option to see and retrieve my information beyond that medium. I needed to be able to use these tools sometimes disconnected and this also created a huge problem, as they’re all tightly coupled to the skydrive style service. In addition to this, if one uses any of the other iOS Office Suite Apps from Microsoft those are also tied to skydrive, but one has to get a monthly account to use those.

Overall the OneNote app was elegant, nice and worked well, but the connectivity issues and the tightly coupled service to skydrive left it removed from the other tools that I use to get work done. I suppose, if one is a full on fanbois and using all the Microsoft tooling running on Windows it likely has some integration to those tools. However I use a wide variety of tools across more than one operating system. In the end it seemed like Microsoft was endeavoring to lock me into their online presence with their offer of free OneNote as a gateway to their Office Product.

Albeit I’ve used it now for 2 weeks, I’ve made tons of notes in it, I’m just going to go back to Evernote. The access is better, the apps are clunkier and not as pretty (I realize that’s subjective), and overall I’d rather use OneNote as an interface to files I put in Dropbox or Google Drive or Evernote or something, but alas, it hasn’t worked out. So if you’re looking for a note taking app, OneNote might still be fore you, but otherwise if you want full across the board support across many platforms, Evernote is still a more capable option.

Sorry OneNote, but even though it was nice to have a second fling, we have to go our separate ways again. I guess it’s time to fire Evernote back up.

I’ve Officially Sent This Email Over 100 Times to Recruiters Looking for .NET Developers

Job Description

Here’s the letter, it’s kind of LOLz! I know it’s tough to find .NET Developers (or replace .NET with Java Developers or X Enterprise Language), so CIOs, CTOs and others take note. Here’s what I experience and what I see all the time, embodied in a letter. I will put effort into hooking people up with good jobs, that fit the person, and persons that fit the job, but lately I’ve seen companies that do .NET work in the Portland, Seattle and especially San Francisco areas become exceedingly desperate for .NET Developers. This is what my general response looks like.

“Hello Recruiter Looking for .NET Developer(s), thanks for reaching out to me, however I regret to inform you that I don’t know a single .NET Developer in Portland Oregon looking for work. It seems all the .NET Developers have either A: gone to work for Microsoft on Node.js Technologies, B: switched from being a .NET Developer to a Software Developer or otherwise C: left the field and don’t want to see any software ever again (which always makes me sad when people burn out, but alas, hopefully they find something they love). It’s a funny world we live in.

Even though I’m fairly well connected in Portland, Seattle, Vancouver (BC) and even San Francisco it is rare for me to meet someone who wants to do pure .NET Development. If there is I’ll connect them with you. However if you know a company that is porting away from .NET, building greenfield applications in Node.js, Ruby on Rails or other open source stacks I have a few software developers that might be interested.

Cheers”

Even though this letter is geared toward recruiters looking for coders, there is another letter that I’d like to write to a lot of other companies, that goes something like this,

“Dear Sir or Madam At X Corp Enterprise,

Please realize that lumping a person into the position you’re requesting (.NET Developer) is a career limiting maneuver for many in the occupation of software developers. We software developers are people who solve problems, it happens that we do this with code written on computers. The computers execute that code for us thus resolving the problems that you face. This helps X Corp Enterprise do business better! It’s a great relationship in many ways, but please don’t limit our future careers by mislabeling us.

Also, we’re not resources. That’s just a scummy thing for a human to call another human. Thanks for reading this letter, have a great day at X Corp Enterprise!”

I’d be happy to refer .NETters (or Javaers or COBOLers or RPGers or whatever), but seriously, it seems to be a lost cause out there, even more so for mid-level or beginning developers. Barely a soul is looking for a job as a .NET Developer, but I know a few that look for jobs as software developers every couple of weeks.

Speaking of which, if you are looking for work and you want a filtered list of the cool companies and related information of who to work for in Seattle, Portland or elsewhere in Cascadia reach out to me and let me know who you are. I’m more than happy to help you filter through the mine field of companies and job listings. Cheers!

Addendum:

Write The Docs Bring Forth New Understanding From Documentarians @WriteTheDocs

The Write The Docs Conference was started in 2013 as a simple idea. Bring together those that write professionally in technical fields, creating documentation, educational papers, scientific research or other ideas of this nature. Several people got together to organize this, thinking that even a moderate turn out of writers would be considered a success.

However a moderate turnout of 50 or 60 it wasn’t, it was a huge turnout of hundreds of people. Maybe I should say documentarians? Energy was high and individuals were ecstatic to meet others working on this oft overlooked area of technical work. It was such a great turnout and solid, useful and valuable energy among everyone that the organizers pushed forward and decided to have two conferences this year. One in Europe in the grand city of Budapest. The Write the Docs Conference will be back in its birth place of Portland, Oregon.

With the Budapest over I’m looking forward to getting a review of the event from Troy Howard @thoward37 and experiencing this years conference in person. If you’re looking to go, be sure to get tickets soon, they’ll very likely sell out of space.

Thoughts and Questions

This however brings me to the culminating point of this blog entry, that was ignited at the conference and inspired me to do more than just dwell alone in pondering what, when, how and where to write documentation. I started wondering and talking to people regularly about things related to documentation.

When should I start writing documentation?

Sometimes, it’s a good idea to start writing documentation before any coding actually starts. It’s a great idea to start documenting the ideas, thoughts around workflow and related matters pretty early. It helps to write down thoughts because it helps to confirm understanding and ensure more concise communication. At least, that’s what I have found. I wouldn’t be surprised if others experience this too if they give it a try in the earlier stages of a project.

How should I write documentation?

This is an interesting matter. Is it in story form? Such as in the form of a user story or that of a fictional story of someone using the application that would prospectively be built. Should just the entities, domains or other elements be written about? What should be written about these things at such an early stage? Isn’t this all part of BDUF (Big Design Up Front, and horrible anti-pattern)?

Watching out for BDUF is critical to success. Avoid it with energy and gusto. However confirming the entities and domains of a project, writing them down so others can more decisively understand what they are is important and useful. Writing stories of the application, also can be extraordinarily useful!

Many other questions come up and I’d love to see material on practices and ideas others have had. This is one of the big reasons while I’ll be attending Write the Docs. I hope to see you there, maybe we’ll get some answers and ideas wrapped up into some documentarian solutions!

For some great photos of the Budapest event…

Pluralsight Authors Summit – Meeting & Learning Really Talented People!

Finally, I’ve been able to wrap up my first blog entry on the Pluralsight Authors Summit 2014 (AS14)…

Classified

It all started with this. I’d received a mission.

NOTE: Click on any image to see the full gallery of images I took at the conference. My apologies for the dirty iPhone 5 camera lens.

I’ve been creating Pluralsight courses for a while now, with two to my name; Riak Fundamentals and Docker Fundamentals. I’ve got others in the works, and a lot of great suggestions that I’ll be blogging about in the very near future. However this weekend I headed to Salt Lake City for the Pluralsight Authors Summit.

I arrived at the airport, a 3 minute walk out and onto the light rail to downtown. I ranted via Twitter on my layover at the mess that SEATAC (Seattle & Tacoma’s Airport) is. Salt Lake City makes SEATAC look like an engineering catastrophe. So it was really nice to land in SLC and be able to walk right onto the train into town.

…that led into my admitted love for Seattle, I can’t harsh too bad on the emerald beauty…

Immediately upon leaving the airport it did seem a bit like I’d entered Mordor. Looking into the far distance the sky almost burned a brownish red and seemed to have endless darkness as far as I could see. With a twisting cloud or fog structure pushing down upon the southern view from the airport.

Ok, ok. It actually looked like this. But really, check that out, it’s kind of wild looking!

Along the way it cleared up and there were some amazing views to see of the mountains in the distance. It doesn’t really matter which way you look, you’ll see amazing vistas all around.

I rolled on into town and got to see a bit of downtown as the light rail rolled through town. It seems that Salt Lake City has a lot of bike lanes and related things, albeit I didn’t see any bicyclists anywhere. Overall what I could accrue was the city was extremely clean, well kept and the people – which I got to experience the rush hour while coming into town – were calm and chill as I often expect west coast cities to be.

I then got off at Little America Hotel where the conference was taking place. I couldn’t have asked for an easier ride, with the front door of the hotel being barely across the street from the light rail stop. I figured out my room, headed to check in and got some cool swag, then off to drop all my pack off at the room.

Once I rolled back into the main summit conference center I introduced myself to several people and got my photo taken. Somewhere, at some point, you’ll be exposed to my crazy mug somewhere again. I’ve warned you.

I talked camera and video gear with Phil Hunter. Phil has just started working at Pluralsight and is getting some great work put together for them.

After a bit of talking and introductions to new people, we all rounded up and sat down for dinner that evening. It wasn’t just dinner though, there was gambling setup with prizes and more. That unto itself was pretty cool, but being the non-gambling person that I am, I went straight to the food. Which I gotta say was really good! I even got to experience two glassholes (Jim Wilson @hedgehogjim | Jim’s Author Page and Llewellyn Falco @LlewellynFalco | Llewellyn’s Author Page who are excellent crew) try to setup some magic pixie dust unicorn trick with their Google Glasses.

Jon, Shannon, Julie and all of us we sat helplessly while they configured the glasses to do… well I don’t think we ever figured it out really. But a great table to sit at. We had a good dinner. I wrapped up and others went to gamble while I went to get some recovery sleep.

Saturday

Saturday kicked off a set of talks:

  • Key Note: Aaron Skonnard @skonnard CEO of Pluralsight – great to get the big picture and see where the company is headed.
  • Curriculum Overview & Future Direction – Fritz Onion @fritzonion & team dove into specifics of how we’ll grow offerings to bring more courses and material to subscribers in the coming year, making it easier to find, search for and use.
  • Continuous Improvement & Creating Compelling Technical Content with Geoffrey Grosenbach @topfunky.

Another great lunch was served, conversations were had and I got to introduce myself to even more great authors. After lunch I met Koffi Sessi @aksessi in person finally and we discussed courses, ways to improve and put together even better content and a host of other topics. We wrapped up with a promise he’d send me some of the music he listens to. Being we both of some really esoteric genres I’m looking forward to what he sends me.

After that I got to check out Video Workflow with Shawn Wildermuth @ShawnWildermuth and Authoring and Time Management with the Dane Down Under Lars Klint @larsklint. After dinner the evening wrapped up with X Things You Didn’t Know You Could Do With Your Blog by Chris Reynolds @jazzs3quence and Tips on Using Windows Azure to Host VMs for Recording Pluralsight Demos by Orin Thomas @orinthomas | Orin’s Author Page.

In-memory Orchestrate Local Development Database

I was talking with Tory Adams @BEZEI2K about working with Orchestrate‘s Services. We’re totally sold on what they offer and are looking forward to a lot of the technology that is in the works. The day to day building against Orchestrate is super easy, and setting up collections for dev or test or whatever are so easy nothing has stood in our way. Except one thing…

Every once in a while we have to work disconnected. For whatever the reason might be; Comcast cable goes out, we decide to jump on a train or one of us ends up on one of those Q400 puddle jumpers that doesn’t have wifi! But regardless of being disconnected from wifi, cable or internet connectivity we still want to be able to code and test!

In Memory Orchestrate Wrapper

Enter the idea of creating an in memory Orchestrate database wrapper. Using something like convict.js one could easily redirect all the connections as necessary when developing locally. That way development continues right along and when the application is pushed live, it’s redirected to the appropriate Orchestrate connections and keys!

This in memory “fake” or “mock” would need to have the key value, events, and graph store setup just like Orchestrate. With the possibility of having this in memory one could also easily write tests against a real fake and be able to test connected or disconnected without mocking. Not to say that’s a good or bad idea, but just one more tool in the tool chest doesn’t hurt!

If something like this doesn’t pop up in the next week or three, I might just have to kick off this project myself! If anybody is interested please reach out to me and let’s discuss! I’m open to writing it in JavaScript, C#, Java or whatever poison pill you’d prefer. (I’m not polyglot to limit my options!!)

Other Ideas, Development Shop Swap

Another idea that I’ve been pondering is setting up a development shop swap. I’ll leave the reader to determine what that means!  ;)  Feel free to throw down ideas that this might bring up and I’ll incorporate that into the soon to be implementation. I’ll have more information about that idea right here once the project gets rolling. In the meantime, happy coding!

Configuring Node.js Web Applications… Manually || Convict.js

There’s more than a few ways to configure node.js applications. I’ll discuss a few of them in this blog entry, so without mincing work, to configuring apps!

Solution #1: Build Your Own Configuration

Often this is a super easy solution when an application just needs a single simple configuration. Here’s an example I found that’s pretty clean that Noli posted on Stackoverflow to the question “How to store Node.js deployment settings/configuration files?“.

var config = {}

config.twitter = {};
config.redis = {};
config.web = {};

config.default_stuff =  ['red','green','blue','apple','yellow','orange','politics'];
config.twitter.user_name = process.env.TWITTER_USER || 'username';
config.twitter.password=  process.env.TWITTER_PASSWORD || 'password';
config.redis.uri = process.env.DUOSTACK_DB_REDIS;
config.redis.host = 'hostname';
config.redis.port = 6379;
config.web.port = process.env.WEB_PORT || 9980;

module.exports = config;

…then load that in with a require…

var config = require('./config')

The disadvantage is when the application gets a little bigger the configuration can become unwieldy without very specific, strictly enforced guidelines.

Solution #2: Use a Library/Framework Like Convict.js

The use of a library provides some baseline in which to structure configuration. In the case of convict.js it uses a baseline schema that then can be used to extend or override based on configurations needed for alternate environments. A first steps in setting up convict.js for the fueler project looks like this.

Setup a convict.js file:

var convict = require('convict');

// Schema
var conf = convict({
    env: {
        doc: "The App Environment.",
        format: ["production", "development", "test"],
        default: "development",
        env: "NODE_ENV"
    },
    port: {
        doc: "The port to bind.",
        format: "port",
        default: 3000,
        env: "PORT"
    },
    database: {
        host: {
            default: "someplace:cool",
            env: "DB_HOST"
        }
    }
});

// perform validation
conf.validate();

module.exports = conf;

The main two configuration values are the environment and port values. Others will be added as more of the application is put together, but immediately I just wanted something to put in the project to insure it works.

Next get the convict.js library in the project.

npm install convict --save

The save gets it put into the package.json file as a dependency. Once this is installed I opened up the app.js file of the project and added a require at the top of the file after the path require and before the express() call.

var path = require('path');
var config = require('./config');

var app = express();

In the app.set line for the port I changed the setting of the port to be the configuration parameter.

app.set('port', process.env.PORT || config.get('port'));

Now when I run the application, the port will be derived from the config.js file setting.

Now What Did I Do?

I’ll write more about this in the near future, but for now I’ve run into something not being setup right. I’m still working through various parts of customizing my setup. In the instructions for convict.js, which aren’t very thorough beyond the most basic use, is how to insure that the other environments are setup with *.json files. What I mean by this is…

I’ve setup a directory with three json files. It looks like this.

My Config Directory

My Config Directory

Each of these files (or at least one of the files) I would think, based on the instructions, get loaded and merged into configuration based on the code in my app.js as shown below.

var env = conf.get('env');
conf.loadFile('./config/' + env + '.json');

The order of override for the configuration values starts with the base config.js, then any *.json files override those config.js settings and any environment variables override the *.json set configuration variables. Based on that, unless of course I’ve missed something for this snippet of code, I should be getting the configuration settings from the *.json files.

My config file data looks like this. Since it is using cjson I went ahead and stuck comments in there too.

/**
 * Created by adron on 3/14/14.
 * Description: Adding test configuration for the project.
 */

{
    "port": {
        "doc": "The port to bind.",
        "format": "port",
        "default": 1337,
        "env": "PORT"
    }
}

Until later, happy coding, I’m going to dive into this and figure out what my issue is. In my next blog entry I’ll be sure to post an update to what the problem is.

Oh, and that fueler project. Feel free to ping me and jump into it.