Coding on Orchestrate.io & Orchestrate.js & Orchestrate.NET

First context, then I’ll dive in.

Orchestrate

http://orchestrate.io/

Orchestrate is a service that provides a simple API to access a multitude of database types all in one location. Key value, graph or events, some of the database types I’ve been using, are but a few they’ve already made available. There are many more on the way. Having these databases available via an API instead of needing to go through the arduous process of setting up and maintaining each database for each type of data structure is a massive time saver! On top of having a clean API and solid database platform and infrastructure Orchestrate has a number of client drivers that provide easy to use wrappers. These client drivers are available for a number of languages. Below I’ve written about two of these that I’ve been involved with in some way over the last couple of months.

Orchestrate.NET

https://github.com/RobertSmith/Orchestrate.NET

This library I’m currently using for a demonstration application built against the Deconstructed.io services (follow us on twitter ya! @BeDeconstructed), a startup I’m co-founding. I’m not sure exactly what the app will be, but being .NET it’ll be something enterprisey. Because: .NET is Enterprise! For more on this project check out the Deconstructed.io Blog.

Some of the latest updates with this library.

But there’s still a bit of work to do for the library, so consider this a call out for anybody that has a cycle they’d like to throw in on the project, let us know. We’d happily take a few more pull requests!  The main two things we’d like to have done real soon are…

Orchestrate.js

https://github.com/orchestrate-io/orchestrate.js

With the latest fixes, additions and updates the orchestrate.js client driver is getting more feature rich by the day. In addition @housejester has created an orchestrate-brain project for Hubot that uses Orchestrate.js. If you’re not familiar with Hubot, but sure to check out the company robot that can dramatically improve and reduce employee efficiency! Keep an eye on that project for more great things, or create a Hubot to keep a robotic eye on the project.

Here are a few key things to note that have been added to help in day-to-day coding on the project.

  • The travis.yml file has been added for the Travis Continuous Integration build. This build runs against node.js v0.10 and v0.8.
  • Testing is done with mocha, expect.js and nock. To get the tests up and running, clone the repo and then build with the make file. The tests will run in tdd format.
  • Promises are provided via the kew library.

If you’re opening up the project in WebStorm, it’s great to setup the mocha tests with the integrated mocha testing as shown below. After you’ve cloned the project and run ‘npm install’ then follow these steps to add the Mocha testing to the project. We’ve already setup exclusions in the .gitignore for the .idea directory and files that WebStorm uses.

First add a configuration by clicking on Edit Configurations.

Edit Configurations

Edit Configurations

Next click on the + to add a new configuration to run. Select the Mocha option from the list of configurations.

Mocha & Other Configurations in WebStorm

Mocha & Other Configurations in WebStorm

On the next screen set a name for the configuration. Set the test directory to the path for the test directory in the project. Then finally set the User interface option for Mocha to TDD instead of the default BDD.

Edit Configuration Dialog

Edit Configuration Dialog

Last but not least run the tests and you’ll see the list of green lights light up the display with positive results.

Test Build

Test Build

Fixing Up Passport.js ‘passport-http’ for Express v4

Even though it isn’t in the primary trunk of code for the ‘passport-http’ Passport.js Strategy, I’ve upgraded the packages.json and app.js file for the basic username and passport authentication to Express.js v4. If you’re using Express.js and are looking to migrate to v4 from v3 or earlier a great starting place is to check out the “Migrating from 3.x to 4.x” wiki page. As for the passport-http strategy, here’s the updated example I put together in a commit here with my own fork here, with the code changes below.

First step was to bump to the latest Express.js v4 Module. I did this with a simple edit to the packages.json file. The original looked like this

{
  "name": "passport-http-examples-basic",
  "version": "0.0.0",
  "dependencies": {
    "express": ">= 0.0.0",
    "passport": ">= 0.0.0",
    "passport-http": ">= 0.0.0"
  }
}

which I changed the depedency from >= 0.0.0 to >= 4.0.0 so that it would require something above v4.

{
  "name": "passport-http-examples-basic",
  "version": "0.0.0",
  "dependencies": {
    "express": ">= 4.0.0",
    "passport": ">= 0.0.0",
    "passport-http": ">= 0.0.0"
  }
}

Technically the old file would have pulled the latest (which as of today I believe is 4.1.1) but it would also not do anything if you’d already pulled the example down. It just make sit more specific that the version is v4+ now.

After changing that dependency I added Morgan. Morgan is a replacement middleware for the logger. The final packages.json file looked like this when I was done.

{
  "name": "passport-http-examples-basic",
  "version": "0.0.0",
  "dependencies": {
    "express": ">= 4.0.0",
    "passport": ">= 0.0.0",
    "passport-http": ">= 0.0.0",
    "morgan": "~1.0.0"
  }
}

Once that was done I nuked my node_modules directory and ran npm install to pull down the latest bits. Once I did that, starting with the app.js I made a few changes. Below is what the app.js file looked like when I started with.

var express = require('express')
  , passport = require('passport')
  , util = require('util')
  , BasicStrategy = require('passport-http').BasicStrategy;

var users = [
    { id: 1, username: 'bob', password: 'secret', email: 'bob@example.com' }
  , { id: 2, username: 'joe', password: 'birthday', email: 'joe@example.com' }
];

function findByUsername(username, fn) {
  for (var i = 0, len = users.length; i < len; i++) {
    var user = users[i];
    if (user.username === username) {
      return fn(null, user);
    }
  }
  return fn(null, null);
}

// Use the BasicStrategy within Passport.
//   Strategies in Passport require a `verify` function, which accept
//   credentials (in this case, a username and password), and invoke a callback
//   with a user object.
passport.use(new BasicStrategy({
  },
  function(username, password, done) {
    // asynchronous verification, for effect...
    process.nextTick(function () {
      
      // Find the user by username.  If there is no user with the given
      // username, or the password is not correct, set the user to `false` to
      // indicate failure.  Otherwise, return the authenticated `user`.
      findByUsername(username, function(err, user) {
        if (err) { return done(err); }
        if (!user) { return done(null, false); }
        if (user.password != password) { return done(null, false); }
        return done(null, user);
      })
    });
  }
));

var app = express.createServer();

// configure Express
app.configure(function() {
  app.use(express.logger());
  // Initialize Passport!  Note: no need to use session middleware when each
  // request carries authentication credentials, as is the case with HTTP Basic.
  app.use(passport.initialize());
  app.use(app.router);
  app.use(express.static(__dirname + '/public'));
});

// curl -v -I http://127.0.0.1:3000/
// curl -v -I --user bob:secret http://127.0.0.1:3000/
app.get('/',
  // Authenticate using HTTP Basic credentials, with session support disabled.
  passport.authenticate('basic', { session: false }),
  function(req, res){
   res.json({ username: req.user.username, email: req.user.email });
  });

app.listen(3000);

First changes, add some requires, remove some requires.

var express = require('express')
  , passport = require('passport')
  , util = require('util')
  , BasicStrategy = require('passport-http').BasicStrategy;

and changed it to

var express = require('express')
  , passport = require('passport')
  , util = require('util')
  , BasicStrategy = require('passport-http').BasicStrategy
  , morgan  = require('morgan')
  , app     = express();

Then I deleted the entire section shown below.

var app = express.createServer();

// configure Express
app.configure(function() {
  app.use(express.logger());
  // Initialize Passport!  Note: no need to use session middleware when each
  // request carries authentication credentials, as is the case with HTTP Basic.
  app.use(passport.initialize());
  app.use(app.router);
  app.use(express.static(__dirname + '/public'));
});

I replace that with a nicely cleaned up Express.js v4 section of code and the replacement for logger, the morgan() library. Initializing passport however is still done in the same ole’ trusty way with initialize().

app.use(morgan());
app.use(passport.initialize());

Ordering of code has changed a bit for express.js, which meant I needed to have the app.use commands before the following section, which I moved right up underneath the two app.use statements.

// curl -v -I http://127.0.0.1:3000/
// curl -v -I --user bob:secret http://127.0.0.1:3000/
app.get('/',
    // Authenticate using HTTP Basic credentials, with session support disabled.
    passport.authenticate('basic', { session: false }),
    function(req, res){
        res.json({ username: req.user.username, email: req.user.email });
    });

Finished app.js File

After those changes the app.js file should look like this.

var express = require('express')
  , passport = require('passport')
  , util = require('util')
  , BasicStrategy = require('passport-http').BasicStrategy
  , morgan  = require('morgan')
  , app     = express();


app.use(morgan());
app.use(passport.initialize());

// curl -v -I http://127.0.0.1:3000/
// curl -v -I --user bob:secret http://127.0.0.1:3000/
app.get('/',
    // Authenticate using HTTP Basic credentials, with session support disabled.
    passport.authenticate('basic', { session: false }),
    function(req, res){
        res.json({ username: req.user.username, email: req.user.email });
    });


var users = [
    { id: 1, username: 'bob', password: 'secret', email: 'bob@example.com' }
  , { id: 2, username: 'joe', password: 'birthday', email: 'joe@example.com' }
];

function findByUsername(username, fn) {
  for (var i = 0, len = users.length; i < len; i++) {
    var user = users[i];
    if (user.username === username) {
      return fn(null, user);
    }
  }
  return fn(null, null);
}

// Use the BasicStrategy within Passport.
//   Strategies in Passport require a `verify` function, which accept
//   credentials (in this case, a username and password), and invoke a callback
//   with a user object.
passport.use(new BasicStrategy({
  },
  function(username, password, done) {
    // asynchronous verification, for effect...
    process.nextTick(function () {
      
      // Find the user by username.  If there is no user with the given
      // username, or the password is not correct, set the user to `false` to
      // indicate failure.  Otherwise, return the authenticated `user`.
      findByUsername(username, function(err, user) {
        if (err) { return done(err); }
        if (!user) { return done(null, false); }
        if (user.password != password) { return done(null, false); }
        return done(null, user);
      })
    });
  }
));

app.listen(3000);

If you execute either of the curl commands shown in the comments in the app.js code you should see the respective response when running the application.

curl -v -I http://127.0.0.1:3000/
curl -v -I --user bob:secret http://127.0.0.1:3000/

…and that should get you running with Passport.js and Express.js v4.

A Recap Of My Top 4 Tech Article Reads From Pocket

Sometimes I get overwhelmed with the number of articles that are in my pocket. I’ve got articles on livability, transit, cycling, auto issues, node.js, java, javascript, coding practices, software craftsmanship, feminism, heavy metal, death metal, black metal, jazz, progressive jazz, fusion jazz, NASA news, space discoveries, space research, Star Trek news, Star Wars news, information on sci-fi books and a slight spattering of politics and some other just interesting nonsensical stuff.

Here's a shot of Pocket on OS-X with an article about Seattle's Tech Advantage over many American cities being rooted in urban density. Which, I'd also argue, gives Seattle a unique advantage (And is a serious pain point for Microsoft's misstep into the suburbs decades ago)

Here’s a shot of Pocket on OS-X with an article about Seattle’s Tech Advantage over many American cities being rooted in urban density. Which, I’d also argue, gives Seattle a unique advantage (And is a serious pain point for Microsoft’s misstep into the suburbs decades ago)

I’ve taken the time to sort through this list of articles, pick out the top technical articles and get this down to a manageable level again. In the process I’ve created this list of solid articles that I’ve now officially read or found useful in some way and present it here for you dear reader. Enjoy, I hope they’re useful to you too.

Article Recon, The Top

  1. Zef Hemel wrote up a piece titled “Docker: Using Linux Containers to Support Portable Application Deployment“. In the article Zef delves into a number of things that are key to understanding Docker and the notion of portland application deployment. Other topics covered include isolation, security, reproducing deployments and resource constraints. The article closes with an example of  application containers and their respective deployment.
  2. 7 Javascript Basics Many Developers Aren’t Using (Properly) albeit slightly useful, I found this one more entertaining. It does give some small insight to the scope of oddities that JavaScript has and how one can easily miss the basics in JavaScript.
  3. Even though the article is from late last year, “The Premature Return to SQL” is a good read. As Alex Popescu   states it, “This pisses me off. A lot.” I too find myself pissed off a lot at the naive understanding and decisions making around SQL or alternate options. It’s almost as if some people decide to just flip a coin to make these determinations with zero insight into what they’re actually attempting to do.
  4. The article “No Deadlines for You! Software Dev Without Estimates, Specs or Other Lies” is spectacular in laying out how bullshit specs and estimates are. They’re almost entirely wasted effort on the developers part. In my own opinion it is often a failure (and yeah, I’ve been in management and leadership too, and removed these issues) of management to understand in the slightest what is actually being built or how it is being built. A lack of vision on behalf of the project is a sure fire sign that the original estimates are already completely off, the design and build out of whatever it is will likely be wrong and a host of other issues. Building software isn’t a bridge, it’s more like a painting, you decide as you go. There is no paint by numbers in software development.

Anyway, that’s my list from the 50+ tech articles that were in my Pocket app. Maybe on day I can get disciplined enough to keep the list limited to really good reads and I’ll start putting together a “My Top Pocket Reads this Month” blog entries? That sounds like it could be useful. Until then, happy coding.

Docker Red Hat and Containerization Wreck Virtualization

Conversation has popped up around a few tweets Alex Williams regarding virtualization at the Red Hat Summit. One of the starts to the conversation.

Paraphrased the discussion has been shaped around asking,

“Why is OS-level virtualization via containers (namely Docker) become such a massive hot topic?”

With that, it got me thinking about some of the shifts around containerization and virutalization at the OS level versus at the hyper-visor level. Here’s some of my thoughts, which seemed to match more than a few thoughts at Red Hat.

  1. Virtualization at the hyper-visor level is starting to die from an app usage level in favor of app deployment via OS-level virtualization. Virtualization at the OS level is dramatically more effective in almost every single scenario that virtualization is used today for application development. I’m *kind of* seeing this, interesting that RH is I suppose seeing this much more.
  2. Having a known and identified container such as what Docker works with provides a dramatically improved speed and method for deployment over traditional hyper-visor based virtualized or pure OS based deployment.

There are other points that have brought up but this also got me thinking on a slight side track, what are the big cloud providers doing now? Windows Azure, AWS, Rackspace or GCE, you name it and they’re all using a slightly different implementation of virtualized environments. Not always ideally efficient, fast or effective but they’re always working on them.

…well, random thoughts aside, I’m off to the next session and some hacking in between. Feel free to throw your thoughts into the fray on twitter or in the comments below.

Breaking Up Again, OneNote and I Must Go Separate Ways

Ok, psychologically one is supposed to tell the good news last and the bad news first. Well, I’m doing that backwards with this article. First things first, all the awesome about Microsoft’s OneNote App.

Microsoft OneNote

The cool thing is, after more than a few years, OneNote runs on most mobile and desktop systems. When I say most, what that equates to is: Windows, iOS and OS-X. Now, I wouldn’t doubt if it works on some other things that I’ve missed, but those are the places I know it works because that is where I’ve used the application the most.

OneNote does a number of things that are pretty cool. The first is simply look pleasant and make it easy to add notes, images, sound or other objects into any notebook in the app. This makes note taking extremely easy. There are also a lot of features around note history to move back and forward, play things back and more. This interaction with the notes across all of the devices is pretty seem-less, when the features are similar across all of the devices.

I actually really like the interfaces built specifically for the device that OneNote is running on. If I’m running the iOS iPhone App it is oriented to small screen touch and interactions of that sort. The iOS version is focused on creating notes, not on managing or organizing the actual notebooks and related structures.

On the iPad iOS App it’s oriented toward a larger workspace and more navigation between each notebook and and a little to the management of these notebooks and the respective notes. The iPad version is a happy middle ground between the note creating focus of the iPhone App and the full blown OS-X and Windows Desktop versions.

Speaking of that, it’s been about 3+ years since I’ve used OneNote on Windows and about 2 years since I actually used Windows for anything relevant. So when One Note was released on OS-X I was all over that. I’d always been a fan of the product, but it was limited since it only ran on Windows for the longest time. So when I switched off of Windows as my core operating system years ago, it went away. I had a list of top apps I lost when leaving Windows.

  1. LiveWriter for blogging because it hooked up to all the blogs I wrote to at the time; WordPress, DotNetBlogEngine and Blogger. So it was hugely useful.
  2. ReSharper for Visual Studio. Note I did NOT say Visual Studio, but just ReSharper. I’ve got a lot of this power back via WebStorm and IntelliJ, but I still miss the robustness of refactoring options with ReSharper.
  3. OneNote with Office. Note, again the specifics of just missing OneNote and not Office. The Office Suite, especially when I moved to non-MS Operating Systems was already useless to me. It was stuck in the 90s world of files and file systems. I’d already moved on to web options where the files were always where I needed them and versioned appropriately.

When I got down to this list, I assumed I could go ahead and switch. I did, haven’t regretted it for a moment and will still tell anybody that’s good at adopting to tools and finding the best for the job, the grass is indeed greener being not on Windows.

But I sure was happy to get OneNote back, but as I used it I realized…

…and now OneNote dies to me again.

…that I’d moved on form the paradigm that OneNote has to offer. I use more than just merely the iOS or OS-X or Windows Version. I need an option to see and retrieve my information beyond that medium. I needed to be able to use these tools sometimes disconnected and this also created a huge problem, as they’re all tightly coupled to the skydrive style service. In addition to this, if one uses any of the other iOS Office Suite Apps from Microsoft those are also tied to skydrive, but one has to get a monthly account to use those.

Overall the OneNote app was elegant, nice and worked well, but the connectivity issues and the tightly coupled service to skydrive left it removed from the other tools that I use to get work done. I suppose, if one is a full on fanbois and using all the Microsoft tooling running on Windows it likely has some integration to those tools. However I use a wide variety of tools across more than one operating system. In the end it seemed like Microsoft was endeavoring to lock me into their online presence with their offer of free OneNote as a gateway to their Office Product.

Albeit I’ve used it now for 2 weeks, I’ve made tons of notes in it, I’m just going to go back to Evernote. The access is better, the apps are clunkier and not as pretty (I realize that’s subjective), and overall I’d rather use OneNote as an interface to files I put in Dropbox or Google Drive or Evernote or something, but alas, it hasn’t worked out. So if you’re looking for a note taking app, OneNote might still be fore you, but otherwise if you want full across the board support across many platforms, Evernote is still a more capable option.

Sorry OneNote, but even though it was nice to have a second fling, we have to go our separate ways again. I guess it’s time to fire Evernote back up.

I’ve Officially Sent This Email Over 100 Times to Recruiters Looking for .NET Developers

Job Description

Here’s the letter, it’s kind of LOLz! I know it’s tough to find .NET Developers (or replace .NET with Java Developers or X Enterprise Language), so CIOs, CTOs and others take note. Here’s what I experience and what I see all the time, embodied in a letter. I will put effort into hooking people up with good jobs, that fit the person, and persons that fit the job, but lately I’ve seen companies that do .NET work in the Portland, Seattle and especially San Francisco areas become exceedingly desperate for .NET Developers. This is what my general response looks like.

“Hello Recruiter Looking for .NET Developer(s), thanks for reaching out to me, however I regret to inform you that I don’t know a single .NET Developer in Portland Oregon looking for work. It seems all the .NET Developers have either A: gone to work for Microsoft on Node.js Technologies, B: switched from being a .NET Developer to a Software Developer or otherwise C: left the field and don’t want to see any software ever again (which always makes me sad when people burn out, but alas, hopefully they find something they love). It’s a funny world we live in.

Even though I’m fairly well connected in Portland, Seattle, Vancouver (BC) and even San Francisco it is rare for me to meet someone who wants to do pure .NET Development. If there is I’ll connect them with you. However if you know a company that is porting away from .NET, building greenfield applications in Node.js, Ruby on Rails or other open source stacks I have a few software developers that might be interested.

Cheers”

Even though this letter is geared toward recruiters looking for coders, there is another letter that I’d like to write to a lot of other companies, that goes something like this,

“Dear Sir or Madam At X Corp Enterprise,

Please realize that lumping a person into the position you’re requesting (.NET Developer) is a career limiting maneuver for many in the occupation of software developers. We software developers are people who solve problems, it happens that we do this with code written on computers. The computers execute that code for us thus resolving the problems that you face. This helps X Corp Enterprise do business better! It’s a great relationship in many ways, but please don’t limit our future careers by mislabeling us.

Also, we’re not resources. That’s just a scummy thing for a human to call another human. Thanks for reading this letter, have a great day at X Corp Enterprise!”

I’d be happy to refer .NETters (or Javaers or COBOLers or RPGers or whatever), but seriously, it seems to be a lost cause out there, even more so for mid-level or beginning developers. Barely a soul is looking for a job as a .NET Developer, but I know a few that look for jobs as software developers every couple of weeks.

Speaking of which, if you are looking for work and you want a filtered list of the cool companies and related information of who to work for in Seattle, Portland or elsewhere in Cascadia reach out to me and let me know who you are. I’m more than happy to help you filter through the mine field of companies and job listings. Cheers!

Addendum:

Write The Docs Bring Forth New Understanding From Documentarians @WriteTheDocs

The Write The Docs Conference was started in 2013 as a simple idea. Bring together those that write professionally in technical fields, creating documentation, educational papers, scientific research or other ideas of this nature. Several people got together to organize this, thinking that even a moderate turn out of writers would be considered a success.

However a moderate turnout of 50 or 60 it wasn’t, it was a huge turnout of hundreds of people. Maybe I should say documentarians? Energy was high and individuals were ecstatic to meet others working on this oft overlooked area of technical work. It was such a great turnout and solid, useful and valuable energy among everyone that the organizers pushed forward and decided to have two conferences this year. One in Europe in the grand city of Budapest. The Write the Docs Conference will be back in its birth place of Portland, Oregon.

With the Budapest over I’m looking forward to getting a review of the event from Troy Howard @thoward37 and experiencing this years conference in person. If you’re looking to go, be sure to get tickets soon, they’ll very likely sell out of space.

Thoughts and Questions

This however brings me to the culminating point of this blog entry, that was ignited at the conference and inspired me to do more than just dwell alone in pondering what, when, how and where to write documentation. I started wondering and talking to people regularly about things related to documentation.

When should I start writing documentation?

Sometimes, it’s a good idea to start writing documentation before any coding actually starts. It’s a great idea to start documenting the ideas, thoughts around workflow and related matters pretty early. It helps to write down thoughts because it helps to confirm understanding and ensure more concise communication. At least, that’s what I have found. I wouldn’t be surprised if others experience this too if they give it a try in the earlier stages of a project.

How should I write documentation?

This is an interesting matter. Is it in story form? Such as in the form of a user story or that of a fictional story of someone using the application that would prospectively be built. Should just the entities, domains or other elements be written about? What should be written about these things at such an early stage? Isn’t this all part of BDUF (Big Design Up Front, and horrible anti-pattern)?

Watching out for BDUF is critical to success. Avoid it with energy and gusto. However confirming the entities and domains of a project, writing them down so others can more decisively understand what they are is important and useful. Writing stories of the application, also can be extraordinarily useful!

Many other questions come up and I’d love to see material on practices and ideas others have had. This is one of the big reasons while I’ll be attending Write the Docs. I hope to see you there, maybe we’ll get some answers and ideas wrapped up into some documentarian solutions!

For some great photos of the Budapest event…