A common question I’ve seen on StackOverflow asks for the best way to open a connection to MongoDB when starting up your Express app. Folks generally don’t care for just putting all of the Express setup in the callback of the MongoDB connect call, but it seems to be the generally accepted approach. I didn’t like it either and felt that there must be a better way. Here’s what I came up with.

The Callbacks

You can’t really escape the callbacks when dealing with the native MongoDB driver. Pretty much every call expects a callback. The way I deal with that is by using Promises using the Q library. Q goes beyond just providing a way to use promises by also providing helper functions for wrapping existing Node.js APIs that use the standard callback pattern of the function(err, result).

Promises are a deep topic themselves, so I won’t go into them in detail here. Just know they they can help turn the callback “Pyramid of Doom” or “Callback Christmas Tree” into a chained series of function calls which greatly improves the readability of your code. Google can hook you up if you want to know more.

The Database object

The first step that made the most sense when I first started using MongoDB in Node.js was to create my data access object. Its used for creating the connection, holding the references to the collections used, and the methods that perform the specific actions against MongoDB.

So here’s what my Database object looks like:

var Q = require("Q"),
  MongoClient = require("mongodb").MongoClient,
  ObjectId = require("mongodb").ObjectID,
  Server = require("mongodb").Server,
  ReplSet = require("mongodb").ReplSet;
_ = require("underscore");

var Database = function (server, database) {
  this.server = server;
  this.database = database;
};

Database.prototype.connect = function (collections) {
  var self = this;
  var connectionString =
    "mongodb://" +
    this.server +
    "/" +
    this.database +
    "?replicaSet=cluster&readPreference=secondaryPreferred";
  return Q.nfcall(MongoClient.connect, connectionString).then(function (db) {
    _.each(collections, function (collection) {
      self[collection] = db.collection(collection);
    });

    return db;
  });
};

Database.prototype.findDocs = function (term) {
  return this.mydocs.find({ Title: term }).stream();
};

Database.prototype.saveDoc = function (postData) {
  return Q.npost(this.mydocs, "update", [
    { id: postData.id },
    postData,
    { w: 1, upsert: true },
  ]);
};

module.exports = Database;

So what’s going on here? For the most part, nothing very exciting. We take in the server(s) and database we want to connect to in the constructor. The first interesting part starts on line 16. Q.nfcall is our Q helper function wrapping the MongoClient.connect and giving us a promise back. We chain a then() function which is called after the connection is made to MongoDB. The function receives the connected db object which we can then save a reference to each collection we want to use for our app. We then return the db object from the function so we can keep passing it along. The end result, which consists of the chain of our two functions, is still a promise which is then returned back to the caller.

Just to show a little more detail, you can also see the Q library in use for performing an upsert when we want to save a new document. Again, the promise is returned which means we don’t need to use a callback. Line 27 also shows that the find function can utilize streams instead of using a callback. I hope that feature is spread around more!

The Express App Configuration

For the Express configuration, I decided to keep most of it wrapped in a function. Most of it could be pulled out and just run before we initialize the database. I like that its wrapped up, personally.

So here’s what our app.js looks like:

var database = new Database(settings.databaseServers, settings.database);

function startServer(db) {
  app.set("port", process.env.PORT || 3000);
  app.set("views", __dirname + "/views");
  // The rest of the setup is excluded for brevity...

  console.log("Connected to the database");

  app.locals.database = database;
  routes.registerRoutes(app);

  http.createServer(app).listen(app.get("port"), function onServerListen() {
    console.log("Express server listening on port " + app.get("port"));
  });
}

database.connect(["Posts", "Stats", "Log"]).then(startServer);

The code that really starts things off is at the bottom. We call connect on our Database, passing in the array of collections we want. Since connect returns a promise, we can tack on another function using then() which will also use our connected db object. In this case, it’s our startServer function which loads up Express and start our server listening.

Accessing the Database in your Routes

In our app.js snippet, something I do is attach the database to app.locals on line 10. I’m not sure if this is the best approach, but it has been working for me so far. Now in my routes, I can access the database using req.app.locals.database. It could also be passed in to the registerRoutes function and pass around from there. For my blog, instead of accessing the database directly from the req.app.locals reference, I have another layer which resembles the Repository pattern. For simpler apps I’ve been ok with the direct reference approach.

Can it be better?

Like most of the code we write, it looks pretty good today. Much better than how we did it last year. I’m not sure if there’s a better way, with better falling into [simpler, more scalable, something I just don’t know about]. If you know of or use a better approach, I’d love to hear about it!