The Most Important Thing
An Insight Requested Decades Ago Has Graduated to the Stature of Wisdom
Many moons ago when I was a young Chief Technology Officer, I was asked by a board member, who was also a professor, to speak to a group of public administration graduate students about the role of senior information technology managers in non-profit organizations. For the discussion, she had paired me with an older CIO/CTO from a large regional hospital, thus giving the students two truly different perspectives. One from a mature, large, and well-funded enterprise, the other from a young, growing, scrappy and perpetually funding-challenged human service agency.
At the end of the presentations, the floor was opened for questions from the students. I am certain that most were fairly easy to answer and just as easily forgotten, as they certainly have been. One, however, caught me requiring some quick thinking. I don’t recall the student’s precise wording, but it was along these lines: “What skill is the most important for a successful CIO/CTO to possess?“ It’s a great question, I hadn’t anticipated it, and for a few tense moments I feared that I wouldn’t have a good answer.
Fortunately, be it from some type of divine intervention, precisely aligned stars, or perhaps plain dumb luck, the right answer popped into my head and it has survived the test of time. I answered that, first and foremost, a top CIO/CTO needed to know what data was most critical to the long-term success of the organization. Once this was understood, the other aspects of the job would become manageable.
An app mentality has set into many organizations in which they believe it in the bests interests of the organization to view IT (Information Technology) needs through the prism of “What app do I buy to solve this problem?” I am not saying that there is anything wrong with buying “off the shelf” software, especially as I operate a SaaS (Software as a Service) company and realize that most organizations cannot afford an internal software development function. The problem that arises from poor “software acquisition processes” is that organizational success is hugely dependent on having the right information, in the right format, delivered to the right people, at the right time. And, the first part (the right information) is the most important part. But software vendors, at best, misrepresent and, at worst, lie about the capabilities of their products and services and the financial decision-makers are all too often easily attracted to a low price. And, in all but the most basic of situations, data requirements in the larger context of an organization are far more nuanced and complex than many decision-makers recognize. The huge number of failed implementation projects is strong evidence of this phenonemon.
Interestingly, the importance of getting the “data right” is true at both the macro level, as discussed above, and the micro level. With the tsunami of different Internet services being offered today, there has been a strong interest in agile development practices which, for the most part, is a very good thing. The one area that I hear about in this regard that I question, though, is making data design secondary to application functionality. I am always nervous when I encounter development frameworks and philosophies in which data elements are first declared in the application and the framework has the reponsibility for making the database work. For lightweight apps, this is fine. For anything that is mission-critical and will potentially last for decades, however, you’re setting yourself up for failure by adopting this strategy. Good data architecture is pretty permanent while applications and frameworks come and go. Good data design has the added benefit of making application development easier, as well.
My first mission-critical development project has had five “lives,” so to speak. The first was as a set of SQL scripts. The second packaged those scripts into a client-server application. Next came a Web-based version of the application using very early and primitive tools. The fourth moved the application to a good Web framework. And, the fifth moved the application off of its Oracle database to a PostgreSQL database. All the while, the basic data model stayed intact and lives on still serving the same mission-critical need.
Thus, whenever I start on a new project, I begin by thinking through, as carefully as I can, the data architecture. I try and anticipate how people will use the data and how best to present data to them as I am architecting so that I can get a number of the “niceties” designed in the from get-go. But, the overriding focus in getting the fundamentals correct. The thought process is critical is helping you understand the “problem” as fully as possible so that building the solution is as easy and successful as possible. Once the database architecure is set, I actually use a Python application I developed to automatically “write” the basic application source files around the architecture implemented in an actual database.
PostgreSQL has long been a critical component for me in using this approach to application development. Again, with the explosion in Internet services, a plethora of databases systems with varying approaches have appeared to meet many different perceived needs. When it comes to databases, though, I tend to draw a parallel with the maxim that “I want my banker to be conservative.” The PostgreSQL project, from the very beginning, has been primarily focused on always maintaining the integrity of the data stored in its databases. For enterprise data, “Atomicity, Consistency, Isolation, Durability,” or ACID, is for me more important than speed, convenience, or leading edge features.
And, the great news is that with PostgreSQL, not only can you get ACID, but you also get speed, convenience, and leading edge features. Maybe not the absolute fastest, maybe not the absolutely most convenient, and maybe not bloodiest edge, but certainly “excellence” in all of those factors. It has literally been more than two decades since a PostgreSQL shortcoming has been for me a limiting factor in any application I’ve been developing.
Forthcoming articles will explore how all of this works in more detail. I’ll write about how SQL, despite its detractors, is a marvelous and beautiful language. How PostgreSQL’s highly standards-compliant implementation can be used to develop databases that enforce data integrity all the way up your application stack. How a tool like PL/pgSQL, which integrates tightly with the database at a low level, adds to the data protections available to you in addition to making some application functionality brilliantly easier.
I will be sharing our experiences with the hope of helping you in your development efforts. Please feel free to share your thoughts with us through the Contact mechanism below. Our goal is to be helpful in a direct and practical manner with this series.