In a recent blog post entitled Is Logical Data Modeling Dead?, Karen Lopez (b | t) comments on the trends in the data modeling discipline and shares her own processes and preferences for logical data modeling (LDM). Her key point is that LDMs are on the decline primarily because they (and their creators) have failed to adapt to changing development processes and trends.
I love all things data modeling. I found data models to be a soothing and reassuring roadmap that underpinned the requirements analysis and spec writing of the Dev team, as well as a supremely informative artifact of the Dev process which I would constantly refer to when writing new T-SQL code and performing maintenance. However, as time has passed, I have been surprised by how far it has fallen out of favor.
Karen’s Remedy
One of the things that impressed me about Karen’s article is that she comes forward with solutions for data architects to deal with the problem. In contrast, many pundits are quick to throw out warnings and dire predictions (maybe as click-bait?), but don’t bother with the solution. In Karen’s case, the first element of the solution is to ensure that LDMs are used appropriately to the project, as guidance to the Dev team rather than a bludgeon to keep them in line with the ideal. The second element of the solution is to apply Agile data modeling workflows to ensure the smooth development of data models in sync with other Agile development processes.
If you deal with data architecture and you’re finding strong resistance from the rest of the Dev team, I strongly encourage you to read her post.
Image credit: Nanyang Technological University
Where Do LDMs Succeed or Fail?
Let’s say our project is approaching its launch date and we’re doing a retrospective. (I prefer the term “post-mortem” even though no one has died in the process – we hope). I’ve found across many Dev projects that the Devs and DBAs are universally happy that they have an LDM, and a physical data model (PDM) as well, only when it serves as a useful guide and delivered in time to be an intimate part of the Dev process. In cases like that, they’ll happily make room for and even defer to the DM. But the DMs have to be built using the same approach as the app itself. If the Dev team is Agile, then the data modeling must be Agile.
Application development with an LDM has many benefits, but one that I like the most is that it reinforces a data-driven approach to application development, as well as educates the team members about the inherent need to pay a great deal of attention to data within the application. When new members join the team, the DM reinforces the cohesion of data as a focal point of the application. When old hands on the team perform maintenance, the DMs are essential reference material along with the code comments and source control notes.
The secret ingredient is what Karen calls “the sweet spot.” In cases where there are good data models in place, I’ve never seen a Dev or DBA who did NOT want a copy of it hanging on their wall. I’ve frequently found that DMs are regarded as a form of documentation for what is, rather than a set of guidelines for what will be. And they’re also more commonly associated, in these IT shops, with apps on a heavy-duty development roadmap. If it’s a quick-and-dirty app, you can forget about DMs. In other words, they act as documentation, not design.
When people say, “data modeling is dead,” they usually mean, “logical data modeling those of us using the most modern techniques.” Typically, that means the Dev team uses Agile, while the data architect is still doing things waterfall-style. I think it’s already too late in many shops. Unless the Dev team (magically?) figures out a way to perform logical data modeling within the Agile framework, it is done for. That mission of making data modeling Agile, by itself, is worthy of a whole host of blog posts, DAMA presentations, and .NET framework add-in modules. 🙂
What’s Behind the Success or Failure of Data Modeling Efforts?
But what is driving these pressures of contradictory development methodologies and differing uses for LDMS? Three things:
- IT is bad at project management.
- IT is bad at aligning to and communicating with the business stakeholders.
- IT is bad with data.
Let me explain.
Bad Project Management
We have ample empirical evidence that IT projects trample on their deadlines and run over budget as a matter of course. For projects of any measurable size, it is actually normal to miss their major development KPIs. After all, IT management is almost always composed of IT specialists, not project management specialists. And when Dev managers are aware they’re going to miss important deadlines, they’re most likely to cut time from the non-Dev functions (design , QA, documentation, and the like).
And once a Dev leader has a bad experience with LDM, they take it off of their list of project to-do’s forever. Add that to the tyranny of looming deadlines and, when given the chance, project management will often willingly choose 100 days of additional maintenance after an app has shipped over adding one week to the delivery schedule before development begins.
Bad Alignment and Communications with Stakeholders
IT project teams typically have to communicate across silos within the IT team and with one or more business stakeholders who always seem to have wanted this app done yesterday. That creates a lot of nervousness. Middle management censors bad news when communicating upward and, instinctively, push hard against those elements of a project which may provoke bad news. Hence, there is a self-fulfilling erosion in the support for data modeling efforts.
It’s the old axiom of the Iron Triangle of Project Management. You want your new application to be Fast, Cheap, and Good. But corporate pressures only let you have two of the three. Everybody, it seems, chooses Fast and Cheap. And in those cases where a data modeling effort is enforced within an application development, it is usually done so by decree from on-high. That means members of the IT team are now at cross-purposes, serving different masters, and fostering a sense of “us versus them” within a Dev team.
Bad with Data:
A few years ago, when Big Data was first making the headlines, I laughed off those declarations of the Next Big Thing with, “How do these companies expect to do Big Data well, when they do not do ANY-sized data well?” I mean, think about it for a minute. These organizations don’t have full-time DBAs to manage their data. They don’t have skilled Devs with a modicum of understanding about databases. And they definitely don’t know the architectural trade-offs between relational platforms and NoSQL, along with many other data-related concepts.
Sure enough, we see as many headlines today with names like Where Big Data Projects Fail and 8 Reasons Big Data Projects Fail. There are some exceptions to this broad statement, particularly organizations which are essentially built out of their data itself (think Facebook, Uber, Google’s search operations, Stack Exchange, financial trading outfits, etc). But when I set foot inside organizations where data is not their bread-winning business, I can pretty much assure you that 80-90% of their Devs don’t understand why NULLs need special consideration and their DBAs can’t explain a CHECKPOINT.
In the same way organizations fail at data in general, I think we can safely say, “How can an Agile app dev team do data models well, when they don’t even do ANY kind of data (or app modeling) well?” In fact, much of the frenzy I’ve seen in the developer community around NoSQL platforms is due to the fact that you don’t have to define your database schema before you start developing. CHARGE! But, as in so many cases, just because you can doesn’t mean you should. A lot of times you’re just trading work now for work later.
What is Your Organization Doing?
So how do things work within the IT teams of your organization? What is the priority of planning and assessment before leaping into cranking out the code? Do logical data models matter in your organization? I’d love to hear your feedback!
Cheers,
-Kev
Connect with me online! Facebook | Twitter | LinkedIn | Blog | SlideShare | YouTube | Google Author
Kevin (@kekline) serves as Principal Program Manager at SentryOne. He is a founder and former president of PASS and the author of popular IT books like SQL in a Nutshell.
Kevin is a renowned database expert, software industry veteran, Microsoft SQL Server MVP, and long-time blogger at SentryOne. As a noted leader in the SQL Server community, Kevin blogs about Microsoft Data Platform features and best practices, SQL Server trends, and professional development for data professionals.