Complément de l'article "L'avenir du DBA" de la lettre SQL Server Vol.1 Num.3 - Juin 2002 (Pas encore mis en ligne)
Can Generalists Handle Complex IT?
I’ve been working with SQL Server since the OS/2 days. Back then, it was easy being a SQL Server expert. Learn a few T-SQL tricks, memorize a few arcane stored procedures, and learn the relatively simple index selection and join-processing strategies, and you’re a query-tuning genius. Things certainly have changed. A few weeks ago, I advised you to start learning Extensible Markup Language (XML) unless you wanted to be obsolete in our brave new Internet economy, but XML is just the beginning. Today, SQL Server experts need to be boning up on a huge number of topics. I won’t try to name them all now, but a few obvious subjects come to mind:
– VB scripting
-Multidimensional expressions (MDX)
-OLAP databases and star-schema design
-Internet integration
-New Windows 2000 directory service integration issues
And those are just the technology changes. Moving forward, SQL Server experts will need to understand the latest XYZ application package craze sweeping their company. What are XYZ applications? You know: CRM, ERP, EAI, SFA, etc. I could go on, but whining about how much we have to learn isn’t my real point. One nagging concern about complexity and too much new stuff worries me from time to time.
Look at the solutions we’re building today. In the not-so-distant past, advanced, multithreaded, n-tiered, distributed applications were considered complex and probably beyond the realm of the typical corporate developer. Today everyone with a Microsoft Certified Professional (MCP) test under his belt and an Microsoft Developer Network (MSDN) subscription on the shelf is building this type of application. Oh yes, and we want these applications to be available 24 X 7, have an availability factor that has a few decimal places of 9s, and support a huge number of concurrent users.
Have all the developers in the world suddenly become faster than a speeding bullet and able to leap tall buildings in a single bound, or have the tools simply become that much more powerful and easy to use? I think you know the answer. But better tools don’t necessarily mean that the problems we’re trying to solve are any less complex. The problems are still hard. We’ve simply added layers of smart tools and middleware so we don’t have to reinvent the same wheel over and over again. In fact, we’ve potentially introduced new problems by relying on relatively immature middleware that few people know how to tune and troubleshoot. We’re solving bigger and bigger problems, but the solutions are becoming more and more complex and fragile.
All that said, here’s my nagging concern. Sometimes I worry that the solutions we’re trying to build require such a high degree of specialization that people doing the job don’t always have an adequate level of expertise to solve the problem properly. I do a lot of troubleshooting and tuning for SQL Server systems. I regularly see my clients and colleagues, whom I consider outstanding IT professionals, make simple mistakes. The mistakes are simple for me to find and fix because I spend a lot of time keeping up on the latest tips and tricks associated with database application tuning. The mistakes aren’t always simple for the original developers to avoid because they need to keep on top of so many things as they’re working in complex distributed and middleware-oriented architectures.
Back in the days of « Little House on the Prairie, » Doc Baker did a good job of handling the medical needs of Walnut Grove. He was a great doctor for the time and knowledge available, but today I’d want to see a top specialist if I needed brain surgery. More and more, what’s considered a commonplace solution is the IT equivalent of brain surgery, but more often than not, the IT equivalent of general practitioners still do the work. They might be amazingly skilled and talented doctors, but they’re still generalists rather than specialists. The tools are getting better, but they’re not THAT good yet. I’m don’t know whether this is a serious problem, and even if it is, I don’t have a good answer. But still, it worries me now and then.
Téléchargez cette ressource
Mac en entreprise : le levier d’un poste de travail moderne
Ce livre blanc répond aux 9 questions clés des entreprises sur l’intégration du Mac : sécurité, compatibilité, gestion, productivité, coûts, attractivité talents, RSE et IA, et l’accompagnement sur mesure proposé par inmac wstore.
Les articles les plus consultés
- Stockage autonome, Evolutivité & Gestion intelligente, Pure Storage offre de nouvelles perspectives aux entreprises
- Les projets d’intégration augmentent la charge de travail des services IT
- ActiveViam fait travailler les data scientists et les décideurs métiers ensemble
- 9 défis de transformation digitale !
- Dark Web : où sont vos données dérobées ?
Les plus consultés sur iTPro.fr
- Et si la sécurité de nos villes se jouait aussi… en orbite ?
- Forum INCYBER : les 4 lauréats du Prix de la Start-up 2026
- Mises à jour Microsoft : quand l’automatisation du cloud redéfinit la gouvernance IT
- State of DevSecOps 2026 : la sécurité glisse vers la chaîne d’approvisionnement logicielle
Articles les + lus
La visibilité des données, rempart ultime aux dérives du « Shadow AI »
Scality bouscule le marché du stockage avec une cyber garantie de 100 000 $
De la donnée brute à l’actif stratégique : une approche produit
L’essor de l’IA propulse les cyberattaques à des niveaux records
Face aux ransomwares, la résilience passe par les sauvegardes immuables
À la une de la chaîne Data
- La visibilité des données, rempart ultime aux dérives du « Shadow AI »
- Scality bouscule le marché du stockage avec une cyber garantie de 100 000 $
- De la donnée brute à l’actif stratégique : une approche produit
- L’essor de l’IA propulse les cyberattaques à des niveaux records
- Face aux ransomwares, la résilience passe par les sauvegardes immuables
