Why the situation hasn't improved since then? It really looks that seven years later we are still dealing with exactly the same problems.
In my case I've been working with data federation tools to improve the first problem: Data fragmentation. With tools like Composite or Denodo you can, in a quick and flexible way, connect remote data sources without having to create big and complex data-warehouses, but the success of such solutions have been limited so far.
To me the question is way? If you analyse the three problems I think that you can see ways of, at least, improving them today. Why are we not using the possible solutions at hand?
My view is that root cause is resistance to change. The DB administrators and the IT people have been doing this things in the same way for ages. They know how to do a DW but some of this tools are new for them.
In the Spanish popular culture we have a phrase: "Is better the known bad than the good to be known" ("Más vale lo malo conocido que lo bueno por conocer"). This is at the core of our cultural values: Do not try new things, even if they are better. Stick to the known even if it's clearly bad.
I've been fighting a crusade against Data Warehouses for ages, since the first time I heard about the concept long time ago. Mainly because I see as a duplication of resources and databases just to provide some reports. Instead of creating those big monsters we should rationalize the solutions and the data at the sources and then connect directly to those sources and provide the reports in real time. I know this can not be realistic in a lot of companies today, but be careful with the cost of a DW compared with its value.
So, bet on the people that is happy to change tools and ways of working. An unknown better can be outside (it probably is already), but is very easy to identify possible problems to justify to keep current solutions, and consequently current problems.
Credits: I've seen the video referenced in this post of Enrique Dans