My company’s database, I am working with, makes available a data to a lot of external applications. So I need to transform the same data to a lot of dynamic views. I can see that a former database developer has implemented many long chain of view-function-procedure call sequences to do transformation more common to all external applications. I think, this architecture and so long requests (stored proc calls some function, then funcs calls some view and this view based on other one and so on) are a performance problem at least query optimizer does not resolve these issues (please confirm my guesses). Is it good approch? Is it cause of degradation of performance? If yes, how can I reimplement objects of the database. At this moment I see these steps to do this:
- analysis of source data structure (own data)
- analysis of all external systems (what formats does database have to provide for)
- separate views, functions, stored procs for every external subsystems (i have to avoid long chains, common to many subsystems db objects, if it is a cause of problem)
Is it a cause of degradation of performance? Nobody can really tell until you have measured it. From your description, it is certainly possible, but before I’d get going on the tasks I’d make sure if this is really the case. I’d also weigh in that reimplementing an existing setup doesn’t come for free.
As for the steps to do: I’d put in place test cases that make sure that the external applications get the same data when you change something.
Lastly, I’d also get the commitment of management before you start doing that. It wouldn’t be the first time that someone tries to put the fault on you if something with the database doesn’t, even if you’re not involved (and even if it is only marginally related to the database) just for the fact that someone heard that “this guy” is “chaning the system”.