A while back, I responded to a post on an ASUG forum. The question was how others handle the refresh of their development client and data for development and testing. I couldn’t find the original post, but a recent CIO Insight article made me think about my reply to the poster. The original question was precipitated by a statement that the client’s consultant made: that it is a best practice (the misuse of this term really bothers me) to refresh their development client from production regularly. My response to this issue was, it is not a best practice to refresh the development client. In actuality, it is harmful because of the loss of change transport links, modification data (there are ways to restore the modification data, but not the version history) and version history. My other comment was, why would you even want to do this, anyway? Normally, a production environment for a large company can be upwards of 900 GB. I couldn’t imagine copying this amount of data, yet some sites do this on a regular basis. Does it make sense? You weigh in.

What prompted me to write this post was an article in CIO Insight titled, “Real Data Rampant During Development." The study in this report shows that by using real data in the development and testing process, customers are exposed to any number of data breaches. I understand the reasons for copying production data to us in development and testing environments, but in my opinion, this is overkill and can actually lead to quality problems. Let me explain.

Why I think it’s overkill – I mentioned previously that copying 900+ GBs of data from a production client takes additional disk capacity. But also consider that in order to make this happen, I need my production client snapshot and recovery process. This involves my Net Weaver administrator and also affects anyone wanting to utilize the system while I perform the recovery. Furthermore, when I perform my recovery, any interim work I may have in my destination client is wiped clean. This results in cost and resource time. Lastly, is the time investment really worth it? In the long run, how many times will all this data be used?

Why it can lead to quality problems – In discussions with developers and analysts, I have heard the argument several times that having real production data is the only way to ensure their application works. Really? My argument is that if the application was designed properly and the requirements/specifications are documented, there is no reason to require real production data. Any nuances that could be represented in this data should have already been taken into account by the specification and design process and then incorporated into a test plan. Is making this data available worth it - only to make up for a problem in the development process? I have heard more than once that it worked OK in development, or was tested against production data ... but I say this is a cop-out. As an analyst, after having analyzed the way the system works and/or doesn’t work, specifically-designed tests are needed to interrogate this behavior. If a specific example was not available in a productive data set and the test is not performed, there is no way to ensure the quality of the application. A side issue is the fact that creating the data needed is part of an integration test. This is the test of the whole scenario or process as opposed to a unit test, which is an isolated application or function.

I would like to hear what you think? Is the copy of productive data justified and is it worth the effort? Is the payback worth the potential data breach by anyone who has access to a development or QA environment?


Go To Top