Author: David Hunt, Strategy Director, DWS Consultants

 

Executing Your E1 Upgrade Testing

In the first blog in this series I explained how the DWS Dimension Analyze service is used in the planning phase to produce extremely accurate forecasts of the upgrade effort required. It can also reduce your modified footprint by up to 75%, thereby reducing your upgrade effort and costs.

In the second blog I addressed the execution of your upgrade and explained how you can deliver your upgrades on time, in budget and with minimal defects by using the DWS Dimension Professional service.

In the third blog I addressed the challenges of planning the testing of your upgrade and showed you how you can minimize your testing effort by focusing on just what needs testing.

In this fourth blog in this series I will review some of the key challenges and considerations regarding the management and automation of your JD Edwards EnterpriseOne functional testing without the need for expensive technical specialists. I will also introduce you to a new approach that will simplify your test management.

Research indicates that the testing component can take up to 65% of the overall effort of an E1 upgrade, so it is an area where any improvements could make a big difference. I mentioned in my previous blog that the number one issue inhibiting the adoption of a code current policy is that testing is difficult and time-consuming.

Key Challenges and Considerations

I will now drill a little deeper into some of the key challenges and considerations when it comes to the functional testing of your E1 upgrade. The key areas I will address are:

  • Identifying what needs testing
  • Creating up-to-date test scenarios
  • Resources
  • Executing and managing the tests
  • Agile v Waterfall

Clearly, the first challenge is to identify just what needs testing and what does not. I covered that in the previous blog, where I introduced you to the new product from DWS that does exactly that, DWS Dimension focus. That alone can reduce your total testing effort considerably, as some upgrades may only impact 5% of your standard and custom objects.

The next challenge relates to making sure your library of test scenarios or cases is up to date. This will be dependent on the analysis that identified what has changed and what has not.

You need to identify which programs add totally new functionality that offer opportunities to perform some business processes in a different and more efficient way. These new processes will require new test scenarios to be created.

For any programs that have been impacted by changes Oracle has made, you may need to either modify any existing test scenarios you had created previously or create new test scenarios if you don’t have an existing test script or model. Even if you do have an existing test script you have used for whatever test solution you use, how easy is it to modify those tests? How easy is it to test the changes you have made to those scripts? After all, a test script often requires highly technical test programmers to develop the test scripts, so they need testing too.

What resources are available to help create test scenarios? Are you using a testing tool that requires expensive and scarce specialist test engineers to develop the scripts required? Or are you relying on manual testing that requires many key end users to manually run through their business processes on the new version of E1? Either way, testing can become a resource bottleneck and requires careful planning.

When it comes to executing the tests, you need to focus your testing effort on the code that has been affected by the changes in the upgrade or even in code you have changed yourselves. You also need to consider how thoroughly you need to test each object. Is the change, for example, just the repositioning of a yes/no dialog box, in which case the testing required will be very simple.

On the other hand, if the changes have a profound impact on a number of objects in a process, that process will require much more thorough testing. Again, the Dimension focus tool we introduced you to in the previous blog provides you with all the information you need to help you plan your testing efforts.

If you are relying on manual testing and bugs are identified, these will need to go back to the development team to be fixed. Your developers will need to know exactly what the user did and what happened or did not happen as expected. They will require screen shots to help them.

Once they have carried out the fix, this will need to be retested. This cycle may need to be repeated several times until the code is performing correctly. The problem with this approach is that each time it goes back to the users for retesting, it is taking up more of the users’ time, so they will probably be less diligent with the testing on each testing cycle. This could result in bugs creeping through to the live production environment with potentially serious consequences. According to Carnegie Mellon University it costs fifty times more to fix a problem once it has gone into production than to find it at the development stage.

This brings me to the classic issue of agile versus waterfall testing. In the Waterfall testing model, no testing is started until all development activity is completed, so no usable software is produced until very late in the whole upgrade process. With an agile approach, software is tested as it becomes available and in a usable form, so testing of one bundle of objects can be tested whilst development is carried out on the next bundle of objects. You may recall we spoke about how you can create workable bundles of objects when we spoke about the execution of your upgrade in the second blog in this series.

What are your options?

The main affordable options that have been available until recently are the Oracle Application Testing Suite or manual testing. Those larger companies with larger budgets may consider outsourcing to a testing services company or using the HP testing suite. There is a small number of software test offerings available on the market, but they tend to be prohibitively expensive for most JD Edwards users and tend to require the HP platform anyway to actually store the scripts and results and manage the testing process.

Whichever solution you choose, you still are faced with the issue of identifying what you need to test and what you do not need to test. So, the Dimension Focus solution we introduced you to in the previous blog will be of enormous help to you, whatever your chosen testing solution.

There is clearly a need for a new solution

Fortunately, DWS has recently introduced a totally new approach to JD Edwards EnterpriseOne testing. It is a product called Dimension SwifTest. This has been designed specifically for use with E1, unlike any other alternative. It is also very easy for non-technical end users to create and manage their own testing without the need for expensive or scarce specialist test programmers, as there is no scripting knowledge required. SwifTest can mimic anything a user can do in E1, but do it quicker and free from errors, so the quality and reliability of your testing will improve. It also addresses all the challenges and considerations discussed above.

Early trials indicate that, when used in conjunction with the Dimension Focus product, you could cut your total testing effort by up to 85%.

You can try it out on a 30-day no-obligation free trial.  Why not give it a try? We would love to receive your feedback.

In the final blog in this series I will address the issues involved in staying code current.

If you would like to register for any of the live webinars that support this series of blogs, or to view any of the event recordings, please use the registration links below.

Date Title & Registration
1 January 28th Planning your E1 Upgrade
2 February 11th Executing your E1 upgrade
3 February 24th Planning your E1 Testing
4 March 10th Automating your E1 Testing
5 March 23rd Keeping your E1 code current

Comments are closed.

Our Customers

Need Help?
Close

Marcoms House
Abbey Barn Road
High Wycombe
Bucks
HP11 1RL

UK: +44 (0) 1494 896 600
US: +1 8887 693 248
ANZ: +64 (0) 9427 99 56

Sales: sales@dwsconsultants.com
Support: support@dwsconsultants.com
Accounts: accounts@dwsconsultants.com