Business Solutions with SAP

SAP Reimplementation Method Key Considerations

January 30th, 2012 by
SAP REimplementation

SAP Reimplementation

Among three variations for SAP software re-implementations there are two key approaches.  You either make the changes to your existing production system (or a cloned copy of it) or you make the changes in a pristine, newly designed environment. 

SAP Cloned Production System “Re”-Implementations

Making changes in your existing production system (more likely in a cloned instance of it) helps ensure data consistency and ease of adjustment, however there are several difficulties involved.  If you have a significant number of custom-coded solutions you will have to fight them every step of the way.  You will have to work around them, deal with them throughout the process, adjust any out of date coding, and most likely will end up keeping many of them.  As you can tell, I’m not a big fan of an SAP reimplementation in a production system with lots of custom coding.

You either make the changes in your existing production system or in a newly installed instance with no data.

For example if you decided to consolidate organization structures from a multi-system environment you might quickly discover lots of hard coded values in custom programs.  These hard-coded values in the programs themselves, rather than using table driven values and parameters, can cause system consolidation nightmares.  This is just one type of problem from many of the custom-coded solutions so often provided.

Another problem occurs with any of the existing system configuration.  If you make changes to existing objects that are already in production you have the challenge of timing and coordinating your cutover to prevent disruption to existing processes.  Depending on your circumstances if you decide to do a transition in your own production system with the eventual goal of moving away from it and into  cleaner environment then it may be best to create all new custom configuration objects.  You have to make that determination.

Clean SAP Re-Implementation with Old Legacy SAP for History

The other approach to SAP re-implementation is to do a re-implementation in a clean, non-modified system.  That approach assumes that after conversion to the newly designed environment you will leave your old “legacy” SAP system in place for reference and historical data only.  Using a new system for a re-implementation means that you do not have to work around any of the bad setup or design decisions that were made previously.  You avoid all of the headaches with the custom programs and only bring in those custom programs that are really business critical.

If necessary, and if you already have a BI / BW / or other reporting system it will require some additional work to integrate old data structures with new.  However even that will be easier with standard functionality.  The SAP BI / BW / BObj reporting options already contain a number of standard extractors that can be used more easily and with less expense.

The Optimum Solution is a Phased SAP Global Instance Harmonization

The most cost effective way I have found over the years to do a reimplementation is to bring in an operation that is moving to SAP in a “clean” environment.  It is not particularly complicated to integrate two SAP systems using ALE (Application Link Enabling).  In this way you create a new environment, with more up to date and more standard functionality that you can eventually migrate other business units into.

As upgrade projects occur it is only incrementally more expensive to migrate the upgraded companies into the less customized environment.  With an upgrade you still have to do the custom ABAP program reviews, code validations, etc.  With a cleaner environment that does not have all of the custom coded artifacts it is much easier to pick and choose what is really of value and what can be replaced by new, or better understood functionality.

For additional rollout locations there is virtually no additional cost over the rollout project for bringing those companies or organizations onto the more standard SAP environment.  In fact, the reduced custom coding would tend to be less expensive because the amount of time spent regression testing custom functionality, or fixing any organization specific settings, as well as training people how to deal with some of the custom functionality would be lower.  Consulting time, and therefore consulting cost, would be lower as well because the closer you stay to standard the larger pool of resources there is available to make or adjust system settings rather than work with custom programs.

SAP Organization Structure and Master Data Harmonization

One other possible project approach is to do the SAP Org Structure harmonization in all of the separate SAP global instances and then agree on the common master data types.  At “go-live” you extend all of the existing data in each production instance to begin executing with the new structure and master data types in each production instance.  By doing this, the “legacy” data and “legacy” org structures stay in place so that little or no business disruption occurs.  A transition period of approximately a year is needed to complete at least one full annual financial close under the new structures and data in the existing production system.

By using this approach you are actually making the transition in two-steps.  First you build out the new state in your existing system,  then after flushing out and adjusting most of the issues you do a conversion or cutover to a clean system after a financial close.  This approach allows for an orderly transition from the old to the new with relatively little business disruption.  While the old SAP org structure elements and master data types are being made “obsolete” they are still available for all processing and reference purposes.

Some of the key considerations for this approach involve what to do with custom coding and how to transition the master data.  It is impossible to know what custom coding is in place that might be replaced with standard functionality.  However some new data types may help resolve the issue of moving off the custom coding in the same system.  Eventually the goal would be to upgrade away from the custom coding and into more standard functionality unless there is some clear business justification preventing this.

Popular Searches:

  • https://usngerip datamanagement it

Related Posts:

Planning For a Smooth SAP Go-Live Part 1

October 27th, 2008 by
SAP smooth go-live requires a lot of coordination

SAP go-live issues

After you’ve done all the research, and gone to all the trouble to make your project a success, there are still four key areas that consistently cause trouble during your SAP go-live:

1.  SAP Security and Authorizations.

2.  Master Data.

3.  Business process changes, process gaps (missed processes and exceptions).

4.  SAP ABAP Custom Development.

While each of these areas consistently cause trouble at go-live, resolving the first item, Security and Authorizations, and the third item, Process issues, should be a standard part of every project no matter who does it.  There is just no real reason for authorizations or business process issues to be a problem on any project.

The Master Data and Custom Development are a bit more difficult to resolve.  Even companies that believe they have a good handle on master data often discover that it is not as “pristine” as they might believe.  Custom development can also be another source of headaches.  Often it is some “gee wiz gotta have it” improvement, automation, or rearranging a printed form 20 times to get it “right” where development can come back to bite you if you’re not careful.  This is especially true with inexperienced developers who read some ABAP programming book, or take some back room fly-by-night “certification” course and then get a fake resume presented.

1.  SAP Security and Authorizations…

To reduce SAP go-live headaches, security and user authorizations must be carefully tested.  By testing I am *not* referring to consultant testing, core team testing, or even extended user (power user) testing–, I mean actual end users logging in under their own SAP user ID’s and verifying they have what they need to get their job done.

The most effective method I’ve seen over the years is to incorporate authorization testing into the end-user training.  Usually end-user training has “dummy” training user ID’s created that are used during classroom training.  However, the best solution I have seen is for SAP end-users to use their own ID’s during training.  They log in under their own ID’s and then verify that they have access to all of the transactions they will need at go-live.

At one client the users had a form that matched their training classes and users had to initial a sheet next to each transaction they tested, sign the sheet, and then turn it in at the end of the course.  If there were problems they were noted on the form and sent in to security to be resolved.  The users then make use of their training ID’s for the classroom training.  While this is a little disruptive to the classroom training process, it is the most effective method I have ever seen.  The idea is that the end user must somehow log in and test their logon ID long before your SAP go live.  However you decide to do that is up to you, but doing so will reduce many headaches after go-live when everyone is focused on trying to resolve fires, gaps, process changes, and the users learning a new system.

Suggestions and ideas for handling SAP security

1)      Integrate the SAP security and training efforts.  Those identified for training should also have the tasks and transactions they will be trained on identified.  This is a good starting point for security access as well.

2)      Be sure to test security with every end user before you actually go-live.  This will help to reduce the production headaches with security and authorizations; unfortunately it will not eliminate them.

3)      Take the time and effort to carefully structure your security and authorizations.  Done properly authorizations should not be a maintenance nightmare.

Other than the obvious reasons, security maintenance after your SAP go-live can not be overemphasized; after the consultants leave this is one of the routine, regular, and ongoing maintenance areas and if there is not enough attention paid to it from a long term maintenance perspective you may have to live with a “nightmare.”  Aside from the normal security concerns an improperly designed security strategy will cause you ongoing maintenance nightmares because each person will eventually end up with completely “unique” one off security objects.  That translates into significant maintenance overhead that is not necessary with a properly designed security strategy.

Four Part Series on SAP Project Planning for a Smooth Go-Live:

Planning For a Smooth SAP Go-Live: Part 1
(introduction, security and authorizations)

Planning For a Smooth SAP Go-Live: Part 2
(master data, data transformation methods)

Planning For a Smooth SAP Go-Live: Part 3
(process issues, blueprinting, testing, and change management)

Planning For a Smooth SAP Go-Live: Part 4
(custom development, costs and consequences of inexperienced developers)

Popular Searches:

  • sap security for go live

Related Posts:

Planning For a Smooth SAP Go-Live: Part 2

October 25th, 2008 by

SAP go-live steps and processes

2.  SAP Master Data. 

Ideally your master data processing should begin early in the process.  Identification of legacy data sources, and even raw legacy data extracts should begin during the Blueprint phase.  Experienced implementation consultants should be able to ensure that the key data requirements or data settings are also captured during the blueprint process. 

It won’t be perfect, but initial scope should have already been determined and experienced developers should be able to point you to the type of raw data records they will need to begin working with.  When you begin your SAP project you should immediately ask for SAP master data maps, layouts, and conversion information.  If the developers responsible for data conversion seem to “talk around” the issue but can not demonstrate very clear understanding of the details of the SAP master data requirement then you should be very suspicious. By the end of Blueprint legacy system master data requirements (extracts and layouts) won’t be complete but should be well underway. 

It is optimal to plan for Integration Testing to be done with converted data, even if the data is not perfect and not completely ready for prime time

Even though you may have to add a little more time to your initial project plan, it is optimal to plan for Integration Testing to be done with converted data, even if the data is not perfect and not completely ready for prime time.  By doing integration testing with converted data you will discover data gaps, process gaps, and other problems that can be addressed before you convert to a live SAP production system.  “Live testing” of converted data in a production system is the worst time to find out if the system will work.  Not only that, data corrections in SAP can be very involved and difficult because of the integrated nature of the system.

SAP Data Transformation or Data Conversion Methods 

There are four primary methods for data transformation into SAP: 

1)      Do the data transformations outside of SAP and legacy systems then feed pre-formatted data files into the system using standard input programs;

2)      Extract raw legacy data and do all of the transformations in custom programs or conversion tools inside of SAP;

3)      A hybrid that does some conversion outside of SAP and legacy and some at the time of conversion inside SAP;

4)      Or do transformations in legacy extract programs.
I personally prefer methods 1) or 3), if there is one method that I dislike the most it is probably 4).  There are mountains of data transformation tools available.  USE THEM!  I personally prefer MS Access, but that is for one time data conversion and one time “throwaway” transformation rules.  Over the years I have seen that the more cleanup and transformation on the data that is performed outside of SAP and legacy, in an automated and repeatable process, the shorter the development time.  Using that approach my experience is that is easier / less risky it is to resolve and mitigate data conversion problems.  The other side benefit of that approach is that it begins to press legacy employees to move away from the legacy app and begin learning more about the new system.
My personal method is to take unmodified, raw legacy file extracts, use MS Access to do as much data conversion, data cleanup, and data transformation as possible, and then use SAP Legacy System Migration Workbench (LSMW) tools to load the data. Over the years I’ve learned that I do NOT want any legacy data changes made at extract time.  It causes too many problems, and frequently it then leads to managing multiple, sometimes conflicting, transformation rules.  Often times a single change in an extract data set, before an intermediate or final transformation into SAP can completely compromise a data load.  And sadly, those “extract changes” rarely get reported so they are discovered by trial and error at load time.  
A good MS Access power user can do huge amounts of data transformation and file layouts to match SAP input requirements.  Done correctly, this then becomes an easily and quickly repeatable process.  MS Access is a quite capable data transformation tool, on a decent workstation it can manipulate hundreds of thousands, or even a couple million records “relatively” quickly for the lightweight desktop tool it is.  From there, SAP’s LSMW tool is quite powerful and has the ability to directly code individual field level transformation rules right into the program for any final “detailed” requirements. 

Suggestions for SAP data conversion:  

1)  Never leave off doing a “mock conversion” of data and capturing the necessary steps and times to do the real data conversion into your production client. 

2)  If you are doing a rollout of a new facility into an existing SAP instance, make sure you do some type of regression testing before you go into production.

3)  Plan for and be ready to support the master data maintenance efforts after go-live.

4)  Test and check every interface and batch job, both before the go-live and then within 24 hours of the first time they are run at go-live.

5)  Make sure to correct and clean up any master data issues immediately at go-live.  Each time that master data is referenced by a transaction without being corrected compounds the cleanup effort and the problem needing to be corrected.

Four Part Series on SAP Project Planning for a Smooth SAP Go-Live:

Planning For a Smooth SAP Go-Live: Part 1
(introduction, security and authorizations)

Planning For a Smooth SAP Go-Live: Part 2
(master data, data transformation methods)

Planning For a Smooth SAP Go-Live: Part 3
(process issues, blueprinting, testing, and change management)

Planning For a Smooth SAP Go-Live: Part 4
(custom development, costs and consequences of inexperienced developers)

Related Posts: