RealTime Issues


Real-time Project issues 

In Shell Project point of view,

*how many process chains u have* -  1800 + chains

and how many loads per day  -  3 schedule per day ( G1, G2, G3 ),  G1 is
for Europe Region ( starts at 6:15 am IST ), G2 is for Asia region  ( Starts
at 7:30 pm IST), G3 is for US region ( starts at 12:30 pm IST ).

*modules of projects* -

FSS  ( Financial Services )
PGS ( Procurment of Good and Supply),
GAME ( Global Asset  Management system )
StBC ( Sell to Business Customers ),
StRC ( Sell to Retail Customers )
HM ( Hydro Corbon Managment ),
LSC ( Lubricants supply chain management )

*data source whic u have used -*

2LIS_BAITEM -  Sales Item
2LIS_BAIHDR  -  Sales header
2LIS_12VCITEM  -  Delivery Item
2LIS_VDHRD  -  Delivery Header
2LIS_VCItem -  Billing item
These are some of SAP standdard Data sources, serach in google with  "SAP
standdard Data sources", u can find many more.
**
*title of project   -*  Shell BAM ( GSAP )
*Object of project*  -

*Shell* is a global group of energy and petrochemicals companies, operating
in over 145 countries and employing more than 119, 000 people best known to
the public for service station and for exploring and producing oil and gas
on land and at sea. GSAP will replace Shell’s fragmented Enterprise Resource
Planning systems with a harmonized global platform, critical to the delivery
of OP-One benefits. Shell aims to reduce the number of Enterprise Resource
Planning systems in oil products from 123 to less than 10.
*flow of project*   - Not clear with the question, U just can't explain the
flow of the projects but can explain flow of a particular Report.
like from which source system data is coming to BW, where u are storing it
in BW etc,

Eg : R/3 system --> 2LIS_BAITEM -->PSA-->ABAP Routine ( Update rules )
-->ODS --> Cube --> Multi Provider --> Query  ( explain Logic in the query )
--> Report name.

*roles and responsibilities -*

  - Monitoring Data load activities and recovered failures in Dev, Quality
  system as part of Unit Testing**


  - Preparing the process documents for each and every enhancement.**
  - Handled the tickets of different severities by following SOP (Standard
  Operating Procedures) and never missed the SLA (Service level agreement).
  **


  - Involved in solving high-prioritized tickets regarding extractions,
  performances issues and also Data load failures.**
  - Extensively worked on process chains to automate the Deleting indexes
  in Info cubes, Process of info packages, creating indexes, ODS activation,
  further updating, PSA request deletion and Attribute change run so on.**
  - Experience using the Open Hub Feature to export data from SAP BW to
  external systems (Flat files / Database tables).**
  - Involved in Unit testing, Integration testing, Regression testing and
  User acceptance testing in different servers.**
  - Deleted the Delta queues by running the programs for different
  applications. And filled the set up tables in production during the
  downtime. Before upgrading the BI7.0 system.
  - Having experience in RMI&OPSM   interfaces technology.
  - Enhancing the existing reports as per the new requirement.
  - Monitoring the daily loads to various data targets in the system.
  - Monitoring the process chains on the basis of daily, weekly & monthly.
  - Manual loading and rolling up the data into data targets.
  - Maintaining the logs for each manual load.
  - Created aggregates to improve the query response.
  - Performance tuning of queries using aggregates, indexing of Info cubes.
  - Responsible for the Rev track system in maintaining, creating RevTracks
  for the PGS area

used in transports   -  Shell uses the toll RevTrac  to transfer the objects
from Dev to Production systems.

Above details are different from project to Project. Understand and prepare
some thing similar to ur project as well.

 

How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records?

Here goes the beautiful explanation:
Go to R/3 TC RSA3 and run the extractor. It will give you the number of records extracted. Then go to BW Monitor to check the number of records in the PSA and check to see if it is the same.
RSA3 is a simple extractor checker program that allows you to rule out extract problems in R/3. It is simple to use, but only really tells you if the extractor works. Since records that get updated into Cubes/ODS structures are controlled by Update Rules, you will not be able to determine what is in the Cube compared to what is in the R/3 environment. You will need to compare records on a 1:1 basis against records in R/3 transactions for the functional area in question. I would recommend enlisting the help of the end user community to assist since they presumably know the data.

To use RSA3, go to it and enter the extractor ex: 2LIS_02_HDR. Click executes and you will see the record count, you can also go to display that data. You are not modifying anything so what you do in RSA3 has no effect on data quality afterwards. However, it will not tell you how many records should be expected in BW for a given load. You have that information in the monitor RSMO during and after data loads. From RSMO for a given load you can determine how many records were passed through the transfer rules from R/3, how many targets were updated, and how many records passed through the Update Rules. It also gives you error messages from the PSA.