Quantcast
Channel: SCN : Document List - Data Services and Data Quality
Viewing all 401 articles
Browse latest View live

Customer IDOC : DEBMAS Simplified

$
0
0

Hi All,

The goal of this document is to provide field level details for Debmas idoc, used to load and extract data of Customer Master in SAP.

 

In our daily job, often we come across a scenario where we need to load data into SAP or extract data from SAP for Customer/Employee/Vendor/Material etc.

 

We understand that data in SAP are stored in SAP tables, with specific field requirement like data type and Length of field.

 

Often it’s difficult to memorize the field level details for any table.

In my sincere attempt, I have tried to include the field level details for a particular case of Customer Master.

In Customer Master, we can load the data in, as well as extract from, SAP, using DEBMAS idoc.

 

Please find attached the document, to understand the meaning of each and every field in IDOC, its system requirement (mandatory/non Mandatory), field name, description, segment name and table name where the field lies in SAP.

The document is designed segment wise.

 

 

 

System Required(Mandatory/NON Manadatory)

Text Description

SAP Technical
Field Name

Field Length

Segment Name

E1KNA1M - Basic Data (KNA1)

*

Customer Number 1       

KUNNR

10

E1KNA1M

 

Title

ANRED

15

E1KNA1M

 

Central order block for customer

AUFSD

2

E1KNA1M

 

Express train station

BAHNE

25

E1KNA1M

 

Train station

BAHNS

25

E1KNA1M

 

International location number  (part 1)

BBBNR

7

E1KNA1M

 

International location number (Part 2)

BBSNR

5

E1KNA1M

 

Authorization Group

BEGRU

4

E1KNA1M

 

Industry key

BRSCH

4

E1KNA1M

 

Check digit for the international location number

BUBKZ

1

E1KNA1M

 

Data communication line no.

DATLT

14

E1KNA1M

 

Central billing block for customer

FAKSD

2

E1KNA1M

 

Account number of the master record with the fiscal address

FISKN

10

E1KNA1M

 

Account number of an alternative payer

KNRZA

10

E1KNA1M

 

Group key

KONZS

10

E1KNA1M

*

Customer Account Group

KTOKD

4

E1KNA1M

 

Customer classification

KUKLA

2

E1KNA1M

*

Country Key

LAND1

3

E1KNA1M

 

Account Number of Vendor or Creditor

LIFNR

10

E1KNA1M

 

Central delivery block for the customer

LIFSD

2

E1KNA1M

 

City Coordinates

LOCCO

10

E1KNA1M

 

Central Deletion Flag for Master Record

LOEVM

1

E1KNA1M

*

Name 1

NAME1

35

E1KNA1M

 

Name 2

NAME2

35

E1KNA1M

 

Name 3

NAME3

35

E1KNA1M

 

Name 4

NAME4

35

E1KNA1M

 

Nielsen ID

NIELS

2

E1KNA1M

*

City

ORT01

35

E1KNA1M

 

District

ORT02

35

E1KNA1M

 

PO Box

PFACH

10

E1KNA1M

 

  1. P.O. Box Postal Code

PSTL2

10

E1KNA1M

 

Postal Code

PSTLZ

10

E1KNA1M

 

Region (State, Province, County)

REGIO

3

E1KNA1M

 

County Code

COUNC

3

E1KNA1M

 

City Code

CITYC

4

E1KNA1M

 

Regional Market

RPMKR

5

E1KNA1M

*

Sort field

SORTL

10

E1KNA1M

 

Central posting block

SPERR

1

E1KNA1M

*

Language Key

SPRAS

1

E1KNA1M

 

Tax Number 1

STCD1

16

E1KNA1M

 

Tax Number 2

STCD2

11

E1KNA1M

 

Indicator: Business Partner Subject to Equalization Tax?

STKZA

1

E1KNA1M

 

Liable for VAT

STKZU

1

E1KNA1M

 

House number and street

STRAS

35

E1KNA1M

 

Telebox number

TELBX

15

E1KNA1M

 

First telephone number

TELF1

16

E1KNA1M

 

Second telephone number

TELF2

16

E1KNA1M

 

Fax Number

TELFX

31

E1KNA1M

 

Teletex number

TELTX

30

E1KNA1M

 

Telex number

TELX1

30

E1KNA1M

*

Transportation zone to or from which the goods are delivered

LZONE

10

E1KNA1M

 

Indicator: Alternative payee in document allowed ?

XZEMP

1

E1KNA1M

 

Company ID of trading partner

VBUND

6

E1KNA1M

 

VAT Registration Number

STCEG

20

E1KNA1M

 

Legal status

GFORM

2

E1KNA1M

 

Industry Code 1

BRAN1

10

E1KNA1M

 

Industry code 2

BRAN2

10

E1KNA1M

 

Industry code 3

BRAN3

10

E1KNA1M

 

Industry code 4

BRAN4

10

E1KNA1M

 

Industry code 5

BRAN5

10

E1KNA1M

 

Year For Which Sales are Given

UMJAH

4

E1KNA1M

 

Currency of sales figure

UWAER

5

E1KNA1M

 

Yearly number of employees

JMZAH

6

E1KNA1M

 

Year for which the number of employees is given

JMJAH

4

E1KNA1M

 

Attribute 1

KATR1

2

E1KNA1M

 

Attribute 2

KATR2

2

E1KNA1M

 

Attribute 3

KATR3

2

E1KNA1M

 

Attribute 4

KATR4

2

E1KNA1M

 

Attribute 5

KATR5

2

E1KNA1M

 

Attribute 6

KATR6

3

E1KNA1M

 

Attribute 7

KATR7

3

E1KNA1M

 

Attribute 8

KATR8

3

E1KNA1M

 

Attribute 9

KATR9

3

E1KNA1M

 

Attribute 10

KATR1

3

E1KNA1M

 

Natural Person

STKZN

1

E1KNA1M

 

Field of length 16

UMSA1

16

E1KNA1M

 

Tax Jurisdiction

TXJCD

15

E1KNA1M

 

Fiscal Year Variant

PERIV

2

E1KNA1M

 

Reference Account Group for One-Time Account (Customer)

KTOCD

4

E1KNA1M

 

PO Box city

PFORT

35

E1KNA1M

 

Indicator for Data Medium Exchange

DTAMS

1

E1KNA1M

 

Instruction key for data medium exchange

DTAWS

2

E1KNA1M

 

Hierarchy assignment (batch input)

HZUOR

2

E1KNA1M

 

ID for mainly non-military use

CIVVE

1

E1KNA1M

 

ID for mainly military use

MILVE

1

E1KNA1M

 

Tax type

FITYP

2

E1KNA1M

 

Tax Number Type

STCDT

2

E1KNA1M

 

Tax Number 3

STCD3

18

E1KNA1M

 

Tax Number 4

STCD4

18

E1KNA1M

 

Customer is ICMS-exempt

XICMS

1

E1KNA1M

 

Customer is IPI-exempt

XXIPI

1

E1KNA1M

 

Customer group of Substituicao Tributaria calculation- old!!

XSUBT

1

E1KNA1M

 

Customer's CFOP category

CFOPC

2

E1KNA1M

 

Tax law: ICMS

TXLW1

3

E1KNA1M

 

Tax law: IPI

TXLW2

3

E1KNA1M

 

Indicator for biochemical warfare for legal control

CCC01

1

E1KNA1M

 

Indicator for nuclear nonproliferation for legal control

CCC02

1

E1KNA1M

 

Indicator for national security for legal control

CCC03

1

E1KNA1M

 

Indicator for missile technology for legal control

CCC04

1

E1KNA1M

 

Central sales block for customer

CASSD

2

E1KNA1M

 

Customer condition group 1

KDKG1

2

E1KNA1M

 

Customer condition group 2

KDKG2

2

E1KNA1M

 

Customer condition group 3

KDKG3

2

E1KNA1M

 

Customer condition group 4

KDKG4

2

E1KNA1M

 

Customer condition group 5

KDKG5

2

E1KNA1M

 

Central deletion block for master record

NODEL

1

E1KNA1M

 

Customer group of Substituicao Tributaria calculation

XSUB2

3

E1KNA1M

 

Plant

WERKS

4

E1KNA1M

E1KNVVM - Sales Data (KNVV)

*

Customer Number 1

KUNNR

10

E1KNVVM

*

Sales Organization

VKORG

4

E1KNVVM

*

Distribution Channel

VTWEG

2

E1KNVVM

*

Division

SPART

2

E1KNVVM

 

Authorization Group

BEGRU

4

E1KNVVM

 

Deletion flag for customer (sales level)

LOEVM

1

E1KNVVM

 

Customer Statistics Group

VERSG

1

E1KNVVM

 

Customer order block (sales area)

AUFSD

2

E1KNVVM

*

Pricing procedure assigned to this customer

KALKS

1

E1KNVVM

 

Customer group

KDGRP

2

E1KNVVM

 

Sales district

BZIRK

6

E1KNVVM

 

Price group (customer)

KONDA

2

E1KNVVM

 

Price list type

PLTYP

2

E1KNVVM

 

Order probability of the item

AWAHR

3

E1KNVVM

 

Incoterms (Part 1)

INCO1

3

E1KNVVM

 

Incoterms (Part 2)

INCO2

28

E1KNVVM

 

Customer delivery block (sales area)

LIFSD

2

E1KNVVM

 

Complete delivery defined for each sales order?

AUTLF

1

E1KNVVM

 

Maximum Number of Partial Deliveries Allowed Per Item

ANTLF

2

E1KNVVM

 

Partial delivery at item level

KZTLF

1

E1KNVVM

 

Order Combination Indicator

KZAZU

1

E1KNVVM

 

Batch split allowed

CHSPL

1

E1KNVVM

 

Delivery Priority

LPRIO

2

E1KNVVM

 

Shipper's (Our) Account Number at the Customer or Vendor

EIKTO

12

E1KNVVM

*

Shipping Conditions

VSBED

2

E1KNVVM

 

Billing block for customer (sales and distribution)

FAKSD

2

E1KNVVM

 

Manual invoice maintenance

MRNKZ

1

E1KNVVM

 

Invoice dates (calendar identification)

PERFK

2

E1KNVVM

 

Invoice list schedule (calendar identification)

PERRL

2

E1KNVVM

 

Currency

WAERS

5

E1KNVVM

 

Account assignment group for this customer

KTGRD

2

E1KNVVM

*

Terms of Payment Key

ZTERM

4

E1KNVVM

 

Delivering Plant

VWERK

4

E1KNVVM

 

Sales Group

VKGRP

3

E1KNVVM

 

Sales Office

VKBUR

4

E1KNVVM

 

Item proposal

VSORT

10

E1KNVVM

 

Customer group 1

KVGR1

3

E1KNVVM

 

Customer group 2

KVGR2

3

E1KNVVM

 

Customer group 3

KVGR3

3

E1KNVVM

 

Customer group 4

KVGR4

3

E1KNVVM

 

Customer group 5

KVGR5

3

E1KNVVM

 

Indicator: Customer Is Rebate-Relevant

BOKRE

1

E1KNVVM

 

Exchange Rate Type

KURST

4

E1KNVVM

 

Relevant for price determination ID

PRFRE

1

E1KNVVM

 

Customer classification (ABC analysis)

KLABC

2

E1KNVVM

 

Customer payment guarantee procedure

KABSS

4

E1KNVVM

 

Credit control area

KKBER

4

E1KNVVM

 

Sales block for customer (sales area)

CASSD

2

E1KNVVM

 

Switch off rounding?

RDOFF

1

E1KNVVM

 

Indicator: Relevant for agency business

AGREL

1

E1KNVVM

 

Unit of Measure Group

MEGRU

4

E1KNVVM

 

Overdelivery tolerance limit (BTCI)

UEBTO

4

E1KNVVM

 

Underdelivery tolerance (BTCI)

UNTTO

4

E1KNVVM

 

Unlimited overdelivery allowed

UEBTK

1

E1KNVVM

 

Customer procedure for product proposal

PVKSM

2

E1KNVVM

 

Relevant for POD processing

PODKZ

1

E1KNVVM

 

Timeframe for Confirmation of POD (BI)

PODTG

11

E1KNVVM

 

Indicator: Doc. index compilation active for purchase orders

BLIND

1

E1KNVVM

 

Carrier is to be notified

CARRIER_NOTIF

1

E1KNVVM

E1KNVPM - Partner Roles (KNVP)

*

Customer Number 1

KUNNR

10

E1KNVPM

*

Sales Organization

VKORG

4

E1KNVPM

*

Distribution Channel

VTWEG

2

E1KNVPM

*

Division

SPART

2

E1KNVPM

*

Partner Function

PARVW

2

E1KNVPM

*

Customer number of business partner

KUNN2

10

E1KNVPM

 

Default Partner

DEFPA

1

E1KNVPM

 

Customer description of partner (plant, storage location)

KNREF

30

E1KNVPM

 

Partner counter

PARZA

3

E1KNVPM

E1KNVIM - Tax Indicators (KNVI)

*

Customer Number 1

KUNNR

10

E1KNVIM

*

Sales Organization

VKORG

4

E1KNVIM

*

Distribution Channel

VTWEG

2

E1KNVIM

*

Division

SPART

2

E1KNVIM

*

Departure country (country from which the goods are sent)

ALAND

3

E1KNVIM

*

Tax category (sales tax, federal sales tax,...)

TATYP

4

E1KNVIM

*

Tax classification for customer

TAXKD

1

E1KNVIM

E1KNVLM - Licenses (KNVL)

*

Customer Number 1

KUNNR

10

E1KNVLM

*

Sales Organization

VKORG

4

E1KNVLM

*

Distribution Channel

VTWEG

2

E1KNVLM

*

Division

SPART

2

E1KNVLM

*

Departure country (country from which the goods are sent)

ALAND

3

E1KNVLM

*

Tax category (sales tax, federal sales tax,...)

TATYP

4

E1KNVLM

*

License number

LICNR

15

E1KNVLM

*

Valid-From Date

DATAB

8

E1KNVLM

*

Valid To Date

DATBI

8

E1KNVLM

 

Confirmation for licenses

BELIC

1

E1KNVLM

E1KNB1M - Company Code (KNB1)

*

Customer Number 1

KUNNR

10

E1KNB1M

*

Company Code

BUKRS

6

E1KNB1M

 

Posting block for company code

SPERR

1

E1KNB1M

 

Deletion Flag for Master Record (Company Code Level)

LOEVM

1

E1KNB1M

 

Key for sorting according to assignment numbers

ZUAWA

3

E1KNB1M

 

Accounting clerk

BUSAB

2

E1KNB1M

*

Reconciliation Account in General Ledger

AKONT

10

E1KNB1M

 

Authorization Group

BEGRU

4

E1KNB1M

 

Head office account number (in branch accounts)

KNRZE

10

E1KNB1M

 

Account number of an alternative payer

KNRZB

10

E1KNB1M

 

Indicator: Payment notice to customer (with cleared items)?

ZAMIM

1

E1KNB1M

 

Indicator: payment notice to sales department?

ZAMIV

1

E1KNB1M

 

Indicator: payment notice to legal department?

ZAMIR

1

E1KNB1M

 

Indicator: Payment notice to the accounting department ?

ZAMIB

1

E1KNB1M

 

Indicator: payment notice to customer (w/o cleared items)?

ZAMIO

1

E1KNB1M

 

List of the Payment Methods to be Considered

ZWELS

10

E1KNB1M

 

Indicator: Clearing between customer and vendor ?

XVERR

1

E1KNB1M

 

Block key for payment

ZAHLS

1

E1KNB1M

*

Terms of Payment Key

ZTERM

4

E1KNB1M

 

Terms of payment key for bill of exchange charges

WAKON

4

E1KNB1M

 

Interest calculation indicator

VZSKZ

2

E1KNB1M

 

Key date of the last interest calculation

ZINDT

8

E1KNB1M

 

Interest calculation frequency in months

ZINRT

2

E1KNB1M

 

Our account number at customer

EIKTO

12

E1KNB1M

 

User at customer

ZSABE

15

E1KNB1M

 

Memo

KVERM

30

E1KNB1M

 

Planning group

FDGRV

10

E1KNB1M

 

Export credit insurance institution number

VRBKZ

2

E1KNB1M

 

Amount Insured

VLIBB

14

E1KNB1M

 

Insurance lead months

VRSZL

4

E1KNB1M

 

Deductible percentage rate

VRSPR

4

E1KNB1M

 

Insurance number

VRSNR

10

E1KNB1M

 

Insurance validity date

VERDT

8

E1KNB1M

 

Collective invoice variant

PERKZ

1

E1KNB1M

 

Indicator: Local processing?

XDEZV

1

E1KNB1M

 

Indicator for periodic account statements

XAUSZ

1

E1KNB1M

 

Bill of exchange limit (in local currency)

WEBTR

14

E1KNB1M

 

Next payee

REMIT

10

E1KNB1M

 

Date of the last interest calculation run

DATLZ

8

E1KNB1M

 

Indicator: Record Payment History ?

XZVER

1

E1KNB1M

 

Tolerance group for the business partner/G/L account

TOGRU

4

E1KNB1M

 

Probable time until check is paid

KULTG

4

E1KNB1M

 

Short Key for a House Bank

HBKID

5

E1KNB1M

 

Indicator: Pay all items separately ?

XPORE

1

E1KNB1M

 

Subsidy indicator for determining the reduction rates

BLNKZ

2

E1KNB1M

 

Previous Master Record Number

ALTKN

10

E1KNB1M

 

Key for Payment Grouping

ZGRUP

2

E1KNB1M

 

Short Key for Known/Negotiated Leave

URLID

4

E1KNB1M

 

Key for dunning notice grouping

MGRUP

2

E1KNB1M

 

Key of the Lockbox to Which the Customer Is To Pay

LOCKB

7

E1KNB1M

 

Payment Method Supplement

UZAWE

2

E1KNB1M

 

Account Number of Buying Group

EKVBD

10

E1KNB1M

 

Selection Rule for Payment Advices

SREGL

3

E1KNB1M

 

Indicator: Send Payment Advices by EDI

XEDIP

1

E1KNB1M

 

Release Approval Group

FRGRP

4

E1KNB1M

 

Reason Code Conversion Version

VRSDG

3

E1KNB1M

 

Accounting clerk's fax number at the customer/vendor

TLFXS

31

E1KNB1M

 

Personnel Number

PERNR

8

E1KNB1M

 

Internet address of partner company clerk

INTAD

130

E1KNB1M

 

Payment Terms Key for Credit Memos

GUZTE

4

E1KNB1M

 

Activity Code for Gross Income Tax

GRICD

2

E1KNB1M

 

Distribution Type for Employment Tax

GRIDT

2

E1KNB1M

 

Value Adjustment Key

WBRSL

2

E1KNB1M

 

Deletion bock for master record (company code level)

NODEL

1

E1KNB1M

 

Accounting clerk's telephone number at business partner

TLFNS

30

E1KNB1M

 

Accounts Receivable Pledging Indicator

CESSION_KZ

2

E1KNB1M

 

Customer is in execution

GMVKZD

1

E1KNB1M

E1KNB5M  - Reminder Data (KNB5)

*

Customer Number 1

KUNNR

10

E1KNB5M

*

Company Code

BUKRS

6

E1KNB5M

 

Dunning Area

MABER

2

E1KNB5M

 

Dunning Procedure

MAHNA

4

E1KNB5M

 

Dunning block

MANSP

1

E1KNB5M

 

Last dunned on

MADAT

8

E1KNB5M

 

Dunning level

MAHNS

1

E1KNB5M

 

Account number of the dunning recipient

KNRMA

10

E1KNB5M

 

Date of the legal dunning proceedings

GMVDT

8

E1KNB5M

 

Dunning clerk

BUSAB

2

E1KNB5M

E1KNBKM - Bank Details and Bank Master

*

Customer Number 1

KUNNR

10

E1KNBKM

*

Bank country key

BANKS

3

E1KNBKM

*

Bank number

BANKL

15

E1KNBKM

*

Bank account number

BANKN

18

E1KNBKM

 

Bank Control Key

BKONT

2

E1KNBKM

 

Partner Bank Type

BVTYP

4

E1KNBKM

 

Indicator: Is there collection authorization ?

XEZER

1

E1KNBKM

 

Reference specifications for bank details

BKREF

20

E1KNBKM

 

Account Holder Name

KOINH

35

E1KNBKM

 

Date (batch input)

KOVON

8

E1KNBKM

 

Date (batch input)

KOBIS

8

E1KNBKM

E1KNVKM  - Contact Person (KNVK)

*

Customer Number 1

KUNNR

10

E1KNVKM

 

Number of contact person

PARNR

10

E1KNVKM

 

First name

NAMEV

35

E1KNVKM

 

Name 1

NAME1

35

E1KNVKM

 

Contact person's department at customer

ABTPA

12

E1KNVKM

 

Contact person department

ABTNR

4

E1KNVKM

 

Higher-level partner

UEPAR

10

E1KNVKM

 

First telephone number

TELF1

16

E1KNVKM

 

Form of address for contact person (Mr, Mrs...etc)

ANRED

30

E1KNVKM

 

Contact person function

PAFKT

2

E1KNVKM

 

Partner's Authority

PARVO

1

E1KNVKM

 

VIP Partner

PAVIP

1

E1KNVKM

 

Partner's gender

PARGE

1

E1KNVKM

 

Partner language

PARLA

1

E1KNVKM

 

Date of Birth

GBDAT

8

E1KNVKM

 

Representative number

VRTNR

10

E1KNVKM

 

Call frequency

BRYTH

4

E1KNVKM

 

Buying habits

AKVER

2

E1KNVKM

 

Advertising material indicator

NMAIL

1

E1KNVKM

 

Notes about contact person

PARAU

40

E1KNVKM

 

Contact person: Attribute 1

PARH1

2

E1KNVKM

 

Contact person: Attribute 2

PARH2

2

E1KNVKM

 

Contact person: Attribute 3

PARH3

2

E1KNVKM

 

Contact person: Attribute 4

PARH4

2

E1KNVKM

 

Contact person: Attribute 5

PARH5

2

E1KNVKM

 

Contact person's visiting hours: Monday morning from ...

MOAB1

6

E1KNVKM

 

Contact person's visiting hours: Monday morning until ...

MOBI1

6

E1KNVKM

 

Contact person's visiting hours: Monday afternoon from ...

MOAB2

6

E1KNVKM

 

Contact person's visiting hours: Monday afternoon until ...

MOBI2

6

E1KNVKM

 

Contact person's visiting hours: Tuesday morning from...

DIAB1

6

E1KNVKM

 

Contact person's visiting hours: Tuesday morning until ...

DIBI1

6

E1KNVKM

 

Contact person's visiting hours: Tuesday afternoon from..

DIAB2

6

E1KNVKM

 

Contact person's visiting hours: Tuesday afternoon until ...

DIBI2

6

E1KNVKM

 

Contact person's visiting hours: Wednesday morning from...

MIAB1

6

E1KNVKM

 

Contact person's visiting hours: Wednesday morning until ...

MIBI1

6

E1KNVKM

 

Contact person's visiting hours: Wednesday afternoon from ..

MIAB2

6

E1KNVKM

 

Contact person's visiting hours: Wednesday afternoon until..

MIBI2

6

E1KNVKM

 

Contact person's visiting hours: Thursday morning from ....

DOAB1

6

E1KNVKM

 

Contact person's visiting hours: Thursday morning until ....

DOBI1

6

E1KNVKM

 

Contact person's visiting hours: Thursday afternoon from...

DOAB2

6

E1KNVKM

 

Contact person's visiting hours: Thursday afternoon until ..

DOBI2

6

E1KNVKM

 

Contact person's visiting hours: Friday morning from ...

FRAB1

6

E1KNVKM

 

Contact person's visiting hours: Friday morning until ...

FRBI1

6

E1KNVKM

 

Contact person's visiting hours: Friday afternoon from ...

FRAB2

6

E1KNVKM

 

Contact person's visiting hours: Friday afternoon until ...

FRBI2

6

E1KNVKM

 

Contact person's visiting hours: Saturday morning from ...

SAAB1

6

E1KNVKM

 

Contact person's visiting hours: Saturday morning until ...

SABI1

6

E1KNVKM

 

Contact person's visiting hours: Saturday afternoon from ...

SAAB2

6

E1KNVKM

 

Contact person's visiting hours: Saturday afternoon until ..

SABI2

6

E1KNVKM

 

Contact person's visiting hours: Sunday morning from ...

SOAB1

6

E1KNVKM

 

Contact person's visiting hours: Sunday morning until ...

SOBI1

6

E1KNVKM

 

Contact person's visiting hours: Sunday afternoon from ...

SOAB2

6

E1KNVKM

 

Contact person's visiting hours: Sunday afternoon until ...

SOBI2

6

E1KNVKM

 

Contact person: Attribute 6

PAKN1

3

E1KNVKM

 

Contact person: Attribute 7

PAKN2

3

E1KNVKM

 

Contact person: Attribute 8

PAKN3

3

E1KNVKM

 

Contact person: Attribute 9

PAKN4

3

E1KNVKM

 

Contact person: Attribute 10

PAKN5

3

E1KNVKM

 

Sort field

SORTL

10

E1KNVKM

 

Marital Status Key

FAMST

1

E1KNVKM

 

Nickname

SPNAM

10

E1KNVKM

 

Title of contact person (description of function)

TITEL_AP

5

E1KNVKM

E1KNKKM - Credit Management Control Area Data (KNKK)

*

Customer Number 1

KUNNR

10

E1KNKKM

*

Credit control area

KKBER

4

E1KNKKM

*

Field of length 16

KLIMK

16

E1KNKKM

 

Customer's account number with credit limit reference

KNKLI

10

E1KNKKM

 

Credit management: Risk category

CTLPC

3

E1KNKKM

 

Last internal review

DTREV

8

E1KNKKM

 

Indicator: Blocked by credit management ?

CRBLB

1

E1KNKKM

 

Credit representative group for credit management

SBGRP

3

E1KNKKM

 

Next internal review

NXTRV

8

E1KNKKM

 

Credit information number

KRAUS

11

E1KNKKM

 

do not use - replaced by DBPAY_CM

PAYDB

2

E1KNKKM

 

do not use - replaced by DBRTG_CM

DBRAT

3

E1KNKKM

 

Last review (external)

REVDB

8

E1KNKKM

 

Customer Credit Group

GRUPP

4

E1KNKKM

 

Reference Date

SBDAT

8

E1KNKKM

 

Customer Group

KDGRP

8

E1KNKKM

 

Payment Index

DBPAY

3

E1KNKKM

 

Rating

DBRTG

5

E1KNKKM

 

Recommended credit limit

DBEKR

17

E1KNKKM

 

Date Monitoring

DBMON

8

E1KNKKM

 

 

Hope this would be helpful for all.

 

 

 

 

 

Thanks,

Mayank Mehta


Auditing in SAP Data Services

$
0
0

A proper data reconciliation process must be in place in any data Extraction-Transformation-Load (ETL) process. A successful reconciliation process should only indicate whether or not the data is correct. But data reconciliation is not easy. Fortunately for us, Data Services (BODS) provides an in-built data reconciliation feature called Auditing. Auditing is a way to ensure that a dataflow loads correct data into the target warehouse. Let's see how...

Audit objects are used to collect run time audit statistics about the objects within a data flow. Using the Audit feature we can collect statistics about data read into a job, processed by various transforms, and loaded into targets.

We define audit rules to determine if the correct data is processed. Also in the event of audit rule failure there is a provision to generate notification of audit failures.

Auditing stores these statistics in the repository for further future analysis. In event of any mismatch in source/target row counts or sum of measures notification can be set, to highlight the audit failure for our immediate action like reload, identify incorrect source data etc.

Now let us see how easy it is to enable Audit option in Data Services. Find below a sample dataflow on which we will Audit the count of Source and Target record count and also will verify the HASH value of character data type columns between Source and Target.

1.jpg

We can enable the Audit option property of the dataflow either by right-click the Dataflow in Workspace or from the Local Object Library.   2.jpg3.jpg

On selection of Audit, we have two tabs namely Label And Rule which is described in details below.

4.jpg5.jpg

Auditing Features

Following are the various available features of auditing:

§  Define Audit Points to collect run-time statistics about the data. E.g. Number of rows extracted from Source, Processed by Transforms, Loaded into Targets

§  Define Audit Rules for the audit statistics to ensure expected data at the audit points in the dataflow.

§  Generate run-time notification for the audit rule that fails i.e. Action on failure and the values of the audit statistics for which the rule failed.

§  Display Audit Statistics after the job execution to identify the object/transform that processed incorrect data in the dataflow.

Auditing Objects in a Dataflow

§  Audit Point– This is the object in a dataflow where audit statistics is collected. i.e. Source, Transform or Target.

§  Audit Function– Defines the collects audit statistics for a table, output schema or column. E.g. Count, Sum, Average, Checksum

§  Audit Label– This is a unique name generated by BODS for each audit function that is defined on an audit point to collect the audit statistics.

§  Audit Rule– A Boolean expression which uses audit labels to verify a job.

§  Action on Audit Failure– Ways to generate notification for audit rules failure.

Audit Labels

The software generates a unique name for each audit function that we define on an audit point and are editable.

If the audit point is on a table or output schema for the audit function Count, the software generates two labels namely $Count_objectname and $CountError_objectname

If the audit point is on a column, the software generates an audit label namely $auditfunction_objectname

If the audit point is in an embedded data flow, the labels generated are namely $Count_objectname_embeddedDFname, $CountError_objectname_embeddedDFname and $auditfunction_objectname_embeddedDFname

Audit Functions

There are four Audit Functions available in BODS.

§  COUNT function helps to get the audit information on a table or output schema. The default datatype is INTEGER.

§  SUM function helps to generate audit information like summation of measure type columns. E.g. Revenue. Accepted datatypes are INTEGER, DECIMAL, DOUBLE, REAL.

§  AVERAGE function helps to generate audit information like average of measure type columns. E.g. Profit Margin. Accepted datatypes are INTEGER, DECIMAL, DOUBLE, REAL.

§  CHECKSUM function helps to audit based on the hash values generated for VARCHAR datatype columns. The order of rows is important for the result of CHECKSUM function.

Error Count Statistics

BODS collects two types of statistics when we use a Count audit function. Good row count for rows processed without any error. Error row count for rows that the job could not process but ignores those rows to continue processing. When we specify the Use overflow file option in the Source or Target Editor we can handle the error rows.

Let us now see how to define the Audit Points and the Audit Functions. We can also modify the default Audit Labels generated by Data Services if necessary.

6.jpg7.jpg

Audit Rule

An Audit Rule is a Boolean expression which consists of a Left-Hand-Side (LHS), a Boolean operator, and a Right-Hand-Side (RHS). The LHS can be a single audit or multiple audit labels that form an expression with one or more mathematical operators, or a function with audit labels as parameters. The RHS can also be a single or multiple audit labels that form an expression with one or more mathematical operators, a function with audit labels as parameters, or a constant. Examples:

The following Boolean expressions are examples of audit rules:

§  $Count_ODS_CUSTOMER = $Count_DW_CUSTOMER

§  $Sum_CALLS_PREPAID + $Sum_CALLS_POSTPAID = $Sum_DW_CALLS

§  round($Avg_CALL_TIME) >= 30

Audit Notification

There are three types of Actions available for Notification in the event of an Audit Failure namely Email to list, Script and Raise exception. We can select either all three actions or can choose any combination of the actions. Find below the order these actions are executed in the event when all the actions are selected as a part of notification for audit failure.

§  Email to list— BODS sends a notification of which audit rule failed to the email addresses mentioned for this option.

§  Script— BODS executes the custom script mentioned for this option in the event of Audit rule failure.

§  Raise exception— The job fails if an Audit rule fails, and the error log shows which audit rule failed. The job stops at the first audit rule that fails. This action is the default. We can continue with the job execution if we place the audit exception in a try/catch block.

We need to the define the Audit Rules related to the Audit Points or Audit Labels. Based on the success or failure of this rule we will generate error message.

8.jpg9.jpg

                                               10.jpg

Viewing Audit Results

We can find the audit status in the Job Monitor Log, when we set Audit Trace to Yes on the Trace tab in the Execution Properties window of the Job. We can see messages for audit rules that passed and failed. If the audit rule fails, the places that display audit information depends on the Action on failure option that was selected.

§  When Raise exception action is selected and an audit rule fails the Job Error Log and the Metadata Reports shows the rule that failed.

§  In case of Email to list we get notified of audit rule failure via Email message. Audit rule statistics are also available in the Metadata Reports.

§  For failure action type Script the custom script is executed and the audit statistics can be viewd from the Metadata Reports.

Metadata Reports

We can view passed and failed audit rules result in the metadata reports. We can look at the Audit Status column in the Data Flow Execution Statistics reports of the Metadata Report tool. This Audit Status column has the following values:

§  Not Audited

§  Passed— All audit rules succeeded. This value is a link to the Auditing Details report which shows the audit rules and values of the audit labels.

§  Information Collected— This status occurs when we define audit labels to collect statistics but we do not define audit rules. This value is a link to the Auditing Details report which shows the values of the audit labels.

§  Failed— Audit rule failed. This value is a link to the Auditing Details report which shows the rule that failed and values of the audit labels.

Notes

  • 1.    An audit label can become invalid if we delete the audit label in an embedded data flow that the parent data flow has enabled.
  • 2.    An audit label can become invalid if we delete or rename an object that had an audit point defined on it.
  • 3.    We can edit the audit label name while creating the audit function and before creating an audit rule that uses the label.
  • 4.    If we edit the label name after using it in an audit rule, the audit rule does not automatically use the new name. We must redefine the rule with the new label name.
  • 5.    If we define multiple rules in a data flow, all rules must succeed or the audit fails.
  • 6.    Auditing is disabled when you run a job with the debugger.
  • 7.    We cannot audit NRDM schemas or real-time jobs.
  • 8.    We cannot audit within an ABAP Dataflow, but we can audit the output of an ABAP Dataflow.
  • 9.    If we use the CHECKSUM audit function in a job that normally executes in parallel, the software disables the DOP for the whole data flow. The order of rows is important for the result of CHECKSUM, and DOP processes the rows in a different order than in the source.
  • 10.  We can audit the number of rows loaded using bulkload dataflow only for those that uses the Oracle API method.
  • 11.  While executing the job, the Enable auditing option in the Execution Properties window is checked by default. If we do not want to collect audit statistics for the specific job execution clear the selection.
  • 12.  If we clear the action Raise exception and an audit rule fails, the job will complete successfully and the audit does not write messages to the job log. We can view which rule failed in the Auditing Details report in the Metadata Reporting tool.
  • 13.  If a pushdown_sql function is after an audit point, the software cannot execute it.
  • 14.  If we add an audit point prior to an operation that is usually pushed down to the database server, performance might degrade because pushdown operations cannot occur after an audit point.

When we audit the output data of an object, the optimizer cannot pushdown operations after the audit point. Therefore, if the performance of a query that is pushed to the database server is more important than gathering audit statistics from the source, we should define the first audit point on the query or later in the data flow. For example, suppose a query has a WHERE clause that is pushed to the database server that significantly reduces the amount of data that returns to the software. So we can define the first audit point on the query, rather than on the source, to obtain audit statistics on the query results.

Audit Result Logging

In case we want to capture or log the Audit Results to Database tables we can use the custom script as:

 

$Count_ODS_CUSTOMER = $Count_CUST_DIM

AND sql('Target_DS', 'insert into dbo.STG_RECON values (

\'ODS_CUSTOMER\',[ $Count_ODS_CUSTOMERM ],

\'CUST_DIM\',[ $Count_CUST_DIM ],

{ to_char( sysdate(), \'YYYY-MM-DD\' ) } )

') is NULL

SAP Business Objects Data Services Target Table Options:

$
0
0

Auto Correct Load:

 

Auto correct loading ensures that the same row is not duplicated in a target table

 

Input Data Screen Short:

IMG36.PNG

Target Option Window: Following steps need to be done

 

IMG1.png

Uncheck the Delete sata from Table before Loading

 

Before Updating in Update Control:

IMG2.png

After Updating:

 

IMG3.png

Now go to Source Data and add two Duplicate rows

Before Updating:

IMG4.png

After Updating:

IMG5.png

 

After Creation of Jobs in Data Services Designer

 

IMG6.png

 

Now Apply Auto Correct Load on Target Tables

 

IMG7.png

 

Note: After apply Auto Correct Load Duplicate values are removed from Target Tables

Now Execute the Job

IMG8.png

 

Output:

 

IMG9.png

 

Ignore Column Option:

 

 

If a matching row exists, the row is updated depending on the values of Ignore columns with value, and Ignore columns with null:

 

Ignore Columns Options are Worked with Auto Correct Load:

 

When the Ignore columns with null option is set to Yes and the column data from the source is NULL, then the corresponding column in the target table is not updated. Otherwise, the corresponding target column is updated as NULL since the source column is NULL.

 

 

 

Now Go To Target Table Properties:

 

Before Update in the Target Table Properties:

 

IMG19.png

 

After Update in the Target Table Properties:

 

IMG16.png

 

Now Source Data Information:

IMG29.png

Now create Jobs in Data Services Using Job-Work Flow->Data Flow-> Source & Target and make EMP_ID as a Primary key

IMG22.png

 

View of Source table Before Loading:

 

IMG14.png

 

View of Source table After Null Values Before Loading:

 

IMG15.png

Now update in the Target Table before loading the data

 

IMG11.png

 

Then Finally Execute the Job

 

IMG8.png

 

Output after executing the job:

 

IMG18.png

 

Ignore Column with Value Option:

 

If a matching row exists, the row is updated depending on the values of Ignore columns with value, and Ignore columns with null:

 

When the column data from the source matches the value in Ignore columns with value, the corresponding column in the target table is not updated. The value may be spaces. Otherwise, the corresponding column in the target is updated with the source data.

 

 

Now Go To Target Table Properties:

 

Before Update in the Target Table Properties:

 

Before Update in the Target Table Properties:

 

IMG19.png

 

After Update in the Target Table Properties:

 

IMG20.png

Now Source Data Information:

IMG21.png

 

Now create Jobs in Data Services Using Job-Work Flow->Data Flow-> Source & Target and make EMP_ID as a Primary key

 

IMG22.png

 

View of Source table Before Loading:

 

IMG23.png

 

Now update in the Target Table before loading the data

 

IMG20.png

 

Then Finally Execute the Job

 

IMG8.png

Output after executing the job:

IMG26.png

 

Update Key Columns in Update Control:

Now Go To Target Table Properties:

 

Before Update in the Target Table Properties:

 

IMG19.png

 

 

After Update in the Target Table Properties:

 

IMG28.png

 

Now Source Data Information:

Before Update:

 

IMG29.png

 

After Update in Source Table:

 

IMG30.png

 

Now create Jobs in Data Services Using Job-Work Flow->Data Flow-> Source & Target and make EMP_ID as a Primary key

IMG31.png

 

View of Source table Before Loading:

IMG23.png

 

Now update in the Target Table before loading the data

 

IMG33.png

Then Finally Execute the Job

 

IMG8.png

Output after executing the job:

 

IMG35.png

Featured Content Data Services and Data Quality

$
0
0

What is dirty data costing you?

Intrinsically, you know that you need good data. But how far do you need to go? What are the real costs incurred if you DON’T have clean data? Check out this new SAP Data Quality infographic, which highlights the probably extent of your data quality problems, and the costs associated with those problems. This blog by Ina Felsheim also links to the popular Information Governance infographic.

 

 

Data Profiling and Data Cleansing – Use Cases and Solutions at SAP

While some organizations have set up enterprise wide data governance projects including managing and tightly integrating People, Processes, Policies & Standards, Metrics and Tools other companies are still in the starting phase of mostly departmental data cleansing activities. Recent research from Gartner still indicate that the poor data quality is a primary reason for about 40% of all business initiatives fail. Read more in this blog by Niels Weigel.

 

 

Data Quality Performance Guide

This document from Dan Bills provides performance throughput numbers for data quality capabilities within Data Quality Management and Data Services v4.2.

Data Services 4.1 and ODBC MYSQL Connections to MySQL Community Servers in SLES 11

$
0
0

Hallo,

 

I have problems running a Job with ODBC MySQL Datastore Sourcetable.

The job get an error message CON-120302 when trying to connect the MySQL Datastore Source Table:

 

My SLES 11 production with new Data Services 4.1 shows errors when I try to use a MySQL Datasource.

 

SAP BusinessObjects Data Services

   Version: 14.1.1.392

Linux 3.0.74-0.6.8.1.5506.2.PTF-default #1 SMP Wed May 15 07:26:33 UTC 2013 (5e244d7) x86_64 x86_64 x86_64 GNU/Linux

 

 

Old driver from running Data Services 3.2 in SLES 10 was libmyodbc3-3.51.27.so .

Now in SLES 11 it states:

 

ldd /usd/as16148a/soft/mysql-connector-odbc/usr/lib64/libmyodbc3-3.51.27.so

        linux-vdso.so.1 =>  (0x00007ffff7ffe000)

        libdl.so.2 => /lib64/libdl.so.2 (0x00007ffff7ace000)

        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007ffff78b1000)

        libltdl.so.3 => not found

        libcrypt.so.1 => /lib64/libcrypt.so.1 (0x00007ffff7675000)

        libnsl.so.1 => /lib64/libnsl.so.1 (0x00007ffff745d000)

        libm.so.6 => /lib64/libm.so.6 (0x00007ffff71e4000)

        libodbcinst.so.1 =>

/usd/as16148a/soft/unixODBC/usr/local/lib/libodbcinst.so.1

(0x00007ffff70cd000)

        libc.so.6 => /lib64/libc.so.6 (0x00007ffff6d56000)

        /lib64/ld-linux-x86-64.so.2 (0x0000555555554000)

 

libltdl.so.3 => not found

 

So i try new mysql odbc driver libmyodbc5w.so

 

isql 'MySQL S-Web' s-web-reader ###

+---------------------------------------+

| Connected!                            |

|                                       |

| sql-statement                         |

| help [tablename]                      |

| quit                                  |

|                                       |

+---------------------------------------+

SQL> quit

 

odbcinst -j

unixODBC 2.2.14

DRIVERS............:

/usd/as16148a/home/serberw/businessobjects/dataservices/bin/odbcinst.iniSYSTEM DATA SOURCES:

/usd/as16148a/home/serberw/businessobjects/dataservices/bin/odbc.ini

FILE DATA SOURCES..:

/usd/as16148a/home/serberw/businessobjects/dataservices/bin/ODBCDataSources

USER DATA SOURCES..:

/usd/as16148a/home/serberw/businessobjects/dataservices/bin/odbc.ini

SQLULEN Size.......: 8

SQLLEN Size........: 8

SQLSETPOSIROW Size.: 8

 

ldd /usd/as16148a/soft/mysql-connector-odbc/usr/lib64/libmyodbc5w.so | less

        linux-vdso.so.1 =>  (0x00007ffff7ffe000)

        libodbc.so.1 => /usd/as16148a/soft/unixODBC/usr/local/lib/libodbc.so.1 (0x00007ffff787b000)

        librt.so.1 => /lib64/librt.so.1 (0x00007ffff7664000)

        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007ffff7447000)

        libodbcinst.so.1 => /usd/as16148a/soft/unixODBC/usr/local/lib/libodbcinst.so.1 (0x00007ffff7330000)

        libdl.so.2 => /lib64/libdl.so.2 (0x00007ffff712c000)

        libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x00007ffff6e22000)

        libm.so.6 => /lib64/libm.so.6 (0x00007ffff6ba8000)

        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007ffff6992000)

        libc.so.6 => /lib64/libc.so.6 (0x00007ffff661b000)

        /lib64/ld-linux-x86-64.so.2 (0x0000555555554000)

lines 1-11/11 (END)

 

 

DSConnectionManager.sh -c does not work.

 

The job get an error message CON-120302 when trying to connect the MySQL Datastore Source Table:

 

CON-120302: |Session J_3150_AWZT_SWEB- ODBC-Aufruf <SQLConnect> für Datenquelle <MySQL S-Web> fehlgeschlagen: <瑠ɱ>. Benachrichtigen Sie den Kundensupport.

 

My SLES 11 production use a technical user for running and there are no root privileges allowed neither for installing nor for running the production of Data Services 4.1.

 

Are there any helpfull experience with SLES 11, Data Services 4.1 and MySQL ODBC.

 

Thank you,

 

Rolf-Dieter

Create, Configure and Manage Central Repository (CR) in BODS 4.0

$
0
0

Attached document will help to understand how to Create, Configure and Manage Central Repository (CR) in BODS 4.0.

 

Note: Open each slide to get better vision.

 

1.JPG

2.JPG

3.JPG

4.JPG

5.JPG

 

6_1.JPG

7.JPG

 

8.JPG

9.JPG

10.JPG

11.JPG

12.JPG

13.JPG

 

14.JPG

 

Thanks,

Sagar Girme

SAP Business Objects Data Services 4.X Cleansing Package Location & Activation

$
0
0

Purpose:


The purpose of this document is to re-install the cleansing package in SAP Business Objects Data Services 4.X

 

Overview:

 

If you forget to deploy "cleansing package in SAP Business Objects Data Services 4.X" then please refer below points to activate it again, Below screen shots is of Windows Environment.

 

Path in My Virtual Machine where BODS is Installed:

 

C:\Program Files (x86)\SAP BusinessObjects\InstallData\InstallCache\im.lac.ds.cpdata.cp-4.0-core-64\14.1.1.210\actions\data

Currently "No Files are Present".

2.png

Download the latest "SBOP DQM CLEANSING PACK 4.X" as per your Operating System from SAP Market Place.3.png

and Paste to that Location

 

"C:\Program Files (x86)\SAP BusinessObjects\InstallData\InstallCache\im.lac.ds.cpdata.cp-4.0-core-64\14.1.1.210\actions\data"

Then go to Control Panel and click on Modify of SAP Business Objects Data Services then these Data Cleansing Option is "Activated"

 

Path:

 

C:\Program Files (x86)\SAP BusinessObjects\InstallData\InstallCache\im.lac.ds.cpdata.cp-4.0-core-64\14.1.1.210\actions\data

Paste the Downloaded Person Data in that Location:

 

Go to Control Panel and modify the SAP Business Objects Data Services 4.X.

 

Currently it was in disable mode because no Cleansing package are available on BODS 4.X Server

1.png

Below status after you save the "SBOP DQM CLEANSING PACK 4.X" package "Cleansing package" option are enabled.

 

4.png

After that Cleansing Package Data is Enabled

5.png

Start the Installation and Activate Cleansing Package Data

6.png

Now Open the SAP Business Objects Data Services Designer 4.X

7.png

Open any Jobs to check Data Cleansing Transform activation status

8.png

Click on Data Cleanse transform and check the Cleansing Package Name Status

9.png

Previously Cleansing package name is "Blank"

 

10.png

Now validate and execute the Job

11.png

Now "SBOP DQM CLEANSING PACK 4.X" is activated on Server.

Netezza to Teradata Migration using Business Objects Data Services

$
0
0

Hi All

 

Netezza to Teradata Migration

The scope of the document is to detail the design of steps involved to repoint the existing BODS  jobs to Teradata, which are currently pointing to Netezza.

Why there is need to migrate from Netezza to Teradata?

Netezza is one of the key platforms that are serving as a reporting application for BGS. As a part of the BGEDW initiative, Netezza is the first data warehouse that is chosen for the migration to Teradata platform.

The other main reason for the migration of Netezza application to Teradata is the long lasting performance issues in Netezza that is making the reports to be available for Business very late in the day.

BODS job flows (actually used in NZ to TD migration)

In this method, we will be creating BODS jobs for transferring historical data from NZ to TD layer.

Figure1.JPG

Steps of approach for historical loading:

1)    Prepare metadata for source and target tables

The source and target table metadata needs to be defined. The target metadata is in Teradata layer and can be obtained by converting the datatypes and formats as converting Netezza table names to Teradata table names.

This can be achieved by the Netezza-to-Teradata DDL convertor tool which takes the source DDL (NZ) as input and gives the target DDL (TD). Define mappings between the source and target tables at column level

2)    Build BODS jobs for data transfer

Teradata Parallel Transporter with fast load option in replace mode is used to load the Teradata tables with 1:1 mapping in BODS job flow built for this purpose. A single BODS job can be built and then with the help of macros we can replicate the job for any number of tables we want to migrate with 1:1 mapping based on the column position.

figure2.JPG

3)    Perform migration

Populate tables in target by running the BODS jobs. We need to run the BODS job by passing the required parameters to populate the target tables. This whole process can be automated and run for desired number of times to migrate all the required tables for historical loading.

For Example:

figure3.JPG

How to migrate from Netezza to Teradata?

figure4.JPG

 

Source:  BODS jobs extract data from various source systems and create flat files based upon them.

Arrival: The flat files created as part of the source process will be placed into a landing area referred to as the ‘arrival’ area.

Processing: Various BODS data flows load the data into the staging area with transformations.

Staging: Data is loaded in from the processing area and transformations are undertaken for history handling.

ODS/WH: Various BODS processes apply business rules against the data in the staging databases.

Data Marts: Various BODS processes apply business rules against the data in the staging databases.

To implement the above approach, we will divide the action items in 4 levels:

a)    Source to Archive

b)    Processing to Staging

c)    Staging to Warehouse

d)    Warehouse to Data Mart (Server Specific)

a)   Source To Archive:

1)    Import the job into the repository.

2)    The existing dataflow DF_CONTROL will be replaced with a new work-flow named W_INSERT_DF_CONTROL.

3)    Assign the values to the parameters defined for W_INSERT_DF_CONTROL workflow

4)    The DF_CONTROL table will be re-imported into the ADMIN datastore.

5)    Open the Main dataflow (e.g. D_SRC_ARV_UABSCON) and change the target table properties for the DF_CONTROL by setting the Auto Correct Load = YES under the Advanced Target Table properties.

6)    One ‘source to archive’ work flow (W_SRC_ARV_UABSCON) is shown in the below figure.

figure5.JPG

b)  Processing to Staging:

1)    Import the job into the repository.

2)    The existing dataflow DF_CONTROL will be replaced with a new work-flow named W_INSERT_DF_CONTROL.

3)    Assign the values to the parameters defined for W_INSERT_DF_CONTROL workflow.

4)    All the tables that will be used have to be re-imported from their respective datastores.

5)    The W_RECOVERY_STG work-flow will be updated by replacing the double dot ‘..’ identifier after the database name with a single dot ‘.’ e.g. [$$SP_DB_EDW_ADMIN_DATABASE]..DF_CONTROL in the workflow W_RECOVERY_STG is changed to [$$SP_DB_EDW_ADMIN_DATABASE].DF_CONTROL.

6)    If required, we will have to add the Post-Load and Pre-Load SQL commands obtained from the migration inventory. The migration inventory SQL commands will have to be converted to Teradata SQL first. Change the Bulk Loader Options for the target table as follows

i.        Bulk Load = Parallel Transporter.

ii.        File = Named Pipe and access module

iii.        Operator = If data already exists - Update else Load.

iv.        Number of Instances = 1.

v.        Mode = Append.

vi.        Bulk Operation = Insert.

vii.        Field Delimiter = 1/27

Named Pipe Parameters

The default value for all the parameters is C:\Program Files\Business Objects\BusinessObjects Data Services\Log\BulkLoader. Change it as below:

viii.        Logdirectory = [$$SP_LOG_DIRECTORY]

ix.         FallbackDirectory = [$$SP_LOG_DIRECTORY]

7)    Change the datatype of the parameter [$P_DATETIME_ACTIVITYDATE] =VARCHAR (19) from DATETIME. Enclose in single quotes, when it is used in Pre/Post-Load commands.

            One processing to staging work flow is shown in the below figure.

            figure6.JPG

             c)   Staging to Warehouse:

1)    Import the job into the repository.

2)    The existing dataflow DF_CONTROL will be replaced with a new work-flow named W_INSERT_DF_CONTROL.

3)    Assign the values to the parameters defined for W_INSERT_DF_CONTROL workflow $P_W_VARCHAR_JOBNAME = $L_VARCHAR_JOBNAME

4)    All the tables that will be used have to be re-imported from their respective datastores.

5)    The W_RECOVERY_STG work-flow will be updated by replacing the double dot ‘..’ identifier after the database name with a single dot ‘.’. e.g. [$$SP_DB_EDW_ADMIN_DATABASE]..DF_CONTROL in the workflow W_RECOVERY_STG is changed to [$$SP_DB_EDW_ADMIN_DATABASE].DF_CONTROL.

6)    Change the datatype of the parameter [$P_DATETIME_ACTIVITYDATE] =VARCHAR (19) from DATETIME.

7)    If required, we will have to add the Post-Load and Pre-Load SQL commands obtained from the migration inventory.

8)    While the load is happening from Teradata to Teradata tables (BODS extracts data, performs transformations, then loads), change the Bulk Loader Options as below:

9)    If any custom SQL is used for the lookups, it has to be modified to make it Teradata compatible. Refer section 10.2 for SQL conversion from Netezza to Teradata.

10)  Open the Target Table Editor and go to Options tab and change the Column Comparison = Compare by position in the General settings.

              One Staging to Warehouse Work Flow is shown in the below figure.

            figure6.JPG

     d)    Warehouse to Server Specific (Data Mart)

1)    Import the job into the repository.

2)    The existing dataflow DF_CONTROL will be replaced with a new work-flow named W_INSERT_DF_CONTROL.

3)    Assign the values to the parameters defined for W_INSERT_DF_CONTROL workflow

4)    All the tables that will be used have to be re-imported from their respective datastores.

5)    The W_RECOVERY_SS work-flow will be updated by replacing the double dot ‘..’ identifier after the database name with a single dot ‘.’. e.g. [$$SP_DB_EDW_ADMIN_DATABASE]..DF_CONTROL in the workflow W_RECOVERY_SS is changed to [$$SP_DB_EDW_ADMIN_DATABASE].DF_CONTROL.

6)    Change the datatype of the parameter [$P_DATETIME_ACTIVITYDATE] = VARCHAR (19) from DATETIME

7)    Add the Post-Load and Pre-Load SQL commands

        One Warehouse to Service Specific work flow (W_WH_SS_PROD_MVMT_HMOVE) is shown in the figure below.

            figure7.JPG

Maestro/ Tivoli Workload Scheduler(TWS)- Batch

Introduction

Tivoli Workload Scheduler (TWS) provides the backbone for automated workload management and monitoring. Offering a single console, real-time alerts, reports and self-healing capabilities, this is software automation to help manage the workloads that span your enterprise.

Scheduler Jobs Process

In order to offer greater flexibility, improved throughput and concurrency, the existing schedules shall be updated to execute BODS processes at the atomic level. In essence, this means that each dataflow in BODS shall have a corresponding job in TWS.

Current Structure:

The current strucuture with BODS Jobs structure hierarchy with Netezza is shown below

figure8.JPG

So, we can have ‘n’ number of workflows in one job and similarly ‘n’ number of dataflows in one workflow, thereby the relation being One-to-Many for Job:Workflow and One-to-Many for Workflow:Dataflow.

 

In this case the dependency of one workflow would be on the successful completion of all the dataflows created under it and similarly a whole job would be successful with the completion of all the workflows below that job in the hierarchy.

 

 

Proposed structure of BODS Jobs with Teradata

 

figure9.JPG

So, we can have only one workflow in one job and ‘n’ number of dataflows in one workflow, thereby the relation being One-to-One for Job:Workflow and One-to-Many for Workflow:Dataflow.

In this case, the dependency of one workflow would be on the successful completion of all the dataflows created under it and similarly a whole job would be successful with the completion of just one workflow created under that job in the hierarchy.

 

By following the above process, we can migrate from Netezza to Teradata using Business Objects Data Services.

 

 

Thanks & Regards,

Vishakha Nigam


Data Quality Management, version for SAP, Product Tutorials

$
0
0

DataServices : Install BODS function group on SAP system

$
0
0

Method 1 – Recommended one: Using the CTS (Transport)

 

Prerequisite:

  • Identify the correct R900xxx.xxx and K900xxx.xxx to use, for it, read the readme.txt file in the following folder <install_dir>/Business Objects Data Services/admin/R3_functions/transport
  • In case of DataServices version before 4.1, you need to check the Unicode or Non-Unicode version of SAP system. To do it, log on to the SAP system, in the Menu Bar, go to ->System -> Status:

04-11-2013 12-11-10.png

 

 

Step N°

 

Screenshot

Comment

1

04-11-2013 12-11-33.png

On the SAP Server:

  • Copy R900xxx.xxx in /usr/sap/trans/data, check write privilege on file (chmod 666 on Unix)

 

 

  • Copy K900xxx.xxx in /usr/sap/trans/cofiles, check write privilege on file (chmod 666 on Unix)

 

 

With Transaction AL11 you could check that both files are present

2

04-11-2013 12-11-49.png 

Launch STMS transaction in SAP

Click on camion.png, DataServices Transport are not available

3

04-11-2013 12-12-03.png

In Menu Bar, click on

   ->Extras

      ->Other Requests

         ->Add

4

04-11-2013 12-16-47.png

 

Select the DataServices transport to add. If files copied in folders are: x900yyy.R63, transport request is R63K900yyy.

Click on OK.png.

5

  04-11-2013 12-17-00.png

The DataServices transport is now available, Select it and click on camion2.png.

6

04-11-2013 12-18-13.png

 

  • Fill the right Target Client

 

  • Mark checkbox “Ignore Invalid Component Version“

 

Click on OK.png.

7

 

04-11-2013 12-18-29.png

Open SE37 transaction and check if function module starting with /BODS/* are available.

Function module name are displayed in DataServices documentation.

 

Method 2 - Use .txt file

            (See DataServices documentation)

DataServices: Use Message Server or SAPLogon Group in Datastore

$
0
0

In some SAP environment, it is frequent to meet a technology called Message Server or SAP Logon Group to load balanced or distribute users and jobs on several SAP application server.

It is possible to create DS datastore that follow the same way.

 

The Architecture should be similar to:

04-11-2013 12-30-57.png

 

 

For connection through Message Server. The SAP datastores in DataServices could only be done through a sapnwrfc.ini file.

The following tasks will be done on the DS server


04-11-2013 12-32-21.png

04-11-2013 12-32-51.png04-11-2013 12-33-19.png

In the SAP datastore you should now use:

 

04-11-2013 12-33-37.png

In addition you could setup the RFC destination in the DataServices Management Console.

Here, only application servers could be used, no message server.

It means that you have to add one RFC connection by application server present in the SAP landscape:

04-11-2013 12-35-02.png

DataServices: Role and Authorizations for SAP

$
0
0

 

 

 

 

 

Create an SAP role via PFCG transaction code and insert authorization objects and corresponding values as below:

                                                                                         

 

Authorization Objects

 

Comments

 

1

 

S_RFC– Authorization Check for RFC   Access

                                                           
  

Field

   
  

Values

   
  

ACTVT

   
  

16

   
  

RFC_NAME

   
  

BAPI, CADR, RFC1, RSAB, SCAT,     SDIF,SDIFRUNTIME, SDTX, SYST, SLST, SUNI, SUTL, /BODS/BODS or ZAW*

   
  

RFC_TYPE

   
  

FUGR

   

 

/BODS/BODS   or ZAW* depending on the function group name   where transport was done (check chapter 1)

 

2

 

S_RFC_ADM– Administration for RFC   Destination

                                                                         
  

Field

   
  

Values

   
  

ACTVT

   
  

01, 02, 03

   
  

ICF_VALUE

   
  

*

   
  

RFCDEST

   
  

Name of RFC destination dedicated to DataServices

   
  

RFCTYPE

   
  

*

   

 

Name of   RFC destination dedicated to DataServices (View   SM59 Info)

 

3

 

S_TCODE– Transaction Code Check

                               
  

Field

   
  

Values

   
  

TCD

   
  

RSA1, SE37,     SE38, SU53

   

 

 

4

 

S_PATH– File System Access

                                             
  

Field

   
  

Values

   
  

ACTVT

   
  

02, 03

   
  

FS_BRGRU

   
  

*

   

 

 

5

 

S_ADMI_FCD– System Authorization

                               
  

Field

   
  

Values

   
  

S_ADMI_FCD

   
  

MEMO, ST0R

   

 

 

6

 

S_BTCH_ADM– Background Administrator

                               
  

Field

   
  

Values

   
  

BTCADMIN

   
  

Y

   

 

 

7

 

S_BTCH_JOB– Operations on Background   Jobs

                                             
  

Field

   
  

Values

   
  

JOBACTION

   
  

DELE, RELE

   
  

JOBGROUP

   
  

*

   

 

 

8

 

S_BTCH_NAM– Background User Name

                               
  

Field

   
  

Values

   
  

BTCUNAME

   
  

ALEREMOTE or Name of Background User

   

 

ALEREMOTE   or Name of Background User, Specify custom user   if ALEREMOTE is not used

 

9

 

S_CTS_ADMI– Transport Administration

                               
  

Field

   
  

Values

   
  

CTS_ADMFCT

   
  

PROJ

   

 

 

10

 

S_DATASET– Authorization for file   Access

                                                           
  

Field

   
  

Values

   
  

ACTVT

   
  

33, 34, A6,     A7

   
  

FILENAME

   
  

*

   
  

PROGRAM

   
  

Program Name

   

 

Program   Name, to be define according your requirements or   generated programs, could be *

 

11

 

S_SCRP_TXT– Standard text

                                                                         
  

Field

   
  

Values

   
  

ACTVT

   
  

03

   
  

LANGUAGE

   
  

*

   
  

TEXTID

   
  

*

   
  

TEXTNAME

   
  

*

   

 

 

12

 

S_TABU_DIS– Table Maintenance

                                             
  

Field

   
  

Values

   
  

ACTVT

   
  

03

   
  

DICBERCLS

   
  

*

   

 

 

13

 

S_USER_GRP– User Master Maintenance

                                             
  

Field

   
  

Values

   
  

ACTVT

   
  

03, 05

   
  

CLASS

   
  

CPIC, SUPER

   

 

 

14

 

S_DEVELOP– ABAP Wokbench

                                                                                       
  

Field

   
  

Values

   
  

ACTVT

   
  

03

   
  

DEVCLASS

   
  

Development     Package Name

   
  

OBJNAME

   
  

*

   
  

OBJTYPE

   
  

FUGR, PROG

   
  

P_GROUP

   
  

*

   

                                                                                       
  

Field

   
  

Values

   
  

ACTVT

   
  

01, 02, 03

   
  

DEVCLASS

   
  

$TMP

   
  

OBJNAME

   
  

Name of specific objects

   
  

OBJTYPE

   
  

PROG

   
  

P_GROUP

   
  

*

   

 

Name of   specific objects, declare here program name   called by DataServices (Example Z*, Y*)

 

 


If working on BW system:

                                   

 

Authorization Objects

 

Comments

 

1

 

S_RS_ADMWBDatawareHousing  Workbench - Objects

                                             
  

Field

   
  

Values

   
  

ACTVT

   
  

03

   
  

RSADMWBOBJ

   
  

APPLCOMP, INFOAREA, INFOOBJECT,     SOURCESYS, WORKBENCH

   

 

 

2

 

S_RS_DTPDatawareHousing  Workbench - DTP

                                                                                                                   
  

Field

   
  

Values

   
  

ACTVT

   
  

03, 16

   
  

RSONDTPSRC

   
  

Name     of source

   
  

RSONDTPTGT

   
  

Name of target

   
  

RSSTDTPSRC

   
  

*

   
  

RSSTDTPTGT

   
  

*

   
  

RSTLDTPSRC

   
  

Check     Type of source

   
  

RSTLDTPTGT

   
  

Check Type of target

   

 

Check Type of source for DTP (Example   : ODSO, CUBE …)

Check Type of target for DTP (Example : DEST for OHD…)

 

3

 

S_RS_ICUBEDatawareHousing  Workbench - InfoCube

                                                                         
  

Field

   
  

Values

   
  

ACTVT

   
  

03

   
  

RSICUBEOBJ

   
  

DEFINITION

   
  

RSINFOAREA

   
  

*

   
  

RSINFOCUBE

   
  

*

   

 

 

4

 

S_RS_ODSODatawareHousing  Workbench - DSO

                                                                         
  

Field

   
  

Values

   
  

ACTVT

   
  

03

   
  

RSINFOAREA

   
  

*

   
  

RSODSOBJ

   
  

*

   
  

RSODSPART

   
  

DEFINITION

   

 

 

5

 

S_RS_PC DatawareHousing  Workbench – Process Chains

                                                                         
  

Field

   
  

Values

   
  

ACTVT

   
  

03, 16

   
  

RSPCAPPLNM

   
  

*

   
  

RSPCCHAIN

   
  

Name of Process Chains

   
  

RSPCPART

   
  

DEFINITION, RUNTIME

   

 

 

 


How to import Role in SAP System?

 

Use PFCG transaction code:

                                                           

Step N°

 

Screenshot

 

Comment

 

1

 
04-11-2013 14-54-50.png

In menu click on Role -> Upload

 

2

 

     image002.png

 

Execute ok.png       

And select the attached file ZS_DS_SAP.SAP.

 

3

 
  image004.png

Execute  ok.png       

 

4

 
  04-11-2013 14-58-24.png

Check successful status

 

5

 
  04-11-2013 14-58-40.png

Change the uploaded role

 

6

 
  04-11-2013 14-58-55.png

Go to Authorizations Tab

And Click on Change Authorization Data button.

 

7

 
  04-11-2013 14-59-14.png

Complete open authorization   values (Yellow traffic light)

And Save   image009.png  

 

8

 
  04-11-2013 14-59-29.png

Check and Executeok.png       

 

9

 

     04-11-2013 14-59-46.png

 

Generate04-11-2013 15-00-03.png        and   Exit

 

EIM AD / SSO Tips & Tricks

$
0
0

This paper combines all the steps from the BI4.x, DS 4.x, FIM 10 and Intercompany 10 Administrator’s Guide with the latest best practices and all the latest SAP KBAs regarding vintela, kerberos and java AD configuration. It is specifically written for Platform 4.x and will not work with earlier versions of XI.

 

 

 

Assumptions & Prerequisites

·         Review the BI4 and DS4 Product Availability Matrix (PAM) to ensure your client and server operating systems, browsers and Active Directory versions are supported.

·         Windows AD authentication only works if the CMS is run on Windows. For single sign-on to the database, the reporting servers must also run on Windows. These Windows machines must be joined to the appropriate AD domain.

·         For multiple AD Forest environments please refer to KBA 1323391

Section 1 – Configure IPS / BI Platform  for AD authentication

Refer to KBA 1631734 to configure Active Directory for IPS / BI Platform and Tomcat server. Following are the high level steps covered by this KBA

i)         AD Service Account Configuration on Windows AD server

ii)        Configure the AS plugin on CMC

iii)       Configure all services to run on new service account

iv)       Configure manual AD to Java Application Servers (Tomcat)

v)        Configure BI Launchpad and CMC for manual AD

vi)       Configure Active Directory SSO on Tomcat

 

The following sections focus on the additional configurations required to enable AD authentication and SSO (where supported) on client components for BI and EIM 4 stack.

 

AdditionalTips: -

a)   SSO will not work from the Tomcat Server, use client machine for tests

b)   Create SPNs for all application URLs for hostnames and FQDN (Fully Qualified Domain Names)

c)   If load balancer is configured, create SPN for all URLs of load balancers (including FQDNs)

d)   If there are multiple domain controllers for AD authentication, configure all of them in the [Realms] section of krb5.ini

e)   If you are not able to locate Tomcat properties shortcut, run the following command on command line to open the configuration (Example is for Tomcat6)

Tomcat6w //ES/<Service Name>

Where <Service name> can be found from windows services i.e. the name under which Tomcat service is running. By default the name is BOEXI40Tomcat

f)    In multi-server architecture, KBA needs to be applied on all the machines running IPS / BI platform and application servers

g)   Not all products support SSO, however most of them support Windows AD. Refer to product admin guides for details

h)   SSO does not work for CMC for security reasons, however manual AD is supported

 

Section 2 – Configuring BI Clients for AD Authentication

In Section 1, we have already configured the CMC and BI Launchpad to run for AD and SSO. This section focuses on remaining client tools (where additional configuration is required)

 

Information Design Tool (IDT) AD Configuration

On each client machine, navigate to InformationDesignTool.ini at the following path

<LINK_DIR>\SAP BusinessObjects Enterprise XI 4.0\win32_x86

 

Add the following configuration (Ensure that IDT application is not running)

 

-Djava.security.auth.login.config= c:\windows\bscLogin.conf

-Djava.security.krb5.conf=c:\windows\krb5.ini

 

Note: - While running the application, enter the SYSTEM name and Authentication method to ‘Windows AD’, leaving ‘User Name’ and ‘Password’ blank

 

Universe Designer AD Configuration

No additional configuration is required. While running the application, enter the SYSTEM name and Authentication method to ‘Windows AD’, leaving ‘User Name’ and ‘Password’ blank.

 

Web Intelligence Rich Client AD Configuration

No additional configuration is required. While running the application, enter the SYSTEM name and Authentication method to ‘Windows AD’, leaving ‘User Name’ and ‘Password’ blank.

 

LCM (Life Cycle Management) SSO Configuration

Create a file LCM.properties at the following path

<LINK_DIR>\Tomcat6\webapps\BOE\WEB-INF\config\custom\

 

Update the file with the following contents

  1. authentication.visible=true
  2. authentication.default=secWinAD
  3. cms.default=<cmc-name>:<cmc-port>

 

Note: - Tomcat restart is required for the changes to be effective and perform the changes on all application servers for .properties file change

 

Section 3 – Configuring Data Services 4.x and FIM 10 for AD Authentication

 

Data Services Designer AD Configuration

No additional configuration is required. While running the application, enter the SYSTEM name and Authentication method to ‘Windows AD’, leaving ‘User Name’ and ‘Password’ blank.

 

Data Services Management Console AD Configuration

Ensure that Section 1 is performed on all the application server machines where DS Management Console application is running

 

FIM AD Configuration

Navigate to FIM (Financial Information Management) administration console web application and select the option ‘Configure Financial Information Management Server’.

Check the option ‘Show Authentication Mode’. This will enable FIM application to show additional drop down for selecting the authentication mode

 

 

Section 4 – Configuring Intercompany 10 for AD authentication

 

Intercompany AD Configuration

Navigate to Intercompany Administration console web application and select ‘Authentication’. Set the authentication mode to ‘BusinessObjects’ from drop down and ensure that correct CMS server is set.

 

To configure the users on Intercompany application to use AD authentication, perform the following for each user

a)     Log on to Intercompany application for business users

b)    Navigate to Users tab to add new users

c)     Click ‘Add New’ and select the following options

Authentication Mode:- Business Objects Security

Code:- Domain ID for account for which AD authentication will happen

Associated External Login:- Domain ID for account

User Profile:- As per requirement

 

Note: -

1)     Intercompany application is an exception and does not re-use CMC users, however new users are required to be created and mapped in Intercompany application

2)   There will be no authentication drop down available for Intercompany business application. The application will decide based on account name and password supplied i.e. If account belongs to both enterprise and AD, the password will decide on authentication type (assuming both types of authentication is allowed on server)

 

 

References

BI 4 Documentation: http://help.sap.com/

EIM 4 Documentation: http://help.sap.com/

FIM 10 Documentation: http://help.sap.com/

Intercompany 10 Documentation: http://help.sap.com/

SAP Knowledge Base Articles: http://service.sap.com/bosap-support

SAP SDN Business Objects User  forums (requires free registration) https://www.sdn.sap.com/irj/sdn/businessobjects-forums

 

 

 

Appendix

Key Terms

Some terms or acronyms we will be referring to throughout this document

AD Plugin– The area in the CMC where the query account is entered, SPN is set, and group mapping rules are configured

AD– Active Directory – Microsoft’s directory server

CMC– Web Admin tool used to configure the CMS service and other parameters for Business Objects Enterprise

FQDN– The Fully Qualified Domain Name.  For example, the FQDN of your Tomcat server may be Tomcat01.SAP.COM

SPN– Service Principal Name refers to an additional alias and attribute to an AD account. Various tools can be used to add an SPN to an AD account. The SPN is a primary access point for kerberos applications.

SSO - Single Sign-On – The ability to access an application without entering login credentials also known as silent sign-on, automatic logon, etc

Service account– Refers to an Active Directory user with special permissions (such as a fixed, non- changing password or SPN)

 

 

Copyright

© Copyright 2013 SAP AG. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries, eServer, z/VM, z/OS, i5/OS, S/390, OS/390, OS/400, AS/400, S/390 Parallel Enterprise Server, PowerVM, Power Architecture, POWER6+, POWER6, POWER5+, POWER5, POWER, OpenPower, PowerPC, BatchPipes, BladeCenter, System Storage, GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2, Parallel Sysplex, MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are trademarks or registered trademarks of IBM Corporation.

Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology.

Java is a registered trademark of Sun Microsystems, Inc.

JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape.

SAP, R/3, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and other countries.

Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, Web Intelligence, Xcelsius, and other Business Objects products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of Business Objects S.A. in the United States and in other countries. Business Objects is an SAP company.

All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.

These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

 

How to resolve connectivity error "BODI-1112172" when running SAP Business Objects Data Services Designer on Windws 7 Client

$
0
0
Applies to:

 

This document applies to SAP Business Objects Data Services14.x running on Microsoft SQL Server 2008 or higher as repository database and being accessed via the designer client software on a Windows 7 (32-bit) client pc/laptop.

 

Summary

 

This document explains about resolving connectivity error when accessing a Business Objects Data Services repository from Windows 7 using the designer client.  Error code - BODI-1112172

 

Introduction

 

When Accessing SAP Business Objects Data Services repositories in SQL Server 2008 the connectivity error code BODI-1112172 is a common one. The error predominantly occurs with accessing running SAP Business Objects Data Services Designer Client on a Windows 7 pc/laptop to access a MS SQL Server or ORACLE repository. This document applies only to cases where the repository database is hosted in Microsoft SQL Server.

 

Fig 01.png

 

Some of the common causes for this error are

 

  1. Windows firewall settings
  2. Data Source is not accessible via ODBC
  3. SQL Server connectivity is set to listen for dynamic port instead of the default port 1433 for BOE

 

In cases where the connection to the SQL Server repository database is not through windows active directory, there will be a prompt for Repository Password as show in the figure below.

 

Fig 2 RepositoryPassword.png

 

Resolution Steps

 

The working assumption of this document is that the Repository Password entered is correct. A resolution for this issue when connecting using ODBC for SQL Server 2008 is to create a system DSN on the client PC that is used for running SAP Business Objects Data Services Designer Client software. The steps involved are provided below.

 

 

Step 1– Go to Control Panel and open Administrative Tools

 

Fig 3.png

 

 

Step 2 – Open “Data Sources (ODBC)”

 

Fig 4.png

 

 

This will display the “ODBC Data Sources Administrator”

 

 

Step 3 – Click on the Add button.

Fig 5.png

This will display the “Create New Data Source” dialog box

 

 

Step 4– In the “Create New Data Source” dialog box, select SQL Server and click on finish

Fig 6.png

 

 

Clicking Finish will open the “Create New Data Source to SQL Server” dialog box in Step 5

 

Step 5– In the “Create New Data Source to SQL Server” dialog box, enter the connection details.

 

Fig 7_1.png

 

Step 6– Choose the authentication method

  • In case of Windows Active Directory access model, choose the "With Windows NT authentication using Network login ID"

          Fig 8.png

  • If SQL Server Authentication is used choose the "With SQL Server authentication using a login ID and password entered by the user". Enter the user name and password accordingly as shown in figure below

               Fig 9.png

Note that in case of SQL Server authentication, it is not necessary to user system administrator credentials. Any SQL Server user account with relevant level of access to the repository database(s) is sufficient.

 

After choosing the relevant authentication method and entering the credentials where applicable, click on "Client Configuration" button. This will open the "Add Network Library Configuration" dialog box in Step 7.

 

 

Step 7–  Enter the network configuration as shown in the figure below. 

 

Fig 10.png

Note that the port must be set to the default BOE140 installation port 1433 and the option "Dynamically determine port" must be unchecked. The protocol used is TCP/IP for MS QL Server connection using ODBC.

 

Also note that the server alias or server name must not be changed and should be the same as the SQL Server mentioned in Step 5.

 

Click on 'OK' after completing the settings. This will take the control back to the previous dialog box as shown below.

 

Fig 11.png

Click on "Next"

 

 

 

Step 8– For the purposes of SAP Business Objects Data Services Designer client connectivity to the repository in MS SQL Server, do not alter the setting in the subsequent two dialog boxes and click on “FINISH”

 

Fig 12.png

 

Fig 12a.png

 

 

 

Step 9– Upon clicking Finish, the “ODBC Microsoft SQL Server Setup” dialog box will appear. Click on “Test Data Source” button.

 

Fig 14.png

 

 

If the connection parameters and logon credentials entered are correct, the test should be completed successfully as shown below in the “SQL Server ODBC Data Source Test” dialog box.

 

Fig 15.png

 

Step 10– Click on OK on the “SQL Server ODBC Data Source Test” dialog box and then on “ODBC Microsoft SQL Server Setup” dialog box.

 

If the tests were successful, the “ODBC Data Source Administrator Window” will appear with a new System DSN BOE140. Click on the ‘OK’ button to close the dialog box.

 

Fig 16.png

 

Step 11– Open the Data Services Designer Logon Window, enter the credentials and click on “Lon On”

 

Fig 17.png

 

If using enterprise authentication and the windows logon ID is not synced with the SQL Server via Active Directory, then enter the repository password in the “Repository Password” dialog box and click “OK”.

 

Fig 2 RepositoryPassword.png

If the password entered is correct and the access is allowed for the user to the repository database in MS SQL Server security, then the SAP Business Objects Data Services Designer Client window will open for managing SAP Data Services Objects in the repository.

 

Fig 18.png

 

If the access for the user is set to read-only, in the SQL Server Security set-up, then the user will only be able to view the objects saved in the repository. If full read/write access is provided for the user to the repository database, then the user can create/edit/delete objects in the repository. Similar access can be set-up using a specific security model targeting a given set of security requirements in the CMC.

BODI 11.7 to BODS 4.1 Migration

$
0
0

Hi All,

 

This Document explains code migration process from BODI(Business Objects Data Integrator) 11.7 to BODS (Business Objects Data Services) 4.1.

 

                                                    

 

                                                            Migration Approach

 

  • Installation
              Install BODS 4.1 in the new server.

  • Migration
              Perform the code migration.

  • Testing of the Migrated objects
              Perform the testing of the codes.

 

                                                                   Migration Steps

 

  1. Take backups of Repositories and configuration files of BODI 11.7
  2. Install BODS 4.1 in the new linux server.
  3. Create the already existing job servers in the current BODI version also in BODS 4.1 with the same name.
  4. Repository Creation: can be done in two ways. detailed below in Approach 1 and Approach 2.
  5. Migrate the central repository users, repositories, management console users to BODS using DSXI40UpgradeMgr.sh.
  6. Assign job servers to BODI repositories.
  7. Do the testing of all the jobs.

Details on Step 4 different methods of repository creation :

 

     Approach 1 (Repository Upgrade):

  • Move the currently existing oracle schemas corresponding BODI to the new DB for BODS.
  • Use repository manager for upgrading the Repository (Central and Local).
  • If we are using the repository manager for upgrading the repository there can be compatibility issues for migrating from BODI 11.7 to BODS 4.1
           (Using an intermediate version BODS 3.2 (Temporary version) and first do a migration from BODI 11.7 to   BODS 3.2 and then to BODS 4.1 will help to reduce the compatibility issues)
  • Upgrade the repository version to BODS 4.1 using repository manager.
  • Take the snapshots of the errors occurred during the migration and do corrections manually in the job.


  Issues in Approach 1:

  • Compatibility issues between BODI 11.7 and BODS 4.1 .
  • Will take more time since intermediate version is involved.

Approach 2 (ATL back up Method):

  • Create new repositories for BODS 4.1 using repository manager. (central/Local).
  • Import the .atl.
  • Take the snapshots of the errors occurred during the migration and do corrections manually in the job.
  • Central Repository :

    
            1) Take the  latest version / required labeled version of all the components (data stores,  projects,  jobs, workflows, data flows, functions,  flat files ) to local  repository in BODI  11.7.

 

            2) Take the backup in .atl files from BODI 11.7

 

            3) Import the .atl files to a local repository specially created for central repository code  movement in BODS 4.1

 

            4) Move the code to the BODS 4.1 central repository.

 

Issues in Approach 2:

  • Central repository history cannot maintained.

 

Note:

  • We have used Approach 2 method for migration it went successful for us.


Teradata Control Framework

$
0
0

Hi Everyone,

 

This document covers following points:

 

1. What is Teradata Control Framework?

2. Control Framework Control Patterns.

3. TERADATA Control Framework 3- Tier Architecture.

4. STREAM concept

5. TERADATA Control Framewok Components.

6. Job to start stream

7. Source to Staging Job Flow Diagram.

8. Source to Staging Job steps.

9. Staging to ODS load steps.

 

Teradata Control Framework

 

The Control Framework is a set of architectural standards, development methods, processing patterns and a code library that provides the basis for implementing a true time variant data warehouse.

•The Control Framework is an automated framework to manage the ELT processes of the Teradata Global Architecture (3 Tier EDW) to ensure accuracy and efficiency.

>The Control Framework is used in conjunction with an ELT tool – in itself it is not a tool nor a replacement for a tool.

 

•The Control Framework enforces audit trail and reduces programmer effort.

•The Control Framework consists of the following components.

>Process Control Data Model and standard processing columns for all cross functional model tables.

>Standard code modules for ELT processes, surrogate keys and reference code data management .

>House keeping utilities for the daily/intraday control of the ELT processes

 

•The Control Framework is a Teradata Professional Services Consulting Asset that is available for purchase by Teradata customers.

 

Control Framework Control Patterns:

 

Start of Day

>Opens the business date for a given stream of ELT

•Start of Stream

>Opens the stream instance within a business date for a specific ELT batch

•End of Stream

>Completes the stream instance

•End of Day

>Completes the business date for a given stream

•Register Source File Extract

>Registers the availability of source data in the landing queue.

 

TERADATA CONTROL FRAMEWORK 3- TIER ARCHITECTURE

 

3 Tier Architecture.JPG

 

STREAM Concept

 

•A Stream is a collection of processes that must be completed as a whole to meaningfully transform a set of input data (files or tables) into a coherent output set of tables within the data warehouse. Each collection of dependent processes is identified by a record in this table.

•A Stream is a unit of schedulable work. Once completed a Stream may not be re-run without intervention to revert output data and CTLFW metadata to the pre-run state.

•There is no limit to the number of times a Stream may be run on any given date. Only one instance of any Stream is allowed to be run at a time.

 

Control Framework Components:

 

•Data Model that holds processing metadata

>When, what program, what source, what target, update metrics

>Static metadata, eg System, Files, Paths

>Operational metadata collected by the Standard CTLFW code at run time, e.g. metrics…

 

•Standard Code for the processing, control and registration patterns

>Called and executed to read from and write to the CTLFW tables.

>Guarantees

–Reliability

–Restart

–Accuracy

>Does the audit and control part of the process isolated from the individual pattern.

 

Job to Start Stream_fig1.JPG

 

Source_STAGING1.JPG

 

Source_Staging_Flow_diagram.JPG

 

Souce_Staging_flow_diagram2.JPG

 

Souce_Staging_Job_step1.JPG

 

These lines indicates function1, function2, function3 .

 

Staging_ODS_load_step1.JPG

 

STaging_ODS_load_step2.JPG

 

Thanks & Regards,

Vishakha Nigam

Execute Only Once - Component used to improve Data Services job design

$
0
0

One of the standard interview questions that encounters by nearly every-one is- Which two objects have the 'Execute Only Once' property, and what does this do?

And of the very few people that get this question right, the next question completely stumps them - Why would you want to set an object to 'Execute Only Once'?

So, the first part of the question is easy to answer, in that data flows and work flows have the execute only once property. Right click on any work flow or data flow and take a look:

 


 
So what does this do? Well, pretty much what it says on the box, it only allows an object to execute once within a single run of a job.

So if we set this property to true for the DF_CUSTOMERS_Load data flow, it would only execute once, despite being in the job six times:

 

                                                            

 

 

Generally this kind of job is never created in real life, so when would be a good time to use this property?

Well let’s say a job that loads 3 fact tables, e.g. FACT_SALES, FACT_RETURNS and FACT_ENQUIRIES, and each one of these facts shares a common dimension called DIM_CUSTOMER.

Now generally a job can be built by running all the staging tables, then all the dimension tables and then all the fact tables:

  

 

This method of building this kind of jobs has 3 disadvantages:

  1. You can't start building any fact until ALL the dimensions have been loaded. It may be that your slowest fact table build only requires the fastest dimension build to complete before it can start, so you are wasting time by making that fact table wait for all the dimensions to be built first before it can run.
  2. Let’s say you want to split out the 3 fact tables into separate jobs? You'd have to go through each fact data flow and click 'View Where Used' on each source table to make sure you get all the right dimensions and their associated data flows out of the job and into the new job.
  3. When it comes to debugging, if you have a bug in your fact table, you'll need to run all the staging and dimension tables, even if they aren't needed for the fact you are debugging.
  4. So in this scenario the ‘execute only once’ property can help.

 

Instead of having 3 big workflows for all the staging, then dimension and then fact builds, A workflow for each complete table build can be created and calling that workflow a component.

 

Within each component will be two further work flows.

1.     One containing all the dependencies for that table i.e. the components for the dimension tables, and

2.     Other one containing the actual data flows needed to build that table.

 

So first of all 3 work flow components are created:

C_FACT_SALES, C_FACT_RETURNS and C_FACT_ENQUIRIES.

 

Here a component is a work flow containing everything that is required to build the table it refers to.

 

                                                                                      

Within each component two work flows can be created:

 

One of these work flows will contain all the data flows needed to build the table itself and

 

Other will contain all the components needed to build the tables that need to be built first before the table build can start – like all the staging and dimension tables needed for that fact table.

 
So in the above example the work flow WF_FACT_SALES_Dependencies would look like this:

 

 

 

Each of the components above will have the 'Execute only once' option set. So the C_DIM_CUSTOMER component can be in each of the fact table components, but it will only get executed once.

So in this example the data flows to build the FACT_SALES table will be able to run as soon as the 4 component work flows above have completed. We can also now also run the C_FACT_SALES component on its own and get everything we need to build the FACT_SALES table.

By using the advantage of ‘Execute only once' option. Jobs can be created that:

  1. Run faster
  2. Allow for a quick view of the dependencies required to build a table
  3. Are easier to split into separate jobs
  4. And are easier to debug as I can just run the offending component, rather than the entire job

Account Payable Recievable Idoc Simplified

$
0
0

Hi All,

The goal of this document is to provide field level details for ACC_DOCUMENT03 idoc, used to load and extract data of Account Receivable and Account Payable in SAP ECC.

 

 

We understand that data in SAP are stored in SAP tables, with specific field requirement like data type and Length of field.

 

Often it’s difficult to memorize the field level details for any table.

In my sincere attempt, I have tried to include the field level details for a particular case of AP/AR

 

Please find attached the document, to understand the meaning of each and every field in IDOC, its system requirement (mandatory/non Mandatory), field name, description, segment name and table name where the field lies in SAP.

The document is designed segment wise,with each segment  differentiated by colors.

 

 

System Required

Text Description

SAP Technical
Field Name

Field Length

Segment Name

E1BPACHE09 - Header (BSIK)

 

 

 

 

 

Accounting Document Number

BELNR

10

E1BPACHE09

 

Reference Transaction

AWTYP

5

E1BPACHE09

 

Reference Key

AWKEY

20

E1BPACHE09

 

Logical system of source document

AWSYS

10

E1BPACHE09

 

Business Transaction

GLVOR

4

E1BPACHE09

*

User Name

USNAM

12

E1BPACHE09

 

Document Header Text

BKTXT

25

E1BPACHE09

*

Company Code

BUKRS

4

E1BPACHE09

*

Document Date in Document

BLDAT

24

E1BPACHE09

*

Posting Date in the Document

BUDAT

24

E1BPACHE09

 

Translation Date

WWERT

24

E1BPACHE09

 

Fiscal Year

GJAHR

4

E1BPACHE09

 

Fiscal Period

MONAT

2

E1BPACHE09

*

Document Type

BLART

2

E1BPACHE09

 

Reference Document Number

XBLNR

16

E1BPACHE09

 

Cancel: object key (AWREF_REV and AWORG_REV)

OBJ_KEY_R

20

E1BPACHE09

 

Reason for reversal

STGRD

2

E1BPACHE09

 

Component in ACC Interface

COMPO_ACC

4

E1BPACHE09

 

Reference Document Number (for Dependencies see Long Text)

REF_DOC_NO_LONG

35

E1BPACHE09

 

Accounting Principle

ACC_PRINCIPLE

4

E1BPACHE09

 

Indicator: Negative posting

XNEGP

1

E1BPACHE09

 

Invoice Ref.: Object Key (AWREF_REB and AWORG_REB)

OBJ_KEY_INV

20

E1BPACHE09

 

Billing category

BILL_CATEGORY

1

E1BPACHE09

 

Tax Reporting Date

VATDATE

24

E1BPACHE09

E1BPACGL09 - G/L account item (BSIS)

 

 

 

 

*

Accounting_Document_Number

BELNR

10

E1BPACGL09

*

Accounting_Document_Line_Item_Number

BUZEI

10

E1BPACGL09

*

General_Ledger_Account

HKONT

10

E1BPACGL09

 

Item_Text

SGTXT

50

E1BPACGL09

 

Indicator for statistical line items

STAT_CON

1

E1BPACGL09

 

Logical Transaction

LOG_PROC

6

E1BPACGL09

 

Business partner reference key

REF_KEY_1

12

E1BPACGL09

 

Business partner reference key

REF_KEY_2

12

E1BPACGL09

 

Reference_key_for_line_item

XREF3

20

E1BPACGL09

 

Transaction Key

ACCT_KEY

3

E1BPACGL09

 

Account Type

ACCT_TYPE

1

E1BPACGL09

 

Document_Type

BLART

2

E1BPACGL09

*

Company_Code

BUKRS

4

E1BPACGL09

 

Business_Area

GSBER

4

E1BPACGL09

 

Functional_Area

FKBER

4

E1BPACGL09

 

Plant

WERKS

4

E1BPACGL09

 

Fiscal_Period

MONAT

2

E1BPACGL09

*

Fiscal_Year

GJAHR

4

E1BPACGL09

 

Posting_Date_in_the_Document

BUDAT

24

E1BPACGL09

 

Value_date

VALUT

24

E1BPACGL09

 

Financial Management Area

FM_AREA

4

E1BPACGL09

 

Customer Number 1

CUSTOMER

10

E1BPACGL09

 

Indicator: Line item not liable to cash discount?

CSHDIS_IND

1

E1BPACGL09

 

Account Number of Vendor or Creditor

VENDOR_NO

10

E1BPACGL09

*

Assignment_Number

ZUONR

18

E1BPACGL09

 

Sales_Tax_Code

MWSKZ

2

E1BPACGL09

 

Tax Jurisdiction

TAXJURCODE

15

E1BPACGL09

 

Technical Key of External Object

EXT_OBJECT_ID

34

E1BPACGL09

 

Business Scenario in Controlling for Logistical Objects

BUS_SCENARIO

16

E1BPACGL09

 

Cost Object

COSTOBJECT

12

E1BPACGL09

 

Cost_Center

KOSTL

10

E1BPACGL09

 

Activity Type

ACTTYPE

6

E1BPACGL09

 

Profit_Center

PRCTR

10

E1BPACGL09

 

Partner_Profit_Center

PPRCT

10

E1BPACGL09

 

Network Number for Account Assignment

NETWORK

12

E1BPACGL09

 

Work_Breakdown_Structure_Element_(WBS_Element)

PROJK

24

E1BPACGL09

 

Order_Number

AUFNR

12

E1BPACGL09

 

Order Item Number

ORDER_ITNO

4

E1BPACGL09

 

Routing number of operations in the order

ROUTING_NO

10

E1BPACGL09

 

Operation/Activity Number

ACTIVITY

4

E1BPACGL09

 

Condition type

COND_TYPE

4

E1BPACGL09

 

Condition Counter

COND_COUNT

2

E1BPACGL09

 

Level Number

COND_ST_NO

3

E1BPACGL09

 

Fund

FISTL

10

E1BPACGL09

 

Funds_Center

GEBER

16

E1BPACGL09

 

Commitment_Item

FIPOS

14

E1BPACGL09

 

Business Process

CO_BUSPROC

12

E1BPACGL09

 

Main Asset Number

ASSET_NO

12

E1BPACGL09

 

Asset Subnumber

SUB_NUMBER

4

E1BPACGL09

 

Billing Type

BILL_TYPE

4

E1BPACGL09

 

Sales_Order_Number

KDAUF

10

E1BPACGL09

 

Item_Number_in_Sales_Order

KDPOS

6

E1BPACGL09

 

Distribution Channel

DISTR_CHAN

2

E1BPACGL09

 

Division

DIVISION

2

E1BPACGL09

 

Sales Organization

SALESORG

4

E1BPACGL09

 

Sales Group

SALES_GRP

3

E1BPACGL09

 

Sales Office

SALES_OFF

4

E1BPACGL09

 

Sold-to party

SOLD_TO

10

E1BPACGL09

 

Indicator: subsequent debit/credit

DE_CRE_IND

1

E1BPACGL09

 

Partner profit center for elimination of internal business

P_EL_PRCTR

10

E1BPACGL09

 

Indicator: Update quantity in RW

XMFRW

1

E1BPACGL09

 

Quantity

MBGBTR

48

E1BPACGL09

 

Base_Unit_of_Measure

MEINB

3

E1BPACGL09

 

Actual Invoiced Quantity

INV_QTY

48

E1BPACGL09

 

Billing quantity in stockkeeping unit

INV_QTY_SU

48

E1BPACGL09

 

Sales unit

SALES_UNIT

3

E1BPACGL09

 

Quantity in order price quantity unit

PO_PR_QNT

48

E1BPACGL09

 

Order price unit (purchasing)

PO_PR_UOM

3

E1BPACGL09

 

Quantity in Unit of Entry

ENTRY_QNT

48

E1BPACGL09

 

Unit of Entry

ENTRY_UOM

3

E1BPACGL09

 

Volume

VOLUME

48

E1BPACGL09

 

Volume unit

VOLUMEUNIT

3

E1BPACGL09

 

Gross Weight

GROSS_WT

48

E1BPACGL09

 

Net weight

NET_WEIGHT

48

E1BPACGL09

 

Weight unit

UNIT_OF_WT

3

E1BPACGL09

 

Item category in purchasing document

ITEM_CAT

1

E1BPACGL09

 

Material_Number

MATNR

18

E1BPACGL09

 

Material Type

MATL_TYPE

4

E1BPACGL09

 

Movement Indicator

MVT_IND

1

E1BPACGL09

 

Revaluation

REVAL_IND

1

E1BPACGL09

 

Origin Group as Subdivision of Cost Element

ORIG_GROUP

4

E1BPACGL09

 

Material-related origin

ORIG_MAT

1

E1BPACGL09

 

Sequential number of account assignment

SERIAL_NO

2

E1BPACGL09

 

Partner account number

PART_ACCT

10

E1BPACGL09

 

Trading partner's business area

TR_PART_BA

4

E1BPACGL09

 

Company ID of trading partner

TRADE_ID

6

E1BPACGL09

 

Valuation Area

VAL_AREA

4

E1BPACGL09

 

Valuation Type

VAL_TYPE

10

E1BPACGL09

 

Reference Date

ASVAL_DATE

24

E1BPACGL09

 

Purchasing Document Number

PO_NUMBER

10

E1BPACGL09

 

Item Number of Purchasing Document

PO_ITEM

5

E1BPACGL09

 

Item number of the SD document

ITM_NUMBER

6

E1BPACGL09

 

Condition Category (Examples: Tax, Freight, Price, Cost)

COND_CATEGORY

1

E1BPACGL09

 

Functional Area Long

FUNC_AREA_LONG

16

E1BPACGL09

 

Commitment Item Long

CMMT_ITEM_LONG

24

E1BPACGL09

 

Grant

GRANT_NBR

20

E1BPACGL09

 

Transaction Type

CS_TRANS_T

3

E1BPACGL09

 

Funded Program

MEASURE

24

E1BPACGL09

 

Segment for Segmental Reporting

SEGMENT

10

E1BPACGL09

 

Partner Segment for Segmental Reporting

PARTNER_SEGMENT

10

E1BPACGL09

 

Document Number for Earmarked Funds

RES_DOC

10

E1BPACGL09

 

Earmarked Funds: Document Item

RES_ITEM

3

E1BPACGL09

 

Billing Period of Performance Start Date

BILLING_PERIOD_START_DATE

24

E1BPACGL09

 

Billing Period of Performance End Date

BILLING_PERIOD_END_DATE

24

E1BPACGL09

 

PPA Exclude Indicator

PPA_EX_IND

1

E1BPACGL09

 

PPA Fast Pay Indicator

FASTPAY

1

E1BPACGL09

E1BPACAR09 - Customer Item (BSID)

 

 

 

 

 

Accounting Document Number

BELNR

10

E1BPACAR09

*

Accounting Document Line Item Number

ITEMNO_ACC

10

E1BPACAR09

 

Customer Number 1

KUNNR

10

E1BPACAR09

 

General Ledger Account

HKONT

10

E1BPACAR09

 

Business partner reference key

XREF1

12

E1BPACAR09

 

Business partner reference key

XREF2

12

E1BPACAR09

 

Reference key for line item

XREF3

20

E1BPACAR09

 

Company Code

BUKRS

4

E1BPACAR09

 

Business Area

GSBER

4

E1BPACAR09

 

Terms of Payment Key

ZTERM

4

E1BPACAR09

 

Baseline Date For Due Date Calculation

ZFBDT

24

E1BPACAR09

 

Days for first cash discount

ZBD1T

3

E1BPACAR09

 

Days for second cash discount

ZBD2T

3

E1BPACAR09

 

Deadline for net conditions

ZBD3T

3

E1BPACAR09

 

Percentage for First Cash Discount

ZBD1P

5

E1BPACAR09

 

Percentage for Second Cash Discount

ZBD2P

5

E1BPACAR09

 

Payment method

ZLSCH

1

E1BPACAR09

 

Payment Method Supplement

UZAWE

2

E1BPACAR09

 

Payment Reference

KIDNO

30

E1BPACAR09

 

Dunning keys

MSCHL

1

E1BPACAR09

 

Dunning block

MANSP

1

E1BPACAR09

 

Payment block key

ZLSPR

1

E1BPACAR09

 

VAT Registration Number

STCEG

20

E1BPACAR09

 

Assignment Number

ZUONR

18

E1BPACAR09

 

Item Text

SGTXT

50

E1BPACAR09

 

Partner Bank Type

BVTYP

4

E1BPACAR09

 

State Central Bank Indicator

LZBKZ

3

E1BPACAR09

 

Stores

BUPLA

4

E1BPACAR09

 

Section Code

SECCO

4

E1BPACAR09

 

Account number of the branch

FILKD

10

E1BPACAR09

 

Currency for automatic payment

PYCUR

5

E1BPACAR09

 

Amount in Payment Currency

DMBTR

23

E1BPACAR09

 

Credit control area

KKBER

4

E1BPACAR09

 

Short Key for a House Bank

HBKID

5

E1BPACAR09

 

Supplying Country

LANDL

3

E1BPACAR09

 

Sales Tax Code

MWSKZ

2

E1BPACAR09

 

Tax Jurisdiction

TAXJURCODE

15

E1BPACAR09

 

Date Relevant for Determining the Tax Rate

TAX_DATE

24

E1BPACAR09

 

Special G/L Indicator

UMSKZ

1

E1BPACAR09

 

  1. Com. Interface: Business Partner GUID

PARTNER_GUID

32

E1BPACAR09

 

Alternative payee

KNRZA

10

E1BPACAR09

 

Bank type of alternative payer

ALT_PAYEE_BANK

4

E1BPACAR09

 

Dunning Area

MABER

2

E1BPACAR09

 

Technical Case Key (Case GUID)

CASE_GUID

32

E1BPACAR09

 

Profit Center

PRCTR

10

E1BPACAR09

 

Fund

GEBER

10

E1BPACAR09

 

Grant

GRANT_NBR

20

E1BPACAR09

 

Funded Program

MEASURE

24

E1BPACAR09

 

ID for account details

HKTID

5

E1BPACAR09

 

Document Number for Earmarked Funds

KBLNR

10

E1BPACAR09

 

Earmarked Funds: Document Item

KBLPOS

3

E1BPACAR09

 

Long Fund (Obsolete)

FUND_LONG

20

E1BPACAR09

 

Dispute Management: Dispute Interface Category

DISPUTE_IF_TYPE

1

E1BPACAR09

E1BPACAP09 - Vendor Item (BSIK)

 

 

 

 

 

Accounting Document Number

BELNR

10

E1BPACAP09

*

Accounting Document Line Item Number

ITEMNO_ACC

10

E1BPACAP09

 

Account Number of Vendor or Creditor

LIFNR

10

E1BPACAP09

 

General Ledger Account

HKONT

10

E1BPACAP09

 

Business partner reference key

XREF1

12

E1BPACAP09

 

Business partner reference key

XREF2

12

E1BPACAP09

 

Reference key for line item

XREF3

20

E1BPACAP09

 

Company Code

BUKRS

4

E1BPACAP09

 

Business Area

GSBER

4

E1BPACAP09

 

Terms of Payment Key

ZTERM

4

E1BPACAP09

 

Baseline Date For Due Date Calculation

ZFBDT

24

E1BPACAP09

 

Days for first cash discount

ZBD1T

3

E1BPACAP09

 

Days for second cash discount

ZBD2T

3

E1BPACAP09

 

Deadline for net conditions

ZBD3T

3

E1BPACAP09

 

Percentage for First Cash Discount

ZBD1P

5

E1BPACAP09

 

Percentage for Second Cash Discount

ZBD2P

5

E1BPACAP09

 

Payment Method

ZLSCH

1

E1BPACAP09

 

Payment Method Supplement

UZAWE

2

E1BPACAP09

 

Payment Block Key

ZLSPR

1

E1BPACAP09

 

State Central Bank Indicator

LZBKZ

3

E1BPACAP09

 

Supplying Country

LANDL

3

E1BPACAP09

 

Service Indicator (Foreign Payment)

DIEKZ

1

E1BPACAP09

 

Assignment Number

ZUONR

18

E1BPACAP09

 

Item Text

SGTXT

50

E1BPACAP09

 

ISR Subscriber Number

PO_SUB_NO

11

E1BPACAP09

 

ISR Check Digit

PO_CHECKDG

2

E1BPACAP09

 

ISR Reference Number

PO_REF_NO

27

E1BPACAP09

 

Withholding Tax Code

QSSKZ

2

E1BPACAP09

 

Stores

BUPLA

4

E1BPACAP09

 

Section Code

SECCO

4

E1BPACAP09

 

Instruction key 1

DTWS1

2

E1BPACAP09

 

Instruction key 2

DTWS2

2

E1BPACAP09

 

Instruction key 3

DTWS3

2

E1BPACAP09

 

Instruction key 4

DTWS4

2

E1BPACAP09

 

Account number of the branch

FILKD

10

E1BPACAP09

 

Currency for Automatic Payment

PYCUR

5

E1BPACAP09

 

Amount in Payment Currency

DMBTR

23

E1BPACAP09

 

Special G/L Indicator

UMSKZ

1

E1BPACAP09

 

Sales Tax Code

MWSKZ

2

E1BPACAP09

 

Date Relevant for Determining the Tax Rate

TAX_DATE

24

E1BPACAP09

 

Tax Jurisdiction

TAXJURCODE

15

E1BPACAP09

 

Alternative Payee

ALT_PAYEE

10

E1BPACAP09

 

Bank type of alternative payer

ALT_PAYEE_BANK

4

E1BPACAP09

 

Partner Bank Type

BVTYP

4

E1BPACAP09

 

Short Key for a House Bank

HBKID

5

E1BPACAP09

 

  1. Com. Interface: Business Partner GUID

PARTNER_GUID

32

E1BPACAP09

 

Profit Center

PRCTR

10

E1BPACAP09

 

Fund

GEBER

10

E1BPACAP09

 

Grant

GRANT_NBR

20

E1BPACAP09

 

Funded Program

MEASURE

24

E1BPACAP09

 

ID for Account Details

HKTID

5

E1BPACAP09

E1BPACCR09 - Currency Items (BSIK)

 

 

 

 

 

Accounting Document Number

BELNR

10

E1BPACCR09

 

Accounting Document Line Item Number

BUZEI

10

E1BPACCR09

 

Currency Type and Valuation View

CURR_TYPE

2

E1BPACCR09

 

Currency Key

WAERS

5

E1BPACCR09

 

Amount in document currency

WRBTR

23

E1BPACCR09

 

Exchange rate

EXCH_RATE

9

E1BPACCR09

 

Indirect quoted exchange rate

EXCH_RATE_V

9

E1BPACCR09

 

Tax Base Amount in Document Currency

AMT_BASE

23

E1BPACCR09

 

Amount eligible for cash discount in document currency

SKFBT

23

E1BPACCR09

 

Cash discount amount in the currency of the currency types

DISC_AMT

23

E1BPACCR09

 

Amount in document currency

TAX_AMT

23

E1BPACCR09

 

Hope this would be helpful to all of us.

 

 

Regards,

Mayank Mehta

Difference between INFORMATICA 8.X AND BODS 4.0

$
0
0

In Attached excel comparison between BODS and Informatica is given point by point.

 

If anyone want to contribute please do so.

 

 

  0- No support (Begineer)
1-Intermediate
  2-Advanced

 

 

Features Informatica Powercenter 8.x BODS 4.0 Comments
Tool Features
Operational Dashboard 02BODS provides dashboards of DI execution statistics to see at a glance the status and performance of the Job execution for one or more repositories over a given time period. Informatica is working on to embed this feature in upcoming version.
Impact and Lineage Analysis 22BODS has feature to analyze end-to end impact and lineage for DI tables and columns, and BO objects such as universes, business views and reports. Informatica has Metadata manager which is a key feature for data lineage. User would require separate license to avail the feature
Auto Documentation 12BODs provides functionality to View, analyze, and print graphical representations of all objects as depicted in the Data Services Designer including their relationships, properties, tables used and more. Also you can see the logic implemented on Column level.
Informatica uses B2B Data Transformation Studio to create documentation.
Information Stewart 02Information Stewart is extensively used for data mining, maintain data quality and provide Business Analyst to define rules. As IS is SAP tool, it is easily integrated with BODS but not with Informatica.
Data Quality 12BODS has Data Quality Transforms to generate data Quality reports in Crystal Report format automatically. Informatica requires different client tool IDQ-Informatica Data Quality.
Built-in Scheduler 12BODS Job scheduling provides more option to schedule Jobs than Informatica.
Preview Source Data In-Designer 22Table's data can be seen in BODS and Infomatica Designer directly instead of going to database.
Slowly Changing dimension 21Informatica generates SCD mappings (with different options and types) & through Wizards and workflows though Generate Workflow options. We would require just customizing them.
Productivity Features
GUI Interface 21Informatica GUI is built with good features and it is divided in three client tools, so you can divide and work accordingly while in BODS we have a single platform for mapping (dataflow), workflow and monitoring. Also you can undo changes in Informatica but not in BODS.
Join Multiple Sources 22Both able to join multiple sources
Split Data Streams 22Data can be split and use according to requirement.
Complex Transformations 21SP, Java or any other language code can be integrated with Informatica with the help of custom transformation
Clustering and Job Distribution 21Informatica has grid and load distribution to handle load. If one node goes down, the other node becomes parent and take cares for environment.
Large Volume Performance 21Informatica Load data efficiently and Session properties in Workflow manager provides to handle data accordingly. Data has to be moved from the database to BW system once in a week or once in a day then BODS is good option.
Recovery of data 21Recovery of data can be done in Informatica. It create Bad files and OPB_SRVR_RECOVERY TABLE in repository.
Debugging 11We can discard or load data while debugging in Informatica but not in BODS.
Cache Management 21Cache size can be set manually and different cache option can be used in Informatica
Data Masking 10Data masking can be done in Informatica
Data Partitioning 21Partitioning option available in Informatica which process Job in multiple threads.
Really Codeless 21Different options are integrated in properties of Session, mapping, just need to check the box instead of doing coding.
Realtime Processing Capability 21Informatica needs Power Exchange (needs licenses) for real-time processing but in BODS runtime jobs can be created within Designer interface but Data latency for DS is larger ( minutes rather than seconds).
Portability/Deployment Features
JOB/Code Movement 21Job and code can be move via copy paste from one Repo to another or you can target the different database to load the data via parameterized database through parameters in Informatica and can use dynamic deployment group and labels.
Supported Operating System (Windows,Unix) 11
Connectivity/adapter support 22Both ETL provides a good connectivity to all databases adapters.
 

BODS benefits:

 

Inetegration with SAP (Information Stewart): SAP Business Objects provides tools for data mining and quality; profiling due to many acquisitions of other companies.

 

Operational Dashboard: View dashboards of DI execution statistics to see at a glance the status and performance of the Job execution for one or more repositories over a given time period.

 

Impact and Lineage Analysis: Analyze end-to end impact and lineage for DI tables and columns, and BO objects such as universes, business views and reports.

 

Data Validation: Evaluate the reliability of the target data based on the validation rules created in the Batch Jobs in order to quickly review, assess and identify potential inconsistencies or errors in source data.

 

Auto Documentation : View, analyze and print graphical representations of all objects as depicted in DI Designer, including their relationships, properties and ore.

 

Advantages:

 

· integration with SAP (Information – Stewart)

 

· SAP Business Objects created a firm company determined to stir the market;

 

· Good data modeling and data-management support;

 

· SAP Business Objects provides tools for data mining and quality; profiling due to many acquisitions of other companies.

 

· Quick learning curve and ease of use

 

Disadvantages:

 

· Uncertain future. Controversy over deciding which method of delivering data integration to use (SAP BW or BODS).

 

· BusinessObjects Data Integrator (Data Services) may not be seen as a stand-alone capable application to some organizations.

 

Informatica PowerCenter

 

Advantages:

 

· most substantial size and resources on the market of data integration tools vendors

 

· consistent track record, solid technology, straightforward learning curve, ability to address real-time data integration schemes

 

· Informatica is highly specialized in ETL and Data Integration and focuses on those topics, not on BI as a whole

 

· focus on B2B data exchange

Sales Order Idoc Simplified

$
0
0

Hi All,

The goal of this document is to provide field level details for Sales Order (SALESORDER_CREATEFROMDAT202)idoc, used to load Sales Order in SAP ECC.

We understand that data in SAP are stored in SAP tables, with specific field requirement like data type and Length of field.

Often it’s difficult to memorize the field level details for any table.

In my sincere attempt, I have tried to include the field level details for a particular case of Sales Order

Please find attached the document, to understand the meaning of each and every field in IDOC, its system requirement (mandatory/non Mandatory), field name, description, segment name and table name where the field lies in SAP.

The document is designed segment wise, with each segment differentiated by colours.

 

 

System Required

Text Description

SAP Technical
Field Name

Field Length

Segment Name

E1SALESORDER_CREATEFROMDAT2 - Header Segment (VBAK)

 

 

*

Sales Document

VBELN

10

E1SALESORDER_CREATEFROMDAT2

 

Relationship type

BINARY_RELATIONSHIPTYPE

4

E1SALESORDER_CREATEFROMDAT2

 

Int Number Assignment

INT_NUMBER_ASSIGNMENT

1

E1SALESORDER_CREATEFROMDAT2

 

Behave When Error

BEHAVE_WHEN_ERROR

1

E1SALESORDER_CREATEFROMDAT2

 

Testrun

TESTRUN

1

E1SALESORDER_CREATEFROMDAT2

 

Convert

CONVERT

1

E1SALESORDER_CREATEFROMDAT2

E1BPSDHD1 - Sales and Distribution Document Header (VBAK)

 

 

*

Sales Document

VBELN

10

E1BPSDHD1

 

Object Type

REFOBJECTTYPE

10

E1BPSDHD1

 

Object key

REFOBJECTKEY

70

E1BPSDHD1

 

Document type of reference object

REFDOCTYPE

10

E1BPSDHD1

*

Sales Document Type

AUART

4

E1BPSDHD1

 

Collective Number

SUBMI

10

E1BPSDHD1

*

Sales Organization

VKORG

4

E1BPSDHD1

*

Distribution Channel

VTWEG

2

E1BPSDHD1

*

Division

SPART

2

E1BPSDHD1

 

Sales Group

VKGRP

3

E1BPSDHD1

 

Sales Office

VKBUR

4

E1BPSDHD1

 

Requested delivery date

VDATU

24

E1BPSDHD1

 

Proposed date type

VPRGR

1

E1BPSDHD1

 

Customer purchase order date

BSTDK

24

E1BPSDHD1

 

Customer purchase order type

BSARK

4

E1BPSDHD1

 

Purchase order number supplement

BSTZD

4

E1BPSDHD1

 

Your Reference

IHREZ

12

E1BPSDHD1

 

Name of orderer

BNAME

35

E1BPSDHD1

 

NAME

NAME

35

E1BPSDHD1

 

Telephone Number

TELF1

16

E1BPSDHD1

 

Price group (customer)

KONDA

2

E1BPSDHD1

 

Customer group

KDGRP

2

E1BPSDHD1

 

Sales district

BZIRK

6

E1BPSDHD1

 

Price list type

PLTYP

2

E1BPSDHD1

 

Incoterms (Part 1)

INCO1

3

E1BPSDHD1

 

Incoterms (Part 2)

INCO2

28

E1BPSDHD1

 

Terms of Payment Key

ZTERM

4

E1BPSDHD1

 

Delivery block (document header)

LIFSK

2

E1BPSDHD1

 

Billing block in SD document

FAKSK

2

E1BPSDHD1

 

Order reason (reason for the business transaction)

AUGRU

3

E1BPSDHD1

 

Complete delivery defined for each sales order?

AUTLF

1

E1BPSDHD1

 

Date for pricing and exchange rate

PRSDT

24

E1BPSDHD1

 

Quotation/Inquiry is valid from

ANGDT

24

E1BPSDHD1

 

Date until which bid/quotation is binding (valid-to date)

BNDDT

24

E1BPSDHD1

 

Valid-from date (outline agreements, product proposals)

GUEBG

24

E1BPSDHD1

 

Valid-to date (outline agreements, product proposals)

GUEEN

24

E1BPSDHD1

 

Customer group 1

KVGR1

3

E1BPSDHD1

 

Customer group 2

KVGR2

3

E1BPSDHD1

 

Customer group 3

KVGR3

3

E1BPSDHD1

 

Customer group 4

KVGR4

3

E1BPSDHD1

 

Customer group 5

KVGR5

3

E1BPSDHD1

*

Customer purchase order number

BSTKD

35

E1BPSDHD1

 

Ship-to Party's Purchase Order Number

BSTKD_E

35

E1BPSDHD1

 

Ship-to party's PO date

BSTDK_E

24

E1BPSDHD1

 

Ship-to party purchase order type

BSARK_E

4

E1BPSDHD1

 

Ship-to party character

IHREZ_E

12

E1BPSDHD1

 

SD document category

VBTYP

1

E1BPSDHD1

 

Document Date (Date Received/Sent)

AUDAT

24

E1BPSDHD1

 

Guarantee date

GWLDT

24

E1BPSDHD1

 

Shipping Conditions

VSBED

2

E1BPSDHD1

 

Search term for product proposal

KTEXT

40

E1BPSDHD1

 

Number of contacts from the customer

MAHZA

3

E1BPSDHD1

 

Last customer contact date

MAHDT

24

E1BPSDHD1

 

Usage Indicator

ABRVW

3

E1BPSDHD1

 

MRP for delivery schedule types

ABDIS

1

E1BPSDHD1

 

Document number of the reference document

VGBEL

10

E1BPSDHD1

 

Company code to be billed

BUKRS_VF

4

E1BPSDHD1

 

Alternative tax classification

TAXK1

1

E1BPSDHD1

 

Tax classification 2 for customer

TAXK2

1

E1BPSDHD1

 

Tax classification 3 for customer

TAXK3

1

E1BPSDHD1

 

Tax Classification 4 Customer

TAXK4

1

E1BPSDHD1

 

Tax classification 5 for customer

TAXK5

1

E1BPSDHD1

 

Tax classification 6 for customer

TAXK6

1

E1BPSDHD1

 

Tax classification 7 for customer

TAXK7

1

E1BPSDHD1

 

Tax classification 8 for customer

TAXK8

1

E1BPSDHD1

 

Tax classification 9 for customer

TAXK9

1

E1BPSDHD1

 

Reference Document Number

XBLNR

16

E1BPSDHD1

 

Assignment number

ZUONR

18

E1BPSDHD1

 

Document category of preceding SD document

VGTYP

1

E1BPSDHD1

 

Order Combination Indicator

KZAZU

1

E1BPSDHD1

 

Invoice dates (calendar identification)

PERFK

2

E1BPSDHD1

 

Invoice list schedule (calendar identification)

PERRL

2

E1BPSDHD1

 

Manual invoice maintenance

MRNKZ

1

E1BPSDHD1

 

Directly quoted exchange rate for FI postings

KURRF

9

E1BPSDHD1

 

Additional value days

VALTG

2

E1BPSDHD1

 

Fixed value date

VALDT

24

E1BPSDHD1

 

Payment Method

ZLSCH

1

E1BPSDHD1

 

Account assignment group for this customer

KTGRD

2

E1BPSDHD1

 

Directly quoted exchange rate for pricing and statistics

PRSDT

9

E1BPSDHD1

 

Billing date for billing index and printout

FKDAT

24

E1BPSDHD1

 

Date on which services rendered

FBUDA

24

E1BPSDHD1

 

Dunning key

MSCHL

1

E1BPSDHD1

 

Dunning block

MANSP

1

E1BPSDHD1

 

Payment guarantee procedure

ABSSC

6

E1BPSDHD1

 

Department number

ABTNR

4

E1BPSDHD1

 

Receiving point

EMPST

25

E1BPSDHD1

 

Financial doc. processing: Internal financial doc. number

LCNUM

10

E1BPSDHD1

 

Customer condition group 1

KDKG1

2

E1BPSDHD1

 

Customer condition group 2

KDKG2

2

E1BPSDHD1

 

Customer condition group 3

KDKG3

2

E1BPSDHD1

 

Customer condition group 4

KDKG4

2

E1BPSDHD1

 

Customer condition group 5

KDKG5

2

E1BPSDHD1

 

Agreed delivery time

DELCO

3

E1BPSDHD1

 

SD Document Currency

WAERK

5

E1BPSDHD1

 

Name of Person who Created the Object

ERNAM

12

E1BPSDHD1

 

Tax departure country

LANDTX

3

E1BPSDHD1

 

Tax destination country

STCEG_L

3

E1BPSDHD1

 

Indicator: Triangular deal within the EU ?

XEGDR

1

E1BPSDHD1

 

Mster contract number

VBELN_GRP

10

E1BPSDHD1

 

Referencing requirement: Procedure

SCHEME_GRP

4

E1BPSDHD1

 

Check partner authorizations

ABRUF_PART

1

E1BPSDHD1

 

Cml delivery order qty date

DAT_FZAU

24

E1BPSDHD1

 

Sales document version number

VSNMR_V

12

E1BPSDHD1

 

Notification No

QMNUM

12

E1BPSDHD1

 

Work Breakdown Structure Element (WBS Element)

PS_PSP_PNR

24

E1BPSDHD1

 

Indirectly quoted exchange rate for FI postings

EXCH_RATE_FI_V

9

E1BPSDHD1

 

Indirectly quoted exchange rate for pricing and statistics

EXCHG_RATE_V

9

E1BPSDHD1

 

Character field of length 12

FKK_CONACCT

12

E1BPSDHD1

 

Generic project planning: GUID from external R/3 system

CAMPAIGN

16

E1BPSDHD1

 

Original system with release and transaction control

DOC_CLASS

9

E1BPSDHD1

 

Currency Key

H_CURR

5

E1BPSDHD1

 

Shipping type

VSART

2

E1BPSDHD1

 

Special processing indicator

SDABW

4

E1BPSDHD1

 

Reference Document Number (See long text for dependencies)

REF_DOC_L_LONG

35

E1BPSDHD1

E1BPSDITM - Sales and Distribution Document Item (VBAP)

 

 

*

Sales Document

VBELN

10

E1BPSDITM

*

Sales Document Item

POSNR

6

E1BPSDITM

 

Higher-level item in bill of material structures

UEPOS

6

E1BPSDITM

 

Item Number of the Underlying Purchase Order

POSEX

6

E1BPSDITM

 

Material Number

MATNR

18

E1BPSDITM

 

Item for which this item is an alternative

GRPOS

6

E1BPSDITM

 

Customer's material number (obsolete)

CUST_MAT22

22

E1BPSDITM

 

Batch Number

CHARG

10

E1BPSDITM

 

Delivery group (items are delivered together)

GRKOR

3

E1BPSDITM

 

Partial delivery at item level

KZTLF

1

E1BPSDITM

 

Reason for rejection of quotations and sales orders

ABGRU

2

E1BPSDITM

 

Block

FAKSP

2

E1BPSDITM

 

Billing date for billing index and printout

FKDAT

24

E1BPSDITM

 

Plant

WERKS

4

E1BPSDITM

 

Storage Location

LGORT

4

E1BPSDITM

 

Target quantity in sales units

ZMENG

48

E1BPSDITM

 

Target quantity UoM

ZIEME

3

E1BPSDITM

 

Sales document item category

PSTYV

4

E1BPSDITM

 

Short text for sales order item

ARKTX

40

E1BPSDITM

 

Material group 1

MVGR1

3

E1BPSDITM

 

Material group 2

MVGR2

3

E1BPSDITM

 

Material group 3

MVGR3

3

E1BPSDITM

 

Material group 4

MVGR4

3

E1BPSDITM

 

Material group 5

MVGR5

3

E1BPSDITM

 

Product hierarchy

PRODH

18

E1BPSDITM

 

Material Group

MATKL

9

E1BPSDITM

 

Customer purchase order number

BSTKD

35

E1BPSDITM

 

Customer purchase order date

BSTDK

24

E1BPSDITM

 

Customer purchase order type

BSARK

4

E1BPSDITM

 

Your Reference

IHREZ

12

E1BPSDITM

 

Ship-to Party's Purchase Order Number

BSTKD_E

35

E1BPSDITM

 

Ship-to party's PO date

BSTDK_E

24

E1BPSDITM

 

Ship-to party purchase order type

BSARK_E

4

E1BPSDITM

 

Ship-to party character

IHREZ_E

12

E1BPSDITM

 

Item Number of the Underlying Purchase Order

POSEX_E

6

E1BPSDITM

 

Price group (customer)

KONDA

2

E1BPSDITM

 

Customer group

KDGRP

2

E1BPSDITM

 

Sales district

BZIRK

6

E1BPSDITM

 

Price list type

PLTYP

2

E1BPSDITM

 

Incoterms (Part 1)

INCO1

3

E1BPSDITM

 

Incoterms (Part 2)

INCO2

28

E1BPSDITM

 

Order Combination Indicator

KZAZU

1

E1BPSDITM

 

Invoice dates (calendar identification)

PERFK

2

E1BPSDITM

 

Invoice list schedule (calendar identification)

PERRL

2

E1BPSDITM

 

Manual invoice maintenance

MRNKZ

1

E1BPSDITM

 

Directly quoted exchange rate for FI postings

KURRF

9

E1BPSDITM

 

Additional value days

VALTG

2

E1BPSDITM

 

Fixed value date

VALDT

24

E1BPSDITM

 

Terms of Payment Key

ZTERM

4

E1BPSDITM

 

Payment Method

ZLSCH

1

E1BPSDITM

 

Account assignment group for this customer

KTGRD

2

E1BPSDITM

 

Directly quoted exchange rate for pricing and statistics

KURRF

9

E1BPSDITM

 

Date for pricing and exchange rate

PRSDT

24

E1BPSDITM

 

Date on which services rendered

FBUDA

24

E1BPSDITM

 

Dunning key

MSCHL

1

E1BPSDITM

 

Dunning block

MANSP

1

E1BPSDITM

 

Promotion

WAKTION

10

E1BPSDITM

 

Payment guarantee procedure

ABSSC

6

E1BPSDITM

 

Financial doc. processing: Internal financial doc. number

LCNUM

10

E1BPSDITM

 

Department number

ABTNR

4

E1BPSDITM

 

Receiving point

EMPST

25

E1BPSDITM

 

Customer condition group 1

KDKG1

2

E1BPSDITM

 

Customer condition group 2

KDKG2

2

E1BPSDITM

 

Customer condition group 3

KDKG3

2

E1BPSDITM

 

Customer condition group 4

KDKG4

2

E1BPSDITM

 

Customer condition group 5

KDKG5

2

E1BPSDITM

 

Agreed delivery time

DELCO

3

E1BPSDITM

 

Sales unit

VRKME

3

E1BPSDITM

 

Factor for converting sales units to base units (target qty)

UMZIZ

5

E1BPSDITM

 

Factor for converting sales units to base units (target qty)

UMZIN

5

E1BPSDITM

 

Rounding quantity for delivery

ABLFZ

48

E1BPSDITM

 

Allowed deviation in quantity (absolute)

ABSFZ

48

E1BPSDITM

 

Allowed deviation in quantity (in percent)

KBVER

3

E1BPSDITM

 

Days by which the quantity can be shifted

KEVER

3

E1BPSDITM

 

Unused - Reserve Length 3

USAGE_IND

3

E1BPSDITM

 

Quantity is Fixed

FMENG

1

E1BPSDITM

 

Indicator: Unlimited Overdelivery Allowed

UEBTK

1

E1BPSDITM

 

Overdelivery Tolerance Limit

UEBTO

3

E1BPSDITM

 

Underdelivery Tolerance Limit

UNTTO

3

E1BPSDITM

 

Division

SPART

2

E1BPSDITM

 

Numerator (factor) for conversion of sales quantity into SKU

UMVKZ

5

E1BPSDITM

 

Denominator (Divisor) for Conversion of Sales Qty into SKU

UMVKN

5

E1BPSDITM

 

Gross Weight of the Item

BRGEW

48

E1BPSDITM

 

Net Weight of the Item

NTGEW

48

E1BPSDITM

 

Weight Unit

GEWEI

3

E1BPSDITM

 

Volume of the item

VOLUM

48

E1BPSDITM

 

Volume unit

VOLEH

3

E1BPSDITM

 

Delivery Priority

LPRIO

2

E1BPSDITM

 

Shipping Point/Receiving Point

VSTEL

4

E1BPSDITM

 

Route

ROUTE

6

E1BPSDITM

 

Name of Person who Created the Object

ERNAM

12

E1BPSDITM

 

Tax classification material

TAXM1

1

E1BPSDITM

 

Tax classification material

TAXM2

1

E1BPSDITM

 

Tax classification material

TAXM3

1

E1BPSDITM

 

Tax classification material

TAXM4

1

E1BPSDITM

 

Tax classification material

TAXM5

1

E1BPSDITM

 

Tax classification material

TAXM6

1

E1BPSDITM

 

Tax classification material

TAXM7

1

E1BPSDITM

 

Tax classification material

TAXM8

1

E1BPSDITM

 

Tax classification material

TAXM9

1

E1BPSDITM

 

Material Pricing Group

KONDM

2

E1BPSDITM

 

Valuation Type

BWTAR

10

E1BPSDITM

 

Delivery date and quantity fixed

FIXMG

1

E1BPSDITM

 

BOM explosion number

SERNR

8

E1BPSDITM

 

Results Analysis Key

ABGRS

6

E1BPSDITM

 

Requirements type

BEDAE

4

E1BPSDITM

 

Customer has not posted goods receipt

NACHL

1

E1BPSDITM

 

Business Transaction Type for Foreign Trade

EXART

2

E1BPSDITM

 

Overhead key

ZSCHL_K

6

E1BPSDITM

 

Costing Sheet

KALSM_K

6

E1BPSDITM

 

Material freight group

MFRGR

8

E1BPSDITM

 

Planning delivery schedule instruction

PLAVO

4

E1BPSDITM

 

KANBAN/sequence number

KANNR

35

E1BPSDITM

 

Billing form

FAKTF

2

E1BPSDITM

 

Dynamic Item Processor Profile

FFPRF

8

E1BPSDITM

 

Revenue recognition category

RRREL

1

E1BPSDITM

 

Proposed start date for accrual period

ACDATV

1

E1BPSDITM

 

Pricing Reference Material

UPMAT

18

E1BPSDITM

 

Object Type

REFOBJTYPE

10

E1BPSDITM

 

Object key

REFOBJKEY

70

E1BPSDITM

 

Logical system

REFLOGSYS

10

E1BPSDITM

 

Order probability of the item

AWAHR

3

E1BPSDITM

 

Maximum Number of Partial Deliveries Allowed Per Item

ANTLF

1

E1BPSDITM

 

CFOP Code and Extension

J_1BCFOP

10

E1BPSDITM

 

Tax law: ICMS

J_1BTAXLW1

3

E1BPSDITM

 

Tax law: IPI

J_1BTAXLW2

3

E1BPSDITM

 

SD tax code

J_1BTXSDC

2

E1BPSDITM

 

Assortment module

SKOPF

18

E1BPSDITM

 

Component quantity

KMPMG

48

E1BPSDITM

 

Currency amount for BAPIS (with 9 decimal places)

TARGET_VAL

28

E1BPSDITM

 

SD Document Currency

WAERK

5

E1BPSDITM

 

Profit Center

PRCTR

10

E1BPSDITM

E1BPSDITM1 - Document Item (VBAP)

 

 

 

*

Sales Document

VBELN

10

E1BPSDITM1

*

Sales Document Item

POSNR

6

E1BPSDITM1

 

Order Number

AUFNR

12

E1BPSDITM1

 

Work Breakdown Structure Element (WBS Element)

PS_PSP_PNR

24

E1BPSDITM1

 

Depreciation percentage for financial document processing

AKPRZ

5

E1BPSDITM1

 

Document number of the reference document

VGBEL

10

E1BPSDITM1

 

Item number of the reference item

VGPOS

6

E1BPSDITM1

 

Document category of preceding SD document

VGTYP

1

E1BPSDITM1

 

Material belonging to the customer

KDMAT

35

E1BPSDITM1

 

Indirectly quoted exchange rate for FI postings

EXCH_RATE_FI_V

9

E1BPSDITM1

 

Indirectly quoted exchange rate for pricing and statistics

EXCHG_RATE_V

9

E1BPSDITM1

 

ATP: Encryption of DELNR and DELPS

ITEMGUID_ATP

22

E1BPSDITM1

 

Value contract no.

WKTNR

10

E1BPSDITM1

 

Value contract item

WKTPS

6

E1BPSDITM1

 

External Configuration ID (Temporary)

CONFIG_ID

6

E1BPSDITM1

 

Instance Number in Configuration

INST_ID

8

E1BPSDITM1

 

Long Material Number for MATERIAL Field

MAT_EXT

40

E1BPSDITM1

 

External GUID for MATERIAL Field

MAT_GUID

32

E1BPSDITM1

 

Version Number for MATERIAL Field

MAT_VERS

10

E1BPSDITM1

 

Long Material Number for PR_REF_MAT Field

P_MAT_EXT

40

E1BPSDITM1

 

External GUID for PR_REF_MAT Field

P_MAT_GUID

32

E1BPSDITM1

 

Version Number for PR_REF_MAT Field

P_MAT_VERS

10

E1BPSDITM1

 

Functional Area

FKBER

4

E1BPSDITM1

 

Alternative BOM

STLAL

2

E1BPSDITM1

 

Character Field of Length 12

VKONT

12

E1BPSDITM1

 

International Article Number (EAN/UPC)

EAN11

18

E1BPSDITM1

 

Product catalog number

WMINR

10

E1BPSDITM1

 

Shipping type

VSART

2

E1BPSDITM1

 

Special processing indicator

SDABW

4

E1BPSDITM1

 

Functional Area Long

FKBER

16

E1BPSDITM1

 

Billing Relevance (CRM)

BILL_REL

1

E1BPSDITM1

 

ID for higher-level item usage

UEPVW

1

E1BPSDITM1

 

Generic project planning: GUID from external R/3 system

CAMPAIGN

16

E1BPSDITM1

 

Usage Indicator

VKAUS

3

E1BPSDITM1

 

CFOP code and extension

J_1BCFOP

10

E1BPSDITM1

E1BPPARNR - SD Document Partner: WWW (ADRC)

 

 

 

*

Sales Document

VBELN

10

E1BPPARNR

 

Sales Document Item

POSNR

6

E1BPPARNR

*

Partner Function

PARVW

2

E1BPPARNR

*

Customer Number 1

KUNNR

10

E1BPPARNR

 

Form of address

NAME_TEXT

15

E1BPPARNR

 

TITLE

TITLE

15

E1BPPARNR

 

Name 1

NAME1

35

E1BPPARNR

 

Name 2

NAME2

35

E1BPPARNR

 

Name 3

NAME3

35

E1BPPARNR

 

Name 4

NAME4

35

E1BPPARNR

 

House number and street

STREET

35

E1BPPARNR

 

COUNTRY

COUNTRY

3

E1BPPARNR

 

Postal Code

POST_CODE1

10

E1BPPARNR

 

  1. P.O. Box Postal Code

POST_CODE2

10

E1BPPARNR

 

PO Box city

PO_BOX_LOC

35

E1BPPARNR

 

City

CITY1

35

E1BPPARNR

 

District

CITY2

35

E1BPPARNR

 

Region (State, Province, County)

REGION

3

E1BPPARNR

 

PO Box

PO_BOX

10

E1BPPARNR

 

First telephone number

TELEPHONE

16

E1BPPARNR

 

Second telephone number

TELEPHONE2

16

E1BPPARNR

 

Telebox number

TELEBOX

15

E1BPPARNR

 

Fax Number

FAX_NUMBER

31

E1BPPARNR

 

Teletex number

TELETEX_NO

30

E1BPPARNR

 

Telex number

TELEX_NO

30

E1BPPARNR

 

Language Key

LANGU

2

E1BPPARNR

 

Unloading Point

ABLAD

25

E1BPPARNR

 

Transportation zone to or from which the goods are delivered

LZONE

10

E1BPPARNR

 

Tax Jurisdiction

TAXJURCODE

15

E1BPPARNR

 

Address 1

ADRNR

10

E1BPPARNR

 

Address 2

ADDRESS

10

E1BPPARNR

 

Address type (1=Organization, 2=Person, 3=Contact person)

ADDR_TYPE

1

E1BPPARNR

 

Origin of an address

ADDR_ORIG

1

E1BPPARNR

 

Link to address number

ADDR_LINK

10

E1BPPARNR

 

Object Type

REFOBJTYPE

10

E1BPPARNR

 

Object key

REFOBJKEY

70

E1BPPARNR

 

Logical system

REFLOGSYS

10

E1BPPARNR

E1BPSCHDL - SD Document Schedule Lines

 

 

 

*

Sales Document

VBELN

10

E1BPSCHDL

*

Sales Document Item

POSNR

6

E1BPSCHDL

*

Delivery Schedule Line Number

ETENR

4

E1BPSCHDL

*

Schedule line date

EDATU

8

E1BPSCHDL

 

Date type (day, week, month, interval)

PRGRS

1

E1BPSCHDL

 

Required Arrival time

EZEIT

6

E1BPSCHDL

*

Order quantity in sales units

WMENG

13

E1BPSCHDL

 

Schedule line blocked for delivery

LIFSP

2

E1BPSCHDL

 

Schedule line category

ETTYP

2

E1BPSCHDL

 

Transportation Planning Date

TDDAT

8

E1BPSCHDL

 

Material Staging/Availability Date

MBDAT

8

E1BPSCHDL

 

Loading Date

LDDAT

8

E1BPSCHDL

 

Goods Issue Date

WADAT

8

E1BPSCHDL

 

  1. Transp. Planning Time (Local, Relating to a Shipping Point)

TDUHR

6

E1BPSCHDL

 

Material Staging Time (Local, Relating to a Plant)

MBUHR

6

E1BPSCHDL

 

Loading Time (Local Time Relating to a Shipping Point)

LDUHR

6

E1BPSCHDL

 

Time of Goods Issue (Local, Relating to a Plant)

WAUHR

6

E1BPSCHDL

 

Object Type

REFOBJTYPE

10

E1BPSCHDL

 

Object key

REFOBJKEY

70

E1BPSCHDL

 

Logical system

REFLOGSYS

10

E1BPSCHDL

 

Delivery date

DLV_DATE

8

E1BPSCHDL

 

Delivery Arrival time

DLV_TIME

6

E1BPSCHDL

 

Release type

ABART

1

E1BPSCHDL

 

Schedule line type EDI

 

1

E1BPSCHDL

E1BPCOND - Conditions (KONV)

 

 

 

*

Sales Document

VBELN

10

E1BPCOND

*

Sales Document Item

POSNR

6

E1BPCOND

*

Condition item number

KPOSN

6

E1BPCOND

*

Step number

STUNR

3

E1BPCOND

*

Condition counter

ZAEHK

2

E1BPCOND

 

Condition type

KSCHL

4

E1BPCOND

 

Condition rate

KBETR

28

E1BPCOND

 

Currency Key

WAERS

5

E1BPCOND

 

Condition unit

KMEIN

3

E1BPCOND

 

Condition pricing unit

KPEIN

5

E1BPCOND

 

Object Type

REFOBJTYPE

10

E1BPCOND

 

Object key

REFOBJKEY

70

E1BPCOND

 

Logical system

REFLOGSYS

10

E1BPCOND

 

Application

KAPPL

2

E1BPCOND

 

Condition pricing date

KDATU

24

E1BPCOND

 

Calculation type for condition

KRECH

1

E1BPCOND

 

Condition base value

KAWRT

28

E1BPCOND

 

Condition exchange rate for conversion to local currency

KKURS

9

E1BPCOND

 

Numerator for converting condition units to base units

KUMZA

5

E1BPCOND

 

Denominator for converting condition units to base units

KUMNE

5

E1BPCOND

 

Condition category (examples: tax, freight, price, cost)

KNTYP

1

E1BPCOND

 

Condition is used for statistics

KSTAT

1

E1BPCOND

 

Scale Type

KNPRS

1

E1BPCOND

 

Condition is Relevant for Accrual  (e.g. Freight)

KRUEK

1

E1BPCOND

 

Condition for invoice list

KRELI

1

E1BPCOND

 

Origin of the condition

KHERK

1

E1BPCOND

 

Group condition

KGRPE

1

E1BPCOND

 

Condition update

KOUPD

1

E1BPCOND

 

Access sequence - Access number

KOLNR

2

E1BPCOND

 

Sequential number of the condition

KNUMH

2

E1BPCOND

 

Currency amount for BAPIS (with 9 decimal places)

ROUNDOFFDI

28

E1BPCOND

 

Condition value

KWERT_K

28

E1BPCOND

 

SD Document Currency

CURRENCY_2

5

E1BPCOND

 

Condition control

KSTEU

1

E1BPCOND

 

Condition is inactive

KINAK

1

E1BPCOND

 

Condition class

KOAID

1

E1BPCOND

 

Factor for condition base value

KFAKTOR

8

E1BPCOND

 

Scale basis indicator

KZBZG

1

E1BPCOND

 

Scale base value of the condition

KSTBS

28

E1BPCOND

 

Condition scale unit of measure

KONMS

3

E1BPCOND

 

Scale currency

KONWS

5

E1BPCOND

 

Condition for inter-company billing

KFKIV

1

E1BPCOND

 

Condition for configuration

KVARC

1

E1BPCOND

 

Condition changed manually

KMPRS

1

E1BPCOND

 

Condition record number

KNUMH

10

E1BPCOND

 

Sales Tax Code

MWSK1

2

E1BPCOND

 

Variant condition

VARCOND

26

E1BPCOND

E1BPCUINS - Instances of Several Configurations (VBAPKOM)

 

 

*

Sales Document

VBELN

10

E1BPCUINS

 

External Configuration ID (Temporary)

CONFIG_ID

6

E1BPCUINS

 

Instance Number in Configuration

INST_ID

8

E1BPCUINS

 

Object type

OBJ_TYPE

10

E1BPCUINS

 

Class Type

CLASS_TYPE

3

E1BPCUINS

 

Object key

OBJ_KEY

50

E1BPCUINS

 

Language-Dependent Object Description

OBJ_TXT

70

E1BPCUINS

 

Instance Quantity

QUANTITY

15

E1BPCUINS

 

Statement was Inferred

AUTHOR

1

E1BPCUINS

 

Unit of Measure

QUANTITY_UNIT

3

E1BPCUINS

 

General Indicator

COMPLETE

1

E1BPCUINS

 

General Indicator

CONSISTENT

1

E1BPCUINS

 

GUID for TYPE_OF Statement of Instance

OBJECT_GUID

32

E1BPCUINS

 

Instance Number (Persistent)

PERSIST_ID

32

E1BPCUINS

 

Type of Instance Number (Persistent)

PERSIST_ID_TYPE

1

E1BPCUINS

E1BPCCARD - Means of Payment (FPLTC)

E1BPCCARD - Means of Payment Order/Billing Document (FPLTC)

 

 

*

Sales Document

VBELN

10

E1BPCCARD

 

Payment cards: Card type

CCINS

4

E1BPCCARD

 

Payment cards: Card number

CCNUM

25

E1BPCCARD

 

Payment Cards: Valid To

DATBI

24

E1BPCCARD

 

Payment cards: Name of cardholder

CCNAME

40

E1BPCCARD

 

Value to be billed on the date specified in billing plan

BILLAMOUNT

23

E1BPCCARD

 

Payment cards: Authorization to be transferred

FLGAU

1

E1BPCCARD

 

Payment cards: Authorized amount

AUTWR

23

E1BPCCARD

 

Currency Key

CCWAE

5

E1BPCCARD

 

Payment cards: Authorization date

AUDAT

24

E1BPCCARD

 

Payment cards: Authorization time

AUTIM

24

E1BPCCARD

 

Payment cards: Authorization number

AUNUM

10

E1BPCCARD

 

Payment cards: Authoriz. reference code of clearing house

AUTRA

15

E1BPCCARD

 

Payment cards: Response to authorization checks

REACT

1

E1BPCCARD

 

Currency amount in BAPI interfaces

CC_RE_AMOUNT

23

E1BPCCARD

 

G/L Account Number

CCACT

10

E1BPCCARD

 

Payment cards: Status when external system is called?

CCALL

1

E1BPCCARD

 

Payment cards: Result text

RTEXT

40

E1BPCCARD

 

Checkbox

VCARD

1

E1BPCCARD

 

Payment cards: Merchant ID at the clearing house

MERCH

15

E1BPCCARD

E1BPSDTEXT - SD Texts (STXH)

 

 

 

 

Sales Document

DOC_NUMBER

10

E1BPSDTEXT

 

Sales Document Item

ITM_NUMBER

6

E1BPSDTEXT

*

Text ID

TDID

4

E1BPSDTEXT

*

Language Key

TDSPRAS

2

E1BPSDTEXT

 

Language according to ISO 639

LAISO

2

E1BPSDTEXT

 

Tag column

FORMAT_COL

2

E1BPSDTEXT

 

Text Line

TEXT_LINE

132

E1BPSDTEXT

 

Function

FUNCTION

3

E1BPSDTEXT

 

 

Hope this will help us.

 

 

Thanks & Regards,

Mayank Mehta

Viewing all 401 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>