Wednesday, June 30, 2010

API Parameters


Overview


1) The Application Programming Interface or API is a PL/SQL packaged procedure which can be used as an alternative entry point into the system to the traditional online forms

2) The advantage being that the same logic used by the seeded online forms can also be used by other interfaces into the system, thus ensuring that the integrity of the data remains intact

Calling API

1) Ensure it is appropriate to allow that particular business operation

2) Validate the data passed to the API

3) Each API has a number of parameters, most of them mapped with DB column. Every parameter name starts with p_. If the parameter maps onto a database column, the remaining part of the name is usually the same as the column name

4) When calling the APIs, it is strongly recommended that you use Named Notation, instead of Positional Notation

Standard IN Parameters

1) p_api_version IN NUMBER
This must match the version number of the API. An unexpected error is returned if the calling program version number is incompatible with the current API version number

2) p_init_msg_list IN VARCHAR2
The valid values for this parameter are:
• True = FND_API.G_TRUE
• False = FND_API.G_FALSE
• Default = FND_API.G_FALSE
If set to true, then the API makes a call to fnd_msg_pub.initialize to initialize the message stack. To set to true, use the value, "T". If set to false then the calling program must initialize the message stack. This action is required to be performed only once, even in the case where more
than one API is called. To set to false, use the value, "F".

3) p_commit IN VARCHAR2
The valid values for this parameter are:
• True = FND_API.G_TRUE
• False = FND_API.G_FALSE
• Default = FND_API.G_FALSE
If set to true, then the API commits before returning to the calling program. To set to true, use the value, "T". If set to false, then it is the calling program’s responsibility to commit the transaction. To set to false, use the value, "F".

Standard OUT Parameters

1) x_return_status OUT NOCOPY VARCHAR2
Indicates the return status of the API. The values returned are one of the following:
• FND_API.G_RET_STS_SUCCESS
Success: Indicates the API call was successful
• FND_API.G_RET_STS_ERROR
Expected Error: There is a validation error, or missing data error.
• FND_API.G_RET_STS_UNEXP_ERROR
Unexpected Error: The calling program can not correct the error.

2) x_msg_count OUT NOCOPY NUMBER
Holds the number of messages in the message list.

3) x_msg_data OUT NOCOPY VARCHAR2
Holds the encoded message if x_msg_count is equal to one

Tuesday, June 29, 2010

Adding GL Accounting Flexfield to Report



RDF Steps
1) Define Below Parameters in Report

P_MIN_FLEX Char(200)

P_MAX_FLEX Char(200)
STRUCT_NUM Number(15)
P_CONC_REQUEST_ID Number(15)
P_LEDGER_ID Number(15)
P_USER_ID Number(15)

2) Add below code in Before Report Trigger

FND_PROFILE.GET('CONC_REQUEST_ID',:P_CONC_REQUEST_ID);

FND_PROFILE.GET('GL_SET_OF_BKS_ID',:P_LEDGER_ID);
FND_PROFILE.GET('USER_ID',:P_USER_ID);

srw.reference(:STRUCT_NUM);

srw.user_exit('FND FLEXSQL CODE="GL#" NUM=":STRUCT_NUM" APPL_SHORT_NAME="SQLGL"
OUTPUT=":WHERE_FLEX" TABLEALIAS="CC"
MODE="WHERE" DISPLAY="ALL"
OPERATOR="BETWEEN"
OPERAND1=":P_MIN_FLEX"
OPERAND2=":P_MAX_FLEX"');

:WHERE_FLEX := ' AND ' :WHERE_FLEX;

3) Add below code in Report Query to the last line

&WHERE_FLEX

Application Setup Steps
- Add below parameters alongwith value set in Concurrent program Parameters Window

1) Chart of Accounts ID
Value Set- GL_SRS_COA_UNVALIDATED
Default Type- SQL Statement
Default Value- SELECT chart_of_accounts_id FROM gl_access_sets WHERE access_set_id = :$PROFILES$.GL_ACCESS_SET_ID
Token- STRUCT_NUM

2) Account Flexfield From
Value Set- GL_SRS_LEDGER_FLEXFIELD
Range- Pair
Token- P_MIN_FLEX

3) Account Flexfield To
Value Set- GL_SRS_LEDGER_FLEXFIELD
Range- Pair
Token- P_MAX_FLEX

SAP Vs. ORACLE



What Is ROI?
Return on investment is calculated by dividing the average annual net benefits over a 3 year period by the initial costs:
(Y1+Y2+Y3)/3/initial cost
Benefits include both direct cost savings and productivity and other indirect benefits; costs include software, hardware, personnel, consulting, and training.

ROI AND PAYBACK
> 65 percent of Oracle E-Business Suite customers achieved a positive return on investment after an average deployment time of 3.1 years.
> 43 percent of SAP customers had achieved a positive return on investment after an average deployment time of 2.8 years.

COSTS
1) Oracle’s average costs were 48 percent lower than SAP’s.
2) Oracle’s median costs were 29 percent lower than SAP’s.
3) Oracle’s average 3-year TCO was 48% lower than SAP’s.
4) Oracle’s median 3-year TCO was 34 percent lower than SAP’s.

PERSONNEL REQUIREMENTS
1) It takes fewer internal personnel to deploy Oracle: an average of 81 man-months of internal personnel were needed for Oracle, compared to an average of 372 for SAP.
2) It takes less training to deploy Oracle: an average of 27 hours of training for Oracle, compared to an average of 49 hours for SAP

CONCLUSION
1) SAP projects are larger in scale and scope and thus take a longer time to reach positive ROI than Oracle.
2) More Oracle customers than SAP customers achieve a positive ROI. Ninety percent of Oracle customers can expect a positive ROI in fewer than 5 years.
3) SAP customers often face customization and integration challenges that slow deployments and increase consulting and personnel costs.
4) Oracle has a lower overall TCO than SAP

Types of Purchase Orders



Standard PO

1) One time commitment to purchase goods / services
2) Specific Quantity, Rate, Delivery schedule

Blanket Purchase Agreement
1) Long Term Agreement to Purchase – e.g. Annual Rate Contract
2) Specific Quantity, Rate, but no specific Delivery schedule

Planned PO
1) Long term agreement to buy goods / services
2) Tentative delivery schedule and accounting distribution given.
3) Other terms and conditions fixed
4) Planned PO look exactly like a Standard PO

Contract Purchase Agreement
1) A master document giving Standard terms & Conditions
2) Suppliers are authorised to ship through Standard PO lines giving reference of the Contract Agreement

View SQL Statement from OAF Page



Below are the detailed steps:

1) Update the following Profile Option Values to Yes at User level.
• FND: Diagnostics
• Personalize Self-Service Defn

2) Go to the Java OAF Page where the SQL Query executes

3) Click on the link About this Page in the bottom Left Hand Side of the page

4) Navigate to the Page Tab on the Top Left Hand Side of the page

5) In the Business Component References Details Section (You may need to expand this
section), Go to the View Objects Sub Section.

6) Find the Object that describes the piece of information that you want to find the query for

7) Click on the Link

8) The full Query used can be cut and pasted into a SQL Editor and the query run.
Note: You will have to find the Bind Variables passed to the query in order to do this!!

Friday, June 25, 2010

AP-PO Link Queries



-- VENDOR, PO AND INVOICE DETAILS (2 WAY MATCH)


SELECT DISTINCT
a.org_id "ORG ID", e.segment1 "VENDOR NUM",
e.vendor_name "SUPPLIER NAME",
UPPER (e.vendor_type_lookup_code) "VENDOR TYPE",
f.vendor_site_code "VENDOR SITE CODE", f.address_line1 "ADDRESS",
f.city "CITY", f.country "COUNTRY",
TO_CHAR (TRUNC (d.creation_date)) "PO DATE", d.segment1 "PO NUM",
d.type_lookup_code "PO TYPE", c.quantity_ordered "QTY ORDERED",
c.quantity_cancelled "QTY CANCELLED", g1.quantity_received, g1.quantity_rejected, g1.quantity_billed,
g.item_id "ITEM ID",
g.item_description "ITEM DESCRIPTION", g.unit_price "UNIT PRICE",
(NVL (c.quantity_ordered, 0) - NVL (c.quantity_cancelled, 0)
)
* NVL (g.unit_price, 0) "PO LINE AMOUNT",
(SELECT DECODE (ph.approved_flag, 'Y', 'Approved')
FROM po.po_headers_all ph
WHERE ph.po_header_id = d.po_header_id) "PO APPROVED?",
a.invoice_type_lookup_code "INVOICE TYPE",
a.invoice_amount "INVOICE AMOUNT",
TO_CHAR (TRUNC (a.invoice_date)) "INVOICE DATE",
a.invoice_num "INVOICE NUMBER",
(SELECT DECODE (x.match_status_flag,
'A', 'Approved'
)
FROM ap.ap_invoice_distributions_all x
WHERE x.invoice_distribution_id = b.invoice_distribution_id)
"INVOICE APPROVED?",
a.amount_paid
FROM ap.ap_invoices_all a,
ap.ap_invoice_distributions_all b,
po.po_distributions_all c,
po.po_headers_all d,
ap.ap_suppliers e,
ap.ap_supplier_sites_all f,
po.po_lines_all g,
po.po_line_locations_all g1
WHERE a.invoice_id = b.invoice_id
AND b.po_distribution_id = c.po_distribution_id(+)
AND c.po_header_id = d.po_header_id(+)
AND e.vendor_id(+) = d.vendor_id
AND f.vendor_site_id(+) = d.vendor_site_id
AND d.po_header_id = g.po_header_id
AND g1.po_header_id = c.po_header_id
AND g1.po_line_id = g.po_line_id
AND c.po_line_id = g.po_line_id
AND c.line_location_id = g1.line_location_id
AND g1.inspection_required_flag = 'N'
AND g1.receipt_required_flag = 'N'

-- VENDOR, PO AND INVOICE DETAILS (3 WAY MATCH)
SELECT DISTINCT
a.org_id "ORG ID", e.segment1 "VENDOR NUM",
e.vendor_name "SUPPLIER NAME",
UPPER (e.vendor_type_lookup_code) "VENDOR TYPE",
f.vendor_site_code "VENDOR SITE CODE", f.address_line1 "ADDRESS",
f.city "CITY", f.country "COUNTRY",
TO_CHAR (TRUNC (d.creation_date)) "PO DATE", d.segment1 "PO NUM",
d.type_lookup_code "PO TYPE", c.quantity_ordered "QTY ORDERED",
c.quantity_cancelled "QTY CANCELLED", g1.quantity_received, g1.quantity_rejected, g1.quantity_billed,
g.item_id "ITEM ID",
g.item_description "ITEM DESCRIPTION", g.unit_price "UNIT PRICE",
(NVL (c.quantity_ordered, 0) - NVL (c.quantity_cancelled, 0)
)
* NVL (g.unit_price, 0) "PO LINE AMOUNT",
(SELECT DECODE (ph.approved_flag, 'Y', 'Approved')
FROM po.po_headers_all ph
WHERE ph.po_header_id = d.po_header_id) "PO APPROVED?",
a.invoice_type_lookup_code "INVOICE TYPE",
a.invoice_amount "INVOICE AMOUNT",
TO_CHAR (TRUNC (a.invoice_date)) "INVOICE DATE",
a.invoice_num "INVOICE NUMBER",
(SELECT DECODE (x.match_status_flag,
'A', 'Approved'
)
FROM ap.ap_invoice_distributions_all x
WHERE x.invoice_distribution_id = b.invoice_distribution_id)
"INVOICE APPROVED?",
a.amount_paid
FROM ap.ap_invoices_all a,
ap.ap_invoice_distributions_all b,
po.po_distributions_all c,
po.po_headers_all d,
ap.ap_suppliers e,
ap.ap_supplier_sites_all f,
po.po_lines_all g,
po.po_line_locations_all g1
WHERE a.invoice_id = b.invoice_id
AND b.po_distribution_id = c.po_distribution_id(+)
AND c.po_header_id = d.po_header_id(+)
AND e.vendor_id(+) = d.vendor_id
AND f.vendor_site_id(+) = d.vendor_site_id
AND d.po_header_id = g.po_header_id
AND g1.po_header_id = c.po_header_id
AND g1.po_line_id = g.po_line_id
AND c.po_line_id = g.po_line_id
AND c.line_location_id = g1.line_location_id
AND g1.inspection_required_flag = 'N'
AND g1.receipt_required_flag = 'Y'

-- VENDOR, PO AND INVOICE DETAILS (4 WAY MATCH)
SELECT DISTINCT
a.org_id "ORG ID", e.segment1 "VENDOR NUM",
e.vendor_name "SUPPLIER NAME",
UPPER (e.vendor_type_lookup_code) "VENDOR TYPE",
f.vendor_site_code "VENDOR SITE CODE", f.address_line1 "ADDRESS",
f.city "CITY", f.country "COUNTRY",
TO_CHAR (TRUNC (d.creation_date)) "PO DATE", d.segment1 "PO NUM",
d.type_lookup_code "PO TYPE", c.quantity_ordered "QTY ORDERED",
c.quantity_cancelled "QTY CANCELLED", g1.quantity_received, g1.quantity_rejected, g1.quantity_billed,
g.item_id "ITEM ID",
g.item_description "ITEM DESCRIPTION", g.unit_price "UNIT PRICE",
(NVL (c.quantity_ordered, 0) - NVL (c.quantity_cancelled, 0)
)
* NVL (g.unit_price, 0) "PO LINE AMOUNT",
(SELECT DECODE (ph.approved_flag, 'Y', 'Approved')
FROM po.po_headers_all ph
WHERE ph.po_header_id = d.po_header_id) "PO APPROVED?",
a.invoice_type_lookup_code "INVOICE TYPE",
a.invoice_amount "INVOICE AMOUNT",
TO_CHAR (TRUNC (a.invoice_date)) "INVOICE DATE",
a.invoice_num "INVOICE NUMBER",
(SELECT DECODE (x.match_status_flag,
'A', 'Approved'
)
FROM ap.ap_invoice_distributions_all x
WHERE x.invoice_distribution_id = b.invoice_distribution_id)
"INVOICE APPROVED?",
a.amount_paid
FROM ap.ap_invoices_all a,
ap.ap_invoice_distributions_all b,
po.po_distributions_all c,
po.po_headers_all d,
ap.ap_suppliers e,
ap.ap_supplier_sites_all f,
po.po_lines_all g,
po.po_line_locations_all g1
WHERE a.invoice_id = b.invoice_id
AND b.po_distribution_id = c.po_distribution_id(+)
AND c.po_header_id = d.po_header_id(+)
AND e.vendor_id(+) = d.vendor_id
AND f.vendor_site_id(+) = d.vendor_site_id
AND d.po_header_id = g.po_header_id
AND g1.po_header_id = c.po_header_id
AND g1.po_line_id = g.po_line_id
AND c.po_line_id = g.po_line_id
AND c.line_location_id = g1.line_location_id
AND g1.inspection_required_flag = 'Y'
AND g1.receipt_required_flag = 'Y'

-- VENDOR, PO, INVOICE AND PAYMENT DETAILS
SELECT DISTINCT a.org_id "ORG ID", e.segment1 "VENDOR NUMBER",
e.vendor_name "VENDOR NAME",
UPPER (e.vendor_type_lookup_code) "VENDOR TYPE",
f.vendor_site_code "VENDOR SITE CODE",
TO_CHAR (TRUNC (d.creation_date)) "PO DATE",
d.segment1 "PO NUM", d.type_lookup_code "PO TYPE",
c.quantity_ordered "QTY ORDERED",
c.quantity_cancelled "QTY CANCELLED",
g.item_description "ITEM DESCRIPTION",
g.unit_price "UNIT PRICE",
(NVL (c.quantity_ordered, 0) - NVL (c.quantity_cancelled, 0)
)
* NVL (g.unit_price, 0) "PO LINE AMOUNT",
(SELECT DECODE (ph.approved_flag,
'Y', 'Approved'
)
FROM po.po_headers_all ph
WHERE ph.po_header_id = d.po_header_id) "PO APPROVED?",
a.invoice_type_lookup_code "INVOICE TYPE",
a.invoice_amount "INVOICE AMOUNT",
TO_CHAR (TRUNC (a.invoice_date)) "INVOICE DATE",
a.invoice_num "INVOICE NUMBER",
(SELECT DECODE (x.match_status_flag,
'A', 'Approved'
)
FROM ap.ap_invoice_distributions_all x
WHERE x.invoice_distribution_id = b.invoice_distribution_id)
"INVOICE APPROVED?",
a.amount_paid, h.check_id, i.check_number,
h.invoice_payment_id,
TO_CHAR (TRUNC (i.check_date)) "PAYMENT DATE"
FROM ap.ap_invoices_all a,
ap.ap_invoice_distributions_all b,
po.po_distributions_all c,
po.po_headers_all d,
ap.ap_suppliers e,
ap.ap_supplier_sites_all f,
po.po_lines_all g,
ap.ap_invoice_payments_all h,
ap.ap_checks_all i
WHERE a.invoice_id = b.invoice_id
AND b.po_distribution_id = c.po_distribution_id(+)
AND c.po_header_id = d.po_header_id(+)
AND e.vendor_id(+) = d.vendor_id
AND f.vendor_site_id(+) = d.vendor_site_id
AND d.po_header_id = g.po_header_id
AND c.po_line_id = g.po_line_id
AND a.invoice_id = h.invoice_id
AND h.check_id = i.check_id
AND c.po_header_id IS NOT NULL
AND a.payment_status_flag = 'Y'
AND d.type_lookup_code != 'BLANKET'
ORDER BY E.VENDOR_NAME

Saturday, June 19, 2010

Tarball Creation



While working in OOD (Oracle On Demand) Environment, developers need to provide Tarball (Zip file) to oracle for migration of components. Below are the Unix Commands for the same.

pwd
/home/c_vghaya
mkdir TECH001
Move all files (.ctl, .rdf, .install, .ldt etc..) to above directory TECH001
tar -cvf TECH001_D_1.tar TECH001
pwd

/home/c_vghaya
chmod 777 TECH001_D_1.tar
compress TECH001_D_1.tar

Note- After Compress, Zip File- (TECH001_D_1.tar.Z) file will be created in /home/c_vghaya. Upload this Zip file to oracle for migration
----------------------------------------------------------
Alternatively, you can also user below commands to explode this tarball and perform migration
pbrun impnpdb -u apdbtg7i
cd $XBOL_TOP/install
cp /home/c_vghaya/TECH058_D_1.tar.Z .
uncompress TECH058_D_1.tar.Z

Note- After Uncompress TECH001_D_1.tar file will be created in $XBOL_TOP/installchmod 777 TECH001_D_1.tar
tar -xvf TECH001_D_1.tar
Note- Above command will create TECH001 directory in $XBOL_TOP/install
cd TECH001
Now finally run .install file from TECH001 directory to transfer all files to respective folders
. XXEQ_AP_INVOICE.install BOLINF/Plin28tx APPS/(appspwd) $XBOL_TOP

Note: After running .install file, files are transferred as below. This install file is a shell script
1) .sql files --> $XBOL_TOP/admin/sql
2) .ldt files --> $XBOL_TOP/patch/115/import
3) .prog & .ctl files --> $XBOL_TOP/bin
4) .rdf files --> $XBOL_TOP/reports/US

Application Diagnostic Test



After creating SR through Metalink, oracle usually asks for data collection test. Below are the detailed steps to execute those diagnostic test from Application.

1) Login to E-Business Suite or R12 using your username/password
2) Select the "Application Diagnostics" responsibility, If this responsibility is not assigned, contact the System Administrator to configure the responsibility
3) Click Diagnose in the right hand side menu
4) From the 'Diagnose' tab, under 'Tests', Click 'Select Application' button and using the LOV, select search criteria (application name or application shortname)
Note: Multiple products may be queried by using a comma (,) between products. Do not include a space before or after the comma.
5) Enter search criteria and click 'Go'
6) Select the desired application(s) from the list below, and click 'Select'
7) Expand group(s), select the test(s) you wish to execute and click 'Execute'
8) Enter parameters (as required) and click 'Select'.
9) Use the 'list of values' to help in your parameter selection. If multiple tests are selected, click the icon in 'Test Inputs' column followed by 'Add Another Row' to enter parameters. Click 'Add Another row' for each occurance of that test.
- Click 'Submit'
10) Click the 'Refresh' button for execution status. When the execution is complete, the 'Status' field in the 'Request Details' section will show as 'Completed'. The report is now available for viewing and download by clicking the icon in the respective 'View Report' or 'Download Report' column in the 'Execution Details' section.
11) When executing mulitple tests, click the number in the 'Total Executions' column to select the specific report.

Host Concurrent Program



Below are the steps to register host concurrent program in oracle apps

1) Define Concurrent Program with Executable Type = Host and Execution File Name = Name of .prog file without extension

2) Copy your .prog file (Shell Script) to Custom Top Bin directory (Eg: $XXCUST_TOP/bin)

3) Create symbollic link by executing below commands on Unix Server
cd $XXCUST_TOP/bin
ln -s $FND_TOP/bin/fndcpesr sample_file
WHERE sample_file = Name of your Host program script

Report Compilation/Execution Command



Compilation Command


Login to Database Server and Run below Command

rwconverter userid=apps/xxx source=$XXCUST_TOP/reports/US/XXPRPOP.rdf dest=$XXCUST_TOP/reports/US/XXPRPOP.rdf stype=rdffile dtype=rdffile overwrite=yes batch=yes compile_all=yes

Execution Command

Login to Database Server and Run below Command

/devlappl/prod11iora/8.0.6/bin/rwrun60 USERID=$P_LOGIN PARAMFORM=no P_START_DATE=$P_START_DATE P_END_DATE=$P_END_DATE report=$v_RepFile BATCH=yes destype=file mode=character desname=$vOutFile desformat=delimited vRetCode1=$?

WHERE

/devlappl/prod11iora/8.0.6 = Path for $ORACLE_HOME
$P_LOGIN = Userid/Password@Connect_String
$P_START_DATE, $P_END_DATE = Parameters
$v_RepFile = Path of RDF File
$vOutFile = Path of output delimeted text file

Thursday, June 17, 2010

Concurrent Request Details



1) FND_CONCURRENT_REQUESTS stores information about individual concurrent requests.


2) ARGUMENT1 through ARGUMENT25 contain any arguments the application passes to the concurrent program. If the concurrent program needs more than 25 arguments to run, the first 25 arguments are stored in this table, ARGUMENT26 throughARGUMENT100 are stored in table FND_CONC_REQUEST_ARGUMENTS.


3) REQ_INFORMATION is used with report sets to remember the status of the request between runs.


4) When the request is set to use automatic resubmission, RESUBMITTED is a flag to indicate whether the request has been resubmitted or not.


5) IS_SUB_REQUEST is a flag that identifies a child request and HAS_SUB_REQUEST is a flag that identifies a parent request.


STATUS_CODE Column Values:
A- Waiting
B- Resuming
C- Normal
D- Cancelled
E- Error
F- Scheduled
G- Warning
H- On Hold
I- Normal
M- No Manager
Q- Standby
R- Normal
S- Suspended
T- Terminating
U- Disabled
W- Paused
X- Terminated
Z- Waiting


PHASE_CODE Column Values:
C- Completed
I- Inactive
P- Pending
R- Running

Detailed Query for Input Parameter Request ID

SELECT DISTINCT fcr.request_id, fcr.actual_start_date,
fcr.actual_completion_date,
floor(((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60)/3600) HOURS,

floor((((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60) -
floor(((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60)/3600)*3600)/60) MINUTES,
round((((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60) -
floor(((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60)/3600)*3600 -
(floor((((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60) -
floor(((fcr.actual_completion_date-fcr.actual_start_date)*24*60*60)/3600)*3600)/60)*60) )) SECS,
DECODE (fcr.phase_code,
'C', 'Completed',
'I', 'Inactive',
'P', 'Pending',
'R', 'Running',
'N/A'
) phase_code,
DECODE (fcr.status_code,
'A', 'Waiting',
'B', 'Resuming',
'C', 'Normal',
'D', 'Cancelled',
'E', 'Errored',
'F', 'Scheduled',
'G', 'Warning',
'H', 'On Hold',
'I', 'Normal',
'M', 'No Manager',
'Q', 'Standby',
'R', 'Normal',
'S', 'Suspended',
'T', 'Terminating',
'U', 'Disabled',
'W', 'Paused',
'X', 'Terminated',
'Z', 'Waiting',
'N/A'
) status_code,
fcr.outfile_name, fcr.number_of_arguments, fcr.argument_text,
frt.responsibility_name, fav.application_name, fav.application_short_name appl_short_name, fu.user_name,
fu.description user_description, fu.start_date user_start_date,
fcp.user_concurrent_program_name,
fcp.concurrent_program_name short_name, fe.executable_name,
DECODE (fe.execution_method_code,
'B', 'Request Set Stage Function',
'Q', 'SQL*Plus',
'H', 'Host',
'L', 'SQL*Loader',
'A', 'Spawned',
'I', 'PL/SQL Stored Procedure',
'P', 'Oracle Reports',
'S', 'Immediate',
'N/A'
) execution_method,
fe.execution_file_name
FROM fnd_concurrent_requests fcr,
fnd_user fu,
fnd_application_vl fav,
fnd_responsibility_tl frt,
fnd_concurrent_programs_vl fcp,
fnd_executables fe
WHERE fcr.requested_by = fu.user_id
AND fcr.concurrent_program_id = fcp.concurrent_program_id
AND fcr.responsibility_id = frt.responsibility_id
AND fcr.responsibility_application_id = fav.application_id
AND fcp.executable_id = fe.executable_id
AND fcr.request_id = :request_id
-- AND fu.user_name= ''

DBA Queries



-- Database Details
SELECT * FROM v$database

-- Instance Details
SELECT * FROM v$instance

-- License Details

SELECT * FROM v$license

-- Version Details
SELECT * FROM v$version

--Release Details
SELECT * FROM apps.fnd_product_groups

-- Patch Details
SELECT * FROM ad_applied_patches
SELECT * FROM ad_bugs


-- Concurrent Manager

SELECT concurrent_queue_name,
       user_concurrent_queue_name,
       description,
       enabled_flag
  FROM apps.fnd_concurrent_queues_vl fcq
 WHERE user_concurrent_queue_name LIKE 'AA%'



-- Partitioning Installed
SELECT DECODE (COUNT (*), 0, 'No', 'Yes') partitioning
FROM (SELECT 1
FROM dba_part_tables
WHERE owner NOT IN ('SYSMAN', 'SH', 'SYS', 'SYSTEM') AND ROWNUM = 1);

-- Spatial Installed
SELECT DECODE (COUNT (*), 0, 'No', 'Yes') spatial
FROM (SELECT 1
FROM all_sdo_geom_metadata
WHERE ROWNUM = 1);

-- RAC Installed
SELECT DECODE (COUNT (*), 0, 'No', 'Yes') rac
FROM (SELECT 1
FROM v$active_instances
WHERE ROWNUM = 1);

-- Unix Product Top Value
SELECT variable_name, value
FROM apps.fnd_env_context
WHERE variable_name = 'AP_TOP'
AND concurrent_process_id =
(SELECT MAX (concurrent_process_id) FROM apps.fnd_env_context);

-- Command to Kill Session for Releasing Lock
ALTER SYSTEM KILL SESSION '(sid, serial#)';

Sunday, June 13, 2010

RTF Template- Word Features



Q. How to repeat table header on each page?
Ans: Right click on first row of table header and Go to table properties and click on Row Tab. Here check the second option checkbox- Repeat as header row at the top of each page. For rest of the rows uncheck this checkbox and check first option- Allow row to break across pages

Q. How to insert tag in header/footer portion of RTF template?
Ans. Usually we are not allowed to insert tag in header/footer. But we can insert text of the tag directly in header/footer, it indirectly work as a tag in template. Eg: (?CountryName?)
Q. Not able to print tag value even after opening the loop?
Ans. Sometimes we are not able to print tag value even after properly opening and closing the loop. In such case, provide complete path for tag in form field help text.
Eg: (?/OutboundPaymentInstruction/OutboundPayment/Payee/SupplierNumber?)
Q. How to remove line border between records of a table?
Ans. To manage line border between records of a table in loop, use outside border option in Formatting Menu of Word. If it still doesnt work, then create a table in excel by formatting cells and paste it in RTF template.

Note: Replace symbol '(' with '<' and ')' with '>' in all above xml/xsl syntax

Oracle Fusion Overview



Fusion is Oracle’s initiative to develop a set of “next generation” enterprise business applications, on a service oriented platform. It is broadly divided into 2 parts as below:

Fusion Applications

– Enterprise business applications developed using open industry based standards (vs. the proprietary languages of Oracle’s current products lines – Oracle Forms, Reports, People Tools..etc.)
– Based on the best features and functionalities of Oracle’s existing applications product lines (Oracle, PeopleSoft, JDEdwards, Siebel, etc.)


Fusion Middleware

Oracle’s currently available, standards based middleware platform and the application server platform for the future Fusion Applications.
An architecture blueprint that brings together enterprise applications, leveraging what Oracle believes are three key emerging trends in technology:
• Service Oriented Architectures (SOA)
• Grid Computing Infrastructure
• Enterprise Information Architecture

Oracle SOA Suite Components

JDeveloper (Application Development Framework)
BAM- Business Activity Monitoring
BPEL Process Manager- Business Process Execution Language
ESB- Enterprise Service BusWSM- Web Service Manager

Friday, June 11, 2010

AIM Documents



AIM- Application Implementation Methodology

Business Process Architecture (BP)
BP.010 - Define Business and Process Strategy
BP.020 - Catalog and Analyze Potential Changes
BP.030 - Determine Data Gathering Requirements
BP.040 - Develop Current Process Model
BP.050 - Review Leading Practices
BP.060 - Develop High-Level Process Vision
BP.070 - Develop High-Level Process Design
BP.080 - Develop Future Process Model
BP.090 - Document Business Procedure

Business Requirements Definition (RD)
RD.010 - Identify Current Financial and Operating Structure
RD.020 - Conduct Current Business Baseline
RD.030 - Establish Process and Mapping Summary
RD.040 - Gather Business Volumes and Metrics
RD.050 - Gather Business Requirements
RD.060 - Determine Audit and Control Requirements
RD.070 - Identify Business Availability Requirements
RD.080 - Identify Reporting and Information Access Requirements

Business Requirements Mapping
BR.010 - Analyze High-Level Gaps
BR.020 - Prepare mapping environment
BR.030 - Map Business requirements
BR.040 - Map Business Data
BR.050 - Conduct Integration Fit Analysis
BR.060 - Create Information Model
BR.070 - Create Reporting Fit Analysis
BR.080 - Test Business Solutions
BR.090 - Confirm Integrated Business Solutions
BR.100 - Define Applications Setup
BR.110 - Define security Profiles

Application and Technical Architecture (TA)
TA.010 - Define Architecture Requirements and Strategy
TA.020 - Identify Current Technical Architecture
TA.030 - Develop Preliminary Conceptual Architecture
TA.040 - Define Application Architecture
TA.050 - Define System Availability Strategy
TA.060 - Define Reporting and Information Access Strategy
TA.070 - Revise Conceptual Architecture
TA.080 - Define Application Security Architecture
TA.090 - Define Application and Database Server Architecture
TA.100 - Define and Propose Architecture Subsystems
TA.110 - Define System Capacity Plan
TA.120 - Define Platform and Network Architecture
TA.130 - Define Application Deployment Plan
TA.140 - Assess Performance Risks
TA.150 - Define System Management Procedures

Module Design and Build (MD)
MD.010 - Define Application Extension Strategy
MD.020 - Define and estimate application extensions
MD.030 - Define design standards
MD.040 - Define Build Standards
MD.050 - Create Application extensions functional design
MD.060 - Design Database extensions
MD.070 - Create Application extensions technical design
MD.080 - Review functional and Technical designs
MD.090 - Prepare Development environment
MD.100 - Create Database extensions
MD.110 - Create Application extension modules
MD.120 - Create Installation routines

Data Conversion (CV)
CV.010 - Define data conversion requirements and strategy
CV.020 - Define Conversion standards
CV.030 - Prepare conversion environment
CV.040 - Perform conversion data mapping
CV.050 - Define manual conversion procedures
CV.060 - Design conversion programs
CV.070 - Prepare conversion test plans
CV.080 - Develop conversion programs
CV.090 - Perform conversion unit tests
CV.100 - Perform conversion business objects
CV.110 - Perform conversion validation tests
CV.120 - Install conversion programs
CV.130 - Convert and verify data

Documentation (DO)
DO.010 - Define documentation requirements and strategy
DO.020 - Define Documentation standards and procedures
DO.030 - Prepare glossary
DO.040 - Prepare documentation environment
DO.050 - Produce documentation prototypes and templates
DO.060 - Publish user reference manual
DO.070 - Publish user guide
DO.080 - Publish technical reference manual
DO.090 - Publish system management guide

Business System Testing (TE)
TE.010 - Define testing requirements and strategy
TE.020 - Develop unit test script
TE.030 - Develop link test script
TE.040 - Develop system test script
TE.050 - Develop systems integration test script
TE.060 - Prepare testing environments
TE.070 - Perform unit test
TE.080 - Perform link test
TE.090 - perform installation test
TE.100 - Prepare key users for testing
TE.110 - Perform system test
TE.120 - Perform systems integration test
TE.130 - Perform Acceptance test

Performance Testing(PT)
PT.010 - Define Performance Testing Strategy
PT.020 - Identify Performance Test Scenarios
PT.030 - Identify Performance Test Transaction
PT.040 - Create Performance Test Scripts
PT.050 - Design Performance Test Transaction Programs
PT.060 - Design Performance Test Data
PT.070 - Design Test Database Load Programs
PT.080 - Create Performance Test TransactionPrograms
PT.090 - Create Test Database Load Programs
PT.100 - Construct Performance Test Database
PT.110 - Prepare Performance Test Environment
PT.120 - Execute Performance Test

Adoption and Learning (AP)
AP.010 - Define Executive Project Strategy
AP.020 - Conduct Initial Project Team Orientation
AP.030 - Develop Project Team Learning Plan
AP.040 - Prepare Project Team Learning Environment
AP.050 - Conduct Project Team Learning Events
AP.060 - Develop Business Unit Managers’Readiness Plan
AP.070 - Develop Project Readiness Roadmap
AP.080 - Develop and Execute CommunicationCampaign
AP.090 - Develop Managers’ Readiness Plan
AP.100 - Identify Business Process Impact onOrganization
AP.110 - Align Human Performance SupportSystems
AP.120 - Align Information Technology Groups
AP.130 - Conduct User Learning Needs Analysis
AP.140 - Develop User Learning Plan
AP.150 - Develop User
AP.160 - Prepare User Learning Environment
AP.170 - Conduct User Learning Events
AP.180 - Conduct Effectiveness Assessment

Production Migration (PM)
PM.010 - Define Transition Strategy
PM.020 - Design Production Support Infrastructure
PM.030 - Develop Transition and Contingency Plan
PM.040 - Prepare Production Environment
PM.050 - Set Up Applications
PM.060 - Implement Production Support Infrastructure
PM.070 - Verify Production Readiness
PM.080 - Begin Production
PM.090 - Measure System Performance
PM.100 - Maintain System
PM.110 - Refine Production System
PM.120 - Decommission Former Systems
PM.130 - Propose Future Business Direction
PM.140 - Propose Future Technical Direction

CRM Modules Listing




Oracle's integrated Customer Relationship Management (CRM) solution is a set of applications that give you information-driven Sales, Service and Marketing

Oracle Channel Management
Oracle Marketing
Oracle Order Management
Oracle Sales
Oracle Service

Oracle Service contains following applications
Depot Repair
Field Service
Advanced Scheduler
Mobile Field Service
Spares Management
Interaction Center
Advanced Inbound Telephony
Advanced Outbound Telephony
Email Center
Scripting
iSupport
Service Contracts
TeleService

All following 3 modules integrates seamlessly and are inter-related:
1) Field Service
2) TeleService
3) Service Contracts

R12- Upgrade Vs. Re-implementation




Upgrade Consideration
1) No major process changes
2) No major organizational restructuring
3) COA unchanged
4) Current system is cleaned, reconciled and up-to-date

Re-implement Consideration1) Major business process changes
2) Major organizational restructuring
3) Significant setup changes required (Eg. New COA)
4) Geographic expansion (New country roll-out)
5) Multiple environments to be merged
6) Heavily customized
7) New application to be implemented
8) Historical data can be viewed separately

For more detailed factors/parameters, visit discussion below
http://www.infosysblogs.com/oracle/2009/10/ebs_r12_roadmap_upgrade_vs_rei_1.html

Thursday, June 10, 2010

CRP for Oracle R12



Conference Room Pilots (CRPs) are interpreted by different organizations in different ways and can be used in a variety of project types. The R12 project will have 3 conference room pilots followed by user acceptance testing (UAT). The overall goals will be to allow discrete transactional areas of the solution to be validated, with later CRPs building towards demonstrating end to end business scenarios in the new solution.

CRP1 will include:·
Process: An overview of current client process and best practice process recommendations from ESolutions.
System: Demonstration of a subset of the standard Oracle R12 functionality, with small amounts of representative data. (CRP1 is not a user training session. User participants will not follow along on their own PCs.)
Scope: Discussion of ESolution’s recommendations for best practice where applicable to client. Scope review based on ESolution’s recommendations together with the steering committee recommendations to determine the scope baseline for the Oracle R12 project. Gathering of high-level requirements - identify how well the application meets business needs, and identify gaps, whilst still in the design phase of the project.
User Input: An opportunity for users to ask questions and get feedback on the proposed configuration and use of the software. Business attendees will provide many observations and areas of feedback to the project team which will be recorded by the functional team. If these cannot be resolved during the session they will be processed once the CRP is completed with feedback provided to the representatives prior to the next CRP.

Between CRP1 and CRP2: 
1) Detailed requirements gathering interviews will be conducted.
2) “As-Is” business process modelling and gap analysis.
3) Draft to-be process flows and initial software configuration completed prior to the CRP2.
4) Prioritization of requirements/processes (based on business ranking and level of config/customization/development required) to be demonstrated in CRP2 or CRP3.
5) Completion of requirements for “bucket one” for custom development delivered to technical team.

CRP2 will include: 
Process:Functional leaders walk cross-functional teams and potential system users through flow charts. Review of key business scenarios against predefined scripts. The scenarios will be driven from the business process and requirements mapping activities carried out earlier in the project to show how the business processes, are mapped onto system functionality.
System:Functional leaders will take the group through on-line scenario exercises. Various questions and additions to the foreseen problems are raised and alternative approaches are tested.
User Input:User will use predefined scripts to carry out key process in the system. Most issues should be resolved by the team during the exercises or delegated for further action. Signoff by the business on high priority requirements.
Scope:Requests for additional functionality (an increase to the baselined requirements) would need to be handled as part of the scope management process within the project and that observations raised are not automatically included in scope. Changes to the scope baseline will result in change requests to the steering committee.

Between CRP2 and CRP3:
1) Finalization of “To-Be” process models and gap resolution.
2) Detailed requirements validation and sign-off.
3) Completion of requirements for “bucket two” of requirements for custom development delivered to technical team.
4) Development of training materials.

Objectives of CRP3:
Process:Functional leaders walk cross-functional teams and potential system users through final “To-Be” process models.
System:Most converted data available for review in CRP environment.
Some RICE (Reports, Interfaces, Conversions, Enhancements / Extensions) components available for business review.
User Input:User will be hands-on in the system completing real business scenarios from end-to-end.
Validate that any gaps identified during CRP2 have been addressed.
Continue to uncover any system / process issues and potential future phase business requirements.
Scope:Requests for additional functionality (an increase to the baselined requirements) would need to be handled as part of the scope management process within the project and that observations raised are not automatically included in scope. Only items critical for go-live will be considered.

In summary the CRP sessions will focus on the following:

CRP-1: Discrete transactional activities
CRP-2: Proposed business processes (No RICE – custom Reports, Interfaces, Conversions, Enhancements / Extensions)
CRP-3: End to end business scenarios. Processes with converted data and some RICE components
UAT: Test 100% of functionality and end-to-end integration testing including all RICE components

Wednesday, June 9, 2010

R12 Supplier Bank API's



In order to load Supplier external bank, branches and bank accounts, one needs to use the Oracle Supplied Package IBY_EXT_BANKACCT_PUB.

Below API's are used to create Bank and Branch

IBY_EXT_BANKACCT_PUB.create_ext_bank:

It is used to create the External Bank, please note that the Bank name and Home Country Name are mandatory for creating an External Bank. Once you create the Bank, Bank Party ID gets created and you can check it from IBY_EXT_BANKS_V view.

IBY_EXT_BANKACCT_PUB.create_ext_bank
(p_api_version => 1.0
,p_init_msg_list => FND_API.G_TRUE
,p_ext_bank_rec => x_bank_rec
,x_bank_id => x_bank_id
,x_return_status => x_return_status
,x_msg_count => x_msg_count
,x_msg_data => x_msg_data
,x_response => x_response_rec
);

IBY_EXT_BANKACCT_PUB.create_ext_bank_branch

It is used to create a Bank Branch, so that an account could be created in the same branch. Once a Bank Branch is created, a record gets inserted into IBY_EXT_BANK_BRANCHES_V view.

IBY_EXT_BANKACCT_PUB.create_ext_bank_branch
(p_api_version => 1.0
,p_init_msg_list => FND_API.G_TRUE
,p_ext_bank_branch_rec => x_bank_branch_rec
,x_branch_id => x_branch_id
,x_return_status => x_return_status
,x_msg_count => x_msg_count
,x_msg_data => x_msg_data
,x_response => x_response_rec
);


After the bank and branches are created, the table iby.iby_temp_ext_bank_accounts can be populated to create the bank accounts and associate to the supplier or supplier site

Note: Table must be populated prior to running the supplier interface


Below API's are used internally by oracle to create payee and associate bank account to supplier or supplier site.

IBY_EXT_BANKACCT_PUB.create_ext_bank_acct
(p_api_version => 1.0
,p_init_msg_list => FND_API.G_TRUE
,p_ext_bank_acct_rec => x_bank_acct_rec
,x_acct_id => x_acct_id
,x_return_status => x_return_status
,x_msg_count => x_msg_count
,x_msg_data => x_msg_data
,x_response => x_response_rec
);


IBY_DISBURSEMENT_SETUP_PUB.Create_External_Payee
(p_api_version => 1.0,
p_init_msg_list => FND_API.G_TRUE,
p_ext_payee_tab => v_external_payee_tab_type,
x_return_status => v_return_status,
x_msg_count => v_msg_count,
x_msg_data => v_msg_data,
x_ext_payee_id_tab => x_ext_payee_id_tab,
x_ext_payee_status_tab => x_ext_payee_status_tab);

IBY_DISBURSEMENT_SETUP_PUB.Set_Payee_Instr_Assignment
(p_api_version => 1.0
,p_init_msg_list => FND_API.G_TRUE
,p_payee => x_rec
,p_assignment_attribs => x_assign
,x_assign_id => x_assign_id
,x_return_status => x_return_status
,x_msg_count => x_msg_count
,x_msg_data => x_msg_data
,x_response => x_response_rec);

CUSTOM Library in Form



1) CUSTOM library is located in the AU_TOP/resource directory.

2) After you write code in the CUSTOM procedures, compile and generate the library using following command:
f60gen module=CUSTOM.pll userid=apps/(appspwd) module_type=LIBRARY batch=NO compile_all=special output_file=$AU_TOP/resource/CUSTOM.plx

3) The specification of the CUSTOM package in the CUSTOM library cannot be changed in any way.

4) You may add your own packages to the CUSTOM library, but any packages you add to this library must be sequenced after the CUSTOM package.

5) You cannot use any SQL in the library, however, you can use a record group to issue SELECT statements.

6) You can use calls to stored procedures for any other DML operations such as updates, inserts, or deletes.

7) Oracle Forms global variables in your code are visible to all running forms.

8) You may attach other libraries to the CUSTOM library. However, you cannot attach the APPCORE library to CUSTOM because it would cause a recursion problem (because CUSTOM is attached to APPCORE).

9) You should thoroughly test all logic you add to the CUSTOM library before using it in a production environment. Subsequent invocations of Oracle Applications will then run this new code.

You can code logic using the CUSTOM library for following main cases. Each of these cases must be coded differently.

Zoom
: A Zoom typically consists of opening another form and (optionally) passing parameter values to the opened form through the Zoom logic.

Logic for Generic events
: Augment Oracle Applications logic for certain generic form events such as WHEN–NEW–FORM–INSTANCE or WHEN–VALIDATE–RECORD. You can use generic events to change field prompts and other properties, hide fields, add validation, and more.

Logic for Product–Specific events
: Augment or replace Oracle Applications logic for certain product–specific events that enforce business rules.

Custom entries for the Special Menus
: Add entries to the special menus for Oracle Applications forms using app_special2 routine from APPCORE2 package, such as an entry that opens a custom form. You can use total 45 Special Menus.
To invoke another form, use the function security routines like EXECUTE in the FND_FUNCTION package.

Setting Visual Attributes
: Use the CUSTOM library to change the visual attributes of Oracle Applications fields at runtime.

The CUSTOM library receives two different kind of events, generic and product–specific. Generic events are common to all the forms in Oracle Applications. These events are:

• WHEN–FORM–NAVIGATE
• WHEN–NEW–FORM–INSTANCE
• WHEN–NEW–BLOCK–INSTANCE
• WHEN–NEW–RECORD–INSTANCE
• WHEN–NEW–ITEM–INSTANCE
• WHEN–VALIDATE–RECORD
• SPECIALn (where n is a number between 1 and 45)
• ZOOM
• EXPORT
• KEY–Fn (where n is a number between 1 and 8)

TEMPLATE Form Module Objects



a) It has Platform independent references to object groups in APPSTAND form(STANDARD_PC_AND_VA, STANDARD_TOOLBAR and STANDARD_CALENDAR)

b) Attachment of several PL/SQL Libraries (APPCORE, FNDSQF, APPDAYPK.etc..)

c) Program unit APP_CUSTOM which includes specification as well as body.

d) Many referenced objects (from the object groups) that support Calendar,Toolbar, alternative regions, & Menu.

e) The Applications color palette, pure colors and various other colors namedafter their content of Red, Blue and Green.

f) Sample objects that show typical items & layout cosmetic. To remove them entirely from your form, delete the following objects:
• Blocks: BLOCKNAME, DETAILBLOCK
• Window: BLOCKNAME
• Canvas view: BLOCKNAME

g) PL/SQL Libraries listed below:

APPCORE: It contains the packages and procedures that are required of all forms to support the menu, Toolbar, and other required standard behaviors.
APPDAYPK: It contains the packages that control the Oracle Applications Calendar feature.
FNDSQF: It contains packages and procedures for Message Dictionary, flexfields, profiles, and concurrent processing.
CUSTOM: This library allows extension of Oracle Applications forms without modification of Oracle Applications code.
GLOBE: This library allows Oracle Applications developers to
incorporate global or regional features into Oracle Applications forms without modification of the base Oracle Applications form.
VERT: This library allows Oracle Applications developers to incorporate vertical industry features (for automotive, consumer packaged goods, energy, and other industries) into Oracle Applications forms without modification of the base Oracle Applications form.
JA : This library contains code specific to the Asia/Pacific region and is called by the GLOBE library.
JE: This library contains code specific to the EMEA (Europe/Middle
East/Africa) region and is called by the GLOBE library.
JL: This library contains code specific to the Latin America region and is called by the GLOBE library

Form Compilation Commands



Login to Application Server on Unix Box for Compiling Forms


R11i

f60gen module=CUSTOM.pll userid=apps/(appspwd) module_type=LIBRARY batch=NO compile_all=special output_file=$AU_TOP/resource/CUSTOM.plx

f60gen module=XXPOCF.fmb userid=apps/(appspwd) module_type=form batch=no compile_all=special output_file=$XXPO_TOP/forms/US/XXPOCF.fmx

R12

$ORACLE_HOME/bin/frmcmp_batch module_type=LIBRARY module=$AU_TOP/resource/CUSTOM.pll userid=apps/(appspwd) output_file=$AU_TOP/resource/CUSTOM.plx compile_all=special

$ORACLE_HOME/bin/frmcmp_batch module=$XXFND_TOP/forms/US/XXFND_FHLOG.fmb userid=apps/(appspwd) output_file=$XXFND_TOP/forms/US/XXFND_FHLOG.fmx module_type=form compile_all=special

TKPROF Utility



1) TKPROF is an utility and a unix command to format Raw trace file which is huge in size. It essentially formats a trace file into a more readable format for performance analysis.

2) Trace files are placed by Oracle in user_dump_dest directory as specified in the init.ora
Use below query to find the location of trace files
select value from v$parameter
where name = 'user_dump_dest';


3) Use the following Unix command on the server:
TKPROF tracefile exportfile [explain=username/password] [table= …] [print= ] [insert= ] [sys= ] [record=..] [sort= ]

Eg- tkprof ora_12345.trc output.txt explain=scott/tiger sort=(prsela,exeela,fchela)
Here Sort is on elapsed parse time, elapsed executed time and elapsed fetch-time.

Tuesday, June 8, 2010

Enabling Trace in Apps



Form Tracing
I have a form where I am entering few values and then saving the form. This will have some insert and updates command in the backend which will get executed when we save the form. SQL tracing is used to track those DML queries to debug the issue.

1) Open the form and Navigate to Menu- Help -> Diagnostic -> Trace

2) Here we have many options available (default is ‘No Trace’). We can enable tracing by selecing on of the options from here. ‘Regular Trace’ gives the least information and ‘Trace with Binds and Waits’ (level 12) gives maximum information. Be careful while enabling SQL trace with wait and binds as this will make the trace file huge.

3) Also we can start the tracing in the middle of forms sessions.

4) When we enable the trace it will give the trace file location (This location will be the location of USER_DUMP_DESTINATION parameter of database)

5) After you save the form you can stop tracing by selecting ‘No Trace’ again from Help -> Diagnostic -> Trace -> No Trace

6) Finally use tkprof command to covert this raw trace file into readable format. TKPROF is explained in separate article

Report Tracing

1) Login to System Administrator responsibility and open concurrent program definition form

2) Check the 'Enable Trace' checkbox and save the program

3) In order to generate Level 12 Trace file, Navigate to Profile > System and find the profile option 'Initialization SQL Statement - Custom' for User who is submitting the process to trace. Click on User column - Edit Field and enter below statement:
begin fnd_ctl.fnd_sess_ctl('','','TRUE','TRUE','LOG','ALTER SESSION SET EVENTS=''''''10046 TRACE NAME CONTEXT FOREVER,LEVEL 12'''''); end;
Save the record.

4. Run the Concurrent program which generates the trace file on server

5. Run the following query that gives the directory where the trace file is stored.
SELECT name, value FROM v$parameter WHERE name = 'user_dump_dest';
Go to the directory returned by USER_DUMP_DEST and use below command to find the file
grep -i 'CONC_REQUEST_ID=' *
Ex: grep -i 'CONC_REQUEST_ID=1499133' *

6. Finally use tkprof command to covert this raw trace file into readable format

How to Use FNDLOAD?

It is merely time consuming task to create same setup data on each instance separately. Hence to migrate setup data from 1 instance to other (Eg. From DEV to PROD), ldt & lct files are used.

LDT (Data Loader Files)- This files are Used to upload & download setup data on different instances.

LCT (Data Config Files)- This files are used to create LDT Files. There are difference lct files for different sets of data.

Eg:
Request Set- afcpreqg.lct, Concurrent Program- afcpprog.lct, Value Set- afffload.lct..etc..

** Now lets take a simple example where you want to migrate 1 concurrent program from Development (DEV) instance to Production (PROD).

Step 1: Login to Unix Development server and change the directory using following Command where you can store your LDT files:
cd $CUST_TOP/ patch/115/import/US/ (where CUST_TOP is your custom top on server)

Step 2: Run the following command by supplying the parameters given in {} as per your need
FNDLOAD {userid/password@connect_string} O Y DOWNLOAD $FND_TOP/patch/115/import/afcpprog.lct {file_name.ldt} PROGRAM APPLICATION_SHORT_NAME="{your_application_short_name}" CONCURRENT_PROGRAM_NAME="{your_concurrent_name}"
The above command will generate ldt file in your custom top directory ($CUST_TOP/ patch/115/import/US/ )

Step 3: Copy the LDT file from Development Server to your local desktop

Step 4: Login to Unix Production Server and copy the LDT files from local desktop to Production Server on following path:
$CUST_TOP/ patch/115/import/US/ {file_name.ldt}

Step 5: Finally Run the following command by supplying all parameters in {} to upload concurrent program definition on production application server.
$ FNDLOAD {userid/password@connect_string} 0 Y UPLOAD $FND_TOP/patch/115/import/afcpprog.lct $CUST_TOP/patch/115/import/US/{file_name}.ldt

Note:
1) All above FNDLOAD commands are single line commands.
2) All above steps will remain exactly same for all kind of setup data (Eg: Value Set, CP, Req.group, DFF…etc) except the name of LCT file (.lct). LCT file will be different for every different component because the setup data is different.

Refer to below link for list of various FNDLOAD commands
http://www.oracleappshub.com/aol/its-all-about-fndload/

Types of Value Sets

None (not validated at all)
Independent
Dependent
Table
Special (advanced)
Pair (advanced)

Attention: The Accounting Flexfield only supports Independent, Dependent, and Table validation. Also, You cannot change the validation type of an existing value set, since your changes affect all flexfields and report parameters that use the same value set.

None
You use a None type value set when you want to allow users to enter any value so long as that value meets the value set formatting rules. That is, the value must not exceed the maximum length you define for your value set, and it must meet any format requirements for that value set. Because a None value set is not validated, a parameter that uses this value set does not provide a list of values for your users.

Independent
An Independent value set provides a predefined list of values for a segment. These values can have an associated description. For example, the value 01 could have a description of "Company 01". The meaning of a value in this value set does not depend on the value of any other segment. Independent values are stored in an Oracle Application Object Library table. You define independent values using an Oracle Applications window, Segment Values.

Table
A table-validated value set provides a predefined list of values like an independent set, but its values are stored in an application table. You define which table you want to use, along with a WHERE cause to limit the values you want to use for your set. Typically, you use a table-validated set when you have a table whose values are already maintained in an application table (for example, a table of vendor names maintained by a Define Vendors form). Table validation also provides some advanced advanced features such as allowing a segment to depend upon multiple prior segments in the same structure.

Dependent
A dependent value set is similar to an independent value set, except that the available values in the list and the meaning of a given value depend on which independent value was selected in a prior segment of the flexfield structure. You can think of a dependent value set as a collection of little value sets, with one little set for each independent value in the corresponding independent value set. You must define your independent value set before you define the dependent value set that depends on it. You define dependent values in the Segment Values windows, and your values are stored in an Oracle Application Object Library table.

Special and Pair
Special and pair value sets provide a mechanism to allow a "flexfield-within-a- flexfield". These value sets are primarily used for Standard Report Submission parameters. You do not generally use these value sets for normal flexfield segments.

Customer Interface Performance Tips



1) Define "HZ: Number of Workers Used by Customer Interface"

2) Use Customer Interface Master Conc Program (RACUSTMA) instead of "Customers Interface" (RACUST)

3) Set "HZ: Gather Table Stats" the Profile Option set to "YES".

4) Run Gather Schema Stats program and Rebuild all AR Indexes with online clause

Transaction Conversion Error/Resolution



**ERROR
raapem: no errors recordedraavgi()+ Current system time is 11-JUN-2009 06:21:11raavgi()- Current system time is 11-JUN-2009 06:21:12raapem: 0 <227439 you="you">/u03/dev1/apps/apps_st/appl/ar/12.0.0/bin/RAXTRXProgram was terminated by signal 11

**SOLUTION
This Error comes up if CODE_COMBINATION_ID is Null for any of the Transaction in Interface Table - RA_INTERFACE_DISTRIBUTIONS_ALL. Setup Receivables & Revenue Account properly for all Transaction Types, so that Correct CCID gets populated in Interface Table and we will get rid of this Signal 11 Error.

**ERRORERROR: from eBusiness Tax - return code = 2 Current system time is 08-MAY-2009 13:02:07Please review the plsql debug log for additional details. Current system time is 08-MAY-200913:02:07Error calling raaebt()Error calling raapic()Error.No.2)ZX_API_PUB.import_document_with_tax returns failureAn unexpected error has occurred. Please contact your system administrator.arp_etax_autoinv_util.calculate_tax()-

**SOLUTIONRun the following scripts in a TEST environment first and then to PROD:Please run ARHGEFTR.sql manually from sqlplus Dir: $AR_TOP/patch/115/sql/This will synch the data between fnd_territories_vl and hz_geographies.

**ERRORcalling raapem to record error
raapem: 2 <31486 the="the">

**SOLUTIONPrecision Length is Zero for Currency Codes. Increase the length to 2.

**ERRORError calling arcdsu_do_setup()
Error calling raaini()
+---------------------------------------------------------------------------+
No completion options were requested.
AR_NO_PROFILE
+---------------------------------------------------------------------------+
Exceptions posted by this request:
Concurrent Request for "Autoinvoice Import Program" has completed with error.

**SOLUTIONComplete the Setup as per Metalink Doc ID- 733835.1 (Point 5A & 7A)

**ERRORAPP-AR-11526: ORA-01458: invalid length inside variable character string
select NAME ,DECODE(CUST_TRX_TYPE_RULE,'Id','I','R') ,DECODE(BILL_CUST
APP-AR-11526: 7207185/ar/lib/raamtr.opc 371

**SOLUTIONExtend the Buffer size in AR System Options screen(N) setup->system -> Options (Tab: Trans. and customers->)Set - Max memory in bytes = 3000000 and save changes.This setup step will prevent the error.

Discoverer Installation Steps (Secure Env)



1) Install BI Tool (discoverer) using below link:
http://www.oracle.com/technology/software/products/ids/htdocs/101202winsoft.html
- Apply below patch to upgrade discoverer to latest version- 10.1.2.55.26:
https://updates.oracle.com/Orion/PatchDetails/process_form?patch_num=5983622&urlc=v1.2~4CCAD0AC87DF49E94C1BD6FEE81FDF1C1A863C2773B61A934293703FE27E60561C77DEFCBE89C5319CE51B05760B7D0466210C3BFB0D480BE7CEAB2FE335769B6EAA830E420C75519C56667074E2C4129140BA774EC6BCB80AE5894843D82A872B730C33A77E7287704828F331EF87608045ACEDF264A13AE6D21842ECC8DB7B96AECD925C2EF112FE3B9C3E61B6A18B030E529B217647520668D6A364DA779A81C4ED7B88AEEDC42E2A13BF05ED38B489DE1B4CDD12EF0001D4FCFBF3598D336055C1C266851E2ED7FF726BD10497E7C707C0DA0DFABA579E207B4EB9087DBD02E9B7BEE815C3868D0CB5AD1F56DD2644AC7C34E8111C1D583017A65980EE6DCFE660C4E8E8DB31C9A1AE572A981EA699941C057092A7BAB5F297FED7CAE1E2BEAD52E8A979889C6A1038580938ECDE276F48004F648A242AEB9BCC569EBC09CA82AEAAE7D6CD131D0AB8819B68AE023E4EF807E7EA5F4CD5EDE106605038A3314ED9C22808B44F30559D95B433C87F

2) Identify the Oracle Home for Discoverer
For example: You could find dis51usr.exe under the directory D:\oracle\BIToolsHome_1\bin. Then the Oracle Home is D:\oracle\BIToolsHome_1

3) Create a folder named "secure" in the ORACLE_HOME directory identified in step 2 (i.e. D:\oracle\BIToolsHome_1\Secure). Copy DBC files in this secure folder

4) Discoverer requires Windows Environment variables to dynamically retrieve the location of the .dbc file on the PC.

5) Open Windows Control Panel and double click on the System icon.

6) Select the Environment tab and create two new System Variables
FND_SECURE = ORACLE_HOME\Secure
FND_TOP= ORACLE_HOME
NOTE: The Oracle Home is the one identified in step 2.

7) Restart the PC and you should be able to connect to Discoverer in an Oracle Applications mode EUL in secure mode.

Zoom Function in Forms



Following code enable the Zoom function at particular block of form

FUNCTION zoom_available
RETURN BOOLEAN
IS
form_name VARCHAR2 (30) := NAME_IN ('system.current_form');
block_name VARCHAR2 (30) := NAME_IN ('system.cursor_block');
BEGIN
if (form_name = 'OEXOEORD' AND block_name = 'LINE') then
RETURN TRUE;
else
RETURN FALSE;
end if;
END zoom_available;

Following code helps to Onhand Quantity Form and to pass the item name to Onhand Quantity from Sales Order Form and navigate to Item field while clicking the Zoom button.

procedure event(event_name varchar2)
is
param_to_pass1 VARCHAR2 (255);
b varchar2(20);
begin
if (event_name = 'ZOOM') then
if (form_name = 'OEXOEORD' AND block_name = 'LINE') then
param_to_pass1 := NAME_IN ('LINE.ORDERED_ITEM_DSP');
fnd_function.EXECUTE (function_name => 'INV_INVMATWB',
open_flag => 'Y',
session_flag => 'Y',
other_params => 'ITEMS="'
|| param_to_pass1
|| '"' );
end if;
end if;

if (event_name = 'WHEN-NEW-RECORD-INSTANCE') then
if (form_name = 'INVMATWB' AND block_name = 'MATERIAL_QF') then
b := fnd_profile.VALUE ('user_name');
fnd_message.set_string (NAME_IN ('parameter.ITEMS')
|| 'is entered by user'
|| b);
fnd_message.show ();
GO_ITEM ('MATERIAL_QF.ITEM');
COPY (NAME_IN ('parameter.ITEMS'), ('MATERIAL_QF.ITEM'));
VALIDATE (item_scope);
END IF;
end event;

Interface Vs Conversion


Interface

1) Connection between two systems in order to synchronize the data.
2) Manual, batch or real-time.
3) Used repeatedly and should therefore be designed and constructed in the most efficient manner possible.
4) Can be triggered by an event (such as running a concurrent program) or it can be scheduled to run at a certain time.
5) Can be very costly to construct and maintain

Conversion
1) Process where existing data from the client’s old system is extracted, cleansed, formatted, and installed into a new system.
2) Manual or automated.
3) One-time only process that requires extensive testing and preparation.
4) Executed before a system goes into production

Descriptive Flexfields (DFF)


General Ledger

Daily Rates
Historical Rates

Payables
Bank Branch
Payment Term
Site Address

Receivables
Credit History Information
Address Information

Assets
Bonus Rates
Calendar Types
Price Indexes

HRMS
Additional Address Details
Additional Assignment Details
Additional Hierarchy Details
Additional Job Details
Additional Position Details
Additional Qualification Details
Additional Personal Details

Key FlexFields (KFF)


General Ledger

Accounting Flexfield
Reporting Attribute:Accountin

Receivables
Sales Tax Location
Territory

Assets
Asset
Category
Location

Inventory
Account Aliases
Item Catalogs
Item Categories
Sales Orders
Stock Locators
System Items

HRMS & Payroll
Grade
Job
Personal Analysis
Position
Competence
Cost Allocation
People Group

Sunday, June 6, 2010

Base Tables for Data Extraction


GL Journals

GL_JE_BATCHES
GL_JE_HEADERS
GL_JE_LINES
GL_CODE_COMBINATIONS
AP Suppliers

PO_VENDORS
PO_VENDOR_SITES_ALL
PO_VENDOR_CONTACTS

----------------
AP_SUPPLIERS
AP_SUPPLIER_SITES_ALL
AP_SUPPLIER_CONTACTS

AP Invoices


AP_INVOICES_ALL

AP_INVOICE_LINES_ALL
AP_INVOICE_DISTRIBUTIONS_ALL
AP_TERMS
AP_HOLDS_ALL
PA_PROJECTS_ALL
PA_TASKS
PO_VENDORS

AR Customers
HZ_CUST_ACCOUNTS
HZ_PARTIES
HZ_CUST_ACCT_SITES_ALL
HZ_LOCATIONS
HZ_PARTY_SITES
HZ_CUST_SITE_USES_ALL

AR Customer Profiles


HZ_CUSTOMER_PROFILES
HZ_CUST_PROFILE_CLASSES
HZ_CUST_PROFILE_AMTS
AR_COLLECTORS
RA_TERMS
RA_GROUPING_RULES
AR_STATEMENT_CYCLES

AR Customer Contacts
RA_CONTACTS
RA_PHONES

AR Transactions
AR_PAYMENT_SCHEDULES_ALL
RA_CUSTOMER_TRX_ALL
RA_CUST_TRX_TYPES_ALL
RA_CUSTOMER_TRX_LINES_ALL
RA_CUST_TRX_LINE_GL_DIST_ALL
RA_TERMS
HR_OPERATING_UNITS

AR Receipts


AR_CASH_RECEIPTS_ALL
AR_CASH_RECEIPT_HISTORY_All
AR_RECEIVABLE_APPLICATIONS_ALL
AR_RECEIPT_METHODS
AR_RECEIPT_CLASSES
AR_PAYMENT_SCHEDULES_ALL

PO Standard


PO_HEADERS_ALL
PO_LINES_V
PO_LINE_LOCATIONS_ALL
PO_DISTRIBUTIONS_V

PO Releases

PO_RELEASES_ALL

PO Requisitions
PO_REQUISITION_HEADERS_V
PO_REQUISITION_LINES_V

Project Data


PA_PROJECTS_ALL
PA_TASKS
PA_PROJECT_PARTIES
PA_PROJECT_ROLE_TYPES_B

Project Transaction Data
PA_EXPENDITURE_ITEMS_ALL
PA_COST_DISTRIBUTION_LINES_ALL
PA_EXPENDITURES_ALL
PA_EXPENDITURE_COMMENTS
PA_PROJECTS_ALL
PA_TASKS

INV Items
MTL_SYSTEM_ITEMS_B
ORG_ORGANIZATION_DEFINITIONS

INV Item Categories
MTL_CATEGORY_SETS_TL
MTL_CATEGORIES
MTL_ITEM_CATEGORIES
MTL_SYSTEM_ITEMS
ORG_ORGANIZATION_DEFINITIONS
MTL_ORGANIZATIONS

INV Item SubInventory

 

MTL_ITEM_SUB_INVENTORIES
ORG_ORGANIZATION_DEFINITIONS
MTL_SYSTEM_ITEMS_B
MTL_ITEM_LOCATIONS
MTL_SECONDARY_LOCATORS


Attachments
FND_DOCUMENTS_LONG_TEXT
FND_DOCUMENTS_SHORT_TEXT
FND_DOCUMENT_CATEGORIES_TL
FND_DOCUMENT_DATATYPES
FND_DOCUMENTS_TL
FND_DOCUMENTS
FND_ATTACHED_DOCUMENTS

FND_DOC_CATEGORY_USAGES

Saturday, June 5, 2010

FA Purging



There are 2 Types of Purging in Oracle Assets:
1) Archive and Purge Transaction and Depreciation data
2) Delete and Purge Mass Additions Data

Archiving & Purging Depreciation Data
Prerequisites are,
- If necessary, Navigate to System Administrator > Profile > System and update the profile option- FA: Archive Table Sizing Factor. You can find out more on this at,
http://download.oracle.com/docs/cd/A60725_05/html/comnls/us/fa/archiv01.htm#p_arc_resize
- Before you perform purge, it’s mandatory to check the Checkbox Allow Purge for the Book in Book Controls window. This can be done by navigating to Setup > Asset System > Book Controls in FA Managers Responsibility

Detailed Steps are as below:
1) Change Responsibility to FA PURGE.
2) Open the Archive and Purge window.
3) Enter the Book and Fiscal Year you want to archive. You must archive and purge in chronological order.
4) Click on Archive Button to submit a concurrent request that changes the status from New to Archived and creates temporary archive tables with the data to be purged.
5) Oracle Assets automatically assigns an Archive Number when you save your work. The temporary table name includes a five-digit archive number as below.
FA_ARCHIVE_SUMMARY_
FA_ARCHIVE_DETAIL_
FA_ARCHIVE_ADJUSTMENT_
6) Export the archive tables to a storage device.
-----------------------------------------------------
7) Return to the Archive and Purge window and use the Archive Number to find the archive you want to purge.
8) Click on Purge Button to submit a concurrent request that changes the status from Archived to Purged and removes the archived data from Oracle Assets tables. Now DBA can drop the temporary archive tables.
9) You can only purge definitions with a status of Archived or Restored.

Restoring Data

1. To restore records that you have purged from Oracle Assets, you must first import the tables from your archive, and then perform the restore. Since the archive number is part of the temporary table name, Oracle Assets restores only the records that were archived during that archive you specify.
2. You can only restore the most recently purged fiscal year, so you must restore fiscal years in reverse chronological order.
Tables Deleted from Database after PurgingFA_DEPRN_SUMMARY
FA_DEPRN_DETAIL
FA_ADJUSTMENTS


Archiving & Purging Mass Additions Data

The Delete Mass Additions program removes mass additions in the DELETE and POSTED queues. It also removes SPLIT parents if the split children have been posted or deleted. The program archives mass additions in the DELETE queue only to an audit trail table, FA_DELETED_MASS_ADDITIONS.

Prerequisites are,
If Required, Run the Delete Mass Additions Preview Report from respective FA Managers responsibility

Detailed Steps are as below:

1) Navigate to FA PURGE Responsibility and click on Delete Mass Additions
2) In the Parameters window, specify the corporate book for which you want to clean up mass additions.
3) Choose Submit to submit a concurrent process to remove the lines. When the program completes successfully, Oracle Assets automatically runs the Mass Additions Delete report, which gives you an audit trail of the processed mass addition lines.
4) Oracle Assets maintains an Audit Trail by moving lines from Base table to Interim table- FA_DELETED_MASS_ADDITIONS
5) Review the log file and report after the request completes.
------------------------------------------------
6) Click on Purge Mass Additions from the Navigator.
7) Enter the Batch Number of the Create Mass Additions batch associated with the deleted mass additions for which you want to purge the audit trail from Oracle Assets.
8) Choose Submit to submit a concurrent process that removes archived lines from the Audit trail table for deleted mass additions.
9) When the program completes successfully, Oracle Assets automatically runs the Mass Additions Purge Report, which lists the mass addition lines you purged.
10) Review the log file and report after the request completes.

Tables Deleted from Database after Purging
- FA_MASS_ADDITIONS

AR Purging



There are 2 Types of Purging in Oracle Receivables:
1) Autoinvoice Interface Purge
2) Transaction Data Archive & Purge

Autoinvoice Interface Purge
1) Login to corresponding AR Purge User responsibility

2) Run ‘Autoinvoice Purge Program’ from SRS window

3) The program deletes the data from following tables for specific operating unit attached to the responsibility
RA_INTERFACE_DISTRIBUTIONS_ALL
RA_INTERFACE_SALESCREDITS_ALL
RA_INTERFACE_LINES_ALL
RA_INTERFACE_ERRORS_ALL


Transaction Data Archive & Purge
1) Login to corresponding AR Archive & Purge User responsibility

2) Run ‘Call New Archive and Purge’ program from SRS window which will eventually calls ‘New Archive and Purge’ program

3) This program archives the data in tables- AR_ARCHIVE_HEADER & AR_ARCHIVE_DETAIL and deletes transaction data from following base tables:
AR_ACTION_NOTIFICATIONS
AR_ADJUSTMENTS
AR_BATCHES
AR_CALL_ACTIONS
AR_CASH_BASIS_DISTRIBUTIONS
AR_CASH_RECEIPTS
AR_CASH_RECEIPT_HISTORY
AR_CORRESPONDENCE_PAY_SCHED
AR_CUSTOMER_CALL_TOPICS
AR_MISC_CASH_DISTRIBUTIONS
AR_NOTES
AR_PAYMENT_SCHEDULES
AR_RATE_ADJUSTMENTS
AR_RECEIVABLE_APPLICATIONS
RA_BATCHES
RA_CUSTOMER_TRX
RA_CUSTOMER_TRX_LINES
RA_CUST_TRX_LINE_GL_DIST
RA_CUST_TRX_LINE_SALESREPS
AR_CORRESPONDENCES
AR_DISTRIBUTIONS

4) If in case, Archive & Purge program fails, you can submit ‘Archive-Restart’ program from SRS window. It can be used to save the system from having to revalidate all purge candidates, if Archive & Purge has completed the selection and validation phase, then fails during the archive phase. Archive-Restart clears the Archive Header and Detail tables and submits the archive report. When submitting the Archive-Restart program you must provide the following parameters: Archive Level, Summary Report Only, Number of Workers, Commit Size, and Archive Id

5) Run ‘Archive to File’ program from SRS window to move your archive data to a file in the directory with the file name . Also ensure that you move your archive output from the AR_TOP/out directory to an appropriate storage area. Otherwise, it will be deleted when your system administrator clears the output directories.

6) Once archive data has been stored the archive tables must be cleared before the next purge run. To clear the archive tables use the TRUNCATE command in SQL with the following tables:
AR_ARCHIVE_HEADER
AR_ARCHIVE_DETAIL

7) The following tables will be cleared automatically the next time you run the Archive/Purge programs.
AR_PURGE_TRX_AUX
AR_PURGE_REC_AUX
AR_ARCHIVE_PURGE_LOG
AR_ARCHIVE_PURGE_INTERIM
AR_PURGE_OE_EXCEPTIONS

8) After you purge your database, you should contact your Database Administrator (DBA), so that he can export and import the tables and indexes from which you purged data. By recreating these objects, you can reduce the memory each object occupies in your tablespace and increase the performance of your system.