All the times i forget these profiles always google out for them..just putting it here for quick reference
Profile Name:Personalize Self-Service Defn
– Yes enables personalization link on all pages..This should be no in Production environment.for development team also it is better to set it yes at the user level
THis will enable the personalize page on the left side top corner.This allows us to do personalizations
from the page itself
Profile Name:FND: Personalization Region Link Enabled
– Yes displays all regional links
– Minimal displays key regional links
Itis do get the personalize link at the region level
Profile Name:Disable Self-Service Personal
– Turn off all personalizations
Profile Name:FND: Personalization Document Root Path
– Used for importing/exporting personalizations
Profile Name:FND: Diagnostics
– Activates the link called About this Page at
bottom left of page
Profile Name:FND:OA:Enable Defaults
– Allows defaulting through personalizations to
take place
Monday, December 27, 2010
Thursday, December 9, 2010
Steps to Move OA page personalizations at responsibility/Organization level
Hi All,
Coming to OA page personalization always becareful while defining personalization's .
Dont define personalizations at responsibility level\organization level unless and until it is absolutely required.Because deploying personalizations between instances would be tedious work for the ones defined at organization and responsibility level
1.List all the personalization at the responsibility level for the page
2.export the personalization from the source instance
3.move the personalization.jar/zip files to the target instance
4.unzip the files in the target instance
5.rename the folder names with the corresponding responsibility id’s in the target instance
Use this sql to get the responsibility id in the target instace
select responsibility_key,responsibility_id
from fnd_responsibility
where responsibility_key='XX_CUST_REP'
Rename the folder with target instance responsibity id
/oracle/apps/pa/project/webui/customizations/responsibility/60781/MyProjectsPG
60781 folder need to renamed with responsibility id of the target instance like 77876
Note: In case the responsibility key also differs between the instances the page.xml file(personalization file) need to be changed by replacing the responsibility key with corresponding responsibility key
6.Import personalizations
we need to use funtional administrator responsibility/personalization tab/export and import personalization
Coming to OA page personalization always becareful while defining personalization's .
Dont define personalizations at responsibility level\organization level unless and until it is absolutely required.Because deploying personalizations between instances would be tedious work for the ones defined at organization and responsibility level
1.List all the personalization at the responsibility level for the page
2.export the personalization from the source instance
3.move the personalization.jar/zip files to the target instance
4.unzip the files in the target instance
5.rename the folder names with the corresponding responsibility id’s in the target instance
Use this sql to get the responsibility id in the target instace
select responsibility_key,responsibility_id
from fnd_responsibility
where responsibility_key='XX_CUST_REP'
Rename the folder with target instance responsibity id
/oracle/apps/pa/project/webui/customizations/responsibility/60781/MyProjectsPG
60781 folder need to renamed with responsibility id of the target instance like 77876
Note: In case the responsibility key also differs between the instances the page.xml file(personalization file) need to be changed by replacing the responsibility key with corresponding responsibility key
6.Import personalizations
we need to use funtional administrator responsibility/personalization tab/export and import personalization
Friday, December 3, 2010
Intro to Interfaces --Part2
Continuing with my earlier post long way back...
Let see what are inbound interfaces
inbound interfaces:Inbound interfaces where data comes into our systems.so when ever we are moving from Legacy system to oracle applications
we need to move the data residing in that system(master and transactional data) from legacy to new oracle apps system
Inbound are broadly classified in to two types
1.Interfaces
2.Conversion
Conversion:Conversion is a one time activity where we move the data from the legacy system to oracle applications
Interface:Is a integration between oracle applications and any other third party application which will exists even after migrating
to the new oracle apps systems
Lets talk about conversion
In a typical conversion process we will load data provided by the client team into oracle applications
Usually data is provided in the dat format or in excel(*.csv)
THe usual steps involves in a conversions are
1.Load the Dat/csv files into a staging table--This step is done using external tables/sqlloader
2.Perform the basic validations on the data at staging table
3.Derive any data required based on conditional logic/or defaulting some of the coulmns
4.mark the valid records in the staging table
5.pull all the valid records and insert into open interface tables/call API's
6.if data is loaded into interface tables run the standard interface program pragmatically/submitted as a part of the request set
8.once the interface program runs check whether any records are left in the open interface tables with error state
9.update the corresponding staging table with the error status and error message
10.pull all the error records from the staging table and generate a error records report with the corresponding error message
Typical conversion examples are item,customer,supplier etc..
One of the conversion approaches we have taken recently where the data volume is very high is
we divided the program into two step process
1.load data
2.program to validate and load data.
This process was very useful when we have more custom validations before loading data into system
so we developed the custom program to run in two modes.
1.validation mode
2.validation and insert mode
for this process we will have a processed column in the staging table.we load data with N status
once we validate data we mark the records as V and error records CE--custom validation error
in validation mode we just validate the custom data loaded in the staging table and generate the error report based on the
custom validation logic and mark the records as V or CE
In validation and insert mode we pick all the records marked as v and N.Validation logic's are performed on records which are in
N status only.we load load all the records marked as V and then load open interface/call api
By following this process we avoid multiple iterations for the conversion process.
Conversion process itself is multiple iterative process where we clean data by performing multiple mock conversions.
add validations:
The best approach is have the concurrent program to switch on and off the validation when ever required.
Normally we will have around 2-3 mock conversions before we load data in the production system
if we have option to switch off validation..we can switch of them if they are very time consuming because by the time we go to production
we would have cleaned our data.
Let see what are inbound interfaces
inbound interfaces:Inbound interfaces where data comes into our systems.so when ever we are moving from Legacy system to oracle applications
we need to move the data residing in that system(master and transactional data) from legacy to new oracle apps system
Inbound are broadly classified in to two types
1.Interfaces
2.Conversion
Conversion:Conversion is a one time activity where we move the data from the legacy system to oracle applications
Interface:Is a integration between oracle applications and any other third party application which will exists even after migrating
to the new oracle apps systems
Lets talk about conversion
In a typical conversion process we will load data provided by the client team into oracle applications
Usually data is provided in the dat format or in excel(*.csv)
THe usual steps involves in a conversions are
1.Load the Dat/csv files into a staging table--This step is done using external tables/sqlloader
2.Perform the basic validations on the data at staging table
3.Derive any data required based on conditional logic/or defaulting some of the coulmns
4.mark the valid records in the staging table
5.pull all the valid records and insert into open interface tables/call API's
6.if data is loaded into interface tables run the standard interface program pragmatically/submitted as a part of the request set
8.once the interface program runs check whether any records are left in the open interface tables with error state
9.update the corresponding staging table with the error status and error message
10.pull all the error records from the staging table and generate a error records report with the corresponding error message
Typical conversion examples are item,customer,supplier etc..
One of the conversion approaches we have taken recently where the data volume is very high is
we divided the program into two step process
1.load data
2.program to validate and load data.
This process was very useful when we have more custom validations before loading data into system
so we developed the custom program to run in two modes.
1.validation mode
2.validation and insert mode
for this process we will have a processed column in the staging table.we load data with N status
once we validate data we mark the records as V and error records CE--custom validation error
in validation mode we just validate the custom data loaded in the staging table and generate the error report based on the
custom validation logic and mark the records as V or CE
In validation and insert mode we pick all the records marked as v and N.Validation logic's are performed on records which are in
N status only.we load load all the records marked as V and then load open interface/call api
By following this process we avoid multiple iterations for the conversion process.
Conversion process itself is multiple iterative process where we clean data by performing multiple mock conversions.
add validations:
The best approach is have the concurrent program to switch on and off the validation when ever required.
Normally we will have around 2-3 mock conversions before we load data in the production system
if we have option to switch off validation..we can switch of them if they are very time consuming because by the time we go to production
we would have cleaned our data.
Subscribe to:
Posts (Atom)