this FAQ

advertisement
KCS Template Quick Reference
KCS Content Style Guide
Title: FAQ-Importing from ADP into Abila Fund Accounting
Question: Any suggestions on importing from ADP?
Product:
Product:
Module:
Version/SP:
Abila Fund Accounting
Payroll, Accounting
ALL
Solution:
This document covers the basics of importing information from ADP into Abila Fund Accounting. It
assumes the operator has some level of experience with the SFA import process.
If not, please consult the Abila Fund Accounting Support Information Guide for Basic Data
Import.
The examples in this document are pulled from the MasterT3.DEF file which is located on the SFA
server in the MIP Share\IMPORT\Master DEF Files folder. It is recommended that you copy this file
and modify it to fit your import needs.
Basic process:
The basic process of getting information from ADP to SFA is as follows
-In ADP an export file that contains ACCOUNTING information is created. The format and
spacing of the file and its data should be consistent every time.
-A Definition file is created to the specifications of the SFA database in relation to the ADP
export file. The definition file maps the information from the ADP export file to an unposted
session in SFA.
- In SFA the user, accessing the DEF file imports the information into the SFA database,
usually as an unposted Journal Voucher session.
Terminology:
Export File – The file created by ADP. This file needs to be either a Text (.TXT) or Comma
Separated Variable (.CSV) file. For troubleshooting purposes we recommend the CSV file
format. This format is much easier to scan and comprehend when attempting to locate
problems.
Definition File (DEF file) – This is a text file that contains commands for the SFA system to
use in the import process. It acts as map to match data up in the Export file with its proper
location when creating an unposted session in SFA.
Segment – SFA uses a table driven chart of accounts. Each database has two or more
segments. Each segment has certain properties that control how it behaves in SFA. There are
usually multiple account codes in each segment. There are 5 different types of segments that
can exist in SFA not all databases will necessarily have all of them. The segment types are
Fund, GL, Balancing, Non-Balancing and Restriction.
Required Field – A specific type of value that is required for the transaction to be imported.
Default Value – A value hard coded into the DEF file that is used if no information for that field
is present in the data file.
Offset Account Assignments – Designations in SFA that are used to speed data entry. They
automatically assign the second half of a transaction when only the first half is imported. One
example would be an Expense transaction that automatically offsets to a Cash account.
KCS Template Quick Reference
KCS Content Style Guide
Expense Type Account – A sub classification of an account found in the General ledger
segment. Expense type accounts have additional data entry requirements when compared to
balance sheet type accounts
Balance Sheet Type Account – A sub classification of an account found in the General
Ledger segment. Balance sheet type accounts (such as cash) may not have as many required
segments as an Expense type account.
T1 Format – A format for data where a single piece of information is contained on each line.
T3 Format – A format for data where all information is contained within each line.
When preparing to import from ADP the first issue that needs to be resolved is the format of the
export file (T1/T3) and what data it needs to contain to be successfully imported.
SFA supports the import of CSV and TXT files. They can be in the T1 or T3 type format. In
practice the CSV file is easier to work with than the TXT and the T3 is easier to work with than
the T1. The majority of users import CSV type files using a T3 format.
When considering what to export form ADP you must be aware of what fields are required in
SFA. To see what fields are required you would turn on the “Highlight Required Fields” in
Options>Customize workstation settings. Then attempt to enter a document of the same type
you would be importing manually and note what is a required field/segment.
Data to be imported can be classified into one of three levels.
Session level information - Each session has to be unique. One session can contain many
documents
Document level information – Each document number has to be unique. One document can
contain many transactions
Transaction level information – The majority of the information is contained at this level. These
are the actual debit and credit entries.
In most cases you will be importing the data as a journal voucher document. The following
information is required in order to do so for most databases. It is listed as it appears in the DEF file
Session Level Information:
SESSION_SESSIONNUMID – The session ID number in SFA. This needs to be a unique value for
each import. It can usually be alphanumeric. It is possible to default this value in the DEF file but it is
not recommended.
SESSION_STATUS – The posting status in SFA. In most cases the value should be BP for Batch to
Post or BS for Batch to suspend. It is common to default this value in the DEF file to BP.
SESSION_DESCRIPTION – The session description. This is an optional field
SESSION_SESSIONDATE – This is the date of the session. This can be the same as other date
values in the data.
SESSION_TRANSSOURCEID – The type of transaction being imported. Normally JV (Journal
Voucher). It is common to default this value in the DEF file to JV.
Document Level Information:
TEDOC_SESSION – The session ID the document number will be part of, usually the same as
SESSION_SESSIONNUMID.
TEDOC_TRANSOURCE – The type of transaction the document is. Usually the same as
SESSION_TRANSSOURCEID.
TEDOC_DOCNUM – The document number. If multiple documents are created for the same session,
each document number must be unique, though the same document numbers can be repeated in a
different session.
KCS Template Quick Reference
KCS Content Style Guide
TEDOC_DESCRIPTION – The document description. This is a required field
TEDOC_DOCDATE – The document date. This can be the same as other date values in the data.
Transaction Level information:
TETRANS_SESSIONNUMID- The session ID the transaction will be a part of, usually the same as
SESSION_SESSIONNUMID.
TETRANS_DOCNUM – The document number the transaction will be a part of. Usually the same as
TEDOC_DOCNUM.
TETRANS_DESCRIPTION – The Transaction description.
TETRANS_ENTRY_TYPE – This is the accounting entry type. N (normal). It is common to default this
in the DEF file.
TETRANS_EFFECTIVEDATE – The effective date of the entry. This can be the same as other date
values in the data.
TETRANS_SEGMENT_GL – This is an example of an account code segment title. In this case it is
for the General Ledger segment.
The value “GL” after the TETRANS_SEGMENT_.. denotes the segment
you are importing into. Review your Chart of Accounts and find the
EXACT names of your segments and reference these. You will have one
such TETRANS_SEGMENT_XXX line for each segment field you will be
populating during the import process. Please not that the values in the
sample DEF files are based on the SSA database. Each database will
have different segment names and vary in the numbers of segments.
TETRANS_DEBIT – The column for your debit value.
TETRANS_CREDIT – The column for your credit value.
These above fields will be required for ALL JV type imports regardless of what database is being
used since the GL segment is the minimum required segment for all databases. Additional
Transaction Level Information will be required for the actual segment coding that is imported (the
debits and credits). What information is required and how many segment fields you need will vary
from database to database.
Below are some general guidelines when figuring out what data is needed
First, how many segments does your database have and what type are they. The easiest way to find
this out is to log into the Administration Module, Navigate to: Organization>Organization
information>Segments. This is a listing of the Segment names within the database with an indication
of the Type of segment and Character type (alpha, numeric, alphanumeric), segment length.
It will look something like
this
KCS Template Quick Reference
KCS Content Style Guide
The type of segment is important because it will partially govern what information is required when
coding information to Balance Sheet type accounts.
For Fund and Balancing (BAL) type segments, information is required for all transaction lines,
even for balance sheet type GL accounts.
For Restriction (RES) type segments, information is not always required but it is strongly advised that
it be included. If it is not included some financial statements may not run properly.
For Non Balancing (NBAL) segments and Balance sheet type accounts information is NOT
required.
If you are coding to a REVENUE or EXPENSE type GL account ALL segments are required
regardless of segment type.
If required information is missing the import will fail.
To ensure the creation of a successful Export file from ADP you should have values for all required
segments for the General Ledger accounts you are importing into. In most cases you are importing
into Expense type General Ledger accounts which require all segments and the second half of the
transaction is importing into Liability and Cash type General Ledger accounts which are balance
sheet type accounts and do not require all segments.
It is very possible that the segment value required (i.e. Restriction Code segment) is not stored or
tracked in ADP. If that is the case then you have two options.
1) Apply a default code for this segment in the data. This can be done either as a global value
exported from ADP into the data file or added within a column of the data file after the export
from ADP, populating the appropriate value manually.
2) Default the value for the segment in the DEF file (remember this would apply to every
transaction line)
Considerations to watch out when exporting:
When exporting data from ADP it is important to ensure that the data values match the codes in your
database EXACTLY. For example, if you have Fund Code 101 established in ADP, Fund Code 101
must also exist in SFA. It may be necessary to add codes in both applications before the data in both
is an exact duplicate.
Be aware of leading zeros. This is common for both codes and dates. SFA may have a Fund Code of
001. In the ADP export it may display as Fund Code 1. Even if it is exported from ADP as a 3 digit
value of 001, if the data file is opened in Excel it may strip out the leading zeros. It is best to check the
data in a text editor such as PFE32 or WordPad or copy the original file and open it for review
purposes. Date fields are especially vulnerable to this affect.
If at all possible every distinct piece of information should be contained in its own column. If ADP is
exporting a single value which is the equivalent of the Fund segment and the GL segment, this single
value should be segregated into separate columns rather than run together. It is possible to modify
the DEF file to handle both values in the same column but it is more complicated and harder to
diagnose issues when they occur.
Under some settings ADP will put quotations or apostrophes’ around every field of data. When you
open the data in Excel this is not obvious but if you open the data in a text editor you will see data
KCS Template Quick Reference
KCS Content Style Guide
similar to this: “101”,”4401”,JV,10/21/2007 . You will need to reconfigure the ADP export so that it
does not include the apostrophes’.
The order in which the data cells are arranged, does not matter as long as the order is not altered
from export to export. If the order ever changes the DEF file will have to be edited to reference the
correct location in the data file.
Configuring the DEF file:
After you have reviewed your export file data format, it is time to configure the DEF file to match the
data cell arrangement.
Several choices will have to be made in order for the system to understand how the import process
will function. Continuing with the T3/CSV combo you will need to address the following values located
above the Session block
The File statement – The file statement indicates the location of the data file to be imported.
The file statement is required but it is not necessary to reference the exact location of the data.
If the name of the import file will remain constant or you wish to enter the name into the
statement on each and every export, the file statement line format would be similar to this:
FILE,SESSION, C:\MIP SHARE\Import\CSV Samples\T3_trans.csv (T3_trans.csv is the data
file name)
Otherwise do not reference a specific location and the Import process will prompt the user to
browse for the data file to be imported. In this scenario, the file statement line format would
be:
FILE,SESSION,
CONTEXTIDPOSITION,1,6 – This is an internal programming reference. In most cases this
value will work
SEGMENTNOTSTRING – Used when your data is a CSV type and every data value occupies
a unique column\cell.
DISCARDFIRSTNRECORDS – Used when the top row(s) of your data file contain header
information such as column names, this will indicate how many rows of non-importable data
the system should ignore.
TRANSACTION_READ,3 - This is a required block and tells the system what format the data
file type is - a T3.
After the context information is configured properly you should go through and configure the Session,
Document and Transaction Level information.
The various fields and their meaning are covered in the above documentation. Once you have the
statements in the DEF file you will need to match the fields in your DEF file to the values in the Data
file. This is done by telling the system where in each row the specific data value is located. Below is a
general explanation.
KCS Template Quick Reference
KCS Content Style Guide
In a CSV type import there are seven field positions for each line of the DEF file. Not all are used or
required for all lines. The positions are:
Position
1
2
3
4
5
6
7
Value
Context Type and Field Name – The field being Imported
Field Position in CSV – The column the data is in on the data file
Default Value – The value to use if no data is there (optional)
Date Mask – The format of the date. This is only required for date
type info, otherwise it is left blank.
Position within String – If you have multiple segments of information
within a column this indicates the starting position for the specific
segment of information. If each column contains a single segment of
information this is left blank.
Field Length – If you have multiple segments of information within a
column this gives the length of the specific segment of information.
Used only in conjunction with position 5. If each column contains only
one piece of information this is left blank.
Assumed Decimal Places – This is the assumed decimal places for
debit/credit information. If decimal places are already included in the
data this should be 0.
Example 1
TETRANS_SEGMENT_GL,9,1011,,,,,
Informs the system we are importing data which relates to the GL segment in SFA (position 1).
Use the value found in column 9 (position 2).
If no value is found in column 9 use the code 1011 (position 3).
Date Mask, Positions within String, Field Length and Assumed Decimal Places (Positions 4-7)
do not apply to this type of import, so they are left blank. Technically, you could remove the
commas after position 3 (the 1011 value) as no other information is referenced.
Example 2
TEDOC_DOCDATE,4,05/15/2006,MM/DD/YYYY,,,
Informs the system we are importing data which relates to a document date in SFA (position
1).
Use the value found in column 4 (position 2)
If no value is found in column 4 use the value 05/15/2006 (position 3).
The date format to use is the MM/DD/YYYY format (position 4)
Positions within String, Field Length and Assumed Decimal Places (Postions5-7) do not apply
to this type of info, so they are left blank. Technically, you could remove the commas after
position 4 (the MM/DD/YYYY date format) as no additional information is referenced.
Example 3
TETRANS_CREDIT,14,,,,,2
Informs the system we are importing data which relates to a credit value in SFA (position 1)
Use the value found in column 14 (position 2)
KCS Template Quick Reference
KCS Content Style Guide
There is no Default Value, Date Mask, Positions within String, Field Length for this entry but
positions 3-6 must be defined as blank in order for the system to recognize position 7.
In position 7 we have indicated the last 2 digits of the value are interpreted as cents. If the
value imported is 10305 it would come in as $103.05
Other considerations:
There are additional conditions that have to be met in order to successfully import into SFA:
-Session ID must be unique – You cannot reuse a session ID. If you have your session ID hard
coded through a default value you will have to change the DEF file every import.
-Document Number must be unique in each session. You can have multiple transaction codes
associated with a document number but they must be contiguous in the data file.
-All segments must have a value when you are importing a Fund or Balancing type segment
-All segments must have a value when you are importing to a revenue or expense GL account
segment
-There is a user setting in the Chart of Accounts which designates certain segments are
required for certain type GL codes. If that is the case you will need to supply that information to
the system in order to successfully import
Transactions must balance in order to successfully save a session in the batch to post status. To
post a batch afterwards the data must pass certain validations. These are:
-All required information is present
-All documents must balance in total. The debits and credits have to be equal for each
document number.
-All Fund type codes must balance. The debits and credits must balance or equal for every
fund code
-All Balancing (BAL) segment codes must balance. The debits and credits must balance or
equal for every balancing type segment code
-Transactions must balance by effective date. The debits and credits must balance for every
effective date used in the system. If they do not, ensure that leading zeros have not been
stripped off the date values.
-Transactions must balance by entry type. By setting the default entry type to “N” this usually
will avoid this issue
Most of the difficulty in getting an import to work is related to the DEF file and the requirement to have
all the correct values, commas in the right place and all the required information.
Testing:
When testing the import for SFA for the first time it is best to perform a sample import of the data.
Make a copy of your data file then remove all but the information which would relate to a single
document or one set of balancing transactions. It is also a good idea to use the PREVALIDATE
function. To do this, type the command PREVALIDATE after the SEGMENTNOTSTRING or other
Context command at the top. This will instruct the system to go through the import process but
perform an actual import. If an error is returned, you can correct any issue(s) in the data file or within
SFA without the additional task of having to delete an unposted session. After you have performed a
validation without errors, remove the PREVALIDATE command from the DEF file to allow future
imports to be done live.
If you should import transactions by mistake or those which need alteration in the original data file or
in ADP, you will need to delete the unposted session that was created in SFA. In the Accounting
KCS Template Quick Reference
KCS Content Style Guide
module navigate to: Transactions>Enter Journal Vouchers. Select the correct Session ID created by
the import (you can click the Start button to review the contents of the session) and click the Red X Delete button.
Use of the Offset Account Assignments:
Some users importing from ADP wish to use their Offset Account Assignments to ensure proper
financial entry.
At the most basic level the Offset Account Assignment is the second account used for a balanced
entry. For payroll transactions it is common to post entries to the Payroll Expense account which are
offset by an entry to the Cash account. If the Payroll Expense Account (GL code 50000) is the Debit
entry then the Cash Account (GL code 1000) would be our offset or credit entry.
These offsets can be established in the SFA system for ease and accuracy of data entry. We can
take advantage of these offsets in the import process to reduce the number of transaction lines that
are required to be imported.
A standard data import file would contain an expense transaction line to debit GL 5000 and a second
line with an equal amount to credit GL 1000. If we utilize the Offset Account Assignments we only
import the expense transaction line to debit GL 5000 and the system will automatically offset or create
the credit entry to GL 1000.
To use the offset feature, place the following command in the Context block at the top (along with the
SEGMENTNOTSTRING command).
APPLY_OFFSETS
This will create offset entries to be applied to every account we import. Because of this it is important
that EVERY account we reference in the data file has an Offset Account Assignment associated with
it in SFA. It is also important to remember that you should not have any Offset Account transactions
present in the import data. If you do, SFA will attempt to offset the original transactions as well as the
included offsets.
Download