ACL GuideLine
ACL GuideLine
The scripts used in AX Core analytics support most of the commands and data access
functionality available in ACL Desktop. However, you need to review your analytics carefully to
ensure that the scripts they use run without any user intervention, and do not use features or
commands that are not supported by the AX Core Analytic Engine.
These guidelines assume you will be creating analytics and their supporting scripts primarily in
ACL Desktop, before importing them to AX Core. As a convenience feature, the AX Core Client
script editor does allow you to add new scripts to an existing analytic project in AX Core. This
feature is useful if you want to fine-tune the behavior of an analytic without having to export it
to ACL Desktop and then reimport it to AX Core. However, you will probably find that analytic
development work beyond minor adjustments is easier to accomplish in ACL Desktop.
You should refer to the information that follows as you develop your scripts.
Best Practices
The following general guidelines describe best practices for creating AX Core analytics:
Test the scripts you use in your analytics locally before importing them into AX Core
with AX Core Client to ensure that they run without any user intervention.
Ensure that you do not use absolute paths (e.g. C:\results) in analytic scripts. You should
only use relative paths (e.g. \results) so that analytics can be developed and tested
locally and then imported into AX Core.
Ensure that you use the SET command to specify any preferences required by the script.
If you do not specify preferences in your script, the default ACL preferences will be used.
If you connect to an ODBC data source in your script, you must have an ODBC data
source configured with identical settings on your local computer and the AX Core server,
or AX Core Analytic Server, where the analytic will run if you want to test the scripts
locally and then import the ACL project into AX Core. To avoid having a data source
password in clear text in your script, use the PASSWORD analytic tag, which prompts for
a password, and encrypts the entered value.
Optimize the performance of analytic scripts by minimizing the number of times AX Core
server tables are accessed. You can do this by using the FILTER command to select the
records you need, and then the EXTRACT command to extract only the required fields.
The reduced data set will be processed locally on the server where the analytic is being
run by the AX Core Analytic Engine.
Optimizing your analytic scripts in this way is particularly important when the data files
are not located on the same server as the AX Core server or the AX Core Analytic Server
processing the analytic and the Copy analytic data option is not selected in AX Core
Administration.
Inefficient analytic script example:
OPEN LargeTable
SET FILTER TO trans_date >= `20091201` AND trans_date < `20100101`
COUNT
TOTAL amount
CLASSIFY ON account ACCUMULATE amount TO TransClassAccount
OPEN LargeTable
SET FILTER TO trans_date >= `20091201` AND trans_date < `20100101`
EXTRACT FIELDS trans_date desc account type amount TO AnalysisTable
OPEN AnalysisTable
COUNT
TOTAL amount
CLASSIFY ON account ACCUMULATE amount TO TransClassAccount
Use Background mode to access data from SAP ERP systems using AX Link.
When you develop AX Core analytics, you must ensure that they run in all circumstances
without user intervention. If a command tries to create a dialog box, an error message will be
entered in the log and the AX Core Analytic Engine will stop processing the analytic.
The following guidelines outline the areas you need to check to remove commands that require
user interaction from your scripts:
Ensure that the script does not contain interactive commands that display dialog boxes.
Interactive commands include PAUSE, ACCEPT, DIALOG, and PASSWORD. You can use
the PASSWORD analytic tag, or the SET PASSWORD command, if you need to supply a
password in your script.
Add the SET SAFETY OFF command at the beginning of the script to ensure files can be
overwritten as necessary without displaying a confirmation dialog box. You also need to
add the SET SAFETY ON command at the end of the script to restore the default
behavior.
Add the OK command after any commands that normally display a confirmation dialog
box, such as RENAME and DELETE.
Use the Schedule command if you want to run an analytic once at a future time, or if you want to set up
a recurring schedule. Analytic schedules can use any of the following frequencies:
To schedule an analytic, you must have “Full permissions” for the activity containing the analytic.
If you are using a linked analytic, you may also need “Read only” permissions for the activity containing
the master analytic, depending on how your installation of AX Core is configured. For more information,
see About permissions for linked items.
If your organization has AX Exception installed and configured, analytics must be scheduled if you want
to publish results to AX Exception. You cannot publish results to AX Exception when you run analytics ad
hoc.
Note
If an analytic is edited after being scheduled and the required input parameters are changed, any
schedules for the analytic are suspended and will not run. The analytic must be rescheduled and the
appropriate input values provided. For more information, see Editing scheduled analytic jobs.
To schedule an analytic:
4. If you are not using an existing set of input values, enter or edit table, field, or input values as
required, and click Next to proceed to the next page of the Schedule wizard.
Depending on the complexity of the analytic, you may need to enter several pages of
information. If there are optional input values and you want to exclude them, select Do not pass
a value to the analytic.
5. In the Confirm Analytic Input Values page, confirm the information you have entered is correct
and click Next.
This page does not appear if you are using an existing set of input values.
6. If AX Exception is installed and configured on the AX Core server, the Publish Results page is
displayed after you have entered all of the required input values. If you want to publish results
to AX Exception, select the Publish results to AX Exception checkbox, and enter the following
information for each table you want to publish:
o Table to publish– Select the ACL table to publish. The ACL tables listed are identified as
result tables in the analytic.
o Entity – Select or enter the AX Exception entity to associate the results with.
o Analytic Name – Select or enter the AX Exception analytic to associate the results with.
o Publish – Select the Publish checkbox for each table you want to publish to
AX Exception.
Note :
An analytic author can specify, in the analytic, which tables to publish. If this is the case, the
Table to publish, Entity, Analytic Name, and Publish fields are disabled.
7. If you want to publish more than one table to AX Exception, click Add Row, and enter the
required information.
8. Click Next.
9. In Set Schedule, specify the following values and click Next:
o Date – Select the date when you want the analytic job to run for the first time.
o Time – Select the time when you want the analytic job to run.
o Repeat – Select the frequency for the analytic job to run, or leave the default setting of
Do not repeat if you want the job to run only once in the future.
10. In Confirm Schedule, review the scheduled times for the analytic job(s). Up to ten jobs are listed,
depending on the frequency. If the scheduled times are correct, click Schedule.
Use the Analytic Status view to view the status of currently running, queued, scheduled,
suspended, or previously run analytics. For more information, see Viewing the status of
analytics.
Viewing the status of analytics
You can use the Analytics Status view to view the status of currently running, queued, scheduled,
suspended, or previously run analytics. The tabs in the Analytics Status view display a variety of
information, including start, next scheduled, and end times, whether or not an analytic ran successfully
to completion, the name of the Results subfolder, and the name/IP address of the server where the
analytic is running or ran.
If your organization exceeds the maximum number of analytics that can be run simultaneously (as
configured by your AuditExchange administrator), the additional analytics, whether scheduled or run ad
hoc, are queued. Queued analytics automatically start as soon as previously running analytics complete.
If an analytic is edited after being scheduled and the required input parameters are changed, any
schedules for the analytic are suspended and will not run. The analytic must be rescheduled and the
appropriate input values provided. For more information, see Editing scheduled analytic jobs.
You can enable automatic refreshing of the Analytics Status view by selecting Auto-refresh. The refresh
interval is configured by your AuditExchange administrator. You can manually refresh the Analytics
Status view immediately by switching back and forth between any of the tabs in the view.
In order to view the status of an analytic, you must have at least “Read only” permissions for the activity
containing the analytic.
The View History By drop-down list allows you to filter the list of analytics by time
period (today, last 7 days, last 30 days, all). The analytic End Time is used when
calculating which time period an analytic falls into (using the time from the server that
ran the analytic). The today time period is the entirety of the current calendar day. The
last 7 days and last 30 days time periods are calculated from the current time to the
same time 7 days or 30 days previous — for example, 3:00 pm October 8 to 3:00 pm
October 1.
2. Select an analytic entry and click Details to view additional information about the status of the
analytic.
The information displayed in the Analytic History Details dialog box is particularly useful for
failed analytics, because it lists the reason for the failure.
You can delete a scheduled analytic job if you want to prevent it from running. If you delete an analytic
job that has a frequency specified, all future occurrences of the job are deleted when you delete the job
listed in the Analytics Status view. The one exception is if the next scheduled run of the analytic is
currently queued. In this case, you must delete the queued instance of the scheduled analytic, refresh
the view of scheduled analytics, and then delete the next scheduled occurrence of the analytic.
In order to delete a scheduled analytic job, you must have “Full permissions” for the activity containing
the analytic.
Note
Deleting an analytic job deletes the job only, not the actual analytic.
You can delete the history information for any analytic job that has either run successfully or failed. This
is useful when you are testing an analytic and want to delete information about your test jobs so that
other users only see the real history associated with the analytic job.
In order to delete the history for an analytic job, you must have “Full permissions” for the activity
containing the analytic.
Ensure that the script does not use unsupported features or contain commands or command
syntax that is not supported on the server.
The following list outlines command syntax, features, and commands that are not supported on
the server:
AX Core does not support direct database server tables linked to ACL Server Edition for
z/OS.
The NOTIFY command only supports SMTP messaging. The MAPI and VIM mail protocols
are not supported.
If the PRINT or TO PRINT command is used in a script, a default printer must be
configured on the server.
The SAVE GRAPH and PRINT GRAPH commands are not supported.
Do not use the SET LEARN command in AX Core analytics.
After you develop and test the scripts in your ACL project, you need to add the analytic
declaration to one or more of the scripts before the project can be imported as an analytic into
AX Core using AX Core Client. You need to keep the following points in mind when you add
analytic declarations:
The analytic declaration must be completely defined in a comment block that starts on
the first line of the script where the analytic is defined.
You must declare all of the inputs required by the analytic (tables, fields, parameters)
and the results that will be copied to AX Core (ACL tables, log files, and data files) in the
analytic declaration.
COMMENT
//ANALYTIC Identify missing checks
This analytic identifies missing check numbers.
//TABLE table_payments Payments Table
Select a table that lists payments and includes a check number column.
//FIELD check_num CN Check Number
Select the field that contains the check numbers.
//PARAM start_date D OPTIONAL Start Date
(Optional)Enter the start date for the analysis.
//PARAM end_date D OPTIONAL End Date
(Optional) Enter the end date for the analysis.
//PARAM region C MULTI Region(s)
Enter one or more regions to include in the analysis.
//RESULT LOG
//RESULT TABLE MissingChecks
//RESULT FILE MissingCheckDetails.xls
END
ANALYTIC
The ANALYTIC tag is used to identify a script as an analytic. An ACLScript COMMENT command
must be entered on the first line in the script, followed by the ANALYTIC tag on the second line.
If the ANALYTIC tag is used in any other location it is ignored. One or more scripts in an ACL
project can include an analytic declaration.
Syntax
//ANALYTIC name
<description>
name
Specifies the name to assign to the analytic. The name is used to identify the analytic in
AX Gateway and AX Core Client, and is separate from the script name you specify when
you initially create the analytic script. Characters that cannot be used in Windows
filenames (< > : “ / \ | ? *) should not be used in analytic names, because they will cause
an error that prevents the export of analytic results.
description
Optional. Descriptive text that specifies the purpose of the analytic or other information.
The description can be multiline, but it cannot skip lines. The description must be
entered on the line below the associated ANALYTIC tag.
Example
The following example shows a basic analytic declaration with a name and a description of the
purpose of the analytic.
COMMENT
//ANALYTIC Identify missing checks
This analytic identifies missing check numbers.
END
The following table lists the error codes that you may encounter when running AX Core analytics.
1000 No preference file was specified. A new default preference file was created.
1001 There is a problem with the preference file. A new default preference file was
Error Code Description
created.
1002 The project has been upgraded from an earlier version. A copy was saved with a .old
extension.
1003 The project file could not be processed. The last saved project was used.
1008 The specified .old project file cannot be used. You must specify a project file with
the .ACL extension.
Command errors
The following table lists the error codes that are returned when an analytic fails because of an invalid
ACLScript command. The error code number returned identifies the command that failed.
Error Code Command
1 SAMPLE
2 EXTRACT
3 LIST
4 TOTAL
5 DEFINE
6 COMMENT
7 QUIT
8 STOP
9 BYE
10 USE
11 OPEN
12 SAVE
13 DISPLAY
14 ACTIVATE
15 CLOSE
16 HELP
17 COUNT
18 STATISTICS
19 HISTOGRAM
20 STRATIFY
Error Code Command
21 SUMMARIZE
22 EXPLAIN
23 GROUP
24 ELSE
25 END
26 CANCEL
27 SUBMIT
28 DELETE
29 RANDOM
30 SORT
31 FIND
32 DIRECTORY
33 TYPE
34 DUMP
35 INDEX
37 SET
40 DO
41 TOP
42 EXPORT
43 VERIFY
Error Code Command
44 SEEK
45 JOIN
46 MERGE
47 SEQUENCE
48 CALCULATE
49 PRINT
50 LOCATE
51 RENAME
54 COPY
55 REPORT
56 EJECT
58 LET
59 ACCUMULATE
63 ACCEPT
64 ASSIGN
65 AGE
66 CLASSIFY
67 PROFILE
68 DO REPORT
69 LOOP
Error Code Command
70 PAUSE
71 SIZE
72 EVALUATE
73 DIALOG
74 IF
75 GAPS
76 DUPS
77 SQLOPEN
78 PASSWORD
79 IMPORT
80 REFRESH
81 NOTIFY
82 CONNECT
83 RETRIEVE
84 FIELDSHIFT
85 BENFORD
86 CROSSTAB
87 CRYSTAL
88 ESCAPE
89 NOTES
Analytic job processing errors
-10 The analytic results could not be saved because the destination results folder was
deleted after the analytic started running.
-23 Publish failed. One or more of the table’s column names are too long.
-24 Publish failed. Invalid values within data cells within an ACL table.
-25 Publish failed. Not supported data types within table fields.
-27 Job did not run. The user was removed or does not have permission.
-28 Job did not run. Unexpected error. See the server log and ACL log for details.
-29 Could not copy data files. The analytic failed because the required data files could
not be copied to the jobs directory.
-31 Publish failed. The exception mapping file could not be located.
The information in this appendix is for organizations that have existing implementations of the non-
Unicode ACL products, and are migrating to the Unicode edition of ACL AuditExchange. Existing analytics
and scripts are automatically converted to Unicode, but you must ensure that the logic of the scripts
remains the same when applied to Unicode data.
Unicode is a standard for encoding text that uses two bytes to represent each character, and characters
for all languages are contained in a single character set. The Unicode editions of ACL products allow you
to view and work with files and databases that contain Unicode-encoded data in all modern languages.
Note
ACL Desktop and the ACL Analytic Engine support little-endian encoded Unicode data.
The following list outlines the key areas you should be aware of when you convert to the Unicode
version of ACL AuditExchange:
Note
The information in this document focus on the changes required to convert AX Core analytics to
Unicode. You can find out more about the Unicode encoding standard, and Unicode-specific functions,
in the help system for the Unicode version of ACL Desktop.
Required Changes
When you upgrade to the Unicode version of ACL AuditExchange, you need to check your analytics and
make the necessary changes wherever the following commands and functions are used:
You need to recreate all instances of the IMPORT PRINT and IMPORT DELIMITED commands by
importing the source data file using the Data Definition Wizard in the Unicode version of ACL
and reimporting the projects into AX Core.
You need to modify all instances of the ZONED( ) and EBCDIC( ) functions as follows so that the
ASCII values returned by the functions are correctly converted to Unicode data:
o For computed fields, you need to wrap the BINTOSTR( ) function around the ZONED( ) or
EBCDIC( ) function. For example:
o For static expressions within scripts (such as usage in counters), you must wrap
BINTOSTR( ) around the ZONED( ) function.
You need to modify all instances of the OPEN FORMAT command. You need to use the SKIP
command to skip the first two bytes of the Unicode file you are opening. This is required
because the first two bytes of UTF-16 encoded files are reserved as byte order marks and are
separate from the text in the file.
The following examples show the non-unicode and unicode versions of the same OPEN FORMAT
command:
Non-unicode:
Unicode:
ASCII( )
BIT( )
BYTE( )
CHR( )
DIGIT( )
HEX( )
MASK( )
SHIFT( )
You need to check the usage of the functions listed below in your scripts to ensure that they are not
used in ways where the script logic assumes a one to one correspondence between the number of
characters and the number of bytes. If you find any instances where the logic assumes a one to one
correspondence between characters and bytes, you must adjust the logic to work correctly with Unicode
data, which uses two bytes to represent each character. Numbers specified as string function
parameters, such as 4 in STRING(1000, 4) refer to the number of characters, so standard usage of these
functions will not cause problems.
Conversion Functions
PACKED( )
STRING( )
UNSIGNED( )
VALUE( )
ZONED( )
String Functions
AT( )
BLANKS( )
INSERT( )
LAST( )
LENGTH( )
REPEAT( )
SUBSTRING( )
Miscellaneous Functions
FILESIZE( )
LEADING( )
OFFSET( )
RECLEN( )
ACL Unicode products support six Unicode-specific functions that support conversions between non-
Unicode and Unicode data. The following functions are available in ACL Unicode products:
BINTOSTR( ) – Converts ZONED or EBCDIC data to its corresponding Unicode string. This ensures
that values encoded as ZONED or EBCDIC data can be displayed correctly.
DHEX( ) – Returns the hexadecimal equivalent of a specified Unicode field value. This function is
the inverse of HTOU( ).
DBYTE( ) – Returns the Unicode character interpretation of a double-byte character at a
specified position in a record.
DTOU( ) – Converts a date value to the correct Unicode string display based on the specified
locale setting.
HTOU( ) – Returns the Unicode equivalent of a specified hexadecimal string. This function is the
inverse of DHEX( ).
UTOD( ) – Converts a locale specific Unicode string to an ACL date value.
You can learn more about these functions and their usage in the help system for the Unicode edition of
ACL Desktop.