# Overview
Administration functions are managed directly through the provided web interface, enabling administrators to access and manage features of PMG Portal™ and PMG Workflow™. To access the administration features, the user must be logged in as the administrator defined during the initial product installation or as a user whose account has been subsequently granted sufficient administration permissions.
Application management features can be accessed via the global navigation menu across the top of the PMG Portal screen and from the Workflow Designer screen. Most administration features are accessed from the PMG Portal via the ADMINISTRATION link on the top navigation bar. A limited number of management functions require file access and management directly within the application server file system. Management features for content, such as Pages, Categories, and Services, can also be accessed by navigating to those items directly and selecting the appropriate management options provided on those pages. Certain features or functions may not be accessible until the given feature has been explicitly enabled.
# Quick Find
The "Quick Find" or "omni menu" feature provides a simple search feature to jump to other areas of administration, or to search contextually within administrative functions related to the current activity.
To access Quick Find, type 'CTRl' + '/' from any of the supporting pages, including the Designer areas of the application, the Administration screens, and any supporting Portal pages. Begin typing into the search box and any found matches will be displayed. Select any shown match to jump to that area of the application.
Note: When accessing the Quick Find feature from a designer (Workflow, Form, App, etc.), additional contextual options will be displayed.
# Administration Landing Page
The Administration landing page, or home page, shows you important information quickly and organizes administration activities. This page provides useful information at the top of the page and provides administration links grouped by functionality below. For example, the app version and server information are shown related to the environment under the top System Information section.
The filter box allows different activities to be quickly found in the activity sections below. Additionally, the menu at the top left of the page allows for a fully expanded navigation list of all the available activities.
System management functions can be accessed from the home page via main navigation options shown under ADMINISTRATION, System Management.
# Settings
All manageable system settings can be updated from the Settings page. The Settings page is accessed via the rollover drop-down menu navigation path ADMINISTRATION, System Management, Settings.
On the Settings screen, each row in the Name column provides a brief description of the Setting when the label is clicked.
Values for each Setting are provided by selecting from a drop-down list or by filling out a text box. Upon initial installation, these will be set to their respective default values or left blank where no default value exists.
After any applicable updates have been made to individual Settings, the Submit button at the bottom of the page can be clicked to save the changes. To revert to the previously saved values without saving any subsequent changes, the Reset button can be used.
See the System Settings Addendum for more information on specific system settings.
# System Settings Export
The Export System Settings allows for all current system settings to be exported as a JSON document. The JSON document can be compared with another environment JSON document using any text compare tool to determine differences in environment settings.
# User and Group Rights
Global application permissions are applied at the individual user or group level and are managed from the User and Group Rights page. This page is accessed from the Administration, User and Group Rights.
From this screen, the administrator can select to search by users or groups, then type the name or partial name in the search box and click Enter to locate the specific user profile(s).
Filter By Permission - Users and Groups may also be found using the Filter By Permission option which when selected allows you to select one or more roles and rights, and optionally apply the clause of "And" or "Or" to specify which to find.
Admin Roles: Permission levels can be set per a user’s administrative role as follows:
System: By checking the System Administrator role, the user will be granted global administrative permissions without the need to assign additional admin roles and rights specifically. System Administrators have rights to all system features and available functionality within the platform, including rights to standard user features. In addition to system settings and functions, System Administrators can fully manage Categories, Services, Forms, SLAs, Workflow Designer, Queues, Calendars, Reports, Connectors, providers, workflow actions, Configuration Modules, and iCollaborate content.
Catalog: The Catalog Administrator permission level grants a user full management rights to all Categories, Services, SLAs, and Reports. In addition, Catalog Administrators can view all other users’ orders from the Request History interface via the REQUESTS button in the global navigation menu across the top of the screen.
Report: The Report Administrator permission level grants a user rights to fully manage available reporting capabilities, including the Report Editor feature found under the REPORTS button in the global navigation menu across the top of the screen.
iCollaborate: The iCollaborate Administrator permission level grants a user full management rights to the portal content management tools available through the iCollaborate interface accessed via the ADMINISTRATION button in the global navigation menu across the top of the screen. Portal content includes custom screens, widgets, and navigation elements not generated through the Service/Forms Catalog (i.e., outside of Categories and Services).
Licensed: By checking this box, the System Administrator can indicate whether a named user license, which grants the user access to the application, is in effect for this user.
Rights: Specific feature or functionality-based rights available for individual users are listed below:
- Workflow Designer: The Workflow Designer permission level grants a user access to the Workflow Designer and allows them to create and manage Workflows, within certain limits. This setting does not provide global rights to all Workflows, but applies to those Workflows where the user created the Workflow design or has been explicitly set as a manager of the Workflow by another user. The Workflow Designer interface can be accessed via ADMINISTRATION-->Content Management-->Workflow Designer.
- SLA Designer: The SLA Designer permission level grants a user rights to create and manage SLA definitions and settings, within certain limits. This setting does not provide global rights to all SLAs, but applies to those SLAs where the user created the SLA or has been explicitly set as a manager of the SLA by another user. The SLA Designer interface can be accessed via ADMINISTRATION-->Content Management-->Manage SLAs.
- Settings: The Settings permission level allows a user to access and manage a certain set of designated system settings. Settings are managed from the Settings page, which is accessed via ADMINISTRATION, System Management, Settings.
- Custom Form Elements: The Custom Form Elements permission level allows a user to manage the Custom Form Element Types available within the Platform. The Custom Form Elements page is accessed via ADMINISTRATION, System Management, Custom Form Elements.
- External Data Sources: The External Data Sources permission level allows a user to manage the data sources for use within the Platform. The External Data Sources page is accessed via ADMINISTRATION, System Management, External Data Sources.
- Allow Abort: The Allow Abort permission setting enables a user to abort or fully stop any Workflow in process for which the user can access. When this setting is checked, the user will have access to an Abort link when viewing a given workflow details.
- Table Manager: The Table manager right allows the user to access the “Table Manager” functionality to manage various database tables for which the user has been granted rights to manage.
- Records Designer: The Records Designer right allows users to access the “Records” functionality to create Records and/or edit Records they have permission to by direct user assignment or by group assignment.
- Global Work Reassign: Grants the user the right to transfer any work from one user to another, accessed from the Work Dashboard.
- Theme Admin: Grants the user access to the platform’s Theme Administration where general page layout, branding, widget management and related activities are supported.
- Theme Assets: Grants access to the Theme Assets administration page (logo, CSS, JavaScript, file manager, and global widgets)
- Category and Form Designer: Grants the right to create and manage any category or forms, including shared forms.
- Queue Manager: Grants the user access to Queue Management functions, accessed from Administration, “Queues” or from the work dashboard
- App Designer: Grants the user access to App Designer, accessed from Administration, “App Designer”
- System Workflows: The System Processes permission level allows a user to manage System Workflows, which are Workflows used to manage system processes only, without any front-end user input. This management capability includes being able to start System Workflows as well as review ones that have run in the past. The System Processes page is accessed via ADMINISTRATION, System Management, System Processes.
- Bulk Process Control: The Bulk Process Control Manager permission setting grants a user permission to manage the state of Workflows that are running or have run within the current instance. The user can filter the view by certain parameters or by executing a custom query, and then can perform certain operations to manage the Workflows, such as pausing, resuming, or aborting them. The Bulk Process Control screen is accessed via ADMINISTRATION, System Management, Bulk Process Control.
- Windows Services: The Windows Services permission setting allows a user to view, stop, and start the Workflow Engine for the current environment. The Windows Services page is accessed via ADMINISTRATION, System Management, Windows Services.
- Log Viewer: The Log Viewer permission setting allows the user to access the application logs. The Log Viewer page is accessed via ADMINISTRATION, System Management, Log Viewer.
- SPEWS Access: The SPEWS Access permission setting enables the user account to submit calls to the SPEWS (“Service Process Engine Web Service”) interface. A more detailed discussion of SPEWS can be found in the Service Process Engine Web Service document available on the PMG Support site (<http: support.pmg.net="">) under Integration Documentation.
- Global Order History: The Global Order History permission level enables a user to review the order history data for all users.
- Calendar Editor: The Calendar Editor setting allows the user to fully manage the list of existing Calendar definitions available for use within the platform. Calendar definitions can be used within SLAs and OLAs when calculating time periods or durations. In the Workflow Designer, Calendar definitions can be used to configure timeout periods within the Action Properties settings for given Workflow activities**.** The Calendar Control screen is accessed via ADMINISTRATION, System Management, Calendars.
- Process Details Viewer: Grants the user access to view the read-only information in the process details diagram.
- Knowledge Contributor: Grants the user access to Knowledge Contributor functions
- Knowledge Publisher: Grants the user access to Knowledge Publisher functions
- Documents: The Documents permission setting allows the user to access the Document Management page. The Document Management page is accessed via ADMINISTRATION, System Management, Documents.
- Change Workflow Data: Grants the user the right to modify workflow data for workflow diagrams for workflow instances which the user can access
- Workflow Design Reader: Grants the user the right to browse workflow diagrams from the Workflow Designer in read-only mode, for which the user has been granted permission
- Actions Manager: Grants the user the right to manage actions accessed from the Workflow Designer menu, Manage, Actions
- Connector Configuration Manager: Grants the user access to Connector Configuration, accessed from the Workflow Designer menu, Manage, Connector Configuration
- Package Manager: Grants the user access to Package Manager, accessed from Administration, Package Manager
To find all users with system admin rights, or all licensed users, enter “help:sysadmins” or “help:licensed” in the search box.
# User Management - Properties and Preferences
The "Manage User" option shown for individual users provides visibility into additional application stored user properties and preferences as well as options to manage the data. When viewing the user data, you may choose "Edit User" to set the user display name values and the stored email address. The stored user preference data may be viewed and reset as well.
Show Effective Rights - This toggle will indicate which show which rights the user effectively has, considering group membership for the user. Additionally, a "Managing" row in the Permission table will show the number of forms and categories a user has manager rights to.
Show Groups - For a displayed user, this will show any groups the user belongs to which also have permissions defined as a group within the system.
Show Users - For a displayed group, this will show any users within the group for which permissions have been defined
Refresh Group Membership - For performance, the application synchronizes users to directory groups on a schedule. This option will initiate a real-time synchronization of users and group relations.
# Active Directory
For each domain configured for access to the Platform, applicable AD settings can be managed on the Active Directory screen, which is accessible via ADMINISTRATION, System Management, Active Directory.
Domain and Active Directory settings are typically configured during the initial installation process; however, users with full System Administrator rights can access this Active Directory page to update settings and/or to add additional domains as needed.
Domains already configured for the current environment are listed on the left-hand side of the main Active Directory page. Click Add Domain to add and configure a new domain.
The following properties are shown for a domain and may be configured on this screen:
NETBIOS Name: NetBIOS name for the domain
Enabled: Determines if the domain is used for logon
DNS Name: DNS name for the domain
LDAP Search Path: Base distinguishedName for the domain
Additional Logon Domains: List of UPN suffixes used in the domain. (The list is new line delimited.) Can also be “*”. Logon domains can be located and added by clicking the Discover link.
Domain Controllers: List of domain controllers that will answer domain queries. (The list is new line delimited.) The DNS name for the domain can usually be used. Servers in the list will be queried in the order they appear, so faster/closer servers should be at the top.
Email Domains: List of email domains that are in use in the domain. If the system setting ALLOW_EMAIL_AS_LOGIN is set to True, then this field will be shown. This is required for users to login to the SCS using an email address and password.
SSL: Enables SSL for the connection to Active Directory
Use Machine Trust: Removes the requirement of using a service account user for the Active Directory configuration. The PMG servers must belong to same domain for this option.
Authentication User: User account that will be used for reading data about users and groups in this domain. This should be in UPN format.
Authentication Password: Password for the Authentication User. Ideally, this will be a non-expiring password. Values may be tested before submitting by clicking on the Test Values link. Only the first password input field is required for the Test Values action. The passwords must be entered identically in both password fields to submit a password change.
Administrators can also manage the list of Active Directory properties available for use within the system by clicking the Edit Attributes link on the left-hand side of the main Active Directory page. Once opened, the Edit Attributes screen will provide an XML document specifying the list of available AD properties that can be referenced in front-end Forms and/or within Workflow designs. The individual nodes of the existing XML document can be edited as needed to add or remove AD attributes from the available list. For each node of the XML document (listed in between the opening and closing <properties> tag), the applicable AD property must be declared using the following example node format:
<properties>
<property name="mail" type="string" purpose="email" displayname="Email"></property>
</properties>
After making any edits to the list of AD attributes, the Submit button can be clicked to save the modified XML document.
# LDAP
The Platform supports x.500 based LDAP for user authentication and group based permissions. To manage LDAP domains and settings, the LDAP administration screen can be accessed via the rollover drop-down menu navigation path ADMINISTRATION, System Management, LDAP.
LDAP domains that have already been configured for the current environment are listed on the left-hand side of the LDAP screen. Click New LDAP Domain to add and configure a new LDAP domain.
The following properties are shown for an LDAP domain and may be configured on this screen:
Name: Provides a reference name for display when selecting the domain throughout the application
Description: A brief description of the domain
Enabled: Determines if the LDAP is to be used for logon
Search Path: Specifies the LDAP path as the top level for finding users and groups
Server: Network accessible server name address and port for LDAP queries
Port: Network port to use
Use SSL: Determines if SSL is to be used in the LDAP connection
Ignore SSL Errors: Determines if SSL validation errors are ignored
Connectionless: Use connectionless option
User: Authentication user for LDAP connections if Anonymous is not selected
Password: Authentication password for LDAP connections if Anonymous is not selected
Domain: Authentication domain for LDAP connections
Authentication Type: Specifies the type of LDAP authentication to use
Authenticate with DN: Use DN for authentication
Logon Attribute: User object attribute to use for login
Member Attribute: User object attribute which holds group membership
Member Values Attribute: Group object attribute which is stored in Member Attribute
Group Unique Attribute: Group object attribute which is unique
Group Display Name Attribute: Group object attribute to use for display
Group Filter: Filter (objectClass) to identify group objects
User Filter: Filter (objectClass) to identify user objects
Group Search Filter: Search filter to find group objects
User Search Filter: Search filter to find user objects
Bind timeout(sec): Amount of time to allow for binding
Search request time limit(sec): Amount of time allowed for search requests
Global max results: The maximum amount of results
Attribute Replacements: Attribute mapping to associate Active Directory properties to LDAP properties. The Attribute Replacement property has a link “Recommend” which suggests the attribute replacements.
Changes to the LDAP domain properties can be saved by clicking the Save button at the bottom of the screen or returned to the previously saved values by clicking Reset.
# LDAP Debugging Tool
The LDAP Debugging Tool provides an interface to validate the LDAP configuration, analyze LDAP connectivity and discover various attributes of the LDAP connection.
System administrators can also view the XML for the LDAP domain settings by clicking the Show Configuration XML link on the left-hand side at the bottom of the page.
# SSRS
The SSRS Settings page allows an administrator to point to a SSRS server and configure the behavior of the main REPORTS page. The SSRS Settings page can be accessed via ADMINISTRATION, System Management, SSRS.
The following settings may be configured on this screen:
Expand Depth: When the REPORTS page loads, the navigation tree is fully collapsed. To make the tree expand to a certain level of depth, enter a number into this field (e.g. a value of 2 will expand all visible top level folders and their subfolders). The default value is 0. “-1” will expand all reports.
Server: Point to a SSRS Server:
Protocol: http or https
Server Name: IP address or server name of the server where SSRS is installed
SSRS Instance Name: Instance name of the SSRS installation. The default value is ReportServer.
Reports Manager Root Folder: Path to a folder on the SSRS server. Only folders and reports below this folder will be displayed on the REPORTS page. This does not affect the Report Permissions page.
The format used is a slash (/) delimited list of folders: [Folder Name]/[Folder Name]/[Folder Name] (e.g. Reports/CMDB/Assets). For a single folder, use [Folder Name]. The default value is blank, indicating that every folder that has a report on it should be displayed in the navigation tree on the REPORTS page.
Authentication: Logon credentials (User Name, Domain, and Encrypted Password) for a user who has at least "browser" level permissions on the SSRS server
Reports Interface Behavior: Allows for modification of elements on the main REPORTS page:
"My Report" Parameter: By adding a hidden parameter to the report through SSRS, the data in a report can be filtered based on the specific user that is viewing it. The name of the parameter is configured here, and the default value is UserLogon.
# Email Settings
Administrators can configure the default settings for system generated emails on the Email administration page, which is accessible via ADMINISTRATION, System Management, Email.
The following settings may be configured on this screen:
From Address: Used to set the global default email address displayed in the From field of outgoing emails sent through executed Workflows. Within the actual Workflow definitions built in the Workflow Designer, individual email notifications can be configured to use their own specific From email address, but if no such From address is specified in the Action Properties, then the value in this field will be used by default. The value in this field does not necessarily have to be an active email account, but does need to reflect a valid email address; e.g., name@domain.TLD.
SMTP Server: Used to designate the applicable mail server to be tied to the Workflow Engine
SMTP Port: Network port for the SMTP connection
SMTP User: Required only if the designated host mail server has SMTP Authorization enabled on it
SMTP Password: Required only if the designated host mail server has SMTP Authorization enabled on it
POP3 Server: POP3 host to access for incoming email
POP3 User: POP3 username for incoming email
POP3 Password: POP3 password for the POP3 user
POP3 Check Interval: Interval between POP3 mail checking
POP3 Timeout: Time in seconds to wait for the POP3 connection
Changes to the settings can be saved by clicking the Save button at the bottom of the screen. Clicking the Reset button will return the settings to their previously saved values.
Note - “pop3Method” can be configured to “headers” in the PMGSC common_custom_data table. This is in experimental stages, please reach out to PMG Support for more information.
# Debugging
Used for testing purposes, the Debugging feature allows an administrator to temporarily force all Workflow assignments and/or emails generated from submitted Requests to a designated user and/or specified account. When enabled, this feature allows the designated user to review all emails and/or work assignments as the actual email recipient and/or work assignee, without making any actual changes to the Workflow.
The Debugging page is accessed via ADMINISTRATION, System Management, Debugging.
Under the Settings tab on this screen, the following fields are provided:
Debug Email Address: Any saved value in the DEBUG_TO_EMAIL field will force all email notifications to be sent to the specified email address. Once testing is completed, the DEBUG_TO_EMAIL setting should be cleared out for normal Workflow processing to resume.
Debug User (UPN): Any saved entry in the DEBUG_TO_USERID field will send all work assignments to the specified user account. The value saved in the DEBUG_TO_USERID field must reflect the fully qualified userPrincipalName for the selected user in Active Directory (e.g., userid@domain**.**TLD). Once testing is completed, the DEBUG_TO_USERID setting should be cleared out for normal Workflow processing to resume.
Database Log Level: Most detailed level of logging to capture. Options are Debug, Information, Warning, and Error, and None.
File System Log Level: To improve system performance and reduce stress on the database server, workflow engine logs can be directed to the file system, with control for the type of logs sent to database and filesystem individually. It is recommended to keep DB_LOG_ALL_MESSAGE at Warning or Error only, and if full logging is needed, set the FS_LOG_ALL_MESSAGE option to Debug. File system logging will only work for the machines configured to use the Logging Server enabled, in the PMG Configuration Tool.
Do Not Log Messages: This is a new-line delimited list of strings which if found in a log message will cause the message to be excluded from being logged. These are case sensitive.
Enable SOAP Trace: Some Workflow actions send and receive data using SOAP messages. With this protocol, it can be difficult to debug certain things because SOAP messages are created and parsed within an API and not exposed. By setting this to True, these messages are logged to a special folder in the server(s) where the Workflow Engine is running. (On a default install the folder is “C:\Program Files\PMG Service Catalog\SPE\Logs\SOAP”.) Restarting the Workflow Engine is required for this setting to take effect. Without this setting a network tracing tool is required to access these SOAP messages for debugging.
Enforce Parent Workflow Is Waiting Check:
Compress Synchronous Workflow Log Data: Log data is compressed for synchronous workflows reducing disk space pressure on the database
Cache Dynamically Created Source Code: This debug feature tells the workflow engine to cache the source code instead of just the full compiled code. To see the cached source code logged for view in the Log Viewer, use the "Log Cached Code" option from the "Send Engine Diagnostic Message" below.
Log Failed Query Text: Log actual queries used by the database abstraction layer. This will have some performance penalty. All services should be restarted for this to fully take effect.
Max Concurrent Workflows: Change this requires restarting the workflow engines to take effect. It can be an integer or a mupliplier of the number of processes when prefixed with "x", e.g., "x4" would be 4 X logical Cpus.
Concurrent Workflow Executions: Pick one to many workflow and action(s) to exclude from the max concurrent workflows limit.
Global Values Cache Duration (Minutes): This can be a fraction, such 1.5. Use <= 0 to disable this cache. If using the "Update Environment Value" action you may need to disable this cache if you are running workflow function "in process".
Send Workflow Variable Updates To Hub: This can be used for debugging workflow while they are being debugged and stepped though.
Changes to the settings can be saved by clicking the Save button at the bottom of the screen. Clicking the Reset button will return the settings to their previously saved values.
Sort Workflow Exports: This setting indicates that workflow XML exports are to be sorted by node, attributes, and values. This may be useful when comparing versions.
The Debugging page shows live debug details about the environment instances.
# Debug Level Override
Debug level override allows for extra logging for specific actions within workflow. Individual actions may be selected as well as more specific internal methods.
# Custom Form Elements
Custom Form Elements are highly configurable and robust Form Element Types that can be used on Service Forms. The Platform offers a variety of standard Form Element Types; however, Custom Form Elements can be developed when richer, more dynamic controls are required, or when integration between the Service Form and an external system is needed.
The Custom Form Elements administration page is accessed via ADMINISTRATION, System Management, Custom Form Elements.
The process for registering a new Custom Form Element entails installing its corresponding ASPX file in the proper folder on the application server and then adding its display name and corresponding file path to the Custom Form Elements administration page. Once saved on this screen, valid Custom Form Elements will appear in the drop-down list of available Form Element Types on the Edit Mode screen for Services.
The Found column on the right-hand side of the page indicates that the file for the Custom Form Element has been located on the system. Delete allows the user to deregister a Custom Form Element from the system, although it will not delete the files.
Changes can be saved by clicking the Save button at the bottom of the screen. Clicking the Reset button will return the settings to their previously saved values.
Certain Custom Form Elements (formerly called “Configuration Modules” or “CMs”) are available for download from the PMG Support site (<http: support.pmg.net="">). In addition, a more detailed discussion of Custom Form Elements and Form Element Types can be found in the Configuration Module Management document available there.
# External Data Sources
The External Data Sources feature allows administrators to create and manage connections to external data sources for use with the application. The External Data Sources management screen is accessed via ADMINISTRATION, System Management, External Data Sources.
Once a valid external data source has been defined and saved on this screen, it will become available for use throughout the application, Form values, widgets, reports, and workflow.
Enter the name and description of the data source, as appropriate. The ID column is not required when adding a new data source; however, once created, each row on this screen will display the applicable GUID for each external data source in the ID column for ongoing reference.
When declaring a new external data source, the following sample format is required to set the value in the applicable Connection String field:
Driver={SQL Server};Server=20.0.10.5;Database=MyDB;Uid=AdminUser;Pwd=Pword123
Once entered, the connection string can be tested, saved, or deleted by clicking the appropriate button on the right-hand side of the screen. To reveal the connection string of a previously entered external data source, the view icon to the left of the Connection String field can be clicked.
Remote Data
Each data source can be configured as a "Remote Data" source, if the system setting ENABLE_REMOTE_DATA_SOURCES is enabled. Setting "Remote Data" to "No" is the default and the data source needs to be accessible at locations the PMG servers can access. If "Remote Data" is set "Yes", you can provide the settings to access the data from a connected Agent. The settings required to connect to an agent are the Relay Server the agent is connected to, the Relay Port the relay server is listing at, and the agent name.
If you know the Relay Server and Replay Port, you can use the "Query Relay" button to query the relay list the agents that are connected to it.
# Providers
NOTE: The provider framework is deprecated and is replaced by Workflow Functions which provide easier development and troubleshooting. The below material is for legacy configuration support.
The PMG Provider framework provides allows for a developer to integrate any type of 3rd party system data for use throughout the PMG application in a common manner. Third party systems are often accessible only by a specialized API, or, may not even offer a true integration interface. The PMG Provider framework allows a developer to write code to interact directly with the application specific APIs or otherwise, and then supply that data for use throughout the PMG Platform. Providers are usable by Forms, Widgets, Reports, and Workflows. Additionally, Search Providers, a variant of the Provider framework, allows third party data search results to be included in the PMG Portal search feature.
The PMG Provider Framework offers many benefits, including
Improved team scalability, as it allows more developers to contribute without specialized knowledge.
Removing the need for custom form elements, as specialized code can be abstracted away generically. Provider functionality once developed, is also available then across the application stack.
Reduced special knowledge, Developers only need to understand the third party API and write trivial code extending the PMG Provider classes provided.
Support, as providers are shipped with many licensed connectors
Upgrade safety, Provider logic is upgrade safe
Easier maintenance, Providers are managed by web interface, and may be imported and exported.
Providers are managed from the web interface available from ADMINISTRATION, System Management, Providers.
The interface allows for various provider management activities. To select an existing installed provider, select the provider from the dropdown available by Name.
To add a Provider to the PMG application, Select “Add new provider”.
Name: The name for the provider to be referenced throughout the PMG application.
Class Name: The fully qualified path name for the Provider class.
Choose File: Provides an upload prompt to upload a new Provider
Configuration: Optional configuration for the provider. This configuration is passed to the provider by the PMG application when Providers are called.
Notes: A general notes field, useful for usage notes or other considerations.
Type:
Data Provider – indicates the provider will return a data table.
Search Provider – indicates the provider is to be used by the PMG Portal search box.
Once all the information is set, click “Save” to upload the provider to the PMG application.
Providers can be tested directly from the interface by selecting “Test”.
# Import/Export
Providers may be exported individually, or in bulk. To export a single provider, select “Export” from the selected provider screen. To export all providers at once, select “Export all providers” from the Provider management screen.
# Data Workflows
Data Workflows may also be listed and tested from the Providers screen. To test a data workflow, select the data workflow by name and click “Test”. A test window will display an example JSON input document for the workflow and an execution mode selection list. To test a workflow, provide any input data for the Query input and select a Mode. Choices for mode are "in process", or "in memory" of the handing server, EngineNotLogged – in memory of the workflow engine, and EngineLogged – runs the workflow fully, persisting to the database. Click “Test Query” to see the results of the test.
NOTE: If interested, contact PMG Support to obtain a search provider which allows data workflows to be used as search sources for the global application search feature.
# Theme Assets
The Theme Assets screen allows administrators to configure global CSS, JS, and more to personalize the portal's look and feel. Apart from branding specific features, Theme Assets gives administrators easy access to the File Manager screen to upload images used as a part of App Designer apps.
# Basic
The Basic tab has various branding properties that allow administrators to configure the end-user portal's look and feel.
# Global CSS
Allows for globally scoped CSS on the end user portal.
# Global JavaScript
Allows for globally scoped JavaScript on the end user portal.
# Image Brand
Allows administrators to upload an image to be used as the logo.
# Chart colors
Allows administrators to configure the colors of charts used in the app designer.
# File Manager
The File Manager tab provides administrators and app designers a location to upload and store images and documents to reference them in forms, categories, and app pages.
# Global Widgets
The Global Widgets tab allows administrators and app designers to configure various global widgets' behavior on the end-user portal.
Global widgets must be configured as a part of the header template HTML using the following syntax for any changes made on the Global Widgets tab to take effect:
<div global-widget="Search"></div>
# Currency
Currencies are used within the Platform for pricing or costing Services. Settings related to currencies can be managed on the Currency screen, which is accessible via ADMINISTRATION, System Management, Currency.
For information on Currency management, see Section 7.1 - Currency, which can be found in the section titled Price Management.
# Price Rules
Price Rules determine how users will be matched to the cost and price of a Service by defining the rules used for matching groups to configured pricing rules. Pricing rules are managed on the Price Rules screen, which is accessible via ADMINISTRATION, System Management, Price Rules.
For information on Price Rules management, see Section 7.2 - Price Rules, which can be found in the section titled Price Management.
# Languages
From the Languages page, system-supported languages may be enabled or disabled, controlling the list of languages shown in the various translation administration menus throughout the application. The Languages page is accessible via the rollover drop-down menu navigation path ADMINISTRATION, System Management, Languages.
For information on Languages management, see Section 8.1 - Languages, which can be found in the section titled Localization.
# Site Text
Base application text that is rendered as part of the Platform may be translated into an enabled language. The Site Text administration screen is accessible via ADMINISTRATION, System Management, Site Text.
For information on Site Text management, see Section 8.2 - Site Text, which can be found in the section titled Localization.
# Localized Text
Text created in the Platform for use within a specific application instance can be translated into an enabled language for use as localized content within individual Categories, Services and Workflows. Such configurable text primarily includes text created within Form Elements, but also certain text existing within the application that is configurable. The Localized Text administration screen is accessible via ADMINISTRATION, System Management, Localized Text.
For information on Localized Text management, see Section 8.3 - Localized Text, which can be found in the section titled Localization.
# Table Manager
Table Manager provides an intuitive set of web management utilities to manage SQL Server databases and database tables accessed by the PMG platform. The Table Manager feature provides an interface that non-technical administrators can use to manage external data that may be used by Services, Forms, Workflows, Widgets, or Reports. To enable Table Manager for use within the Catalog, the ENABLE_TABLE_MANAGER System Setting must be set to True.
Once enabled, Table Manager will be accessible via ADMINISTRATION, System Management, Table Manager.
The Table Manager page has the following tabs across the top of the screen, providing interfaces for various management functions:
Browse Tables: The Browse Tables tab presents the Managed Tables screen from which administrators can view a list of editable custom data tables by the following fields:
- Table Name: Identifying name of the table
- Description: Description of the table
- Data Source: Source database
A View Data link is provided to view and manage the data within the table. By clicking on the View Data link, the table can be updated by editing or inserting single rows of data, or the table data can be exported to Excel.
Upload Data: The Upload Data tab opens the Upload Data from a File screen, which allows for bulk insertion of data by appending data to an existing table, replacing data in an existing table, or creating a new table from an uploaded CSV file. Data can be imported from an external database or an Excel spreadsheet.
Manage Tables: The Manage Tables tab includes two table management screens:
Table Setup: The Table Setup screen allows administrators to add a table from an existing data source. Existing tables can be managed here by editing the setup or schema, unregistering, or dropping them.
Create Table: The Create Table screen allows an administrator to create a new table.
Note: While initially creating a new table, the order for columns may be changed by dragging and dropping rows as needed.
Administration: The Administration tab includes two application management screens:
Data Sources: From the Custom Data Sources screen, an administrator can add or edit the databases referenced by the tables.
App Settings: The Application Settings screen provides the connection string and URL for the Platform, as well as a list of Table Manager data sources.
Permissions: Allows for selection of individuals to be granted specific table management rights.
# Editing Table Data
Data for tables can be managed directly within Table Manager's user interface, when the tables are configured with a unique identifier that Table Manager can recognize.
Table Manager uses one of three conventions to understand a column as the identifier column. These column properties may be set using SQL management software, or by using Table Manager as below.
The table has a column named "Id" with the datatype "uniqueidentifier". Note: The Default Value should be 'newid()'. This does not need to be marked as "Primary Key".
The table has a column marked as "Primary Key" and is the datatype "uniqueidentifier". Note: The Default Value should be 'newid()' for the uniqueidentifier column.
The table has a column marked as "Identity" and is the datatype "int".
Note: When adding a new record to a table using 'uniqueidentifier' set to default as 'newid()', using the user interface, that column should be left blank, like below:
# Package Manager
Use this screen to create a package that is comprised of assets including forms, shared forms, workflows, email templates, calendars, SLAs, and queues, that can be downloaded and imported into another environment as a single migration package. Access is restricted to catalog and system admins as well as users and groups with the 'Package Manager' permission. Package management functions are also available by API from the API web services URI at the base site URL /sc/packages.asmx
# Create a New Package
To create a new package, select New Package from the Package Manager screen.
From here, you can add a name, tags, a description, migration notes (a great place to put any setting changes that need to be made, names of sql scripts that need to be run, or just notes to other administrators about the package).
Then select the components and add to the package. You can either browse, or filter the available items. Before saving, you can search for any items the selected components depend on with the 'related items' button. Once all applicable items have been added, select Save to create the package.
# Edit an Existing Package
To edit an existing package, select the package from the list of available packages, then select "Edit".
From here, you can make any necessary updates to the package. Once all updates have been made, click Save, to save the package.
# Clone an Existing Package
To create a copy of an existing package, select the package from the list of available packages and click 'Clone'. This creates a copy of the package which you can edit while leaving the original package unchanged.
# Create a Meta Package
A Meta Package is a package of packages. Creating, editing, exporting, and importing meta packages can be done just like you would a standard package. Use the 'New Meta Package' button to create a new Meta Package.
# Download a Package
To download, select from the list of available packages, then select "Download Package". This will create the package with the versions of files at the time you added them to the package. If you have made changes to the components of the package since adding them, and you want to get the latest versions in the package, select "Generate New Version", then "Download Package".
# Importing a Package
To import a package, navigate to Package Manager in the target environment. Select "Import Package". Click "Choose File", browse for the package you wish to import, then select "open". During the import, a backup will be automatically made with any current artifacts in the target system as a timestamped version of the package being imported. With the system setting PM_ALLOW_DEPLOY_LATEST_UI set to True, the import screen provides the option to import the package content and deploy over the latest version in the target system.
# Delete a Package
To delete a package, select the package from the list of available packages, then select "Delete".
# Package Tags
Tags provide for adding keywords to packages. You can add one to many tags on a package. Tags supports many use cases, for example, tagging a package that is ready for release from staging to production.
# Version History
To view a package's version history, select a package from the list of available packages. Click the 'Version History' button at the top to see and optionally download older versions of the package. If a package only has one version, this button will be disabled.
# System Backups
To download system backups of the items in the package as they were before the package was imported, select a package from the list of available packages and click the 'System Backups' button to view the available backups. If a package was created and edited on the local system, it won't have any system backups.
# Package Details
To download an Excel spreadsheet of details about the package, select the package from the list of available packages and click the 'Package Details' button.
# Other Details
To search for other important items that the package's contents reference/require, select the package from the list of available packages and click the 'Other Details' button. Once the modal appears, click 'Find Other Details.' This search will pull back items that the package might need in another environment but aren't available to export via package manager. This feature is not available for Meta Packages.
# Remote Package Manager
Remote Package Manager allows defining other PMG environments which have packages to import. This reduces the multiple steps needed to export a package from one environment, then import to another. From this screen packages can be imported in one step from the source environment. Tags for remote packages are displayed when listing the available remote packages.
# Remote Logging
Remote logging supports for pushing PMG web and applications logs to a remote RSYSLOG compatible server, for example, such as Rapid7.
# Network Shares
Network Shares allow for system-wide network shares to be available for use in the workflow engine. Once network shares are defined and mapped successfully, any workflow and its actions may refer to the full UNC file path without additional credentials. Actions which support credentials through scenarios should be switched to use no scenario or the “system” scenario as available to the action.
# Data Maintenance
Data Maintenance provides configuration options to set policies for data retention and removal from the application database, providing for significant reductions in SQL resources and can maintain or improve performance over time. Policies vary for different entities which may be expunged or compressed, accordingly.
As workflows execute over time, or related service forms are submitted the application database tables are populated with data. Various activities affect the different aspects of the database, for example, users interacting with the work dashboard, populate more data in the workitems related information, and the workflow process will eventually complete after some time. Over time, as more work is being done in the system, the database tables will continue to grow in record count and taking up disk space on the SQL server.
Certain tables need to keep their data for historical reporting, but other tables can safely have their data cleaned out or even compressed. This frees up disk space on the SQL server, and allows for the PMG environment to process more requests going forward.
This process can be automated and maintained through the “Data Maintenance” interface.
Getting Started:
Under the Administration menu, find “Data Maintenance” (DM), which brings a new interface to control what kind of data to clean out, or “expunge”, and what kind of data to compress.
PMGSPE tables currently compatible with Data Maintenance are below:
RTVariable
RTLocalVariable
RTXmlContent
RTTimerAction
WorkflowFunctionRequests
Operations
WebRequests
RTSyncronousLog
SC_RequestData
RTRawMails
SMTPMailsParsed
WorkflowRepeaterLog
RTActionDetail
General tab
This tab controls the allowed hours that data cleanup and compression can take place.
If “Global Enabled” is unchecked, nothing happens. Once checked, the processes defined in the “Expunge” and “Compress” tabs are processed based on the “Allowed Hours” that are checked, and at the “Interval” provided.
For example, this allows expunge & compress to run every 5 minutes from 8:00PM to 5:00AM (local server time). The interval is where no cleanup process will run and is a delay interval which does not include the running of the actual cleanup process.
In some environments where there is a lot of old data to clear out and compress, you can adjust the interval here to be longer, like 30 minutes. If there’s not as much old data, you can start with a smaller interval, like 5 minutes.
Another aspect to consider is the frequency of SQL transaction logs are backed up, which your DBA can tell you. That frequency could be once every 15 minutes, once an hour, or every few hours. You should match the interval & selected hours to align with that frequency as close as possible so that only one cleanup process is done per SQL transaction log backup. This way, we avoid transaction log bloat.
“Delay Start” should be enabled. This means when “PMG Config Service” (the process that performs Data Maintenance) first starts, it will wait until the “Interval” value has passed once before starting. If not, it will start the cleanup as soon as the service starts. We found that having the interval delay enabled is the better practice.
Expunge – Variables tab
When enabled, variables (RTVariable) and local variables (RTLocalVariable) used in completed workflows older than “Days to Keep” get deleted.
In this example, completed workflows older than 365 days will have their variables & local variables deleted. At most only 1000 workflows that qualify will be processed per interval, and only variables + local variables tied to 20 of the qualifying WFs will be deleted at a time in a batch. Since workflows can have a large or small number of variables, the actually number removed rows will vary. After deletion of the batch, wait 1000 milliseconds before deleting the next batch of 20 workflow variables + local variables.
Fine-tuning the “Delete Batch Size” and “Intra-Delete Delay” is important. Deleting too many records too quickly can affect SQL performance. If you have a lot of large variables (for example, variables that hold large XML of JSON bodies), then you want a small batch size and longer delay, like this example here. If you tend to have smaller variables, you can have a larger batch size.
(Completed Workflows: WFs that are finished successfully, finished with errors, or aborted)
Expunge – Content
When enabled, workflow content (RTXmlContent) used in completed workflows older than “Days to Keep” get deleted.
In this example, completed workflows older than 365 days will have their WF content deleted. At most only 1000 workflows that qualify will be processed per interval. After deletion, wait 1000 milliseconds before processing the next 1000 qualifying workflows.
The RTXmlContent record that gets deleted can range anywhere from a few kilobytes to a few megabytes, so fine-tuning the “Workflows Per Interval” and “Intra-Delete Delay” is important. Deleting too many records too quickly can affect SQL performance. If you have a lot of large WF contents (i.e. large service forms), then you want to decrease the “Workflows Per Interval” and increase the delay, like this example here.
(Completed Workflows: WFs that are finished successfully, finished with errors, or aborted)
Expunge – Timers
When enabled, completed timers (RTTimerTask) older than “Days to Keep” get deleted.
In this example, completed timers older than 365 days will be deleted (regardless if workflow is completed or not). At most only 1000 completed timers that qualify will be deleted per interval, at 100 timers in a batch. After the batch of 100 timers are deleted, wait 1000 milliseconds before deleting the next batch of 100 timers.
RTTimerTask records tend to not have much variance in size, so fine-tuning is not as necessary here.
Expunge – Workflow Function Logs
When enabled, logs from WF functions (table WorkflowFunctionRequests) older than “Days to Keep” get deleted.
In this example, WF function logs older than 365 days will be deleted. At most only 1000 log entries will be deleted per interval, at 100 log entries in a batch. After the batch of 100 log entries are deleted, wait 1000 milliseconds before deleting the next batch of 100 log entries.
WorkflowFunctionRequests records tend to not have much variance in size, so fine-tuning is not as necessary here.
Expunge – Operations
When enabled, the log records of operations being performed per second (table Operations) older than “Days to Keep” get deleted.
In this example, Operation log records older than 365 days will be deleted. At most only 100,000 log records will be deleted per interval, at 10,000 log records in a batch. After the batch of 10,000 log records are deleted, wait 1000 milliseconds before deleting the next batch of 10,000 log records.
Operations records tend to not have much variance in size, and are very small, so fine-tuning is not as necessary here.
Expunge – Web Requests
When enabled, the web requests log records (table WebRequests) older than “Days to Keep” get deleted. Web request logs are created when setting PAGE_REQUEST_LOGGING is enabled.
In this example, WebRequests records older than 365 days will be deleted. At most only 5,000 records will be deleted per interval, at 1,000 records in a batch. After the batch of 1,000 records are deleted, wait 1000 milliseconds before deleting the next batch of 1,000 records.
WebRequests records tend to not have much variance in size, so fine-tuning is not as necessary here.
Expunge – Synchronous Logs
When enabled, Workflow Function results log records (table RTSynchronousLog) older than “Days to Keep” get deleted.
In this example, RTSynchronousLog records older than 365 days will be deleted. At most only 1,000 records will be deleted per interval, at 100 records in a batch. After the batch of 100 records are deleted, wait 1000 milliseconds before deleting the next batch of 100 records.
RTSynchronousLog can vary in size, and depending on how often WF functions are used, the batch size may need to increase, so fine-tuning can be necessary here.
Expunge – Workflow Scheduler Logs
When enabled, logs from Workflow Scheduler tasks (table WorkflowRepeaterLog) older than “Days to Keep” get deleted.
Workflow Scheduler log records tend to not have much variance in size, so fine-tuning is not as necessary here.
Expunge – Mail
When enabled, mail from inbound parsed email (tables RTRawMails and SMTPMailsParsed) older than “Days to Keep” get deleted.
Emails can vary in size, and depending on how many emails are parsed and workflows run in a given time period, fine-tuning can be necessary here.
Expunge – Action Details
When enabled, data from Workflow actions detail (table RTActionDetail) older than “Days to Keep” get deleted.
Action details can vary in size, and depending on how many actions and workflows run in a given time period, fine-tuning can be necessary here.
Compress
When enabled, the service form responses that are stored in table SC_RequestData, column question_response_xml get compressed.
In this example, requests that are completed (all workflows relating to the request are either in Finished Successfully, Finished with Errors, or Aborted) and older than 365 days will have their question_response_xml column compressed, with the compressed information stored in the question_response_compressed column in the same table (SC_RequestData). . At most only 5,000 requests that qualify will be compressed per interval, at 20 requests in a batch. After the batch of 20 requests are compressed, wait 25 milliseconds before processing the next batch of 20 requests.
Since we are not deleting records, the delay can be much smaller, relative to the “Expunge” processes.
If the service form responses are needed again (e.g. perhaps the WF needs to be re-executed again), the PMG platform will un-compress the data back into question_response_xml. Then it will get compressed back to question_response_compressed again when the next interval arrives, assuming all workflows relating to the request are no longer running.
Status
This shows the current status of the Data Maintenance activity, and a running count of variables that were deleted, workflow contents that were deleted, and requests that were compressed.
The “PMG Config Service” is the process that performs Data Maintenance, and the machine which is currently (or last to) run the process will be displayed in this tab under “Machine”.
Log
Log reader of Data Maintenance activities
Troubleshooting
There is a “Fix stuck maintenance” link in the Status tab.
Using that link will show this:
In HA setups (separate web server(s) & app server(s)), only one server’s “PMG Config Service” is going to run the Data Maintenance processes. If that config service crashes for any reason outside of an intentional stop and start via “PMG Config Utility”, then you might need to click the “Fix stuck maintenance” link to get the DM process to restart.
Even if you have only a single server, if the “PMG Config Service” stopped running, and later was started again, you might need to click the “Fix stuck maintenance” link to resume the DM process.
# System Workflows
The System Workflows page gives administrators visibility into not only what System Workflows are currently running, but also those System Workflows that have already completed. The System Workflows screen is accessible via ADMINISTRATION, System Management, System Workflows.
NOTE - this function, System Workflows. is largely replaced by Workflow Scheduler which provides more flexibility, features, and control for running ad-hoc workflows.
The screen displays the following information:
Workflow Name: Name of the System Workflow
Requested On: Date and time when the Request was submitted
Requested By: Name of the user who submitted the Request
Last Status: Most recent status of the Request
Running: Whether the Workflow is currently running
Administrators can choose to filter results to display only those System Workflows currently running. Selecting the Workflow name on the left-hand column will display the Process Execution View of that Workflow.
The Start Workflow button at the bottom of the screen opens a drop-down list of available System Workflows. Administrators can define custom XML and select a System Workflow from the drop-down to run a new Request.
# Two Factor Authentication
Two Factor Authentication allows for multi-factor challenges to end users to authenticate to the platform. Options include SMS, email, and temporary one time passwords via QR code (TOTP) for authentication.
In addition to provider specific configuration options for Twilio and Plivo, additional options for MFA are provided as follows:
# General
MFA Users: Determines which users will be required to authenticate with MFA
Enabled SMS for MFA: Determines if SMS will be used
Enabled Email for MFA: Determines if email will be used
Enabled TOTP: Determines if TOTP will be used
SMS MFA Code Length: Specifies how many characters will be in the generated SMS codes
Email MFA Code Length: Specifies how many characters will be in the Email MFA generated codes
Days Valid For: the number of days to consider a user's MFA as valid, 0 indicates MFA is required on every logon
SMS MFA Provider: Specifies which provider to use for SMS MFA
From Address: The sending email address to use for outbound emails
Registration Subject: The subject line to use for the user during registration email communications
Registration Body: The email body to use during email registration for end users
MFA Subject: The subject line to use in ongoing MFA verification emails
MFA Body: The email body to use in ongoing MFA verification emails
# TOTP
The temporary one time password configuration is managed from the TOTP tab. Options to configure are below.
Issuer (site): This is the name given to the entry for the user's TOTP application
App install help text: Helper text to show unregistered TOTP users at logon
NOTE: Once an issuer site name has been saved, a test QR code is presented to allow for validating the TOTP setup.
# Testing
Each of the MFA options provides a Test option to validate the configuration
# Login Link
Login Link provides a login bypass method for users who need to access the application but do not have their password or have temporarily lost access for some other reason. Using the Login Link feature, an administrator can send the specified user an email which allows the user to link into the application and bypass authentication.
Note: The Register Login Link Code action can perform this function from workflow.
# SAML Configuration
A configuration to support SAML logon is setup at the page /sc/admin.saml.aspx.
Once enabled, a SAML IdP will be used to logon a user.
Enabled: Determines if SAML will be used instead of the login page.
Domain: The domain for the logged in user.
SAML User Attribute Mapping: The attribute in the IdP response which will hold the user value.
AD Search Property: This property will be mapped to the value of the SAML User Attribute Mapping result and used to search Active Directory for the user's sAMAccountName value. For example, if the SAML attribute value is the user's email address, a config value of 'mail' can be used.
Service Provider Name: The name of the Service Provider.
Assertion Consumer Service URL: The URL which will receive the SSO response. Example: https://{hostname}/sc/acs.aspx
Partner Identity Provider Name: The IdP name
Single Sign On Service URL: The IdP SSO URL
SAML Attribute Mapping: An optional table of key-value pairs which can be used to map the IdP response attributes to variables in the user's session.
Trigger Workflow: An optional workflow value that will be run every time a user triggers the SAML SSO login process.
Workflow Variable to Pass Attributes Value: An optional workflow variable that will be passed the SSO attributes value in the form of a JSON document.
SAML_USER Session Workflow Output Variable: An optional workflow variable which will hold the user login value generated by the workflow. Use this for creating new users on the fly. This value will be placed in the user's SAML_USER session variable for the login process.
Use Workflow Output Variable When:: Select the option to specify when to use the SAML_USER output variable from the trigger workflow as the value of the user's login. The options are Never, Always, and User Not Found in AD Search. If an "AD Search Property" value is not specified, use "Never" or "Always" since a search isn't performed.
Service Provider Local Certificate File: The name of the local certificate file.
Service Provider Local Certificate Password: The password for the local certificate file.
Sign Authentication Request: This flag specifies whether the SAML authentication request sent to the IdP should be signed.
SAML Response Signed: This flag specifies whether SAML responses received from the IdP should be signed.
Assertion Encrypted: This flag specifies whether SAML assertions received from the Idp should be encrypted.
Assertion Signed: This flag specifies whether SAML assertions received from the IdP should be signed.
Use Embedded Certificate: This flag specifies whether to use embedded certificates in the XML signature when verifying the signature.
IdP Local Certificate File: The name of the local IdP certificate file.
IdP Local Certificate Password: The password for the local IdP certificate file.
# OAuth Support
OAuth authentication is configured at the page /sc/admin.oauth.aspx which enables single sign-on with authentication sources from Google and Microsoft identities.
Once enabled, Site Text may be used to configure the form logon page.
# Google OAuth Configuration
Please reference the Google API support documentation for obtaining the OAuth credentials.
Google Console Dashboard Link (opens new window)
Google Developers Sign-in Link (opens new window)
The Google OAuth configuration page has the following options.
Enable: Determines if Google OAuth is available for user logon
Enable Login Button: Controls the display of a Google Login button for the PMG application form logon page
Redirect URI: The redirect URI for the Google authentication to send the client browser upon authentication
Client ID: The OAuth Client ID
Client Secret: The OAuth Client Secret.
Domain: The configured domain to use as the internal user store upon authentication
Google OAuth user identification uses the google ID, or google email address, as the key or identity to use to find the user within the internal domain specified and is matched on the email address property for users.
# Microsoft Azure OAuth Configuration
Please reference the Microsoft Azure OAuth material for configuration of OAuth protocol on the Microsoft identity platform Microsoft Identity Platform Link (opens new window)
The Microsoft Azure OAuth Configuration page has the following options.
Enable: Determines if Microsoft Azure OAuth is available for user logon
Enable Login Button: Controls the display of a Microsoft Login button for the PMG application form logon page
Redirect URI: The redirect URI for the Microsoft authentication to send the client browser upon authentication
Client ID: The OAuth Client ID
App ID: The OAuth App ID
Tenant ID: The OAuth Tenant ID
Payload Prop For SAM Search: This is the OAuth payload property to be used as the SAM to identify users within the Domain
Domain: The configured domain to use as the internal user store upon authentication. This is the domain that is searched for the user and it is the domain used to identify the user in the portal after the user is found.
AD Property For SAM Search: This is the internal Active Directory user object property to use as the SAM to match the OAuth payload property
Allow All Tenant Issuers: Controls whether all external domain users authenticated with Azure SSO will be allowed
Allowed Issuer Tenants: Allows for a specific list of additional Tenant IDs which will be allowed
# Bulk Process Control
Bulk Process Control gives administrators the power to locate and act on multiple Workflows simultaneously. The Bulk Process Control screen is accessed via ADMINISTRATION, System Management, Bulk Process Control.
The Build Query tab can be used to filter and locate Workflows with a common set of parameters. Administrators can filter by workflow, state, request date, update date, submitter, specific execution ids, and outcome state. A max number of rows can be returned, and optionally, row numbers can be displayed. Once all parameters have been set, select Run Query to view the results.
Alternatively, a Custom Query can be written by the user on the upper right-hand side of the screen.
The query results are displayed at the bottom portion of the screen. Administrators are presented with the following information regarding the displayed Workflows:
ID: Numeric ID of the Workflow associated with a specific Request
Workflow: Name of the Workflow
Version: Version of the Workflow
Requested On: Date and time when the Request was submitted
Internal Status: Execution state of the Workflow
Status: Status of the Workflow
Last Updated Date/Time: Date and time of the last update to the Workflow
User ID String: Unique UserID of the user who submitted the Workflow
User Display Name: Name of the user who submitted the Request
Outcome State: Outcome State of the Workflow (n/a for sub-workflows)
Administrators can check the boxes for the desired Workflows and then select one of the action options available at the bottom of the page:
Pause: Brings a running Workflow to a temporary stop
Resume (Paused Workflow): Resumes a Workflow that was previously paused
Resume (Error Workflow): Resumes a Workflow that had previously ended as an error
Abort: Brings a running Workflow to a permanent stop
The source of the process is indicated in the “Source” column of the results with links as available to the source.
# Windows Services
The Window Services page gives administrators one central location to start, stop, or restart application services. The Window Services screen is accessible via ADMINISTRATION, System Management, Windows Services.
The Workflow Engine on a selected environment instance can be Stopped, Restarted, or Started as needed.
# Log Viewer
The Log Viewer feature allows administrators to review system-generated log data for advanced troubleshooting. The Log Viewer screen is accessible via ADMINISTRATION, System Management, Log Viewer. From the Log Viewer page, administrators can view log data for the web components of the platform as well as the base platform and workflow details, by selecting the appropriate tab.
# Web
By clicking on the Web tab, the administrator will see rows of data with Time, User, Remote IP, Host Name, Type, Path, and Message fields. The results can be filtered to display only log entries for errors. Web logs may be exported with the Export link provided.
# Application
The data displayed in the Application tab includes Time, Message, Error Details, Machine Instance Name, Process Name, Level, Process ID, Workflow ID, Action ID. The results can be filtered to display only log entries for errors, as well several other options. Logs may be exported with the Export link provided.
# File System Logs
File System Logs include data which was unable to be committed to the database server. Logs are queued by application server instance and can be rotated on demand to be downloaded by selecting the respective “Rotate Now” link for each available server. Logs which are rotated are compressed and sent as a timestamped file.
# Summary Logs
Summary logs provides details for workflows which are executed “in memory” or “in process” vs the standard model of workflows which run from a workflow server instance and are persisted to the database. Filters provide options to limit the shown workflow data and each may be selected to see a diagram of the workflow execution details. Summary log entries provide a download link as well to facilitate troubleshooting and sharing with product support.
# License
The License page gives administrators a summary of the status of their product license. The License screen is accessible via ADMINISTRATION >> System Management, License.
Information is provided about the license parameters, a list of the licensed connectors is shown, and the License String can be viewed and/or updated on this screen. The License page also includes information on the number of License Services Counts by Category.
# Utilities
The Utilities page lets administrators perform maintenance tasks. The Utilities screen is accessible via ADMINISTRATION, System Management, Utilities.
The following tasks can be executed from the Utilities screen:
Rebuild Full Text Search: Regenerates Catalog full text search index
Benchmark: Provides various details about system performance
Refresh Queue Membership: Updates users for queues based on group membership
Eradicate Portal Content: Deletes all Portal content; catalog categories, services and pages
Clear Portal Caches: Resets all Portal content caches
Clear all User, Group and Query caches: resets all cache data for users, groups, and queries
Reindex All Workflows: rebuilds the search index for workflow designs
Force GC Collect: Force garbage collection in the application
Change KMS Master Password: changes the key management master password
Summary Viewer: Upload and view diagrams and details for a workflow summary file
App Log Viewer: Upload and view app log files
Email Opt Out: Provides management for selection of emails to opt out of receiving emails triggered by workflow action execution. Emails can be opted out globally, or per workflow, or by specific actions.
# Workflow Precache
Workflow Precache allows for specified workflows to be cached, or precompiled, which allows workflows to be ran quickly on first execution from the application perspective. When applications are accessed initially, various components of the workflow are processed and loaded into application cache, such as Code actions and various properties of workflow actions which must be converted from 'script' to compiled code. Once a workflow is precached, the first execution of the workflow is generally as fast as subsequent executions. Precaching then can be very useful to support performance goals when the application recycles. This can be helpful for data workflows used in end-user interfaces, or for Workflows APIs.
Workflow precache management is accessed from Administration, Workflow Precache. This screen offers the following settings and options.
Cache Interval
To precache workflows, specify a cache interval for the defined workflows, using the format HH:MM:SS, e.g. every five minutes:
00:05:00
This time period is a trigger to check listed workflows for cache presence.
By Recently Changed
Workflows with diagrams that have change in this number of days will be cached.
By Recently Ran (InProc)
Workflows that have ran as InProc (and Engine Not Logged) within this number of days will be cached.
By Recently Ran (Logged)
Workflows that have ran as normal (and Engine Logged) within this number of days will be cached.
By Name
This list determines which workflows will be cached.
Effective List
This is the current names of workflow that will be cached based on the current configuration.
Advanced (button)
Allows for direct editing of the cache settings document. Can be useful for bulk operations.
Status
The Status tab indicates the caching activity as well as the various processes which have established a cache.
# Calendar Management
Administrators can utilize multiple Calendars within a single implementation. The Calendar Control screen is accessed via ADMINISTRATION, System Management, Calendars, or from the Workflow Designer, under the main Tools menu.
This feature provides for the management of Calendar definitions for use throughout the application.
Upon opening initially, the Calendar management window will display an alphabetized list of all existing Calendar profiles on the left-hand side. This is the global list of Calendar definitions available to workflow administrators when configuring escalation properties in Human Actions, timers, or OLA Start/End activities, as well as in SLA properties on front-end forms. Referencing a Calendar allows you to exclude specific dates and/or times of day from consideration when calculating time periods.
Using the Filter box above the Calendar list, you can type in a search string (or partial string) to filter down and locate a specific Calendar, if needed.
Clicking an existing Calendar will display the current details for that Calendar profile. From here, the individual property values can be edited as needed, or the entire Calendar record itself deleted via the “Delete” button at the top right of this window.
To create a new Calendar, click the “+ Add Calendar” button at the top left of the pop-up window. This will open an additional window where you can provide preliminary, required information about the new Calendar being added, including:
Calendar Name – enter the display name for the new Calendar
Time Zone – select the appropriate global time zone for the new Calendar to use
Observe Daylight Saving Time – checkbox to indicate whether the new Calendar should be subject to Daylight Saving Time changes each year
Time slot data – Configurable for specific start and end times, or “All day” for any number of days. At least one time slot must be set for any given calendar to exist.
Once these required fields have been filled out, click Save to save and continue defining the new Calendar. Or, click Cancel to disregard any entries made so far, and cancel the Add Calendar process entirely.
After clicking Save, the new Calendar record will be added to the database, and additional details may be added. The top center area of the Calendar management window will display the new Calendar Name, selected Time Zone, and setting for Observe Daylight Savings Time. Below are property fields for the Calendar, broken out into two separate tabs:
Working Time – designates the day(s) of the week and times to be included in this Calendar
Exceptions – assigns specific date(s) to be excluded entirely from this Calendar definition
Under the Working Time tab, use the controls provided to designate those times and days that should always be included in time calculations for this new Calendar. Specify a start and end time, or, click the checkbox under the All Day column to indicate all 24 hours on the selected day(s) should be included.
Use the checkboxes under the Days column to select the specific day(s) of the week that each time period row should apply to. Click the “+ Add” button at the bottom of the list of timeslots to add an additional timeslot. And, for each time period row saved, use the pencil icon to edit any settings as needed, or the trash can icon to delete the row entirely. Ultimately, you can add as many time period rows as needed to cover all the daily timeframes that should always be included in time calculations against this Calendar.
Under the Exceptions tab, use the controls provided to build a list of specific calendar dates across the year that should be excluded from consideration when calculating time elapsed agains this Calendar. Clicking the field under the Dates column will open a calendar-based date picker window, where you can select the specific date to be excluded.
After selecting a date, use the checkbox under the Recurring (On calendar date) column to indicate whether this date should be excluded every year, or only for the year selected on the date picker. Once these selections have been made, click the “+ Add” button on the current row to save it.
For each date exception row saved, you can use the pencil icon to edit settings as needed, or the trash can icon to delete the row entry entirely. Ultimately, you can add as many date rows as needed to cover all the dates that should always be excluded from time calculations against this Calendar.
Once you’re finished adding all the applicable settings under the Working Time and Exceptions tabs, you can either use the list on the left-hand side to navigate to a different Calendar definition, or close the Calendar management window entirely. Authorized admin users can always return to this Manage Calendar feature to add, edit or delete Calendar profiles as needed. Any workflow steps or SLA profiles that reference a Calendar will always use the latest settings configured in this management window.
# External Groups
The External Groups feature gives administrators the ability to establish groups using database driven configuration. The External Groups screen is accessed via ADMINISTRATION, System Management >> External Groups.
The External Groups feature leverages information from databases or third party applications via the PMG Provider Framework, to group users for use throughout the platform. For example, an HR database may contain useful information about users, such as business department or location, which can be used to group them into a defined group. In addition, users may be allocated to groups through business rules, such as determining whether a user is a functional administrator, by using SQL queries. Once an External Group is configured, the group information may be used just as AD or LDAP groups are used within category permissions.
External Groups are comprised of the below properties:
Name: Name to display for the group when listed along with other AD or LDAP directory names
Data Source: External data source to use for the external group
Group List Query: Query which will provide the list of groups, whether the groups are literal or inferred by a rule
User Group Query: Query to determine the groups to which a user belongs
Examples of queries that can be used in External Groups include:
Group List Query: Uses the “%__GROUP_SEARCH__%” to allow for filtering of groups when selecting groups from administration once the group is configured:
SELECT DISTINCT OperatingRegion
FROM Regions
WHERE [OperatingRegion] LIKE '%__GROUP_SEARCH__%'
User Group Query: Works with the group query above to determine groups for the user, and uses the dynamic value “__USER_LAST_NAME__” to match the groups to user in the SQL call:
SELECT OperatingRegion
FROM Regions
WHERE LastName = '__USER_LAST_NAME__'
Rules Based Query: Leverages MS SQL ability to generate dynamic tables to return a table with “Admin” and “Non-Admin” as values, which then can be used for assignment within Categories:
SELECT [Name]
FROM (SELECT 'Admin' AS [Name]
UNION ALL
SELECT 'Non Admin' AS [Name]) drv1
WHERE Name LIKE '%__GROUP_SEARCH__%'
Match User Group Query: Uses MS SQL case statement to return a matching group value for the user based on the SQL condition:
SELECT CASE
WHEN (SELECT DISTINCT LanID
FROM SpecialPermissions
WHERE '__USER_SAMACCOUNTNAME__' IN ( LanID )) =
'__USER_SAMACCOUNTNAME__' THEN 'Admin'
ELSE 'Non Admin'
END
An External Group can be saved or deleted by clicking the Save or Delete buttons at the right of each row.
# File Storage
File Storage determines the storage location for files related to the use of the application. As files are attached to Apps, Forms or created by Workflows, the application requires a persistence storage reference for the files. Storage options available for files are "Internal Database", "File System", and "S3". Form/App storage and Workflow storage may be separately configured. As options are selected for storage, additional tabs will display for additional settings.
Storage settings may be changed at any time as the application remembers the source type with files over time. Therefore it is safe to toggle from one storage type to another and back.
When File System is the the selected storage type, a system path must be defined for use, and this path should be accessible to any possible application instances at any given time.
When S3 is selected as the storage type, additional questions will display for you to provide Access Key, Secret Key, Bucket Name and Region. Files accessible from S3 maybe downloaded directly by end users from S3, if the option is enabled. Note: "direct download of S3" security is based on the authenticated user can access the file, then redirecting to download from S3 with a time based secured URL.
# Data Encryption
Data Encryption controls how data will be encrypted within the platform. Encryption is available for Workflow Variables as well as Form and app widget data. Encryption may be internal or may use AWS-CMK for “Bring Your Own Key” (BYOK) support.
Encryption options are none, internal, or AWS. Auto rotate will change the encryption key on an internal schedule.
When AWS is selected an additional tab will display to provide the AWS key access information.
# Connectors
Available connectors for the platform are managed from the Connectors screen. This screen is accessed from Administration, Integrations, Connectors. The Connectors screen lists connectors with information about their respective current state, and links to manage the connector configuration. As connectors are enabled, the actions are made accessible from the Workflow Designer.
Hovering over a connector provides quick links to manage the connector. Options are to Hide/Show, Edit the actions for the connector, and edit the configurations, based on the connector type.
The dot in the top right of each connector box indicates if the connector is currently licensed, green indicates licensed. Connectors with Blue backgrounds are available (not hidden) from the workflow designer. Greyed out connectors are hidden from the designer.
# Lists
“Lists” provides for management of lists of items which can be used as a data source in forms and workflow without requiring the creation of database tables or use SQL. Lists can also be migrated through export/import. Lists management is accessed from the Portal, Administration, “Lists”
To add a list, select “Add/Import” and choose to manually create a new list with “Add”, or Import a list export file with “Import”.
When manually creating a list, the management screen allows for adding of list items or bulk updating of list items. To use bulk mode, select the Bulk Update tab and enter a set of line separated values and submit
Once lists are defined, they may be exported from the Export List link.
Defined Lists are available for use within form values under the field properties as “Data, Source, Lists”.
List items will have a “Value” and a “Display”. Once items are added to the list, the values may be updated through the “Bulk Update” screen, where the values are the first entry, and the display are the second for each item.
Lists are also available for management and use from workflow. Under the “PMG Platform” actions palette are “List Edit”, “List Item Edit” and “List Item Query”. See the actions section for more details about these actions.