Data Uploader FAQ

General

What is Data Uploader?

Data Uploader is a CAM windows utility that reads workspaces, users, and groups-related data from your on-premises billing or NBI system and upload the information to the CAM cloud for processing.


What Limits do I have with Data Uploader?

  1. Currently, you can set the daily Throttle limit to a maximum of 50,000 jobs for optimum and error-free performance. Daily, you can upload maximum jobs that are permitted as per the Throttle limit. After the daily limit is reached, you can find the Throttle limit warning message in the log.

  1. CSV files have a maximum of 100 rows. Having more than 100 rows may result in issues like imports resulting in days to process.

How do I download Data Uploader?

The Data Uploader can be downloaded from the Administration tab >Data Uploader. Download and run the installer. Read here for Configuration details. The user must have system permission to download, install, and run the Data Uploader on the Desktop or server.

 

Is Windows 2012 supported for DU?

There can be issues running Windows Server 2012. Please read: . We recommend Windows 2016+.

SQL Server can be SQL 2012+.


Are any changes made to the on-premises source database?

No changes are made to the on-premises database. The Data Uploader needs Read-only access to the source database on the SQL server to query the database and upload the information to CAM.


Are high availability servers supported for Data Uploader?

Currently, DU does not support an active-active or an active-passive High Availability configuration. This means that DU is not designed to be run concurrently on two servers for high availability purposes, and there is no built-in functionality for automatic failover or files synchronization (generated JSON / CSV) between servers. In the event that the primary server running DU fails, the service will not automatically resume on a secondary server. Furthermore, because there is no automatic synchronization of files or processing states, if you were to start DU on a secondary server after a failure, it would not continue from where the primary server left off uploading generated files. If the client chooses to use high availability on the database level from SQL, they can use that option.


Can multiple users log into the DataUploader server to view jobs?

Yes, multiple users can login to the server as whomever you want. However, data uploader must be run as the account you setup it with (no exceptions) otherwise jobs and connections may not display.


Connection Strings

What is a Connection String?

A connection string enables the Data Uploader to access your native database to authorize and execute the query request at run time.

The Data Uploader Log tab displays a Connection String error. How to fix this?

This error is displayed if the Connection String is not valid and the Data Uploader cannot access your database. In the Connection String tab, re-enter the credentials, click Save, and Test the connection. A Valid connection message must be displayed.


Authentication Errors

Why does the Data Uploader return an Authentication error when I try to run a provisioning job?

This usually occurs if the Authentication token has expired. In the Data Uploader> Authentication tab, enter your password and click Re-Authenticate.


Job Upload Fails

The Connection String and the Authentication are validated, why does the Data Uploader fail to upload a job?

In order for Data Uploader to be able to create a Provisioning job, a properly configured script to update the matters must exist in the default SQL folder C:\Program Files(x86)\Prosperoware\DataUploader\SQL or in the folder path displayed in the SQL Files tab.

 

The Jobs completed and show the workspaces in CAM, but the Job Center shows In Progress. What do I ned to do?

Most likely the API got overwhelmed and they ended up showing twice in the UI. It happens from time to time. If any concern is that the data is not there and completed, the jobs can be bulk re-run.

 


Can I change the default SQL folder location?

Yes. The default location can be changed when installing the Data Uploader. Change the default location to install somewhere, other than the default location displayed in the installer.


Can I set a schedule for the query to be executed?

Yes. The Timer tab allows you to set a schedule for the task to be executed. The task is then added to the Windows Task Scheduler>Task Scheduler Library. The files from your database will then be automatically updated to CAM as per the defined frequency.

Note: Based on your firm's IT local security policy settings, the task in the Windows scheduler may be not trigger if it is disabled. If the upload has failed, check the task scheduler and enable the task. The user must have an account with permissions to enable the scheduler task.


What needs to be set on the task scheduler?

  1. On the General tab, ensure Run whether user is logged on or not, Do not store password, and Run with highest privileges are checked.

  2. On the Settings tab, ensure Do not start a new instance is selected under If the task is already running, then the following rule applies list.

  3. Use the service account user for the user account.


The connection to the Data Uploader is validated but the Timer tab blanks out.

Change the date format on the system where the uploader is installed to yyyy/mm/dd. Close and open the Data Uploader application and re-validate the connection. The Timer tab should now be displayed.


What is the data format of the query result uploaded to CAM?

Each query is returned in the form of a separate CSV file and uploaded to the Jobs tab for further processing.


How can Data Uploader sync from multiple sources?

If there are multiple sources, users can set multiple connection strings. For this to work, .Net 4.8 or higher is required to be on the system that the data uploader is installed. The setup of the multiple connection strings differs where the source systems reside. Use one of the following steps below:

Both sources are cloud-based: Set up the source systems in the Source System panel. The source systems can be independently run.

Both sources are on-premise and on the same SQL server: The connection string can be used for both systems since the databases reside on the same server.

Both sources are on-premise, but each source is on a different SQL server: Data Uploader would need to be installed on two instances, one for each source.


Why is Data Uploader blank after I log in to it?

The ports or sites used in Cloud Uploader are blocked or your .Net version is not .net 4.8+. Add them to Trusted Sites. See the article here.


Will DataUploader support Secure LDAP or enforce this as well?

Will this comply with cloudimanage.com’s requirement to have users and groups synced securely?

Data Uploader has two steps to the process. Uploader first locally connects to AD via the LDAP protocol and uploads to CAM over HTTPS (443) and then CAM calls the iManage Rest API over HTTPS (443) to create users and groups. Directly connecting to Secure LDAP is on the future roadmap.


BillingtoMilan Timestamp Feature

Is there a way to include and reference a timestamp of the time the SQL queries were last run (the feature apparently exists in BillingToMilan)? Currently, their scripts are picking up anything modified in the last x days, but the scripts are run every few minutes, so an almost identical set of results are being uploaded and processed every few minutes. Is there a better way to do this?

This will be supported in the future.


Sample CSV for Provisioning Shortcuts

Does anyone have a sample CSV file to provision shortcuts in iManage?

Yes, please find attached a sample CSV for this and modify it for your data: Sample Provisioning CAM CSV


Error: User/Password is not working w(502)

In order to fix this Login failed issue, we need to change the “App client id” in Cognito Federated Identities within the AWS portal.

  1. Login to the AWS console.

  2. Search for Cognito Service.

  3. Click on “Manage User Pools”.

4. Click on the user pool that you’re using.

5. Click on App Clients.

6. Copy the App client id for the Remote App.

7. Click on Federated Identities.

8. Click on the PROD federated identity of your user pool (e.g.: tenantProductionIoProdFederatedIdentityPool)

 

9. Click on Edit Identity pool.

10. Expand Authentication Providers.

a. Click Unlock on App Client id.

b. Paste the App Client id you copied over from the User pool in step 6.

11. Scroll down and click Save Changes.

 

Best Practice: Data Uploader should be configured with the same service account that has been used for the initial install, any upgrades, and any configuration change.


Handshake Error or Protocol Error

This is an issue with the firewall or network settings. Try checking what changes on the system’s firewall/Network could affect this.


Source Mapping Issues

Source mapping isn’t taken into effect or errors in DU:

The configuration may be set up wrong. The Source should be equal to the OU container in AD, or the specific SQL filename in the Data Uploader scripts folder.

Like:

<UniqueIds> <uniqueId columns="clientId,matterid" source="CreateOrUpdateWorkspace.sql"/> <uniqueId columns="type,UserId,database" source="ou=VMWareUsers,dc=hetzner,dc=myfirm,dc=com"/> <uniqueId columns="type,UserId,database" source="ou=DomainUsers,dc=hetzner,dc=myfirm,dc=com"/> </UniqueIds>

 

Note this does not only apply to an initial job, but any JobDelta run.


AD Mapping Issues

The CSV was not generated:

  • Check if any issue with the configuration here, or from the Source Mapping config.

Extra data is imported that shouldn’t be:

  • Check the configuration here, or from the Source Mapping config.


CSV’s aren’t being uploaded

  • Check the CSV is available in the DataUploader\Archive folder. Each time a job runs or is scheduled to start, a CSV is created here.

  • Check for the CSV Name in the row of the JobDelta table of the Data Uploader DB.

  • Check the S3 bucket for the user performing the Data Uploader job that the CSV is inside it. Access this from within the AWS portal.


Data Uploader is not loading

  • Check that the SQL connection to the Database is healthy. Check the connection to the source system is healthy (iManage, ND).

  • Check the Event log table size in the Database. It may be filled.


SQL Errors in DU

Check that the database has enough space (both disk space, and log space) free. If the database has maintenance plans in place, ensure those have run.

Check the Source Database connection is healthy (iManage, ND).

Check if the user running Data Uploader has access to execute scripts. Take one of the SQL scripts and run it as-is on the database.


Check the error page for more errors

Let's Connect📌

☎ +1 630.598.1100
☎ ‪+44 20 3880 1550‬
📧 support@litera.com
💻 https://www.litera.com/support/

📝 Support is available:
4 am - 8 pm US Eastern
(9 am - 1 am GMT/BST
7 pm - 11 am AET) on normal business days (excluding holidays)

© 2024 Litera