Summary:
This article details common integration errors and how to resolve them. These errors are applicable to most connectors. Please reference this article for troubleshooting steps should your integration fail to complete.
Article Sections:
Note: The information discussed in this article is relevant to Solver Cloud and Private Host customers.
More Information:
Run Button in Integration is Greyed Out
When a user attempts to run an integration, the run button is greyed out and does not allow the user to click on it.
The Run button being grey out is due to either
- No data source is selected.
- If this is a Gateway integration, verify the data source is selected.
- The integration is unpublished.
- Verify the integration is published on the Summary tab. If you see the below "Publish Integration" button, that means the integration is unpublished, click that button to publish the integration.
- Verify the integration is published on the Summary tab. If you see the below "Publish Integration" button, that means the integration is unpublished, click that button to publish the integration.
Troubleshooting File Import Integrations
- Period must be in YYYYMMDD format.
- If the Period field is in any other format (e.g. YYYYMM, MM-DD-YYYY, etc.), the application will not recognize the format and will cause the integration to fail.
- Currently Solver has a 100MB limit for flat files.
- Files can still be uploaded to the integration if it exceeds the limit. However, depending on the file size and its data, it can potentially lead to issues such as integration may not run.
- The datatype for measures is numeric. Therefore, it will only accept decimals and numeric values.
- Decimals are acceptable (e.g. 1050.58)
- Verify and ensure General is the only data format in the CSV/Excel file.
- It is recommended to click through each column to ensure no other data formats are being used in the Excel file, as Select All does not always show the data format accurately.
- Tips: Review the CSV/Excel file in Notepad or any text editor for formatting. If there are double quotes around the value, it indicates the value is being treated as non-numeric and not in General format.
- Thousands Separators will not work (e.g. 69,000)
- If dimension loads are separate integrations from module data loads, ensure all dimension codes are loaded into the DW prior to running the module data load.
- If the error persists, keep only the headers, and the first top 10 rows of data. Delete the rest to narrow down search area until the problem area is found.
- Compare data (in CSV/Excel File) with integration mapping.
- Data loading into a dimension can be a string, text, or numeric as long as it does not exceed 50 characters. The target data type for dimension codes is nvarchar (50).
- Data loading into an attribute requires more attention. Ensure source data is compliant with the target data type. Verify data type in the following location: Cloud > Data Warehouse > Configuration > Modules > select the target module
Common Errors in File Import Integrations
Scenario |
Error Message |
Cause |
Solution |
This error returned in Task Details after uploading the CSV or Excel file in the integration: |
Error connection. Either there is no data source selected, or the Gateway isn’t running. |
1. Duplicate column headers. 2. Password on the Imported File |
1. Rename one of the column headers to get past the error. 2. Remove the password from the Imported File (Excel only). |
Upon clicking Preview in Source Data: |
No Data to Preview / Timestamp error |
Special characters in the file name, or column headers. |
Only include 0-9, A-Z (a-z), and an underscore in file name to eliminate any possible issues caused by special characters. |
User is unable to delete the CSV or Excel file in the integration; the application returns this error: |
ERROR [object Object] |
Special characters in the file name, e.g. P&L. |
This can be resolved by copying the original integration. By clicking the Duplicate button in the Integrations tab, a new integration will be created with the same integration mappings, but the file will not be carried over. Rename the file and upload it to the newly copied integration to resolve this issue. Once the new integration is created, go ahead and delete the previous non-working integration. |
Integration failed with this error: |
Error converting data type nvarchar to numeric/bigint. Line 4 |
bigint and numeric are the key indicators in this error message.
Line 4 is a generic message and does not point to the actual line 4 in the file. |
Bigint hints to the following issues:
Numeric or other fields indicates an issue with the attributes.
|
CSV File fails to upload when creating new integration: |
Connecting to Remote Source Not Authorized |
The encoding of the file is not supported. |
Change your encoding to utf-8. |
Integration fails with the following error: |
The given key was not present in the dictionary. |
This happens when a dimension's name has been altered and has not been updated in the integration's Task Mapping.
|
Remap the dimension from the CSV file to the new dimension. Also, deleting the identifier in the task mapping as well as leaving it will also cause the integration to error out when publishing the integration. This is due to having null values in the integration (specifically since it is a key identifier). Since the dimension was altered and is not a new one, reports should function properly and update to their new value. |
Event log provides the following error: |
Insert bulk failed due to a schema change of the target table |
This error is a general error but can occur when data is formatted improperly. For example, if you have a CSV file you are integrating and there is data missing under the headers of dimensions such as Scenario, Account, etc, it can cause this issue. |
Check your data/CSV file for null errors, type mismatches, or formatting issues. By cleaning these up, the error should disappear. Provided are some articles on more errors related to this type of issue as well as good practices. |
Troubleshooting ERP Integrations
- First check if the issue is related to the Source System:
- When on the Task Page, no source objects appear.
- When on the Task Mapping page, the “Preview” button does not pull data.
This points to either an authentication issue or a connection issue. Review the web service credential to ensure it is valid and the password is not expired.
For some ERPs, you can verify the credential is valid by using the same credential to access the ERP system.
Verifying the Web Service Account
- Log into Cloud Data Warehouse Data Integrations.
- Select the failing integration and click Edit.
- You should now be on the Task Manager page. Click Connector.
- The application should direct you to the initial login page, your screen may look different depending on the connector you are using.
- If the password field is empty, fill out that column and click Save.
- If the credential is invalid, the application will return an error. Once you update the credential to use the correct information, you will be able to get past the login screen and re-establish the connection to the ERP system.
Additionally, you may also check the developer tools for a detailed error. Press Ctrl+F12 on your keyboard to enable the developer tools in your web browser.
Common Errors in ERP Integrations
Error Message |
Cause |
Solution |
Error occurred when saving change: The Attributes ______ exist in the database |
Same object name appears twice in one of the following:
|
Rename the duplicate names to ensure all objects are unique. |
Error occurred saving changes: An item with the same key has already been added |
This error hints to one of the followings:
|
Rename the duplicate names to ensure all objects are unique. |
Cannot insert duplicate key row in object ‘meta.ModuleAttribute’ with unique index ‘UIX_ModuleAttribute_ModuleId_Label’. The duplicate key value is (2, Credit). The statement has been terminated. |
Duplicate object names
|
Rename the duplicate names to ensure all objects are unique. Also check GL Detail for duplicate labels if the field does not exist in GL. |
Invalid column name ‘UDF002’ (or any UDFs) |
Duplicate attribute labels |
Use the web browser built-in dev tools (F12) to find out which field is the UDF and rename the duplicate. |
Cannot drop the table 'f_Trans_GL_temp', because it does not exist or you do not have permission. There is already an object named 'f_Trans_GL_temp' in the database. |
Multiple integrations are hitting the same target table at the same time (This error will be gone in the near future as our engineers work to drop temp tables after each integration run) |
Space out the integrations with at least a 5 minute interval, or however long the previous integration needs to complete the process. |
Object reference not set to an instance of an object |
Connection to the data source may have been lost |
Verify Source Object can render in Task Manager. Gateway:
Cloud ERPs:
|
Timeout expired after 240 Minutes |
|
|
Error connecting. Either there is no data source selected, or the Gateway isn’t running. |
Connection lost |
|
Gateway error: |
|
|
Violation of UNIQUE KEY constraint ‘UQ_fTrans_OT12’. Cannot insert duplicate key in object ‘dbo.f_Trans_OT12’. The duplicate key value is (……..). Line 4. |
Data not unique |
Ensure each transaction is unique.
|
Cannot insert the value NULL into column 'Account', table …..dbo.f_Trans_GL'; column does not allow nulls. INSERT fails. Line 4 |
Source data has a NULL/blank/yet to exist in the DW value in the dimension being referenced |
Run the dimension load prior to running the module load to ensure all dim codes are up-to-date in the DW. |
Converting data type nvarchar to bigint |
This error occurs when a dimension that exists within your transactional data does not exist in a respective dimension |
|
Error converting data type nvarchar to numeric |
This error occurs for any of your numeric mappings when the data provided is not a valid number. Solver support 12 characters to the left of the decimal and 16 characters to the right of the decimal. For example, if the source system provide 1.02E17 as a value, this value is invalid because "E" is not a number and will error out. |
|
Error converting from nvarchar to datetime |
This error will occur when the data passed off to your datetime attribute is not a valid datetime. For example: datetime should be in your regional setting (US MM/DD/YYYY). If the date you passed off is 13/01/2020, the month 13 is invalid and therefore you will receive this error message |
Ensure that valid datetime values are coming in from the source system. |
Conversion failed when converting date and/or time from character string. |
Typically referring an incorrect datetime/date/Period field |
Ensure Period is in YYYYMMDD. Date in DW accepts datetime or Date data type. |
String or binary data would be truncated. The statement has been terminated. |
Source data exceeds the character limit target field can accept Long Text in DW accepts up to 512 characters, a value passes this limit will result in this error. |
|
The given key was not present in the dictionary. |
|
|
Cannot perform runtime binding on a null reference
|
This happens when the Solver Application cannot find the source object due to a duplicate label in the JSON file. |
|
Elastic Pool has reached its storage limit. The storage usage for the elastic pool cannot exceed (Amount of Megabytes)
|
This happens when the Azure SQL Server that your Database is on has reached a storage limit. | Contact Solver Support. |
The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.
|
This happens when a dimension integration has duplicate keys in the code field. |
1. Use a different field from your source that has unique values. Solver does not use a DISTINCT function. Examples of unique fields are typically ID fields on the source side. 2. In your target field, code, you can use INCREMENT()/2 in the expression builder and set your description field to the ID you want to use in a staging dimension. You will then need to run a rule to update the module. If you need assistance with this. Please contact Solver Support. |
D365 BC OAuth request Response Error: ErrorCode: Internal_RecordNotFound. Value does not exist. Identification fields and values: Dimension Code = [BC CODE].... | Solver integration worked before and then a recent error comes up with no changes on the Solver side. This happens when a field name or value is reclassed. |
1. Ensure there are no dimension set ids that are all 0s 2. Ensure that no Dimension codes were renamed as now Solver is calling on a code that does not exist 3. Contact your BC consultant for further issues and work with Solver. |
Glossary:
Bc_Issue001