Question

Reading/writing from Saleforce


I would like to be able to use FME to bulk upload data to salesforce. We currently manually upload csv files with awareness that there is a limit of 50,000 rows per file (salesforce import wizard). I have seen it should be possible to automate this using the salesforce reader/writer. I have the API credentials for salesforce and I can see all the table objects available in FME. The issue is that the object we are trying to write to appears to have different fields than expected when read (these look to be metadata fields) - see image. When we run the process to manually import using the wizard the attributes are those in the csv uploaded (see image). Both of these tables are the custom table 'property(_c)', the saleforce URL indicates that this is the case. Do you need to change permissions on the object salesforce side or do anything specific with the reader/writer in FME to get this working?

 

Kind regards,

 

Antony property_c_fieldsMetadata_fields


16 replies

Badge +2

Hi @ant_r_upp​ ,

The maximum size of a Salesforce write Batch is 10,000 rows or 10MB, whichever occurs first with the Salesforce Writer in FME. As a result, you'll likely want to use a Grouper and a FeatureWriter. By doing this, you can easily create groups of 10,000 and utilize the Group Processing parameter in the FeatureWriter to create multiple batch jobs for each 10,000 records being Insert/Upsert/Update/Delete into your Salesforce.

2022-08-30_13-38-08As for the attributes, can you send a screenshot of the fields and relationships tab in the Object Manager from the Salesforce setup? I see from the reader feature type parameter dialog it looks like you are missing attributes that are visible in your Salesforce web UI. FME is able to read and write to custom object as well as custom fields in Salesforce. However, since Salesforce is essentially a relational db, you may have attribute (columns) visible in the web UI that may belong to another object. The object manager should tell you the exact field name (custom fields are suffixed with __c) that you should use in the writer feature type. Otherwise, you could use the import from dataset option when adding the writer to the canvas and connect to your desired object from the import wizard.

 

If you could confirm from the object manager that would be great. Let me know how it goes.

Thanks Chris, really appreciate the information you have shared here. It turned out we had to change the permissions on the custom attributes to make them visible to the reader. Thanks for the tips for the grouper we're going to give it a test run on monday. One question that remains is in the manual import wizard there is a specific checkbox to kick off workflow rules and processes. Is there an option in FME ensure our processes are triggered for new and updated records?

 

salesforce_upload_trigger

Badge +2

salesforce_upload_trigger

Hi @ant_r_upp​ ,

To trigger an FME workspace to run automatically based on an event that takes place in Salesforce I believe you'd need to make use of FME Server.

In FME Server create an Automation that contains a webhook trigger followed by your workspace action. In Salesforce I believe you should be able to create a workflow rule to send a message to this webhook URL every time there is a new or updated record. We don't have a tutorial on setting this up in Salesforce, but we have an example of setting up a webhook up with Survey123 for which parts 2 & 3 would still be applicable so this is a good starting point.

 

If you don't have access to FME Server I'd suggest setting your workspace up to run on a Schedule using Windows Task Scheduler and have your workspace check the Database for new or updated records. This won't be a real-time solution so you wouldn't be able to make use of the workflow rules in Salesforce.

Hi Holly, The requirement wouldn't be to trigger an FME workspace based on an event, that takes place in salesforce, it would be to trigger processes in salesforce after an FME workspace has been run. If you see in the screengrab about when using the manual data upload wizard in salesforce there is a checkbox that can be toggled on and off.

Userlevel 1
Badge +15

Hi Holly, The requirement wouldn't be to trigger an FME workspace based on an event, that takes place in salesforce, it would be to trigger processes in salesforce after an FME workspace has been run. If you see in the screengrab about when using the manual data upload wizard in salesforce there is a checkbox that can be toggled on and off.

Hi @ant_r_upp​ I have another answer, it may or may not be the one you are looking for:

 

You will have to use the REST API and send the assignmentid in the headers if the request: https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/create_job.htm

Otherwise you could likely setup an apex trigger on the object to process any inserts/updates: https://developer.salesforce.com/forums/?id=9060G000000UUY4QAO

 

I personally do not have experience in SF, and do not know the difference between checking and unchecking that checkbox in the wizard. you may need to go to the Salesforce community for your answer regarding the SF wizard

That's really handy that will hopefully help us be able to kick off the processes we need.

 

We're currently running this as a POC to understand the viability of using FME as a long term solution for our data flow into salesforce. I think this is likely another question for out IT partners (We have an external partner who manages our salesforce infrastructure) but our API user appears to only be able to read and not write at the moment. When running the workspace we currently get a 500 error in the FME logs:

 

[1]: Received HTTP response header: 'HTTP/1.1 500 Server Error' from 'https://test.salesforce.com/services/Soap/u/51.0'

Salesforce: HTTP 500: INVALID_LOGIN: Invalid username, password, security token; or user locked out.

SALESFORCE writer: An error has occurred. Check the logfile above for details.

 

I'm assuming that salesforce side we will have to extend the user privilege's to be read and write.

 

I think we have 5 days left of the trial, can you at least confirm that it is something that has been performed in the past using FME and the salesforce API user (read, insert, delete, update)?

Badge +2

That's really handy that will hopefully help us be able to kick off the processes we need.

 

We're currently running this as a POC to understand the viability of using FME as a long term solution for our data flow into salesforce. I think this is likely another question for out IT partners (We have an external partner who manages our salesforce infrastructure) but our API user appears to only be able to read and not write at the moment. When running the workspace we currently get a 500 error in the FME logs:

 

[1]: Received HTTP response header: 'HTTP/1.1 500 Server Error' from 'https://test.salesforce.com/services/Soap/u/51.0'

Salesforce: HTTP 500: INVALID_LOGIN: Invalid username, password, security token; or user locked out.

SALESFORCE writer: An error has occurred. Check the logfile above for details.

 

I'm assuming that salesforce side we will have to extend the user privilege's to be read and write.

 

I think we have 5 days left of the trial, can you at least confirm that it is something that has been performed in the past using FME and the salesforce API user (read, insert, delete, update)?

Hi @ant_r_upp​ ,

Yes FME can both read and write assuming sufficient privileges are granted to the user you are authenticating with. This can be accomplished using the basic username, password, security token with the Salesforce Reader and Salesforce Writer or using a different authentication type like OAuth2.0 or Token when interacting with Salesforce via the HTTPCaller and the Salesforce REST API. Please note, you shouldn't have to change the host URL in the Salesforce reader or writer - the default login.salesforce.com should be fine in most cases.

 

If you are still running into issues after permission is granted, please refresh your security token as and ensure you are pasting the latest security token in the reader/writer parameters as the error message you are currently seeing is most commonly caused by an invalid token/password combo.

Hi Chris,

We still have not been able to get the salesforce writer to work yet. The privileges have been altered on the salesforce API user to allow write access in our UAT salesforce environment, but we now receive a 404 not found error.

 

Received HTTP response header: 'HTTP/1.1 404 Not Found' from 'https://upp-com--enxuat.my.salesforce.com/services/data/v51.0/sobjects/property_c/describe/' Salesforce: HTTP 404: The requested resource does not exist.

 

This is interesting as we are able to read from the same table were attempting to write to so we know the table exists (its called property_c - it's a custom table). I guess it is different from the initial 500 error we had experienced prior to this, so perhaps this means that the access is ok now and the issue is with the call in the writer.

 

the url it is using to write seems to reflect what is shown here: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_sobject_create.htm

 

As recommended above we are using a feature writer and a grouper together to break the data into smaller chunks to write with. Do you have any further recommendations?

 

I am really keen to get FME to work for this POC, I have not used it with salesforce before so I am new to that form of data storage. If there is any instruction you could provide for the correct salesforce writer setting and salesforce user privileges, it would be incredibly helpful.

 

Thanks,

 

Antony

Also currently exploring whether this is related - https://fek.io/blog/getting-a-404-for-salesforce-rest-api-resource-that-exists

 

Do you need to register FME somewhere as a connected App?

Badge +2

Hi Chris,

We still have not been able to get the salesforce writer to work yet. The privileges have been altered on the salesforce API user to allow write access in our UAT salesforce environment, but we now receive a 404 not found error.

 

Received HTTP response header: 'HTTP/1.1 404 Not Found' from 'https://upp-com--enxuat.my.salesforce.com/services/data/v51.0/sobjects/property_c/describe/' Salesforce: HTTP 404: The requested resource does not exist.

 

This is interesting as we are able to read from the same table were attempting to write to so we know the table exists (its called property_c - it's a custom table). I guess it is different from the initial 500 error we had experienced prior to this, so perhaps this means that the access is ok now and the issue is with the call in the writer.

 

the url it is using to write seems to reflect what is shown here: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_sobject_create.htm

 

As recommended above we are using a feature writer and a grouper together to break the data into smaller chunks to write with. Do you have any further recommendations?

 

I am really keen to get FME to work for this POC, I have not used it with salesforce before so I am new to that form of data storage. If there is any instruction you could provide for the correct salesforce writer setting and salesforce user privileges, it would be incredibly helpful.

 

Thanks,

 

Antony

Hi @ant_r_upp​ ,

I've only registered as a connected app when using the Salesforce REST API. For the Reader/Writer you shouldn't have to configure anything on the Salesforce side of things aside from user permissions.

 

Could you try reading one feature from the property_c object, removing the salesforce-generated attributes (i.e. created/update date, id, etc.) and try writing that with the Salesforce writer? Does that run successfully? If you look at the event log on the Salesforce side, does it provide any more details about what caused the failure? I have found the event log helpful in the past (unfortunately it is no longer in the Salesforce Setup from what I can tell so you may need to use the developer console to query the eventlog object).

 

Additionally, are you able to write to any other objects in salesforce? It would be helpful to narrow down if it's just an issue with this object or all objects.

 

Alternatively, you could try writing the data as a CSV and uploading it with the Salesforce REST API (this would require you to setup a connected app). You can do so by using a FeatureWriter configured to the CSV format, and then connecting the FeatueWriter to an HTTPCaller to upload. This can be helpful in the event you are using master-detail relationship attributes as some of those attributes need to be escaped which the CSV writer can do by wrapping the attribute names in "quotes". Please let me know if you are using master-detail relationship fields as I can provide more info if needed.

Hi @ant_r_upp​ ,

The maximum size of a Salesforce write Batch is 10,000 rows or 10MB, whichever occurs first with the Salesforce Writer in FME. As a result, you'll likely want to use a Grouper and a FeatureWriter. By doing this, you can easily create groups of 10,000 and utilize the Group Processing parameter in the FeatureWriter to create multiple batch jobs for each 10,000 records being Insert/Upsert/Update/Delete into your Salesforce.

2022-08-30_13-38-08As for the attributes, can you send a screenshot of the fields and relationships tab in the Object Manager from the Salesforce setup? I see from the reader feature type parameter dialog it looks like you are missing attributes that are visible in your Salesforce web UI. FME is able to read and write to custom object as well as custom fields in Salesforce. However, since Salesforce is essentially a relational db, you may have attribute (columns) visible in the web UI that may belong to another object. The object manager should tell you the exact field name (custom fields are suffixed with __c) that you should use in the writer feature type. Otherwise, you could use the import from dataset option when adding the writer to the canvas and connect to your desired object from the import wizard.

 

If you could confirm from the object manager that would be great. Let me know how it goes.

Hi,

I've had the same issue and had to use grouper (our table has 70k+ records). I've used the grouper and feature writer and it works only until writing 30k records but I've encountered this error:

Duplicate FeatureWriter_MPS_Billing_Account__c_0_DEF group entry in correlation table -- MPS_Billing_Account__c

Program Terminating

Translation FAILED.

 

It says it has features written 10000 each but on the 4th iteration, that is the error that I get.

Any ideas or if you've encountered before ? thank you

Badge +2

Hi,

I've had the same issue and had to use grouper (our table has 70k+ records). I've used the grouper and feature writer and it works only until writing 30k records but I've encountered this error:

Duplicate FeatureWriter_MPS_Billing_Account__c_0_DEF group entry in correlation table -- MPS_Billing_Account__c

Program Terminating

Translation FAILED.

 

It says it has features written 10000 each but on the 4th iteration, that is the error that I get.

Any ideas or if you've encountered before ? thank you

Hi @crstnaiw​ ,

 

Could you try using a Sampler and sampling the First 30,000 features followed by a second Sampler (first 10,000 features) to confirm whether this is an issue with the Group By parameter or a feature/value inside that batch of features? image 

Hi,

I've had the same issue and had to use grouper (our table has 70k+ records). I've used the grouper and feature writer and it works only until writing 30k records but I've encountered this error:

Duplicate FeatureWriter_MPS_Billing_Account__c_0_DEF group entry in correlation table -- MPS_Billing_Account__c

Program Terminating

Translation FAILED.

 

It says it has features written 10000 each but on the 4th iteration, that is the error that I get.

Any ideas or if you've encountered before ? thank you

Hello @chrisatsafe​ 

I tried it and the 10k records from Sampler2 were written to Salesforce.

Does it mean there is problem with the group by?

image

Badge +2

Hi,

I've had the same issue and had to use grouper (our table has 70k+ records). I've used the grouper and feature writer and it works only until writing 30k records but I've encountered this error:

Duplicate FeatureWriter_MPS_Billing_Account__c_0_DEF group entry in correlation table -- MPS_Billing_Account__c

Program Terminating

Translation FAILED.

 

It says it has features written 10000 each but on the 4th iteration, that is the error that I get.

Any ideas or if you've encountered before ? thank you

Hi @crstnaiw​ - it sounds like it. I'll convert your comment to a case so our support team can take a closer look. In the meantime, you could replace the Salesforce FeatureWriter with a CSV FeatureWriter and connect an HTTPCaller to upload directly with the bulk API: https://developer.salesforce.com/docs/atlas.en-us.244.0.api_asynch.meta/api_asynch/create_job.htm

https://developer.salesforce.com/docs/atlas.en-us.244.0.api_asynch.meta/api_asynch/datafiles_understanding_bulk2_ingest.htm

 

We also show an example of this in the Getting Started with the Salesforce REST API demo on the Blazing New Trails with Salesforce Data webinar.

Hi,

I've had the same issue and had to use grouper (our table has 70k+ records). I've used the grouper and feature writer and it works only until writing 30k records but I've encountered this error:

Duplicate FeatureWriter_MPS_Billing_Account__c_0_DEF group entry in correlation table -- MPS_Billing_Account__c

Program Terminating

Translation FAILED.

 

It says it has features written 10000 each but on the 4th iteration, that is the error that I get.

Any ideas or if you've encountered before ? thank you

Thank you, I'll give that a try

Reply