In a Nutshell
Consider you have the following situation: you want to leverage the applications from SAP ecosystem but there is no connector available to sync data between your system and SAP Cloud.
No worries, we’ve got your back. You can use the Data API 4 to write a connector that is able to exchange data between your system and SAP Cloud. This way, you can upload data to the Cloud and have that information available for your field service engineers on their mobile devices.
This can also be used to save all data that is added by you field service engineers or dispatchers through SAP client applications to your external system.
The rest of the page is addressed to software engineers that want to use Data API 4 to create a connector between their data management system and SAP Cloud.
In the next sections we will build a use case and see how we can achieve the synchronization flows needed for a connector with some sample requests.
In order to start using the API for your company you will have to set-up your account, make sure you know the data model you will be using, and how you will be mapping your data to our model. Since Data API 4 is a RESTful API, we will use the term resource whenever we refer to an object type.
Since the best way to understand a tool is to use it, we recommend you create a trial account ( use standalone company type and remember the company name you set, we will need it) in order to be able to play around while you read through the documentation. If you’ve already created a trial account, you will now have a standalone company with demonstration data.
ECC or SAP Business One Field Service Management account
To create a SAP Field Service Management account that can be connected to an SAP ECC or an SAP Business One you need to login to Admin app with an
ADMINISTRATOR role. From the Admin app, you can create an empty account.
After creating an account please do not make any changes to that account. In particular do not create a company. The company and all other necessary data will be created automatically by the ERP Connector during the first synchronization.
This is the first step to have a SAP Field Service Management account connected to an ERP system. For ERP installation, connector installation, connector configuration and performing initial synchronization, please contact our consultants or support.
User for Connector
We recommend that you use a user just for the connector and that you set the user’s
Delete permissions to ALL for all resources that you plan to sync. That way the user is allowed to do all the operations needed on the resources and also will be easier to detect changes done by other users or the connector itself.
The Use Case
Alright, let’s imagine your company uses AwesomeERP and you want to integrate some of your data models with the SAP ecosystem to provide mobility and knowledge to your field service engineers.
We will need to synchronize some BusinessPartners and also some ServiceCalls so the technicians know what to work on.
First of all, let’s send up some data to SAP. We will upload two Business Partners (a.k.a BP) and one Service Call (a.k.a SC)
Data API is a RESTful API so all the operations are resource-oriented. Since this would involve many calls to do complete a flow, we have a Batch API that allows you to send multiple calls in one request. The call would look like this :
Creating Objects Sample Request
Creating Object Sample Response
|TIP:||In case you forgot the name of your company when you login into my.coresuite.com you will see in the top left corner your company name. Also, here you can see how to find out your base URL.|
DTO Version Used
You need to use dto versions of your resource that extend SyncObject_V10. This version of dtos contains some fields ( like externalId and lastChangedBy) needed for proper connectors.
As you can seen, we specified the externalId we want to use for those objects. This is very important, as you can work using only those external ids without needing to store any mapping between the cloud identifier and your AwesomeErp identifiers.
Another nice thing is that you can reference other resources by their externalId, as you can see from the Service Call example.
Notice that when you use batching, the requests are processed sequentially in the order in which you sent the requests and that they are considered by default in one big transaction. If, for example, the Service can’t be saved then all the changes would be rolled back.
Creating Objects with Error Sample Request
Creating objects with Error Sample Response
As you can see, there was an error for SC, since the referenced BP does not exist. As a result, none of the data was saved. Before you can successfully upload data to the Cloud, you must ensure that your references are valid. That way we can maintain data consistency and clarify connector responsibility.
You can quickly view that the BP was not saved by doing a GET call by externalId like this :
Simple GET by externalId
However, you might do an initial sync, in which you’ll want to sync all the business partners and save them in the cloud. For this you could use a Batch Call and pass the transactional = false parameter. This will allow us to save anything we can while providing you with an individual response for each request so you can check what was successful and what wasn’t.
Create Non Transactional Objects with Errors
Create Non Transactional Objects with Error Response
In this case, the first BusinessPartner was created and saved on cloud and the second one not.
To be or not to be Transactional
We recommend you leave the batch request as one big transaction when sending upstream data that depends on each other. Non transactional batch requests should be used only when you don’t care about consistency of the data, and want as many objects as possible available in SAP.
Now we have some data in the cloud. You can use our iOS or Android app or go directly to my.Coresuite.com and browse your data. When you login, use a different user for the applications to simulate a real situation. For example I used user4 for connector and user5 for my.coresuite.com
Retrieving created or modified resources in a certain timeframe
Now let’s say that the field service engineer (user5) creates a new Service Call. You want to be able to see the changes and save them in AwesomeERP.
Our current API supports only interval sync, so in order to get modified data, you will need to query for objects modified within a specified time frame. We do not support a notification style sync, where you could register for updates and receive them when something happened. You will have to periodically query the SAP Cloud to get modified data.
So, how do we do it? The following call will return all the created/updated Service Calls.
Get Changed ServiceCalls from last 10 minutes
Response with Last Modified Service Calls
In this examples you use the query parameter to say that you want the objects that were modified by other users and that are changed in the last minute. Also by specifying useExternalIds = true then in response you will receive the references filled with externalId for easier identification in your system.
You also need to specify the interval of time in the query, since you want the data using the lastChanged field. The log that you see there represents the number of milliseconds since January 1, 1970, 00:00:00 GMT. (In Java is equivalent to Date.getTime())
As you can see in the response, you can also find the externalId of the BP and thus solve the reference in AwesomeErp system. The externalId field for this service call is null and you will need to replace it.
Handling New Data from SAP Cloud
After getting new or modified data from SAP Cloud, if the externalId of the resource is null, then you will need to update it on the cloud after saving it in your external system. This way you will be able to maintain data consistency
Cloud Transaction Boundaries
Suppose a technician is working hard for one week remotely and then he syncs all the data back to cloud after one week. Now this sync, might take 5-10 minutes.What happens in the cloud?
Well the Cloud starts a transaction at 15:00, creates the objects and at 15:05 the transaction finishes successfully and everything is committed. At this point new objects will appear in cloud that were created 5 minutes ago. LastChanged will be set at the beginning of the transaction, that means 15:00
How does this affect you?
On a normal sync flow let’s say you would have a sync sequence like this
t1: [0, t1], t2: (t1, t2], t3: (t2, t3], … where ti = ti-1 + 1 minute. This means that at t1 you get all the data from the beginning to t1 moment, at t2 you’ll get all the data since t1 up to t2 moment and so fort.
Now because some transactions might last about 5 minutes, in case you use the previous sync sequence then you will loose some object updates. At t3 you won’t see the objects are saved in a transaction that started in (t1,t2] interval and finished it in (t2,t3] interval, because those objects would appear as being changed in (t1,t2] interval
There are a few options to overcome this limitation
Option 1. Use a sync sequence that use intersect with each other
The sequence would look like this
t1: [0, t1], t2: (t1 - C, t2], t3: (t2 - C, t3], ... where ti = ti-1 + 1 minute and C is a constant, let’s say 5 minutes.
Now if you sync at t3 and an object is saved in a transaction that started in (t1,t2] interval and finished it in (t2,t3] interval, you will get that change.
However, another limitation appears here. You will receive in multiple sync requests the same object update and you will have to detect if you already handled that object update or not.
For this, you could use in the connector a history table where you can store the externalId and cloud lastChanged value or simply compare the cloud object with the one from ERP to see if you need to process it.
Option 2. Use a sync sequence that are completely different but not up to present moment
The sequence would look like this
t1: [0, t1 - C], t2: (t1 - C, t2 - C], t3: (t2 - C, t3 - C], ... where ti = ti-1 + 2 minute and C is a constant, let’s say 5 minutes
Now you won’t get any duplicates but you won’t have the latest data, since at 15:05 you will read that data changed on cloud between 14:58-15:00
It’s up to you to decide which solution would fit your needs and also to come up with a different one. Meanwhile, we are working to get rid of the limitations imposed by transaction boundaries so you can use a normal and simple sync flow.
Let’s see how we can update this newly created object on cloud to add externalId to it. In this particular case, we will need to use the cloud identifier. This is necessary because we want to tell the SAP Cloud, “Hey link your object to my object (smile) “. In order to do this, we can make a call like the following :
Update Cloud Object with externalId
We recommend you use PATCH method. This way you can pass just the data that you are interested in replacing. The SAP Cloud will then merge the resource identified by the provided id with the information you supplied.
There is another trick to this request. The forceUpdate = true flag, will indicate that you want the resource to be updated with that externalId, no matter if there were any changes done before. Basically you will skip the mechanism that checks for concurrent modifications
It’s up to you if you want to detect whether or not an object was modified since you pulled the changes. If so, you will need to pass also the lastChanged field with the same value that you received from the cloud for that object and don’t use the forceUpdate flag anymore. The request will appear as follows :
Update Cloud Object with externalId
In the event there is a concurrent modification error, you will get a response like the following :
Concurrent Modification Response
Alright, we learned how to GET new objects from SAP Cloud and update them with external Ids. We also learned how to create new objects in SAP Cloud with external Ids. But what about updating SAP Cloud objects with data from your system? Piece of cake! (smile) Just do a PATCH by externalId
Update Cloud Object by externalId
Handling Deleted Objects
There are two aspects in handling deleted objects. First of all, you want to be able to
Delete Existing Resources
If, for example, some back office user of AwesomeERP deletes a business partner because he is a bad payer, then he will also want to delete that business partner in SAP Cloud. That way the field service engineers will not see him on their device after they sync the data. How do we do that? Make a simple Delete call by externalId like this :
Delete by externalId
This will delete the business partner with external id “BPExtId2”. As you can see, we’ve also used the forceDelete flag for updates. By doing so, we will bypass the concurrent modification mechanism. In case you want to use the concurrent modification mechanism, you should pass the lastChanged as follows :
Delete by externalId
Concurrent Modification Handling by the Connector
We recommend you design the connector in such a way that he will use force updates and force deletes. This way, you don’t need to store the SAP last change and externalId mapping.
It’s important to note that if you DELETE a business partner, objects that refer to the deleted business partner will still remain on SAP Cloud. You will still see the Service Calls associated to BP with BPExtId2 on the device, however the business partner will no longer be displayed for those Service Calls. In case you don’t want to see them anymore, then it’s connector’s responsibility to delete all objects related to the business partner from SAP.
Good, now we managed to DELETE an object from SAP Cloud. Now let’s take a look at how we can GET deleted objects from SAP Cloud.
Retrieving Deleted Resources in a Certain Timeframe
We could have the situation where one of our field service engineers deletes a Service Call and the connector needs to see this in order to delete it from his AwesomeERP.
You will have to GET deleted data with externalIds in the specified time frame. To do this, make a similar call like the following:
Get Deleted Objects with externalIds
Response with Deleted Business Partner with externalIds
As you can see, you will receive a list of identifiers that will contain the Cloud identifier and the external identifier. In case there is no external id , that means that object was not saved in your systems.
There is however a limitation to this functionality that will be addressed as soon as possible. Currently, you will also receive in the list the objects deleted by the connector itself. The lastChangedBy != me() is not working for these requests.
Alright, so we pretty much covered the most common needs of an connector. Remember that you can group multiple calls using the Batch API, which with just one call will give you a snapshot of the entire company! You can also filter the data that you get from cloud and use more powerful queries with Query API. Feel free to explore the Reference of SAP FSM API and Query API to find out how you can build the connector you need.