The business requirements drives the choice of technology as well as the choice of pattern above. This also included if and what middleware is being used to orchestrate the integration scenario as well as how the data is being posted to/requested from the destination system/the source system.
See below a high level overview of possible intergation patterns in D365FO:
Modification: Direct call, real time (synchronous)
Scenario: real time call via payment connector for brick and mortar stores for example.
OData: E.g. 2-5k messages a day, low data volume, near real time (almost synchronous)
OData itself is a standard protocol for creating and consuming data. The purpose of OData is to provide a protocol that is based on Representational State Transfer (short REST) for create, read, update, and delete operations. Short, It lets developers interact with data by using RESTful web services.
D365FO provide an OData REST endpoint OTB. This endpoint exposes all the data entities that are marked as IsPublic in the Application Object Tree (AOT). For your understanding an entity is pretty much a flat view of field from one or multiple tables. In D365FO, there are various OTB entities to import to/export from but only such as marked as “IsPublic” will available for OData. If OTB entities do not suffice, you can create add additional entities via customizations (MODs).
To see a list of all the entities that are exposed, open the OData service root URL. [Your organization's root URL]/data
Batch: E.g. 1-3 times a day, Large data volume, more static (asynchronous)
If the business scenario is driven more asynchronous means e.g. via batch 1-3 times a day and requires a larger amount of data to be integrated, then a DMF recurring integration would be probably the better choice.
The recurring integration via DMF is built on top of data entities and Data management framework.
It enables the exchange of documents or files between Finance and Operations and any third-party application or service.
Also, it supports several document formats, source mapping, Extensible Stylesheet Language Transformations (XSLT), and filters. OTB the DMF project when an entity is added you can specify the export format e.g. or apply filters to the data set.
It uses secure REST application programming interfaces (APIs) and authorization mechanisms to receive data from, and send data back to, integration systems.
After the DM project is created, you create the application ID in Microsoft Azure Active Directory (Azure AD) and assign the correct permissions as before the integrating client application can consume the endpoint you would require to authorize via ADD.
You reference such ID then when setting up the DM project as recurring data job.
You can push e.g. individual files (means one entity) or even packages (means multiple entities that are usually zipped and contain a manifest and package header)
You can make Post calls for example via. (Enqueue job)
https://<base URL>/api/connector/enqueue/<activityID>?entity=<entity name>
To get the activityID you can retrieve from the created data project in D365FO.
Same applies to request jobs means Dequeue jobs.
Event driven architecture: Small message size (64KB per message)
This is often also referred to as ad hoc. The estimated size for a message should not exceed 64KB. A use case herefore would be for example every time a purchase order is created an integration needs to be kicked off.
Business events occur when a business process is ran/executed.
Business events provide a mechanism that lets external systems receive notifications from Finance and Operations applications. In this way, the systems can perform business actions in response to the business events.
OTB there are a handful of business events available. You can find such here under System administration > Set up > Business events. If OTB business events do not suffice you can create additional events via customization (MODs).
From an endpoint standpoint, those business events can be consumed using Microsoft PowerAutomate and Azure messaging services.( such as Azure service BUS, Event Grid, etc.). You must specify the endpoint prior activating the Business event.
Short, Endpoints let you manage the destinations that business events are sent to.
If you are having multiple LEs, it is nice to know that business events can be activated either for all or specific LEs.
It is also important to note that The Microsoft Azure–based endpoints must be in the customer's Azure subscription. For example, if Azure Event Grid is used as an endpoint, the endpoint must be in the customer's Azure subscription.
As addressed, the business requirements drive the choice of technology as well as the choice of pattern above. But the above should give you a good understanding of what’s possible.