Custom D365 integration development — OData APIs, custom APIs, Azure Service Bus, virtual entities, and dual-write — for Business Central and Finance & Operations environments where Power Automate and AppSource connectors cannot handle your data volume, transformation logic, or legacy system connections. Built by a Zeeland, MI team with 20+ years connecting ERP systems to everything else.
Microsoft markets Dynamics 365 as a platform that connects to everything. And for basic scenarios — syncing contacts between Business Central and Outlook, posting sales orders to a Teams channel, triggering an email when an invoice posts — Power Automate works fine. The problem starts when your integration requirements move past demo-day complexity into the reality of a multi-system enterprise environment. You need to sync 50,000 inventory records nightly between Business Central and a third-party WMS. You need to push purchase orders from Finance & Operations into a supplier's legacy AS/400 system over EDI. You need real-time pricing updates from your CPQ tool reflected in D365 within seconds, not the 5-15 minute polling interval that Power Automate offers. Standard connectors were built for convenience, not for the transaction volumes, error handling, and data transformation logic that mid-market and enterprise D365 environments actually require.
The first symptom is throttling. Power Automate's D365 connector makes individual API calls per record. At 50,000 records, you are making 50,000 sequential HTTP requests through Microsoft's connector infrastructure, each one subject to Dataverse API limits (6,000 requests per 5-minute window per user for Finance & Operations, separate limits for Business Central). Flows start queueing. Processing that should complete in 30 minutes takes 6 hours. Retry storms cascade. Your IT team gets called at 2 AM because the nightly inventory sync failed at record 34,217 and there is no easy way to resume from where it stopped. Meanwhile, your warehouse opens at 6 AM with stale stock counts.
The second symptom is data transformation rigidity. Real integrations between D365 and other systems are not clean one-to-one field mappings. You need to split a single D365 sales order into multiple warehouse orders based on fulfillment location. You need to aggregate hundreds of POS transactions into daily summary journal entries. You need to translate between D365's chart of accounts structure and your consolidation system's account hierarchy with conditional mapping rules that change by entity, by currency, and by posting period. Power Automate expressions can technically do some of this, but maintaining 200-line expression blocks inside a visual flow designer is a maintenance disaster that no one on your team wants to touch six months later.
The third symptom is error handling that does not exist. When a standard connector fails mid-batch, you get a flow run failure notification. That is it. No partial completion tracking. No dead letter queue for failed records. No automatic retry with exponential backoff. No mechanism to identify which records succeeded and which failed so you can resume without reprocessing everything. In a production D365 environment processing thousands of transactions daily across multiple integration points, the absence of enterprise-grade error handling is not a minor inconvenience — it is a data integrity risk that compounds every time a flow silently fails at 3 AM and no one notices until accounting tries to close the month.
Power Automate throttling: Dataverse API limits (6,000 requests per 5-minute window) stall batch syncs at scale, turning 30-minute jobs into 6-hour failures
No batch processing: standard connectors make individual API calls per record — 50,000 records means 50,000 sequential HTTP requests through Microsoft's connector layer
Data transformation ceiling: complex mapping logic (multi-entity splits, conditional aggregations, currency conversions) exceeds what Power Automate expressions can maintain long-term
Zero enterprise error handling: no partial completion tracking, no dead letter queues, no resumable batch processing — just a generic flow failure email
5-15 minute polling latency: Power Automate triggers poll on intervals, making sub-second real-time sync impossible for inventory, pricing, or order status
Vendor lock-in on AppSource connectors: third-party connectors disappear, change pricing, or break on D365 version updates with no SLA
Legacy system blind spots: Power Automate cannot reach AS/400, Progress, FoxPro, or custom databases that your other systems still depend on
Our engineers have built this exact solution for other businesses. Let's discuss your requirements.
Dynamics 365 exposes multiple integration surfaces — more than most ERP platforms — but using them effectively requires going deeper than the standard connector layer. Business Central offers OData v4 endpoints for every published page and query, plus custom API pages that let you expose exactly the data shapes your integrations need without the overhead of full page metadata. Finance & Operations provides Data Management Framework (DMF) for high-volume batch operations, OData endpoints bound to data entities, custom services built on X++ service contracts, and dual-write for real-time synchronization with Dataverse. The integration surface is powerful but fragmented, and knowing which tool to use for which scenario is the difference between an integration that processes 100,000 records in 10 minutes and one that times out at 5,000.
FreedomDev builds custom D365 integrations that use the right API layer for each use case. For Business Central, we build integrations against custom API pages (not standard OData page endpoints) because custom APIs let us control exactly which fields are exposed, add computed fields, enforce filtering logic server-side, and version the API independently of page layout changes. This means your integration does not break when someone adds a field to a Business Central page. For Finance & Operations, we choose between OData data entities for moderate-volume real-time operations, DMF recurring integrations for high-volume batch scenarios (the DMF package API can process 500,000 records in a single operation), and custom services for complex operations that need to execute business logic on the server side rather than just move data.
For multi-system environments where D365 is one of five or ten connected systems, we architect integration middleware using Azure Service Bus as the message backbone. Service Bus gives you guaranteed message delivery with duplicate detection, dead letter queues for failed processing, topic-based pub/sub routing so multiple downstream systems can subscribe to the same D365 events independently, and sessions for ordered processing when sequence matters. This is the architecture you need when your D365 instance has to talk to your WMS, your CRM, your e-commerce platform, your EDI trading partners, your BI warehouse, and two legacy systems — simultaneously, reliably, and with full observability into what is flowing where.
Dual-write is Microsoft's own real-time synchronization framework between Finance & Operations and Dataverse, and it is genuinely useful for specific scenarios — keeping customer and product master data consistent between F&O and a Dataverse-powered customer portal, for example. But dual-write has hard limitations that Microsoft's documentation undersells. It only works between F&O and Dataverse (not Business Central, not third-party systems). It has a fixed set of supported entity maps. It struggles with high-frequency transactional data where conflict resolution gets complicated. And it requires careful initial sync planning because it was designed for ongoing synchronization, not historical data migration. We implement dual-write where it fits and build custom alternatives where it does not, rather than trying to force every integration through a single framework.
We build custom API pages in AL that expose exactly the data your integrations need — no more, no less. Custom APIs are versioned independently of Business Central page layouts, support complex filtering with OData query parameters, handle pagination for large result sets, and include computed fields that aggregate or transform data server-side so your integration layer receives clean, ready-to-use payloads. This is fundamentally different from consuming standard page-based OData endpoints, which expose every field on the page, break when pages change, and carry metadata overhead that slows bulk operations.
For F&O environments, we select the right integration pattern per scenario. OData-bound data entities for moderate-volume, real-time operations like order creation and customer updates. Data Management Framework (DMF) for high-volume batch scenarios — recurring data jobs that can process hundreds of thousands of records per batch using the package API. Custom X++ services for operations that require server-side business logic execution, multi-table transactions, or complex validation chains that cannot be expressed through data entity mappings alone.
We build event-driven integration architectures using Azure Service Bus as the messaging backbone for D365 environments with 4+ connected systems. Service Bus queues for point-to-point command processing with guaranteed delivery. Service Bus topics with filtered subscriptions for pub/sub event distribution where multiple systems need to react to the same D365 event. Dead letter queues with automated alerting for messages that fail processing. Sessions for ordered delivery when processing sequence matters (financial postings, inventory adjustments, sequential order processing).
Virtual entities let you expose external system data inside Dataverse — and by extension inside D365 Customer Engagement apps, Power Apps, and Power BI — without copying the data into Dataverse tables. We build custom virtual entity data providers that connect Dataverse to your external databases, APIs, and legacy systems so users see and interact with external data as if it were native Dataverse records. This eliminates the need for batch synchronization of reference data and gives your team a single interface for data that physically lives in multiple systems.
We implement Microsoft's dual-write framework for real-time F&O-to-Dataverse synchronization where it fits: master data entities (customers, vendors, products) with moderate update frequency and straightforward conflict resolution. Where dual-write's limitations apply — unsupported entity maps, high-frequency transactional data, complex transformation requirements, or integration targets beyond Dataverse — we build custom real-time sync using Azure Service Bus and change tracking that provides the same millisecond-level sync without dual-write's constraints.
D365 implementations rarely replace every system overnight. You go live on Business Central or F&O and still need to exchange data with AS/400 inventory systems, Progress manufacturing execution systems, FoxPro databases, flat-file EDI translators, and homegrown applications that have been running since 2003. We build the middleware that bridges D365 to these systems — database-level connectors with change data capture, file-based integration pipelines, screen-scraping adapters for terminal systems, and custom wrapper APIs that give your legacy systems a modern REST interface that D365 can call natively.
We tried to build our D365 Business Central integrations with Power Automate. The nightly inventory sync to our WMS was taking 8 hours and failing 3 times a week. FreedomDev rebuilt it with custom API pages and Azure Service Bus — the same sync now completes in 12 minutes with zero failures in 6 months. Our warehouse team finally trusts the stock counts when they walk in at 6 AM.
We audit your D365 environment (Business Central, Finance & Operations, or both), catalog every system it needs to exchange data with, and map the data flows — what moves where, how often, in what volume, and with what transformation logic. For each integration point, we evaluate which D365 API surface is the right fit: custom API pages, OData data entities, DMF batch jobs, custom services, dual-write, or virtual entities. We assess your non-D365 systems for API availability, database access, and file export capabilities. Deliverable: an integration architecture document with the specific D365 API layer, messaging pattern (direct, queue-based, or pub/sub), and estimated cost for each connection.
Before writing integration code, we define the data contract for every connection. This means field-level mapping between D365 entities and target systems, data transformation rules (unit conversions, account code translations, currency handling, entity splitting and aggregation), conflict resolution logic for bidirectional syncs, error handling behavior (retry policies, dead letter routing, alerting thresholds), and authentication flows (OAuth 2.0 for Azure AD-protected D365 APIs, API keys for third-party systems, certificate-based auth for legacy connections). For Business Central custom APIs, we write the AL API page specifications. For F&O integrations, we define data entity configurations and DMF job parameters.
We build integrations in priority order against your D365 sandbox environment. Business Central custom API pages are developed and deployed as extensions. F&O data entities, custom services, and business events are developed in the appropriate D365 development environment. Middleware components — Azure Functions, Service Bus configurations, legacy system connectors — are built and deployed to Azure or your infrastructure. Every integration gets unit tests, integration tests against sandbox data, batch processing tests at 2–5x your expected volume, error injection testing (simulated API failures, timeout scenarios, malformed data), and performance benchmarking. A straightforward Business Central to WMS integration takes 3–4 weeks. A multi-system F&O integration with legacy bridging, Service Bus event routing, and complex transformation logic takes 6–8 weeks.
Integrations run alongside your existing processes — manual or otherwise — for a validation period. Automated reconciliation compares integration output against your current data movement methods transaction by transaction. This is where we catch the edge cases that sandbox testing cannot simulate: unusual transaction types, end-of-period posting anomalies, timezone mismatches on international transactions, D365 number sequence gaps, and vendor-specific API quirks that only surface with production data shapes. Your team continues their current workflow unchanged until integration accuracy is verified. For high-volume integrations (50,000+ records per batch), validation includes throughput testing under production load to confirm that batch windows complete within your operational requirements.
After validation, we cut over to automated integration, configure monitoring dashboards in Azure Monitor or your preferred observability platform, set up alerting for failures and anomalies (processing time deviations, error rate spikes, volume drops that suggest upstream issues), and provide 30 days of hypercare support. Ongoing maintenance covers D365 version update compatibility testing (Microsoft releases major updates twice yearly and cumulative updates monthly), third-party API change monitoring, Azure Service Bus health management, performance tuning as transaction volumes grow, and emergency response for integration failures. Maintenance runs $750–$3,000/month per integration cluster depending on the number of connected systems, transaction volume, and SLA requirements.
| Metric | With FreedomDev | Without |
|---|---|---|
| Batch Processing (50K+ records) | DMF package API: 500K records/batch, parallel processing | Power Automate: per-record API calls, 6K request limit per 5 min |
| Real-Time Sync Latency | Sub-second via Service Bus + webhooks | Power Automate: 1–15 minute polling intervals |
| Data Transformation Complexity | Custom C#/AL logic: entity splits, aggregations, conditional mappings | Power Automate expressions: limited, unmaintainable at scale |
| Error Handling | Dead letter queues, partial completion tracking, resumable batches, automated alerting | Flow failure email notification, manual rerun of entire flow |
| Legacy System Connectivity | AS/400, Progress, FoxPro, flat files, EDI, screen scraping, custom wrapper APIs | Only systems with existing Power Platform connectors |
| D365 Version Update Resilience | Custom APIs versioned independently, tested against preview releases | AppSource connectors break on updates, vendor fix timelines unknown |
| 3-Year TCO (10+ integration points) | $80K–$200K total (build + maintenance) | Power Automate Premium: $15/user/mo + AppSource connectors + manual intervention labor |
| Monitoring & Observability | Azure Monitor dashboards, custom health checks, transaction-level audit logs | Power Automate run history (28-day retention, no custom metrics) |
Schedule a direct technical consultation with our senior architects.
Make your software work for you. Let's build a sensible solution.