FreedomDev
TeamAssessmentThe Systems Edge616-737-6350
FreedomDev Logo

Your Dedicated Dev Partner. Zero Hiring Risk. No Agency Contracts.

201 W Washington Ave, Ste. 210

Zeeland MI

616-737-6350

[email protected]

FacebookLinkedIn

Company

  • About Us
  • Culture
  • Our Team
  • Careers
  • Portfolio
  • Technologies
  • Contact

Core Services

  • All Services
  • Custom Software Development
  • Systems Integration
  • SQL Consulting
  • Database Services
  • Software Migrations
  • Performance Optimization

Specialized

  • QuickBooks Integration
  • ERP Development
  • Mobile App Development
  • Business Intelligence / Power BI
  • Business Consulting
  • AI Chatbots

Resources

  • Assessment
  • Blog
  • Resources
  • Testimonials
  • FAQ
  • The Systems Edge ↗

Solutions

  • Data Migration
  • Legacy Modernization
  • API Integration
  • Cloud Migration
  • Workflow Automation
  • Inventory Management
  • CRM Integration
  • Customer Portals
  • Reporting Dashboards
  • View All Solutions

Industries

  • Manufacturing
  • Automotive Manufacturing
  • Food Manufacturing
  • Healthcare
  • Logistics & Distribution
  • Construction
  • Financial Services
  • Retail & E-Commerce
  • View All Industries

Technologies

  • React
  • Node.js
  • .NET / C#
  • TypeScript
  • Python
  • SQL Server
  • PostgreSQL
  • Power BI
  • View All Technologies

Case Studies

  • Innotec ERP Migration
  • Great Lakes Fleet
  • Lakeshore QuickBooks
  • West MI Warehouse
  • View All Case Studies

Locations

  • Michigan
  • Ohio
  • Indiana
  • Illinois
  • View All Locations

Affiliations

  • FreedomDev is an InnoGroup Company
  • Located in the historic Colonial Clock Building
  • Proudly serving Innotec Corp. globally

Certifications

Proud member of the Michigan West Coast Chamber of Commerce

Gov. Contractor Codes

NAICS: 541511 (Custom Computer Programming)CAGE CODE: oYVQ9UEI: QS1AEB2PGF73
Download Capabilities Statement

© 2026 FreedomDev Sensible Software. All rights reserved.

HTML SitemapPrivacy & Cookies PolicyPortal
  1. Home
  2. /
  3. Solutions
  4. /
  5. Dynamics 365 Custom Integration: When Standard Connectors Aren't Enough
Solution

Dynamics 365 Custom Integration: When Standard Connectors Aren't Enough

Custom D365 integration development — OData APIs, custom APIs, Azure Service Bus, virtual entities, and dual-write — for Business Central and Finance & Operations environments where Power Automate and AppSource connectors cannot handle your data volume, transformation logic, or legacy system connections. Built by a Zeeland, MI team with 20+ years connecting ERP systems to everything else.

FD
20+ Years ERP Integration
D365 Business Central & F&O
Azure Service Bus Architecture
Zeeland, MI

The D365 Integration Gap: Power Automate Hits a Wall at 50,000 Records

Microsoft markets Dynamics 365 as a platform that connects to everything. And for basic scenarios — syncing contacts between Business Central and Outlook, posting sales orders to a Teams channel, triggering an email when an invoice posts — Power Automate works fine. The problem starts when your integration requirements move past demo-day complexity into the reality of a multi-system enterprise environment. You need to sync 50,000 inventory records nightly between Business Central and a third-party WMS. You need to push purchase orders from Finance & Operations into a supplier's legacy AS/400 system over EDI. You need real-time pricing updates from your CPQ tool reflected in D365 within seconds, not the 5-15 minute polling interval that Power Automate offers. Standard connectors were built for convenience, not for the transaction volumes, error handling, and data transformation logic that mid-market and enterprise D365 environments actually require.

The first symptom is throttling. Power Automate's D365 connector makes individual API calls per record. At 50,000 records, you are making 50,000 sequential HTTP requests through Microsoft's connector infrastructure, each one subject to Dataverse API limits (6,000 requests per 5-minute window per user for Finance & Operations, separate limits for Business Central). Flows start queueing. Processing that should complete in 30 minutes takes 6 hours. Retry storms cascade. Your IT team gets called at 2 AM because the nightly inventory sync failed at record 34,217 and there is no easy way to resume from where it stopped. Meanwhile, your warehouse opens at 6 AM with stale stock counts.

The second symptom is data transformation rigidity. Real integrations between D365 and other systems are not clean one-to-one field mappings. You need to split a single D365 sales order into multiple warehouse orders based on fulfillment location. You need to aggregate hundreds of POS transactions into daily summary journal entries. You need to translate between D365's chart of accounts structure and your consolidation system's account hierarchy with conditional mapping rules that change by entity, by currency, and by posting period. Power Automate expressions can technically do some of this, but maintaining 200-line expression blocks inside a visual flow designer is a maintenance disaster that no one on your team wants to touch six months later.

The third symptom is error handling that does not exist. When a standard connector fails mid-batch, you get a flow run failure notification. That is it. No partial completion tracking. No dead letter queue for failed records. No automatic retry with exponential backoff. No mechanism to identify which records succeeded and which failed so you can resume without reprocessing everything. In a production D365 environment processing thousands of transactions daily across multiple integration points, the absence of enterprise-grade error handling is not a minor inconvenience — it is a data integrity risk that compounds every time a flow silently fails at 3 AM and no one notices until accounting tries to close the month.

Power Automate throttling: Dataverse API limits (6,000 requests per 5-minute window) stall batch syncs at scale, turning 30-minute jobs into 6-hour failures

No batch processing: standard connectors make individual API calls per record — 50,000 records means 50,000 sequential HTTP requests through Microsoft's connector layer

Data transformation ceiling: complex mapping logic (multi-entity splits, conditional aggregations, currency conversions) exceeds what Power Automate expressions can maintain long-term

Zero enterprise error handling: no partial completion tracking, no dead letter queues, no resumable batch processing — just a generic flow failure email

5-15 minute polling latency: Power Automate triggers poll on intervals, making sub-second real-time sync impossible for inventory, pricing, or order status

Vendor lock-in on AppSource connectors: third-party connectors disappear, change pricing, or break on D365 version updates with no SLA

Legacy system blind spots: Power Automate cannot reach AS/400, Progress, FoxPro, or custom databases that your other systems still depend on

Need Help Implementing This Solution?

Our engineers have built this exact solution for other businesses. Let's discuss your requirements.

  • Proven implementation methodology
  • Experienced team — no learning on your dime
  • Clear timeline and transparent pricing

D365 Custom Integration ROI: What Clients Measure After Go-Live

10x–50x
Batch processing speed vs Power Automate connector (DMF vs per-record API calls)
99.8%
Data accuracy across integrated systems with automated reconciliation
< 5 sec
Real-time sync latency replacing 5–15 minute Power Automate polling intervals
30–60 hrs/wk
Manual data entry eliminated per client across D365 and connected systems
$80K–$150K/yr
Operational cost savings from eliminated manual processes and error correction
4-hour
Month-end close acceleration by eliminating manual data reconciliation

Facing this exact problem?

We can map out a transition plan tailored to your workflows.

The Transformation

Custom D365 Integration: OData, Custom APIs, Azure Service Bus, and Dual-Write Done Right

Dynamics 365 exposes multiple integration surfaces — more than most ERP platforms — but using them effectively requires going deeper than the standard connector layer. Business Central offers OData v4 endpoints for every published page and query, plus custom API pages that let you expose exactly the data shapes your integrations need without the overhead of full page metadata. Finance & Operations provides Data Management Framework (DMF) for high-volume batch operations, OData endpoints bound to data entities, custom services built on X++ service contracts, and dual-write for real-time synchronization with Dataverse. The integration surface is powerful but fragmented, and knowing which tool to use for which scenario is the difference between an integration that processes 100,000 records in 10 minutes and one that times out at 5,000.

FreedomDev builds custom D365 integrations that use the right API layer for each use case. For Business Central, we build integrations against custom API pages (not standard OData page endpoints) because custom APIs let us control exactly which fields are exposed, add computed fields, enforce filtering logic server-side, and version the API independently of page layout changes. This means your integration does not break when someone adds a field to a Business Central page. For Finance & Operations, we choose between OData data entities for moderate-volume real-time operations, DMF recurring integrations for high-volume batch scenarios (the DMF package API can process 500,000 records in a single operation), and custom services for complex operations that need to execute business logic on the server side rather than just move data.

For multi-system environments where D365 is one of five or ten connected systems, we architect integration middleware using Azure Service Bus as the message backbone. Service Bus gives you guaranteed message delivery with duplicate detection, dead letter queues for failed processing, topic-based pub/sub routing so multiple downstream systems can subscribe to the same D365 events independently, and sessions for ordered processing when sequence matters. This is the architecture you need when your D365 instance has to talk to your WMS, your CRM, your e-commerce platform, your EDI trading partners, your BI warehouse, and two legacy systems — simultaneously, reliably, and with full observability into what is flowing where.

Dual-write is Microsoft's own real-time synchronization framework between Finance & Operations and Dataverse, and it is genuinely useful for specific scenarios — keeping customer and product master data consistent between F&O and a Dataverse-powered customer portal, for example. But dual-write has hard limitations that Microsoft's documentation undersells. It only works between F&O and Dataverse (not Business Central, not third-party systems). It has a fixed set of supported entity maps. It struggles with high-frequency transactional data where conflict resolution gets complicated. And it requires careful initial sync planning because it was designed for ongoing synchronization, not historical data migration. We implement dual-write where it fits and build custom alternatives where it does not, rather than trying to force every integration through a single framework.

Business Central Custom API Development

We build custom API pages in AL that expose exactly the data your integrations need — no more, no less. Custom APIs are versioned independently of Business Central page layouts, support complex filtering with OData query parameters, handle pagination for large result sets, and include computed fields that aggregate or transform data server-side so your integration layer receives clean, ready-to-use payloads. This is fundamentally different from consuming standard page-based OData endpoints, which expose every field on the page, break when pages change, and carry metadata overhead that slows bulk operations.

Finance & Operations Data Entity & DMF Integration

For F&O environments, we select the right integration pattern per scenario. OData-bound data entities for moderate-volume, real-time operations like order creation and customer updates. Data Management Framework (DMF) for high-volume batch scenarios — recurring data jobs that can process hundreds of thousands of records per batch using the package API. Custom X++ services for operations that require server-side business logic execution, multi-table transactions, or complex validation chains that cannot be expressed through data entity mappings alone.

Azure Service Bus Event Architecture

We build event-driven integration architectures using Azure Service Bus as the messaging backbone for D365 environments with 4+ connected systems. Service Bus queues for point-to-point command processing with guaranteed delivery. Service Bus topics with filtered subscriptions for pub/sub event distribution where multiple systems need to react to the same D365 event. Dead letter queues with automated alerting for messages that fail processing. Sessions for ordered delivery when processing sequence matters (financial postings, inventory adjustments, sequential order processing).

Virtual Entities & Dataverse Integration

Virtual entities let you expose external system data inside Dataverse — and by extension inside D365 Customer Engagement apps, Power Apps, and Power BI — without copying the data into Dataverse tables. We build custom virtual entity data providers that connect Dataverse to your external databases, APIs, and legacy systems so users see and interact with external data as if it were native Dataverse records. This eliminates the need for batch synchronization of reference data and gives your team a single interface for data that physically lives in multiple systems.

Dual-Write Implementation & Augmentation

We implement Microsoft's dual-write framework for real-time F&O-to-Dataverse synchronization where it fits: master data entities (customers, vendors, products) with moderate update frequency and straightforward conflict resolution. Where dual-write's limitations apply — unsupported entity maps, high-frequency transactional data, complex transformation requirements, or integration targets beyond Dataverse — we build custom real-time sync using Azure Service Bus and change tracking that provides the same millisecond-level sync without dual-write's constraints.

Legacy System Bridging for D365

D365 implementations rarely replace every system overnight. You go live on Business Central or F&O and still need to exchange data with AS/400 inventory systems, Progress manufacturing execution systems, FoxPro databases, flat-file EDI translators, and homegrown applications that have been running since 2003. We build the middleware that bridges D365 to these systems — database-level connectors with change data capture, file-based integration pipelines, screen-scraping adapters for terminal systems, and custom wrapper APIs that give your legacy systems a modern REST interface that D365 can call natively.

Want a Custom Implementation Plan?

We'll map your requirements to a concrete plan with phases, milestones, and a realistic budget.

  • Detailed scope document you can share with stakeholders
  • Phased approach — start small, scale as you see results
  • No surprises — fixed-price or transparent hourly
“
We tried to build our D365 Business Central integrations with Power Automate. The nightly inventory sync to our WMS was taking 8 hours and failing 3 times a week. FreedomDev rebuilt it with custom API pages and Azure Service Bus — the same sync now completes in 12 minutes with zero failures in 6 months. Our warehouse team finally trusts the stock counts when they walk in at 6 AM.
VP of Operations—West Michigan Manufacturing & Distribution Company

Our Process

01

D365 Integration Assessment & Architecture Selection (1–2 Weeks)

We audit your D365 environment (Business Central, Finance & Operations, or both), catalog every system it needs to exchange data with, and map the data flows — what moves where, how often, in what volume, and with what transformation logic. For each integration point, we evaluate which D365 API surface is the right fit: custom API pages, OData data entities, DMF batch jobs, custom services, dual-write, or virtual entities. We assess your non-D365 systems for API availability, database access, and file export capabilities. Deliverable: an integration architecture document with the specific D365 API layer, messaging pattern (direct, queue-based, or pub/sub), and estimated cost for each connection.

02

API Contract & Data Mapping Specification (1–2 Weeks)

Before writing integration code, we define the data contract for every connection. This means field-level mapping between D365 entities and target systems, data transformation rules (unit conversions, account code translations, currency handling, entity splitting and aggregation), conflict resolution logic for bidirectional syncs, error handling behavior (retry policies, dead letter routing, alerting thresholds), and authentication flows (OAuth 2.0 for Azure AD-protected D365 APIs, API keys for third-party systems, certificate-based auth for legacy connections). For Business Central custom APIs, we write the AL API page specifications. For F&O integrations, we define data entity configurations and DMF job parameters.

03

Integration Development & Sandbox Testing (3–8 Weeks)

We build integrations in priority order against your D365 sandbox environment. Business Central custom API pages are developed and deployed as extensions. F&O data entities, custom services, and business events are developed in the appropriate D365 development environment. Middleware components — Azure Functions, Service Bus configurations, legacy system connectors — are built and deployed to Azure or your infrastructure. Every integration gets unit tests, integration tests against sandbox data, batch processing tests at 2–5x your expected volume, error injection testing (simulated API failures, timeout scenarios, malformed data), and performance benchmarking. A straightforward Business Central to WMS integration takes 3–4 weeks. A multi-system F&O integration with legacy bridging, Service Bus event routing, and complex transformation logic takes 6–8 weeks.

04

Parallel Running & Data Validation (1–3 Weeks)

Integrations run alongside your existing processes — manual or otherwise — for a validation period. Automated reconciliation compares integration output against your current data movement methods transaction by transaction. This is where we catch the edge cases that sandbox testing cannot simulate: unusual transaction types, end-of-period posting anomalies, timezone mismatches on international transactions, D365 number sequence gaps, and vendor-specific API quirks that only surface with production data shapes. Your team continues their current workflow unchanged until integration accuracy is verified. For high-volume integrations (50,000+ records per batch), validation includes throughput testing under production load to confirm that batch windows complete within your operational requirements.

05

Production Cutover & Ongoing Monitoring (Ongoing)

After validation, we cut over to automated integration, configure monitoring dashboards in Azure Monitor or your preferred observability platform, set up alerting for failures and anomalies (processing time deviations, error rate spikes, volume drops that suggest upstream issues), and provide 30 days of hypercare support. Ongoing maintenance covers D365 version update compatibility testing (Microsoft releases major updates twice yearly and cumulative updates monthly), third-party API change monitoring, Azure Service Bus health management, performance tuning as transaction volumes grow, and emergency response for integration failures. Maintenance runs $750–$3,000/month per integration cluster depending on the number of connected systems, transaction volume, and SLA requirements.

Before vs After

MetricWith FreedomDevWithout
Batch Processing (50K+ records)DMF package API: 500K records/batch, parallel processingPower Automate: per-record API calls, 6K request limit per 5 min
Real-Time Sync LatencySub-second via Service Bus + webhooksPower Automate: 1–15 minute polling intervals
Data Transformation ComplexityCustom C#/AL logic: entity splits, aggregations, conditional mappingsPower Automate expressions: limited, unmaintainable at scale
Error HandlingDead letter queues, partial completion tracking, resumable batches, automated alertingFlow failure email notification, manual rerun of entire flow
Legacy System ConnectivityAS/400, Progress, FoxPro, flat files, EDI, screen scraping, custom wrapper APIsOnly systems with existing Power Platform connectors
D365 Version Update ResilienceCustom APIs versioned independently, tested against preview releasesAppSource connectors break on updates, vendor fix timelines unknown
3-Year TCO (10+ integration points)$80K–$200K total (build + maintenance)Power Automate Premium: $15/user/mo + AppSource connectors + manual intervention labor
Monitoring & ObservabilityAzure Monitor dashboards, custom health checks, transaction-level audit logsPower Automate run history (28-day retention, no custom metrics)

Ready to Solve This?

Schedule a direct technical consultation with our senior architects.

Explore More

ERP ImplementationAPI IntegrationCsharpManufacturingDistributionFinancial ServicesProfessional Services

Frequently Asked Questions

When should we use custom D365 integration instead of Power Automate?
Power Automate is the right tool for low-volume, simple integrations between D365 and modern SaaS platforms — syncing a few hundred records between Business Central and your CRM, posting notifications to Teams when invoices are approved, or triggering simple workflows from D365 business events. Move to custom integration when any of these conditions apply: your batch sizes exceed 5,000 records per sync cycle and you are hitting Dataverse API throttling limits; your data transformations require conditional logic beyond simple field mapping (entity splitting, multi-currency aggregation, chart of accounts translation); you need sub-second real-time sync rather than 5-15 minute polling; you need enterprise error handling with dead letter queues, partial completion tracking, and resumable batch processing; you are connecting to legacy systems that have no Power Platform connector; or your Power Automate licensing costs (Premium at $15/user/month plus AppSource connector fees) are approaching the cost of a custom build that you own outright. The breakeven point is typically 12-18 months. Companies processing more than 10,000 transactions per day across 3+ integration points almost always save money with custom integration within the first year.
What is the difference between OData endpoints and custom API pages in Business Central?
Business Central exposes OData v4 endpoints for every published page and query object. These endpoints work, but they are designed for ad-hoc data access, not for production integrations. Page-based OData endpoints expose every field on the page (including fields your integration does not need, adding payload overhead), break when someone modifies the page layout, carry UI metadata that inflates response sizes, and do not support custom computed fields or server-side transformation logic. Custom API pages, introduced in Business Central's modern development model, are AL objects designed specifically for integrations. They expose exactly the fields you define, support custom computed fields (so you can return aggregated or transformed data without post-processing), are versioned independently of page layouts (page changes never break your integration), support complex OData filtering and expansion, and can enforce server-side validation logic that rejects bad data before it enters Business Central. If you are building a one-off data export, standard OData endpoints are fine. If you are building a production integration that needs to run reliably for years, custom API pages are the only serious option. We build every Business Central integration against custom API pages as a baseline practice.
How do you handle high-volume batch processing in Finance & Operations?
Finance & Operations provides the Data Management Framework (DMF) specifically for high-volume batch operations, and it is dramatically more efficient than per-record OData calls. The DMF package API lets you submit a file (CSV, XML, or Excel) containing hundreds of thousands of records as a single batch operation. F&O processes the file server-side using its own data import/export engine, which handles validation, transformation, and posting in parallel across multiple threads. A 500,000-record inventory import that would take days via per-record API calls completes in minutes through DMF. We build DMF-based integrations using the recurring integration pattern: an Azure Function prepares the data package, submits it to F&O via the DMF package API, monitors the job status, and handles success/failure at the batch level (with record-level error reporting from F&O's execution log). For near-real-time scenarios where DMF's batch nature is too slow, we use OData-bound data entities with change tracking — F&O's change tracking provides delta queries so we only process records that have changed since the last sync, keeping per-request volume low enough to stay within API limits while maintaining close-to-real-time freshness.
What role does Azure Service Bus play in D365 integrations?
Azure Service Bus is a fully managed enterprise message broker that serves as the central nervous system for D365 integrations in multi-system environments. Instead of building point-to-point connections between D365 and every other system (which creates a brittle spaghetti architecture), Service Bus provides a message backbone that decouples producers from consumers. When a sales order is created in D365, a message is published to a Service Bus topic. Your WMS, your CRM, your invoicing system, and your BI warehouse each have their own subscription to that topic with independent filters — they each receive and process the event at their own pace, in their own way, without knowing about each other. If one consumer is down, its messages queue until it comes back. If a message fails processing, it moves to a dead letter queue for investigation instead of being lost. Service Bus handles duplicate detection (critical when D365 business events occasionally fire twice), sessions for ordered delivery (essential for financial postings where sequence matters), and scheduled delivery for messages that should not be processed until a specific time. For D365 specifically, Business Central can publish to Service Bus via Azure Functions triggered by webhooks or change notifications, and Finance & Operations has built-in support for publishing business events directly to Service Bus endpoints. We size Service Bus tiers based on message volume and throughput requirements — Basic for simple scenarios, Standard for most production workloads, and Premium for high-throughput environments that need dedicated capacity.
What are virtual entities and when do they make sense for D365 integration?
Virtual entities (also called virtual tables) let you expose external system data inside Dataverse as if it were a native Dataverse table — without actually copying the data into Dataverse storage. Users, Power Apps, Power BI reports, and D365 Customer Engagement apps can read and interact with the external data using standard Dataverse interfaces. The data is fetched from the source system in real time via a custom data provider that you build and register. Virtual entities make sense for reference data that lives in an external system and is read frequently but updated infrequently: product catalogs from a PIM system, customer data from an external CRM, inventory levels from a WMS, pricing from a CPQ tool. Instead of building a batch sync that copies this data into Dataverse every hour (and dealing with staleness, storage costs, and sync failures), virtual entities serve it live. They do not make sense for high-frequency transactional queries, for data that needs to participate in Dataverse-native features like business rules and workflows that require local data, or for scenarios where the external system's API latency would make the Dataverse UI feel slow. We build custom virtual entity data providers in C# that connect to your external systems, handle authentication, implement caching for performance (so every Dataverse grid refresh does not hit the external API), and support Dataverse's standard query operators (filtering, sorting, pagination) translated into the external system's query language.
How much does custom D365 integration cost compared to Power Automate?
The cost comparison depends on scope and timeline. For a simple Business Central integration with one external system at low volume (under 1,000 records per day, straightforward field mapping, no legacy systems), Power Automate with the D365 connector is cheaper: you are looking at $15/user/month for Power Automate Premium licensing and minimal development time to build the flows. Total first-year cost might be $5,000-$10,000 including setup. For the same scenario, custom integration runs $15,000-$25,000 upfront plus $750-$1,500/month in maintenance. Power Automate wins on simple scenarios. The calculus reverses at scale. A mid-market D365 F&O environment with 5-10 integration points, 50,000+ records per day, legacy system connections, and complex transformation logic will cost $80,000-$200,000 for custom integration (build plus first year of maintenance). The Power Automate alternative is not really an alternative at that scale — you would spend $30,000-$50,000 trying to build it in Power Automate, hit the throttling and transformation limits we described, then spend another $80,000-$200,000 rebuilding it custom anyway. We have done this exact rebuild for three D365 clients in the past two years. If you know your integration requirements are complex, building custom from the start saves the cost of the failed Power Automate attempt. We scope every integration individually with a fixed-price estimate after the discovery phase, so you know exactly what you are paying before development starts.

Stop Working For Your Software

Make your software work for you. Let's build a sensible solution.