Skip to content

Run High-Volume Models with Coherent’s Batch APIs

When insurance models grow into the millions of records, spreadsheets and scripts fail for the same reason—single-record execution doesn’t scale.

Coherent's Batch APIs let insurers run Excel-based models across massive datasets in parallel, without rewriting logic or managing infrastructure.

Insurers don’t struggle with modeling sophistication.
They struggle with volume.

As datasets expand—from thousands of policies to tens of millions of records—calculation workflows break down:

  • Spreadsheet recalculations grind to a halt
  • Python scripts time out or require heavy orchestration
  • Batch jobs become brittle, manual, and slow to rerun

When pricing accuracy, reserving assumptions, or scenario testing depends on speed, these delays aren’t just inconvenient—they’re business-limiting.

What if you didn’t have to choose between ease and performance?

The Real Scaling Problem in Insurance Modeling

Most actuarial and pricing logic already exists. The challenge isn’t what to calculate—it’s how often and how much.

Common breaking points look like this:

  • Mortality or lapse assumptions need recalibration across millions of records
  • Scenario testing requires repeated runs with updated parameters
  • Rate dislocation analysis must process entire books, not samples

Traditional execution models—row-by-row spreadsheet processing or sequential scripts—can’t keep up.

Scaling requires batch execution.

What Batch APIs Actually Change

Batch APIs allow insurers to submit large volumes of records at once, execute calculations in parallel, and receive structured results asynchronously.

Instead of processing records individually, Coherent executes models across distributed compute resources—automatically.

One insurer used Coherent Spark’s Batch APIs to process 30 million records in under an hour. That scale enables:

  • Rapid assumption updates
  • Large-scale scenario testing
  • Full-book recalculations without operational lag

Batch processing turns what used to be overnight jobs into near-real-time workflows.

How Coherent's Batch APIs Work

Coherent makes batch execution practical without forcing teams to re-architect their models.

Excel Logic, Executed at Scale

Pricing and actuarial logic stays in Excel. Spark executes that logic across large datasets using cloud-native parallel processing.

Python SDK for Submission and Control

Using the Coherent Spark Python SDK, teams submit batch jobs with minimal code. No orchestration layers. No custom infrastructure.

Automatic Scaling

Spark dynamically allocates compute resources based on workload size—whether that’s thousands or millions of records.

Parallel Execution, Faster Results

Jobs run concurrently, dramatically reducing execution time and eliminating long-running sequential scripts.

Centralized, Repeatable Runs

Batch jobs can be rerun consistently as assumptions change, without manual intervention or file manipulation.

Batch APIs in Action

In this short demo, Ralph Florent from Coherent’s field engineering team shows how to submit over 1,000 records simultaneously using the Spark Python SDK.

With just a few lines of code, Ralph demonstrates:

  • Submitting batch jobs
  • Running calculations in parallel
  • Receiving structured results in seconds

No waiting on recalculations. No brittle scripts. Just scalable execution.

What This Unlocks for Insurers

Imagine an insurer struggling with slow spreadsheet calculations for mortality figures, taking hours and impacting timely decision-making.

By adopting Coherent's Batch APIs, the same tasks can be completed in minutes. Actuarial teams now focus on analysis, not execution. Scenario testing becomes routine, not exceptional.

High-volume insurance modeling doesn’t require abandoning Excel—or over-engineering Python pipelines.

With Coherent Spark’s Batch APIs, insurers get:

  • The flexibility of spreadsheet logic
  • The performance of parallel cloud execution
  • The repeatability required for enterprise scale

If your models are sound but your execution can’t keep up, Batch APIs change the equation.

Curious how Batch APIs could accelerate your workflows?