Page 1 of 2
-
Scala 3 and Spark: Where Things Stand in 2026
Apache Spark still ships exclusively for Scala 2.13, and official Scala 3 support has no target release. But a practical workaround exists today using Scala 3's forward-compatibility mode. Here's what works, what doesn't, and whether your team should try it.
-
Spark Declarative Pipelines: First Look from a Scala Dev
Spark 4.1 introduces Spark Declarative Pipelines (SDP) — a framework for building managed ETL pipelines where you declare datasets and Spark handles the rest. The catch for Scala developers: authoring is Python and SQL only, with no JVM support yet.
-
Structured Streaming Real-Time Mode: A Practical Look
Spark 4.1 ships a new real-time execution mode for Structured Streaming that breaks the microbatch barrier, targeting single-digit millisecond p99 latency for stateless queries. Here's what it does, how it works, and when you should use it.
-
Delta Lake 4.0: What Scala Engineers Need to Know
Delta Lake 4.0 shipped in September 2025, requiring Apache Spark 4.0 and dropping Scala 2.12 entirely. Here's what changed, what's new, and what it means for your sbt builds and production pipelines.
-
Spark 4.2 Preview: What's Coming Next
The Apache Spark community has published three Spark 4.2.0 preview releases — the most recent on March 12, 2026 — for community testing ahead of the stable release. Here's what the preview docs and JIRA tell us about what's coming, and how to try it today.
-
The State of Spark Scala in 2026
PySpark has dominated the conversation for years, but the Spark 4.x release cycle is sending a clear signal: the JVM ecosystem isn't being abandoned. Here's an honest look at where Scala stands, where Python leads, and why both are here to stay.
-
Spark 4.1 Release Highlights for Scala Developers
Spark 4.1 lands with 1,800+ resolved tickets from 230+ contributors and two headline features: Structured Streaming Real-Time Mode for sub-second latency and Spark Declarative Pipelines for managed ETL pipelines. Here's what matters if you're writing or maintaining Spark Scala applications.
-
Upgrading from Spark 3.x to Spark 4.0: A Practical Guide
Spark 4.0 brings real breaking changes that will likely affect your existing Scala pipelines — ANSI mode on by default, Scala 2.12 dropped, JDK 17 required, and infrastructure changes to shuffle and event logging. This guide walks through each one with before/after context and the config knob to fall back if you need time to migrate.
-
What's New in Spark 4.0 for Scala Developers
Spark 4.0 is the biggest release in years — over 5,100 resolved tickets from 390+ contributors. Here's what matters most if you're writing or maintaining Spark Scala applications.
-
Spark is Like a Sledgehammer
UNKNOWN