The Reference You Need
Spark Scala Examples
Simple spark scala examples to help you quickly complete your data etl pipelines. Save time digging through the spark scala function api and instead get right to the code you need...
Page 1 of 2
-
SBT Assembly Jar Naming in Shell Scripting
When using sbt in spark scala projects it can be useful to access the name of the assembly that will be created with sbt within your bash or zsh shell scripts. Thankfully extracting the values is rather straight forward.
-
Building MapType Columns in Spark Scala DataFrames for Enhanced Data Structuring
Using a MapType in Spark Scala DataFrames can be helpful as it provides a flexible logical structures that can be used when solving problems such as: Machine Learning Feature Engineering, Data Exploration, Serialization, Enriching Data and Denormalization. Thankfully building these map spark scala columns is very stright fwd from existing data within a data frame.
-
Converting a Map Type to a JSON String in Spark Scala
Using a MapType in Spark Scala DataFrames provides a more flexible logical structures, hierarchical data and of course working with arbitrary data attributes.
-
Spark Scala isin Function Examples
The isin function is defined on a spark column and is used to filter rows in a DataFrame or DataSet.
-
Hashing Functions, Spark Scala SQL API Function
Hash functions serve many purposes in data engineering. They can be used to check the integrity of data, help with duplication issues, cryptographic use cases for security, improve efficiency when trying to balance partition sizes in large data operations and many more.
-
When and Otherwise in Spark Scala -- Examples
The when function in Spark implements conditionals within your DataFrame based etl pipelines. It allows you to perform fallthrough logic and create new columns with values based upone the conditions logic.
-
Coalesce in Spark Scala for Data Cleaning
The coalesce function returns the first non-null value from a list of columns. It's a common technique when you have multiple values and you want to prioritize selecting the first available one from them.
-
Format Numbers in Spark Scala For Humans
The format_number function in Spark Scala is useful when you want to present numerical data in a human-readable and standardized format. This helps improve the clarity of numeric values by adding, commas for thousands separators and controlling the number of decimal places.
-
Data Transformation and Data Extraction with Spark's regexp_replace Function
Regular expression matching and replace are a comonly used tool within data etl pipelines to transform, clean your string data and extract more structured information from it.
-
Concatenate Columns in Spark Scala | Guide to Using concat and concat_ws Functions
The concat function in Spark Scala takes multuple columns as input and returns a concated version of all of them. When any column in the list of columns to concatenate are null then the result is null.