Browse All Recipes (59)

Aggregate and append to large public data sources to Excel file
To use this recipe, upload the Chicago crimes file from data.gov into the Local File connector for your Environment. For more information on the calculations here, see: https://blog.godatadriven.com/impala-haversine.html
Automatic Union of Multiple JSON files
This recipe dynamically unions two or more like JSON files together into a single dataset for transformation and aggregration. To run this recipe, you need to configure the S3 connector (in the environment and the recipe), update the list of files in the recipe and click "Run".
CSV File Reader
To use this recipe, upload the file into the Local File connector for your Environment.
CSV to Excel File Conversion
This recipe converts an CSV file to Excel. To use this recipe, upload the file into the Local File connector for your Environment.
CSV to JSON File Conversion
This recipe converts an CSV file to JSON. To use this recipe, upload the file into the Local File connector for your Environment.
Caliper Event Data Processor
Use a static JSON Caliper event with nested JSON-LD and flatten the data for additional processing. This recipe can be converted to a streaming recipe by changing the type of the connector to `inputevent`.
Canvas REST API Reader - Courses and Enrollments
Read courses and enrollments from the Canvas REST API.
Canvas REST API Reader - Retrieve Last Modules in Courses
Retrieve the last module in each course using Spark SQL windowing. With this data you can retrieve all course completion data for any enrollment.
Canvas REST API Writer
Create an account_notification using the Canvas REST API. This example can be extended for any Canvas JSON pattern. The JSON can be dynamically built using struct() and named_struct() functions.
Canvas Users to Salesforce Contacts Writer
This recipe reads all the Canvas users and upserts them as contacts to Salesforce. This pattern can be extended to any Canvas endpoint or Salesforce object. Lingk handles all the details of the integration, you just worry about the data.
Convert Nested Array to Table
This recipe demonstrates two ways to access nested array data using LingkQL with Spark SQL functions.
Create New Users in Canvas
This recipe creates Canvas users based on a data source. It will not update users, only create new ones. To run this recipe, set up the Canvas connector in your environment, choose your Environment, and click "Run".
Data Diff Simplified
Data diff on JSON structures in the JSON connector. However, it will work with ANY connector because all outputs are the same for Lingk!
Data diff deltas on full datasets to event driven webhook subscriptions
This recipe does a data diff on JSON structures in the JSON connector (though it will work with ANY connector because all outputs are the same for Lingk). To use this recipe, just click Run! To see changes, delete, change, and add records and run again to see the output. You will need to add an Object under Events and update the the "signal" table in the recipe.
Date Formatting Examples
This recipe parses some static dates and shows how to convert them into other formats. To use this recipe, click Run!
Date formatting with Spark and Jinja
This recipe demonstrates using Spark and Jinja (Jinjava) expressions. To run this recipe, click the "Run" and "Start Recipe" buttons.
Ellucian Ethos Join
Join Ellucian Ethos APIs and create a flattened table
Ellucian Ethos Reader
A simple Ellucian Ethos data reader
Ellucian Ethos to SFTP
Read and joins data from multiple Ellucian Ethos APIs and writes the data to SFTP
Ethos Writer
Write a payload to the Ethos APIs
Ethos to Salesforce Writer
Pulls programs from Ellucian Ethos academic programs and upsert them into the Salesforce HEDA Academic Program Account object.
Excel to CSV File Conversion
This recipe converts an Excel file to CSV. To use this recipe, upload the file into the Local File connector for your Environment.
Excel to JSON Conversion
This recipe converts an Excel file to JSON. To use this recipe, upload the file into the Local File connector for your Environment
Excel to Postgres (Amazon RDS) Writer
This recipe reads data from an Excel spreadsheet and writes to a Postgres database on Amazon RDS. To run this recipe, configure an environment with a Local File (upload an Excel doc) and the Amazon RDS connector.
Fuzzy Matching with Soundex and Levenshtein
This recipe demonstrates the use of fuzzy matching in Spark with Soundex and Levenshtein Distance. The soundex algorithm is often used to compare first names that are spelled differently. You might want to use the Levenshtein distance when joining two DataFrames if you don’t want to require exact string matches. It’s always a struggle to minimize the number of false positives when performing fuzzy joins. So do multiple tests and join with multiple columns to improve results. To run this recipe, choose your environment and click Run!
Google PubSub Writer
Writes a JSON array to multiple messages on a Google PubSub topic.
Inserting JSON into S3
This recipe demonstrates output different JSON structures to Amazon S3. To use this recipe, just click "Run"
JSON (S3 hosted) to Excel File Conversion
This recipe converts JSON (nested or flat) to an Excel file.To use this recipe, configure the S3 connector in your environment and upload the file into your S3 bucket. Then download the file from Local File area for your environment.
JSON API to SOAP APIs and SOAP XML processing
This recipe dynamically populates values into an SOAP XML payload, submits the request and processes the result. Jinja templates can be used to dynamically build the XML elements based on data from another connector (i.e. API request). Credentials can be stored in a CSV in the local file connector and to dynamically populate credentials and make the recipe safe across environments. To use this recipe, click Run!
Join JSON and database data to output an Excel Spreadsheet
This recipe joins JSON and postgres Database data together to create an Excel spreadsheet. To run this recipe, upload rows_100k.json1 to your Local File connection and configure the Amazon RDS connector. Update the SQL statements, click run.
Join Multiple Excel Spreadsheets and Output Excel
This recipe joins and aggregates multiple Excel worksheets / spreadsheets and ouput a single Excel spreadsheet. To use this recipe, upload the file(s) into the Local File connector for your Environment
Join Salesforce data for better Excel reporting or CSV data feeds
To use this recipe, connect to Salesforce in your environment, update the salesforceReaders connectors and click run.
Join data between two Google Spreadsheets
Read data from multiple Google spreadsheets and join them with SQL.
Lingk Adapter Reader (for Peoplesoft, on-prem DBs, etc.)
Read data from on-premise databases (like Peoplesoft, Banner, Oracle, MS-SQL, Postgres, and more)
Lingk Adapter to Moodle User Sync
This recipe reads users data from the Lingk Adapter (powered by Apache Nifi) and upserts student records into Moodle using the Lingk Moodle Connect plugin (a REST service layer for Moodle) To use this recipe: 1. Install Apache Nifi on a server, add Lingk Adapter extensions, and configure 2. Install Moodle Connect plugin on your Moodle instance (see Lingk for access to plugin) 3. Configure Lingk Adapter and Moodle in your Lingk environment. 4. Run / Schedule recipe
Lingk Event APIs to Google PubSub Events
This recipe takes event data passed from external systems (Apache Nifi, iPaaS, ESBs) and processes the data using Apache Spark. This recipe can be configure for batch to streaming processing by changing the "json" type connector to "inputEvent".
Microsoft Dynamics CRM to SFTP
To use this recipe, enter your Dynamics crendentials inline in the recipe and and configure an SFTP connector in your environment.
Nested Object Schemas
Process data with nested objects easily by applying a schema to connectors. Schemas are necessary for selecting *Null* columns in staged, in-memory tables.
Print the Schema of a Connector or Table
This recipe convert data from a connector to a schema. Schemas are important because they enable you to use NULL columns in a queries without errors.
Public Location Data Set Browser with Distance Calculation
This recipe will filter public crime data and output distance from downtown Chicago (lat/lon). To use this recipe, upload the Chicago crimes file from data.gov into the Local File connector for your Environment
SFTP File Management
Read a file from an SFTP site, processes the data and then overwrites the file with new data.
SFTP to Microsoft Dynamics 365 CRM
Reads a file from SFTP and writes to MS Dynamics 365 oData APIs
SFTP to Moodle (Upsert users)
Read a file from SFTP, transform the data and upsert the values into Moodle courses using the Lingk Connect Moodle plugin REST APIs.
SFTP to Salesforce
Read a CSV file from SFTP, transform the data, and write the data to Salesforce using the bulk API.
Salesforce Data Cleanup
Easily delete Salesforce data immediately after a SOQL query
Salesforce HEDA to Canvas Course Writer
This recipe reads data from a Salesforce HEDA course object and writes to a Canvas course. To run this recipe, configure the Salesforce and Canvas connectors in your environment. Then click "Run" and "Start Recipe".
Salesforce Org to Org Migration with Record Types
This recipe interacts with two orgs with different data models and record types. To use this recipe, 1. Add one org to the environment and 2. add the second org to the inline credentials.
Salesforce Power Reader (a better Workbench)
Use SOQL to query any object in Salesforce, transform the data and write it back to Salesforce or SFTP.
Salesforce Recipe with Multiple Orgs
This recipe demonstrates use credentials from the environment for one organization and adding inline credentials for the second organization. This example demonstrates Salesforce reader connectors, but the same approach will work with Salesforce writers as well.
Salesforce to Colleague by Ellucian (Lingk Adapter - Unidata)
This recipe reads data from Salesforce and writes it to Ellucian Colleague (via Apache Nifi with Lingk Extensions through Subroutines)
Salesforce to Ethos Writer
Pulls academic programs from Salesforce HEDA Account objects and upserts them into the Ethos Academic Levels object. To run, update the Salesforce SOQL query to point to your academic levels.
Simple Lingk Event API Subscription Tester
This recipe demonstrates a Mock event payload for recipe subscription. To use this recipe -- select your environment, run the recipe, and paste the sample event payload into the Event window.
Snowflake - Read Tables and Write CSV Data
This recipe reads and writes data to the Snowflake data warehouse. To use this recipe, add a CSV file to your local file connector for environment variables and update the related Snowflake properties. Then click "Run"!
Transformation Examples
Test various transformation examples. This recipe requires no external connectors.
Trigger Lingk Events through a Recipe
This recipe writes an event payload to a Lingk event that can trigger other recipes or publish the data to webhooks for Zapier and other apps. To use this recipe, create an object in the Lingk app under Events, subscribe it to a recipe or webhook, update the object name in the recipe, and click Run!
Union Multiple Excel Worksheets
This recipe unions the data form multiple worksheets into a single in-memory database to query. To use this recipe, upload the file into the Local File connector for your Environment.
Upsert Excel Data to Salesforce Contacts
This recipe will upsert one record from an Excel document into a Salesforce Contact object. To use this recipe, upload the file into the Local File connector for your Environment and connect a Salesforce instance.
Using Jinja-like Template Expressions in Connectors and Statements
This recipe demonstrates using Jinja-style templates and the variable collections available at different stages of a recipe execution. To use this recipe, click Run!
Using SQL to create complex JSON structures
This recipe demonstrates creating a number types of JSON structures. To run this recipe, choose your Environment, and click "Run". For more examples, go to: https://docs.databricks.com/spark/latest/spark-sql/complex-types.html