Browse All Recipes (79)
Aeries Sections and Programs SIS Reader to Salesforce
This recipe reads sections and program data from the Aires SIS (K-12). You need the Aeries-Cert for the API calls. You can use the demo cert from their documentation. While the credentials are inline, you can upload a config file to the environment and read the values from config to enable the recipe to be portable across environments. To use this with a Salesforce connector, add a SF connector to an environment, uncomment the SF, and run this recipe under the environment.
Amazon RDS (PostgreSQL) to Google Spreadsheet Writer
This recipe enables you to read data from a postgres database and write the data to a Google spreadsheet. To get started with this recipe, configure the AWS RDS and Google Spreadsheet connector in environments. Then, select your environment and click "Go".
Amazon S3 Lambda File Trigger to Ethos Person Import
This recipe processing a file on S3 from a Lamdba file upload trigger and importing the data into the Ethos persons APIs.
To use this recipe:
1. edit the Lambda function,
2. import it into AWS
3. associated it with an S3 bucket.
4. Ethos creds: Set up the Ethos connector environment AND upload a seperate credential file for use by the HTTP connector
Download and edit the accompanying Lamdba function:
Automatic Union of Multiple JSON files
This recipe dynamically unions two or more like JSON files on Amazon S3 together into a single dataset for transformation and aggregration. To run this recipe, you need to configure the S3 connector (in the environment and the recipe), update the list of files in the recipe and click "Run".
Blackboard Learn REST API Course Reader
This recipe reads all the courses from a Blackboard Learn instance through the Blackboard REST API. Paging of data and security is automatically handled.
To run this recipe, configure the Blackboard connector in your environment, select your environment, and click "Run".
Blackboard Learn REST API Courses to SFTP file
This recipe reads all the courses from a Blackboard Learn instance through the Blackboard REST API and writes them to a file on an SFTP site. Paging of data and security is automatically handled. To run this recipe, configure the Blackboard and SFTP connector in your environment, select your environment, and click "Run".
Blackboard Users to Salesforce Contacts Writer
This recipe reads all the Blackboard Learn users and upserts them as contacts to Salesforce. This pattern can be extended to any Blackboard endpoint or Salesforce object. Lingk handles all the details of the integration, you just worry about the data.
To get started: (a) Add a Blackboard connector and (b) add a Salesforce connector to an environment. Select the environment and run. Note: You can comment out the INSERT and UPDATE statements for a "safe" run.
CSV File Reader
To use this recipe, upload the file into the Local File connector for your Environment.
CSV to Excel File Conversion
This recipe converts an CSV file to Excel. To use this recipe, upload the file into the Local File connector for your Environment.
CSV to JSON File Conversion
This recipe converts an CSV file to JSON. To use this recipe, upload the file into the Local File connector for your Environment.
Caliper Event Data Processor
Use a static JSON Caliper event with nested JSON-LD and flatten the data for additional processing. This recipe can be converted to a streaming recipe by changing the type of the connector to `inputevent`.
Canvas REST API Writer
Create an account_notification using the Canvas REST API. This example can be extended for any Canvas JSON pattern. The JSON can be dynamically built using struct() and named_struct() functions.
Canvas Users to Salesforce Contacts Writer
This recipe reads all the Canvas users and upserts them as contacts to Salesforce. This pattern can be extended to any Canvas endpoint or Salesforce object. Lingk handles all the details of the integration, you just worry about the data.
Create New Users in Canvas
This recipe creates Canvas users based on a data source. It will not update users, only create new ones. To run this recipe, set up the Canvas connector in your environment, choose your Environment, and click "Run".
Creating and Updating Objects with the Salesforce REST API Writer
This recipe demonstrates creating and updating data with the Salesforce REST API. The recipe will create an account, retrieve it's ID and then use the ID to update the name of the same account. To run this recipe, configure a Salesforce connector in your environment and click "Run".
Cross Join of Two Tables
This recipe executes a Cross JOIN between two tables. Warning: cross joins can be very slow and should only be used when absolutely necessary.
To run this recipe, click the Run button.
Data Diff Simplified
Data diff on JSON structures in the JSON connector. However, it will work with ANY connector because all outputs are the same for Lingk!
Data diff deltas on full datasets to event driven webhook subscriptions
This recipe does a data diff on JSON structures in the JSON connector (though it will work with ANY connector because all outputs are the same for Lingk). To use this recipe, just click Run! To see changes, delete, change, and add records and run again to see the output. You will need to add an Object under Events and update the the "signal" table in the recipe.
Date Formatting Examples
This recipe parses some static dates and shows how to convert them into other formats. To use this recipe, click Run!
Ellucian Ethos to Salesforce Writer
This recipe pulls programs from Ellucian Ethos and upserts them into the Salesforce Education Data Architecture (EDA) Academic Program Account object.
Email Microservice Helper Example
This recipe sends text-based notification emails. While you can send multiple notifications at once, this recipe is NOT designed for marketing purposes. Please contact Lingk support for marketing solutions. To use this recipe, click Run!
Excel to CSV File Conversion
This recipe converts an Excel file to CSV. To use this recipe, upload the file into the Local File connector for your Environment.
Excel to JSON Conversion
This recipe converts an Excel file to JSON. To use this recipe, upload the file into the Local File connector for your Environment
Excel to Postgres (Amazon RDS) Writer
This recipe reads data from an Excel spreadsheet and writes to a Postgres database on Amazon RDS. To run this recipe, configure an environment with a Local File (upload an Excel doc) and the Amazon RDS connector.
Fixed Width Data Processing
This recipe reads data from a fixed width file. The performance of this recipe may be very slow due to the cross join on the source data and the size of the dataset (as it essentially will be a linear increase in memory for the record count). This recipe is a good test of Apache Spark's data processing capabilities.
Fuzzy Matching with Soundex and Levenshtein
This recipe demonstrates the use of fuzzy matching in Spark with Soundex and Levenshtein Distance. The soundex algorithm is often used to compare first names that are spelled differently. You might want to use the Levenshtein distance when joining two DataFrames if you don’t want to require exact string matches. It’s always a struggle to minimize the number of false positives when performing fuzzy joins. So do multiple tests and join with multiple columns to improve results. To run this recipe, choose your environment and click Run!
JSON (S3 hosted) to Excel File Conversion
This recipe converts JSON (nested or flat) to an Excel file.To use this recipe, configure the Amazon S3 connector in your environment and upload the file into your Amazon S3 bucket. Then download the file from Local File area for your environment.
JSON API to SOAP APIs and SOAP XML processing
This recipe dynamically populates values into an SOAP XML payload, submits the request and processes the result.
Jinja templates can be used to dynamically build the XML elements based on data from another connector (i.e. API request).
Credentials can be stored in a CSV in the local file connector and to dynamically populate credentials and make the recipe safe across environments.
To use this recipe, click Run!
Join JSON and database data to output an Excel Spreadsheet
This recipe joins JSON and postgres Database data together to create an Excel spreadsheet.
To run this recipe, upload rows_100k.json1 to your Local File connection and configure the Amazon RDS connector. Update the SQL statements, click run.
Join Multiple Excel Spreadsheets and Output Excel
This recipe joins and aggregates multiple Excel worksheets / spreadsheets and ouput a single Excel spreadsheet. To use this recipe, upload the file(s) into the Local File connector for your Environment
Lingk Adapter to Moodle User Sync
This recipe reads users data from the Lingk Adapter (powered by Apache Nifi) and upserts student records into Moodle using the Lingk Moodle Connect plugin (a REST service layer for Moodle)
To use this recipe:
1. Install Apache Nifi on a server, add Lingk Adapter extensions, and configure
2. Install Moodle Connect plugin on your Moodle instance (see Lingk for access to plugin)
3. Configure Lingk Adapter and Moodle in your Lingk environment.
4. Run / Schedule recipe
Lingk Event APIs to Google PubSub Events
This recipe takes event data passed from external systems (Apache Nifi, iPaaS, ESBs) and processes the data using Apache Spark. This recipe can be configure for batch to streaming processing by changing the "json" type connector to "inputEvent".
Microsoft Dynamics CRM to SFTP
To use this recipe, enter your Dynamics crendentials inline in the recipe and and configure an SFTP connector in your environment.
Multi-step REST Auth Flow Example (OAuth, etc)
This recipe makes a request for an authentication token, inserts the value into Execution Content and uses the token for subsequent request. To run this recipe, add the headers for the first step of your flow and then click Run.
Nested Object Schemas
Process data with nested objects easily by applying a schema to connectors. Schemas are necessary for selecting *Null* columns in staged, in-memory tables.
Retrieve Leads from the Marketo REST API
This recipe retrieves the AuthToken from Marketo and uses it in a subsequent API request. To run this recipe, replace all tenant and API configurations with your configuration and click "Run".
SFTP File Management
Read a file from an SFTP site, processes the data and then overwrites the file with new data.
SFTP to Moodle (Upsert users)
Read a file from SFTP, transform the data and upsert the values into Moodle courses using the Lingk Connect Moodle plugin REST APIs.
SFTP to Salesforce
Read a CSV file from SFTP, transform the data, and write the data to Salesforce using the bulk API.
Salesforce EDA to Canvas Course Writer
This recipe reads data from a Salesforce Education Data Architecture (EDA) course object and writes to a Canvas course. To run this recipe, configure the Salesforce and Canvas connectors in your environment. Then click "Run" and "Start Recipe".
Salesforce Recipe with Multiple Orgs
This recipe demonstrates use credentials from the environment for one organization and adding inline credentials for the second organization. This example demonstrates Salesforce reader connectors, but the same approach will work with Salesforce writers as well.
Salesforce Upsert Example supporting Lookups
This recipe demonstrates upserting to multiple Salesforce objects with lookup relationships. To run this recipe, change the source and targets. Update the statements to match your SF data sets and click "Run".
The process is as follows:
1. Read data from the source and target
2. Transform the source data
3. Make the insert and update data sets
4. Refresh the source data before creating child objects to get the SF object IDs for newly created SF objects
Salesforce to Ellucian Ethos Writer
Pulls academic programs from Salesforce Education Data Architecture (EDA) Account objects and upserts them into the Ellucian Ethos Academic Levels object. To run, update the Salesforce SOQL query to point to your academic levels.
Schemas Techniques for Null Nested Data Properties
This recipe demonstrates how to ensure a recipe complete when nested data structures are used with inline() or explode() functions. The key is to attach a schema to the connector that defined the temporarily non-existant value that Spark will otherwise drop the column for (because it's all null)
Note: we are looking for an endOn date that doesn't exist in this payload but could. Use schema to ensure that the field is available even if null.
Simple Lingk Event API Subscription Tester
This recipe demonstrates a Mock event payload for recipe subscription. To use this recipe -- select your environment, run the recipe, and paste the sample event payload into the Event window.
Snowflake - Read Tables and Write CSV Data
This recipe reads and writes data to the Snowflake data warehouse.
To use this recipe, add a CSV file to your local file connector for environment variables and update the related Snowflake properties. Then click "Run"!
Trigger Lingk Events through a Recipe
This recipe writes an event payload to a Lingk event that can trigger other recipes or publish the data to webhooks for Zapier and other apps. To use this recipe, create an object in the Lingk app under Events, subscribe it to a recipe or webhook, update the object name in the recipe, and click Run!
Union Multiple Excel Worksheets
This recipe unions the data form multiple worksheets into a single in-memory database to query. To use this recipe, upload the file into the Local File connector for your Environment.
Upsert Excel Data to Salesforce Contacts
This recipe will upsert one record from an Excel document into a Salesforce Contact object. To use this recipe, upload the file into the Local File connector for your Environment and connect a Salesforce instance.
Using Lookup Values in Select Statements
This recipe uses a file as a lookup table for values in a specific field. To use this recipe, upload the file into the Local File connector for your Environment.
Using SQL to create complex JSON structures
This recipe demonstrates creating a number types of JSON structures. To run this recipe, choose your Environment, and click "Run". For more examples, go to: https://docs.databricks.com/spark/latest/spark-sql/complex-types.html
XML to JSON Helper Service
This recipe converts some static XML to an JSON for processing. The XML to JSON help service can handle up to 50 MB XML files.
https://xmltojson.lingkcloud.com/api/v1/xmltojson can only be called from the recipe.