Go Back
Snowflake - Read Tables and Write CSV Data
Publisher
: Lingk
Run In Lingk
Description
This recipe reads and writes data to the Snowflake data warehouse. To use this recipe, add environment variables for the related Snowflake properties. Then click "Run"!
Browse the knowledge base
Twitter
E-Mail
# _____ _ _____ __ # | __ \ (_) |_ _| / _| # | |__) |___ ___ _ _ __ ___ | | _ __ | |_ ___ # | _ // _ \/ __| | '_ \ / _ \ | | | '_ \| _/ _ \ # | | \ \ __/ (__| | |_) | __/ _| |_| | | | || (_) | # |_| \_\___|\___|_| .__/ \___| |_____|_| |_|_| \___/ # | | # |_| # Project Name - SNOWFLAKE - READ TABLES AND WRITE CSV DATA # Recipe ID - # Recipe URL - https://app.lingk.io/a/10932/tf/17955 # Description - # This recipe reads and writes data to the Snowflake data warehouse. # To use this recipe, add a CSV file to your local file connector for environment variables and update the related Snowflake properties. Then click Run! # Industry - Higher Ed # Business Process - Graduate Reporting # Systems - Snowflake # Connectors - Snowflake, LocalFile # Data Flows - Single Direction # Connection Type - API to CSV # Add Recipe notes / Change log information here! # _____ _ # / ____| | | # | | ___ _ __ _ __ ___ ___| |_ ___ _ __ ___ # | | / _ \| '_ \| '_ \ / _ \/ __| __/ _ \| '__/ __| # | |___| (_) | | | | | | | __/ (__| || (_) | | \__ \ # \_____\___/|_| |_|_| |_|\___|\___|\__\___/|_| |___/ # # CONNECTORS specify what data will be pulled into the in-memory database during processing connectors: # Configure Snowflake credentials in your Environment before running this recipe # Snowflake Setup - https://help.lingk.io/en/articles/167-release-2019-02-08 ###### Start - Snowflake connectors ###### # Read data in from Snowflake - name: SnowflakeReader type: SnowflakeReader parameterizedBy: credentials properties: username: "{{env.vars.username}}" password: "{{env.vars.password}}" account: "{{env.vars.account}}" sqlQuery: "SELECT * FROM DEMO_DB.PUBLIC.TEST" # Write data to a Snowflake table - name: SnowflakeWriter type: SnowflakeWriter properties: username: "{{env.vars.username}}" password: "{{env.vars.password}}" account: "{{env.vars.account}}" tableName: "DEMO_DB.PUBLIC.TEST" ###### End - Snowflake connectors ###### ###### Start - Localfile connectors ###### # Read a large CSV (this file should be automatically available in your local file environment) - name: bigFile type: LocalFileReader format: csv properties: fileName: rows_100k.csv ###### End - Localfile connectors ###### # Set the format of CSV file # ______ _ # | ____| | | # | |__ ___ _ __ _ __ ___ __ _| |_ ___ # | __/ _ \| '__| '_ ` _ \ / _` | __/ __| # | | | (_) | | | | | | | | (_| | |_\__ \ # |_| \___/|_| |_| |_| |_|\__,_|\__|___/ readFormats: - name: csv type: delimited properties: delimiter: ',' header: true inferSchema: true # _____ _ _ _ # / ____| | | | | | # | (___ | |_ __ _| |_ ___ _ __ ___ ___ _ __ | |_ ___ # \___ \| __/ _` | __/ _ \ '_ ` _ \ / _ \ '_ \| __/ __| # ____) | || (_| | || __/ | | | | | __/ | | | |_\__ \ # |_____/ \__\__,_|\__\___|_| |_| |_|\___|_| |_|\__|___/ # STATEMENTS specify how the data should be processed while in memory statements: #******************************************************************** D I S C L A I M E R *********************************************************************************************** # * # Note that in an effort to keep recipes optimized for DPH (Data Processing Hours), print statements should be commented out after development has concluded for a recipe. * # For more information on DPH optimization, please visit the following help article - https://help.lingk.io/en/articles/212-minimizing-data-processing-hours-on-the-lingk-platform * # * #******************************************************************** D I S C L A I M E R *********************************************************************************************** # REFRESH executes the SnowflakeReader connector based on the location in the recipe - statement: (count) => select count(*) from SnowflakeReader #- statement: print count # CHANGE the SQL aliases (c1, c2...) to match your Snowflake schema - statement: | (testData) => SELECT first_name c1, last_name c2 FROM bigFile LIMIT 10 # UNCOMMENT to INSERT data # - statement: INSERT testData INTO SnowflakeWriter - statement: refresh SnowflakeReader - statement: (count) => select count(*) from SnowflakeReader #- statement: print count # Add more statements to convert, join, aggregrate, transform, and integrate your data
Multi-step REST Auth Flow Example (OAuth, etc)
Fuzzy Matching with Soundex and Levenshtein