top of page

How to use DataLakeHouse.io: a Data Analytics Platform

Updated: Apr 17, 2023

This article explains how to use DataLakeHouse.io, a SaaS platform.


DataLakeHouse.io has connectors for many applications and datasources:


  1. Aiven Postgre SQL,

  2. Aloha POS,

  3. Amazon S3,

  4. Bill.com,

  5. Bloom Growth,

  6. Bullhorn,

  7. Ceridian Dayforce,

  8. Cloudflare R2,

  9. ConnectWise,

  10. Facebook Ads,

  11. Flat Files,

  12. Food Delivery Service,

  13. GCP Storage,

  14. Google Analytics 4,

  15. Google Big Query,

  16. Google Sheets,

  17. Harvest,

  18. Hubspot,

  19. JIRA,

  20. MailChimp,

  21. MariaDB,

  22. McLoad Software,

  23. MongoDB (Sharded),

  24. MongoDB,

  25. MySQL,

  26. NetSuite,

  27. Optimum HRIS,

  28. Oracle EBS,

  29. Peoplesoft,

  30. Postgre SQL,

  31. Quickbooks,

  32. Reddit Ads,

  33. Salesforce,

  34. Shopify,

  35. Snowflake,

  36. Snowflake Terraform,

  37. SQL Server,

  38. Square,

  39. Square (Marketplace),

  40. Stripe,

  41. Trinet,

  42. Verizon Business,

  43. Wasabi Storage,

  44. Xero

We will cover a Data Lake created on the Snowflake Data Cloud with data from Harvest.


Step 1 - Create a Source Connection


Connect to your instance on app.datalakehouse.io.


Add a Harvest source connection and add a Name and a Target Schema Prefix. Click to Authorize Your Account with your Harvest account.

Click on Actions and Edit to view the entities you will replicate.


Note all of the Tables and Entities appear on the Harvest data source.


Step 2 - Create a Target.


Open your Snowflake Data Cloud instance and create an empty database:



On DataLakeHouse.io, create a new Snowflake Target Connection and enter a Database, Warehouse and your username credentials.

Click on Save & Test Connection



Step 3 - Create a Sync Bridge which will be used to consume the data from your source and feed your data lake. Select the Harvest and Snowflake connections and Save the Sync Bridge.

Click on Actions and Enable the Sync Bridge


Return to your Snowflake instance and notice a few tables created under the HARVEST schema on DLH database:


Conclusion


DataLakeHouse.io is a powerful technology and it has an intuitive user interface where you can connect to dozens of data sources to populate Snowflake Data Cloud or Google Big Query without writing any code. All of the target tables will be automatically created and populated by a predefined data model. You can also request other sources and customizations directly with DataLakeHouse support team, more details on DLH.io.


About the author: Angelo Buss is a Solutions Architect and the founder of BRF Consulting, a DataLakeHouse consulting partner.




56 views0 comments
bottom of page