Snowflake Data Platform

Snowflake is a data platform built for the cloud. It consolidates data warehousing, data marts, and data lakes into a single platform in order to make all data available to all business users. The architecture consists of three layers:

  1. Database storage: Cloud storage allows organizations to store all data in a scalable and inexpensive manner. Data is stored in their native format without the need for time-consuming transformations.
  2. Processing: Virtual warehouses facilitate compute resources that execute data-processing tasks required for queries.
  3. Cloud services: Coordinates the entire system and manages security, optimization, and metadata.

Snowflake effectively reinvents data warehousing, eliminating complexity related to integrating different data sources and types.

Why is it Snowflake cannot be ignored?

Why is it Snowflake cannot be ignored?

“Why should you pay for something not currently in use? Why should you manually relate to scalability? Why should there be downtime relating to scalability and database changes? These are some of the things I admire Snowflake for solving. You shouldn’t have to analyze and guess the number of CPUs you need at 8:00 in the morning or 18:00 in the evening.

Other technologies provide something similar, but they fall short since the redeployment time for compute is often more than 1 minute. I don’t know how Snowflake has managed to solve this, but I think a lot of people are jealous of their less than 1 second redeployment time.

Zero-copy clone is to me a completely unique way of testing for corrections and alterations in a 1:1 setup such as production. I have seen a lot of heavy solutions on how to synchronize development, test, and production setups in your database, but I have not previously seen a well-working solution. Zero-copy clone provides exactly that. It takes a meta-data copy of e.g. a production database for a test environment in 1 second. Since it only copies meta-data and not actual data, the cost of this is next to nothing.”

Adam Boje Hertz, Head of AI & Data Platforms at Intellishore

Get in touch
If you want to know more about how to utilize Snowflake to unlock value for your organization
Send us your details and we will get back to you with some more inspiration on how you can unleash more value to your organization by utilizing Snowflake.

And don't worry, we will not use your information to spam you with buzz-filled articles on AI.

We look forward to hearing from you.

Snowflake supports a multitude of workloads…

Snowflake’s patented multi-cluster shared data architecture delivers a platform that enables many different workloads. These include: Data warehouses, data lakes, data pipelines, and data exchanges, as well as many types of business intelligence, data science, and data analytics applications.

… and different data types and sources

Snowflake’s agnostic nature supports the handling and optimization of both structured and semi-structured data. The latter include the likes of JSON, Avro, and XML. The platform includes standard-based connectors such as ODBC, JDBC, Javascript, Python, Spark, R, and Node.js. As a result, developers are granted full access to all tools, languages, or frameworks they might need.

Furthermore, Snowflake Data Marketplace allows you to discover new datasets and services.

Global sharing across providers

Global sharing across providers

In an effect to help mitigate data silos within both large and small organizations, Snowflake allows for global sharing of data. When requested, this happens instantaneously without anyone having to copy or move data. The platform is also cloud agnostic meaning that, in addition to the ability of Snowflake to distribute data across regions, it can distribute data across different cloud providers including AWS, Google Cloud, and Microsoft Azure.

This effectively allows large organizations to break down data-silos and obtain unified insights from an all-encompassing data platform.

One platform that powers the data cloud

One platform that powers the data cloud

“Snowflake is the first native data platform that has been built for the cloud. Our unique architecture offers our customers a true cross-cloud and multi-cloud approach, across any geography. It also enables them to share data in a fast, secure, and seamless manner that was not previously possible.

Snowflake customers can truly democratize their data and share it both with internal business units, but also externally with their wider ecosystem. Customers can instantly reach organizations in the Data Cloud through Snowflake Marketplace and can enrich their own data with access to more than 1,500 live and ready-to-query data sets from over 300 third-party data providers and data service providers.

In addition to mobilizing and sharing data, organizations can build applications directly on Snowflake’s platform, and monetize these applications on Snowflake Marketplace to share more widely, and with the highest level of security.

We enable our customers to work with data in an entirely new way and for that, we need partners who are innovative, forward-thinking, and always put the customers’ needs first. That is why our partnership with Intellishore has been extremely successful, as they have been able to help us unlock new ways to drive value for joint-customers by offering modern, best-of-breed data solutions.”

Christian Lindtorp Andersen, Country Manager at Snowflake Denmark

True Software as a Service solution

Snowflake operates as a true software as a service solution through its fully managed service layer handling user sessions, resources, enforcing security measures, compiling queries, enabling data governance, and ensuring atomicity, consistency, isolation, and durability.

Snowflake is fully automatic and capable of handling and servicing the infrastructure. Effectively, this allows organizations to focus on analyzing and gaining insights from data rather than spending resources maintaining the data platform.

Speed, scalability, and pricing

Snowflake is among the fastest cloud-based data warehouses in the world, able to load 1 billion rows of data in less than 90 seconds. It does this through its column-based data engine that automatically clusters data. Since data storage and data processing (compute) is physically decoupled, users can load and run queries on the same data without sacrificing performance. Since it is possible to have several different warehouses in Snowflake, users can set up warehouses for specific workloads, such as Business Intelligence. In this way, compute power is separated from the other data warehouses, and a demanding data-calculation in the BI warehouse, will not affect the performance of other warehouses.
Snowflake’s unique architecture provides complete elasticity allowing for the scaling of compute resources dedicated to specific workloads either automatically or manually. This happens within 1 second. In other words, Snowflake can scale computing up and down in order to suit the organizational needs in terms of data amount, workload, and the number of users or applications.
This provides benefits to organizations both in terms of performance and costs. Snowflake separates storage and compute. The price of storage is similar to that of blob storage (Azure), S3 (AWS), and GCP. Compute is normally costly, but Snowflake separates storage and compute and utilizes pricing per second for the latter. If compute power is not used for 5 minutes, it is automatically scaled down, and it is then free. The overall price is thus set based on pricing per second, pricing per user, and pricing in terms of the amount of data storage needed. As a result, organizations only pay for what they use, and this elicits cost saving-benefits compared to traditional data-storage and processing solutions.

Need help getting started with Snowflake in your organization?

Feel free to reach out to one of our consultants, if you need help with how to start using Snowflake in your organization.
Anders Boje Hertz
Head of AI & Data Platforms
Adam de Neergaard
Sales Director
Next Up
Unleash your data's potential with dbt Labs