0% found this document useful (0 votes)
469 views

Snowflake 101 - For Data Architects - LinkedIn

The document discusses Snowflake, a cloud data platform, from the perspective of a data architect. It outlines four common areas of focus for data management: data warehousing, data lakes, data science, and application development. It then describes six key data workloads that Snowflake can enable: data warehousing, data lakes, data science, application development, data sharing, and stream processing. The document goes on to discuss how different architectural patterns can be implemented using Snowflake's capabilities and provides examples of modern data warehousing and data lake architectures built on Snowflake.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
469 views

Snowflake 101 - For Data Architects - LinkedIn

The document discusses Snowflake, a cloud data platform, from the perspective of a data architect. It outlines four common areas of focus for data management: data warehousing, data lakes, data science, and application development. It then describes six key data workloads that Snowflake can enable: data warehousing, data lakes, data science, application development, data sharing, and stream processing. The document goes on to discuss how different architectural patterns can be implemented using Snowflake's capabilities and provides examples of modern data warehousing and data lake architectures built on Snowflake.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

6 Volver a pro
Inicio Mi red Empleos Mensajes Notificaciones Yo Productos Premium gr

Photo and editing by Celine Erb

Snowflake 101 - for Data


Architects
Harald Erb
Data Enthusiast & Architect, Principal Sales 4 artículos Seguir
Engineer with Snowflake

7 de junio de 2021

What is your priority when it comes


to data management?

Initial Snowflake projects can be set up in very different


ways, e.g. as a greenfield project to eliminate data silos, for
self-service BI or they aim to modernise/consolidate legacy
data warehouse and/or Hadoop landscapes. In this context,
typical project-determining factors for the implementation
of analytical use cases are: Is a company already cloud-born
or are relevant data sources predominantly located
in local/own data centers? What is the maturity level of
data science, modern development techniques (+ tools)
and is the handling of data still considered a by-product or
is it mission-critical? This article is intended to give an
overview of what you can actually build with Snowflake
from a data architect's perspective.

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 1/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

The illustration below summarises current data


management topics in four areas, each indicating in loose
order main purpose, key requirements or best practices
that are typically discussed and validated as part of a new
Snowflake overall architecture. We will go into more detail
later in this article.

Before we dive deeper and look at some architecture


options in more detail, let's first recap Snowflake's
positioning of its Cloud Data Platform. Historical known as
a Cloud Data Warehouse solution with unlimited
performance and concurrency, Snowflake can be used to
implement modern data lake strategies allowing data
engineers to build robust data pipelines in the
programming language of their choice. Streamlined data
science, creation of data intensive applications and
collaboration with live data across data consumers and
providers are also possible areas of utilisation where data
teams can capitalise on Snowflake’s cloud-native
architecture enabling rapid data access, query performance
and data transformation, while capitalising on Snowflake’s
built-in data governance and security. Here are the most
critical six data workloads that Snowflake enables with a
single platform.

In the next step, let's take a look at how currently common


architecture patterns, and the six workloads described
above, can be related to Snowflake.

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 2/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

Architectural Bird's Eye View

Don't worry, no magnifier is needed for the next picture.


Snowflake's multi-cluster shared data architecture (I still
recommend the SIGMOD whitepaper as a must-read), the
pay-as-you-go payment model, the tight integration with
cloud provider services as well as with leading tool vendors
make it possible to quickly implement your data
architecture. Due to Snowflake's platform capabilities,
domain-driven developments in a Snowflake environment,
for example, are possible alongside a migration of a legacy
data warehouse. Work results of individual
teams/departments, e.g. in the form of a data mart or other
data products, can be securely shared within the company
or with external business partners without the need to copy
or transfer data between Snowflake accounts - a rather
important feature as we can expect much more data
sharing with many more parties in the near future.

Build your architecture, do it smart,


and give it a fancy name

It is clear: developing artificial intelligence and analytics


applications typically involves different processes,
technology and talent than those for traditional software
solutions. You might like to take a look at McKinsey
Analytics' interactive guide which explains processes, roles
and key technical terms nicely. Finding your way through
the jungle of tech buzzwords and assessing which
architecture blueprints and best practices can be adapted
for your own projects is both exciting and difficult. On the
one hand, there are a lot of promising technologies; on the
other hand, each company is working on its competitive

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 3/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

advantages, increasingly using data in its own unique way.


So now let's take a look at what Snowflake has to offer and
how these technical capabilities are already being used in
production environments. First we start with a well known
topic: data warehousing...

Modern Data Warehousing is cooler than you


might think

KPI's and metrics are still vital navigation instruments that


help decision-makers see how well an organisation,
business unit or project is performing in relation to
strategic goals and objectives. Here is an example of a KPI
library with a lot of relevant KPI definitions. Performance
indicators condense operational information into a
meaningful figure and clarify larger interrelationships within
the company. They only make sense if the underlying
database is of high quality. Successful data warehouse
systems enforce this "gold standard" for data through good
data modelling and enforcement of quality criteria when
harmonising and transforming the source data to be
loaded. Or to put it another way: sooner or later, time has
to be invested (regularly) in data quality procedures, no
matter which technology stack is used at the end of the day
or whether it is a data warehouse or not. 

On the topic of modern data warehousing, I can


recommend Christian Kaul's series of articles. In addition,
the Picnic Engineering Team (NL) provides nice insights into
the selected technology (including Snowflake) and
architecture for their "Lakeless Data Warehouse". I like how
sandboxing and prototyping (with Snowflake Temporary
Tables) as well as Data Scientists as a user group are part of
Picnic's overall concept. And why not work in a classic Star

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 4/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

Schema with table columns using Snowflake's VARIANT


data type for semi-structured data, as described as Picnic-
flavored customisation? Both of the above articles already
address the use and advantages of Data Vault 2.0. Anyone
interested in how this works with Snowflake would do best
to start with the articles Tips for Optimizing the Data Vault
Architecture on Snowflake and Building a Real-Time Data
Vault in Snowflake. Also very cool this February was the
Talk Lessons From Earth's Largest Data Vault @ Micron by
Mike Magalsky at the Data Modelling Meetup Munich: In
addition to scalability, performance and development
features, Snowflake was also able to meet Micron's
business critical security requirements (interesting
explanation in the video starting at 36 min).

Snowflake's Data Platform was built to be (data warehouse)


design pattern agnostic. That means you can use it with
equal efficiency 3NF models, dimensional (star) schemas,
Data Vault, or any hybrid you might have. This capability, in
addition to standard SQL support, data type compatibility
and provision of necessary functions, appropriately
facilitates the migration of a legacy data warehouse to
Snowflake. More information is available in several
migration guides (SAP, Teradata, Oracle etc.), and reference
manuals.

Leveraging existing Data Lakes or let Snowflake


be your Data Lake

Great hype erupted when the concept of the data lake


emerged. But that hype soon subsided because it was
nearly impossible (for business users) to gain insights from
all that data. Snowflake delivers on the promise of the data
lake and provides two options: 

Make Snowflake your data lake: With this approach


you provide one copy of your data – a single source of
truth – to all your data users allowing them to work
with more diverse data sets without requiring you to
copy or move data or manipulate it for specific use
cases. This helps to unify your technology landscape
with a single platform for many types of data
workloads, eliminating the need for different services

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 5/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

and infrastructures and brings together the best


storage and analytical attributes of both a data lake
and a data warehouse. Benefits: unlimited scale and
speed, simplicity, cost-effectiveness, and built-in
security and end-to-end governance. A cool example
of many built-in security features: in Snowflake, data
masking can be applied to semi-structured data
(VARIANT columns) and is not limited to structured
data alone. If you think about it further, you will see
that in Snowflake you can not only provide
semistructured data quickly (schema-on-read
approach) and optimised for immediate analysis, but
also enforce rules for organisation, access and
protection of the data end-to-end in the data lake and
track their compliance. And not to forget, Snowflake
offers continuous availability by design, is always-on
like the most customer-facing SaaS systems. This is an
important capability as data analysis became critical to
more and more business tasks. The two main technical
features in this regard are fault resilience (Snowflake
tolerates individual and correlated node failures at all
levels of the architecture) and online upgrades which is
a process that is transparent to the user with no
downtime or performance degradation.

Snowflake can optionally also complement an existing


data lake: In data engineering, so called “multi-hop”
architectures are quite common to organise tables in
data lake zones/layers that correspond to different
quality levels, progressively adding structure to the
data: data ingestion (“Bronze” tables),
transformation/feature engineering (“Silver” tables)
and cleaned/aggregated data ready for consumption
through business users (“Gold” tables). This allows data

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 6/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

engineers to build a pipeline that begins with raw data


as a “single source of truth” from which everything
flows. Typically, this can be technically implemented in
the same way in Snowflake and one would benefit
from the platform advantages for better, faster data
preparation and analytics. Whether you want to keep
the bronze and silver tables in the storage layer
depends on how long operators accept unnecessary
management overhead: Manual and time consuming
tasks like repairing tables, vacuum and optimise data
layouts are unknown in Snowflake. In addition,
Snowflake's Time Travel enables accessing historical
data at any point within a defined period allowing to
restore tables, schemas and databases that might have
been accidentally (yes, it happens...) or intentionally
deleted. When it comes to the Gold Layer, platform
decisions should be made with the end users in mind:
I.e., do not exclude any users due to performance
bottlenecks and make access to the data simple and
understandable so that self-service is not only possible
for full-time analysts alone. Snowflake’s unique multi-
cluster shared data architecture delivers the
performance, scale, elasticity, and concurrency today’s
organisations require to manage user and query
concurrency needs as they change, such as during
peak and off hours.

Data lake architectures with Snowflake are not just theory.


The Little Book of Big Success with Snowflake contains
interesting customer examples, including Siemens in
Germany. Would you like first-hand information? Then I
recommend this session at the Snowflake Summit taking
place in June: How Siemens Healthineers Drives Digital
Transformation by Migrating Its Data Lake Use Cases to

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 7/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

Snowflake. And if you are planning your Cloud Data Lake


initiative, you may find some useful hints in this ebook
Cloud Data Lakes For Dummies.

Data Engineers & Data Scientists: This is for you!

Data pipelines in Snowflake can be batch or continuous and


processing can happen directly within Snowflake itself.
Thanks to Snowflake’s multi-cluster compute approach,
these pipelines can handle complex transformations
without impacting the performance of other workloads
(data analysts etc.). Sonra's blog article gives some nice
hands-on advice on how to use Snowflake as a data
transformation engine for data preparation. For automatic
ingestion and data processing, Snowflake introduced
Snowpipe to simplify the process of moving files from
object stores into Snowflake as they arrive in an AWS, Azure
or Google object store, eliminating the need for scripts and
scheduling tools. A practical scenario can be found in
InterWork's Snowpipe 101. Snowflake also offers an Apache
Kafka consumer connector to easily ingest streaming data
from Kafka and in some projects this integration takes a key
role: Joyn, a multi channel TV and video streaming platform
in Germany, reports in their Joyn Tech Blog that Apache
Kafka became the new main data source of their data
warehouse since the microservices in the backend push
their data to it. In contrast, there are numerous on-premise
OLTP databases that are required to offload data to analytic
platforms in the cloud. How data can be ingested from
OLTP databases into Kafka, with an initial snapshot followed
by any change made against the tracked table(s) is
discussed in detail in Confluent's Pipeline to the Cloud
article. As data lands in Snowflake, table streams capture
changes that have occurred since the last load. Table
streams support change data capture (CDC) use cases,
Tasks in Snowflake trigger processing without the need for
complex scheduling routines. Together, Snowpipe auto-
ingest, table streams and tasks enable the continuous
movement of data within a Snowflake.

Writing code for data transformations on Snowflake is


significantly faster than on other platforms. Just use the

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 8/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

power of SQL with the tool and/or framework of your


choice. If interested, you can follow the step-by-step guide
Accelerating Data Engineering with Snowflake & dbt (data
build tool) to see some of the benefits this popular tandem
brings. And there is more to come as Snowflake continues
to extend their platform with features that help customers
achieve more with less. I recommend to attend these
sessions at the upcoming Snowflake Summit: 

What's New: Extensibility in Snowflake and Snowpark


and Java Functions Under the Hood - The Snowpark
API opens up a new programming model for
Snowflake and Java functions expand the possibilities
for data transformation and analysis.

What's New: Unstructured Data Management in


Snowflake - In this session, you will learn about
Snowflake’s new support for unstructured data and see
demos on how you can store, access, process, govern
and share it in a single data platform.

What's New: Performance and Core Engine


Improvements in Snowflake

Improving Feature Engineering by Storing and


Serving Features with Snowflake - In this session,
you’ll hear about patterns and techniques that can be
applied to perform in-database feature engineering,
storing and serving over thousands of features with
Snowflake

The session on feature engineering recommended above is


a good lead into the next use case for Snowflake:
Increasingly Data Science and Data Engineering teams are
turning towards feature stores to manage the data sets and
data pipelines needed to productionize their machine
learning applications. In this context, a feature is data used
as an input signal to a predictive model and a central
feature store enables feature discovery and sharing, model

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 9/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

training, and production data serving to a model for


inference. General concept and main components of a
modern feature store are vividly explained in Tecton's blog
article. The following overview shows to what degree and
with which technical functions Snowflake can simplify
processes in the individual phases of a typical machine
learning lifecycle.  

Many Data Scientists confirmed to me that they spend 80%


or more of their time searching for hidden data sets, and
for data preparation, feature storing, transformation and
serving. These are tasks Snowflake can definitely help with
(see my introductory talk here) and there are already
exciting case studies and success stories, such as from
Uniper in Germany (one of Europe’s largest power
generators, eliminated data silos, reduced administrative
effort, and freed up capacity machine learning), Zabka in
Poland (largest convenience chain in Europe with 6500
stores, presented their digital transformation journey incl. a
new store location planning tool) and Elkjøp in Norway
(market leader within consumer electronics in Northern
Europe, presentation on how their own data science team is
collaboratively using Snowflake while foregoing Spark for
feature engineering). 

Machine learning libraries and frameworks are rapidly


evolving. With Snowflake you can avoid re-platforming and
migrating your data unnecessarily. Snowflake provides
connectors for Spark and Python (with Apache Arrow
support) for seamless connectivity to open source ML
libraries. Try it yourself with one of these tutorials: Build a
Recommendation Engine with AWS SageMaker (uses the
Surprise python library and External Functions in Snowflake
to call external API endpoints) or build a time series

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 10/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

forecasting model using Zepl (a data science notebook


using Facebook's Prophet and other Python libraries with
data loaded from Snowflake's Data Marketplace). And there
are more guides and other resources showcasing
Snowflake's optimized integrations with leading AutoML
tools such as DataRobot, Dataiku, H20.ai. By the way, this
article mentioned various open source frameworks, formats
and platforms in context with Snowflake. What is the
Snowflake executives' point of view on this? The recent
article Choosing Open Wisely answers this question and
concludes as follows: "We believe in the value of open
standards and open source, but also in the value of data
governance and security; we believe in the value of ease of
use, the power of community, and the value of abstractions
that enable transparent optimisations and improvements
over time." I think this statement sums up the product
philosophy very well and we will hear about more examples
in the next section.

Data as a Product & Domain-driven


Development

Initiatives that aim to democratise data and enable the


whole organisation to treat data as a strategic asset are
currently on every agenda. In this context, many projects I
know of are looking at and discussing the applicability of
domain-driven development with data, data meshes as an
alternative to centralised data platforms and product
thinking with data. DPG Media from Belgium shared an
interesting experience report on that topic worth reading:
Data Mesh — A Self-service Infrastructure at DPG Media
with Snowflake. Their IT department is structured in areas
covering certain business domains. Business domains with

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 11/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

huge data challenges get dedicated data engineering


resources, being placed in the area responsible for the
domain. Simultaneously, less data-heavy domains can lend
data engineers from a central pool responsible for building
data pipelines for the respective domains. Some of the
positive results that have been achieved include that there
is a low-level entrance to use the new data platform,
software engineering best practices are brought to the data
landscape, and there is a central team managing and
supporting governance rules. My interim conclusion is that
in organisations where domain-driven development is
practiced and microservices architectures are implemented,
it makes sense to also think about moving ownership of
data to domains. However, the biggest challenge is not
technical; it's about gaining the maturity for product
owners to really think about data and treat it as a product -
and no longer as a byproduct. So why not combine Data
with Product Thinking? Z. Dehghani from Thoughtworks did
exactly that and shared her thoughts in a recent webinar
where she explores the principle of Data as a Product and
describes how this simple change in perspective has a deep
and profound impact on how we collect, serve and manage
data. One key point is defining what is a successful data
product? E.g. how easy is it for data consumers to discover,
understand and securely use high quality data with a good
experience? With regard to the role of a product owner the
setting of success criteria (customer satisfaction, # users, #
of downstream consuming data products/apps, etc.), and
the definition of responsibilities and capabilities (deep
understanding of the business domain/use cases, data
semantic/ syntax/regulations, long term ownership) is
pretty crucial. However, there are not only causal data users
who are interested in cleaned and prepared data sets, but
also data scientists who could find a goldmine of unnoticed
signals in raw data for their purposes (e.g. fraud detection).
So it is important to still keep raw data accessible when
designing a decentralised data platform. And as more and
more companies look at implementing this latest trend,
some are asking the question: Does a distributed domain
driven architecture mean the end for the central data
(warehouse and/or engineering) team? To answer this
question for yourself you may find this quick assessment
https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 12/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

helpful in determining if your company is a candidate for a


data mesh. Personally, I don't think a data mesh will
eliminate the need for data warehouse teams, and I expect
there will be coexistence in the future. In the end, what
matters is how quickly and efficiently a company can
integrate the growing number of new data sources into its
own platform and ensure that no one is left out who wants
to actively work with data - whether as an occasional user
or a full-time analyst.  

As shown at the beginning in the Architectural Bird's Eye


View, a Snowflake environment can easily be organised in a
domain-oriented way and still make the (mutual) use of
data products easy. In this context, highly relevant input
can be found in the webinar Designing Snowflake for
Multi-Tenant Data Applications which discusses tenancy
options, optimising Snowflake storage and compute and
security considerations. More supporting capabilities and
examples to check in context of domain-driven
architectures are: 

How can Infrastructure as Code (IaC) be applied to


manage Snowflake configurations in an automated and
source-controlled way? Here is a guide Terraforming
Snowflake demonstrating how to install and use
Terraform to create and manage your Snowflake
environment

DevOps: These guides will provide step-by-step


instructions for how to build a CI/CD pipeline for
Snowflake with GitHub Actions or Azure Devops using
a framework called schemachange as Database
Change Management (DCM) tool.

Practical Snowflake examples for real Data Products are of


course also available, more on this in the next section of
this article.

Secure Data Sharing and Data


Marketplace

Snowflake Data Marketplace gives data scientists, business


intelligence and analytics professionals access to more than
510 live and ready-to-query data sets from over 140 third-

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 13/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

party data providers and data service providers (as of


beginning of June). And if you search the Snowflake Data
Market Place catalog for the data provider "Sonra", you will
find data sets that fulfil the usability aspect of data
products very well. First example: select the OpenStreetMap
data set for Germany (or a country of your choice). The
short introduction contains a link to the data set manual
with detailed explanations, use case and SQL code
examples for a quick start. In this way, analysts and data
scientists get ultra fast access to location based features -
without own data ingestion and data preparation
efforts! Secondly, they published a universal Geography
Dimension that covers all major countries in the world,
administrative levels and major towns/cities with a
population > 500. You can easily join the data with your
own house data to look up location data. You can also use
it for drilling down / rolling up in any type of location
analytics. Both examples precisely reflect how to capitalise
on a variety of open and commercial data sets in 16
categories, including public health, weather, location,
demographics, as shown in the image below.

The Snowflake Data Marketplace is based on Snowflake


Secure Data Sharing which enables sharing selected objects
in a database in your own Snowflake account with other
Snowflake accounts. With Data Sharing, no actual data is
copied or transferred between accounts. All sharing is
accomplished through Snowflake’s unique services layer
and metadata store. This is an important concept because it
means that shared data does not take up any storage in a
consumer account and therefore does not contribute to the
consumer’s monthly data storage charges. The only charges
to consumers are for the compute resources used to query
the shared data. If you intend to set up your own Data Hub
to share data at scale with your entire business ecosystem,
such as suppliers, partners, vendors and customers, as well
https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 14/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

as business units within your own organisation, then setting


up your own Snowflake Data Exchange could be the right
next step for more effective collaboration with data. And as
the image below already indicates, Global Snowflake utilises
database replication to allow data providers to securely
share data with data consumers across different regions
and cloud platforms (AWS, Azure, GCP). And that's the
Snowflake way = The Data Cloud that helps to avoid data
silos and gain a unified experience by operating seamlessly
across multiple cloud providers and regions from anywhere
in your organisation.

I hope this Snowflake tour has been useful and that the
additional resources inspire you to get hands-on and start a
first project yourself. And don't forget: Build your own data
architecture, do it smart, and give it a fancy name!

Denunciar esto

Publicado por
Harald Erb 4 Seguir
Data Enthusiast & Architect, Principal Sales Engineer with
artículos
Snowflake
Fecha de publicación: 1 año

In my post I'm covering Snowflake for data warehousing, data lakes, data
engineering, data science, data sharing and beyond - with a lot of useful links
including a number of top sessions at #SnowflakeSummit on 9-10th June. Learn how
the Snowflake #DataCloud powers businesses including Siemens, Okta, Sainsbury's
and Capital One. Register today: https://bit.ly/3e4Q30a

Recomendar Comentar Compartir

159 19 comentarios

Reacciones

19 comentarios
Más relevantes

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 15/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

Añadir un comentario…

Frank Brinks • +3er 1 año


SAP + Azure = 🚀

Iliana Iankoulova could be interesting for you!?


Ver traducción

Recomendar · 1 Responder · 1 respuesta

Iliana Iankoulova • +3er 1 año


Data Lead at Picnic Technologies

Super nice! Thank you for sharing 🙏


Ver traducción

Recomendar · 1 Responder

Andy Westcott • +3er 1 año


Regional Director (District Manager) at Snowflake

Absolutely excellent!
Ver traducción

Recomendar Responder
Mostrar más comentarios

Harald Erb
Data Enthusiast & Architect, Principal Sales Engineer with Snowflake

Seguir

Más de Harald Erb

Snowflake 101: Data Products Snowflake 101: Data Products


#3 #2

Harald Erb en LinkedIn Harald Erb en LinkedIn

Snowflake 101: Data Products


#1

Harald Erb en LinkedIn

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 16/17
2/7/23, 10:43 AM (6) Snowflake 101 - for Data Architects | LinkedIn

https://www.linkedin.com/pulse/snowflake-101-data-architects-harald-erb/ 17/17

You might also like