Migrating Oracle Database To Snowflake Reference Manual
Migrating Oracle Database To Snowflake Reference Manual
GUIDE
MIGRATING ORACLE
DATABASE TO SNOWFLAKE:
REFERENCE MANUAL
3 INTRODUCTION 19 Appendix D - Other known issues for Oracle
to Snowflake migration
4 PREPARING FOR THE MIGRATION
19 Enforcement of primary keys & foreign keys
4 Document the existing solution
19 DATE vs. TO_DATE()
5 Establish a migration approach
19 Date subtraction
5 Capture the development and deployment
processes 19 Updating Data Through a View
6 Prioritize data sets for migration 19 Oracle Syntax
6 Identify the migration team 19 Stored Procedure
7 Define the migration deadlines and budget
19 Synonyms
7 Determine the migration outcomes
20 Appendix E - Comparing data from Oracle
8 EXECUTING THE MIGRATION with Snowflake
8 Establish security 21 About Snowflake
9 Develop a test plan
9 Prepare Snowflake for loading
11 Load initial datasets
12 Keep data up-to-date
12 Implement the test plan
12 Run Oracle and Snowflake in parallel
13 Redirect tools to Snowflake
13 Cut over to Snowflake
14 ENSURING MIGRATION SUCCESS
14 Identify and mitigate differences between
Oracle and Snowflake
14 Resolve migration issues
14 Communicate migration benefits
15 Need help migrating?
16 APPENDICES
16 Appendix A - Oracle Schemas to exclude when
migrating to Snowflake
17 Appendix B - Converting Oracle data types to
Snowflake data types
18 Appendix C - Query to evaluate Oracle data
type usage
TECHNICAL GUIDE
INTRODUCTION
The intended audience for this document are data engineers, solution
architects, and Snowflake solution partners who need guidance on the
scope and process for migrating from Oracle database to Snowflake.
3
TECHNICAL GUIDE
PREPARING
FOR THE
MIGRATION
Successful data migration projects start with a DOCUMENT THE EXISTING SOLUTION
well-designed plan. An effective plan accounts for
KEY OUTCOMES:
the many components that need to be considered,
paying particular attention to architecture List of Oracle databases that need to be migrated
and data preparation. This section gives you a List of Oracle database objects that need to
checklist of information to gather and decisions be migrated
to make before you start the actual migration. List of processes and tools that populate and pull
data from the Oracle databases
List of security roles, users, and permissions
List of Snowflake accounts that exist or
need creating
List of identified issues with design, datasets,
or processes that may impact migration
List detailing the frequency of security
provisioning processes
Documentation of the existing Oracle solution into
an as-is architecture diagram
Begin preparing for the migration from Oracle to
Snowflake by determining which Oracle databases within
the Oracle system need to be migrated. Then, identify
and document the database objects within the Oracle
databases that need to be migrated, including the size of
the data, to establish the migration project scope. Plan to
exclude schemas specific to Oracle, such as SYS, SYSAUX,
ANONYMOUS, CTXSYS, DBSNMP that aren’t needed
in Snowflake. Appendix A provides a full list of Oracle
schemas that you should exclude from the migration.
When you are unsure which databases and database
objects to migrate from Oracle database 12c/18c, you
can query the newly introduced ORACLE_MAINTAINED
column in the DBA_USERS dictionary view. This column
contains a value of “Y” for any schema that was created
via Oracle scripts. Avoid migrating unused objects unless
they are needed for audit or historical purposes.
After you have identified the Oracle databases and
database objects for migration, evaluate the data sources
that populate them to determine whether they come from
4
TECHNICAL GUIDE
an on-premises or cloud-based source. This will help you spectrum from wanting to take the existing solution
determine the methods available for loading the data into as-is to completely reworking the existing solution.
Snowflake. Specifically, look for the number of terabytes More reengineering requires more development
or petabytes of on-premises data loaded into Snowflake. and testing, which extends the length of a migration
If you need to load a large amount of data, you may need project. Therefore, unless your system is broken, we
to use AWS Snowball or Azure Data Box to move the generally recommend minimal reengineering for the
data as efficiently as possible. first iteration.
Evaluate the data sources that populate the Oracle In addition, as part of your migration, you may need
databases. Identify and document the processes and to resolve challenges that exist with your Oracle
tools that move data into the Oracle databases and pull implementation, so include these in your migration plan.
data from the Oracle databases (for example, ETL/ELT Break the migration into incremental deliverables that
tools, scripting languages, reporting and visualization enable your organization to make the transition to
tools, data science processes, and machine learning Snowflake and provide value to stakeholders sooner.
processes). Use that information to evaluate the level
of Snowflake support for the tools currently in use, as Use the as-is architecture diagram to create a future
well as to provide guidance on what migration approach state architecture diagram for communicating the
would best fit your needs. As these are critical processes, migration approach to stakeholders and ensuring that
be sure to document areas that could present issues in the approach meets their requirements.
the migration.
Document the roles and users that exist within the CAPTURE THE DEVELOPMENT AND
Oracle system, and their granted permissions, to prepare DEPLOYMENT PROCESSES
for the security implementation in Snowflake. Pay
KEY OUTCOMES:
special attention to sensitive data sets and how they are
secured within the Oracle system. Also determine how List of tools introduced with the migration
frequently security provisioning processes run to create
List of tools deprecated after the migration
similar security within Snowflake. In addition, document
the Snowflake accounts set up and any Snowflake List of development environments needed for
accounts necessary for the migration, since they will the migration
affect the security implementation. If you do not have
List of deployment processes used for
this information readily available, a Snowflake solution
the migration
partner can help capture this information.
5
TECHNICAL GUIDE
Your organization may want to change its development project by more easily identifying the ongoing changes
or deployment processes as part of the migration. that occur throughout the migration project. This is
Whether these processes change or not, capture the important since the underlying systems are unlikely to
development environments used for the migration (for be static during the migration.
example, preproduction/production and development/
QA/production) and the deployment processes
IDENTIFY THE MIGRATION TEAM
used for the migration (for example, source control
repository and method for deploying changes from one KEY OUTCOMES:
environment to another). This information is critical to
how you implement the development and deployment. List of migration team members and roles
Contact information for all team members
PRIORITIZE DATA SETS FOR MIGRATION
To complete the migration plan, document the people
KEY OUTCOMES: involved in the migration and the roles they will play.
Team members may come from your organization,
List of data sets to migrate first
Snowflake or a Snowflake solution partner.
Method for identifying process dependencies
The roles required on the migration team include
for data sets
database administrator, quality assurance engineer,
Documentation of process dependencies business owner, project manager, program manager,
for data sets scrum master, and communications specialist.
A Snowflake solution partner can fulfill multiple needs
To build momentum for the project, identify which including solution design, requirements gathering,
data sets to migrate first. Consider high priority data documentation, development, testing, delivery,
sets that have few dependencies. Begin with a simple project management, and training. The entire team
data set that provides a quick win and establishes a works together to successfully complete the migration
foundation of development and deployment processes and communicate the progress of the migration
from which to build the rest of the migration. to stakeholders.
To prioritize data sets for migration, pay careful
attention to the process dependencies of the data sets
and document those dependencies. By identifying
dependencies before beginning the migration work,
you will experience fewer challenges during the
migration. When you have a prioritized list of data sets,
leverage it with the above principles in mind. If you
don’t have a prioritized list, identify those data sets and
engage a Snowflake solution partner, if necessary, to
help capture this information.
Ideally, capture this documentation using an automated
process that iterates through the existing job
schedules. This will minimize the need to manually
identify and document changes. Creating an automated
process provides value throughout the migration
6
TECHNICAL GUIDE
DEFINE THE MIGRATION DEADLINES AND BUDGET DETERMINE THE MIGRATION OUTCOMES
KEY OUTCOMES: KEY OUTCOMES:
List of business expectations for the List of assumptions and high-level outcomes for
migration deadline the completion of the migration
Documented budget allocated for the Documented plan for communicating the
migration project success of the migration project to stakeholders
Completed template for estimating
Snowflake virtual warehouse costs As the final step of preparing for the migration, capture
the assumptions that will determine whether the
Business expectations for migration deadlines are an migration is successful, the high-level outcomes that
important input to your plan. In addition, consider should be achieved by the migration, and the benefits
other information such as the budget, availability of those outcomes provide for stakeholders. Use this
resources, and the amount of required reengineering. documentation to validate that the migration project
By gathering all of this information, you can establish provides the overall benefits stakeholders expect to
and communicate achievable deadlines, even if the achieve from the migration. For example, if turning off
deadlines differ from than the business expectations. an Oracle system is one of the desired outcomes, the
Often businesses have migration deadlines based migration plan should include that outcome.
on events like needing to deprecate a system before You can express this information as success or failure
a removal date. Sometimes these deadlines are not criteria for the migration project. You might also
achievable. When this happens, work with stakeholders include benchmarks that compare process execution
on more realistic migration scenarios. on Oracle to process execution on Snowflake. After
Be sure to understand the migration budget. Compare you compile this information, use it to communicate
the amount of migration work and the associated the success of the migration project to stakeholders.
costs to the available budget to ensure that there are
sufficient funds to complete the work.
A key consideration for budget planning is the basic
data warehouse sizing, such as the number of compute
clusters required to support the migration and the post
migration data warehouse. A Snowflake representative
can provide a template and work with you to determine
the virtual warehouses necessary to do the work (for
example, ETL/ELT and reporting and visualization).
The template calculates the number of minutes a
warehouse is expected to run each day and the number
of days a warehouse is expected to run each week.
After you complete the template, you will get an
estimated annual cost.
7
TECHNICAL GUIDE
EXECUTING
THE
MIGRATION
ESTABLISH SECURITY
You can manually create roles and users when you
initially set up a Snowflake account; however, as soon
as possible, you should automate a process for creating
roles and users. You should also establish an automated
process for users to request system access. Depending
on the security requirements, for auditing purposes,
you might need to establish and document role
creation, user creation, and the assignment of users to
roles.
Although the existing Oracle system security can be
a good starting point for setting up security within
Snowflake, evaluate the Oracle security configuration
to determine if there are roles and users that you no
longer need or that you should implement differently
as part of the migration to Snowflake.
Start by creating roles for at least the first data sets
that you will migrate. Assign users to these roles based
on the work they will do for the migration. When you
complete this setup, users can log into Snowflake
to see their roles in preparation for the creation of
databases and warehouses in the next step.
You can establish common roles for developer access
to non production databases, read-only access, read
and write access, and administrative access. You
might need dditional roles for restricting access to
sensitive data.
8
TECHNICAL GUIDE
DEVELOP A TEST PLAN As shown in the following two figures, Oracle Database
supports two general configurations:
Determine and execute the appropriate level and
scope of testing for each environment (for example, • Single-instance configuration, where all database
schedules aren’t executed in development, but they are functionality and resource management are self-
in QA and production, and data comparisons between contained. (Figure 1)
the Oracle system and Snowflake occur only for • Multitenant configuration, which is a two-part
production). Automate testing as much as possible so architecture consisting of a container database
that it is repeatable and provides results that you can (CDB) and pluggable databases (PDBs). (Figure 2)
evaluate. Ensure that acceptance criteria for the tests
are defined, agreed to, and documented. A container database provides a layer of common
resources shared across the environment and each
PREPARE SNOWFLAKE FOR DATA LOADING pluggable database operates as a set of schemas,
Create a Snowflake database for each Oracle database objects, and non-schema objects plugged and unplugged
that you need to migrate (for example, databases from the container database. From the user perspective,
for development, QA, and production). Within the a PDB appears as a single database, but it is actually
Snowflake databases you create for each Oracle managed within a container that may have one or
database, create corresponding schemas. more PDBs.
Database Database
Sales Sales
Marketing
Schema
HR
Sales Marketing
Marketing
HR
Schema
Sales
HR
Marketing
HR Schema
9
TECHNICAL GUIDE
FIGURE 2: MULTITENANT ORACLE DATABASE TO SNOWFLAKE DATABASE
Sales Database
Marketing Database
HR Database
Using the approach described in Figure 2 clearly would need to be modified as follows:
identifies the environment and uses schemas to contain
create or replace view db_view as select * from
the tables and views, so that tools can more easily be
PROD.public.db_table;
redirected from the Oracle system to Snowflake.
The Snowflake database warehouse specifies the
environment (for example, production or QA) in the view After you create the databases and schemas in Snowflake,
name. Therefore, when you migrate a view from one you can execute the DDL for creating the database
environment to another, be sure to update the name. For objects in Snowflake.
example, migrating a database named QA to one named Create the virtual warehouses based on the information
PROD, using a fully qualified schema object, has the you captured during the migration preparation. As
following form: shown in the following figure, there should be a separate
virtual warehouse for each function in the environment.
<database_name>.<schema_name>.<object_name>
The virtual warehouse will support activities such as
data science, ad-hoc user queries, and BI tools.
Therefore, the following SQL statement: The diagram on the following page contains a
reference architecture for using virtual warehouses
create or replace view db_view as select * for different workloads.
from QA.public.db_table;
10
TECHNICAL GUIDE
E N VI RO N M E N T
S3
STAGING E
DATABAS E S
H
B
VIRTUAL WEB UI AD-HOC SQL
D
SOURCE WAREHOUSE FOR QUERIES
AD-HOC USERS
SYSTEMS
(CLOUD OR
I
ON-PREMISES) G L
VIRTUAL
WAREHOUSE FOR REPORTING
TRANSFORMATIONS
DATABASES
DATA (SCHEMAS, TABLES,
TRANSFORMATION VIEWS, ETC)
VIRTUAL NATIVE BUSINESS
TOOLS WAREHOUSE(S) CONNECTOR / INTELLIGENCE
FOR BI TOOLS ODBC / JDBC TOOLS
NO ETL
ETL
ELT
DATA FLOW
Base the initial sizing of the virtual warehouses on the LOAD INITIAL DATA SETS
estimates created while preparing for the migration. To begin migrating data from the Oracle system to
Then adjust them as needed throughout the migration. Snowflake, you’ll need to extract data from the Oracle
Also, set up resource monitors to track usage, and system. If the Oracle system is on-premises, and you
take appropriate action when limits are reached. See need to move terabytes or petabytes of data into the
the “Managing Resource Monitors” section in the cloud, you might need to use AWS Snowball or Azure
Snowflake documentation for detailed information. Data Box. Add an appropriate amount of time to the
migration schedule to provision these boxes, load them
As you create the databases, database objects, and
with data, transport them to the cloud data center, and
virtual warehouses, assign them to the appropriate
offload the data into the cloud servers.
security roles.
After you have moved the data to the cloud, load the
data into Snowflake.
11
TECHNICAL GUIDE
See the “Overview of Data Loading” section in the verify that SLAs are being met within Snowflake and to
Snowflake documentation. Use this data loading to identify performance and process issues.
test the configuration of the databases, database
objects, virtual warehouses, and the security you’ve
IMPLEMENT THE TEST PLAN
implemented.
Test the Snowflake implementation after you’ve
Depending on which Oracle environment the data
loaded the initial data sets, and make sure processes
came from and which Snowflake data warehouse is
are running to keep the data up to date. Engage team
populated, you could use cloning to move data within
members to test their data sets and applications
Snowflake from one data warehouse to another.
against Snowflake. Then, engage the additional groups
Cloning requires fewer resources than loading the
after you complete initial testing to perform tests and
same data multiple times into different Snowflake data
validate the data.
warehouses.
To make sure the migration has completed successfully,
Due to issues such as failed data type or column
compare data between the Oracle and Snowflake
mappings that occur with extraction and loading, plan
environments throughout the migration. Investigate
to extract and load data more than once. Also plan for
differences to determine the cause and resolve
time between when the initial data sets are loaded and
any issues.
when the ETL/ELT processes are ready to keep the
data up to date. Begin with a subset of the data from If part of the migration includes fixing processes that
the Oracle system, rather than trying to load the entire were incorrect in the Oracle system, the test results
contents of the Oracle system at the beginning of may not match Snowflake. In such a case, use other
the migration. Appendix B and Appendix C of this methods to make sure the data is correct in Snowflake.
guide provide additional data type conversion and Document reasons that data won’t match between
usage information that can further clarify the data the Oracle and Snowflake environments and share
extraction and loading process. the documentation with groups who are performing
testing, so they don’t spend time researching
previously identified issues.
KEEP DATA UP TO DATE
Also, compare the performance of the processes
To ensure a complete history is available in Snowflake, that load and consume data to ensure Snowflake is
wait until after you load the historical data sets from performing as expected. Share these comparisons with
Oracle to implement the processes for keeping the data stakeholders to highlight the benefits of migrating from
up to date. Oracle to Snowflake.
Whether you load data into Snowflake by updating
existing Oracle processes or by creating new RUN THE ORACLE AND SNOWFLAKE
processes, set up the processes on appropriate SYSTEMS IN PARALLEL
schedules. Usually, this means using the same
During the migration, the Oracle and Snowflake systems
schedules you used for loading the Oracle database.
will run in parallel for a period of time. Minimize the
This is another opportunity to evaluate whether
amount of time both systems are running but validate
changes to the schedule should be part of
the migration has completed successfully before
the migration.
shutting down the Oracle system.
To ensure you populate the data in the correct order,
When both systems are running in parallel, determine
create the schedules based on a clear understanding
how to best compare data and performance. For
of the process dependencies you captured as part of
example, you may need to create hashes as data is
preparing for the migration.
extracted from the Oracle system, you can then use to
In addition to scheduling the processes to run, monitor compare data at the row level between the Oracle and
the processes so you understand and can communicate Snowflake systems (Appendix E explains this approach
the state of the data (for example, loading is in further). Perform these comparisons in Snowflake
progress, loading completed successfully, or loading with resources provisioned to compare data without
failures occurred that need addressing). Use monitoring negatively impacting the Oracle system.
to compare execution results with established SLAs to
12
TECHNICAL GUIDE
REDIRECT TOOLS TO SNOWFLAKE
Find the list of tools you gathered while preparing
for the migration and the information on the level of
support each tool has for Snowflake. Then update the
tool connections to redirect the tools to Snowflake after
you’ve migrated a sufficient amount of data to Snowflake
for use by each tool.
Redirecting tools to Snowflake usually involves creating
copies of the existing solution that point to the Oracle
database, and updating them to point to Snowflake
instead. Compare the output of the tools to ensure the
results are the same between the Oracle and Snowflake
systems. Also evaluate the performance of the tools to
verify they are performing as expected in Snowflake.
13
TECHNICAL GUIDE
ENSURING
MIGRATION
SUCCESS
14
TECHNICAL GUIDE
NEED HELP MIGRATING?
Snowflake expert resources are available to accelerate
your migration, structure and optimize your planning
and implementation activities, and apply customer
best practices to meet your technology and business
objectives. Snowflake's Professional Services deploys
a powerful combination of data architecture expertise
and advanced technical knowledge of the platform
to deliver high-performing data strategies, proofs of
concept, and migration projects.
Our global and regional solution partners also have
extensive experience performing proofs of concept and
platform migrations. They offer services ranging from
high-level architectural recommendations to manual
code conversions. Many Snowflake partners have also
built tools to automate and accelerate the migration
process.
Whether your organization is fully staffed for a
platform migration or you need additional expertise,
Snowflake and our solution partners have the skills
and tools to accelerate your journey to cloud-built
data analytics, so you can reap the full benefits of
Snowflake quickly. To find out more, please contact
the Snowflake sales team or visit Snowflake’s Customer
Community Lodge.
15
TECHNICAL GUIDE
APPENDICES—
ORACLE TO SNOWFLAKE
MIGRATION GUIDANCE
APPENDIX A
ORACLE SCHEMAS TO EXCLUDE
WHEN MIGRATING TO SNOWFLAKE
The following list schemas are needed for Oracle only
and shouldn’t be migrated to Snowflake:
• ANONYMOUS
• APEX_XXXXXX
• CTXSYS
• DBSNMP
• EXFSYS
• LBACSYS
• MDSYS
• MGMT_VIEW
• OLAPSYS
• ORDDATA
• OWBSYS
• ORDPLUGINS
• ORDSYS
• OUTLN
• SI_INFORMTN_SCHEMA
• SYS
• SYSMAN
• SYSTEM
• WK_TEST
• WKSYS
• WKPROXY
• WMSYS
• XDB
• APEX_PUBLIC_USER
• DIP
• FLOWS_040100
• FLOWS_FILES
• MDDATA
• ORACLE_OCM
• SPATIAL_CSW_ADMIN_USR
• SPATIAL_WFS_ADMIN_USR
• XS$NULL
• B'
• HR
• OE
• PM
• IX
• SH
16
APPENDIX B
TECHNICAL GUIDE
CONVERTING ORACLE DATA TYPES TO SNOWFLAKE DATA TYPES
Snowflake supports most basic SQL data types (with some restrictions) for use in columns, local variables, expressions, parameters,
and any other appropriate/suitable locations. Data types are automatically coerced whenever necessary and possible.
FLOAT NUMBER Snowflake FLOAT is a 64 bit floating point number. FLOAT within
Oracle is a subtype of NUMBER. Precision and scale will need to be
consistent with that of the data values being loaded.
TIMESTAMP TIMESTAMP_NTZ All operations are performed without taking any time zone
into account.
TIMESTAMP WITH TIME ZONE TIMESTAMP_TZ All operations are performed with the time zone offset specified.
TIMESTAMP WITH LOCAL TIME ZONE TIMESTAMP_LTZ All operations are performed in the current session’s time zone,
controlled by the TIMEZONE session parameter.
INTERVAL YEAR () TO MONTH n/a Alternative: Use the TIME_SLICE function to calculate the start and
end times of fixed-width “buckets” into which data can be categorized.
INTERVAL DAY () TO SECOND n/a Alternative: Use the TIME_SLICE function to calculate the start and
end times of fixed-width “buckets” into which data can be categorized.
ROWID n/a Snowflake VARCHAR can be used if ROWID values are migrated
to Snowflake.
UROWID n/a Snowflake VARCHAR can be used if UROWID values are migrated
to Snowflake.
NCLOB VARCHAR Max 8MB for 2 byte and 4MB for 4 byte
17
TECHNICAL GUIDE
APPENDIX C
QUERY TO EVALUATE ORACLE DATA TYPE USAGE
18
TECHNICAL GUIDE
APPENDIX D
OTHER KNOWN ISSUES FOR ORACLE TO
SNOWFLAKE MIGRATION
Data Type DATE Data Type The Snowflake DATE data type records date without time.
TIMESTAMP_NTZ
SQL Model Format RR SQL Model Format YY Snowflake model format YY assumes the first two digits of the full year value as follows:
00-69 will translate to 2000 - 2069
70-99 will translate to 1970 - 1999
TO_CHAR( <date>, ‘J’ ) n/a The Julian date SQL format model is not currently supported within Snowflake. An
alternative would be to create a user defined function to calculate the Julian value from a
specified date.
<date> - <date> n/a Use DATEDIFF instead. The direct subtraction of two dates is not currently supported
within Snowflake.
19
TECHNICAL GUIDE
APPENDIX D (CONT'D)
OTHER KNOWN ISSUES FOR ORACLE TO
SNOWFLAKE MIGRATION
ORACLE SYNTAX
Snowflake does not support the following Oracle SQL
syntax for creating tables (DDL):
• SEGMENT
• PCTFREE
• PCTUSED
• INITRANS
• MAXTRANS
• NOCOMPRESS
• LOGGING
• STORAGE
• TABLESPACE
• PARTITION
20
TECHNICAL GUIDE
APPENDIX E
COMPARING DATA FROM ORACLE
WITH SNOWFLAKE
the hash, but exclude insert and update dates and
Use row counts and sums of numeric data to verify data timestamps that can change based on when the
from Oracle matches the data loaded into Snowflake. Or, data is loaded into Snowflake). As you load data into
get unique values from columns in Oracle and compare Snowflake, generate another MD5 hash across the
those unique values with Snowflake to ensure you’ve same set of columns and compare it with the MD5
loaded all your data successfully. hash from Oracle. This allows you to compare the
For use cases where you need more in-depth contents of the row based on the MD5 hash rather
validation add an MD5 hash to the data extracted than comparing each column individually.
from Oracle. Construct this MD5 hash using columns The table below shows example hash queries
that won’t change when you load data into Snowflake and their results:
(for example, include key columns and attributes in
¹ Oracle returns hash values in uppercase. MD5 hash values should always be lowercase. Use the Oracle lower function to correct for this.
21
TECHNICAL GUIDE
APPENDIX F
ORACLE INSTANCE VS. SNOWFLAKE ACCOUNT
Similar to an Oracle instance, a Snowflake account encapsulates users, roles and databases.
Instance Account A Snowflake account is created within a single cloud providers region, defines the
Snowflake edition, controls authenticating user connections and encapsulates all costs
associated with the platform.
User User Snowflake users are created and managed at the account level and are independent of
database schema objects.
Role Role Object ownership and object access control are managed at the role level using a
combination of discretionary access control (DAC) and Role-Based Access Control (RBAC).
Access privileges are not granted directly to a user.
Database Database(s) A single Snowflake account supports the creation of an unlimited number of logical
databases.
Tablespace n/a Snowflake’s unique architecture eliminates the need to manage tablespaces as well as
database files and block and extent sizing.
22
TECHNICAL GUIDE
APPENDIX F (CONT'D)
ORACLE INSTANCE VS. SNOWFLAKE ACCOUNT
Schemas Schemas Snowflake schema objects are created and managed independent of a user login.
Tables Tables Snowflake supports permanent, transient, temporary and clustered tables.
External tables are not currently supported within Snowflake.
Table Partitions n/a Snowflake’s unique architecture eliminates the need to manage physical
table partitions.
Constraints Constraints Snowflake provides support for constraints as defined in the ANSI SQL standard,
as well as some extensions for compatibility with other databases, such as Oracle.
Snowflake supports defining and maintaining constraints, but does not enforce
them, except for NOT NULL constraints, which are always enforced.
Indexes n/a Snowflake’s unique architecture eliminates the need to manage indexes.
Views Views A Snowflake view can be created for any valid SELECT statement.
Materialized Views Materialized Views Materialized views are supported in Enterprise Edition and above.
Transactions Transactions A Snowflake transaction is a set of SQL statements, both reads and writes, that are
processed as a unit and guarantee ACID properties.
PL/SQL and Java JavaScript Stored procedures and user defined functions (UDF) within Snowflake utilize
JavaScript as their procedural language.
PL/SQL Anonymous Blocks n/a Anonymous block JavaScript is not currently supported within Snowflake.
Stored Procedures Stored Procedures Snowflake stored procedures utilize JavaScript as the procedural language.
User-Defined Functions User-Defined A UDF can contain either a SQL expression or JavaScript code, and can return either
Functions scalar or tabular results (i.e., table functions).
Sequences Sequences Sequences can be used for generating sequential, unique numbers.
23
TECHNICAL GUIDE
APPENDIX F (CONT'D)
ORACLE INSTANCE VS. SNOWFLAKE ACCOUNT
DML
Snowflake supports standard SQL, including a subset of DECIMAL, etc.), as well as some additional keywords
ANSI SQL:1999 and the SQL:2003 analytic extensions. (ASC, DESC, MINUS, etc.) that are reserved by Oracle
Snowflake also supports common variations for a and other popular databases. Additionally, Snowflake
number of commands where those variations do not reserves keywords REGEXP and RLIKE (which function
conflict with each other. like the ANSI reserved keyword LIKE) and SOME (which
is a synonym for the ANSI reserved keyword ANY).
Snowflake SQL reserves all ANSI keywords (with the
exception of type keywords such as CHAR, DATE,
ANSI compliant SQL:1999 and SQL:2003 will transfer to Snowflake with little to
ANSI SQL ANSI SQL
no modification if schema, table and column names remain the same.
SQL Functions SQL Functions Snowflake supports a wide range of scalar, aggregate and window functions.
Snowflake supports a wide range of standard SQL format models for converting
SQL Format Models SQL Format Models
numeric and date values to text and vice versa.
24
ABOUT
SNOWFLAKE
The Snowflake Cloud Data Platform shatters the barriers that prevent organizations
from unleashing the true value from their data. Thousands of customers deploy
Snowflake to advance their businesses beyond what was once possible by deriving
all the insights from all their data by all their business users. Snowflake equips
organizations with a single, integrated platform that offers the only data warehouse
built for any cloud; instant, secure, and governed access to their entire network
of data; and a core architecture to enable many other types of data workloads,
including a single platform for developing modern data applications.
Snowflake: Data without limits. Find out more at Snowflake.com.