DWC Getting Started
DWC Getting Started
14 Getting Started with the SAP Datasphere Command Line Interface. . . . . . . . . . . . . . . . . . . . .40
SAP Datasphere enables a business data fabric architecture that uniquely harmonizes mission-critical data
across the organization, unleashing business experts to make the most impactful decisions. It combines
previously discrete capabilities into a unified service for data integration, cataloging, semantic modeling, data
warehousing, and virtualizing workloads across SAP and non-SAP data.
This diagram shows how you acquire data from sources, prepare and model it in SAP Datasphere, and expose it
for consumption in SAP Analytics Cloud, Microsoft Excel, and other clients, tools, and apps. Click an element to
learn more.
Consume Data
All users of SAP Datasphere with any of the standard roles can consume data exposed by spaces they are
assigned to. If a user does not need to access SAP Datasphere itself, and only wants to consume data exposed
by it, they should be granted the DW Consumer role.
Integrate Data
Users with the DW Space Administrator or DW Integrator role can create connections to source systems
and databases and can schedule and monitor data replication and other data integration tasks. Space
administrators use other methods to integrate data into their space and are responsible for maintaining the list
of space users and monitoring and managing the space. They can create data access controls to secure data,
and can transport content between tenants.
Users with the Catalog User role browse the Catalog to discover and use these artifacts.
See Curating and Publishing Data Assets in the Catalog [page 33].
SAP Datasphere allows you to converge data coming from SAP and third-party on-premise and cloud
environments into a single, fully-managed cloud environment to allow your organization to radically simplify
your data warehousing landscape. It provides:
• A secure environment supporting diverse data application needs including real-time analytics, governed
data access, a data catalog, and data science (machine learning).
• Spaces, which are created and provisioned centrally to provide secure modeling environments for different
departments or use cases.
• A wide range of connections to SAP and non-SAP cloud and on-premise sources, including data lakes.
• Graphical low-code/no-code tools to support self-service modeling needs for business users.
• Powerful built-in SQL and data flow editors for sophisticated modeling and data transformation needs,
along with support for 3rd party tools and other SAP IDEs.
• An embedded data marketplace to consume external data products and to create internal data products.
• A business user-friendly data matching environment to enrich existing datasets with external data, coming
from Data Marketplace, csv uploads, and other 3rd party sources.
• A catalog to support self-service discovery of data and analytic assets, glossaries and terms, and key
performance indicators.
• Multi-dimensional modeling with powerful analytical capabilities and built-in data preview.
• A graphical impact and lineage analysis to visualize data movements, transformations, and other
dependencies.
• Cross-space collaboration and sharing of centrally governed sources for joining with local files and external
sources with support for row-level security.
• Re-use and migration of trusted and governed meta and data models residing in on-premise SAP Business
Warehouse and SAP SQL Data Warehouse implementations.
• Provision of SAP and partner business content to support end-to-end business scenarios for various
industries and lines of business.
• Seamless integration with SAP Analytics Cloud, Microsoft Excel, and public OData APIs to support
consumption by other clients, tools, and apps.
When you are added as a user to SAP Datasphere, you receive a welcome email.
Click the Activate Account button to connect to the server and set your password.
The SAP Datasphere homepage gives you access to recent objects, quick actions, and blog posts (see The SAP
Datasphere Homepage [page 14]).
To view and edit your user profile settings, click your user icon in the shell bar and select Settings. You can
control various aspects of the user experience of SAP Datasphere and set data privacy and task scheduling
consent options.
User Account
Note
If you would like to set a profile picture for your user, there is no UI functionality to support this in
SAP Datasphere. However, you can set a profile picture by using a POST request with path /sap/fpa/
services/rest/epm/security/photo and upload a file of type jpg.
POST /sap/fpa/services/rest/epm/security/photo?filename=example_logo.jpg&uu
id=97DE06904D3AF7D31700CE0A318925D7&tenant=1 HTTP/1.1
In case there is an SAP Analytics Cloud tenant connected, you can switch to this tenant and upload the
profile picture there with UI support. For more information, see Edit Your Profile in the SAP Analytics Cloud
documentation.
Home Screen
Control the cards that you want to display on your homepage (see The SAP Datasphere Homepage [page
14]). By default, all cards are displayed.
Controls the language, date, and number formats used in SAP Datasphere:
Setting Description
Data Access Language Select the default language in which to display text data in SAP Analytics Cloud .
To choose the data access language, click (Product Switch) Analytics, open
your Profile Settings and edit your user preferences.
For more information, see Edit Your Profile in the SAP Analytics Cloud documentation.
Example
• 8:48:53 AM or 8:48:53
• 11:03:31 PM or 23:03:31
Scale Formatting Select how to format the number scale. You can choose the system default, short (k, m, bn)
or long (Thousand, Million, Billion).
UI Settings
Controls whether the business or technical names of objects are shown by default in SAP Datasphere screens.
Note
By default, you see your object’s business name. To switch to its technical name, choose Show Technical
Name. You will then see the technical name in the Data Builder UIs for graphical views, SQL views, and ER
Privacy
By default, SAP Datasphere keeps track of objects that you have viewed and provides access to those recent
objects on the SAP Datasphere homepage, in the Repository Explorer, and elsewhere. For example, in the
Repository Explorer, you can select the Recent option from the left-side navigation pane to display the last ten
objects you accessed, created, or edited. In addition, if you click in the search entry field in the Repository
Explorer, the last ten successful search queries are shown in the autosuggest selection box.
When you first log into SAP Datasphere, a popup dialog box prompts whether you want to disable tracking of
objects you access, create, or edit. Clicking the Manage Settings button brings you here where you can disable
future tracking and optionally clear previously tracked data..
• To disable tracking of objects you access, clear the Remember My Searches and Opened Objects checkbox
and click Save.
• To clear any existing data for previous searches and recent objects you’ve accessed, click the Clear My
Data button.
After disabling the tracking of accessed objects, or confirming the selection to clear tracked data, the changes
will take effect immediately. If you want to reenable object and search tracking, just reopen the Privacy settings
dialog, select the Remember My Searches and Opened Objects checkbox again, and click Save.
You can give or revoke your consent to let SAP Datasphere run scheduled tasks you own. Consent is also
required to run task chains, whether they are scheduled or you choose to run a task chain directly, without
a schedule. Scheduled tasks or task chains run asynchronously in the background according to the settings
defined in their schedules. Note that if you do not have the required consent, task chains or tasks you have
scheduled to run won't be executed.
Note
Your consent is valid for 12 months. If your consent will expire within the next four weeks, when you attempt
to schedule new tasks, SAP Datasphere displays a message warning that your consent is approaching
its expiration date. After the consent has expired, a log message informs you that future tasks you have
scheduled to run will no longer be executed. Renew your consent to resume task execution according to
their original schedules.
The following video shows you where to find what's new information and help in SAP Datasphere.
Help
To open the in-app help, click the question mark on the upper right hand corner.
A short description gives you a general idea what can be done on this screen. When you click on this short
description, you get a longer text with conceptual information, a step by step procedure or even a video tutorial.
Note
The in-app help is context sensitive. The help topics change depending on where you are in SAP
Datasphere.
When the in-app help panel is open, click the megaphone icon to view the what's new topics.
A short description gives you general information of the new or changed features sorted by SAP Datasphere
version. When you click on this short description, you get a longer and more detailed text about these new or
changed features.
Use the left navigation area to access all the apps available in SAP Datasphere.
Note
Each app requires specific privileges, and some may not be visible to you (see Roles and Privileges by App
and Feature).
The apps contained in SAP Datasphere are available in the side navigation area.
Item Description
(Home) View recent objects, data integration tasks, and SAP Datasphere blog posts in your
customizable homepage (see The SAP Datasphere Homepage [page 14]).
(Catalog) Discover, enrich, classify, and publish high-quality, trusted data and analytic assets
from across your enterprise (see Governing and Publishing Catalog Assets).
(Data Marketplace) Purchase data products from providers and download them directly into your space
(see Purchasing Data from Data Marketplace).
(Semantic Onboarding) Import semantically-rich objects from your SAP systems, the Content Network, and the
Public Data Marketplace and other marketplace contexts (see Semantic Onboarding).
(Business Builder) Create business entities, fact models, and consumption models to present your data to
analytics clients (see Modeling Data in the Business Builder [page 30]).
Create or import tables and views, create flows, task chains and entity-relationship
(Data Builder)
diagrams (see Acquiring Data in the Data Builder [page 21] and Modeling Data in the
Data Builder).
(Data Access Controls) Create criteria-based privileges to filter the data accessible in views and business layer
objects (see Securing Data with Data Access Controls).
(Data Integration Monitor) Monitor remote tables, persisted views, flows and task chains (see Managing and Moni-
toring Data Integration).
(Connections) Create connections to source systems to allow accessing and importing data into SAP
Datasphere (see Integrating Data via Connections).
Note
To open an app in a new browser tab, right-click it and select Open App in New Tab.
Item Description
Set up, configure, and monitor your spaces, including assigning users to them (see
(Space Management)
Preparing Your Space and Integrating Data [page 19]).
(System Monitor) Monitor the performance of your system and identify storage, task, out-of-memory, and
other issues (see Monitoring SAP Datasphere).
• (Packages) - Create packages and add objects from your space in preparation
for transfer to another tenant (see Creating Packages to Export).
• (Export) - Export objects from your space for transfer to another space or
tenant (see Exporting Content for Sharing with Other Tenants).
• (Import) - Import objects from another space or tenant into your space (see
Importing Content from Another Tenant).
(Data Sharing Cockpit) Become a data provider and make your data products available in Data Marketplace
(see Data Marketplace - Data Provider's Guide).
Tool Description
(Feedback) Open the Feedback survey on a separate browser tab and share your experience when
working with SAP Datasphere.
(Support) Open the Support dialog (see Request Help from SAP Support).
(Help) Open the Help panel (see How to Find Help [page 10]).
(Profile) Open the Settings dialog (see Changing SAP Datasphere Settings [page 7]) or log out.
(Product Switch) Click here and select Analytics to navigate to your organization's SAP Analytics Cloud
tenant.
Note
The Product Switch is available if an administrator has enabled it (see Enable the
Product Switch to Access an SAP Analytics Cloud Tenant) and you are assigned one or
more BI roles.
For detailed information about working with SAP Analytics Cloud, see the SAP Analytics
Cloud documentation.
The SAP Datasphere homepage gives you access to recent objects, quick actions, and blog posts. You can
choose to show, hide, and reorder cards to suit your needs.
Card Description
Welcome Card Click Learn More to watch our getting started video and access useful links.
SAP Datasphere Blog See recent posts from the SAP Datasphere Blog.
For more information about the Business Builder, see Modeling Data in the Business
Builder.
Data Integration Tasks See your most recently-access data integration task runs.
For more information about the Data Integration Monitor, see Managing and Monitoring
Data Integration.
To reorganize the cards in your homepage, click a card header and drag it to reposition it.
Tool Description
Auto-Refresh Data Enable this switch to refresh the card data in real-time. By default, the data is loaded when
you navigate to the homescreen and will not be refreshed.
Customize Click here to open the Settings dialog and enable or disable the display of each of the
homepage cards.
Users with the DW Administrator role can configure, manage, and monitor the SAP Datasphere tenant to
support the work of acquiring, preparing, and modeling data for analytics. They manage users and roles, create
spaces, and allocate storage to them. They prepare and monitor connectivity for data integration and perform
ongoing monitoring and maintainance of the tenant.
Either SAP will provision your tenant or you can create an instance in SAP BTP (see Creating and Configuring
Your SAP Datasphere Tenant).
• We recommend that you link your tenant to an SAP Analytics Cloud tenant (see Enable the Product Switch
to Access an SAP Analytics Cloud Tenant).
• You can enable SAP SQL data warehousing on your tenant to exchange data between your HDI containers
and your SAP Datasphere spaces without the need for data movement (see Enable SAP SQL Data
Warehousing on Your SAP Datasphere Tenant).
• You can enable the SAP HANA Cloud script server to access the SAP HANA Automated Predictive Library
(APL) and SAP HANA Predictive Analysis Library (PAL) machine learning libraries (see Enable the SAP
HANA Cloud Script Server on Your SAP Datasphere Tenant).
An administrator creates SAP Datasphere users manually, from a *.csv file, or via an identity provider (see
Managing SAP Datasphere Users).
You must assign one or more roles to each of your users via scoped roles and global roles (see Managing Roles
and Privileges). You can create your own custom roles or use the following standard roles delivered with SAP
Datasphere:
Note
Users who are space administrators primarily need scoped permissions to work with spaces,
but they also need some global permissions (such as Lifecycle when transporting content
packages). To provide such users with the full set of permissions they need, they must be
assigned to a scoped role (such as the DW Scoped Space Administrator) to receive the
necessary scoped privileges, but they also need to be assigned directly to the DW Space
Administrator role (or a custom role that is based on the DW Space Administrator role) in order
to receive the additional global privileges.
• DW Integrator (template) - Can integrate data via connections and can manage and monitor data
integration in a space.
• DW Scoped Integrator - This predefined scoped role is based on the DW Integrator role and inherits
its privileges and permissions.
• DW Modeler (template) - Can create and edit objects in the Data Builder and Business Builder and view
data in objects.
• DW Scoped Modeler - This predefined scoped role is based on the DW Modeler role and inherits its
privileges and permissions.
• DW Viewer (template) - Can view objects and view data output by views that are exposed for
consumption in spaces.
• DW Scoped Viewer - This predefined scoped role is based on the DW Viewer role and inherits its
privileges and permissions.
• Roles providing privileges to consume the data exposed by SAP Datasphere spaces:
• DW Consumer (template) - Can consume data exposed by SAP Datasphere spaces, using SAP
Analytics Cloud, and other clients, tools, and apps. Users with this role cannot log into SAP
Datasphere. It is intended for business analysts and other users who use SAP Datasphere data to
drive their visualizations, but who have no need to access the modeling environment.
• DW Scoped Consumer - This predefined scoped role is based on the DW Consumer role and
inherits its privileges and permissions.
• Roles providing privileges to work in the SAP Datasphere catalog:
• Catalog Administrator - Can set up and implement data governance using the catalog. This includes
connecting the catalog to source systems for extracting metadata, building business glossaries,
creating tags for classification, and publishing enriched catalog assets so all catalog users can find
and use them. Must be used in combination with another role such as DW Viewer or DW Modeler for
the user to have access to SAP Datasphere.
• Catalog User - Can search and discover data and analytics content in the catalog for consumption.
These users may be modelers who want to build additional content based on official, governed assets
in the catalog, or viewers who just want to view these assets. Must be used in combination with
another role such as DW Viewer or DW Modeler for the user to have access to SAP Datasphere.
All data acquisition, preparation, and modeling happens inside spaces. A space is a secure area - space data
cannot be accessed outside the space unless it is shared to another space or exposed for consumption.
An administrator must create one or more spaces. They allocate disk and in-memory storage to the space, set
its priority, and can limit how much memory and how many threads its statements can consume. See Creating
Spaces and Allocating Storage.
Prepare Connectivity
Administrators prepare SAP Datasphere for creating connections to source systems in spaces (see Preparing
Connectivity for Connections).
Administrators have access to various monitoring logs and views and can, if necessary, create database
analysis users to help troubleshoot issues (see Monitoring SAP Datasphere).
Users with the DW Space Administrator or DW Integrator role can create connections to source systems
and databases and can schedule and monitor data replication and other data integration tasks. Space
administrators use other methods to integrate data into their space and are responsible for maintaining the list
of space users and monitoring and managing the space. They can create data access controls to secure data,
and can transport content between tenants.
An administrator will assign you the DW Space Administrator role, create your space, and assign you to it. Once
this is done, you can prepare your space as follows:
• Assign SAP Datasphere users to your space (see Control User Access to Your Space).
• Optionally import SAP and partner business content to support end-to-end business scenarios for various
industries and lines of business (see Importing SAP and Partner Business Content from the Content
Network).
• Transport objects securely to and from your space (see Transporting Content Between Tenants).
• Use various monitoring and logging tools to manage your space (see Managing Your Space).
Space administrators and integrators can create connections to source systems to allow space users to
acquire data from those systems (see Integrating Data via Connections).
• Create database users to allow external tools to connect to the space and write data to Open SQL schemas
associated with the space (see Integrating Data via Database Users/Open SQL Schemas).
We recommend that you create data access controls, which can be applied to views to provide row-level
filtering of your space data (see Securing Data with Data Access Controls).
You can enable, run, schedule, and monitor data replication tasks in the (Data Integration Monitor) (see
Managing and Monitoring Data Integration).
Users with the DW Modeler role can import data directly into the Data Builder from connections and other
sources, and use flows to replicate, extract, transform and load data.
Space administrators and integrators prepare connections and other sources to allow modelers to acquire data
(see Integrating Data and Managing Spaces in SAP Datasphere).
Many connections (including most connections to SAP systems) support importing remote tables to federate
or replicate data (see Integrating Data via Connections).
You can import remote tables to make the data available in your space from the Data Builder start page, in an
entity-relationship model, or directly as a source in a view.
• To get started: In the side navigation area, click (Data Builder), select a space if necessary, and click
Import Import Remote Tables . See Import Remote Tables.
• By default, remote tables federate data, and each time the data is used a call is made to the remote system
to load it. You can improve performance by enabling replication to store the data in SAP Datasphere.
Some connections support real-time replication and for others, you can keep your data fresh by scheduling
regular updates (see Replicate Remote Table Data).
• To optimize replication performance and reduce your data footprint, you can remove unneccessary
columns and set filters (see Restrict Remote Table Data Loads).
• To maximize access performance, you can store the replicated data in-memory (see Accelerate Table Data
Access with In-Memory Storage).
• Once a remote table is imported, it is available for use by all users of the space and can be used as a source
for views.
• You can automate sequences of data replication and loading tasks with task chains (see Creating a Task
Chain).
Many connections (including most connections to SAP systems) support loading data to SAP Datasphere via
data flows (see Integrating Data via Connections).
Data flows support a wide range of extract, transform, and load (ETL) operations.
• To get started: In the side navigation area, click (Data Builder), select a space if necessary, and click
New Data Flow to open the editor. See see Creating a Data Flow.
• To add a source to your data flow, drag it from the Source Browser (see Using the Source Browser).
• In addition to connections, data flows can load and transform data from the following kinds of sources:
• Open SQL schemas (see Integrating Data via Database Users/Open SQL Schemas)
• HDI containers (see Exchanging Data with SAP SQL Data Warehousing HDI Containers).
• Objects that are already in the SAP Datasphere repository (see Add Objects from the Repository).
• Data flows load data into local tables.
• You can automate sequences of data replication and loading tasks with task chains (see Creating a Task
Chain).
Certain connections support loading data from multiple source objects to SAP Datasphere via a replication
flow. You can enable a single initial load or request initial and delta loads and perform simple projection
operations (see Creating a Replication Flow).
Create a transformation flow to load data from one or more source repository tables, apply transformations,
and output the result to a target table. You can load a full set of data or only delta changes from each source
table (see Creating a Transformation Flow).
The Import Entities wizard allows you to import entities from SAP S/4HANA Cloud and SAP S/4HANA on-
premise systems with rich metadata (see Importing Entities with Semantics from SAP S/4HANA).
SAP BW bridge enables you to use SAP BW functionality in the public cloud and to import bridge entities into
SAP Datasphere (seeImporting Entities with Semantics from SAP BW∕4HANA or SAP BW Bridge ).
You can import data from a CSV file to create a new local table (see Creating a Local Table from a CSV File).
Purchase data products from providers and download them directly into your space (see Purchasing Data from
Data Marketplace).
You can become a data provider and offer your own data products for sale in Data Marketplace via the Data
Sharing Cockpit (see Data Marketplace - Data Provider's Guide).
You can create and import empty tables and views to receive and prepare data:
• You can create an empty local table ready to receive data from a CSV file or from a data flow (see Creating
a Local Table).
• You can import business content prepared by SAP and partners to support end-to-end business scenarios
(see Importing SAP and Partner Business Content from the Content Network).
• You can import object definitions from a CSN/JSON file (see Importing Objects from a CSN/JSON File).
Users with the DW Modeler role can use views and intelligent lookups in the Data Builder to combine, clean, and
otherwise prepare data.
For information about identifying the semantic usage of your entities and modeling them for consumption, see
Modeling Data in the Data Builder.
You can combine, filter, enrich and otherwise prepare data in views.
• You can write SQL or SQLScript (table function) code in a powerful SQL editor (see Creating an SQL View).
• To get started: In the side navigation area, click (Data Builder), select a space if necessary, and
click New SQL View to open the editor.
• SAP Datasphere supports:
• A subset of the SQL syntax supported by SAP HANA Cloud (see SQL Reference).
• The SQLScript syntax for table user-defined functions in SAP HANA Cloud (seeSQLScript
Reference).
• You can prepare your data in a graphical no code/low code environment (see Creating a Graphical View).
• To get started: In the side navigation area, click (Data Builder), select a space if necessary, and
click New Graphical View to open the editor.
• You can add and combine your sources by drag and drop (see Add a Source, Create a Join, and Create
a Union).
• You can refine, filter, and enrich your data in the diagram (see Reorder, Rename, and Exclude Columns,
Create a Column, Filter Data, and Aggregate Data).
• By default, views are virtual and must be run each time they are accessed. You can improve performance
by persisting the view (see Persist View Data).
You can join two entities even where there is no appropriate foreign key column or where its data is incomplete
or unreliable, with an intelligent lookup. You can iteratively join two entities by defining rules to match records
and then reviewing and processing the results (see Creating an Intelligent Lookup).
You can browse the catalog to discover high-quality trusted data assets to use as sources in your views and
other objects (see Finding and Accessing Data in the Catalog).
SAP Datasphere provides various ways to visualize and understand the dependencies between your entities
and other objects:
• You can visualize the objects that your object depends on (its lineage) and those that depend on it (its
impacts) by opening its impact and lineage analysis (see Impact and Lineage Analysis).
• You can visualize a set of entities and the associations between them by adding them to an entity-
relationship model (see Creating an Entity-Relationship Model).
• You can trace the source of a column in your graphical view and the transformations it has passed through
(see Visualize the Lineage of Columns and Input Parameters in a Graphical View).
Users with the DW Modeler role can add semantic information to their entities and expose them directly to
clients, tools, and apps, or combine, refine, and enrich them in tightly-focused analytic models for consumption
in SAP Analytics Cloud.
Use the Semantic Usage property to indicate the type of data contained in your entity:
• Select a Semantic Usage of Fact to indicate that your entity contains numerical measures that can be
analyzed.
In our example, Acme Sales View is a fact containing sales data.
Facts are entities that contain numerical measures that can be analyzed and are the principal type of object
that is consumed by BI clients (see Creating a Fact).
• To get started: Select a Semantic Usage of Fact to indicate that your entity contains numerical measures
that can be analyzed.
• You must identify at least one measure (see Specify Measures).
• You can create associations to dimensions and text entities (see Create an Association).
• To expose your data for consumption in SAP Analytics Cloud, add it to an analytic model (see Creating an
Analytic Model).
Dimensions are entities that contain master data that categorize and group the numerical data contained in
your measures (see Creating a Dimension).
• To get started: Select a Semantic Usage of Dimension to indicate that your entity contains attributes that
can be used to analyze and categorize measures defined in other entities.
• You must set at least one key column (see Set Key Columns to Uniquely Identify Records).
• You can create associations to other dimensions, text entities, and hierarchies (see Create an Association).
• You can add parent-child or level-based hierarchies to support drill-down (see Add a Hierarchy to a
Dimension).
• You can make your dimension time-dependent, so that its members can change over time (see Enable
Time-Dependency for a Dimension or Text Entity).
Text entities are entities that contain data to store strings in multiple languages for translating attributes in
other entities (see Create a Text Entity for Attribute Translation).
External hierarchies are entities that contain data to define parent-child relationships for a dimension (see
Creating an External Hierarchy).
• To get started: Select a Semantic Usage of Hierarchy to indicate that your entity contains parent-child
relationships for members in a dimension.
• You must specify the parent and child attributes and set the child attribute as a key.
Note
Parent-child and level-based hierarchies can also be defined directly in a dimension. See Add a Hierarchy to
a Dimension.
A hierarchy with directory is an entity that contains one or more parent-child hierarchies and has an
association to a directory dimension containing a list of the hierarchies. These types of hierarchy entities
can include nodes from multiple dimensions (for example, country, cost center group, and cost center) and
are commonly imported from SAP S/4HANA Cloud and SAP BW systems (see Creating a Hierarchy with
Directory).
There are two methods for exposing view data for consumption outside SAP Datasphere:
• SAP Analytics Cloud (and Microsoft Excel via an SAP add-in) do not consume view data directly. Set the
Semantic Usage of your view to Fact and then add it to an analytic model to expose it (see Creating an
Analytic Model). There is no need to enable the Expose for Consumption switch.
• Other third-party BI clients, tools, and apps can consume data from views with any Semantic Usage via
OData or ODBC if the Expose for Consumption switch is enabled.
Once your fact is ready for use, create an analytic model from it to consume its data in SAP Analytics Cloud
(see Creating an Analytic Model).
• To get started: In the side navigation area, click (Data Builder), select a space if necessary, and click
New Analytic Model to open the editor.
• You must add a fact as a source and can choose to copy all its measures, attributes and associated
dimensions to the analytic model (see Add a Source).
• You can deselect measures and attributes to leave only those that are relevant to answer your particular
analytic question.
• You can create additional calculated and restricted measures (see Add Measures).
• You can create multiple tightly-focused analytic models from a single fact, each providing only the data
needed for a particular BI context, and enriched with appropriate variables, filters, and additional measures
as necessary.
Users with the DW Modeler role can use the Business Builder editors to combine, refine, and enrich Data Builder
objects and expose lightweight, tightly-focused perspectives for consumption by SAP Analytics Cloud and
other BI clients.
• Consume Data From the Data Builder in Business Entities [page 30]
• Combine Business Entities in Fact Models and Consumption Models [page 30]
• Expose Data in Perspectives [page 31]
• Import SAP BW∕4HANA Queries [page 31]
Each business entity created in the Business Builder consumes data from a Data Builder entity. As you can, at
any time, switch the data source of a business entity to a different Data Builder entity, this loose coupling allows
you to maintain stable business entities for reporting, even as your physical data sources change.
• You can create a business entity by selecting a Data Builder entity as its source (see Creating a Business
Entity).
• You can remove unneeded measures and attributes to simplify your business entity for a particular
reporting need.
• You can enrich your business entity with new measures (including derived and calculated measures),
attributes, and other properties.
• You can change the data source of your business entity to a new Data Builder entity if necessary.
Combine your business entities into star-schemas to prepare them for consumption (see Creating a
Consumption Model).
You can use a single business entity in multiple consumption models and modify it by adding and removing
measures and attributes as appropriate for a particular reporting context.
You can, optionally, combine your business entities into an intermediate fact model and then use this as a
source for multiple consumption models (see Creating a Fact Model).
Create perspectives from a consumption model for exposure to SAP Analytics Cloud and other BI clients, MS
Excel, and other apps and tools (see Define Perspectives).
Import an SAP BW4/HANA query, along with its supporting InfoObjects and CompositeProviders to SAP
Datasphere (see Importing SAP BW∕4HANA Models).
You can leverage your existing investment in SAP BW by provisioning an SAP BW bridge tenant and by
importing queries from SAP BW∕4HANA.
SAP BW bridge enables you to use SAP BW functionality in the public cloud and to import bridge entities into
SAP Datasphere (seeImporting Entities with Semantics from SAP BW∕4HANA or SAP BW Bridge ).
Import an SAP BW4/HANA query, along with its supporting InfoObjects and CompositeProviders to SAP
Datasphere (see Importing SAP BW∕4HANA Models).
Users with the Catalog Administrator role publish high-quality trusted data and analytic assets, glossary terms,
and key performance indicators to the Catalog to promote their discovery and reuse.
You can select data and analytic assets for enrichment and publication in the catalog (see Editing and Enriching
Catalog Assets).
You can promote a common, consistent understanding of business terms within your organization by creating
glossary terms and publishing them in the catalog (see Create and Manage a Glossary).
You can define key performance indicators to track performance and provide an analytical basis for decision-
making (see Create and Manage Key Performance Indicators).
You can connect up to three SAP Analytics Cloud tenants to the catalog to allow you to publish stories and
other analytic assets. You can monitor the extraction of assets from the hosting SAP Datasphere tenant as well
as these tenants (see Connecting and Monitoring Source Systems).
All users of SAP Datasphere with any of the standard roles can consume data exposed by spaces they are
assigned to. If a user does not need to access SAP Datasphere itself, and only wants to consume data exposed
by it, they should be granted the DW Consumer role.
Data can be exposed as analytic models, perspectives, and views, which are accessible to clients, tools, and
apps as follows:
Analytic models (see Creating an Analytic Live Connection Live Connection (via OData/ODBC
Model) an SAP Add-In)
Exposed: Automatically
For more information, see: • Consume Data • Consume Data in • Consume Data in
in SAP Analytics Microsoft Excel via Power BI and
Cloud via a Live an SAP Add-In Other Clients,
Connection Tools, and Apps
• Integrate with via an OData Serv-
SAP Analytics ice
Cloud for Planning • Consume Data in
Power BI and
Other Clients,
Tools, and Apps
via ODBC
* The workflow of consuming views with a semantic usage of Analytical Dataset in SAP Analytics Cloud
and Microsoft Excel via live connection is now deprecated. We recommend that you migrate your analytical
datasets to the new Fact semantic usage and expose your view data via analytic models (see Analytical
Datasets (Deprecated)).
** SAP Analytics Cloud primarily uses the consumption of view data via ODATA for planning (see Integrate with
SAP Analytics Cloud for Planning).
Note
Before exposing data for consumption, you should consider applying row-level security via data access
controls (see Securing Data with Data Access Controls).
You can create a live connection from SAP Analytics Cloud to SAP Datasphere and consume data exposed as
analytic models and perspectives to create stories and analytic applications.
• Be a SAP Datasphere user with any of the standard roles. If you do not need to connect to SAP Datasphere
itself, and only consume data, then an administrator can grant you the DW Consumer role (see Standard
Roles Delivered with SAP Datasphere).
If data access controls have been applied, then the data you can consume will be filtered based on your
user id (see Securing Data with Data Access Controls).
• Be assigned to the SAP Datasphere space exposing the data (see Control User Access to Your Space).
• Have access to an SAP Analytics Cloud tenant and have the role BI Content Creator or another role
providing equivalent privileges.
• Create or have access to an SAP Analytics Cloud live data connection to your SAP Datasphere tenant (see
Live Data Connections to SAP Datasphere in the SAP Analytics Cloud documentation).
Note
The workflow of consuming views with a semantic usage of Analytical Dataset in SAP Analytics Cloud and
Microsoft Excel via live connection is now deprecated. We recommend that you migrate your analytical
datasets to the new Fact semantic usage and expose your view data via analytic models (see Analytical
Datasets (Deprecated)).
For more information, see Consume Data in SAP Analytics Cloud via a Live Connection.
You can use SAP Datasphere as a data source for loading actuals or external data into an SAP Analytics Cloud
planning model. You can also load your SAP Analytics Cloud planning data into SAP Datasphere and combine it
with live actuals or other data as appropriate.
For more information, see Integrate with SAP Analytics Cloud for Planning.
You can create a live connection from SAP Analytics Cloud to SAP Datasphere and consume data exposed as
analytic models and perspectives in Microsoft Excel, via the SAP Analytics Cloud add-in for Microsoft Office.
• Install the add-in (see Deploying the Add-In in the SAP Analytics Cloud, Add-In for Microsoft Office
documentation.
Note
This topic focuses on the SAP Analytics Cloud add-in for Microsoft Office. You can also consume
data exposed as perspectives and views in the SAP Analysis for Microsoft Office add-in (see the SAP
Analysis for Microsoft Office documentation). Data exposed as analytic models cannot be consumed in
the SAP Analysis for Microsoft Office add-in.
• Be a SAP Datasphere user with any of the standard roles. If you do not need to connect to SAP Datasphere
itself, and only consume data, then an administrator can grant you the DW Consumer role (see Standard
Roles Delivered with SAP Datasphere).
If data access controls have been applied, then the data you can consume will be filtered based on your
user id (see Securing Data with Data Access Controls).
• Be assigned to the SAP Datasphere space exposing the data (see Control User Access to Your Space).
• Have access to an SAP Analytics Cloud tenant and have the role BI Content Creator or another role
providing equivalent privileges.
• Create or have access to an SAP Analytics Cloud live data connection to your SAP Datasphere tenant (see
Live Data Connections to SAP Datasphere in the SAP Analytics Cloud documentation).
Note
The workflow of consuming views with a semantic usage of Analytical Dataset in SAP Analytics Cloud and
Microsoft Excel via live connection is now deprecated. We recommend that you migrate your analytical
datasets to the new Fact semantic usage and expose your view data via analytic models (see Analytical
Datasets (Deprecated)).
For more information, see Consume Data in Microsoft Excel via an SAP Add-In.
You can consume data exposed from SAP Datasphere in clients, tools, and apps via an OData service or via a
database user/Open SQL schema.
See Consume Data in Power BI and Other Clients, Tools, and Apps via an OData Service and Consume Data in
Power BI and Other Clients, Tools, and Apps via ODBC.
You can connect to the OData API and consume data exposed as views or analytic models in SAP Analytics
Cloud and other clients, tools, and apps that are capable of accessing an OData service and authenticating via
an OAuth client.
You must:
• Be a SAP Datasphere user with any of the standard roles. If you do not need to connect to SAP Datasphere
itself, and only consume data, then an administrator can grant you the DW Consumer role (see Standard
Roles Delivered with SAP Datasphere).
If data access controls have been applied, then the data you can consume will be filtered based on your
user id (see Securing Data with Data Access Controls).
• Be assigned to the SAP Datasphere space exposing the data (see Control User Access to Your Space).
• Obtain the following parameters for an OAuth client defined in your SAP Datasphere tenant:
• Client ID
• Secret
• OAuth2SAML Token URL - To be used in the OAuth 2.0 SAML Bearer Assertion workflow.
• OAuth2SAML Audience - To be used in the OAuth 2.0 SAML Bearer Assertion workflow.
Note
Consuming exposed data in third-party clients, tools, and apps via an OData service requires a three-
legged OAuth2.0 flow with type authorization_code. Users must manually authenticate against
For more information, see Consume Data via the OData API.
You can use our command line interface, datasphere, to connect to SAP Datasphere and manage certain
types of objects.
To use datasphere, you must install it (see Install or Update the SAP Datasphere Command Line Interface).
We recommend that you log in via an OAuth client (see Log into the Command Line Interface via an OAuth
Client).
Command Description
datasphere dbusers Users with the DW Space Administrator role (or equivalent privileges) can reset database
user passwords (see Reset Database User Passwords via the Command Line).
datasphere Users with the DW Modeler role (or equivalent privileges) can manage data providers (see
marketplace Manage Data Marketplace Data Providers via the Command Line) and data products (see
Manage Data Marketplace Data Products via the Command Line).
datasphere scoped- Users with the DW Administrator role (or equivalent privileges) can manage scoped roles
roles (see Manage Scoped Roles via the Command Line).
datasphere spaces Users with the: DW Administrator role (or equivalent privileges) can create spaces and
allocate storage and memory to them, while users with the DW Space Administrator role can
manage and staff spaces (see Manage Spaces via the Command Line).
datasphere tasks Users with the DW Integrator role (or equivalent privileges) can orchestrate tasks and task
chains (see Manage Tasks and Task Chains via the Command Line).
datasphere users Users with the DW Administrator role (or equivalent privileges) can manage SAP Datasphere
users (see Manage Users via the Command Line).
Hyperlinks
Some links are classified by an icon and/or a mouseover text. These links provide additional information.
About the icons:
• Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your
agreements with SAP) to this:
• The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.
• SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
• Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering an SAP-hosted Web site. By using
such links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this
information.
Example Code
Any software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax
and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of
example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Bias-Free Language
SAP supports a culture of diversity and inclusion. Whenever possible, we use unbiased language in our documentation to refer to people of all cultures, ethnicities,
genders, and abilities.
SAP and other SAP products and services mentioned herein as well as
their respective logos are trademarks or registered trademarks of SAP
SE (or an SAP affiliate company) in Germany and other countries. All
other product and service names mentioned are the trademarks of their
respective companies.