Skip to content

Commit 2b0b502

Browse files
committed
Update from SAP DITA CMS (squashed):
commit 012940352773e8e377906e3eb3e427f3882dcc42 Author: REDACTED Date: Mon Apr 14 16:49:57 2025 +0000 Update from SAP DITA CMS 2025-04-14 16:49:57 Project: dita-all/fxd1742329840944 Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap Output: loiob8faae83b519439fb4ea9d0eb1a5f26e Language: en-US Builddable map: 4e1c1e1d5d1947f5875e93e7597c4f4c.ditamap commit e85cf0be816037d21aae8698c372317511167aac Author: REDACTED Date: Mon Apr 14 16:49:27 2025 +0000 Update from SAP DITA CMS 2025-04-14 16:49:27 Project: dita-all/fxd1742329840944 Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap Output: loioc25299a38b6448f889a43b42c9e5897d Language: en-US Builddable map: 678695d903b546e5947af69e56ed42b8.ditamap commit 89c08cbae33278ef1327d5fd88e22e4765ccfffc Author: REDACTED Date: Mon Apr 14 16:49:24 2025 +0000 Update from SAP DITA CMS 2025-04-14 16:49:23 Project: dita-all/fxd1742329840944 Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap ################################################## [Remaining squash message was removed before commit...]
1 parent 911b84a commit 2b0b502

File tree

97 files changed

+1570
-520
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

97 files changed

+1570
-520
lines changed

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ This topic contains the following sections:
99
- [Introduction to the SAP Datasphere Object Store](acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md#loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_intro_to_big_data)
1010
- [Create a File Space in the Object Store](acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md#loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_create_file_space)
1111
- [Load Data with Replication Flows](acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md#loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_load_big_data)
12+
- [Integrate SAP BW Data Pushed Through the Data Product Generator](acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md#loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_load_BWData)
1213
- [Prepare Data with Apache Spark Transformation Flows](acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md#loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_prepare_big_data)
1314
- [Share Data to Standard Spaces](acquiring-and-preparing-data-in-the-object-store-2a6bc3f.md#loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_share_big_data)
1415

@@ -23,7 +24,7 @@ This topic contains the following sections:
2324
>
2425
> The object store cannot be enabled in SAP Datasphere tenants provisioned prior to version 2021.03. To request the migration of your tenant, see SAP note [3268282](https://me.sap.com/notes/3268282).
2526
26-
The object store provides an inbound layer for staging large quantities of data in a cost-effective object store. Users with a modeler role can use replication flows to replicate data to local tables \(file\) and you can optionally further prepare the data with Apache Spark transformation flows. You can then share the tables to standard spaces, where they can be used as sources for flows, views, and analytic models.![](images/BigData_Overview_09644c8.png)
27+
The object store provides an inbound layer for staging large quantities of data in a cost-effective object store. Data can be loaded by replication flows or pushed through the data product generator for SAP Business Data Cloud. You can optionally further prepare the data with Apache Spark transformation flows. You can then share the tables to standard spaces, where they can be used as sources for flows, views, and analytic models.![](images/Big_Data_with_BW_Data_Push_48daa3c.png)
2728

2829

2930

@@ -46,6 +47,14 @@ Users with a modeler role can use replication flows to load data in local tables
4647

4748

4849

50+
<a name="loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_load_BWData"/>
51+
52+
## Integrate SAP BW Data Pushed Through the Data Product Generator
53+
54+
A user with a data integrator role can access local tables \(file\) received from SAP BW or SAP BW/4HANA. An SAP BW administrator has pushed BW data into the object store, in a dedicated BW file space, and data is received as local table \(file\) directly into the Data Builder \(see [Working With Local Tables (File) Received From the Data Product Generator for SAP Business Data Cloud](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/72a055fc7dad40079efa442ddd4b998e.html "An administrator in SAP BW or SAP BW/4HANA has pushed data into SAP Datasphere as a local table (file), and you now want to use it for your business case.") :arrow_upper_right:\). Updates of these tables are pushed in the inbound buffer. To process data updates from this inbound buffer to the local table \(file\) generated by SAP BW and therefore make data visible, a merge task has to run via a task chain \(see [Creating a Task Chain](creating-a-task-chain-d1afbc2.md)\) or via the *Local Tables \(File\)* monitor. You can monitor the inbound buffer using the *Local Tables \(File\)* monitor \(See [Monitoring Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/6b2d0073a8684ee6a59d6f47d00ec895.html "Monitor your local tables (file). Check how and when they were last updated and if new data has still to be merged.") :arrow_upper_right:\).
55+
56+
57+
4958
<a name="loio2a6bc3f6d79b4c39a01b6d58d043fbaf__section_prepare_big_data"/>
5059

5160
## Prepare Data with Apache Spark Transformation Flows

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/acquiring-data-in-the-data-builder-1f15a29.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,10 +36,10 @@ SAP Datasphere is using two types of adaptors to connect to remote tables:
3636
- SAP HANA smart data access adaptors \(used in connections with no *Data Provisioning* option or *Data Provisioning* option = *Cloud Connector* or *Direct*\).
3737

3838
> ### Note:
39-
> If your source data comes from an SAP HANA On-Premise system, select the adaptor following your use case:
39+
> If your source data comes from an SAP HANA On-Premise system, select the adapter following your use case:
4040
>
41-
> - You want to access the data remotely: SAP HANA smart data access \(Data Provisioning Option: Direct\) would be the recommended adaptor to read the data. It allows higher degree of query pushdown to the remote database, leading to better response times and less resource consumption.
42-
> - You want to replicate the data into SAP Datasphere: The preferred option for this is to use Replication Flows, see [Creating a Replication Flow](creating-a-replication-flow-25e2bd7.md). In case you require replication for remote tables, SAP HANA smart data integration \(Data Provisioning Option: Data Provisioning Agent\) is the recommended adaptor to push the data. It offers more options when loading the data, such as applying filter conditions or data partitioning.
41+
> - You want to access the data remotely: SAP HANA smart data access \(Data Provisioning Option: Direct\) would be the recommended adapter to read the data. It allows higher degree of query pushdown to the remote database, leading to better response times and less resource consumption.
42+
> - You want to replicate the data into SAP Datasphere: The preferred option for this is to use Replication Flows, see [Creating a Replication Flow](creating-a-replication-flow-25e2bd7.md). In case you require replication for remote tables, SAP HANA smart data integration \(Data Provisioning Option: Data Provisioning Agent\) is the recommended adapter to push the data. It offers more options when loading the data, such as applying filter conditions or data partitioning.
4343
>
4444
> For more information on these adaptors, see [Connecting SAP HANA Cloud, SAP HANA Database to Remote Data Sources](https://help.sap.com/docs/HANA_CLOUD/db19c7071e5f4101837e23f06e576495/afa3769a2ecb407695908cfb4e3a9463.html).
4545

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/add-a-source-to-a-data-flow-7b50e8e.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,12 @@ Add a source to read data from. You can add multiple sources and combine them to
1616

1717
2. Browse or search for the object you want to add on either of the tabs.
1818

19-
- The *Repository* tab lists all the tables, views, and intelligent lookups that are available in the space \(including objects shared to the space\).. For more information, see [Add Objects from the Repository](../add-objects-from-the-repository-13fcecd.md).
19+
- The *Repository* tab lists all the tables, views, and intelligent lookups that are available in the space \(including objects shared to the space\)..
20+
21+
> ### Note:
22+
> If you use a local table with delta capture on as a source for your view, the column *Change Date* and *Change Type* won't be displayed.
23+
24+
For more information, see [Add Objects from the Repository](../add-objects-from-the-repository-13fcecd.md).
2025

2126
- The *Sources* tab lists all the connections and other data sources that have been integrated to the space from which you can import tables. However it shows only limited records. If you can't see the sources you are looking for, use *Import from Connection* to perform search. You can:
2227

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-local-table-file-d21881b.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,14 +22,18 @@ SAP Datasphere supports two types of local table to persist data:
2222
- Local tables – Are stored on disk or in-memory \(see [Creating a Local Table](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/2509fe4d86aa472b9858164b55b38077.html?locale=en-US&state=DRAFT&version=DEV)\). These tables can use delta capture and can only be created in a standard space \(with **SAP HANA Database \(Disk and In-Memory\)** storage \(see [Create a Space](https://help.sap.com/docs/SAP_DATASPHERE/9f804b8efa8043539289f42f372c4862/bbd41b82ad4d4d9ba91341545f0b37e7.html?locale=en-US&state=DRAFT&version=DEV)\).
2323
- Local tables \(file\) – Are stored on files and are intended for file storage with large amounts of data at lower cost. These tables always use delta capture and can only be created in a file space \(with **SAP HANA Data Lake Files** storage \(see [Create a File Space to Load Big Data](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/947444683e524cfd9169d7671b72ba0c.html?locale=en-US&state=DRAFT&version=DEV)\).
2424

25+
> ### Note:
26+
> Local tables \(file\) can also be created by the Data Product Generator from SAP BW or SAP BW/4HANA, and the behavior of such tables can differ. For more information, see [Working With Local Tables (File) Received From the Data Product Generator for SAP Business Data Cloud](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/72a055fc7dad40079efa442ddd4b998e.html "An administrator in SAP BW or SAP BW/4HANA has pushed data into SAP Datasphere as a local table (file), and you now want to use it for your business case.") :arrow_upper_right:.
27+
28+
2529
You cannot create views and analytic models in file spaces, but you can share local tables \(file\) to standard spaces where they can be consumed by views, flows, and analytic models \(see [Sharing Entities and Task Chains to Other Spaces](../Creating-Finding-Sharing-Objects/sharing-entities-and-task-chains-to-other-spaces-64b318f.md)\).
2630

2731
SAP HANA Cloud, data lake allows SAP Datasphere to store and manage mass-data efficiently in a secured environment. The SAP HANA native SQL on files feature gives you a direct access to the data stored in the object store and enables large data-based business scenarios at lower costs.
2832

2933
As a local table \(file\) is capturing delta changes via flows, it creates different entities in the repository after it is deployed:
3034

3135
- An active records entity for accessing the delta capture entity through a virtual table. It excludes the delta capture columns and deleted records, and keeps only the active records.
32-
- A delta capture entity that stores information on changes found in the delta capture table. It serves as target for flows at design time. In addition, every local table \(File\) has a specific folder in file storage \(inbound buffer\) to which a replication flow writes data files to a specific target object. To process data updates from this inbound buffer to the local table \(File\), and therefore make data visible, a merge task has to run \(see [Creating a Task Chain](creating-a-task-chain-d1afbc2.md), [Monitoring Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/6b2d0073a8684ee6a59d6f47d00ec895.html "Monitor your local tables (file). Check how and when they were last updated and if new data has still to be merged.") :arrow_upper_right: and [Creating a Replication Flow](creating-a-replication-flow-25e2bd7.md).\) You can monitor the buffer merge status using the *Local Tables \(File\)* monitor \(See [Monitoring Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/6b2d0073a8684ee6a59d6f47d00ec895.html "Monitor your local tables (file). Check how and when they were last updated and if new data has still to be merged.") :arrow_upper_right:.
36+
- A delta capture entity that stores information on changes found in the delta capture table. It serves as target for flows at design time. In addition, every local table \(file\) has a specific folder in file storage \(inbound buffer\) to which data updates are stored until they are pushed to the local table \(file\) by a merge task \(see [Creating a Task Chain](creating-a-task-chain-d1afbc2.md), [Monitoring Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/6b2d0073a8684ee6a59d6f47d00ec895.html "Monitor your local tables (file). Check how and when they were last updated and if new data has still to be merged.") :arrow_upper_right: and [Creating a Replication Flow](creating-a-replication-flow-25e2bd7.md)\). You can monitor the buffer merge status using the *Local Tables \(File\)* monitor \(See [Monitoring Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/6b2d0073a8684ee6a59d6f47d00ec895.html "Monitor your local tables (file). Check how and when they were last updated and if new data has still to be merged.") :arrow_upper_right:.
3337

3438

3539

@@ -84,7 +88,7 @@ As a local table \(file\) is capturing delta changes via flows, it creates diffe
8488
<tr>
8589
<td valign="top">
8690

87-
Business Name
91+
Technical Name
8892

8993
</td>
9094
<td valign="top">

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-task-chain-d1afbc2.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -267,7 +267,7 @@ You can monitor the status of task chain runs from the Data Integration Monitor.
267267
- Local Table \(File\) - Merge, Optimize or Delete Records.
268268

269269
> ### Note:
270-
> - Merge: Add, update or delete data into the existing local table \(file\). A replication flow writes data files to the inbound buffer \(specific folder in file storage\) of a target local table \(file\). To process data updates from this inbound buffer to the local table \(file\), and therefore make data visible, a merge task has to run..
270+
> - Merge: Add, update or delete data into the existing local table \(file\). Data updates are pushed by a replication flow or SAP BW to the inbound buffer \(specific folder in file storage\) of a target local table \(file\). To process data updates from this inbound buffer to the local table \(file\), and therefore make data visible, a merge task has to run..
271271
> - Optimize: Improve data access performance by optimizing the layout of data in file storage \(for example by grouping small files into larger files..
272272
> - Delete Records: Delete records from your local table \(file\). Under Settings, define what type of deletion you want:
273273
> - *Delete All Records \(Mark as Deleted\)*: Records will not be physically deleted but marked as deleted and filtered out when accessing the active records of the local table. They will still consume storage, and they can still be processed by other apps that consume them.
Loading

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/replicate-remote-table-data-7e258a7.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -17,10 +17,10 @@ SAP Datasphere is using two types of adaptors to connect to remote tables:
1717
- SAP HANA smart data access adaptors \(used in connections with no *Data Provisioning* option or *Data Provisioning* option = *Cloud Connector* or *Direct*\).
1818

1919
> ### Note:
20-
> If your source data comes from an SAP HANA On-Premise system, select the adaptor following your use case:
20+
> If your source data comes from an SAP HANA On-Premise system, select the adapter following your use case:
2121
>
22-
> - You want to access the data remotely: SAP HANA smart data access \(Data Provisioning Option: Direct\) would be the recommended adaptor to read the data. It allows higher degree of query pushdown to the remote database, leading to better response times and less resource consumption.
23-
> - You want to replicate the data into SAP Datasphere: The preferred option for this is to use Replication Flows, see [Creating a Replication Flow](creating-a-replication-flow-25e2bd7.md). In case you require replication for remote tables, SAP HANA smart data integration \(Data Provisioning Option: Data Provisioning Agent\) is the recommended adaptor to push the data. It offers more options when loading the data, such as applying filter conditions or data partitioning.
22+
> - You want to access the data remotely: SAP HANA smart data access \(Data Provisioning Option: Direct\) would be the recommended adapter to read the data. It allows higher degree of query pushdown to the remote database, leading to better response times and less resource consumption.
23+
> - You want to replicate the data into SAP Datasphere: The preferred option for this is to use Replication Flows, see [Creating a Replication Flow](creating-a-replication-flow-25e2bd7.md). In case you require replication for remote tables, SAP HANA smart data integration \(Data Provisioning Option: Data Provisioning Agent\) is the recommended adapter to push the data. It offers more options when loading the data, such as applying filter conditions or data partitioning.
2424
>
2525
> For more information on these adaptors, see [Connecting SAP HANA Cloud, SAP HANA Database to Remote Data Sources](https://help.sap.com/docs/HANA_CLOUD/db19c7071e5f4101837e23f06e576495/afa3769a2ecb407695908cfb4e3a9463.html).
2626
@@ -86,7 +86,7 @@ SAP Datasphere is using two types of adaptors to connect to remote tables:
8686

8787
Change how the schedule is specified, or change the owner of the schedule.
8888

89-
For more information, see [Take Over the Ownership of a Schedule](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/4b660c0395454bd0923f732eef4ee4b2.html "Per default, the user who creates a task schedule owns the schedule which means that the job scheduling component runs the task on the owner's behalf according to the defined schedule. You can assign the ownership of the schedule to yourself.") :arrow_upper_right:.
89+
For more information, see [Modify the Owner of a Schedule](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/4b660c0395454bd0923f732eef4ee4b2.html "Per default, the user who creates a task schedule owns the schedule which means that the job scheduling component runs the task on the owner's behalf according to the defined schedule. You can assign the ownership of the schedule to yourself.") :arrow_upper_right:.
9090

9191
- *Delete Schedule*
9292

docs/Acquiring-Preparing-Modeling-Data/Modeling-Data-in-the-Data-Builder/add-a-fact-to-an-analytic-model-27075ee.md

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,30 @@
22

33
# Add a Fact to an Analytic Model
44

5-
As a source for your analytic model, you need an object of type fact.
5+
As a source for your analytic model, you need an object of type fact or another analytic model.
66

77

88

99
## Context
1010

1111
The data type and semantic type information is inherited from the fact, as well as the standard aggregation behaviour for a measure. For fields with semantic type *Amount with Currency* and *Quantity with Unit* you only need to add the measure in the analytic model, the quantity or unit are added automatically.
1212

13+
For analytic models as a source, these properties are copied to the new analytic model:
14+
15+
- All measures that are not auxiliary
16+
17+
- All fact source attributes and attributes from all selected dimensions
18+
- Associations
19+
- Variables
20+
- Global filter
21+
- Data access controls
22+
- All attributes needed in the association to the dimension and all additional necessary attributes, e.g for hierarchies on dimensions.
23+
1324

1425

1526
## Procedure
1627

17-
1. Browse or search for the object you want to add. The repository shows only the objects which can be used in an analytic model: facts.
28+
1. Browse or search for the object you want to add. The repository shows only the objects which can be used in an analytic model: facts and analytic models.
1829

1930
2. Drag your source from the Repository and drop it onto the canvas.
2031

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
<!-- loio134b051ef2ed48788a0830cfcf079891 -->
2+
3+
# Changing the underlying Analytic Model
4+
5+
Note the effect of changes to an analytic model which is used as a source for other analytic models.
6+
7+
When changes are made to an analytic model which is used as a source for other analytic models, the status of the dependent analytic model\(s\) is changed to *Changes to Deploy*or *Design Time Error*, depending on the changes.
8+
9+
When you save or deploy an analytic model that is reused by other analytic models after having changed the dimension source, measures, variables or attributes, you will get a warning that this can affect existing analytic models, and the status of dependent analytic models will be updated accordingly.
10+

0 commit comments

Comments
 (0)