Skip to content

Commit 2dde8d3

Browse files
authored
Merge branch 'main' into remi-sap-patch-1
2 parents 5f4408d + bfd7842 commit 2dde8d3

File tree

95 files changed

+1383
-434
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

95 files changed

+1383
-434
lines changed

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/acquiring-data-1f15a29.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ You can import data from a CSV file to create a new local table \(see [Creating
102102

103103
## Purchase Data from Data Marketplace
104104

105-
Purchase data products from providers and download them directly into your space \(see [Evaluating and Installing Marketplace Data Products](../evaluating-and-installing-marketplace-data-products-92c35ef.md)\).
105+
Purchase data products from providers and download them directly into your space \(see [Installing Marketplace Data Products](../installing-marketplace-data-products-92c35ef.md)\).
106106

107107
You can become a data provider and offer your own data products for sale in Data Marketplace via the Data Sharing Cockpit \(see [Data Marketplace - Data Provider's Guide](https://help.sap.com/viewer/bb1899f0b39f415b9de29a845873d7af/DEV_CURRENT/en-US/e479b7b4c95741c7a7a1d42397984c7e.html "Users with a modeler role can create a data provider profile and publish data products to public, private, and internal Data Marketplaces.") :arrow_upper_right:\).
108108

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/add-the-source-for-a-replication-flow-7496380.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ Define the source for your replication flow \(connection, container, and objects
2929
- For cloud storage providers, the container is the folder that contains the relevant dataset folders.
3030

3131

32+
- For delta loading from a Microsoft SQL Server or Azure SQL database source, the source must have a schema with the same name as the user specified in the connection. If this schema is missing, a database administrator needs to create it and grant the necessary write privileges. This schema is needed to store internal objects for delta replication.
3233
- You can use the SQL service exposure from SAP BTP, ABAP environment, or SAP S/4HANA Cloud, respectively, to replicate custom and standard CDS view entities if your system administration has created the relevant communication arrangements. For more information, see [Data Consumption using SAP Datasphere](https://help.sap.com/docs/btp/sap-business-technology-platform/data-consumption-using-sap-datasphere). For information about the relevant integration scenario, see [Integrating SQL Services using SAP Datasphere](https://help.sap.com/docs/btp/sap-business-technology-platform/integrating-sql-services-using-sap-datasphere).
3334

3435
- Replication objects are the datasets that you choose for replication, for example individual CDS view entities or database tables.

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/cloud-storage-provider-sources-4d481a2.md renamed to docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/cloud-storage-provider-sources-for-replication-flows-4d481a2.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
<link rel="stylesheet" type="text/css" href="../css/sap-icons.css"/>
44

5-
# Cloud Storage Provider Sources
5+
# Cloud Storage Provider Sources for Replication Flows
66

77
If you use a cloud storage provider as the source for your replication flow, you need to consider additional specifics and conditions.
88

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/configure-a-replication-flow-3f5ba0c.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -66,9 +66,9 @@ Define settings and properties for your replication flow and individual replicat
6666

6767
- Additional settings that are only relevant for a specific source type and can be made for the replication flow itself as well as for individual replication objects. For more information, see
6868

69-
- [Cloud Storage Provider Sources](cloud-storage-provider-sources-4d481a2.md)
69+
- [Cloud Storage Provider Sources for Replication Flows](cloud-storage-provider-sources-for-replication-flows-4d481a2.md)
7070

71-
- [Confluent Kafka Sources](confluent-kafka-sources-4f2d0a8.md)
71+
- [Confluent Kafka Sources for Replication Flows](confluent-kafka-sources-for-replication-flows-4f2d0a8.md)
7272

7373
- [Secure File Transfer Protocol \(SFTP\) as Targets for Your Replication Flows](secure-file-transfer-protocol-sftp-as-targets-for-your-replicati-5a14eb1.md)
7474

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/confluent-kafka-sources-4f2d0a8.md renamed to docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/confluent-kafka-sources-for-replication-flows-4f2d0a8.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
<link rel="stylesheet" type="text/css" href="../css/sap-icons.css"/>
44

5-
# Confluent Kafka Sources
5+
# Confluent Kafka Sources for Replication Flows
66

77
If you use Confluent Kafka as the source for your replication flow, you need to consider the following additional specifics and conditions.
88

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-replication-flow-25e2bd7.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ You can use replication flows to copy data from the following source objects:
1919
- Tables that have a primary key.
2020

2121

22-
CDS views and ODP artifacts that do not have a primary key can be used as the source for a replication flow if certain prerequisites are met. For more information, see [Using an Object Without Primary Key As the Source](using-an-object-without-primary-key-as-the-source-2267a9f.md).
22+
CDS views and ODP artifacts that do not have a primary key can be used as the source for a replication flow if certain prerequisites are met. For more information, see [Object Without Primary Key As Source Objects for Replication Flows](object-without-primary-key-as-source-objects-for-replication-flo-2267a9f.md).
2323

2424
For more information about available connection types, sources, and targets, see [Connection Types Supporting Replication Flows](https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/94562426f30c475286f50a1e2b45e743.html?connection_overview-rf=yes%20(source)&connection_overview-rf=yes%20(target)&connection_overview-rf=yes%20(source%20and%20target)&connection_overview-rf=via%20connection%20type%20SAP%20ABAP%20Connections%20(source)).
2525

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-task-chain-d1afbc2.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ Group multiple tasks into a task chain and run them manually once, or periodical
2929

3030
You can create task chains that include SAP Datasphere repository objects, that is, Remote Tables and Views, Local Tables, Intelligent Lookups, Data Flows, Replication Flows \(load type *Initial Only*\), and Transformation Flows. You can also include non-repository objects such as SAP HANA Open SQL schema procedures and SAP BW Bridge process chains. In addition, you can nest other existing, locally-created or shared task chains in other task chains, as well as share task chains you've created to other spaces.
3131

32-
n addition to the objects available from the Repository or Others tabs, you can also add two other additional objects to task chains that are only available from the task chain toolbar. The *API Task* object lets you configure and run API asks to access external systems. The *Notification Task* object lets you configure email notification for individual task chain tasks.
32+
In addition to the objects available from the Repository or Others tabs, you can also add two other additional objects to task chains that are only available from the task chain toolbar. The *API Task* object lets you configure and run API asks to access external systems. The *Notification Task* object lets you configure email notification for individual task chain tasks.
3333

3434
> ### Note:
3535
> For remote table and view objects included in a task chain, you have the option, by default, to replicate or persist the data associated with the corresponding remote tables or views. Or, you can choose to remove the replicated or persisted data by selecting that option in the *Activities* section of an object’s *Properties* detail display.
@@ -320,7 +320,7 @@ In addition to working with task chains in the editor, you can also:
320320

321321
\[File Space Only\] When creating a file space, administrators have defined default *Apache Spark Applications* to run tasks \(in Workload Management\). You can update these settings following your needs by object types:
322322

323-
- *Use Default*: The default application is the application selected by an administrator during the file space creation. However, if the settings have been changed on the object level, in the data integration monitor, this value has become the default value, erasing the value defined in *Workload Management*.
323+
- *Use Default*: The default application is the application selected in the table settings. If no default application is defined there, the application selected by an administrator during the file space creation is used. However, if the settings have been changed on the object level, in the data integration monitor, this value has become the default value, erasing the value defined in *Workload Management*.
324324
- *Define New Setting for This Task*: Select another *Apache Spark Application* that fits your needs.
325325

326326
For more information, see [Merge or Optimize Your Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/e533b154ed3e49ce9a03e4421a5296e7.html "Local Tables (File) can store large quantities of data in the object store. You can manage this file storage with merge or optimize tasks, and allocate the required amount of compute resources that the file space can consume when processing these tasks.") :arrow_upper_right: and [Override the Default Settings to Run Your Transformation Flow (in a File Space)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/e5c4ac8ab3bf4573b86cd4f4f3118c16.html "Update the maximum amount of compute resources that the file space can consume to run a transformation flow.") :arrow_upper_right:.

docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-transformation-flow-in-a-file-space-b917baf.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -4,16 +4,16 @@
44

55
# Creating a Transformation Flow in a File Space
66

7-
Create transformation flows with local tables \(file\) as sources, apply various transformations, and store the resulted dataset into another local table \(file\).
7+
Create transformation flows with local tables \(file\), apply various transformations, and store the resulted dataset into another local table \(file\).
88

99
> ### Note:
1010
> For additional information on working with data in the object store, see SAP note [3538038](https://me.sap.com/notes/3538038).
1111
12-
As a Datasphere modeler, you want to model transformation flows with local tables \(file\) as sources, apply various transformations in a file space dedicated to loading and preparing large quantities of data, and store the resulted dataset into another local table \(file\).
12+
As a Datasphere modeler, you want to model transformation flows with local tables \(file\), shared local tables, and shared remote tables on a Delta Share runtime as sources, apply various transformations in a file space dedicated to loading and preparing large quantities of data, and store the resulted dataset into another local table \(file\).
1313

1414
> ### Caution:
1515
> - You must be in a file space. For more information, see [Create a File Space to Load Data in the Object Store](https://help.sap.com/viewer/935116dd7c324355803d4b85809cec97/DEV_CURRENT/en-US/947444683e524cfd9169d7671b72ba0c.html "Create a file space and allocate compute resources to it. File spaces are intended for loading and preparing large quantities of data in an inexpensive inbound staging area and are stored in the SAP Datasphere object store.") :arrow_upper_right:.
16-
> - You source and target must be local table \(file\). For more information, see [Creating a Local Table \(File\)](creating-a-local-table-file-d21881b.md).
16+
> - Your source and target must be a local table \(file\). For more information, see [Creating a Local Table \(File\)](creating-a-local-table-file-d21881b.md).
1717
> - You can only create a graphical view transform.
1818
> - You can only preview data for source and target tables. Intermediate node transforms can’t be previewed.
1919
@@ -23,7 +23,7 @@ As a Datasphere modeler, you want to model transformation flows with local table
2323
> ### Note:
2424
> On file space, you can only create *Graphical View Transform*.
2525
26-
3. Add a source. For more information, see [Creating a Transformation Flow in a File Space](creating-a-transformation-flow-in-a-file-space-b917baf.md). Note that you can only add a local table \(file\).
26+
3. Add a source. For more information, see [Add a Source to a Graphical View](../add-a-source-to-a-graphical-view-1eee180.md). Note that you can only add a local table \(file\).
2727
4. Add a Transformation. The *View Transform* does not support all functions available in a transformation flow created in an SAP HANA space. For more information, see [List of Functions Supported by a Transformation Flow \(in a File Space\)](list-of-functions-supported-by-a-transformation-flow-in-a-file-s-37e737f.md):
2828

2929

@@ -84,7 +84,7 @@ As a Datasphere modeler, you want to model transformation flows with local table
8484
</td>
8585
<td valign="top">
8686

87-
See [Filter Data in a Graphical View](../filter-data-in-a-graphical-view-6f6fa18.md)
87+
See [Filter Data in a Graphical View](../filter-data-in-a-graphical-view-6f6fa18.md)
8888

8989
</td>
9090
</tr>
@@ -136,7 +136,7 @@ As a Datasphere modeler, you want to model transformation flows with local table
136136
> ### Note:
137137
> It can only be a local table \(file\) and *Delete All Before Loading* is not supported.
138138
139-
8. Review the properties of your transformation flow, save and deploy it. See [Creating a Transformation Flow](../creating-a-transformation-flow-f7161e6.md).
139+
8. Review the properties of your transformation flow, save, deploy, and run it. See [Creating a Transformation Flow](../creating-a-transformation-flow-f7161e6.md).
140140

141141
> ### Note:
142142
> The transformation will be saved in the object store. While deploying, a virtual procedure will be created to enable the runtime in the file space.
@@ -147,6 +147,6 @@ As a Datasphere modeler, you want to model transformation flows with local table
147147
- Simulate a run that doesn't save changes in the target table by clicking *Simulate Run*. Simulating allows you to test a transformation flow and see if you get the desired outcome. Based on the result, you can decide to deploy the flow, resolve errors, or to optimize the flow to improve performances.
148148
- Download a PLV file of a visual map of the operators and their relationships and hierarchies by clicking *Generate a SQL Analyzer Plan File*. The plan file contains detailed information about your data model that you can download for further analysis. Analyzing this file allows you to resolve errors and enhance the transformation flow performances.
149149

150-
For more information, see [Explore Transformation Flows](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/7588192bf4cd4e3db43704239ba4d366.html "Use Run with Settings to explore graphical or SQL views and the entities they consume in a transformation flow.") :arrow_upper_right:
150+
For more information, see [Explore Transformation Flows](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/7588192bf4cd4e3db43704239ba4d366.html "Use Run with Settings to explore graphical or SQL views and the entities they consume in a transformation flow.") :arrow_upper_right:.
151151

152152

0 commit comments

Comments
 (0)