You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/acquiring-data-1f15a29.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -102,7 +102,7 @@ You can import data from a CSV file to create a new local table \(see [Creating
102
102
103
103
## Purchase Data from Data Marketplace
104
104
105
-
Purchase data products from providers and download them directly into your space \(see [Evaluating and Installing Marketplace Data Products](../evaluating-and-installing-marketplace-data-products-92c35ef.md)\).
105
+
Purchase data products from providers and download them directly into your space \(see [Installing Marketplace Data Products](../installing-marketplace-data-products-92c35ef.md)\).
106
106
107
107
You can become a data provider and offer your own data products for sale in Data Marketplace via the Data Sharing Cockpit \(see [Data Marketplace - Data Provider's Guide](https://help.sap.com/viewer/bb1899f0b39f415b9de29a845873d7af/DEV_CURRENT/en-US/e479b7b4c95741c7a7a1d42397984c7e.html"Users with a modeler role can create a data provider profile and publish data products to public, private, and internal Data Marketplaces."):arrow_upper_right:\).
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/add-the-source-for-a-replication-flow-7496380.md
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,6 +29,7 @@ Define the source for your replication flow \(connection, container, and objects
29
29
- For cloud storage providers, the container is the folder that contains the relevant dataset folders.
30
30
31
31
32
+
- For delta loading from a Microsoft SQL Server or Azure SQL database source, the source must have a schema with the same name as the user specified in the connection. If this schema is missing, a database administrator needs to create it and grant the necessary write privileges. This schema is needed to store internal objects for delta replication.
32
33
- You can use the SQL service exposure from SAP BTP, ABAP environment, or SAP S/4HANA Cloud, respectively, to replicate custom and standard CDS view entities if your system administration has created the relevant communication arrangements. For more information, see [Data Consumption using SAP Datasphere](https://help.sap.com/docs/btp/sap-business-technology-platform/data-consumption-using-sap-datasphere). For information about the relevant integration scenario, see [Integrating SQL Services using SAP Datasphere](https://help.sap.com/docs/btp/sap-business-technology-platform/integrating-sql-services-using-sap-datasphere).
33
34
34
35
- Replication objects are the datasets that you choose for replication, for example individual CDS view entities or database tables.
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/cloud-storage-provider-sources-for-replication-flows-4d481a2.md
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/configure-a-replication-flow-3f5ba0c.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -66,9 +66,9 @@ Define settings and properties for your replication flow and individual replicat
66
66
67
67
- Additional settings that are only relevant for a specific source type and can be made for the replication flow itself as well as for individual replication objects. For more information, see
-[Confluent Kafka Sources for Replication Flows](confluent-kafka-sources-for-replication-flows-4f2d0a8.md)
72
72
73
73
-[Secure File Transfer Protocol \(SFTP\) as Targets for Your Replication Flows](secure-file-transfer-protocol-sftp-as-targets-for-your-replicati-5a14eb1.md)
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/confluent-kafka-sources-for-replication-flows-4f2d0a8.md
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-replication-flow-25e2bd7.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ You can use replication flows to copy data from the following source objects:
19
19
- Tables that have a primary key.
20
20
21
21
22
-
CDS views and ODP artifacts that do not have a primary key can be used as the source for a replication flow if certain prerequisites are met. For more information, see [Using an Object Without Primary Key As the Source](using-an-object-without-primary-key-as-the-source-2267a9f.md).
22
+
CDS views and ODP artifacts that do not have a primary key can be used as the source for a replication flow if certain prerequisites are met. For more information, see [Object Without Primary Key As Source Objects for Replication Flows](object-without-primary-key-as-source-objects-for-replication-flo-2267a9f.md).
23
23
24
24
For more information about available connection types, sources, and targets, see [Connection Types Supporting Replication Flows](https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/94562426f30c475286f50a1e2b45e743.html?connection_overview-rf=yes%20(source)&connection_overview-rf=yes%20(target)&connection_overview-rf=yes%20(source%20and%20target)&connection_overview-rf=via%20connection%20type%20SAP%20ABAP%20Connections%20(source)).
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-task-chain-d1afbc2.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,7 +29,7 @@ Group multiple tasks into a task chain and run them manually once, or periodical
29
29
30
30
You can create task chains that include SAP Datasphere repository objects, that is, Remote Tables and Views, Local Tables, Intelligent Lookups, Data Flows, Replication Flows \(load type *Initial Only*\), and Transformation Flows. You can also include non-repository objects such as SAP HANA Open SQL schema procedures and SAP BW Bridge process chains. In addition, you can nest other existing, locally-created or shared task chains in other task chains, as well as share task chains you've created to other spaces.
31
31
32
-
n addition to the objects available from the Repository or Others tabs, you can also add two other additional objects to task chains that are only available from the task chain toolbar. The *API Task* object lets you configure and run API asks to access external systems. The *Notification Task* object lets you configure email notification for individual task chain tasks.
32
+
In addition to the objects available from the Repository or Others tabs, you can also add two other additional objects to task chains that are only available from the task chain toolbar. The *API Task* object lets you configure and run API asks to access external systems. The *Notification Task* object lets you configure email notification for individual task chain tasks.
33
33
34
34
> ### Note:
35
35
> For remote table and view objects included in a task chain, you have the option, by default, to replicate or persist the data associated with the corresponding remote tables or views. Or, you can choose to remove the replicated or persisted data by selecting that option in the *Activities* section of an object’s *Properties* detail display.
@@ -320,7 +320,7 @@ In addition to working with task chains in the editor, you can also:
320
320
321
321
\[File Space Only\] When creating a file space, administrators have defined default *Apache Spark Applications* to run tasks \(in Workload Management\). You can update these settings following your needs by object types:
322
322
323
-
-*Use Default*: The default application is the application selected by an administrator during the file space creation. However, if the settings have been changed on the object level, in the data integration monitor, this value has become the default value, erasing the value defined in *Workload Management*.
323
+
-*Use Default*: The default application is the application selected in the table settings. If no default application is defined there, the application selected by an administrator during the file space creation is used. However, if the settings have been changed on the object level, in the data integration monitor, this value has become the default value, erasing the value defined in *Workload Management*.
324
324
-*Define New Setting for This Task*: Select another *Apache Spark Application* that fits your needs.
325
325
326
326
For more information, see [Merge or Optimize Your Local Tables (File)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/e533b154ed3e49ce9a03e4421a5296e7.html"Local Tables (File) can store large quantities of data in the object store. You can manage this file storage with merge or optimize tasks, and allocate the required amount of compute resources that the file space can consume when processing these tasks."):arrow_upper_right: and [Override the Default Settings to Run Your Transformation Flow (in a File Space)](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/e5c4ac8ab3bf4573b86cd4f4f3118c16.html"Update the maximum amount of compute resources that the file space can consume to run a transformation flow."):arrow_upper_right:.
Copy file name to clipboardExpand all lines: docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder/creating-a-transformation-flow-in-a-file-space-b917baf.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,16 +4,16 @@
4
4
5
5
# Creating a Transformation Flow in a File Space
6
6
7
-
Create transformation flows with local tables \(file\) as sources, apply various transformations, and store the resulted dataset into another local table \(file\).
7
+
Create transformation flows with local tables \(file\), apply various transformations, and store the resulted dataset into another local table \(file\).
8
8
9
9
> ### Note:
10
10
> For additional information on working with data in the object store, see SAP note [3538038](https://me.sap.com/notes/3538038).
11
11
12
-
As a Datasphere modeler, you want to model transformation flows with local tables \(file\) as sources, apply various transformations in a file space dedicated to loading and preparing large quantities of data, and store the resulted dataset into another local table \(file\).
12
+
As a Datasphere modeler, you want to model transformation flows with local tables \(file\), shared local tables, and shared remote tables on a Delta Share runtime as sources, apply various transformations in a file space dedicated to loading and preparing large quantities of data, and store the resulted dataset into another local table \(file\).
13
13
14
14
> ### Caution:
15
15
> - You must be in a file space. For more information, see [Create a File Space to Load Data in the Object Store](https://help.sap.com/viewer/935116dd7c324355803d4b85809cec97/DEV_CURRENT/en-US/947444683e524cfd9169d7671b72ba0c.html"Create a file space and allocate compute resources to it. File spaces are intended for loading and preparing large quantities of data in an inexpensive inbound staging area and are stored in the SAP Datasphere object store."):arrow_upper_right:.
16
-
> -You source and target must be local table \(file\). For more information, see [Creating a Local Table \(File\)](creating-a-local-table-file-d21881b.md).
16
+
> -Your source and target must be a local table \(file\). For more information, see [Creating a Local Table \(File\)](creating-a-local-table-file-d21881b.md).
17
17
> - You can only create a graphical view transform.
18
18
> - You can only preview data for source and target tables. Intermediate node transforms can’t be previewed.
19
19
@@ -23,7 +23,7 @@ As a Datasphere modeler, you want to model transformation flows with local table
23
23
> ### Note:
24
24
> On file space, you can only create *Graphical View Transform*.
25
25
26
-
3. Add a source. For more information, see [Creating a Transformation Flow in a File Space](creating-a-transformation-flow-in-a-file-space-b917baf.md). Note that you can only add a local table \(file\).
26
+
3. Add a source. For more information, see [Add a Source to a Graphical View](../add-a-source-to-a-graphical-view-1eee180.md). Note that you can only add a local table \(file\).
27
27
4. Add a Transformation. The *View Transform* does not support all functions available in a transformation flow created in an SAP HANA space. For more information, see [List of Functions Supported by a Transformation Flow \(in a File Space\)](list-of-functions-supported-by-a-transformation-flow-in-a-file-s-37e737f.md):
28
28
29
29
@@ -84,7 +84,7 @@ As a Datasphere modeler, you want to model transformation flows with local table
84
84
</td>
85
85
<tdvalign="top">
86
86
87
-
See [Filter Data in a Graphical View](../filter-data-in-a-graphical-view-6f6fa18.md)
87
+
See [Filter Data in a Graphical View](../filter-data-in-a-graphical-view-6f6fa18.md)
88
88
89
89
</td>
90
90
</tr>
@@ -136,7 +136,7 @@ As a Datasphere modeler, you want to model transformation flows with local table
136
136
> ### Note:
137
137
> It can only be a local table \(file\) and *Delete All Before Loading* is not supported.
138
138
139
-
8. Review the properties of your transformation flow, saveand deploy it. See [Creating a Transformation Flow](../creating-a-transformation-flow-f7161e6.md).
139
+
8. Review the properties of your transformation flow, save, deploy, and run it. See [Creating a Transformation Flow](../creating-a-transformation-flow-f7161e6.md).
140
140
141
141
> ### Note:
142
142
> The transformation will be saved in the object store. While deploying, a virtual procedure will be created to enable the runtime in the file space.
@@ -147,6 +147,6 @@ As a Datasphere modeler, you want to model transformation flows with local table
147
147
- Simulate a run that doesn't save changes in the target table by clicking *Simulate Run*. Simulating allows you to test a transformation flow and see if you get the desired outcome. Based on the result, you can decide to deploy the flow, resolve errors, or to optimize the flow to improve performances.
148
148
- Download a PLV file of a visual map of the operators and their relationships and hierarchies by clicking *Generate a SQL Analyzer Plan File*. The plan file contains detailed information about your data model that you can download for further analysis. Analyzing this file allows you to resolve errors and enhance the transformation flow performances.
149
149
150
-
For more information, see [Explore Transformation Flows](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/7588192bf4cd4e3db43704239ba4d366.html"Use Run with Settings to explore graphical or SQL views and the entities they consume in a transformation flow."):arrow_upper_right:
150
+
For more information, see [Explore Transformation Flows](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/7588192bf4cd4e3db43704239ba4d366.html"Use Run with Settings to explore graphical or SQL views and the entities they consume in a transformation flow."):arrow_upper_right:.
0 commit comments