Skip to content

Commit 7d8a0ea

Browse files
[SDP] Streaming Tables
1 parent df31a24 commit 7d8a0ea

File tree

1 file changed

+22
-10
lines changed

1 file changed

+22
-10
lines changed

docs/declarative-pipelines/index.md

Lines changed: 22 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -83,9 +83,20 @@ libraries:
8383

8484
Declarative Pipelines supports the following dataset types:
8585

86-
* **Materialized Views** (datasets) that are published to a catalog
87-
* **Table** that are published to a catalog
88-
* **Views** that are not published to a catalog
86+
* **Materialized views** that are published to a catalog.
87+
* **Table** that are published to a catalog.
88+
* [Streaming tables](#streaming-tables)
89+
* **Views** that are not published to a catalog.
90+
91+
### Streaming Tables
92+
93+
**Streaming tables** are tables whose content is produced by one or more streaming flows.
94+
95+
Streaming tables can be created with the following:
96+
97+
* [@dp.create_streaming_table](#create_streaming_table)
98+
* [CREATE STREAMING TABLE](../sql/SparkSqlAstBuilder.md/#visitCreatePipelineDataset)
99+
* [CREATE STREAMING TABLE ... AS](../sql/SparkSqlAstBuilder.md/#visitCreatePipelineDataset)
89100

90101
## Spark Connect Only { #spark-connect }
91102

@@ -176,13 +187,14 @@ create_sink(
176187

177188
```py
178189
create_streaming_table(
179-
name: str,
180-
*,
181-
comment: Optional[str] = None,
182-
table_properties: Optional[Dict[str, str]] = None,
183-
partition_cols: Optional[List[str]] = None,
184-
schema: Optional[Union[StructType, str]] = None,
185-
format: Optional[str] = None,
190+
name: str,
191+
*,
192+
comment: Optional[str] = None,
193+
table_properties: Optional[Dict[str, str]] = None,
194+
partition_cols: Optional[List[str]] = None,
195+
cluster_by: Optional[List[str]] = None,
196+
schema: Optional[Union[StructType, str]] = None,
197+
format: Optional[str] = None,
186198
) -> None
187199
```
188200

0 commit comments

Comments
 (0)