This is a simple library that has currently there methods:
- 
insert_http_arrow_stream_to_sqlMake a HTTP Request to an Endpoint from the Lake API and inserts the data via bulk insert into MS SQL Server. In Theory you could also get the data from some other HTTP Endpoint which returns an Arrow Stream and is authenticated using Basic Auth. It does not guarantee atomicity at sql server level, therefore you will usually want to use a global temp table as target.
 - 
insert_record_batch_to_sqlSame as above, but the input is a generic RecordBatchReader from pyarrow
 
It's meant to be used from Python, the Logic is written in Rust.
- You can specify 
Authentication=ActiveDirectoryMSI|ActiveDirectoryDefault|ActiveDirectoryInteractivein the connection string similar to .Net/ODBC SQL Driver. This requires theazure-identitypackage to be installed 
There is still a lot todo:
- Allow passing more flexible HTTP Authentication options
 - Add option to read from database and write to a flat file
 - Document
 - Test
 
This would not have been possible without the excellent arrow-odbc-py library. Use it whenever SQL Server is not the only possible target, or you need to read from a database or you just need something better ;)