This is all items relating to Microsoft Fabric Workloads.
-
Get_Files_Tables_StorageSizes
Will run on your existing lakehouse and get the files and tables storage sizes. -
Adding Date Time Column to Pyspark data frame
Adds the current date time to an existing PySpark DataFrame. -
Azure Key Vault Auth with Service Principal
Authenticates from an Azure Key Vault with the Service Principals needed to run the Power BI or Fabric Admin APIs. -
Blog - Update IR Policy
Updates your Incremental Refresh Policy based on the dates provided in the notebook. -
Reading Table from Another Lakehouse
Allows you to read a Lakehouse Table from a different App Workspace. -
Blog-Scanner API
Downloads the Scanner API Data to a JSON file in your Lakehouse. -
BLOG - Entra ID All Group Members
Gets all the Entra Groups and their Group Members and downloads it to a JSON file in your Lakehouse. -
Blog - Get Files and Table Sizes
Gets the storage size for all your files and table sizes for Lakehouses and Warehouses. -
Blog - Reading and Write different Lakehouses
Shows you how to read from a Lakehouse A in Workspace A and write to Lakehouse B in Workspace B. -
Blog - Create case insensitive Warehouse
Contains the code on how to create a case insensitive Warehouse. -
Blog - Create warehouse with Service Principal
Contains the code on how to create a Warehouse using a service principal account. -
Blog - Get All Entra ID Groups and Users and Licenses
Gets all the Groups, Users and User licenses in your Tenant. -
Blog - Get All Fabric Items - Actual Pure Python
Gets all the Fabric Items in your tenant using a Python-only notebook. -
Blog - DuckDB SQL Code to read from Delta Tables
Shows you how to read data from Lakehouse tables using SQL via DuckDB. -
Blog - Python - DuckDB - Writing to Lakehouse Table
Shows you how to write data from a DataFrame to a Lakehouse Table using DuckDB SQL. -
Blog - Python - Run DAX Query and write to LH Table
Runs a DAX query against a semantic model and writes the DataFrame to a Lakehouse table. -
Blog - Python - DuckDB - Querying Multiple Tables
Queries multiple tables using a Python notebook with DuckDB and a weather API. -
Blog - Python - DuckDB - Looping and write once
Shows how to loop through stop dates and ramp the output once to a Lakehouse table. -
BLOG - Show Table with IR Policy
Uses Semantic Link Labs to connect to the Tabular Object Model and show a table's Incremental Refresh Policy. -
Blog - Export all Items to OneLake
Exports all Fabric Items from your Workspace to OneLake and Azure Blob Storage. -
Blog - Restore from Backup Items
Imports/Restores all Fabric Items from a folder in your Lakehouse Files section.
License: MIT