Hi, I’ve seen this touched on in other posts, but I wanted to ask directly: has anyone had success getting data into a Microsoft Fabric Lakehouse, and what are the best authentication methods?
I think there are potentially options using the Azure Blob Storage connector to push CSV files in, and possibly using the Fabric API as well. Before I go down the governance route and request SAS tokens or API keys, I’d like to hear about others’ experiences to make sure what I’m asking for is actually going to work.
I have a few data rescue use cases where I need to scrape large volumes of documents, structure the outputs using FME, and then write the data to a Lakehouse so the business can use it for reporting.
I’ve managed to get data out of Lakehouses using the SQL endpoint with a SQL Database Reader, since it has Entra ID authentication integrated.
Any insights apperciated.
im using FME 2024
