I'm trying to use DataFlowCachedLookups to implement a dynamic columns mapping from from a SQL source to a SQL destination. The idea is to persists a map on a metadata table, read as source into the DataFlow and store as CachedLookup.
[!INCLUDE data-flow-preamble] This article provides details about cachedlookup functions supported by AzureData Factory and Azure Synapse Analytics in mapping dataflows.
Cache Sink and Cachedlookup in Mapping DataFlow in AzureData Factory In this video, I discussed about cachelookup and cache sink in Mapping DataFlow in Azure...more.
ADF has added the ability to now cache your data streams to a sink that writes to a cache instead of a data store, allowing you to implement what ETL tools typically refer to as CachedLookups or Unconnected Lookups.
Many powerful use cases are enabled with this new ADF feature where you can now lookup reference data that is stored in cache and referenced via key lookups with different values, multiple times, without the need to specify separate Lookup transformation calls.
I am using AzureData Factory Mapping DataFlows. I am trying to write an equivalent of sql 'case' statement to get a value in a derived column based on cashed sink.
To reference cached data from one dataflow activity as the source for another dataflow activity, you can follow these steps: In the first dataflow activity, add a sink transformation and select "Cache" as the sink type.
Loop through the table names and pass each in as a parameter to the dataflow to set the table name in the dataset. ADF handles the cluster creation and teardown.