
So, the mechanism that’s used behind the scenes is quite different it must provision resources behind the scenes and the process of initiating these tasks can take some time. Each one of the tasks that we see here, even the logging, starting, copy and completion tasks, in Data Factory requires some start up effort. If you’re coming from an SSIS background, the idea of using the ForEach Loop is a powerful technique and it’s not a big deal to loop through 100s of files.īut in Azure Data Factory, the story is a bit different. We run the Copy Activity there and then we record whether it succeeded or failed at the end. I’ve got a stored procedure that puts an entry into a table that says I’ve started this. This other screenshot is a typical pattern we would do for each of those files. So, we have a couple tables behind here telling us which files are available and then a list of those files that may have already been loaded to our target.


The screenshot below shows a typical pattern that we use where we would start off by getting a list of files that we want to load. What I’m talking about here comes down to the difference of loading data one file at a time vs loading an entire set of files in a folder. In this post I’d like to share some knowledge based on recent experiences when it comes to performance of Azure Data Factory when we are loading data from Azure Data Lake into a database more specifically in using the Copy Activity.
