Has anyone kicked the tires on using Snowflake streams and tasks to load into your data vault? It's a very slick solution but I'm a little hesitant because I don't see any way around having thousands of tasks given that a child task can have only one predecessor. Granted they'd all have the same code to load a hub/link/sat and this could be as simple as calling a stored proc but I'm nervous about that explosion in the number of tasks. Maybe it's not a big deal. Alternatively we can have a task call a stored proc with multiple sql statements or stored procs nested inside of that. Any thoughts or experiences anyone can share?