In How to start one Cloud Flow for multiple assets with Split On setting, I would like to show you one toggle which might alter your way of working 🤩
Table of content:
Thanks, Tomasz Poszytek for the motivation to write it down 😁
Introduction
Split On is turned on by default in the Cloud Flows with the Automate triggers, like SharePoint when item is created or when item is modified:

With this setting On Cloud Flows with Automated triggers are run for every item or document which has been created or modified.
Hint: Learn how to use Trigger Conditions to add additional criteria to start your Cloud flows.
Change
The fun starts when you will turn Off the Split On:

With that, your Cloud Flows instances will stop to start on the single item but will start to run with multiple assets 🤯
Example
I made a simple example, of how this is working.
I create list BulkAction were uploaded 14 countries (in Polish 😜) using Excel and Edit in Grid view, out of the box Ms List functionality:

Next, when I checked my Cloud Flow I see only single execution of the process and not 14 (for each item created).

When I open the Cloud Flow history, I see 14 items processed in the loop:

Working with data
Turning Off the Split On requires a small change in the approach of configuring the Cloud Flow. As the trigger outputs are not array with information about ALL items or documents from the time window when the Power Automate timer job last time executed and current execution:

1.) Loop
One way of working with the data is to utilize For Each loop. In the loop as the input parameter, we are adding value from our trigger. After that, we can access each item or document all properties. We can search them from the dynamic content:

2.) Data Operations
The second way which came to my mind is to use Data operations actions like Select:

With Select as the From property, we are using the value of the trigger, and then we can provide mapping of all items or documents properties.
Considerations
Time job execution
I think the most important consideration is the time when the timer job is working. We cannot have control over that, and we cannot be 100% when it will run.
Batch size up to 50
I also notice that when I was doing the testing and uploaded 52 countries in my example, Flow executed twice. One batch with 50 items and second with 2 items only.
When I performed further testing, I notice that in case of using Edit in Grid view the batches were irregular size but up to 50 items in single batch!
Event when I uploaded all 194 countries via Flow it split data in four batches (50,50,50,44).
Summary
I hope you like How to start one Cloud Flow for multiple assets with Split On 😁 Please make sure to check the considerations when you will be planning your Solution as there might be the case that scheduled trigger would work better.
In my opinion, Split On is a neat setting 😁 I used this approach in couple solutions, and it worked nicely 😍 Please let me know your ideas how you use it 😁