I have been thinking thinking lately how to make any algorithm slightly more accurate,  but first I need to understand how algorithms are executed. 

I am assuming for my moderate programming skills that if you import 1 minute data that an algorithm is then triggered/executed every time there is a new minute data imported. For example, an algorithm is triggered at 10:30am, 10:31, 10:32, etc. 

For a simple example let's say we have two EMAs. One at 5 minutes and the other at 30 minutes. Buy when the 5min crosses above the 30min and vice versa.

With the algorithm being updated every minute, we will know when the two EMAs cross with a minute resolution (Program updates say at 10:32, you know if the 5 minute crossed over/under the 30 minute EMA in the last 60 seconds).

Is it possible to have the program run every tick or second so that you know the exact moment the two EMAs cross? Therefore, if the EMAs crossed at 10:31:24, you would pretty much instantly know instead of 36 seconds later if the algo updated every minute (which would be at 10:32). Also the EMAs would then be calculated using a significant more data points rather than only 5 and 30 points (The 5 minute would use data from 10:26:24 to 10:31:24). 

I figured this could make algos slightly more accurate. But not sure how it would look for programming it, my first thought would be to import tick/second data (assuming the algo is updated every time data is brought in) then consolidate that into 5min and 30min to get the two EMAs (but I feel that waits to have the total amount of data before outputting a result) I would think you need a moving window style consolidator. 

Any feedback on this would be great, I am trying to grow a better understanding on how this program works, while working towards a successful Algorithm.