Im trying to add a function that will do a bit of risk management for me. Currently I have the below snippet. However I'm seeing that sometimes the function will return an absurdly small amount, like 17 where it should be returning something like a few million. 

I further find upon debugging that even if self.MaxLossAbsolute = 20k and 2*self.ATR_Now is 0.0040, when you add them together you get 20,000.004, which is what youd expect, but when you divide them in "self.MaxLossAbsolute / 2 * self.ATR_Now", somehow the returned figure gets screwed up.

Anyone have any ideas whats going on?

def CalculateTradeSize(self):
self.MaxLossAbsolute = self.Portfolio.Cash * 0.02
self.MaxLossAbsolute + 2*self.ATR_Now

return self.MaxLossAbsolute / 2 * self.ATR_Now