How do I count the number of floating point operations in an algorithm?
First, write down your algorithm as pseudocode. Then simply count the number of arithmetic operations that the algorithm is performing (per iteration if the algorithm loops). All operations (addition, subtraction, multiplication, and division) are usually counted as the same, i.e. 1 flop. This is not exactly true, since multiplication includes several additions and division includes several multiplications when actually executed by a computer. However, we are looking for an estimate here, so it is reasonable to assume that on average, all operations count in the same manner.