An algorithm is a way to find an answer to a question. It processes data to create information. Sometimes the data is small and sometimes it is big. To execute an algorithm on a computer we need memory and time. The less memory and time an algorithm uses the better. Big O notation is a concept that helps you gauge how efficient an algorithm is. It lets you predict how much memory or time it will use according to the size of the input data set.