Reading-Notes

View the Project on GitHub MohammadAl-khatib/Reading-Notes

Big O

There is definitly a relation between the way you solve a problem (your algorithem) and the time it takes to complete or the size of the output.

This relation might be described in 5 distinctive ways:

  1. Constant O(1):

    • It will always consume the same time or size no matter what the input is.
  2. Linear O(N):

    • The time or size of output will grow (or shrink) with a constant ratio relative to input.
  3. Squared O(N^2):

    • The time or size of output will be related to the square of the input, you can extend this to have trippled O(N^3) and so on.
  4. Squared O(2^N):

    • The time or size of output will be squared as the input doubles.
  5. Logarithmic:

    • The size or time will be halved with each step or decrease of input.

The time or size can then be used to judge the efficiency of the algorithem.

Names and Values in Python

Anyone who come from another language (like me) will look data structures as call by value or call by name, well, in Python things are a little bit different, when assigining a value to a variable it will always have its place in memory unless it is mutated or nothing is refrencing it (then it will be deleted).

This means that saying a variable is equal to another variable does not make a copy of its value, instead it point to the same value, the trick is when you reassign one of those variables the value will stay unmutated but a new place in memory will be created with the new value refrenced with its variable.

Another important thing is that reassigning a variable with a new value will make the older value disappear unless there is a varible refrencing it, this is so important when using functions as it stacks new variables and demolish them finishing the execution of function, any value assigned to those stacked variables will disappear unless refrenced by another variable.