Tuesday 1 April 2014

Week 11 - Sorting Algorithms

          There are many different sorting algorithms that sort information into various types of orders with varying efficiencies. (From the least inefficient - bubble sort - to one of the most efficient - quick sort)  The many different algorithms each have their pros and cons, excelling at list of smaller sizes but are weaker when dealing with lists of a large size - meaning that their runtime grows to a staggering amount in proportion to the size of the list. And then there are some sorting algorithms that excel over others when the input is a large list. The order in which the information is arranged prior to being sorted using a certain algorithm also affects runtime - this depends on the process in which a function sorts a list. It is important to know the worst case runtime of an algorithm. To do this, you need to calculate the time complexity of an algorithm and represent it using big O notation which describes the behaviour of a function. 

Here, f(x) - blue and g(x)- red. f(x) O(g(x)). Big O notation can be used to describe the worst case runtime of a function’s algorithm. For example, bubble sort has a worst case runtime that is quadratic, O(n**2) which means that, depending on the imput size, the runtime will increase quadratically in proportion to the size of the list. If you analyze the code for bubble sort:
def bubbleSort(alist):
    for passnum in range(len(alist)-1,0,-1):
        for i in range(passnum):
            if alist[i]>alist[i+1]:
                temp = alist[i]
                alist[i] = alist[i+1]
                alist[i+1] = temp

         You will see that on line 2, you are dealing with a list of size n and on line 3, you are also dealing with a list of size n, therefore n * n = n**2, so the runtime at worst is n**2 - we care most about the highest exponents. For example, the the runtime is n**2 + 3n + 6, this means that the runtime is O(n**2). Selection sort improves on bubble sort considering the number of passes, it swaps the largest in the list with the smallest consistently, however, the runtime is still O(n**2) and so is the runtime of insertion sort at the worst case which inserts a value in its correct place from beginning to end. Shell sort is also O(n**2) at worst - it splits a list into sublists and sorts the sublists before piecing the list together again. Merge sort is a recursive algorithm which constantly splits a list in half and sorts smaller lists before rearranging the split list together with a worst case runtime of O(nlogn). Quicksort improves upon merge sort because it does not use additional storage, it uses a pivot point, a value that is used to split the list and then calls itself recursively to sort smaller lists. Quicksort is on average O(nlogn) and faster than most O(nlogn) algorithms although, the worst case runtime, O(n**2), is rare. There is also Timsort (O(nlogn)), a hybrid sorting algorithm that merges insertion sort and merge sort. You can also measure the best case runtime (represented by big omega) and the average case runtime (represented by big theta). However, if someone needs a sorting algorithm to sort a lot of information at once or in various different orders, then an algorithm is only as good as its worst case. 

Saturday 22 March 2014

Slog#10

             This week I continued to learn about many sorting methods - from the most optimal to the worst. In lecture, we compared runtimes of various sorting methods and determined the runtime at the best case and the worst case. Comparing sorting algorithms, you see that some preform optimally under certain conditions like a small or random list, while other sorting algorithms excel at sorting longer lists when compared to others (like sorting algorithms that are logarithmic outperform those that are quadric by a long shot). With this in mind, I always wondered if there could be a hybrid algorithm that takes the best characteristics of two algorithms. It turns out that Timsort does just this, it is a hybrid sorting algorithm derived from insertion sort and merge sort. At the worst case, Timsort is within big O of nlogn and at best case it has a constant runtime (within big O of n). In the lab this week, I had to return an inorder linked list that shows the tree from the furthest left node to the furthest left, to do this we used two helper functions within the main function and recursion. For the second part of the lab, we had to write a method for the longest path, I found this to be more challenging. This week was all about sorting algorithms. 

Sunday 16 March 2014

         This week I completed my homework exercise #3 - part a and b. I found part b to be easier. For this I used 3 functions to help with the main function sum_to_deepest which had to return the sum to the keys from the root to the deepest levels leaves in a TreeList. As for part a, the function make_tree had to construct a tree when given the a preorder list and an inorder list. To start, I created the functions preorder and inorder which took a TreeList as their parameters and then returned the corresponding preorder and inorder list. This helped when making the example calls and helped to better visualize the tree. The function make_tree consisted of many conditions to satisfy all the simpler possibilities, and then recursion was used to satisfy the more complex trees. Part a took the longest because I had to understand exactly what was happening with diagrams. In the lab, I continued to alter linked lists by doing the same things in various different ways. In the lectures, we learned about sorting a list using various sorting methods and considering their base case, worst case, and predicting how well they run with lists of different sizes. 

Sunday 9 March 2014

        This week we continued working with and manipulating linked lists. The lab this week seemed easier because I can more easily visualize the structure of a linked list. I find that the simplest way to understand linked lists is to visualize it in its connected form as lists made up of an item (value) and the remaining list (rest), for example: BST(BTNode(8, BTNode(4, BTNode(2, None, None), BTNode(6, None, BTNode(12, BTNode(10, None, None), BTNode(14, None, None)))) Another way to visualize a linked list with more clarity is: 
|1-|-->|3-|-->|98-|-->|5-|-->|77-|-->|4-|-->|X-|
However, to me, it is important to view the list as being linked with a head that so the first example is more clear. Visualization is very important as I am working on my Assignment 2 part 1, where I have to set up a Tree classes, without much implementation, other than __init__, __repr__, __eq__. This involves mainly planning the Tree. In lectures this week, we continues learning about linked lists and I learned a new way to represent a list - by flipping a Tree on its side. I had continued practice with changing linked lists in various ways. 

Saturday 1 March 2014

Recursion - Slog 7

        Recursion is the act of repeating something constantly until a conclusion is reached when it meets a specified requirement or it continues infinity. In computer science recursion is used to make a problem simpler by consistently reducing the problem to a more clean and strait forward one each time. A conclusion is reached when the smaller and smaller problem reaches a base case, a condition that exits the recursion. Without the base line, python will call the function until it runs out of memory and the program crashes. I remember the function that was taught in CSC108 - is_palindrome, this function checks if a word is a palindrome ( is the read the same forward as it is backward) I found the same example using recursion:

def is_palindrome(word):
if len(word) == 1 or len(word) == 0:
return True
else:
if word[0] == word[-1]:
            return is_palindrome(word[1:-1])
        return False

The base case is this function is the first if statement which checks if the word in of length 0 or 1 which is automatically a palindrome. Otherwise, if the character at the first index is equal to the character at the last, this is where recursion steps in to make the word smaller. If the word is ‘hello’ -> this returns False. If the word is ‘racecar’, the second statement is true and therefore the string ‘aceca’ goes through the function again. Then ‘cec’ then ‘e’ - this is where the base case is reached and the function exits and returns True. This is how recursion makes the problem simpler and simpler each time until it can be dealt with easily with the first statement. I find that this example makes it easy to understand recursion especially for me because I am very new to this concept. However there is a much easier way to write this function by simply reversing the given string and comparing the result:

def isP(s):
a = s[::-1]
    return a == s

        As the level of difficulty is increased, the use of recursion in your code will simplify seemingly difficult tasks using repetition as I found with the Tower of Hanoi from assignment 1. A recursive function makes a call to itself within the function body. When you visualize the execution of a recursive function, you will se that it repeatedly goes through the function over and over until the base case is True. After you reach the base case, you run out of lines in the function and it does an implicit return to the previous value which returns None - this makes the activation frame on the stack go away. This continues until all frames are gone. 

Saturday 15 February 2014

Trees and Assignment One

    This week, I completed the first assignment - I also started it this week which was a mistake, but it all worked out in the end. I spent too much time being stuck on part 4, when I ran GUIController, the I couldn’t move the cheeses in the animation. It turns out that I made cheese_location return a number of cheeses instead of a number of stools. Such a small mistake nearly drove me crazy. Moving on to step 5 - I had a day left to finish the assignment. Tour.py wasn’t any easier but thankfully I made no minor errors so I could move along with a steady pace. Thankfully I had time for the bonus question which was interesting. To complete the very long method (same_strategy) I had to use a for loop within a for loop within a for loop to satisfy all the necessary conditions and I was ecstatic to see it work! I do however wish I would’ve started this assignment sooner than I’d like to admit - this always starts off as my goal. The lab was easy this week (thankfully). In this week’s lectures we learned about trees! Fun stuff. I learned how to build trees and about tree traversals. Time to study for the test! Bye

Friday 7 February 2014

Week 5: Namespaces

    This week I learned about namespaces and the order that python searches for names; first local (within the function body starting at the innermost function - in cases that there are function within functions), then nonlocal then global and finally it looks for built-in names. During lecture, I had practice with predicting where python looks for names being called. I’ve learned to bind and rebind variables using statements like ‘nonlocal’ and how to unbind using ‘del something’ which erases the link between object and attribute. In this week’s lab, I practiced tracing recursion and writing two recursive functions that offered more insight on the process of recursion.