Thursday, March 27, 2014

Week 11

Hey, friends!

I don't want to sound all cheesy and stuff but seriously it's been one heck of a ride in csc148. I can't believe we're almost finished this semester already. To be honest. Like really honest. Like not suck up honest (I swear to God), I've actually found the assignments to be really interesting to do. Most of them make me think outside of the box (like brain exercise. ugh exercise...). There are always so many ways to get something to work, but at the end of the day it all boils down to how fast you can get the job done.

Many people can tread such far paths to get something done, but the first commandment of computer science is to "Always be lazy". By this I mean doing something in the least amount of time/steps and not sitting on your couch munching on chips (Please don't hold me responsible if people start comparing pot bellies to determine who's the real master of computer science). In order to accomplish something like this, we first need to know what efficiency is. 

Efficiency is the extent of which resources are used to accomplish a task. In computer science, this means memory and runtime. We always want to "be lazy", so we will always want to find a way to do something doing the least amount of work. This means we need to find the most 'efficient' runtime (least amount of time or steps to accomplish said task), while taking up the last amount of space. For now, we will talk about runtime, because it's the one we've been taught to control the most in this course.

We've encountered efficiency in our first assignment, The Tower of Anne Hoy, we had to come up with the most efficient algorithm to solve the puzzle. It was a relatively large portion of our mark, so my partner and I spent gruelling days pondering over what it was, drawing pictures and tracing code. In the end, we thought we found the most efficient way of doing it (17 steps on 6 cheeses and 4 stools), but our efforts were of no avail for games with larger number of cheeses and larger number of stools.

Efficiency is usually measured in the big-Oh of a function, which is really just a rough estimate on the upper bound of the runtime of an algorithm. For example, an algorithm that has a runtime of O(n^2) will quadruple in runtime when the list is doubled in size (because (2n)^2 is 4n). Most of the time we will want to use big-Oh when estimating the worst case of an algorithm. Big-Oh is an extremely important concept in sorting. 

Sorting is a problem drenched in efficiency. People are always trying to come up with the most efficient way of sorting a list. Sorting is simply arranging a list into an ascending order, or in a particular sequence. So far we have encountered several O(n^2) sorting algorithms (i.e. bubble sort, insertion sort, quicksort and selection sort in their worst cases), and an O(nlog(n)) algorithm (merge sort). 

The O(n^2) algorithms usually require double looping and the O(nlog(n)) algorithms usually involve dividing the list into two multiple times.

I'm going to dive into some deep logical explanation so if you don't want to read my long rambling of my thought process when determining big-Ohs just skip to the end of my example explanations (I HAVE MADE THE END CLEAR) :

For example, if a sorting algorithm divided a list into 1/5th of its length then linearly sorted them one by one, the algorithm would have O(n) traversal. This is because the list will always be divided into 5 sections independent of the list length, so it would have O(1). But afterwards, when linearly traversing each 1/5ths of the list, it would scale linearly and would have O(n). The overall runtime would be 5 times n (one n for each of the 5 times it traverses the list), but 5n is O(n), so in the end we would say that the algorithm is O(n).

As such, merge sort has an O(nlogn) runtime. This is because it will integer divide the list into two halves until each sublist has only one element left, this step will be O(logn). Following this, it will linearly sort each of these lists to make a larger list, and eventually combine them into the final list (O(n)). From this, we know that we will have to have log n for each n, which means the final runtime will be something like O(nlogn)

Another example, quick sort, has a runtime of O(nlogn) in the best case, if the pivot picked can efficiently divide the list into two at all times (dividing into 2 n times has O(logn) runtime) then linearly sorting it (multiplied together is O(nlogn)). In its worst case, it will have a O(n^2) runtime. This is because the pivot would only separate one largest element or one smallest element each time and that would equate to an O(n) run time. Then the overall runtime would be O(n*n) => O(n^2). 
~~~~~~~~~~~~~~End of examples explanations~~~~~~~~~~~~~

Sorting and efficiency is used in everyday life. If you're into online shopping, things must be kept sorted in order for you to find them easily. If you've just submitted in an assignment, the marks and papers must be kept in order for your professor or TA to easily find them, grade them, then correctly submit your marks back into MarkUs. 

We are heavily reliant on sorting and efficiency, because wasted time is something we can never get back.

Friday, March 14, 2014

Week 9 -- Binary Search Trees

So... Binary search trees. I keep thinking a pyramid of inverted sling shots when I think of these things, but I feel like I probably shouldn't be doing that..

Anyway, Binary search trees are super useful. They're basically binary trees where the left node is less than the root node and the right node is greater than the root node. This makes finding stuff really easy because you really only need to compare the node, then ignore half of the tree. This means that the runtime of binary search trees (for the worst case) when you need to find something is very short (log n runtime).

The wrapper node of the binary search trees (BST) should initialize the root, the left and the right subtrees...Because well, they're supposed to have two kids: one left kid and one right kid.

The lab this week was actually pretty challenging. Most of our lab group only managed to finished the first part of count_less. (Even then people still felt time constraints)

What we had to do with count_less was count the number of items in the BST whose values were less than the given item. I think the most challenging part was determining what kind of if statements and else statements we had to go through.

There are two base cases. when the root does exist and when the root does not exist. If the root does exist, then you wouldn't count it (well, for obvious reasons), and if the root does exist, there are two things you do:
1) if the root is greater than or equal to the item, recurse to the left subtree (rerun the function itself on the left subtree)
2) if the root is less than the item, recurse both the left and right subtrees (count to see if there are items in either of the subtrees because it's possible for either to happen), and add a count for the root itself.

Looking back, the eight lines of code probably shouldn't have taken that long, but I think it took us a long time to really understand the BSTs as well as thinking of how to properly return the value.

Thursday, March 6, 2014

Week 8 -- Linked List Structures

Hello again, friends. :)

This week we did a lot of Linked List stuff again to help us understand and/or improve our understanding of linked lists. This makes me feel like linked lists are going to be on the exam definitely so self note right here to go back and study the heck out of it.

So there are technically two ways of thinking of linked lists. One way is to think of each of the items of the list that's closer to the front to wrap the next value that follows it until it has nothing to wrap any more. Literally think of a piece of candy with a billion wrappers around it, with the outermost wrapper being the first element of the linked list until it reaches the candy which indicates the end of the list (OM NOM NOM). The other way is to think of each node linking to the next one until it reaches an empty node like we've been doing since last week.

Speaking of wrappers, I have to write down what a wrapper class is. I keep on forgetting what it actually does (ugh!). A wrapper class is just something to keep track of information of the entire structure (as in make sense and link together the information provided by the class that encapsulates). For example, for a linked list, it keeps track of the position of the front of the list to make it easier to prepend to that position, as well as the size of the list (which makes counting the number of nodes a lot easier). It should also be where the functions go.

Now let's talk about the functions that can go into the Linked List wrapper class. Insertion is pretty straight forward, you literally just stitch together the old list onto the new node that you'd like the prepend. Reversing, though is a little bit confusing.

What the reversing function does is recurse the list until the very last node. Then it makes a new list and turns the very last node into the very first node and comes back out to the second last node. It then makes the new first node point to the old second last node, making it the second node of the new list. It does this by using the previous rest of the list be thing that the first node points to, then cutting off the rest of the list minus the old second last node by making that point to none, which makes the old second last node into the back of the list. It does this over and over again until it's up until the very first item of the list, in which case the very first item will point to None because there won't be another iteration of the code to allow it to point of the list. This will then be the new end of list.

Here's my ugly drawing masterpiece worthy of envy from even the great Picasso himself depicting the reverse process:


SEE YOU GUYS NEXT WEEK!

Thursday, February 27, 2014

Week 7 - RECURSION!

Hi everyone!

This week is the big post about recursion. I know I've posted about recursion before but this post is going to sum[everything about recursion] (hehehe get it?)

Recursion, in computer science, is a method where the solution of a problem consists of the solution of smaller problems which are subsets of the whole problem. For our convenience, it's basically calling a function on an object repeatedly until a base case is reached. 

To do code recursion in Python, we call the function itself in its own function body, but it's what we want it to do that differs between different recursive functions. Quoting myself from week 5 "I feel like the one most essential thing in recursion is to always call the function itself in its own function body, and it will call and call again until it reaches a base case, in which case the result will be evaluated."  Follow this rule and you pretty much have recursion in the BAG. 

Then we will be able to determine the solution of the main problem from the solutions of the base cases. Most of the time, we would use recursion on things that are nested within itself, especially data structures, and Trees are a prime example of this. (See week 6 post for full Tree explanation).

This week, we encountered recursion in the labs through LinkedLists. A LinkedList is pretty much a tree with a maximum of one child (think the Chinese 1 child rule). The end of the list is reached when a node has no children. The problem of the labs this week was a stack taking the form of LinkedLists. Similar to week 2, I had trouble with this because I couldn't figure out how to keep the pointer on the back of the list with the enqueue function.

CONFESSIONS OF A COMPSCI GIRL: I couldn't figure out how to do enqueue this week and my partner and I tried so many different things and it wouldn't work. WE EVEN TRIED INDEXING THE LINKED LIST WHY DID WE DO THAT? Lesson 2 in Trees: don't index them. OMG don't do it please please please or else you'll face the wrath of the Computer Science gods (aka the Python shell) and will unleash the utmost horrifying consequence on you (EXCEPTION! OH GOD NOT THE EXCEPTION)

So in the end the kind TA (Abdi you're the best) explained everything to us and VOILA (I feel like I cheated though because I couldn't figure it out myself T___T) but basically we set a variable to the last empty "node" of the LinkedList, and just prepend to that.  REMEMBER THIS, I feel like this will help with recursion and data structures for like the rest of my life.

TATA FOR NOW FRIENDS. See you all next week :) !

Thursday, February 13, 2014

week 6 (PHOTOSYNTHESIS)

Hello once again, dear readers! This week we have a bit of a long post. Hard to summarize so much information about plants in such a short post right? ;)

So this week we delved even deeper into recursion. We learned about trees, leaves, edges and roots. Before you go and think it's about plants (for those of you who didn't attend lecture and think we suddenly turned into a biology course), it's actually about data structures.

If you think about it, we've actually had an encounter with trees last week in the labs, with the SearchList. The 0th element was the root, the 1st and 2nd elements were its leaves. Each of the elements could have had either more leaves (up to 2) or None (and None was a base case). Maybe this was Danny's way of preparing us just in case some of us actually thought CSC148 was turning into a biology course.

But all in all, summing trees up. Trees are made of nodes, and there's always a node that's referred to as a root (the one node with no parents). It may or may not have children (and the children are referred to as leaves), and parents are connected to children via edges. In the diagram below, we can see that 1 is the root, and 2, 3, 4 are its children. All values are nodes, and each of the arrows connecting the nodes are the edges. All trees have subtrees.


(Source: http://new.math.uiuc.edu/math198/MA198-2009/lee522/pix/frame_tree_diagram.JPG)


Through learning about trees, we also got some practice with list comprehensions. I thought it was about time we had to use them because there were so many more things you can do with less code, and they were so fast and fun to use.


(Source: YouTube.com)
Athene: What do you think about List Comprehensions, George?
HotShotGG: I THINK IT'S AWESOME! IT'S FAST, IT'S FUN.

Also, before I forget (new list comprehension functions we encountered in the lab):
Filter is a function of list comprehensions that evaluate an iterable (had to use tuples for this in lab), that applies the function to each item in the iterable and eliminates all elements for which the function evaluates to False.
Zip takes two lists, and pairs the ith element together from each of the list and turns it into a tuple. The tuple then gets appended to its own list and becomes the ith element of that list. (remember you have to assign it to a variable so you can call it again.)


P.S.:
Pre-Order = Root first > Preorder left subtree > Preorder right subtree
In-Order = Left subtree first > Root > Right subtree
Post-Order = Right subtree first > Root > Left subtree

ONE LAST THING THAT CONFUSED ME THIS WEEK:
And I finally understand after some example code testing:
You can concatenate values onto a list if the first value in your ([a] + b + c) expression is a list. O:! I was wondering why some random line of code with all plus signs was returning a list then I found the magic [] brackets for a list ._. ...


P.P.S: If you're looking for a TL;DR, sorry this is already a tldr version of this week >_<

Tuesday, February 4, 2014

Week 5

Hi again! :)

The tip of the iceberg is what I would probably call this week. The little bit of recursion that we have learned so far, we are putting into use. In the lab this week, pretty much every single exercise was recursion-based.

In my opinion, I thought it was rather exhilarating when I finally figured out how to code something, when I finally grasped the idea and the light bulb flicked on in my brain.

From what I have come across in this lab and the lectures, I will share what I think is the key to recursion for those of you who are reading this and feel like you're struggling with it (so far...I'm pretty noob too, but this is from noob to noob ;) ). I feel like the one most essential thing in recursion is to always call the function itself in its own function body, and it will call and call again until it reaches a base case, in which case the result will be evaluated. I felt like once I grasped that detail, everything kind of fell into place. An epiphany if you must.

Anyway, I'll edit this again after Wednesday's lecture and after doing some work on assignment 1. :)

EDIT: Well the global, local and nonlocal stuff was confusing, going to go play around with the example to try to understand it. Although I found the parent class things to be relatively easy to understand.

P.S. List comprehensions save lives.

SEE YOU NEXT WEEK!

Wednesday, January 29, 2014

Week 4

Hello once again, my dearest readers.

Time flies like an arrow, fruit flies like a banana. Here we are already treading through week 4 of the 2014 winter semester. The semester is almost 1/3 over, and it feels almost too calm this week.

Both the exercise and lab this week consisted of writing programs of unittesting and catching errors in input for your code as well as inheritance where a subclass inherits a superclass (lol yay lazy people, who writes new code am I right?). The exercise was slightly challenging at first, but after a thorough reading of the handout and figuring out what the format of this kind of code should be, the program was simple to write. I find these exercises rather enjoyable as they give me hands on experience of how I should write the code. One can understand concepts rather easily, but putting concepts into practice is usually more difficult to accomplish.

Also I wanted to say that the labs are quite nice too, working with a partner is a lot less frustrating than working by yourself to solve a problem sometimes because we have more brains to think. Denise, if you're reading this, I just wanted to tell you you're a great lab partner and I'm glad we're working together! Thanks for pulling me back when I come across mental hurdles and (somehow) completely forget how to write classes because my mind just goes blank.

But this week...it's too calm for a week of computer science, SOMETHING FOUL MUST BE AFOOT!

The calm before the storm (war) named "Recursion", that are like the Dead Army of Dunharrow. We students are like the Orcs who thought we'd win the Battle of Gondor (Recursion) easy peasy because we completely outnumber the enemy (I mean it's only one concept right? Recursion...pfffft.) BUT THEN all of a sudden Aragorn (Danny Heap) will come and summon all these undead soldiers to kill us all.

(Wait so does this mean our professor is the lost King of Recursion?)