Wednesday, April 2, 2014

Sorting and Efficiency


Sorting algorithms are a very useful tool in the programmers tool box. It is often the case that a large amount of data will be entered for the computer to work with. If this data isn't in any particular order, it is far less useful. This is where sorting algorithms come in. A sorting algorithm is essentially a method of manipulating data to put it in a desired order. There are a couple of factors that indicate how well a sorting algorithm will perform or, what it's efficiency will be.

When looking at an algorithms efficiency, we are generally only concerned with the 'Big – O' case. That is, the scenario that will take the greatest number of steps to complete. The function that models this behavior can be deduced by determining which operations are necessary to perform for a given algorithm. The most common operations to be looked at are the number of comparisons to be performed and the number of exchanges, or shifts of the data to be performed. Different sorting methods will produce different numbers for each so it is important when choosing a sorting method to consider what type of data and how much of it you're working with.

An algorithm like quick sort would easily outperform bubble sort in almost all cases, but what if the data in question was already very near to being sorted? In that case a bubble sort programmed to stop when it noticed that no exchanges were made during a given pass would be probably be more effective.

The necessity for a deeply fundamental understanding of how different sorting strategies operate is crucial, although easily overlooked. If you know what the algorithm does, why do you need to know how it does it? It will do what you want it to so why bother to mess with what works? You might not care one bit. Indeed, a short list of items to be sorted won't cause you any trouble. On the other hand, a massive list could take the rest of your life to sort. While I personally have never dealt with an overly large data set, I have programmed (unintentionally of course) inefficient algorithms for calculating things like the prime factors of a number. When no output has appeared even after checking back in half an hour, the importance of efficiency is apparent.

Efficiency is king when dealing with large sets of data. Most times that's why we are using a computer to handle the data in the first place. If you had a small amount of data chances are, you wouldn't even need one.

Image source: http://codingmash.com/tag/sorting/

Sunday, March 16, 2014

Regular Expressions


For assignment 2 we have been tasked with working on regular expressions, and regular expression trees. Since I hadn't really heard anything about regular expressions I decided to do some Googling before I began. Turns out, regular expressions are quite useful. They allow for matching a variety of strings by simply using the correct expression. There are different types and different languages that use regular expressions but they are generally used in similar ways. They can be extremely useful for pattern matching; for something simple like looking for whitespace or even full sentences.

I spent some time struggling with the first function that I was working on: is_regex() – which determines whether or not a string is a valid regular expression. It is of course a recursive function and really doesn't seem that difficult at first glance, but there are some cases to consider which take a bit more work. The challenge is basically to determine the proper amount of nesting for parentheses inside a string. I tried a couple methods which worked on some test cases but not all of them. I ended up using a helper function to get what I needed done and it seemed like a pretty clear solution so I'm happy with that.

I'm not very far into the rest of the assignment so I better get to that and update about that next week!

Sunday, March 9, 2014

Programming in the wild...


This week we had an exercise on building trees, as lists from their nodes in a certain order, and finding the sum of the 'deepest' leaf. I found this to be the first exercise that I would describe as quite challenging. I'm not sure if it was because of my lack of knowledge about the techniques or a lack of programming intuition but I definitely struggled with the two functions mentioned above.

I can't help but wonder if this is what programming in 'the wild' will feel like. Encountering something that wasn't just designed to be worked through as an exercise or assignment, but a real world problem that might not actually have a solution until I come up with one! It's kind of scary but at the same time exciting to think of being able to tackle these types of challenges.

In the end, after I devised a solution, the functions weren't actually all that complicated looking. Although, when you're sitting at the computer just watching the cursor blink it seems like the solution might allude you for all eternity.

Without saying too much about the solutions to the problems; if your looking to build a tree from an ordered list – slice it up! If you're looking to do something to the values in a tree – keep track of them during recursion!

Sunday, March 2, 2014

Recursion part 2


Recursion is: see previous post on recursion. No really; In a nutshell that's recursion! Recursion is having faith that a problem has already been solved and letting it solve itself using that knowledge.

Ok, so there's a little more to it than that but since I had already made a post involving recursion, it seemed appropriate to define it that way. Recursion is a very powerful tool in any programmer's toolbox.

When a recursive function or method is executed there is generally a condition to be met or a check to determine whether the recursive case or the base case should be executed. If the recursive case is needed the function calls it self, and stores the calls in memory. This continues until all that's left is the base case. Then the base case is resolved. Once this happens, all the previous recursive cases can then be resolved with respect to that base case.

The reason that this is so brilliant is because a lot of problems actually turn out to be the same problem wrapped up within themselves many times. This variety of problem can seem quite large and daunting when viewed as a whole. However, the power of recursion allows for a very useful way of modelling such a problem.

A common example of recursion that many people will have seen is factorials in math. Each time a factorial is increased, it becomes the product of the new number and the previous factorial total. This is a recursive structure because each factorial can be broken down in such a way.

At first I struggled with the concept of recursion, but being exposed to it repeatedly has greatly helped my comprehension. The weekly labs in particular have been a great opportunity to learn recursion with the help of peers and teaching assistants.

Sunday, February 16, 2014

Trees


Well, now I realize that the photo of the tree from my previous post would have been better suited here. Oh well!

Conceptually, trees are not an incredibly difficult structure to grasp. From a young age most people have encountered a tree structure of some sort during their daily lives. Probably the most common and boring example is arithmetic expressions from math. Even if these are not thought of by the individual as a tree, they are in fact a type of tree.

Trees obviously come in many different forms, although I'm mainly interested in the computer science variety! Trees in computer science are used to represent structured collections of data. They can be thought of as an upside-down version of a real tree - with the roots at the top. Except, instead of having a vast system of roots, a computer science tree has only one root. From this root there can be any number of branches, each leading to another smaller tree.

The reason trees are so useful is their ability to link data together in a meaningful way. Consider a collection of names. Sure you can store all those names in a python dictionary or list, but if you store them using a tree structure, simply the way that they are organized will impart an extra form of implicit information.

Lets say you wanted to store all the names belonging to a particular family. You could contain them within a list, even have them in some kind of order, but you would essentially just have a list of names. On the other hand, if you store those names using a tree you could have say the great grandfather at the top; his children below that, and their children below that. Hopefully that makes some sort of intuitive sense, considering that this is type of tree is well known and refered to as a 'family tree'.

Of course this description and examples are only the most basic and simplified versions of trees, but hopefully they illustrate a clear image of what a Tree is.

Sunday, February 2, 2014

Recursion and The Towers of Anne Hoy


This week, and some of the previous, have been about trying to wrap our heads around the idea of recursion and writing recursive functions. We have been shown a variety of ways that recursive strategy and thinking can be applied to problems that may seem complex at first glance.

Nature has be using recursion long before humans figured out what it was!
A tree grows by repeating the same basic process with slight variations.

When recursively programming, a base or general case is conceived. In other words, a problem is broken down into it's smallest possible chunk. Then an operation is performed on this chunk. The same operation is then performed again and again until the problem is solved. This way of deconstructing a problem can be very useful when trying to deal with a simple problem that has been scaled up. An example of this from our lectures would be finding how many lists are nested within a list. Finding how many lists are in one list is fairly easy, but the fact that it would be possible for each list to contain it's own list could make it difficult to count the amount of lists within other lists. This is where recursion can be a great asset. All that you have to be able to do is code the smallest version of the problem and the computer will do the rest for you.

At first I struggled with the concept of recursion but seeing a variety of examples has helped me to get a better grasp of how it works. I can confidently say that I do understand what it means when something is recursive. However, I have not had much practice with programming recursively and so I'm not yet sure how easily I will be able to apply what I've learned. Fortunately, the first assignment in the class should give me an oppurtunity to do just that.

The problem in assignment one is modelled after the towers of hanoi. A classic problem that involves moving ring shaped objects from one post to another without violating the rule that a larger ring can never be placed on top of a smaller one. Essentially, we have been asked to solve this problem except for with one more post than usual – 4 instead of 3. Since recursion has been such a hot topic I am assuming that it will be required to properly solve this. I'm still working on the starting section of the code and have not yet attempted to implement recursion but I will. I'll be posting my progress next week.

Photo credit: ©2008-2014 crazykitty82stock
Image source: http://crazykitty82stock.deviantart.com/art/tree-stock-83313306

Thursday, January 23, 2014

Object-Oriented Programming

Hello all! This will be the first in a series of weekly slog posts for CSC148 at the University of Toronto. This week I will briefly discuss object-oriented programming.

Object-oriented programming is a programming paradigm which involves creating 'objects', which, for all intents and purposes can be considered to mimic real world objects. A programming object can be considered to be something like a specialized virtual machine. It has it's own properties and can perform it's own operations independent of other objects. Once a blueprint or template for an object (known as a class) has been created it can easily be duplicated as many times as is necessary.

My first introduction to objects was in CSC108 at U of T and was somewhat basic. Since beginning CSC148 I would like to think that I have developed a more complete understanding of this programming paradigm and it's benefits and uses. We have done some lab work on Objects which has been helpful. For instance, during our first lab we worked on designing and implementing a class that would take care of cataloging the words and the frequency with which they appeared in a text file. My partner and I had no difficulty writing the skeleton (basic outline) of the class, although we ran into some troubles when trying to implement a top n word count method. During our second lab we worked again on implementing our own class, this time working with abstract data types like a Stack and a Queue. I found this to be quite helpful as it gave me chance to get used to thinking of using inheritance when writing a class that is very similar to another class. Inheritance can be used to 'pick up' existing methods and attributes associated with a parent or super class.

To me, the usefulness of objects seems to stem from the fact that they allow the programmer to be 'lazy'. That is, they allow you to avoid re-writing code by re-using existing classes. Once a class has been written, there is no need to write it ever again! A fringe, or perhaps an even greater benefit is the ability to revise code that has already been released without causing any catastrophic error on the client side of things. One example of this was discussed during lecture; we had already (hypothetically) shipped our stack code, and needed to made some changes after the fact.

At first I struggled to implement objects because they were so different from what I was used to (mostly procedural programming). I had conceptualized programming as telling the computer directly what to do, not creating virtual objects. However, with some practice they are becoming more and more intuitive.

I am looking forward to learning more about objects and improving my abilities using the object-oriented programming paradigm in the future.