Coding

Simple explanation of Asymptotic Notation!

  • 00:00:00 hi guys this is Keith galley and in this
  • 00:00:03 video I'll be giving a brief explanation
  • 00:00:05 on asymptotic notation and its role in
  • 00:00:08 computer science and more specifically
  • 00:00:10 algorithms so what is asymptotic
  • 00:00:13 notation well in computer science we
  • 00:00:16 often want to know how fast a program
  • 00:00:19 process or algorithm runs however this
  • 00:00:23 isn't always easy because each one of
  • 00:00:25 our computers has different hardware
  • 00:00:27 capabilities as a result asymptotic
  • 00:00:31 notation was developed to provide a
  • 00:00:33 universal way to measure the speed and
  • 00:00:39 efficiency of an algorithm the second
  • 00:00:46 part I think that is really important to
  • 00:00:48 remember about asymptotic notation is
  • 00:00:51 that it refers to large inputs so what I
  • 00:00:58 mean by this is that imagine you are
  • 00:01:00 sorting 10 numbers maybe one algorithm
  • 00:01:03 works really well in these 10 numbers
  • 00:01:06 but if you increase the size of the list
  • 00:01:09 of numbers that you tried to sort to a
  • 00:01:11 hundred thousand maybe a second
  • 00:01:14 algorithm works a lot better in
  • 00:01:16 asymptotic notation we care about the
  • 00:01:18 behavior of an algorithm as the input
  • 00:01:21 size approaches infinity so we care
  • 00:01:23 about these large inputs let's quickly
  • 00:01:28 go over the three most important
  • 00:01:30 notations that you will have to know
  • 00:01:31 when taking any sort of algorithms class
  • 00:01:34 the first is o of n also known as Big Al
  • 00:01:39 what oh of n represents is an upper
  • 00:01:43 bound so if we have the function n we
  • 00:01:51 could say that it is o of N squared
  • 00:01:55 because when we have a big input any
  • 00:01:58 value evaluated at n would be less than
  • 00:02:03 that same value a lot evaluated at n
  • 00:02:07 squared so 10,000 for example is always
  • 00:02:12 less
  • 00:02:13 and upper bounded than 10,000 squared
  • 00:02:15 etc next we have theta of n theta of n
  • 00:02:22 represents a tight bound so what this
  • 00:02:27 means is that you know in algorithms we
  • 00:02:31 don't care about constants we really
  • 00:02:32 just care about that asymptotic behavior
  • 00:02:34 and when you have a really big input you
  • 00:02:38 know the constants don't really matter
  • 00:02:41 so let's say we have the functions and
  • 00:02:43 3n in 5n well 3 n is less than it's all
  • 00:02:51 going to be less than 5 n and n is
  • 00:02:54 always going to be less than 3n so you
  • 00:02:57 can say this function 3 n right here is
  • 00:02:59 tightly bounded by n we could say that
  • 00:03:02 it is theta of n so if we ignore these
  • 00:03:07 constants it is lower bounded by a
  • 00:03:09 function of N and it's upper valued
  • 00:03:13 upper bounded by a function of n so
  • 00:03:15 because it has this tight bound on n we
  • 00:03:18 say it's theta of n finally we'll look
  • 00:03:21 at Omega of N and as you might have
  • 00:03:24 guessed this represents a lower bound
  • 00:03:28 for an example let's imagine that we had
  • 00:03:31 the functions and cubed and log of n
  • 00:03:41 whenever we plug in a value to n cubed
  • 00:03:44 it is always going to be higher than
  • 00:03:47 that same value plugged it plugged in to
  • 00:03:49 log n for big inputs so we can say that
  • 00:03:53 n cubed is Omega of log n I apologize if
  • 00:03:59 this is a little bit messy but so recap
  • 00:04:02 of n is an upper bound theta of n is a
  • 00:04:05 tight bound and Omega of n is a lower
  • 00:04:09 bound remember these three things
  • 00:04:11 because you're going to use them a ton
  • 00:04:13 throughout any algorithms course one
  • 00:04:15 thing that I think is important to note
  • 00:04:16 about asymptotic notation is that it the
  • 00:04:19 convention is that you only care about
  • 00:04:22 the value in an expression that has the
  • 00:04:25 highest magnitude
  • 00:04:26 so in this expression that's end of the
  • 00:04:28 set so when we were writing if we said
  • 00:04:31 that a function was upper bounded by
  • 00:04:33 this expression we would just say it is
  • 00:04:35 o of n to the fifth well it wouldn't be
  • 00:04:40 wrong to say that it is o of n to the
  • 00:04:43 fifth plus four n to the cube plus two n
  • 00:04:45 plus five that's just not how computer
  • 00:04:48 scientists write this notation because n
  • 00:04:50 to the fifth is the term that really
  • 00:04:53 dominates and controls this expression
  • 00:04:56 when you have a very large input as a
  • 00:04:58 simple example on asymptotic notation
  • 00:05:02 let's imagine that we have two grocery
  • 00:05:04 lists the grocery lists on the left it
  • 00:05:07 is sorted alphabetically so so you have
  • 00:05:11 the items that start with an A on the
  • 00:05:14 top and the items that start with the Z
  • 00:05:16 on the bottom the list on the right it
  • 00:05:18 is sorted completely randomly we don't
  • 00:05:22 know where any item will fall on that
  • 00:05:23 list and let's say we have a total of n
  • 00:05:27 items all right now we wanted to figure
  • 00:05:33 out which of these lists is a better way
  • 00:05:35 to you know make a grocery list is it
  • 00:05:38 more quickly quick to make it
  • 00:05:40 alphabetical or is it more quick to make
  • 00:05:42 it random so let's say we are looking
  • 00:05:45 for a specific item such as cucumbers in
  • 00:05:48 the list on the left we can quickly find
  • 00:05:51 cucumbers by going to seas and seeing
  • 00:05:53 that cucumbers is right there so we can
  • 00:05:56 say that it is lower bounded by a
  • 00:05:59 constant number of operations and so as
  • 00:06:05 I mentioned in the previous slide lower
  • 00:06:08 bound is represented by Omega so we
  • 00:06:10 could say that this is omega of 1 1
  • 00:06:12 represents a constant number of
  • 00:06:14 operations however we might not always
  • 00:06:17 get so lucky to have cucumbers be
  • 00:06:19 sitting right there as the only item
  • 00:06:22 that begins with a C let's imagine that
  • 00:06:25 we had hundreds of thousands of items
  • 00:06:28 that began with a C on our grocery list
  • 00:06:30 let's say they like oh no I was shopping
  • 00:06:33 for a lot of people so we had hundreds
  • 00:06:35 of thousands of C
  • 00:06:36 if this is the case then simply going
  • 00:06:40 down to the sea part of our grocery list
  • 00:06:43 wouldn't be enough to you know find
  • 00:06:45 cucumbers we would have to go further
  • 00:06:47 and break our list down even further to
  • 00:06:49 find cucumbers so we would maybe have to
  • 00:06:52 look at CEO and try to figure out you
  • 00:06:56 know what items begin with CEO to find
  • 00:06:58 cucumbers and then if we still had a lot
  • 00:07:00 of items that begin with Cu
  • 00:07:03 we'd have to continue this process and
  • 00:07:05 break the list down even further into
  • 00:07:08 items that begin with C you see the
  • 00:07:12 process of breaking down a list into
  • 00:07:14 smaller and smaller lists sort through
  • 00:07:16 can be called acid or logarithmic time
  • 00:07:21 in computer science so we can say that
  • 00:07:24 this is upper bounded and the upper
  • 00:07:27 bound is is on the worst-case scenario
  • 00:07:31 so the worst case scenario we had a
  • 00:07:32 bunch of items that begin with C and
  • 00:07:34 maybe a bunch of them started with C or
  • 00:07:36 C 1 I mean how to continue this process
  • 00:07:39 for a while so the upper bound of this
  • 00:07:41 is log of n as I mentioned in the
  • 00:07:44 previous slide log of n can be noted Oh
  • 00:07:47 noted the upper bound can be donated
  • 00:07:50 with Big O this is Big O and we can say
  • 00:07:53 this is Big O of log n so this Big O
  • 00:07:57 means upper bound and log N is because
  • 00:08:01 there's n total items in each time we
  • 00:08:03 make a step further and trying to find
  • 00:08:05 cucumbers we break the list down into
  • 00:08:07 smaller and smaller lists on the other
  • 00:08:10 hand let's imagine that we wanted to try
  • 00:08:12 to find the same word the same item
  • 00:08:14 cucumbers in this randomly sorted list
  • 00:08:17 to do this wouldn't be as easy as the
  • 00:08:21 list on the Left we would have to go
  • 00:08:24 through zucchini apples soup and just
  • 00:08:27 keep going down this list until we
  • 00:08:29 eventually found cucumbers somewhere
  • 00:08:32 down here or maybe it was up here you
  • 00:08:34 know we don't know but in the worst case
  • 00:08:36 we'd have to go through this whole
  • 00:08:37 entire list because we don't know where
  • 00:08:39 cucumbers would fall because there's no
  • 00:08:41 order to how this list is sorted so in
  • 00:08:44 this case we could say it is upper
  • 00:08:46 bounded
  • 00:08:49 by n so it is Oh again is upper bounded
  • 00:08:55 and n is because we have to look through
  • 00:08:58 all n items in the worst case to find
  • 00:09:01 cucumbers so the right list has a upper
  • 00:09:05 bound of oh of n now we know to upper
  • 00:09:10 bounds we know that the list on the left
  • 00:09:14 has an upper bound of oh of log n then
  • 00:09:21 the list on the right so this is the
  • 00:09:24 left list on the right has an upper
  • 00:09:26 bound of o of n so as a result of that
  • 00:09:32 we could say that alphabetical grocery
  • 00:09:36 list is asymptotically faster than the
  • 00:09:43 randomly sorted list