BigO notation: Difference between revisions

From Computer Science Wiki
No edit summary
No edit summary
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
<center>
<blockquote style="padding: 5px; background-color: #FFF8DC; border: solid thin gray;">
  [[File:Exclamation.png]] This is student work which has not yet been approved as correct by the instructor
</blockquote>
</center>
[[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]]
[[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]]


== Introduction ==
== Introduction ==


Please write a clear, concise description of your topic here.You will likely reference your introduction from somewhere else. Please use the following syntax at the end of each of your ideas. '''IT IS CRITICAL YOU ATTRIBUTE''' others work. Your introduction should be factual. No more than 3 or 4 sentences, please. Because you are not an expert in your topic, I expect you to triangulate your information. LOTS OF LINK TO OTHER RESOURCES PLEASE!
Big O notation is the language we use for articulating how long an algorithm takes to run. It's how we compare the efficiency of different approaches to a problem.
<nowiki>
<ref> the url I cited by material from </ref>
</nowiki>
 
== How does it work or a deeper look ==
 
* If you are discussing a THING YOU CAN TOUCH, you must explain how it works, and the parts it is made of. Google around for an "exploded technical diagram" of your thing, [http://cdiok.com/wp-content/uploads/2012/01/MRI-Technology.jpg maybe like this example of an MRI]  It is likely you will reference outside links. Please attribute your work.
* If you are discussing a PROCESS OR ABSTRACT CONCEPT (like [[fuzzy logic]]) you must deeply explain how it works.


== Examples ==
With big O notation we express the runtime in terms of—brace yourself—how quickly it grows relative to the input, as the input gets arbitrarily large.


Please include some example of how your concept is actually used. Your example must include WHERE it is used, and WHAT IS BENEFIT of it being used.
Let's break that down:


== Pictures, diagrams ==
'''how quickly the runtime grows'''—Some external factors affect the time it takes for a function to run: the speed of the processor, what else the computer is running, etc. So it's hard to make strong statements about the exact runtime of an algorithm. Instead we use big O notation to express how quickly its runtime grows.


Pictures and diagrams go a LONG way to helping someone understand a topic. Especially if your topic is a little abstract or complex. Using a picture or diagram is a two part process:
'''relative to the input'''—Since we're not looking at an exact number, we need to phrase our runtime growth in terms of something. We use the size of the input. So we can say things like the runtime grows "on the order of the size of the input" (O(n)O(n)) or "on the order of the square of the size of the input" (O(n^2)O(n2​)).


# [https://www.mediawiki.org/wiki/Help:Managing_files upload a file]
'''as the input gets arbitrarily large'''—Our algorithm may have steps that seem expensive when nn is small but are eclipsed eventually by other steps as nn gets huge. For big O analysis, we care most about the stuff that grows fastest as the input grows, because everything else is quickly eclipsed as nn gets very large. If you know what an asymptote is, you might see why "big O analysis" is sometimes called "asymptotic analysis."
# [https://www.mediawiki.org/wiki/Help:Images use the file on a wiki page]
Big O notation is like math except it's an awesome, not-boring kind of math where you get to wave your hands through the details and just focus on what's basically happening.


== External links ==
<ref>https://www.interviewcake.com/article/java/big-o-notation-time-and-space-complexity</ref>


* It would be helpful
* to include many links
* to other internet resources
* to help fellow students
* Please make sure the content is good
* and don't link to a google search results, please


== References ==
== References ==

Latest revision as of 06:50, 22 January 2018

Case study notes[1]

Introduction[edit]

Big O notation is the language we use for articulating how long an algorithm takes to run. It's how we compare the efficiency of different approaches to a problem.

With big O notation we express the runtime in terms of—brace yourself—how quickly it grows relative to the input, as the input gets arbitrarily large.

Let's break that down:

how quickly the runtime grows—Some external factors affect the time it takes for a function to run: the speed of the processor, what else the computer is running, etc. So it's hard to make strong statements about the exact runtime of an algorithm. Instead we use big O notation to express how quickly its runtime grows.

relative to the input—Since we're not looking at an exact number, we need to phrase our runtime growth in terms of something. We use the size of the input. So we can say things like the runtime grows "on the order of the size of the input" (O(n)O(n)) or "on the order of the square of the size of the input" (O(n^2)O(n2​)).

as the input gets arbitrarily large—Our algorithm may have steps that seem expensive when nn is small but are eclipsed eventually by other steps as nn gets huge. For big O analysis, we care most about the stuff that grows fastest as the input grows, because everything else is quickly eclipsed as nn gets very large. If you know what an asymptote is, you might see why "big O analysis" is sometimes called "asymptotic analysis." Big O notation is like math except it's an awesome, not-boring kind of math where you get to wave your hands through the details and just focus on what's basically happening.

[2]


References[edit]