Data Structures

Array, Objects, List, Dictionary

Data structures are built on primitive data structures like STING and BOOL

ARRAY is a way to give you a lot of VARS in one block

var = averageTemp = [];
averageTemp[0] = 79
averageTemp[1] = 82
averageTemp[2] = 68

When you push to an array it puts the value at the end

When you want to add a value to an array at the beginning it has to push all data one place.
This is very inefficient

Stacks and Queues

Stack is a list that you put something on the top and pushes all values down.
This is the fundamental of how program languages order function returns.
Like when you push a tray down in a buffet line, the last one in is the first one out.
It only allows access to the top element.
“LIFO” – last in first out.
Stack Overflow is when you run too many functions.
Peak is a way to see whats on top without removing it

Queues is a list where everything gets added to the end. Like a line, first in first out. Just like lining up to get to iPhone.
Amazon SQS – simple queuing service = good for messanging

Linked List
Each element uses 2 pieces of memory.
One piece is the actual value, the 2nd is a pointer that points to the next item in the array.
An original array requires that the next value be next to the previous one in an array.

Key value pairs
Key = MyAddress
Value = Los Angeles

An object is essentially a dictionary

var myPets = {
cat: “Mr. Hyena”,
lizard: “Mr. Big Big”,
goat: “Wolf Who Ate Wall Street”

Array for loops are O(n2)

Dictionary allows you to map the “array”

When you do a lot of searching
A way of structuring an array so you can split values.
Uses 3 pieces of memory per value
Binary search tree
There are NPM tools to use this type of data structure in Javascript

Helps GPS maps find the shortest routes.
Facebook uses their data ordering
Anything using AI utilizes graphs.

Algorithm Complexity

Head – O(1) grows like 2x input size x running time

List Items – O(n) – more scalable, less time

O(n2) –

“Quadratic time complexity”

3 nest for loops ~ O(n3)

Binary search

array = [ 1,2,3,4,5,6,7,8,9,10 ]
Search for 3 takes 3 steps
Search for 9 takes 3 steps

Even if we double the input we only add 1 step
array = [ 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20 ]

This is the most durable to change
O(log n) – faster than a linear algorithm

Example of O(n)

function solution(k, arr){
    var map = {};
    for (var i=0; i < arr.length ; i++){
        var tmp = k - arr[i];
        if (temp > 0 && map[tmp] == 1)
            console.log("Found the pair :", temp, arr[i]);
            map[arr[i]] = 1;

Technical Debt = using bad code or inefficient code now saying you will fix it later.

# Big O Analysis

Express the running times of the following algorithms in Big O notation. Justify your responses.

Some of these are just review. A few apply Big O to algorithms we saw before.

Assume the _worst case_ running time—i.e., consider only the _maximum_ number of instructions the algorithm could take.

What is the running time of…

* Selection sort?

* Insertion sort?

* Linear search?

* Binary search?

* Finding duplicates in an array?


If you’re mathematically inclined, you might find these interesting. If not, feel free to take a stab anyway, but don’t worry too much about proving your solution.

What is the running time for an algorithm that—

* Finds all triplets `(x, y, z)` such that `x + y + z = n`, where `n` is specified by the user?

* E.g.: `threeSum(list, search)` will find all possible triplets of numbers in `list` that sum to `search`.

* Same question, but for doubles?

* In general, what is the running time for finding n-tuples in `list` that sum to `search`?