- The logarithm the math concept

Understand how the logarithm works

Why a logarithmic time operation or algorithm is so much faster and therefore better than a linear time algorithm or operation?

2. Graph traversal

Pre-traversal, matrix traversal (because matrices are basically kind of like graphs in a way. You have to know how to do depth-first search. Breath first search. DFS BFS. You have to know how to traverse through cyclic graphs so you have to know how to keep track of visited nodes in some sort of auxiliary data structure. Super important most on-site interviews will have at least one interview that consists of a graph problem where you’re gonna have to traverse a graph.

3. Binary Search

Binary Search is way better than linear search. Because it runs in log event time. In a lot of coding interviews. You will have a difficult problem that involves a binary search in the solution like a subproblem of the greater problem will need binary search. Because you’ll repeatedly search through sorted values for example and you’ll want to use binary search.

4. Sliding window technique

It is something that involves manipulating two pointers or two indices at the same time as you’re traversing through something like a string or an array. So basically you’ve got a left pointer or left index, a right pointer right index and you move them simultaneously. An array or a string. usually to count frequencies of maybe characters or maybe to count the sums of sub-arrays in an array where you can know add like the number on the right and then subtract the number on the left, right. You can see how it’s useful and then sometimes you’ll need to expand the window on the right and shrink it on the left or other times you’ll have variations where you’re iterating through a data structure and then you are creating the two pointers and expanding them both at the same time outwards like in the longest palindromic substring problem. The point is being able to comfortably manipulate two pointers at the same time and traverse the data structure can be very useful. A lot of problems have that as a solution and so you want to be comfortable with it.

5. Recursion

Recursion is useful for two reasons. First of all, really understanding recursion and being comfortable with recursion and solving things recursively is very good for a logical perspective like it teaches you logically. It really makes sure that you understand how functions how function calls work, you know what’s returned from functions all that good stuff. But then also there are a lot of problems that can be solved recursively much more easily than iteratively. Like implementing them recursively is just much easier. The code is much cleaner, much less code than its iterative counterparts. A good example is the in-order traversal of a tree. We have a problem an algorithm expert that is a very hard and heart problem called iterative in order traversal. And the reason it’s hard is that traversing a tree in order iteratively is pretty difficult to implement at least in code whereas recursively. It’s pretty trivial. So if you are really good at recursion and you understand it well, you can solve the problem easily.

6. Inverting a binary tree and reversing a linked list

Inverting a binary tree, it’s important because it’s just a very easy problem. It just comes down to traversing a tree and swapping values like:

If you’re intimidated by inverting a binary tree, you probably just have a misconception of what it is to infer a binary tree.

The reversing linked list is a little bit different. The reason is that it’s important to know how to do that. They allow you to do operations much faster or certain operations much faster than with arrays. Linked list problems tend not to be too difficult algorithmically. It’s just like the same thing every time like shirting values in a linked list or rearranging a linked list. And if you know how to reverse a linked list, you probably will know how to do most other manipulations in a linked list that you could face in an interview. Because it’s basically the same thing just knowing how to manipulate you know nodes in a linked list. Knowing how to overwrite pointers, make sure to do so in a specific order such that you don’t lose reference two nodes in the linked list.

7. Suffix tree

Suffix trees are pretty advanced data structures relatively complicated data structures. You can create a data structure that looks similar to a suffix tree just with objects like JavaScript objects or python dictionaries. Very easy and it’s a very powerful data structure that can help you, especially with string problems. When you are trying to find whether a bunch of strings is contained in another string and you want to do so quickly, rather than iterating through all the strings and doing a bunch of substring matching and all that creating a suffix tree or a tree-like structure that contains the strings in them is going to be much better and quicker. It can also make people impressed.

8. Heaps

Heaps are a relatively advanced data structure. Binary heaps are basically like binary trees and usually, we’re talking about min heaps or max heaps that have a special property. Heaps are super useful because they have this logarithmic time operation to find the minimum value in the min-heap or the maximum value of the max heap. Heaps can help a lot in the smallest or the largest value in groups of values and I think that going through constructing a min-heap or a max heap can be very empowering because it really shows you like how the data structure works how you can represent it very simply and elegantly with an array.

9. Dynamic programming

A lot of the hardest coding interview problems that you’re going to encounter can be solved with dynamic programming dynamic programming is this fancy term for really just being able to solve a problem that is pretty complex by first solving a smaller version of that problem and solving that smaller version of the problem you solve an even smaller version of that problem. All the way up until you can solve a trivial version of the problem. (practice make perfect)

10. Sorting algorithms

Quicksort and merge sort. All the sorting algorithms are important to understand. But I think quick sort and merge sort are really the two very famous and popular fast sorting algorithms.

It’s important to understand them because then you understand why they run nlog(n) time and why there are so much better than bubble sort. You need to know how to draw them. It can help a lot in the problem that searching for the case like smallest or largest value.