Google Interview Questions
1. Please describe a data structure that accomplishes the functions INSERT and FIND-MEDIAN in the best possible asymptotic time.
2. Given a circularly sorted array, describe the fastest way to locate the largest element.
3. Given a binary search tree and an integer k, describe the most efficient way to locate two nodes of the tree that sum to k.
4. Given an arbitrarily connected graph, explain how to ascertain the reachability of a given node.
5. If you have one billion numbers and one hundred computers, what is the best way to locate the median of the numbers?
6. Describe an API for a cache, with all of the associated function signatures. Provide pseudocode for the simple one.
7. Implement a special type of integer iterator. It will be a wrapper over a regular integer iterator, except that this iterator will only iterate over negative numbers. Show how you'd write the next() and hasNext() functions to work with the next negative integer instead of just the next integer.
8. You are making an AI to play Tic-Tac-To. Make a function that takes a board position as an input and returns a good position to play. Describe how you'd represent the board, and how you'd determine where to play next.
9. You are given 1001 numbers in an array. There are 1000 unique numbers and 1 duplicate. How do you locate the duplicate as quickly as possible?
10. Say you are implementing exponentiation for a certain processor. How would you minimize the number of multiplications required for this operation?.
Thease are two Interview Questions asked Me
1.You have been shrunk down to the size of a nickel and tossed into a blender. You are told that the blender blades will start in 60 seconds.What would you do to save your life?
2.Design and describe a system/application that will most efficiently produce a report of the top 1 million Google search requests.These are the particulars.
*You are given 12 servers to work with. They are all dual-processor machines with 4Gb of RAM, 4x400GB hard drives and networked together.(Basically, nothing more than high-end PC’s)
*The log data has already been cleaned for you. It consists of 100 Billion log lines, broken down into 12 320 GB files of 40-byte search terms per line.
*You can use only custom written applications or available free open-source software.
No comments:
Post a Comment