"i responded using a multi sourced BFS and in place marking, then i checked the final grid to see if any free spots were left unmarked."
Sh R. - "i responded using a multi sourced BFS and in place marking, then i checked the final grid to see if any free spots were left unmarked."See full answer
"This was a 60 minute assessment. The clock is ticking and you're being observed by a senior+ level engineer. Be ready to perform for an audience.
The implementation for the system gets broken up into three parts:
Implement creating accounts and depositing money into an account by ID
Implement transferring money with validation to ensure the accounts for the transfer both exist and that the account money is being removed from has enough money in it to perform the transfer
Implement find"
devopsjesus - "This was a 60 minute assessment. The clock is ticking and you're being observed by a senior+ level engineer. Be ready to perform for an audience.
The implementation for the system gets broken up into three parts:
Implement creating accounts and depositing money into an account by ID
Implement transferring money with validation to ensure the accounts for the transfer both exist and that the account money is being removed from has enough money in it to perform the transfer
Implement find"See full answer
"We can use dictionary to store cache items so that our read / write operations will be O(1).
Each time we read or update an existing record, we have to ensure the item is moved to the back of the cache. This will allow us to evict the first item in the cache whenever the cache is full and we need to add new records also making our eviction O(1)
Instead of normal dictionary, we will use ordered dictionary to store cache items. This will allow us to efficiently move items to back of the cache a"
Alfred O. - "We can use dictionary to store cache items so that our read / write operations will be O(1).
Each time we read or update an existing record, we have to ensure the item is moved to the back of the cache. This will allow us to evict the first item in the cache whenever the cache is full and we need to add new records also making our eviction O(1)
Instead of normal dictionary, we will use ordered dictionary to store cache items. This will allow us to efficiently move items to back of the cache a"See full answer
"\# An program that prints out the peak elements in a list of integers.
Pseudocode:
1. Define a function that takes a list of integers as input.
2. Initialize an empty list to store the peak elements.
3. Loop through the list of integers.
4. For each element, check if it is greater than its neighbors.
5. If it is, add it to the list of peak elements.
6. Return the list of peak elements.
def findpeakelements(nums):
if not nums:
return []
peaks = []
n = len(nums"
Frederick K. - "\# An program that prints out the peak elements in a list of integers.
Pseudocode:
1. Define a function that takes a list of integers as input.
2. Initialize an empty list to store the peak elements.
3. Loop through the list of integers.
4. For each element, check if it is greater than its neighbors.
5. If it is, add it to the list of peak elements.
6. Return the list of peak elements.
def findpeakelements(nums):
if not nums:
return []
peaks = []
n = len(nums"See full answer
Software Engineer
Coding
+1 more
🧠 Want an expert answer to a question? Saving questions lets us know what content to make next.
"I firstly discuss the brute force approach in O(n^2) time complexity , than i moved to O(nlogn) tine complexity than i discussed the O(n) time complexity and O(n) space complexity . But interviewer want more optimised solution , in O(n) time complexity without using extra space ,
The solution wants O(1) space complexity i have to do changes in same array without using any space . This method is something like i have to place positive values to its original position by swapping and rest negativ"
Anni P. - "I firstly discuss the brute force approach in O(n^2) time complexity , than i moved to O(nlogn) tine complexity than i discussed the O(n) time complexity and O(n) space complexity . But interviewer want more optimised solution , in O(n) time complexity without using extra space ,
The solution wants O(1) space complexity i have to do changes in same array without using any space . This method is something like i have to place positive values to its original position by swapping and rest negativ"See full answer
"Used Recursive approach to traverse the binary search tree and sum the values of the nodes that fall within the specified range [low, high]"
Srikant V. - "Used Recursive approach to traverse the binary search tree and sum the values of the nodes that fall within the specified range [low, high]"See full answer
"These are a set of utilities used to manage the heap memory as part of an application. The C standard library implements these functions.
malloc(bytes) takes a number of bytes and returns a pointer to the start of the allocated buffer. If the allocation failed, a null pointer is returned instead.
calloc(count, size) behaves like malloc(count * size), but also zero-initializes the allocated buffer, assuming the allocation succeeded.
realloc(ptr, size) takes a pointer to a previously al"
J R. - "These are a set of utilities used to manage the heap memory as part of an application. The C standard library implements these functions.
malloc(bytes) takes a number of bytes and returns a pointer to the start of the allocated buffer. If the allocation failed, a null pointer is returned instead.
calloc(count, size) behaves like malloc(count * size), but also zero-initializes the allocated buffer, assuming the allocation succeeded.
realloc(ptr, size) takes a pointer to a previously al"See full answer
"SELECT customer_id,
order_date,
orderid AS secondearliestorderid
FROM (
SELECT order_id,
customer_id,
order_date,
ROWNUMBER() OVER (PARTITION BY customerid, orderdate ORDER BY orderid ASC) AS rank
FROM orders
)
WHERE rank = 2
ORDER BY orderdate, customerid
`"
Tiffany A. - "SELECT customer_id,
order_date,
orderid AS secondearliestorderid
FROM (
SELECT order_id,
customer_id,
order_date,
ROWNUMBER() OVER (PARTITION BY customerid, orderdate ORDER BY orderid ASC) AS rank
FROM orders
)
WHERE rank = 2
ORDER BY orderdate, customerid
`"See full answer
"Constraints: 4-direction moves; no mode switching (pick exactly one of {1=bicycle, 2=bike, 3=car, 4=bus} for the full trip).
Per-mode search:
If a mode’s per-step time/cost are uniform, run BFS on allowed cells. Then totaltime = steps × timeperstep, tie-break by steps × costper_step.
If time/cost vary by cell (given matrices), run Dijkstra per mode minimizing (totaltime, totalcost) lexicographically. Maintain the best ⟨time, cost⟩ per cell; relax when the new pair is strictly better.
S"
Rahul J. - "Constraints: 4-direction moves; no mode switching (pick exactly one of {1=bicycle, 2=bike, 3=car, 4=bus} for the full trip).
Per-mode search:
If a mode’s per-step time/cost are uniform, run BFS on allowed cells. Then totaltime = steps × timeperstep, tie-break by steps × costper_step.
If time/cost vary by cell (given matrices), run Dijkstra per mode minimizing (totaltime, totalcost) lexicographically. Maintain the best ⟨time, cost⟩ per cell; relax when the new pair is strictly better.
S"See full answer
"2 Approaches:
1) The more intuitive approach is doing a multi-source BFS from all cats and storing the distance of closest cats. Then do a dfs/bfs from rat to bread.
Time Complexity: O(mn + 4^L) where L is path length, worst case L could be mn
Space Complexity: O(m*n)
2) The first approach should be fine for interviews. But if they ask to optimize it further, you can use Binary Search. Problems like "Finding max of min distance" or "Finding min of max" could be usually solved by BS.
"
Karan K. - "2 Approaches:
1) The more intuitive approach is doing a multi-source BFS from all cats and storing the distance of closest cats. Then do a dfs/bfs from rat to bread.
Time Complexity: O(mn + 4^L) where L is path length, worst case L could be mn
Space Complexity: O(m*n)
2) The first approach should be fine for interviews. But if they ask to optimize it further, you can use Binary Search. Problems like "Finding max of min distance" or "Finding min of max" could be usually solved by BS.
"See full answer
"
Compare alternate houses i.e for each house starting from the third, calculate the maximum money that can be stolen up to that house by choosing between:
Skipping the current house and taking the maximum money stolen up to the previous house.
Robbing the current house and adding its value to the maximum money stolen up to the house two steps back.
package main
import (
"fmt"
)
// rob function calculates the maximum money a robber can steal
func maxRob(nums []int) int {
ln"
VContaineers - "
Compare alternate houses i.e for each house starting from the third, calculate the maximum money that can be stolen up to that house by choosing between:
Skipping the current house and taking the maximum money stolen up to the previous house.
Robbing the current house and adding its value to the maximum money stolen up to the house two steps back.
package main
import (
"fmt"
)
// rob function calculates the maximum money a robber can steal
func maxRob(nums []int) int {
ln"See full answer
"Sorted the array and stored the minimum difference in a variable and then traversed the array for the pairs having minimum difference"
Aashka C. - "Sorted the array and stored the minimum difference in a variable and then traversed the array for the pairs having minimum difference"See full answer
"WITH filtered_posts AS (
SELECT
p.user_id,
p.issuccessfulpost
FROM
post p
WHERE
p.postdate >= '2023-11-01' AND p.postdate < '2023-12-01'
),
post_summary AS (
SELECT
pu.user_type,
COUNT(*) AS post_attempt,
SUM(CASE WHEN fp.issuccessfulpost = 1 THEN 1 ELSE 0 END) AS post_success
FROM
filtered_posts fp
JOIN
postuser pu ON fp.userid = pu.user_id
GROUP BY
pu.user_type
)
SELECT
user_type,
post_success,
post_attempt,
CAST(postsuccess AS FLOAT) / postattempt AS postsuccessrate
FROM
po"
David I. - "WITH filtered_posts AS (
SELECT
p.user_id,
p.issuccessfulpost
FROM
post p
WHERE
p.postdate >= '2023-11-01' AND p.postdate < '2023-12-01'
),
post_summary AS (
SELECT
pu.user_type,
COUNT(*) AS post_attempt,
SUM(CASE WHEN fp.issuccessfulpost = 1 THEN 1 ELSE 0 END) AS post_success
FROM
filtered_posts fp
JOIN
postuser pu ON fp.userid = pu.user_id
GROUP BY
pu.user_type
)
SELECT
user_type,
post_success,
post_attempt,
CAST(postsuccess AS FLOAT) / postattempt AS postsuccessrate
FROM
po"See full answer