Skip to main content

Interview Questions

Review this list of 4,477 interview questions and answers verified by hiring managers and candidates.
  • Discord logoAsked at Discord 
    Add answer
    Software Engineer
    Behavioral
    +4 more
  • Discord logoAsked at Discord 
    Add answer
    Data Scientist
    Behavioral
    +1 more
  • Discord logoAsked at Discord 
    Add answer
    Software Engineer
    Behavioral
    +1 more
  • Data Engineer
    Data Pipeline Design
  • "There are 2 questions popping into my mind: Should the 2nd job have to kick off at 12:30AM? Are there others depending on the 2nd job? If both answers are no, we may simply postpone the second job to allow sufficient time for the first one to complete. If they are yeses, we could let the 2nd job retry to a certain amount of times. Make sure that even reaching the maximum of retries won't delay or fail the following jobs."

    Anzhe M. - "There are 2 questions popping into my mind: Should the 2nd job have to kick off at 12:30AM? Are there others depending on the 2nd job? If both answers are no, we may simply postpone the second job to allow sufficient time for the first one to complete. If they are yeses, we could let the 2nd job retry to a certain amount of times. Make sure that even reaching the maximum of retries won't delay or fail the following jobs."See full answer

    Data Engineer
    Data Pipeline Design
  • 🧠 Want an expert answer to a question? Saving questions lets us know what content to make next.

  • Databricks logoAsked at Databricks 
    Add answer
    Data Engineer
    Data Pipeline Design
  • Databricks logoAsked at Databricks 
    Add answer
    Data Engineer
    Data Pipeline Design
  • Databricks logoAsked at Databricks 
    2 answers

    "Medallion architecture is a layered data architecture used in lakehouse systems. Data flows through Bronze, Silver, and Gold layers where each layer improves data quality. Bronze stores raw data, Silver contains cleaned and validated datasets, and Gold provides aggregated business-ready data for analytics and reporting bronzedf = spark.read.json("/landing/apidata") bronze_df.write.format("delta").save("/bronze/users")"

    Ramagiri P. - "Medallion architecture is a layered data architecture used in lakehouse systems. Data flows through Bronze, Silver, and Gold layers where each layer improves data quality. Bronze stores raw data, Silver contains cleaned and validated datasets, and Gold provides aggregated business-ready data for analytics and reporting bronzedf = spark.read.json("/landing/apidata") bronze_df.write.format("delta").save("/bronze/users")"See full answer

    Data Engineer
    Data Pipeline Design
  • Databricks logoAsked at Databricks 
    1 answer

    "Delta lake is a metadata layer on top of cloud storage which helps giving datalake transactional capabilities. It helps implement upsert/merge as it conforms a schema to the data assets stored in cloud. It also offers various other capabilities like liquid clustering,time travel, schema evolution,deletes."

    Nitish C. - "Delta lake is a metadata layer on top of cloud storage which helps giving datalake transactional capabilities. It helps implement upsert/merge as it conforms a schema to the data assets stored in cloud. It also offers various other capabilities like liquid clustering,time travel, schema evolution,deletes."See full answer

    Data Engineer
    Data Pipeline Design
  • Databricks logoAsked at Databricks 
    5 answers
    +2

    "This is yet another classic case of evolution of data landscape to account for diversities in the data formats sacrificing restrictive but key components at first and added later to make the solution more effective. Data warehouse -> Data Lake -> Data Lakehouse (Data Lake + Data Warehouse) Data warehouse - A solution to store data in central place (analytics (read) heavy) with stringent schema (structured). Very useful for historical queries and analytics. Schema on write check. Only used for"

    Karthik R. - "This is yet another classic case of evolution of data landscape to account for diversities in the data formats sacrificing restrictive but key components at first and added later to make the solution more effective. Data warehouse -> Data Lake -> Data Lakehouse (Data Lake + Data Warehouse) Data warehouse - A solution to store data in central place (analytics (read) heavy) with stringent schema (structured). Very useful for historical queries and analytics. Schema on write check. Only used for"See full answer

    Data Engineer
    Data Pipeline Design
  • Databricks logoAsked at Databricks 
    1 answer

    "All purpose cluster remains up and running for longer duration irrespective of the job hence preferred for notebooks, adhoc work whereas job cluster spins up as per the submitted job and shuts down post the completion hence preferred for production scheduled workloads as it also offers compute isolation"

    Nitish C. - "All purpose cluster remains up and running for longer duration irrespective of the job hence preferred for notebooks, adhoc work whereas job cluster spins up as per the submitted job and shuts down post the completion hence preferred for production scheduled workloads as it also offers compute isolation"See full answer

    Data Engineer
    Data Pipeline Design
  • 1 answer

    "Clarify : "Movie watching” can happen in a Theatre On a cable TV via an OTT app, A dedicated home-theatre set-up. I will focus on OTT streaming on a Smart-TV because: It is the fastest-growing surface (70 % of long-form streaming minutes). Geography: USA Platform : Any (Android TV, Tizen etc) OTT: Netflix What are we trying to improve ? Basic Journey -->Switch on TV->Navigate to OTT->Find Movie->Play Movie->Consume Audio and Video Setup Journey -> Create account -> Setu"

    Inraathp R. - "Clarify : "Movie watching” can happen in a Theatre On a cable TV via an OTT app, A dedicated home-theatre set-up. I will focus on OTT streaming on a Smart-TV because: It is the fastest-growing surface (70 % of long-form streaming minutes). Geography: USA Platform : Any (Android TV, Tizen etc) OTT: Netflix What are we trying to improve ? Basic Journey -->Switch on TV->Navigate to OTT->Find Movie->Play Movie->Consume Audio and Video Setup Journey -> Create account -> Setu"See full answer

    Product Manager
    Product Design
  • "I walked through the code for a react.js based tic-tac-toe game written in typescript. The goal was to find ways to improve the code/ suggest improvements. I missed some areas like where state was being updated directly rather than using React's setState. There were issues around clear and maintainable logic, adherence to React best practices."

    Natalie C. - "I walked through the code for a react.js based tic-tac-toe game written in typescript. The goal was to find ways to improve the code/ suggest improvements. I missed some areas like where state was being updated directly rather than using React's setState. There were issues around clear and maintainable logic, adherence to React best practices."See full answer

    Engineering Manager
    Coding
  • "I would use A/B testing to see if the new feature would be incrementally beneficial. To begin the testing, we should define what's the goal of this testing. Let's say the new feature would increase the average number of trade by X. Then randomly assign the clients to two groups, control and test group. Control group doesn't see the new feature and the test group see the new feature. We could also stratified sampling if we want to make sure cover different customer segmentation. During this desig"

    Jiin S. - "I would use A/B testing to see if the new feature would be incrementally beneficial. To begin the testing, we should define what's the goal of this testing. Let's say the new feature would increase the average number of trade by X. Then randomly assign the clients to two groups, control and test group. Control group doesn't see the new feature and the test group see the new feature. We could also stratified sampling if we want to make sure cover different customer segmentation. During this desig"See full answer

    Data Scientist
    Statistics & Experimentation
  • 2 answers

    "System Overview and Workflow: The architecture is composed of distributed microservices that communicate via an event bus (Apache Kafka). Ingestion: The process begins when the Chargeback Ingestion Service receives dispute notifications from acquiring banks through various methods (webhooks, SFTP files) and standardizes them into a canonical format. Lifecycle Management: A DisputeNotification event is published to Kafka, which the Chargeback Lifecycle Service consumes. This service acts"

    Abhishek M. - "System Overview and Workflow: The architecture is composed of distributed microservices that communicate via an event bus (Apache Kafka). Ingestion: The process begins when the Chargeback Ingestion Service receives dispute notifications from acquiring banks through various methods (webhooks, SFTP files) and standardizes them into a canonical format. Lifecycle Management: A DisputeNotification event is published to Kafka, which the Chargeback Lifecycle Service consumes. This service acts"See full answer

    Software Engineer
    System Design
  • Amazon logoAsked at Amazon 
    4 answers
    +1

    "First i will collect relevant data,Actual problem,Necessary input and raw material then i will start design drawing,modeling,Callibration and finally handing over my work."

    Mohammed H. - "First i will collect relevant data,Actual problem,Necessary input and raw material then i will start design drawing,modeling,Callibration and finally handing over my work."See full answer

    Product Manager
    Product Design
  • "My brute force approach was to read them. Give a id to each paragraph and for each token count the number of time it has appeared. If any two rows look same , it is duplicated. Further , interviewer guided me that he will do it with hashing."

    Payal B. - "My brute force approach was to read them. Give a id to each paragraph and for each token count the number of time it has appeared. If any two rows look same , it is duplicated. Further , interviewer guided me that he will do it with hashing."See full answer

    Data Engineer
    Coding
  • Salesforce logoAsked at Salesforce 
    2 answers

    "Bitshift the number to the right and keep track of the 1's you encounter. If you bitshift it completely and only encounter one 1, it is a power of two."

    Nils G. - "Bitshift the number to the right and keep track of the 1's you encounter. If you bitshift it completely and only encounter one 1, it is a power of two."See full answer

    Software Engineer
    Data Structures & Algorithms
    +1 more
  • Adobe logoAsked at Adobe 
    4 answers
    +1

    " Compare alternate houses i.e for each house starting from the third, calculate the maximum money that can be stolen up to that house by choosing between: Skipping the current house and taking the maximum money stolen up to the previous house. Robbing the current house and adding its value to the maximum money stolen up to the house two steps back. package main import ( "fmt" ) // rob function calculates the maximum money a robber can steal func maxRob(nums []int) int { ln"

    VContaineers - " Compare alternate houses i.e for each house starting from the third, calculate the maximum money that can be stolen up to that house by choosing between: Skipping the current house and taking the maximum money stolen up to the previous house. Robbing the current house and adding its value to the maximum money stolen up to the house two steps back. package main import ( "fmt" ) // rob function calculates the maximum money a robber can steal func maxRob(nums []int) int { ln"See full answer

    Data Engineer
    Data Structures & Algorithms
    +4 more
  • Salesforce logoAsked at Salesforce 
    Add answer
    Software Engineer
    Data Structures & Algorithms
    +1 more
Showing 1021-1040 of 4477