Data Analyst Mock Interview (Experienced) | Real Scenario-Based Questions & Answers 2026
35:07

Data Analyst Mock Interview (Experienced) | Real Scenario-Based Questions & Answers 2026

Bharath.Insights

10 chapters8 takeaways25 key terms5 questions

Overview

This video presents a mock interview for an experienced Data Analyst, focusing on Power BI skills. The candidate, Vun, discusses their background, project experience with clients in the financial and healthcare sectors, and technical expertise. The interview covers data sources (SQL Server, Snowflake), Power BI connection modes (Import, Direct Query, Live Connection, Composite), data modeling (Star Schema, relationships, normalization), licensing (Pro, Premium), report sharing, page-level security, data flows, testing procedures, deployment pipelines, incremental refresh, workspace roles (Admin, Member, Contributor, Viewer), DAX functions (time intelligence, CALCULATE, USE RELATIONSHIP, PARALLELPERIOD), handling complex SQL queries, and domain experience (healthcare, finance). The candidate also touches upon data masking and SQL concepts like Views and CTEs.

How was this?

Save this permanently with flashcards, quizzes, and AI chat

Chapters

  • Vun has a background in Electronics and Communication Engineering and a Master's in Computer Science.
  • Possesses 4.5 years of industry experience, with 3 years specifically as a Power BI developer.
  • Has worked with healthcare IT (Oracle Soner) and financial services (TCW) clients.
  • Developed over 50 reports and dashboards for approximately 15 clients.
  • Certified in PL300, DP600, and DP700.
This section establishes the candidate's foundational knowledge and breadth of experience, setting the stage for more detailed technical discussions.
Worked as a Power BI developer for Oracle Soner (healthcare IT) and currently for Unlimited Innovations India serving TCW (US financial sector).
  • Supported TCW, an asset management client, for 6-7 months.
  • Developed dashboards and reports for C-level executives (CEO, CTO) to monitor key metrics.
  • Set up daily/monthly email subscriptions for automated report delivery.
  • Data sources include Microsoft SQL Server and Snowflake, with a plan to migrate all data to Snowflake.
  • Familiar with various data sources including Excel, CSV, and JSON.
This provides a concrete example of the candidate's recent work, demonstrating their ability to handle real-world client requirements and data environments.
Created daily dashboards for TCW's management, showing metrics like Assets Under Management (AUM) and other financial indicators, with automated email subscriptions.
  • Primarily uses Import mode (90% of the time) due to large data volumes (10M+ rows).
  • Familiar with Direct Query mode but used it less frequently in practice.
  • Aware of Live Connection mode (for SSAS datasets) and Composite mode (combining Import and Direct Query).
Understanding connection modes is crucial for performance optimization and choosing the right approach based on data size and real-time needs.
Used Import mode for large clinical datasets at a previous role to ensure performance, as Direct Query was not viable.
  • Worked with Star Schema in the most recent project, featuring a central fact table and dimension tables (Date, Product, Fund).
  • Encountered and resolved many-to-many relationships using a bridge table.
  • Understands normalization as a process to simplify data models by reducing redundancy and combining tables.
  • Can explain bidirectional relationships and their necessity in specific filtering scenarios.
Effective data modeling is the backbone of any robust Power BI solution, impacting performance, usability, and data integrity.
Used a bridge table to resolve a many-to-many relationship between two tables by creating a new table with unique values from both and establishing one-to-many relationships from the original tables to the bridge table.
  • Current organization uses Premium capacity, allowing free license viewers to access reports.
  • If not using Premium capacity, viewers typically need Pro or Premium licenses.
  • Page-level security can be used to restrict access to specific pages within a report.
  • Power BI Apps and Audiences offer a more streamlined way to manage content visibility for different user groups.
Understanding licensing and sharing mechanisms is essential for deploying Power BI solutions effectively and ensuring users have appropriate access.
If a report is hosted in a Premium workspace, viewers with free licenses can access it without needing their own Pro or Premium license.
  • Performs unit testing by exporting data to Excel and cross-checking key metrics and row counts.
  • Uses separate workspaces for non-production (development) and production environments.
  • Deployment involves publishing from a non-prod workspace to a fabric-enabled premium workspace.
  • Utilizes Azure DevOps for source control and has a peer review process (manager tests developed reports and vice-versa).
This outlines the candidate's structured approach to development, testing, and deployment, ensuring quality and control before releasing to end-users.
After developing a report in a non-prod workspace, Vun exports data to Excel to manually verify calculations and row counts against expected values before publishing to the production workspace.
  • Daily scheduled refreshes are managed via an on-premises gateway set up by their manager.
  • Aware of incremental refresh but has not personally implemented it.
  • Pro license allows 8 scheduled refreshes per day; Premium allows refreshes every 30 minutes.
  • Power Automate can be used for on-demand report refreshes via a button trigger.
Understanding data refresh capabilities is vital for ensuring reports contain up-to-date information, meeting business needs for timeliness.
The current project uses a scheduled daily refresh managed through an on-premises gateway to keep the data current for the financial client.
  • Familiar with Power BI workspace roles: Admin, Member (current role), Contributor, and Viewer.
  • Successfully used DAX to display textual data alongside numerical values in a matrix visual by creating custom measures and using `ISINSCOPE`.
  • Regularly uses time intelligence functions, `CALCULATE`, and `USE RELATIONSHIP`.
  • Can explain and differentiate `PARALLELPERIOD` and `SAMEPERIODLASTYEAR` for time-based comparisons.
  • Understands `USE RELATIONSHIP` for activating inactive relationships in DAX calculations.
This section highlights the candidate's practical application of DAX for complex requirements and their understanding of Power BI's security and organizational structure.
Created custom DAX measures to display textual data within a matrix visual, overcoming limitations of native visuals by leveraging `ISINSCOPE`.
  • Experience spans healthcare (patient data), healthcare insurance, and financial asset management.
  • Has worked with individual patient-level data in healthcare projects.
  • Acknowledges the need for data masking/hiding for sensitive information at business/executive levels, though not personally implemented.
  • Rates SQL proficiency as 6.5/10, comfortable with complex queries, temp tables, and CTEs.
  • Prefers performing transformations in SQL Server at the source rather than in Power Query/DAX.
Demonstrates adaptability across different industries and a thoughtful approach to data sensitivity and transformation strategies.
In healthcare projects, Vun could see individual patient names, ages, and conditions, but notes that for business-level reporting, data masking would likely be required.
  • Distinguishes between Views (stored query definitions) and CTEs (temporary, session-specific aliases).
  • Performs transformations in SQL Server, including business calculations, quality checks, formatting, and percentage/window calculations.
  • Recalls creating a complex SQL query involving multiple nested CTEs and numerous joins.
  • Understands `LEFT ANTI JOIN` as a way to find rows in the left table that do not have a match in the right table.
Assesses the candidate's foundational database skills, which are critical for efficient data preparation and understanding data structures.
Created a 500-line SQL query using multiple nested CTEs and joins to aggregate and transform data before it even reached Power BI.

Key takeaways

  1. 1Prioritize performing data transformations at the source (e.g., SQL Server) to optimize Power BI performance.
  2. 2Understand and apply different Power BI connection modes (Import, Direct Query) based on data volume and real-time requirements.
  3. 3Master data modeling techniques like Star Schema and the use of bridge tables to resolve complex relationships.
  4. 4Leverage Power BI Apps and Audiences for efficient and granular report sharing, especially in large organizations.
  5. 5Implement a structured development and deployment process, including unit testing and peer reviews, to ensure report quality.
  6. 6DAX functions like `CALCULATE`, time intelligence functions, and `ISINSCOPE` are essential for creating sophisticated calculations and visuals.
  7. 7Be aware of Power BI licensing implications for both developers and end-users when sharing reports.
  8. 8While working with sensitive data, consider and discuss data masking or anonymization techniques.

Key terms

Power BIImport ModeDirect QueryStar SchemaFact TableDimension TableMany-to-Many RelationshipBridge TableBidirectional RelationshipNormalizationPremium CapacityPro LicensePage-Level SecurityPower BI AppsData FlowsUnit TestingWorkspace RolesDAXTime Intelligence FunctionsUSE RELATIONSHIPCTE (Common Table Expression)ViewLeft Anti JoinIncremental RefreshOn-premises Gateway

Test your understanding

  1. 1How would you decide between using Power BI's Import mode versus Direct Query for a new data source, and what factors influence this decision?
  2. 2Describe a scenario where you would need to use a bridge table to resolve a data modeling challenge, and explain the steps involved.
  3. 3What are the key differences in licensing requirements for sharing a Power BI report when using a Premium capacity versus individual Pro licenses?
  4. 4Explain the purpose of page-level security in Power BI and how it differs from using Power BI Apps for content distribution.
  5. 5How do you ensure data accuracy and report quality before deploying a Power BI report to production, and what role does source control play in this process?

Turn any lecture into study material

Paste a YouTube URL, PDF, or article. Get flashcards, quizzes, summaries, and AI chat — in seconds.

No credit card required

Data Analyst Mock Interview (Experienced) | Real Scenario-Based Questions & Answers 2026 | NoteTube | NoteTube