Salesforce Analytics-Con-301 Exam (page: 2)
Salesforce Certified Tableau Consultant
Updated on: 12-Feb-2026

An executive-level workbook leverages 37 of the 103 fields included in a data source. Performance for the workbook is noticeably slower than other workbooks on the same Tableau Server.

What should the consultant do to improve performance of this workbook while following best practice?

  1. Split some visualizations on the dashboard into many smaller visualizations on the same dashboard.
  2. Connect to the data source via a custom SQL query.
  3. Use filters, hide unused fields, and aggregate values.
    OD. Restrict users from accessing the workbook to reduce server load.

Answer(s): C

Explanation:

To improve the performance of a Tableau workbook, it is best practice to streamline the data being used. This can be achieved by using filters to limit the data to only what is necessary for analysis, hiding fields that are not being used to reduce the complexity of the data model, and aggregating values to simplify the data and reduce the number of rows that need to be processed. These steps can help reduce the load on the server and improve the speed of the workbook.


Reference:

The best practices for optimizing workbook performance in Tableau are well- documented in Tableau's official resources, including the Tableau Help Guide and the Designing Efficient Workbooks whitepaper, which provide detailed recommendations on how to streamline workbooks for better performance12.



A client wants to see the average number of orders per customer per month, broken down by region. The client has created the following calculated field:
Orders per Customer: {FIXED [Customer ID]: COUNTD([Order ID])}

The client then creates a line chart that plots AVG(Orders per Customer) over MONTH(Order Date) by Region. The numbers shown by this chart are far higher than the customer expects.

The client asks a consultant to rewrite the calculation so the result meets their expectation.

Which calculation should the consultant use?

  1. {INCLUDE [Customer ID]: COUNTD([Order ID])}
  2. {FIXED [Customer ID], [Region]: COUNTD([Order ID])}
  3. {EXCLUDE [Customer ID]: COUNTD([Order ID])}
  4. {FIXED [Customer ID], [Region], [Order Date]: COUNTD([Order ID])}

Answer(s): B

Explanation:

The calculation {FIXED [Customer ID], [Region]: COUNTD([Order ID])} is the correct one to use for this scenario. This Level of Detail (LOD) expression will calculate the distinct count of orders for each customer within each region, which is then averaged per month. This approach ensures that the average number of orders per customer is accurately calculated for each region and then broken down by month, aligning with the client's expectations. Reference:
The LOD expressions in Tableau allow for precise control over the level of detail at which calculations are performed, which is essential for accurate data analysis. The use of {FIXED} expressions to specify the granularity of the calculation is a common practice and is well- documented in Tableau's official resources12.

The initial calculation provided by the client likely overestimates the average number of orders per customer per month by region due to improper granularity control. The revised calculation must take into account both the customer and the region to correctly aggregate the data:
FIXED Level of Detail Expression: This calculation uses a FIXED expression to count distinct order IDs for each customer within each region. This ensures that the count of orders is correctly grouped by both customer ID and region, addressing potential duplication or misaggregation issues. Accurate Aggregation: By specifying both [Customer ID] and [Region] in the FIXED expression, the calculation prevents the overcounting of orders that may appear if only customer ID was considered, especially when a customer could be ordering from multiple regions.


Reference:

Level of Detail Expressions in Tableau: These expressions allow you to specify the level of granularity you need for your calculations, independent of the visualization's level of detail, thus offering precise control over data aggregation.



A client builds a dashboard that presents current and long-term stock measures. Currently, the data is at a daily level. The data presents as a bar chart that presents monthly results over current and previous years. Some measures must present as monthly averages.

What should the consultant recommend to limit the data source for optimal performance?

  1. Limit data to current and previous years and leave data at daily level to calculate the averages in the report.
  2. Limit data to current and previous years, move calculating averages to data layer, and aggregate dates to monthly level.
  3. Move calculating averages to data layer and aggregate dates to monthly level.
  4. Limit data to current and previous years as well as to the last day of each month to eliminate the need to use the averages.

Answer(s): B

Explanation:

For optimal performance, it is recommended to limit the data to what is necessary for analysis, which in this case would be the current and previous years. Moving the calculation of averages to the data layer and aggregating the dates to a monthly level will reduce the granularity of the data, thereby improving the performance of the dashboard. This approach aligns with best practices for optimizing workbook performance in Tableau, which suggest simplifying the data model and reducing the number of records processed12.


Reference:

The recommendation is based on the guidelines provided in Tableau's official documentation on optimizing workbook performance, which includes tips on data management and aggregation for better performance12.



A consultant builds a report where profit margin is calculated as SUM([Profit]) / SUM([Sales]). Three groups of users are organized on Tableau Server with the following levels of data access that they can be granted.

. Group 1: Viewers who cannot see any information on profitability . Group 2: Viewers who can see profit and profit margin . Group 3: Viewers who can see profit margin but not the value of profit

Which approach should the consultant use to provide the required level of access?

  1. Use user filters to access data on profitability to all groups. Then, create a calculated field that allows visibility of profit value to Group 2 and use the calculation in the view in the report.
  2. Specify in the row-level security (RLS) entitlement table individuals who can see profit, profit margin, or none of these. Then, use the table data to create user filters in the report.
  3. Use user filters to allow only Groups 2 and 3 access to data on profitability. Then, create a calculated field that limits visibility of profit value to Group 2 and use the calculation in the view in the report.
  4. Specify with user filters in each view individuals who can see profit, profit margin, or none of these.

Answer(s): C

Explanation:

The approach of using user filters to control access to data on profitability for Groups 2 and 3, combined with a calculated field that restricts the visibility of profit value to only Group 2, aligns with Tableau's best practices for managing content permissions. This method ensures that each group sees only the data they are permitted to view, with Group 1 not seeing any profitability information, Group 2 seeing both profit and profit margin, and Group 3 seeing only the profit margin without the actual profit values. This setup can be achieved through Tableau Server's permission capabilities, which allow for detailed control over what each user or group can see and interact with12.


Reference:

The solution is based on the capabilities and permission rules that are part of Tableau Server's security model, as detailed in the official Tableau documentation12. These resources provide guidance on how to set up user filters and calculated fields to manage data access levels effectively.



A company has a data source for sales transactions. The data source has the following characteristics:

. Millions of transactions occur weekly.
. The transactions are added nightly.
. Incorrect transactions are revised every week on Saturday.
· The end users need to see up-to-date data daily.

A consultant needs to publish a data source in Tableau Server to ensure that all the transactions in the data source are available.

What should the consultant do to create and publish the data?

  1. Publish an incremental extract refresh every day and perform a full extract refresh every Saturday.
  2. Publish a live connection to Tableau Server.
  3. Publish an incremental refresh every Saturday.
  4. Publish an incremental extract refresh every day and publish a secondary data set containing data revisions.

Answer(s): A

Explanation:

Given the need for up-to-date data on a daily basis and weekly revisions, the best approach is to use an incremental extract refresh daily to update the data source with new transactions. On Saturdays, when incorrect transactions are revised, a full extract refresh should be performed to incorporate all revisions and ensure the data's accuracy. This strategy allows end users to have access to the most current data throughout the week while also accounting for any necessary corrections12.


Reference:

The solution is based on best practices for managing data sources in Tableau Server, which recommend using incremental refreshes for frequent updates and full refreshes when significant changes or corrections are made to the data12.



A Tableau Cloud client has requested a custom dashboard to help track which data sources are used most frequently in dashboards across their site.

Which two actions should the client use to access the necessary metadata? Choose two.

  1. Connect directly to the Site Content data source within the Admin Insights project.
  2. Query metadata through the GraphiQL engine.
  3. Access metadata through the Metadata API.
  4. Download metadata through Tableau Catalog.

Answer(s): B,C

Explanation:

To track which data sources are used most frequently across a site in Tableau Cloud, the client should use the GraphiQL engine and the Metadata API. The GraphiQL engine allows for interactive exploration of the metadata, making it easier to construct and test queries1. The Metadata API provides access to metadata and lineage of external assets used by the content published to Tableau Cloud, which is essential for tracking data source usage2.


Reference:

The actions are based on the capabilities of the GraphiQL engine and the Metadata API as described in Tableau's official documentation and learning resources321.



A client wants to report Saturday and Sunday regardless of the workbook's data source's locale settings.

Which calculation should the consultant recommend?

  1. DATEPART('weekday', [Order Date])>=6
  2. DATEPART('iso-weekday', [Order Date])>=6
  3. DATENAME('iso-weekday', [Order Date])>=6
  4. DATEPART('iso-weekday', [Order Date])=1 or DATEPART('iso-weekday', [Order Date])=7

Answer(s): D

Explanation:

The calculation DATEPART('iso-weekday', [Order Date])=1 or DATEPART('iso-weekday', [Order Date])=7 is recommended because the ISO standard considers Monday as the first day of the week

(1) and Sunday as the last day (7). This calculation will correctly identify Saturdays and Sundays regardless of the locale settings of the workbook's data source, ensuring that the report includes these days as specified by the client.


Reference:

The use of the `iso-weekday' part in the DATEPART function is consistent with the ISO 8601 standard, which is independent of locale settings. This approach is supported by Tableau's documentation on date functions and their behavior with different locale settings123.

To accurately identify weekends across different locale settings, using the 'iso-weekday' component is reliable as it is consistent across various locales:
ISO Weekday Function: The ISO standard treats Monday as the first day of the week (1), which makes Sunday the seventh day (7). This standardization helps avoid discrepancies in weekday calculations that might arise due to locale-specific settings.
Identifying Weekends: The calculation checks if the 'iso-weekday' part of the date is either 1 (Sunday) or 7 (Saturday), thereby correctly identifying weekends regardless of the locale settings.


Handling Locale-Specific Settings: Using ISO standards in date functions allows for uniform results across systems with differing locale settings, essential for consistent reporting in global applications.



A client uses Tableau Data Management and notices that when they view a data source, they sometimes see a different count of workbooks in the Connected Workbooks tab compared to the lineage count in Tableau Catalog.

What is the cause of this discrepancy?

  1. Some workbooks have been connected to the data source, but do not use any fields from it.
  2. Some workbooks have not been viewed by enough users yet.
  3. Some of the workbooks connected to the data source are not visible to the user due to permissions.
  4. Some Creators have connected to the data source in Tableau Desktop but have not yet published a workbook.

Answer(s): C

Explanation:

The discrepancy between the count of workbooks in the Connected Workbooks tab and the lineage count in Tableau Catalog can occur because of user permissions. In Tableau Data Management, the visibility of connected workbooks is subject to the permissions set by administrators. If a user does not have permission to view certain workbooks, they will not see them listed in the Connected Workbooks tab, even though these workbooks are part of the data source's lineage and are counted in Tableau Catalog.


Reference:

This explanation is based on the functionality of Tableau Data Management and Tableau Catalog, which includes managing user permissions and access to workbooks. The information is supported by Tableau's official documentation on data management and security practices1.



Viewing Page 2 of 8



Share your comments for Salesforce Analytics-Con-301 exam with other users:

Kettie 10/12/2023 1:18:00 AM

this is very helpful content
Anonymous


SB 7/21/2023 3:18:00 AM

please provide the dumps
UNITED STATES


David 8/2/2023 8:20:00 AM

it is amazing
Anonymous


User 8/3/2023 3:32:00 AM

quesion 178 about "a banking system that predicts whether a loan will be repaid is an example of the" the answer is classification. not regresion, you should fix it.
EUROPEAN UNION


quen 7/26/2023 10:39:00 AM

please upload apache spark dumps
Anonymous


Erineo 11/2/2023 5:34:00 PM

q14 is b&c to reduce you will switch off mail for every single alert and you will switch on daily digest to get a mail once per day, you might even skip the empty digest mail but i see this as a part of the daily digest adjustment
Anonymous


Paul 10/21/2023 8:25:00 AM

i think it is good question
Anonymous


Unknown 8/15/2023 5:09:00 AM

good for students who wish to give certification.
INDIA


Ch 11/20/2023 10:56:00 PM

is there a google drive link to the images? the links in questions are not working.
AUSTRALIA


Joey 5/16/2023 5:25:00 AM

very promising, looks great, so much wow!
Anonymous


alaska 10/24/2023 5:48:00 AM

i scored 87% on the az-204 exam. thanks! i always trust
GERMANY


nnn 7/9/2023 11:09:00 PM

good need more
Anonymous


User-sfdc 12/29/2023 7:21:00 AM

sample questions seems good
Anonymous


Tamer dam 8/4/2023 10:21:00 AM

huawei is ok
UNITED STATES


YK 12/11/2023 1:10:00 AM

good one nice
JAPAN


de 8/28/2023 2:38:00 AM

please continue
GERMANY


DMZ 6/25/2023 11:56:00 PM

this exam dumps just did the job. i donot want to ruffle your feathers but your exam dumps and mock test engine is amazing.
UNITED KINGDOM


Jose 8/30/2023 6:14:00 AM

nice questions
PORTUGAL


Tar01 7/24/2023 7:07:00 PM

the explanation are really helpful
Anonymous


DaveG 12/15/2023 4:50:00 PM

just passed my exam yesterday on my first attempt. these dumps were extremely helpful in passing first time. the questions were very, very similar to these questions!
Anonymous


A.K. 6/30/2023 6:34:00 AM

cosmos db is paas not saas
Anonymous


S Roychowdhury 6/26/2023 5:27:00 PM

what is the percentage of common questions in gcp exam compared to 197 dump questions? are they 100% matching with real gcp exam?
Anonymous


Bella 7/22/2023 2:05:00 AM

not able to see questions
Anonymous


Scott 9/8/2023 7:19:00 AM

by far one of the best sites for free questions. i have pass 2 exams with the help of this website.
CANADA


donald 8/19/2023 11:05:00 AM

excellent question bank.
Anonymous


Ashwini 8/22/2023 5:13:00 AM

it really helped
Anonymous


sk 5/13/2023 2:07:00 AM

excelent material
INDIA


Christopher 9/5/2022 10:54:00 PM

the new versoin of this exam which i downloaded has all the latest questions from the exam. i only saw 3 new questions in the exam which was not in this dump.
CANADA


Sam 9/7/2023 6:51:00 AM

question 8 - can cloudtrail be used for storing jobs? based on aws - aws cloudtrail is used for governance, compliance and investigating api usage across all of our aws accounts. every action that is taken by a user or script is an api call so this is logged to [aws] cloudtrail. something seems incorrect here.
UNITED STATES


Tanvi Rajput 8/14/2023 10:55:00 AM

question 13 tda - c01 answer : quick table calculation -> percentage of total , compute using table down
UNITED KINGDOM


PMSAGAR 9/19/2023 2:48:00 AM

pls share teh dump
UNITED STATES


zazza 6/16/2023 10:47:00 AM

question 44 answer is user risk
ITALY


Prasana 6/23/2023 1:59:00 AM

please post the questions for preparation
Anonymous


test user 9/24/2023 3:15:00 AM

thanks for the questions
AUSTRALIA