Microsoft AB-730 (page: 2)

Microsoft AI Business Professional

Updated 17-Apr-2026

You are a merchandiser who is planning for the upcoming season.

You prompt Microsoft 365 Copilot to suggest which products to stock based on historical sales data. Without reviewing the suggestions or checking current market trends, you place a large order based solely on the output of Copilot.

What is this an example of?

  1. verification
  2. overreliance
  3. fabrication
  4. prompt injection

Answer(s): B

Explanation:

Relying solely on Microsoft 365 Copilot for high-stakes inventory decisions constitutes overreliance, a known risk where users trust AI outputs without sufficient critical evaluation or human oversight.
Key Risks of Overreliance in Inventory Management
Context Blindness: Copilot excels at structured historical data but cannot account for "real-world" nuances like sudden geopolitical shifts, local weather events, or unrecorded competitor activity.
Data Integrity Issues: AI recommendations are only as good as the input data. If historical sales data is inconsistent, incomplete, or contains "garbage" entries, Copilot will generate inaccurate forecasts.
Algorithmic Bias: If previous manual stocking decisions were biased--for example, favoring certain regions or suppliers--the AI may reinforce these patterns, leading to skewed results.
Hallucinations: Like all Large Language Models (LLMs), Copilot can "hallucinate" or provide plausible-sounding but factually incorrect data points, which can be catastrophic for large financial orders.
How to Maintain Proper Human Oversight
To mitigate these risks, organizations should treat Copilot as an assistive tool rather than a primary decision- maker.



You are comparing the difference between Microsoft 365 Copilot and Microsoft 365 Copilot Chat.

What is available in both versions?

  1. the Researcher agent
  2. Copilot Pages
  3. Copilot Notebooks

Answer(s): B

Explanation:

A key feature available in both Microsoft 365 Copilot and Microsoft 365 Copilot Chat is Enterprise Data Protection (EDP), which ensures that user data and prompts are not used to train the underlying AI models.
Both versions also support key productivity features when signed in with a work or school account, including:
Web-grounded chat: Both can generate answers based on public web data and the latest Large Language Models (LLMs).
File Uploads: Both allow users to upload files to summarize or analyze.
Image Generation: Both can create images directly in the chat interface.
*-> Copilot Pages: Both allow users to turn AI-generated responses into editable, shareable documents.
Contextual Awareness: Both can understand the context of an open file (Word, Excel, PowerPoint) or email (Outlook) in the side pane.



You are discussing Microsoft 365 Copilot with a colleague. The colleague asks which data Copilot uses to answer questions when using the Work scope.

What should you tell your colleague?

  1. Copilot provides responses based only on data that the user can access and the general knowledge that Copilot was trained on.
  2. Copilot provides responses based on all the data in your organization's Microsoft 365 environment and the general knowledge that Copilot was trained on.
  3. Copilot provides responses based only on data that the user can access.
  4. Copilot provides responses based only on the general knowledge that Copilot was trained on.

Answer(s): A

Explanation:

Microsoft 365 Copilot uses a combination of its general training data (the underlying Large Language Model) and your organization's data when operating in the "Work" scope. However, it is designed to prioritize your organizational data first, a process known as "grounding".
Here is how Microsoft 365 Copilot uses knowledge in the Work scope:
1. Grounding in Your Data (Primary Source)
When you ask a question in the work scope (e.g., in Teams, Outlook, or via the M365 app), Copilot uses Retrieval Augmented Generation (RAG) to "ground" its answer in your organization's data via Microsoft Graph.
What it accesses: Files, emails, chats, calendars, and contacts that you have permission to view.
Purpose: To provide accurate, personalized, and context-specific answers rather than generic ones.
2. General Knowledge (Fallback Source)
If Copilot cannot find relevant information within your organizational data to answer your prompt, it may use its
underlying training data to provide a response.
This is typically used for broader, conceptual, or general knowledge questions that are not specific to your company's documents or communications.



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Yes
Yes - Microsoft 365 Copilot can surface data from Microsoft Teams chats.

Microsoft 365 Copilot can surface, summarize, and analyze data from Microsoft Teams chats, including 1:1, group chats, and channel conversations. It uses Microsoft Graph to access this information to help users catch up on missed conversations, identify key points, and find shared links or files.

Box 2: No
No - Microsoft 365 Copilot can schedule, draft, and send emails based on certain conditions.

Microsoft 365 Copilot in Outlook can draft, summarize, and assist with scheduling emails based on context and specific user-defined criteria, acting as an AI assistant to manage inboxes. However, Copilot does not send emails automatically; it requires user approval for all generated drafts to ensure accuracy and prevent premature or incorrect communication.

Box 3: Yes
Yes - Microsoft 365 Copilot can summarize, draft, and answer questions about content from Microsoft Teams

meetings.

Microsoft 365 Copilot in Teams summarizes, drafts, and analyzes meeting content, allowing users to track decisions, identify action items, and query specific discussions in real-time or after, provided the meeting is transcribed. It works with Microsoft Teams Premium or Microsoft 365 Copilot licenses to generate recaps, including chat history, and supports in-meeting queries like "where do we disagree?".



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: No
No - Microsoft 365 Copilot is trained on your organization's business data.

Microsoft 365 Copilot is designed to use your organization's business data, but it is crucial to understand that it is not "trained" on it in the traditional sense of modifying its foundation AI models. Instead, Copilot uses a process called grounding, which connects Large Language Models (LLMs) to your company's data (emails, chats, documents, etc.) in real-time to provide contextually relevant answers.

Here is a breakdown of how Microsoft 365 Copilot uses and protects your data:
How Copilot Uses Organizational Data

Real-Time Access via Microsoft Graph: Copilot uses Microsoft Graph to access data that an individual user already has permission to view, including emails, chats, meetings, and documents, to generate answers.

Contextual Understanding: It uses this information to help with tasks like summarizing a meeting, drafting

emails based on previous communication, or searching for information across documents.

Scope of Access: Copilot is restricted to the data within your organization's Microsoft 365 tenant boundary.
Data Security and Privacy Protections
*-> No Training on Data: Prompts, responses, and data accessed through Microsoft Graph are not used to train the base foundation models that power Copilot.

Data Stays Within Boundary: Customer data remains within the Microsoft 365 service boundary. Permissions Are Respected: Copilot adheres to the same access controls and security policies (e.g., sensitivity labels) that your organization has already put in place.

No Data Sharing with Third Parties: Customer data processed by Copilot is not shared with third-party models or used for advertising.

Box 2: Yes
Yes - Microsoft 365 Copilot can summarize emails in mailboxes that were shared with you.

Microsoft 365 Copilot can summarize emails and analyze content in shared or delegated mailboxes, provided the user has at least read access. Users can utilize prompts in the Copilot chat to summarize recent messages, identify key topics, or extract actions from shared accounts.

Box 3: Yes
Yes - Microsoft 365 Copilot leverages the security framework defined in Microsoft 365 to provide you with access to emails, documents, and other enterprise data.

Microsoft 365 Copilot leverages the existing security, privacy, and compliance framework of Microsoft 365 to ensure that it only accesses data a user is already authorized to view. By operating within the Microsoft 365 service boundary, Copilot respects established permissions, ensuring that sensitive information is not exposed to unauthorized users.



HOTSPOT

You use Microsoft 365 Copilot.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: No
No - You can delete your conversation history only if your administrator allows you to do so.

Users can generally delete their Microsoft 365 Copilot conversation history, which often syncs across apps like Teams and Word to protect privacy. While users can manage their own data via the privacy dashboard, specific organization-wide retention policies set by administrators may dictate how long data is stored or if it can be permanently removed.

Box 2: No
No - When you delete a conversation, the associated files stored in Microsoft SharePoint are also deleted.

Deleting a Microsoft 365 Copilot conversation does not automatically delete associated files stored in SharePoint or OneDrive. While the interaction history (prompts and answers) is removed, saved files, documents, or content generated by Copilot that were saved in Microsoft 365 apps remain intact.

Chat History vs. Files: Deleting Copilot chat history removes the conversation record but not the files linked or created in the process.

Copilot Pages/Notebooks: If a dedicated Copilot Page was created, it is stored in a SharePoint container. While individual notebook deletions are not recoverable by end-users, this generally applies to the workspace itself rather than external files referenced, say Microsoft Learn.

Persistent Data: Any content Copilot helped generate that you have already saved in Word, PowerPoint, or other apps is preserved.

Deletion Scope: The deletion process applies to activity history across Excel, Forms, Loop, OneNote, Outlook, Planner, PowerPoint, Teams, and Word.

To remove files, you must manually delete them from their stored location in SharePoint or OneDrive.

Box 3: Yes
Yes - When you uninstall Copilot agent, all the conversations associated with the agent are retained.

When you uninstall a Microsoft 365 Copilot agent, conversations and saved prompts associated with that agent are generally retained in the user's history and the Copilot Prompt Gallery. The underlying interaction data, stored in Dataverse or user mailboxes, persists for 18 months by default.

Key details regarding agent uninstallation:
Saved Prompts: Even if an agent is removed, saved prompts remain in the Microsoft Support Copilot Prompt Gallery.

*-> Conversation History: Chats with agents (including workflow agents) are retained in Microsoft 365 conversation history, not deleted upon uninstallation.

Data Retention: Copilot typically retains interaction history for 18 months, which can be managed via Microsoft

Support privacy settings.

Reinstallation: If the agent is reinstalled, previously saved prompts can be reassociated with it.

To permanently delete conversation history, users must manually clear it from their Microsoft Support privacy dashboard or through specific retention policies.



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: No
No - Microsoft 365 Copilot uses your business data to train underlying AI models.

No Training on Customer Data: Microsoft does not use customer data, prompts, or interactions with the Microsoft Graph (emails, documents, chat history) to train the foundation Large Language Models (LLMs) that power Microsoft 365 Copilot.

Data Stays within Tenant: Your data, prompts, and Copilot's responses remain within your Microsoft 365 service boundary, adhering to your organization's existing privacy, security, and compliance policies.

Enterprise Data Protection (EDP): For users with an organizational Entra ID account, Copilot operates with EDP, ensuring that prompt and response data are not used for model training and are not accessed by Microsoft employees without explicit permission.

Respects Permissions: Copilot only accesses data that a specific user already has permission to view within

their Microsoft 365 environment (SharePoint, OneDrive, etc.).

Exceptions (Consumer vs. Commercial): While commercial data (Entra ID) is excluded from training, Microsoft may use data from consumer Copilot services (like free Bing/MSN interactions) to train models, but users can opt out.

Box 2: No
No - Microsoft 365 Copilot automatically assigns permissions to resources, so that users can find information more easily.

Microsoft 365 Copilot is designed to strictly respect existing permissions and security, compliance, and privacy policies already established within an organization's Microsoft 365 environment.

Here is a breakdown of how Copilot handles permissions:

Respects Current Access Controls: Copilot only accesses data (emails, chats, documents) that an individual user is already authorized to view, based on existing role-based access controls (RBAC), SharePoint permissions, and OneDrive settings.

No Automatic Permission Changes: Copilot does not change, assign, or create new permissions for files to make them easier to find. It operates within the bounds of what is already shared with the user.

Oversharing Risk: Because Copilot relies on existing permissions, if files have been improperly or overly shared in the past, Copilot will allow users to find that information. It is not a tool to manage or "fix" permissions automatically, but rather to access data based on them.

Data Security: Copilot respects sensitivity labels and encryption (like Microsoft Purview), meaning it will not expose content that is locked down.

Box 3: Yes
Yes - Microsoft 365 Copilot uses the same underlying data access controls as other Microsoft 365 services to ensure that users are presented with only information to which they have access.

Microsoft 365 Copilot is designed to inherit and strictly adhere to the existing security, compliance, and data privacy policies established within your Microsoft 365 tenant.

Here is how Microsoft 365 Copilot ensures users only access information they are authorized to see:

Microsoft Graph Integration: Copilot uses Microsoft Graph to access user data (emails, chats, documents) only within the user's unique context and based on their existing permissions.

Permission Model Inheritance: Copilot honors SharePoint, OneDrive, and Microsoft Entra (formerly Azure AD) permissions. If a user does not have access to a document, Copilot cannot access or summarize it for them.

Data Protection Mechanisms: Copilot respects sensitivity labels and data loss prevention (DLP) policies, ensuring that sensitive data is handled appropriately.

Secure Scope: Copilot operates within the Microsoft 365 service boundary, meaning user prompts and data do not leave the secure environment to train the foundation models.

Zero Trust Approach: Copilot treats input prompts as potentially unsafe, using a zero-trust architecture to prevent data leaks.



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Yes
Yes - Prompt injection can cause data exposure and unintended system behavior

Prompt injection in Microsoft 365 (M365) Copilot is a significant security risk that can lead to unauthorized data exposure and unintended system behavior. Through techniques like indirect prompt injection, attackers can embed hidden instructions in documents, emails, or websites that Copilot parses, causing it to override its intended, safe behavior and act on malicious commands.

Box 2: Yes
Yes - Prompt injection can be used to embed malicious or harmful instructions.

Box 3: Yes
Yes - Prompt injection is a security vulnerability unique to generative AI systems that interpret and respond to natural language inputs.

Prompt Injection is one of the most significant shifts in the "threat model" because it turns natural language into a potential delivery mechanism for malicious commands.
In the context of M365 Copilot, this risk is unique because the AI has access to your sensitive organizational data (emails, files, chats).



Page 2 of 12

Share your comments for Microsoft AB-730 exam with other users:

Terry 5/24/2023 4:41:00 PM

i can practice for exam
Anonymous


Emerys 7/29/2023 6:55:00 AM

please i need this exam.
Anonymous


Goni Mala 9/2/2023 12:27:00 PM

i need the dump
Anonymous


Lenny 9/29/2023 11:30:00 AM

i want it bad, even if cs6 maybe retired, i want to learn cs6
HONG KONG


MilfSlayer 12/28/2023 8:32:00 PM

i hate comptia with all my heart with their "choose the best" answer format as an argument could be made on every question. they say "the "comptia way", lmao no this right here boys is the comptia way 100%. take it from someone whos failed this exam twice but can configure an entire complex network that these are the questions that are on the test 100% no questions asked. the pbqs are dead on! nice work
Anonymous


Swati Raj 11/14/2023 6:28:00 AM

very good materials
UNITED STATES


Ko Htet 10/17/2023 1:28:00 AM

thanks for your support.
Anonymous


Philippe 1/22/2023 10:24:00 AM

iam impressed with the quality of these dumps. they questions and answers were easy to understand and the xengine app was very helpful to use.
CANADA


Sam 8/31/2023 10:32:00 AM

not bad but you question database from isaca
MALAYSIA


Brijesh kr 6/29/2023 4:07:00 AM

awesome contents
INDIA


JM 12/19/2023 1:22:00 PM

answer to 134 is casb. while data loss prevention is the goal, in order to implement dlp in cloud applications you need to deploy a casb.
UNITED STATES


Neo 7/26/2023 9:36:00 AM

are these brain dumps sufficient enough to go write exam after practicing them? or does one need more material this wont be enough?
SOUTH AFRICA


Bilal 8/22/2023 6:33:00 AM

i did attend the required cources and i need to be sure that i am ready to take the exam, i would ask you please to share the questions, to be sure that i am fit to proceed with taking the exam.
Anonymous


John 11/12/2023 8:48:00 PM

why only give explanations on some, and not all questions and their respective answers?
UNITED STATES


Biswa 11/20/2023 8:50:00 AM

refresh db knowledge
Anonymous


Shalini Sharma 10/17/2023 8:29:00 AM

interested for sap certification
JAPAN


ethan 9/24/2023 12:38:00 PM

could you please upload practice questions for scr exam ?
HONG KONG


vijay joshi 8/19/2023 3:15:00 AM

please upload free oracle cloud infrastructure 2023 foundations associate exam braindumps
Anonymous


Ayodele Talabi 8/25/2023 9:25:00 PM

sweating! they are tricky
CANADA


Romero 3/23/2022 4:20:00 PM

i never use these dumps sites but i had to do it for this exam as it is impossible to pass without using these question dumps.
UNITED STATES


John Kennedy 9/20/2023 3:33:00 AM

good practice and well sites.
Anonymous


Nenad 7/12/2022 11:05:00 PM

passed my first exam last week and pass the second exam this morning. thank you sir for all the help and these brian dumps.
INDIA


Lucky 10/31/2023 2:01:00 PM

does anyone who attended exam csa 8.8, can confirm these questions are really coming ? or these are just for practicing?
HONG KONG


Prateek 9/18/2023 11:13:00 AM

kindly share the dumps
UNITED STATES


Irfan 11/25/2023 1:26:00 AM

very nice content
Anonymous


php 6/16/2023 12:49:00 AM

passed today
Anonymous


Durga 6/23/2023 1:22:00 AM

hi can you please upload questions
Anonymous


JJ 5/28/2023 4:32:00 AM

please upload quetions
THAILAND


Norris 1/3/2023 8:06:00 PM

i passed my exam thanks to this braindumps questions. these questions are valid in us and i highly recommend it!
UNITED STATES


abuti 7/21/2023 6:10:00 PM

are they truely latest
Anonymous


Curtis Nakawaki 7/5/2023 8:46:00 PM

questions appear contemporary.
UNITED STATES


Vv 12/2/2023 6:31:00 AM

good to prepare in this site
UNITED STATES


praveenkumar 11/20/2023 11:57:00 AM

very helpful to crack first attempt
Anonymous


asad Raza 5/15/2023 5:38:00 AM

please upload this exam
CHINA


AI Tutor 👋 I’m here to help!