Microsoft AB-730 Exam (page: 2)
Microsoft AI Business Professional
Updated on: 28-Feb-2026

Viewing Page 2 of 9

You are a merchandiser who is planning for the upcoming season.

You prompt Microsoft 365 Copilot to suggest which products to stock based on historical sales data. Without reviewing the suggestions or checking current market trends, you place a large order based solely on the output of Copilot.

What is this an example of?

  1. verification
  2. overreliance
  3. fabrication
  4. prompt injection

Answer(s): B

Explanation:

Relying solely on Microsoft 365 Copilot for high-stakes inventory decisions constitutes overreliance, a known risk where users trust AI outputs without sufficient critical evaluation or human oversight.
Key Risks of Overreliance in Inventory Management
Context Blindness: Copilot excels at structured historical data but cannot account for "real-world" nuances like sudden geopolitical shifts, local weather events, or unrecorded competitor activity.
Data Integrity Issues: AI recommendations are only as good as the input data. If historical sales data is inconsistent, incomplete, or contains "garbage" entries, Copilot will generate inaccurate forecasts.
Algorithmic Bias: If previous manual stocking decisions were biased--for example, favoring certain regions or suppliers--the AI may reinforce these patterns, leading to skewed results.

Hallucinations: Like all Large Language Models (LLMs), Copilot can "hallucinate" or provide plausible-sounding but factually incorrect data points, which can be catastrophic for large financial orders.
How to Maintain Proper Human Oversight
To mitigate these risks, organizations should treat Copilot as an assistive tool rather than a primary decision- maker.


Reference:

https://precoro.com/blog/ai-in-procurement



You are comparing the difference between Microsoft 365 Copilot and Microsoft 365 Copilot Chat.

What is available in both versions?

  1. the Researcher agent
  2. Copilot Pages
  3. Copilot Notebooks

Answer(s): B

Explanation:

A key feature available in both Microsoft 365 Copilot and Microsoft 365 Copilot Chat is Enterprise Data Protection (EDP), which ensures that user data and prompts are not used to train the underlying AI models.
Both versions also support key productivity features when signed in with a work or school account, including:
Web-grounded chat: Both can generate answers based on public web data and the latest Large Language Models (LLMs).
File Uploads: Both allow users to upload files to summarize or analyze.
Image Generation: Both can create images directly in the chat interface.
*-> Copilot Pages: Both allow users to turn AI-generated responses into editable, shareable documents.
Contextual Awareness: Both can understand the context of an open file (Word, Excel, PowerPoint) or email (Outlook) in the side pane.


Reference:

https://support.microsoft.com/en-us/topic/how-copilot-chat-works-with-and-without-a-microsoft-365-copilot- license-5810b659-fbe0-48ee-9fe6-d731fe86cdeb



You are discussing Microsoft 365 Copilot with a colleague. The colleague asks which data Copilot uses to answer questions when using the Work scope.

What should you tell your colleague?

  1. Copilot provides responses based only on data that the user can access and the general knowledge that Copilot was trained on.
  2. Copilot provides responses based on all the data in your organization's Microsoft 365 environment and the general knowledge that Copilot was trained on.
  3. Copilot provides responses based only on data that the user can access.
  4. Copilot provides responses based only on the general knowledge that Copilot was trained on.

Answer(s): A

Explanation:

Microsoft 365 Copilot uses a combination of its general training data (the underlying Large Language Model) and your organization's data when operating in the "Work" scope. However, it is designed to prioritize your organizational data first, a process known as "grounding".
Here is how Microsoft 365 Copilot uses knowledge in the Work scope:
1. Grounding in Your Data (Primary Source)
When you ask a question in the work scope (e.g., in Teams, Outlook, or via the M365 app), Copilot uses Retrieval Augmented Generation (RAG) to "ground" its answer in your organization's data via Microsoft Graph.
What it accesses: Files, emails, chats, calendars, and contacts that you have permission to view.
Purpose: To provide accurate, personalized, and context-specific answers rather than generic ones.
2. General Knowledge (Fallback Source)
If Copilot cannot find relevant information within your organizational data to answer your prompt, it may use its underlying training data to provide a response.
This is typically used for broader, conceptual, or general knowledge questions that are not specific to your company's documents or communications.


Reference:

https://support.microsoft.com/en-us/topic/what-information-does-copilot-use-to-answer-my-prompt-934f537d- ff7d-4059-9fec-a751e4651307



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: Yes
Yes - Microsoft 365 Copilot can surface data from Microsoft Teams chats.

Microsoft 365 Copilot can surface, summarize, and analyze data from Microsoft Teams chats, including 1:1, group chats, and channel conversations. It uses Microsoft Graph to access this information to help users catch up on missed conversations, identify key points, and find shared links or files.

Box 2: No
No - Microsoft 365 Copilot can schedule, draft, and send emails based on certain conditions.

Microsoft 365 Copilot in Outlook can draft, summarize, and assist with scheduling emails based on context and specific user-defined criteria, acting as an AI assistant to manage inboxes. However, Copilot does not send emails automatically; it requires user approval for all generated drafts to ensure accuracy and prevent premature or incorrect communication.

Box 3: Yes
Yes - Microsoft 365 Copilot can summarize, draft, and answer questions about content from Microsoft Teams meetings.

Microsoft 365 Copilot in Teams summarizes, drafts, and analyzes meeting content, allowing users to track decisions, identify action items, and query specific discussions in real-time or after, provided the meeting is transcribed. It works with Microsoft Teams Premium or Microsoft 365 Copilot licenses to generate recaps, including chat history, and supports in-meeting queries like "where do we disagree?".


Reference:

https://scribill.se/en/ai-guides/how-to-use-m365-copilot-in-outlook/ https://nboldapp.com/microsoft-copilot-in-teams-revolutionizing-meetings-and-chats-with-ai/



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:

Box 1: No
No - Microsoft 365 Copilot is trained on your organization's business data.

Microsoft 365 Copilot is designed to use your organization's business data, but it is crucial to understand that it is not "trained" on it in the traditional sense of modifying its foundation AI models. Instead, Copilot uses a process called grounding, which connects Large Language Models (LLMs) to your company's data (emails, chats, documents, etc.) in real-time to provide contextually relevant answers.

Here is a breakdown of how Microsoft 365 Copilot uses and protects your data:
How Copilot Uses Organizational Data

Real-Time Access via Microsoft Graph: Copilot uses Microsoft Graph to access data that an individual user already has permission to view, including emails, chats, meetings, and documents, to generate answers.

Contextual Understanding: It uses this information to help with tasks like summarizing a meeting, drafting emails based on previous communication, or searching for information across documents.

Scope of Access: Copilot is restricted to the data within your organization's Microsoft 365 tenant boundary.
Data Security and Privacy Protections
*-> No Training on Data: Prompts, responses, and data accessed through Microsoft Graph are not used to train the base foundation models that power Copilot.

Data Stays Within Boundary: Customer data remains within the Microsoft 365 service boundary. Permissions Are Respected: Copilot adheres to the same access controls and security policies (e.g., sensitivity labels) that your organization has already put in place.

No Data Sharing with Third Parties: Customer data processed by Copilot is not shared with third-party models or used for advertising.

Box 2: Yes
Yes - Microsoft 365 Copilot can summarize emails in mailboxes that were shared with you.

Microsoft 365 Copilot can summarize emails and analyze content in shared or delegated mailboxes, provided the user has at least read access. Users can utilize prompts in the Copilot chat to summarize recent messages, identify key topics, or extract actions from shared accounts.

Box 3: Yes
Yes - Microsoft 365 Copilot leverages the security framework defined in Microsoft 365 to provide you with access to emails, documents, and other enterprise data.

Microsoft 365 Copilot leverages the existing security, privacy, and compliance framework of Microsoft 365 to ensure that it only accesses data a user is already authorized to view. By operating within the Microsoft 365 service boundary, Copilot respects established permissions, ensuring that sensitive information is not exposed to unauthorized users.


Reference:

https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-architecture https://support.microsoft.com/en-us/topic/use-copilot-in-shared-mailboxes-and-delegate-mailboxes-3e7e5130- eabe-4c19-94ea-117b2a4c14d6



HOTSPOT

You use Microsoft 365 Copilot.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: No
No - You can delete your conversation history only if your administrator allows you to do so.

Users can generally delete their Microsoft 365 Copilot conversation history, which often syncs across apps like Teams and Word to protect privacy.
While users can manage their own data via the privacy dashboard, specific organization-wide retention policies set by administrators may dictate how long data is stored or if it can be permanently removed.

Box 2: No
No - When you delete a conversation, the associated files stored in Microsoft SharePoint are also deleted.

Deleting a Microsoft 365 Copilot conversation does not automatically delete associated files stored in SharePoint or OneDrive.
While the interaction history (prompts and answers) is removed, saved files, documents, or content generated by Copilot that were saved in Microsoft 365 apps remain intact.

Chat History vs. Files: Deleting Copilot chat history removes the conversation record but not the files linked or created in the process.

Copilot Pages/Notebooks: If a dedicated Copilot Page was created, it is stored in a SharePoint container.
While individual notebook deletions are not recoverable by end-users, this generally applies to the workspace itself rather than external files referenced, say Microsoft Learn.

Persistent Data: Any content Copilot helped generate that you have already saved in Word, PowerPoint, or other apps is preserved.

Deletion Scope: The deletion process applies to activity history across Excel, Forms, Loop, OneNote, Outlook, Planner, PowerPoint, Teams, and Word.

To remove files, you must manually delete them from their stored location in SharePoint or OneDrive.

Box 3: Yes
Yes - When you uninstall Copilot agent, all the conversations associated with the agent are retained.

When you uninstall a Microsoft 365 Copilot agent, conversations and saved prompts associated with that agent are generally retained in the user's history and the Copilot Prompt Gallery. The underlying interaction data, stored in Dataverse or user mailboxes, persists for 18 months by default.

Key details regarding agent uninstallation:
Saved Prompts: Even if an agent is removed, saved prompts remain in the Microsoft Support Copilot Prompt Gallery.

*-> Conversation History: Chats with agents (including workflow agents) are retained in Microsoft 365 conversation history, not deleted upon uninstallation.

Data Retention: Copilot typically retains interaction history for 18 months, which can be managed via Microsoft Support privacy settings.

Reinstallation: If the agent is reinstalled, previously saved prompts can be reassociated with it.

To permanently delete conversation history, users must manually clear it from their Microsoft Support privacy dashboard or through specific retention policies.


Reference:

https://support.microsoft.com/en-us/office/delete-your-microsoft-365-copilot-activity-history-76de8afa-5eaf- 43b0-bda8-0076d6e0390f https://support.microsoft.com/en-us/office/delete-your-microsoft-365-copilot-activity-history-76de8afa-5eaf- 43b0-bda8-0076d6e0390f https://learn.microsoft.com/en-us/copilot/microsoft-365/flow-builder-privacy-data-subject-request-faq



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: No
No - Microsoft 365 Copilot uses your business data to train underlying AI models.

No Training on Customer Data: Microsoft does not use customer data, prompts, or interactions with the Microsoft Graph (emails, documents, chat history) to train the foundation Large Language Models (LLMs) that power Microsoft 365 Copilot.

Data Stays within Tenant: Your data, prompts, and Copilot's responses remain within your Microsoft 365 service boundary, adhering to your organization's existing privacy, security, and compliance policies.

Enterprise Data Protection (EDP): For users with an organizational Entra ID account, Copilot operates with EDP, ensuring that prompt and response data are not used for model training and are not accessed by Microsoft employees without explicit permission.

Respects Permissions: Copilot only accesses data that a specific user already has permission to view within their Microsoft 365 environment (SharePoint, OneDrive, etc.).

Exceptions (Consumer vs. Commercial): While commercial data (Entra ID) is excluded from training, Microsoft may use data from consumer Copilot services (like free Bing/MSN interactions) to train models, but users can opt out.

Box 2: No
No - Microsoft 365 Copilot automatically assigns permissions to resources, so that users can find information more easily.

Microsoft 365 Copilot is designed to strictly respect existing permissions and security, compliance, and privacy policies already established within an organization's Microsoft 365 environment.

Here is a breakdown of how Copilot handles permissions:

Respects Current Access Controls: Copilot only accesses data (emails, chats, documents) that an individual user is already authorized to view, based on existing role-based access controls (RBAC), SharePoint permissions, and OneDrive settings.

No Automatic Permission Changes: Copilot does not change, assign, or create new permissions for files to make them easier to find. It operates within the bounds of what is already shared with the user.

Oversharing Risk: Because Copilot relies on existing permissions, if files have been improperly or overly shared in the past, Copilot will allow users to find that information. It is not a tool to manage or "fix" permissions automatically, but rather to access data based on them.

Data Security: Copilot respects sensitivity labels and encryption (like Microsoft Purview), meaning it will not expose content that is locked down.

Box 3: Yes
Yes - Microsoft 365 Copilot uses the same underlying data access controls as other Microsoft 365 services to ensure that users are presented with only information to which they have access.

Microsoft 365 Copilot is designed to inherit and strictly adhere to the existing security, compliance, and data privacy policies established within your Microsoft 365 tenant.

Here is how Microsoft 365 Copilot ensures users only access information they are authorized to see:

Microsoft Graph Integration: Copilot uses Microsoft Graph to access user data (emails, chats, documents) only within the user's unique context and based on their existing permissions.

Permission Model Inheritance: Copilot honors SharePoint, OneDrive, and Microsoft Entra (formerly Azure AD) permissions. If a user does not have access to a document, Copilot cannot access or summarize it for them.

Data Protection Mechanisms: Copilot respects sensitivity labels and data loss prevention (DLP) policies, ensuring that sensitive data is handled appropriately.

Secure Scope: Copilot operates within the Microsoft 365 service boundary, meaning user prompts and data do not leave the secure environment to train the foundation models.

Zero Trust Approach: Copilot treats input prompts as potentially unsafe, using a zero-trust architecture to prevent data leaks.


Reference:

https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy https://www.coreview.com/blog/m365-copilot-security-risks



HOTSPOT

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:

Box 1: Yes
Yes - Prompt injection can cause data exposure and unintended system behavior

Prompt injection in Microsoft 365 (M365) Copilot is a significant security risk that can lead to unauthorized data exposure and unintended system behavior. Through techniques like indirect prompt injection, attackers can embed hidden instructions in documents, emails, or websites that Copilot parses, causing it to override its intended, safe behavior and act on malicious commands.

Box 2: Yes
Yes - Prompt injection can be used to embed malicious or harmful instructions.

Box 3: Yes
Yes - Prompt injection is a security vulnerability unique to generative AI systems that interpret and respond to natural language inputs.

Prompt Injection is one of the most significant shifts in the "threat model" because it turns natural language into a potential delivery mechanism for malicious commands.
In the context of M365 Copilot, this risk is unique because the AI has access to your sensitive organizational data (emails, files, chats).


Reference:

https://www.microsoft.com/en-us/msrc/blog/2025/07/how-microsoft-defends-against-indirect-prompt-injection- attacks https://www.oligo.security/academy/prompt-injection-impact-attack-anatomy-prevention



Viewing Page 2 of 9



Share your comments for Microsoft AB-730 exam with other users:

surya 7/30/2023 2:02:00 PM

please upload c_sacp_2308
CANADA


Sasuke 7/11/2023 10:30:00 PM

please upload the dump. thanks very much !!
Anonymous


V 7/4/2023 8:57:00 AM

good questions
UNITED STATES


TTB 8/22/2023 5:30:00 AM

hi, could you please update the latest dump version
Anonymous


T 7/28/2023 9:06:00 PM

this question is keep repeat : you are developing a sales application that will contain several azure cloud services and handle different components of a transaction. different cloud services will process customer orders, billing, payment, inventory, and shipping. you need to recommend a solution to enable the cloud services to asynchronously communicate transaction information by using xml messages. what should you include in the recommendation?
NEW ZEALAND


Gurgaon 9/28/2023 4:35:00 AM

great questions
UNITED STATES


wasif 10/11/2023 2:22:00 AM

its realy good
UNITED ARAB EMIRATES


Shubhra Rathi 8/26/2023 1:12:00 PM

oracle 1z0-1059-22 dumps
Anonymous


Leo 7/29/2023 8:48:00 AM

please share me the pdf..
INDIA


AbedRabbou Alaqabna 12/18/2023 3:10:00 AM

q50: which two functions can be used by an end user when pivoting an interactive report? the correct answer is a, c because we do not have rank in the function pivoting you can check in the apex app
GREECE


Rohan Limaye 12/30/2023 8:52:00 AM

best to practice
Anonymous


Aparajeeta 10/13/2023 2:42:00 PM

so far it is good
Anonymous


Vgf 7/20/2023 3:59:00 PM

please provide me the dump
Anonymous


Deno 10/25/2023 1:14:00 AM

i failed the cisa exam today. but i have found all the questions that were on the exam to be on this site.
Anonymous


CiscoStudent 11/15/2023 5:29:00 AM

in question 272 the right answer states that an autonomous acces point is "configured and managed by the wlc" but this is not what i have learned in my ccna course. is this a mistake? i understand that lightweight aps are managed by wlc while autonomous work as standalones on the wlan.
Anonymous


pankaj 9/28/2023 4:36:00 AM

it was helpful
Anonymous


User123 10/8/2023 9:59:00 AM

good question
UNITED STATES


vinay 9/4/2023 10:23:00 AM

really nice
Anonymous


Usman 8/28/2023 10:07:00 AM

please i need dumps for isc2 cybersecuity
Anonymous


Q44 7/30/2023 11:50:00 AM

ans is coldline i think
UNITED STATES


Anuj 12/21/2023 1:30:00 PM

very helpful
Anonymous


Giri 9/13/2023 10:31:00 PM

can you please provide dumps so that it helps me more
UNITED STATES


Aaron 2/8/2023 12:10:00 AM

thank you for providing me with the updated question and answers. this version has all the questions from the exam. i just saw them in my exam this morning. i passed my exam today.
SOUTH AFRICA


Sarwar 12/21/2023 4:54:00 PM

how i can see exam questions?
CANADA


Chengchaone 9/11/2023 10:22:00 AM

can you please upload please?
Anonymous


Mouli 9/2/2023 7:02:00 AM

question 75: option c is correct answer
Anonymous


JugHead 9/27/2023 2:40:00 PM

please add this exam
Anonymous


sushant 6/28/2023 4:38:00 AM

please upoad
EUROPEAN UNION


John 8/7/2023 12:09:00 AM

has anyone recently attended safe 6.0 certification? is it the samq question from here.
Anonymous


Blessious Phiri 8/14/2023 3:49:00 PM

expository experience
Anonymous


concerned citizen 12/29/2023 11:31:00 AM

52 should be b&c. controller failure has nothing to do with this type of issue. degraded state tells us its a raid issue, and if the os is missing then the bootable device isnt found. the only other consideration could be data loss but thats somewhat broad whereas b&c show understanding of the specific issues the question is asking about.
UNITED STATES


deedee 12/23/2023 5:10:00 PM

great help!!!
UNITED STATES


Samir 8/1/2023 3:07:00 PM

very useful tools
UNITED STATES


Saeed 11/7/2023 3:14:00 AM

looks a good platform to prepare az-104
Anonymous