Browse Certification Practice Tests by Exam Family

GitHub Copilot GH-300: Privacy and Safeguards

Try 10 focused GitHub Copilot GH-300 questions on Privacy and Safeguards, with explanations, then continue with IT Mastery.

On this page

Open the matching IT Mastery practice page for timed mocks, topic drills, progress tracking, explanations, and full practice.

Try GitHub Copilot GH-300 on Web View full GitHub Copilot GH-300 practice page

Topic snapshot

FieldDetail
Exam routeGitHub Copilot GH-300
Topic areaConfigure Privacy, Content Exclusions, and Safeguards
Blueprint weight14%
Page purposeFocused sample questions before returning to mixed practice

How to use this topic drill

Use this page to isolate Configure Privacy, Content Exclusions, and Safeguards for GitHub Copilot GH-300. Work through the 10 questions first, then review the explanations and return to mixed practice in IT Mastery.

PassWhat to doWhat to record
First attemptAnswer without checking the explanation first.The fact, rule, calculation, or judgment point that controlled your answer.
ReviewRead the explanation even when you were correct.Why the best answer is stronger than the closest distractor.
RepairRepeat only missed or uncertain items after a short break.The pattern behind misses, not the answer letter.
TransferReturn to mixed practice once the topic feels stable.Whether the same skill holds up when the topic is no longer obvious.

Blueprint context: 14% of the practice outline. A focused topic score can overstate readiness if you recognize the pattern too quickly, so use it as repair work before timed mixed sets.

Sample questions

These questions are original IT Mastery practice items aligned to this topic area. They are designed for self-assessment and are not official exam questions.

Question 1

Topic: Configure Privacy, Content Exclusions, and Safeguards

A development team wants GitHub Copilot to help examine pull requests for possible issues, but their policy requires a human reviewer to validate all findings and provide the actual approval. Which Copilot feature best fits this need?

Options:

  • A. Agent Mode

  • B. Instruction files

  • C. Copilot code review

  • D. Pull request summaries

Best answer: C

Explanation: Copilot code review is the best fit when a team wants AI-assisted pull request feedback without treating AI output as approved. It helps reviewers find issues, but human validation and sign-off remain required.

The key concept is that Copilot output is assistance, not verification or approval. In a pull request workflow, the feature that matches “analyze changes and suggest review feedback” is Copilot code review. It supports reviewer productivity by identifying possible issues or improvements, while leaving responsibility for validation and approval with human reviewers.

This aligns with responsible Copilot use:

  • Use AI feedback as draft input.
  • Validate findings against the code and requirements.
  • Keep final approval with a human reviewer.

A summary feature is too shallow for review, and authoring or behavior-shaping features do not perform the review task itself.

  • Pull request summaries help explain what changed, but they do not provide the same review-focused feedback workflow.
  • Agent Mode is for more autonomous coding tasks, not the best fit for PR review assistance.
  • Instruction files guide Copilot behavior, but they are not the feature that performs pull request review.

Question 2

Topic: Configure Privacy, Content Exclusions, and Safeguards

An organization recently updated GitHub Copilot settings. A developer reports that inline suggestions stopped appearing only in apps/billing/legacy/refund.js, but suggestions still appear in other files in the same repository.

Exhibit:

Copilot access: Enabled
Content exclusions:
- repositories: none
- paths:
  - **/legacy/**
  - **/secrets/**

What is the best next step to diagnose the behavior?

Options:

  • A. Ask the developer to paste the file into Copilot Chat to test responses.

  • B. Reinstall the Copilot extension and reauthenticate the developer.

  • C. Turn off duplication detection for the repository.

  • D. Compare the file path to the organization’s content exclusion patterns.

Best answer: D

Explanation: This pattern points to an exclusion rule, not a general Copilot outage. Suggestions still work in other files, and the affected path matches a configured exclusion, so the next step is to validate that policy match.

When Copilot stops suggesting code only in specific files or paths, start by checking organization policy and the affected file context. Content exclusions are designed to prevent Copilot from using matching content as context and can also stop suggestions in those files.

In this scenario, apps/billing/legacy/refund.js matches the **/legacy/** exclusion, while other files in the repository still receive suggestions. That makes a path-based exclusion the strongest explanation.

  • Confirm the file path matches the exclusion pattern.
  • Treat the behavior as expected unless the exclusion was unintended.
  • Change policy only if the organization wants Copilot enabled for that path.

Generic client troubleshooting is less appropriate because the issue is limited to one excluded path, not the whole editor or account.

  • Reinstalling first is a weak next step because the behavior is limited to one matching path, which points to policy rather than an extension failure.
  • Pasting the file into chat tries to work around an exclusion and can expose content that the organization intended to keep out of Copilot context.
  • Disabling duplication detection changes a safeguard unrelated to why suggestions are missing in a path covered by content exclusions.

Question 3

Topic: Configure Privacy, Content Exclusions, and Safeguards

A team builds a patient-scheduling service in a private repository. Content exclusions already prevent Copilot from using data/exports/ as context. The team also wants Copilot Chat and code edits in this repo to consistently avoid logging patient identifiers, prefer the approved encryption library, and remind developers to validate generated output, without repeating those rules in every prompt. Which GitHub Copilot capability should they use?

Options:

  • A. Repository instruction files

  • B. Agent Mode

  • C. Reusable prompt files

  • D. Plan Mode

Best answer: A

Explanation: Repository instruction files are the best fit because the team needs always-on, repo-specific guidance that shapes Copilot behavior across chats and edits. That keeps Copilot useful while consistently reinforcing privacy, security, and compliance expectations.

The deciding requirement is persistent guidance, not just reusable prompting. Repository instruction files let a team encode standing rules for Copilot in that repository, such as avoiding patient-identifier logging, preferring approved libraries, and reminding developers to review generated output. This supports compliance without forcing each developer to restate the same constraints every time they use Chat or editing features.

Content exclusions and instruction files solve different problems: exclusions limit what Copilot can use as context, while instruction files steer how Copilot should respond within the allowed context. Reusable prompt files are helpful for optional task templates, but they are not automatically applied. Plan Mode and Agent Mode help organize or execute work, but they do not replace persistent repository guidance.

  • Reusable prompt files help standardize common requests, but developers must choose them explicitly rather than getting the rules automatically.
  • Plan Mode is useful for breaking work into steps, not for applying standing repository compliance instructions.
  • Agent Mode can take broader action across files, but it is not the primary mechanism for always-on repository policy guidance.

Question 4

Topic: Configure Privacy, Content Exclusions, and Safeguards

Your org recently added a content exclusion intended to block legal/** from GitHub Copilot context. In VS Code, a developer asks Copilot Chat about src/orders/checkout.ts, but the reply references policies/docs/legal/refund-policy.md. That legal file is open in another tab, the workspace includes both storefront and policies repos, and org settings changed this week. What is the best troubleshooting step?

Options:

  • A. Check exclusion-path matching, active editor context, workspace repo scope, and applied org policy.

  • B. Rewrite the prompt to forbid legal content and rely on that response.

  • C. Enable duplication detection and rerun the same chat request.

  • D. Use Agent Mode so Copilot ignores unrelated open tabs.

Best answer: A

Explanation: When Copilot appears to use unexpected content, the first step is to verify how that content could have entered context. That means checking whether the exclusion pattern actually matches the file path, what the editor currently exposes, which repositories are in scope, and whether the organization policy applies as expected.

This scenario is about troubleshooting Copilot context, not improving the prompt. Content exclusions depend on the actual path pattern and scope, so an intended rule like legal/** may not match a file such as policies/docs/legal/refund-policy.md. Copilot context can also come from the active file, selected code, open tabs, chat context, and other repositories in a multi-root workspace.

A good troubleshooting sequence is:

  • verify the exclusion pattern matches the real file path
  • inspect open tabs, selections, and active editor state
  • confirm which repository or workspace roots are in scope
  • check that the expected organization policy is applied

Feature changes or prompt wording do not diagnose whether exclusions and policy are working correctly.

  • Duplication detection helps identify similar generated output, not why unexpected files were available as context.
  • Agent Mode changes how work is performed, but it does not replace checking exclusions, editor state, or repository scope.
  • Prompt-only workaround may hide the symptom, but it does not confirm whether the exclusion rule or organization policy is functioning.

Question 5

Topic: Configure Privacy, Content Exclusions, and Safeguards

A bank wants to use GitHub Copilot in a private monorepo. app/ is approved for Copilot, but customer-data-samples/ contains regulated test records and vendor-snippets/ contains third-party reference code. Developers use supported IDEs with mixed local settings. Security wants Copilot available for app/ while reducing privacy, duplication, and insecure-suggestion risk. What is the best action?

Options:

  • A. Allow all repository content as Copilot context so suggestions are more accurate, then disable duplication detection to reduce interruptions.

  • B. Add an instruction file telling Copilot to avoid the sensitive paths, and rely on pull request review for unsafe or duplicated output.

  • C. Configure organization-level content exclusions for the sensitive paths, standardize editor settings so duplication detection and security warnings stay enabled, and manage Copilot through organization policy.

  • D. Use Copilot Agent Mode only in app/ and let the agent decide which repository files are safe to read.

Best answer: C

Explanation: Higher-risk workflows need layered safeguards, not a single workaround. The best choice excludes sensitive paths from Copilot context, keeps duplication and security protections active in the editor, and uses organization policy for consistent control.

For higher-risk Copilot use, the safest approach is to combine controls at the correct layers. Content exclusions prevent specific files or paths from being used as Copilot context, which is the right control for regulated test data and third-party reference code. Organization policy provides consistent administration for how Copilot is allowed in the repository. Editor settings then keep day-to-day safeguards active, including duplication detection and security warnings, so developers still get protection while working in approved code areas.

  • Exclude sensitive or restricted content from context.
  • Keep duplication and security safeguards enabled in supported editors.
  • Use organization policy to apply the workflow consistently.

Prompt instructions, reviewer-only processes, or choosing Agent Mode do not replace these safeguard controls.

  • Instruction files are not exclusions because they guide behavior but do not block sensitive paths from being used as Copilot context.
  • Review only is too late because pull request review does not replace built-in duplication detection or security warnings during authoring.
  • More context is not safer because allowing all repository content and disabling duplication checks increases both privacy and reuse risk.
  • Wrong control surface because Agent Mode helps with task execution, not with enforcing path exclusions or safeguard policy.

Question 6

Topic: Configure Privacy, Content Exclusions, and Safeguards

A company uses GitHub Copilot in a monorepo. The billing/ and customer-export/ paths contain proprietary payment logic and regulated customer code, but developers still want Copilot for the rest of the repository. What is the best next step to reduce the chance that these paths are used as Copilot context?

Options:

  • A. Configure content exclusions for the sensitive paths

  • B. Add an instruction file telling Copilot to avoid those folders

  • C. Rely on pull request review before merging Copilot-generated changes

  • D. Ask developers not to mention those files in Copilot Chat

Best answer: A

Explanation: Use content exclusions when specific repositories or paths contain confidential, regulated, or security-sensitive code. That safeguard directly limits Copilot context for those areas while allowing Copilot use elsewhere in the repository.

The key privacy safeguard here is content exclusion. When only certain paths in a repository contain sensitive material, configuring content exclusions for those paths is the best control because it is policy-based and targeted. It reduces the likelihood that Copilot uses excluded files as context while preserving productivity in the rest of the monorepo. By contrast, instruction files shape behavior but do not enforce privacy boundaries, and user reminders depend on perfect human compliance. Pull request review is still important for validating AI-generated output, but it happens after suggestions are produced and does not prevent sensitive files from being used as context. For this scenario, the strongest next step is to apply exclusions to the sensitive paths.

  • Instruction files guide Copilot responses, but they are not a privacy safeguard for blocking sensitive paths from context.
  • PR review only is a validation practice after generation, not a preventive control over what Copilot can use as context.
  • Developer reminders are easy to bypass or forget and do not provide organization-level protection.

Question 7

Topic: Configure Privacy, Content Exclusions, and Safeguards

A developer in repo finbank-apps asks Copilot Chat in the IDE to explain services/payroll/tax_calc.py, but Chat says it cannot use that file as context. The developer is signed in, and Copilot is enabled for the repository.

Organization content exclusions
- Path: services/payroll/**
- Repository: finbank-regulated-archive

What is the best next step?

Options:

  • A. Switch to Agent Mode so Copilot can read the file

  • B. Paste the file contents into Chat to bypass the restriction

  • C. Review the matching exclusion rule and its scope

  • D. Disable duplication detection and security warnings, then retry

Best answer: C

Explanation: The file path is already covered by an active content exclusion, which prevents Copilot from using it as context. The best next step is to verify that matching rule and adjust it only if the organization intends to allow that content.

Content exclusions are the first thing to check when Copilot cannot use a file, path, or repository that should otherwise be available. Here, the user is signed in and Copilot is enabled, so the path pattern services/payroll/** is the most relevant cause. Troubleshooting should focus on confirming which exclusion applies and whether it was intentionally configured.

If access is actually required, the right follow-up is to work with the policy owner to change the exclusion at the correct scope. Using a different Copilot surface does not override exclusions, and manually pasting excluded source into prompts is not an appropriate workaround. Other safeguards, such as duplication detection and security warnings, address different concerns and do not control whether excluded content is used as Copilot context.

  • Manual bypass pasting the file into Chat works around the exclusion instead of troubleshooting it and may violate privacy policy.
  • Different surface Agent Mode does not override active content exclusions.
  • Wrong safeguard duplication detection and security warnings do not determine whether Copilot can use excluded content.

Question 8

Topic: Configure Privacy, Content Exclusions, and Safeguards

A development team uses GitHub Copilot in a private monorepo. The compliance lead wants developers to keep using Copilot, but to receive a warning when a suggestion may closely match public code so they can review it before accepting it. Which action best meets this requirement?

Options:

  • A. Turn on security warnings for generated code.

  • B. Create an instruction file to avoid reused code.

  • C. Enable duplication detection for Copilot suggestions.

  • D. Add the repository to content exclusions.

Best answer: C

Explanation: The requirement is specifically to warn developers about suggestions that may match public code. Duplication detection is the Copilot safeguard designed for that scenario, while the other options address different concerns.

Duplication detection is the correct setting when the goal is to alert users that a Copilot suggestion may resemble public code. In this scenario, the team does not want to disable Copilot or reduce its normal usefulness; they want a safeguard that adds a review signal before code is accepted. That makes duplication detection the best fit.

Content exclusions control what repository content Copilot can use as context, not whether generated output resembles public code. Security warnings focus on potentially insecure code patterns, not similarity to public code. Instruction files can guide style, conventions, and behavior, but they do not perform match detection. The key takeaway is to choose the safeguard that directly surfaces possible public-code matches for human review.

  • The content exclusions option is about preventing selected files or paths from being used as Copilot context, not warning on output similarity.
  • The security warnings option targets risky or vulnerable code patterns, not possible matches to public code.
  • The instruction file option can influence responses, but it does not provide an actual detection or warning mechanism.

Question 9

Topic: Configure Privacy, Content Exclusions, and Safeguards

A developer asks Copilot Chat to generate tests for a function, but Copilot responds with only generic advice.

Exhibit:

Open file: services/payments/validator.py
User prompt: "Add unit tests for edge cases in this function."
Copilot reply: "I can suggest general test patterns, but I don't have file-specific context."
Organization content exclusions:
- services/payments/**
Behavior in non-excluded folders: normal file-aware responses

The team must keep the exclusion in place. Which prompt/context change best improves the result?

Options:

  • A. Add more open files from services/payments/ to the chat context.

  • B. Use a sanitized copy in a non-excluded file and prompt on that snippet.

  • C. Switch to Agent Mode and ask it to inspect the excluded folder automatically.

  • D. Rewrite the prompt to ask Copilot to infer the whole repository context.

Best answer: B

Explanation: The behavior points to an organization content exclusion, not a weak prompt alone. The best fix is to provide relevant context from a safe, non-excluded source, such as a sanitized minimal snippet, and then ask for the specific test task.

Content exclusions prevent Copilot from using matching files or paths as context. In the exhibit, file-aware help works in other folders, but not under services/payments/**, which strongly indicates the exclusion is causing the generic response.

A better prompt helps only when Copilot can access the needed context. To troubleshoot without weakening safeguards, move or recreate only the necessary, sanitized code in a non-excluded scratch file and ask a specific request against that selected snippet. Changing modes or adding more files from the same excluded path does not make that excluded content available. The key takeaway is to diagnose the missing context from policy evidence first, then supply safe, allowed context.

  • Broader prompt fails because better wording cannot recover context blocked by an exclusion.
  • Agent Mode bypass fails because Copilot modes do not override organization content exclusions.
  • More excluded files fails because adding additional files from the same excluded path still provides no usable context.

Question 10

Topic: Configure Privacy, Content Exclusions, and Safeguards

A company uses GitHub Copilot in a private monorepo. Developers must keep using inline suggestions and Copilot Chat, but files under legal/ and customer-records/ must never be used as Copilot context. Which action best meets this requirement?

Options:

  • A. Turn off editor inline suggestions for all users.

  • B. Disable pull request summaries on github.com.

  • C. Disable Copilot Chat for the organization.

  • D. Set organization content exclusions for legal/ and customer-records/.

Best answer: D

Explanation: The requirement is to protect specific repository content without removing Copilot from developers. Content exclusions are the privacy setting designed for that purpose, unlike feature toggles that simply turn Copilot surfaces on or off.

The key distinction is between a privacy safeguard and a general feature enablement setting. Content exclusions are used when certain repositories, folders, or files must not be included as Copilot context. That directly fits the requirement to keep Copilot available while preventing legal/ and customer-records/ from being used.

By contrast, disabling Copilot Chat, inline suggestions, or pull request summaries only turns off a Copilot surface. Those settings reduce functionality, but they do not provide path-specific privacy control while preserving the rest of the developer workflow.

A practical rule is:

  • Use content exclusions to keep selected content out of Copilot context.
  • Use editor or feature settings to enable or disable Copilot capabilities.
  • Apply organization-level policy when the rule must be consistent for all users.

The best answer is the one that protects only the sensitive paths without unnecessarily disabling Copilot elsewhere.

  • Disable Chat removes one Copilot surface, but it does not address path-based privacy for sensitive folders.
  • Turn off inline suggestions is a broad feature disablement and blocks normal help even for allowed code.
  • Disable PR summaries changes a github.com feature, not which repository paths Copilot may use as context.

Continue with full practice

Use the GitHub Copilot GH-300 Practice Test page for the full IT Mastery route, mixed-topic practice, timed mock exams, explanations, and web/mobile app access.

Try GitHub Copilot GH-300 on Web View GitHub Copilot GH-300 Practice Test

Free review resource

Read the GitHub Copilot GH-300 Cheat Sheet on Tech Exam Lexicon, then return to IT Mastery for timed practice.

Revised on Thursday, May 14, 2026