Try 82 free Certified Public Accountant Information Systems and Controls (CPA ISC) questions across the ISC blueprint areas, with answers and explanations, then continue in Mastery Exam Prep.
This free full-length CPA ISC multiple-choice question (MCQ) diagnostic includes 82 original Mastery Exam Prep questions across the ISC blueprint areas.
The CPA ISC section also involves task-based simulations and exhibit-heavy work, so use this page as an MCQ diagnostic rather than a complete simulation of every item type. The questions are original practice questions and are not official exam questions.
Practice count note: exam sponsors can describe total questions, scored questions, task-based simulations, duration, or unscored/pretest-item rules differently. Always confirm current exam-day rules with the sponsor.
For concept review before or after this diagnostic, use the CPA ISC guide on CPAExamsMastery.com.
CPA means Certified Public Accountant. ISC means Information Systems and Controls. This page is useful when you want one uninterrupted ISC multiple-choice diagnostic before returning to systems, data, security, privacy, and SOC drills.
Use the score as a diagnostic signal, not as a guarantee. ISC also involves task-based simulations and exhibit-heavy work, so a high score here should be paired with continued review of systems exhibits, control objectives, report scope, and data-reliability judgment.
| Diagnostic result | Practical next step |
|---|---|
| Below 70% | Return to topic drills. Start with the topic that produced the most misses, then retake mixed sets after the explanations make sense. |
| 70-79% | Review every miss and classify it as systems/data, security/privacy, or SOC engagements. Drill the weak category before another timed attempt. |
| 80%+ | Move to timed mixed practice and focus on process boundaries, control objectives, and careful exhibit reading. |
| Repeated 75%+ on unseen timed attempts | Schedule or proceed when you can explain the risk, control, and report-scope logic behind each best answer. |
| If your misses cluster around… | What to drill next |
|---|---|
| process flow, data management, reports, or interfaces | Information systems and data management questions . Trace source, input, processing, output, and use. |
| access, privacy, confidentiality, or safeguards | Security, confidentiality, and privacy questions . Identify the objective and control type. |
| SOC reports, report type, complementary controls, or reliance | System and organization controls questions . Decide who relies on the report and why. |
| timing pressure or repeated recognition of familiar stems | Timed mixed practice in the full route. Use larger unseen sets so practice builds control judgment instead of answer memorization. |
| Item | Detail |
|---|---|
| Issuer | American Institute of Certified Public Accountants (AICPA) |
| Exam route | CPA ISC |
| Official exam name | CPA ISC — Information Systems and Controls |
| Full-length set on this page | 82 questions |
| Exam time | 240 minutes |
| Topic areas represented | 3 |
| Topic | Approximate official weight | Questions used |
|---|---|---|
| Information Systems and Data Management | 40% | 33 |
| Security, Confidentiality and Privacy | 40% | 33 |
| Considerations for System and Organization Controls Engagements | 20% | 16 |
Topic: Information Systems and Data Management
A company uses an ERP system for order-to-cash processing. The system includes:
Which item is best classified as master data?
Best answer: D
What this tests: Information Systems and Data Management
Explanation: Customer records are master data because they are standing records used repeatedly to process many transactions. The API transfer is an interface, the sales order posting is a transaction flow, and the dashboard is a reporting output.
In an ERP or accounting information system, master data refers to core reference records that remain relatively stable and support ongoing processing. Examples include customer, vendor, employee, item, and chart-of-accounts records. Here, the customer file holds attributes such as payment terms, credit limits, and shipping addresses that are reused whenever orders are entered and processed. By contrast, the hourly API transfer describes how data moves between systems, so it is an interface. The sales order posting is an operational transaction flow because it records business activity and updates accounts. The month-end dashboard is a reporting output because it summarizes processed data for review and decision-making.
Master data consists of relatively stable reference records used repeatedly in processing transactions across the system.
Topic: Information Systems and Data Management
In a SOC 2 engagement, a payroll processor states that a key processing integrity control is a daily reconciliation of employee hours imported from client files to hours posted in the payroll application. The reconciliation report is generated automatically, and a payroll supervisor is required to review and resolve any differences before payroll is processed.
During tests of operating effectiveness for 25 business days, the practitioner finds that the reconciliation report was generated each day, but on 3 days there was no evidence of supervisor review before payroll processing. On one of those days, an import error caused 18 employee time records to be omitted until the next payroll cycle.
How should this finding be characterized?
Best answer: D
What this tests: Information Systems and Data Management
Explanation: This is an operating effectiveness issue within processing integrity because the reconciliation control existed and was generated daily, but required supervisory review was not performed consistently. The omitted time records show the lapse affected complete and timely processing, which is central to processing integrity.
In SOC 2, processing integrity addresses whether system processing is complete, valid, accurate, timely, and authorized. Here, the control is designed to detect import differences before payroll runs: the report is generated automatically and a supervisor must review and resolve exceptions. Because that control existed but was not performed as required on 3 of 25 days, the issue is a deviation in operating effectiveness, not a design deficiency. It is also a processing integrity matter, not a security matter, because the problem involves incomplete and delayed payroll processing rather than unauthorized access. It is not a complementary user entity control issue because the facts describe a control the service organization itself is responsible for performing.
The control was appropriately designed and existed, but it did not operate as prescribed on some days, affecting completeness and timeliness of processing.
Topic: Considerations for System and Organization Controls Engagements
A CPA is planning a SOC 2 examination for ApexPay using the carve-out method for subservice organizations.
Current facts:
What should the CPA do next?
Best answer: C
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The draft is missing key elements needed for a complete system description, so the CPA should first compare it to the description criteria and have management correct the omissions. In a SOC examination, testing and reporting decisions come after the system description is complete and properly bounded.
When management’s draft system description appears incomplete, the practitioner’s next step is to compare it to the applicable description criteria and identify gaps. Here, outsourced authentication and cloud hosting affect the system boundary, so they cannot be ignored simply because the carve-out method is used. Under carve-out, management still needs to identify the subservice organizations, describe the nature of their services, and address relevant complementary subservice organization controls. Only after the description is complete and aligned with the criteria should the CPA move to testing or reporting conclusions. Written representations are concluding evidence, not a substitute for comparing the draft to the criteria, and an immediate modified report would be premature before management has a chance to revise the description.
Because the draft omits required boundary and subservice-organization disclosures, the CPA should first compare it to the description criteria and resolve the gaps with management.
Topic: Considerations for System and Organization Controls Engagements
A CPA firm is evaluating acceptance of a SOC 2 examination for AtlasPay, a service organization. AtlasPay uses VaultCo, a separate hosting provider, as a subservice organization. AtlasPay has not yet decided whether VaultCo will be presented using the inclusive method or the carve-out method. The CPA firm is independent of AtlasPay, but another practice unit of the firm performs bookkeeping services for VaultCo. What should the engagement partner do next?
Best answer: C
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The service auditor must always be independent of the service organization. When a subservice organization may be included using the inclusive method, the auditor must also consider whether relationships with that subservice organization affect independence, so the reporting method should be resolved first.
In a SOC engagement, independence is always required with respect to the service organization. A subservice organization adds an extra independence consideration when its services and controls are included in the scope through the inclusive method. Under the carve-out method, the subservice organization’s controls are excluded from the service auditor’s opinion, so the firm’s relationship with that subservice organization is not evaluated the same way for the examination scope. Here, the firm is already independent of AtlasPay, but it has a relationship with VaultCo and AtlasPay has not yet chosen inclusive or carve-out presentation. Therefore, the proper next step is to determine the intended method for VaultCo and then assess the firm’s independence implications before accepting or continuing planning.
Because the firm is already independent of the service organization, the next planning step is to determine whether the subservice organization will be included in scope, which drives whether the VaultCo relationship creates an independence issue.
Topic: Security, Confidentiality and Privacy
An online payroll processor uses a cloud-hosted payroll portal. Employees access the admin console through single sign-on from company laptops, and customers access the portal from the internet.
Recent findings:
Which control mix best reflects defense-in-depth for this environment?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: Defense in depth uses complementary controls across multiple layers and control types rather than relying on one safeguard. The combination of MFA, prompt patching, real-time monitoring, and tested immutable backups directly addresses the credential compromise, exposed server, delayed detection, and weak recovery described in the scenario.
Defense in depth means building overlapping security layers so that if one control fails, others still reduce impact. In this scenario, the weaknesses appear at several points: identity security, system hardening, detection, and recovery. MFA is a preventive control that limits misuse of stolen passwords. Prompt patching of the internet-facing server reduces the chance that known vulnerabilities can be exploited. Centralized monitoring with real-time alerts adds a detective layer so unusual bulk downloads are identified quickly instead of days later. Tested daily immutable backups and an incident response playbook are corrective and recovery measures that help restore operations after malware or file corruption. The best answer is the one that addresses all four observed gaps with coordinated preventive, detective, and corrective controls.
This option layers preventive, detective, and corrective controls across identity, system, monitoring, and recovery points that match the scenario’s specific failures.
Topic: Considerations for System and Organization Controls Engagements
A CPA is preparing training materials for new SOC 2 staff. The CPA wants support for this conclusion:
“In a SOC 2 examination, the Trust Services Criteria are the benchmarks used to evaluate controls over a system. They are organized around COSO-aligned common criteria, with supplemental criteria for certain subject matters and additional specific criteria for privacy.”
Which source best supports that conclusion?
Best answer: B
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The SOC report excerpt is the only source that directly addresses both what the Trust Services Criteria are used for and how they are structured. The other sources discuss individual controls or service features, not the framework organization of the criteria.
In a SOC 2 examination, the Trust Services Criteria serve as the control criteria against which the service organization’s system is evaluated. Their organization matters: the common criteria apply broadly and are aligned with COSO concepts, while certain subject matters use supplemental criteria, and privacy has additional specific criteria when that category is included. A source that best supports this conclusion must explicitly describe both the purpose of the criteria and their structure. Evidence about one access weakness, one vendor capability, or one successful control test may support a conclusion about a specific control or service commitment, but it does not explain how the Trust Services Criteria are organized or why they are used in the examination.
This excerpt directly states both the purpose of the Trust Services Criteria and their organization into COSO-aligned common, supplemental, and privacy-specific criteria.
Topic: Security, Confidentiality and Privacy
An entity’s Severity 1 incident response plan and actual timeline are shown below.
| Plan requirement | Time rule |
|---|---|
| Escalate to incident manager | Within 15 minutes after the analyst confirms the incident |
| Isolate affected production host | Within 30 minutes after Severity 1 classification |
| Notify privacy officer | Within 1 hour after the team determines regulated personal data may be involved |
| Preserve evidence | Capture a forensic image before reimaging any compromised host |
| Actual timeline | Event |
|---|---|
| 08:10 | SIEM alert flags unusual outbound traffic from payroll application server |
| 08:18 | Analyst confirms unauthorized access and classifies incident as Severity 1 |
| 08:27 | Incident manager notified |
| 08:40 | Payroll application server isolated from network |
| 08:50 | Log review indicates exfiltrated file may contain employee SSNs |
| 09:35 | Privacy officer notified |
| 09:50 | Server reimaged for restoration |
| 10:20 | Forensic image captured after reimage completed |
Based on the exhibit, which response step was inconsistent with the plan?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The timeline shows that escalation, isolation, and privacy notification all met the plan’s stated trigger points and deadlines. The only mismatch is evidence preservation, because the host was reimaged before the forensic image was captured.
To evaluate an incident timeline, compare each event to the plan’s stated trigger and required deadline or sequence. Here, the analyst confirmed the incident at 08:18 and notified the incident manager at 08:27, so escalation occurred within 15 minutes. The host was isolated at 08:40, which is within 30 minutes of Severity 1 classification. The privacy officer was notified at 09:35, which is 45 minutes after the team determined at 08:50 that employee SSNs might be involved, so that step was timely. The only violation is the evidence-preservation requirement. The plan explicitly says to capture a forensic image before reimaging a compromised host, but the server was reimaged at 09:50 and imaged only afterward at 10:20. That sequence can destroy or alter evidence needed for investigation.
The plan requires a forensic image before reimaging, but the server was reimaged at 09:50 and the image was not captured until 10:20.
Topic: Information Systems and Data Management
A service organization sends approved customer rate changes from its billing platform to its invoicing system each night. Management says a billing supervisor reviews a daily rejected-record report and resolves errors before invoices are issued. During a walkthrough, the CPA learns that the interface does not generate any rejected-record report, and no other reconciliation compares rate changes sent to rate changes posted. Several approved rate changes were later found missing from invoices.
Which remediation best addresses this deficiency?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The problem is not a one-time failure to perform an existing control. The needed processing integrity control was never actually in place, so the best response is to design and implement a control that detects rejected or missing interface records and requires timely follow-up.
A design deficiency exists when a control, as designed or actually implemented, cannot prevent or detect errors on a timely basis. Here, management describes a daily review control, but the interface does not produce the rejected-record report and no alternative reconciliation exists. That means completeness and accuracy of transferred rate changes are not being monitored at all, making this a design deficiency. An operating deviation would be different: for example, if a valid exception report existed and the supervisor failed to review it on a particular day. In this scenario, the proper remediation is to add or redesign the control itself, such as an automated exception report or source-to-target reconciliation with timely review before invoicing.
Because no exception report or reconciliation exists, the issue is a design deficiency, so the control itself must be implemented rather than merely enforced.
Topic: Security, Confidentiality and Privacy
A company’s SOC manager concluded that recent suspicious VPN activity was a password spraying attack against employee accounts. Which source would best support that conclusion?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The authentication log is the best support because it shows the actual attack pattern: one common password attempted across many usernames. That is direct evidence of password spraying, unlike evidence about the attacker, a control weakness, or the company’s response.
To identify an attack type, the strongest support is evidence of the behavior itself. Password spraying typically uses one or a few common passwords across many accounts to avoid repeated failures on a single account and reduce lockouts. The authentication log showing one external IP trying the same password once against hundreds of usernames directly supports that conclusion. By contrast, a threat intelligence report helps identify the threat agent, not the attack technique. A finding that MFA is missing describes a vulnerability or control weakness that could allow the attack to succeed, but it does not prove the attack occurred. An incident record showing IP blocking and password resets describes the control response after detection. Evidence of reconnaissance or account enumeration would indicate an earlier attack stage, not the spraying pattern itself.
A single common password attempted across many accounts is the defining pattern of password spraying.
Topic: Considerations for System and Organization Controls Engagements
A CPA is reviewing a SOC 2 scoping worksheet for a cloud payroll platform.
| Criterion reference | Control summary |
|---|---|
| CC6 | Multifactor authentication is required for privileged remote access. |
| A1 | Recovery procedures are tested against system availability commitments. |
| C1 | Confidential customer files are encrypted and access is limited to approved personnel. |
| CC9 | Management evaluates third-party vendors before onboarding. |
Which conclusion about the Trust Services Criteria is supported by the exhibit?
Best answer: B
What this tests: Considerations for System and Organization Controls Engagements
Explanation: In the Trust Services Criteria, references beginning with CC are common criteria, while subject-specific references such as C are additional criteria. Because the control over encrypting and restricting access to confidential files is mapped to C1, it is the only option that correctly identifies an additional confidentiality criterion.
SOC 2 uses common criteria, labeled CC, across all engagements and subject matters. Additional criteria are used only when the engagement includes availability, processing integrity, confidentiality, or privacy. Those subject-specific criteria are identified by prefixes such as A, PI, C, and P. In the exhibit, CC6 and CC9 are common criteria, so they are not additional criteria. A1 is an additional availability criterion, not a common one. C1 is an additional confidentiality criterion because it addresses the protection of confidential information. This is why the control over encrypting confidential files and limiting access is the best-supported conclusion from the worksheet.
Criteria labeled C are additional criteria for confidentiality, while CC criteria are common criteria.
Topic: Information Systems and Data Management
An entity’s ERP environment has the following conditions for application and server changes:
Which risk most directly reflects weak change management rather than a separate access, monitoring, or recovery weakness?
Best answer: D
What this tests: Information Systems and Data Management
Explanation: The described conditions point to a classic change management weakness: changes can reach production without approval, testing evidence, or post-implementation review. That most directly increases the risk of unauthorized, erroneous, or incompatible changes causing bad processing results or downtime.
Change management controls are designed to ensure application and infrastructure changes are authorized, tested, approved, documented, and migrated to production in a controlled way. When developers or administrators can move changes directly to production without formal approval or retained test evidence, the main risk is that untested, unauthorized, or poorly understood changes will affect live processing. That can lead to system outages, failed integrations, inaccurate data processing, or unintended configuration changes. Emergency changes may be necessary, but they still should be documented and reviewed afterward. The other choices describe important risks in adjacent control areas, but they are more directly tied to logical access administration, security monitoring, or disaster recovery than to change management.
Weak change controls primarily increase the chance that unauthorized or untested code or configuration changes are promoted to production and disrupt operations.
Topic: Information Systems and Data Management
A manufacturer uses a CRM system for order entry and an ERP system for shipping, invoicing, and the general ledger. Approved customer orders are exported nightly from CRM to ERP.
Process facts:
Which correction is the most appropriate response to this processing integrity issue?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The main issue is not security or system architecture; it is that rejected interface records are not being reconciled or resolved. A daily control-total and sequence reconciliation, combined with exception handling for rejected orders, is the best correction because it addresses missing transactions before shipping and invoicing differences persist.
Processing integrity in an ERP interface depends on transactions being complete, accurate, valid, and processed as intended. Here, approved CRM orders can fail ERP import, yet the process continues without a business-side exception report or reconciliation. That means invalid records are silently excluded from downstream invoicing, creating a completeness failure. The best correction is to reconcile source transactions to loaded transactions using record counts and sequential order numbers, then route rejected items to the responsible business users for timely correction and reprocessing. That response directly targets the broken interface control. Strengthening item-master change approval could help reduce some invalid-code errors, but it would not detect or resolve rejected orders already missing from ERP. Encryption and API replacement do not by themselves fix the missing reconciliation control.
This directly addresses completeness and processing integrity by detecting missing orders and ensuring rejected transactions are corrected and re-entered.
Topic: Considerations for System and Organization Controls Engagements
A CPA firm issued a SOC 2 Type 2 report on May 15 covering January 1 through March 31. On May 25, the service auditor learns from a completed internal investigation that, during March, terminated administrators retained privileged access for several days because an access-deprovisioning script failed. The system description in the report states that privileged access for terminated personnel is removed within 24 hours, and the control was concluded to operate effectively throughout the period.
What should the service auditor do next?
Best answer: C
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The correct next step is to evaluate the newly discovered information with management to determine whether it is reliable and whether it affects the system description or the service auditor’s conclusion. Only after that assessment would the auditor decide whether revision, disclosure, or further action is necessary.
In a SOC engagement, if the service auditor becomes aware after report issuance of facts that existed at the report date and those facts might have affected the report, the auditor should not jump straight to withdrawal, ignore the matter, or switch to testing a new period. The proper response is to discuss the matter with service organization management, assess the reliability of the information, and determine whether the system description or the opinion on design or operating effectiveness is affected. Here, the newly discovered deprovisioning failure occurred during the covered period and directly contradicts both the stated control operation and the conclusion that the control operated effectively throughout the period. If the matter is confirmed and is material, the auditor would then consider revising the report and taking appropriate steps regarding report users.
When facts existing at the report date are discovered after issuance, the service auditor first evaluates their reliability and effect on the report with management before deciding on revision or user notification.
Topic: Information Systems and Data Management
A company uses a SaaS billing application hosted by a cloud provider. Relevant facts:
Under this shared-responsibility arrangement, which responsibility remains with the customer organization?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The customer retains responsibility for logical access administration in the SaaS application because the facts say it configures roles, approves SSO mappings, and performs access reviews. The delayed removal of terminated users is therefore a customer control failure, not a provider infrastructure failure.
In a shared-responsibility model, the exact split depends on the service model and the contract. Here, the arrangement is SaaS, and the stated provider responsibilities cover the underlying environment: data center security, network perimeter, operating system patching, application updates, and database backups. The customer still controls how its own users access the application, including role setup, SSO group mapping, periodic access review, and timely deprovisioning of terminated employees. Because the audit issue involves former employees retaining access, the problem falls within the customer’s retained responsibility for logical access governance. A common mistake is assuming the provider is responsible for all security in SaaS, but customers still own important user-access and data-governance controls.
Logical user access within a SaaS application remains a customer responsibility when the contract assigns role configuration and access reviews to the customer.
Topic: Information Systems and Data Management
A CPA is reviewing documentation for a sales reporting mart. Management concludes the mart uses a snowflake schema rather than a star schema because a central fact table stores measures and foreign keys, while descriptive product data is further normalized into related dimension tables. Which source best supports management’s conclusion?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: The best support is the data dictionary showing FactSales linked to DimProduct, with DimProduct further linked to DimBrand and DimCategory. That layout shows a central fact table plus normalized dimension tables, which identifies a snowflake schema.
In both star and snowflake schemas, the fact table holds numeric measures and foreign keys used for reporting. The difference is how dimension data is organized. A star schema keeps descriptive attributes together in a single denormalized dimension table, while a snowflake schema normalizes that dimension into related tables, such as separate product, brand, and category tables. Because the conclusion is about schema structure, the strongest evidence is documentation that shows table names, keys, and relationships. The data dictionary excerpt with FactSales, DimProduct, DimBrand, and DimCategory directly shows normalized dimensions branching from the fact table, so it best supports the snowflake conclusion.
It directly shows a central fact table and a product dimension split into related brand and category tables, which is the defining snowflake pattern.
Topic: Information Systems and Data Management
A CPA is documenting how a company’s ERP affects the sales, cash collections, and reporting processes before deciding what controls to test.
At month-end, several shipments made on the last day of the month appeared in accounts receivable and the general ledger on the next day, and those shipments were missing from the month-end daily sales report.
What should the CPA do next to understand how the AIS affects the sales and reporting process?
Best answer: D
What this tests: Information Systems and Data Management
Explanation: The right next step is to trace a month-end sale through the automated ERP flow and into the report source. Because the issue is a timing difference between shipment, invoicing, subledger and general ledger posting, and report output, a walkthrough shows exactly where the AIS affects recognition and reporting.
To determine how an accounting information system affects a business process, the CPA should first map the transaction through its trigger, automated processing steps, interfaces, and report source. In this scenario, shipment confirmation triggers invoicing and the receivable/revenue entry, cash receipts are posted later from a lockbox file, nightly summaries update the general ledger, and the daily sales report comes from the invoice table. Since the symptom involves last-day shipments appearing on the next day in both accounting records and the sales report, the most useful next step is an end-to-end walkthrough of a month-end sale, including the report logic or data source. That identifies whether the timing issue arises at shipment confirmation, invoice creation, batch posting, or report generation before moving to control testing or substantive procedures.
An end-to-end walkthrough is the next step because it reveals where the transaction timing changes across the integrated sales, subledger, general ledger, and reporting flow.
Topic: Security, Confidentiality and Privacy
An ISC practitioner reviews the following source material for a retail company:
Policy and notice excerpts
- Customer personal information may be shared with third parties only for purposes described in the privacy notice and supported by recorded consent when consent is required.
- Restricted data must be encrypted at rest and in transit.
- Financing-application SSNs must be deleted 90 days after the credit decision.
Incident facts
- Marketing sent an analytics vendor a file containing customer names, email addresses, purchase history, and financing-application status.
- The file did not include SSNs or payment card data.
- The vendor stored the file unencrypted in a shared workspace for 10 days.
- The vendor used the file to build targeted advertising audiences.
- The company had no recorded customer consent for using purchase-history data for targeted advertising.
Which characterization is most supported by the source material?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The decisive distinction is privacy, not just confidentiality. The source material shows customer personal information was shared and used for targeted advertising without the recorded consent required by policy, while the unencrypted storage separately indicates a confidentiality control failure.
Privacy focuses on whether personal information is collected, used, retained, and disclosed in line with stated notice, consent, and policy obligations. Confidentiality focuses on protecting information from unauthorized disclosure, often through controls such as encryption. Here, the source material explicitly says third-party sharing must align with the privacy notice and recorded consent when required. The vendor received identifiable customer data and used it for targeted advertising without recorded consent, so the core issue is an improper use and disclosure of personal information. The fact that the file was stored unencrypted is also important, but that fact supports a separate confidentiality weakness rather than replacing the privacy conclusion. Nothing in the scenario suggests availability is the main concern, and privacy issues do not require an external cyberattack to exist.
Policy focuses on permitted use and disclosure of personal information, so missing required consent makes privacy the decisive issue even though unencrypted storage also weakens confidentiality.
Topic: Security, Confidentiality and Privacy
During a SOC 2 confidentiality walkthrough, the CPA notes the following control test result:
| Item | Observation |
|---|---|
| Data | Vendor bank account and routing numbers submitted through the onboarding portal |
| Current handling | A nightly CSV extract is copied to a shared network folder for exception review |
| Access | The folder is readable by all AP clerks, 3 interns, and 8 IT developers |
| Protection | The folder is not encrypted at rest |
| Retention | Files are retained indefinitely |
| Policy | Confidential banking data must be encrypted, accessible only to designated AP exception reviewers, and deleted after 90 days |
Which corrective response best addresses the deficiency shown in the exhibit?
Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The best response is the one that fixes the actual confidentiality exposure in the exhibit. The files contain sensitive banking data, are broadly accessible, unencrypted, and kept too long, so the corrective action must tighten access, protect the data at rest, and enforce the retention limit.
A corrective response should directly address the control deficiency shown, not just add a general control around it. Here, confidential vendor banking data is stored in an unencrypted shared folder, accessible to users without a business need, and retained indefinitely even though policy requires encryption, least-privilege access, and deletion after 90 days. The strongest remediation is therefore to move the files to an encrypted location, limit access to the designated reviewers only, and automate retention. Training may improve awareness, but it does not remove the inappropriate access or indefinite retention. More frequent backups address availability, not confidentiality. A reconciliation helps completeness or accuracy of processing, not protection of sensitive data.
This directly remediates the confidentiality gap by aligning storage, access, and retention with the stated policy for sensitive banking data.
Topic: Security, Confidentiality and Privacy
An accounting staff accesses the company’s ERP remotely through a VPN. During a recent phishing campaign, several employees disclosed their usernames and passwords on a fake login page. Attack investigation found successful VPN logins from unusual foreign IP addresses using those valid credentials, and no unpatched-system exploit or malware execution was identified. Management concludes that requiring multifactor authentication for VPN access is the most appropriate preventive control.
Which source best supports management’s conclusion?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The best support is evidence that attackers successfully used stolen passwords to access a password-only VPN. That directly supports multifactor authentication as the most appropriate preventive control for this remote-access risk.
Multifactor authentication is a preventive control that is especially effective when the attack path involves stolen or guessed credentials. Here, the facts show a phishing event, successful remote logins using valid usernames and passwords, and a VPN that relied only on single-factor authentication. That combination directly supports adding MFA to remote access. By contrast, missing patches would support vulnerability management, inactive contractor accounts would support deprovisioning and access review controls, and blocked port scans would support perimeter monitoring or firewall hardening. Those controls may still matter, but they do not best address the specific risk of attackers authenticating with compromised user credentials.
It directly ties the unauthorized access to stolen passwords on a single-factor remote-access process, which multifactor authentication is designed to mitigate.
Topic: Security, Confidentiality and Privacy
The company’s incident response plan states:
At 7:40 a.m., a SOC analyst sees a successful VPN login to a recently terminated employee’s account from an unfamiliar IP address. Five minutes later, database logs show that account ran an export query against a table containing customer Social Security numbers. The analyst does not yet know whether the export file left the network.
What should the analyst do next?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: This is more than a routine security event. A successful login to a terminated employee account followed by an export query against Social Security number data makes unauthorized access and possible exposure reasonably likely, so the analyst should preserve evidence and escalate as a cybersecurity incident before considering external notification.
A security event is any observable occurrence, such as an alert, login, or log entry. A cybersecurity incident is a security event that actually causes or is reasonably likely to cause unauthorized access, data exposure, malware execution, or material disruption. Here, the successful use of a terminated employee’s account strongly suggests unauthorized access, and the subsequent export query against customer SSN data raises a clear confidentiality risk. That is enough to meet the incident threshold under the stated plan, even though exfiltration is not yet confirmed. The proper next step is to preserve relevant evidence and escalate internally under the incident response plan. Customer or regulatory notification may follow later, but only after investigation and legal/compliance review determine that a reportable breach occurred.
A terminated employee account successfully accessing SSN data makes unauthorized access reasonably likely, so incident escalation and evidence preservation are required before any external reporting decision.
Topic: Information Systems and Data Management
A CPA is testing a control over configuration parameters in an acquired billing application. Production parameters are maintained in an admin console outside the CI/CD pipeline and can affect invoice approval thresholds, automatic write-off limits, and posting dates.
Management states the key control is: each Friday, the application manager reviews a system-generated report of all production parameter changes for the week and agrees each change to an approved ticket.
The CPA has already:
What should the CPA do next?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The best next step is to validate the completeness and accuracy of the parameter-change report before using it for control testing. If the population is incomplete or altered, any sample drawn from it would not provide reliable evidence about whether all production parameter changes were reviewed.
When a control relies on a system-generated report, the CPA should first determine whether that report is complete and accurate enough for the intended test. Here, the key control is the weekly review of all production parameter changes, and the application manager’s spreadsheet export is the population that would likely be used for sampling. Because configuration parameters can be changed directly in production outside the CI/CD pipeline, those changes may not appear in normal code-deployment records. Reconciling the export to independent audit logs helps confirm that the report captures all relevant parameter changes and has not been omitted or altered. Only after establishing that reliability should the CPA select samples to inspect approvals and review evidence.
Because the control depends on a report of all parameter changes, the CPA must first verify that the report population is complete and accurate.
Topic: Security, Confidentiality and Privacy
An online retailer allows customers to post product reviews. After one review was submitted, multiple users reported that opening the product page caused their browsers to run unexpected JavaScript and redirect them to a fake sign-in page. The security analyst concluded the site experienced a stored cross-site scripting attack.
Which evidence best supports that conclusion?
admin' OR '1'='1' -- in the username field followed by a database error.<script>window.location='https://acct-verify.example'</script> in the database and the page template later displayed that review to other users without output encoding.Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The best support is the incident record showing malicious JavaScript stored in the review field and later served to other users without output encoding. That is the defining pattern of stored cross-site scripting, where untrusted input is persisted and then executed in victims’ browsers.
Cross-site scripting occurs when an application includes untrusted input in a web page in a way that lets the browser execute it as code. In a stored XSS attack, the malicious payload is saved by the application, such as in a review, comment, or profile field, and then delivered to later visitors. The strongest supporting evidence therefore shows both persistence of the script and unsafe rendering to users. The incident record does exactly that by showing a <script> payload stored in the database and displayed without output encoding. By contrast, a quote-based condition like OR '1'='1' points to SQL injection, repeated acceptance of the same authenticated request points to a replay attack, and overwritten memory with a changed return address points to buffer overflow or return-oriented exploitation.
<script> content supports stored XSS because it shows script code persisted and executed in users’ browsers.OR '1'='1' is classic SQL injection evidence because it targets database query logic, not browser-side script execution.Stored script content that is later rendered in other users’ browsers without output encoding is direct evidence of stored cross-site scripting.
Topic: Security, Confidentiality and Privacy
A company uses a SaaS vendor to process customer billing data.
Vendor file summary:
Which response is the best correction to this issue?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The best correction is to perform the CUECs the SOC 2 report assigns to the user entity and to monitor the vendor on an ongoing basis. The problem is not solved by shifting all responsibility to the vendor or by demanding a different report that is not required by the facts.
When a service provider’s SOC report identifies complementary user entity controls, the user entity must implement those controls for the overall control environment to work as intended. Here, the company failed to review vendor-admin access monthly and failed to notify the vendor when users terminated or changed roles, creating an access-control weakness. In addition, relying only on the onboarding review is insufficient vendor due diligence; ongoing monitoring should include obtaining updated assurance reports and evaluating any carve-out implications for the subservice organization. The most appropriate remediation is therefore to implement the required user-side access controls and strengthen periodic vendor monitoring, rather than overreacting by stopping service or demanding a different report type.
The gap is the company’s failure to perform user-entity responsibilities and ongoing vendor monitoring, not a missing provider-side control.
Topic: Information Systems and Data Management
A company’s accounting environment includes staff laptops, a centralized ERP server, an operating system installed on that server, and switches and routers connecting users to shared resources. Which statement best distinguishes the primary purpose of these IT architecture components?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The correct choice assigns each component to its actual function in a typical accounting environment. Operating systems manage a device’s resources, servers provide shared services, network infrastructure connects devices and carries traffic, and end-user devices are the tools employees use to interact with the system.
In an accounting environment, these components work together but serve different purposes. An operating system is software that manages a device’s hardware resources, files, memory, and processes so applications can run. A server is the computer or virtual instance that provides shared services such as ERP processing, database access, or file storage to multiple users. Network infrastructure, including switches and routers, enables communication between devices and systems. End-user devices, such as laptops and desktops, are the machines accountants use to enter, review, and approve transactions. Distinguishing these roles is important because control, security, and availability issues often depend on which component is actually responsible for a function.
This option correctly matches each component to its primary role in an accounting environment.
Topic: Security, Confidentiality and Privacy
A CPA is reviewing privileged access for Orion Co.’s billing database. Company policy states:
Current privileged access listing:
| Account | Assigned to | Role | MFA | Last review | Notes |
|---|---|---|---|---|---|
| DBA-MBrown | M. Brown | Database admin | Enabled | 18 days ago | Normal admin |
| SQLADMIN | DBA team | Database admin | Disabled | No review on file | Used for faster troubleshooting |
| AD-JLee | J. Lee | Domain admin | Enabled | 25 days ago | Normal admin |
Based on these facts, what should the CPA do next?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The shared SQLADMIN account is the only item that departs from normal privileged-access rules, but the policy allows a narrow break-glass exception. The CPA should first inspect whether that exception was formally approved and whether the required compensating controls exist before concluding there is a deficiency or recommending removal.
When a privileged-access listing shows a shared admin account, the next step is to compare it to policy and determine whether it qualifies for an approved exception. Unique named IDs with MFA are the normal standard because they preserve accountability for privileged actions. Here, the policy permits one emergency break-glass account, but only if it is password-vaulted, tied to incident tickets, and reviewed after each use. The exhibit does not show that support for SQLADMIN, and the note “used for faster troubleshooting” raises concern that it may be ordinary shared access rather than emergency access. The CPA should therefore inspect the exception evidence first. Only after evaluating that evidence should the CPA conclude whether there is a design or operating deficiency.
The shared privileged account may be an allowed exception only if required emergency-access controls are documented and operating.
Topic: Security, Confidentiality and Privacy
A company allows a logistics partner to connect through a site-to-site VPN.
Facts:
Which source would best support that conclusion?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The firewall rule extract is the best support because it directly shows the partner connection has broader access than the stated business need. That overbroad access creates a credible path for unauthorized access or lateral movement if the partner environment is compromised.
For third-party connections, the strongest evidence of cybersecurity threat is the artifact that shows the actual trust boundary or access scope. Here, the partner only needs HTTPS access to one DMZ-hosted API, so evidence that the VPN subnet can also reach internal systems such as a domain controller, ERP database, and file server directly supports the conclusion that the connection is overprivileged. That is a classic third-party risk: compromise of the partner could be leveraged to move deeper into the company’s environment. By contrast, a partner SOC report describes the partner’s controls, a delayed file incident shows an operational issue, and an API change ticket addresses application logic. None of those proves the company exposed unnecessary internal network access through the partner connection.
It directly shows the partner VPN can reach internal systems beyond the single DMZ API required, evidencing unnecessary lateral-movement exposure.
Topic: Security, Confidentiality and Privacy
Blue Harbor Co. discovered unauthorized access to its e-commerce customer database. Management is preparing a memo for the audit committee concluding that the incident is likely to create significant financial and operational impact through response costs, required notifications, service disruption, and reputational harm.
Which source material would BEST support that conclusion?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The incident record is the strongest support because it connects the breach to concrete business consequences: notification obligations, external response spending, downtime, and customer cancellations. Those facts directly substantiate financial, operational, reporting, and reputational effects rather than merely showing how the breach occurred.
To support a conclusion about the implications of a data breach, the best evidence should address business impact, not just technical cause or occurrence. Strong support includes the number and sensitivity of records affected, whether reporting or notification is required, expected response costs, operational disruption, and signs of customer or market reaction. The incident record does all of that by linking exfiltration to legal notification requirements, approved spending for remediation and customer support, system downtime, and increased cancellations. By contrast, an access listing, a firewall log, and a delayed patch record may help explain how the breach happened or confirm unauthorized activity, but they do not by themselves support the broader conclusion about financial and operational consequences.
It directly ties the breach to reporting obligations, measurable response costs, operational downtime, and customer fallout.
Topic: Information Systems and Data Management
Delta Sports stores customer acquisition channel in its CRM, customer invoices in its ERP, and cash receipts in a treasury application. The CFO concludes, “During Q1, customers acquired through trade shows took longer on average to pay invoices than customers acquired through the website.” Which source would best support this conclusion?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The best support is an integrated SQL result that combines acquisition channel, invoice dates, and receipt dates across the CRM, ERP, and treasury systems. That source directly measures average payment time by channel, which matches the CFO’s conclusion.
To support a conclusion about payment speed by acquisition channel, the evidence must combine data from multiple sources and calculate the stated metric. The CRM provides acquisition channel, the ERP provides invoice dates, and the treasury application provides receipt dates. A joined SQL result can connect these records using common keys and compute average days from invoice date to full payment for each channel during Q1. The other sources are weaker because they either show only one part of the needed information or only document that integration is possible. Good support for financial or operational analysis should be both relevant to the conclusion and complete enough to reproduce the analysis from underlying data.
This source directly integrates the needed fields across systems and computes the exact measure used in the conclusion.
Topic: Security, Confidentiality and Privacy
A CPA performs a walkthrough of the employee termination access-removal process.
Documented policy:
Walkthrough observation for one terminated employee:
How should the CPA best characterize this walkthrough observation?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The documented policy clearly requires access removal within 24 hours, and the walkthrough showed removal occurring well after that deadline. Because the policy and related detective monitoring exist, the issue is best classified as a control operation failure rather than a design problem.
In a walkthrough, the CPA compares what actually happened to what documented policy requires. Here, the policy requires IT operations to disable network and VPN access within 24 hours of the HR ticket. The observed removal took until Thursday afternoon after a Monday morning ticket, so the control did not operate as required. That makes the exception an operating deviation in a preventive access-removal control. It is not primarily a design deficiency because the process includes both a required timely disablement step and a weekly exception report. It is also not automatically a security incident, because the facts show delayed deprovisioning, not confirmed unauthorized use or harm. The weekly exception report is detective and may help identify exceptions, but it does not convert the late disablement into compliance.
The policy is designed and documented, but the observed execution failed to meet the 24-hour requirement, making this an operating deviation.
Topic: Considerations for System and Organization Controls Engagements
A CPA firm is engaged as the service auditor for a payroll processor’s SOC 2 examination. The payroll processor’s system description will use the inclusive method to include a cloud hosting provider as a subservice organization. During the examination period, the CPA firm also designed and implemented the hosting provider’s privileged-access approval workflow. The firm performed no nonattest services for the payroll processor itself.
How should this relationship be characterized for independence purposes?
Best answer: D
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The inclusive method brings the subservice organization into the SOC examination’s subject matter. Because the CPA firm designed and implemented a control for that included subservice organization, the relationship should be characterized as an independence impairment.
In a SOC engagement, the service auditor must be independent of the parties whose system and controls are included in the examination. When the inclusive method is used, the subservice organization’s services and relevant controls are incorporated into the system description and the report’s subject matter. That means the service auditor must maintain independence with respect to both the service organization and the included subservice organization. Here, the CPA firm designed and implemented the hosting provider’s privileged-access approval workflow, which is management-type or nonattest involvement with the included subservice organization. That impairs independence for the SOC examination, even though the firm did not perform nonattest services for the payroll processor itself. By contrast, under a carve-out presentation, the subservice organization’s controls are excluded from the service auditor’s direct opinion.
Under the inclusive method, the subservice organization is part of the subject matter, so designing and implementing its control impairs independence.
Topic: Considerations for System and Organization Controls Engagements
A manufacturer that is not acting as a service organization wants an independent report it can share with investors, lenders, major customers, and regulators to communicate its entity-wide cybersecurity risk management program and the effectiveness of controls within that program. Which source best supports the conclusion that a SOC for Cybersecurity report is the appropriate report?
Best answer: D
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The best support is the excerpt describing an entity’s cybersecurity risk management program and a report intended for general use. That combination is the defining purpose and audience of a SOC for Cybersecurity report.
A SOC for Cybersecurity report is designed to help an entity communicate information about its cybersecurity risk management program to a broad range of users, such as investors, customers, business partners, and regulators. It is not limited to service organizations, and it is a general-use report. The report includes management’s description of the program, management’s assertion, and the practitioner’s opinion on whether the description is presented in accordance with the description criteria and whether the controls were effective to achieve the entity’s cybersecurity objectives. By contrast, SOC 1 focuses on controls relevant to user entities’ financial reporting, SOC 2 focuses on a service organization’s system and Trust Services Criteria for restricted users, and SOC 3 is general use but still relates to a service organization’s system rather than an entity-wide cybersecurity risk management program.
This excerpt matches SOC for Cybersecurity because it addresses the entity’s cybersecurity risk management program and is intended for broad, general-use distribution.
Topic: Considerations for System and Organization Controls Engagements
A CPA is reviewing a SOC 2 Type 2 report for a cloud payroll processor. Which excerpt most clearly belongs in the service auditor’s tests of controls and results section, rather than in management’s assertion, the system description, CUECs, CSOCs, or service commitments?
Best answer: A
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The excerpt about sampling terminated-user tickets and noting one late removal is the only choice that reports the service auditor’s procedure and the result of that procedure. Management assertions, system descriptions, CUECs, CSOCs, and service commitments describe claims, system facts, responsibilities, or promises, not audit test results.
In a SOC 2 Type 2 report, the tests of controls and results section explains what the service auditor actually tested and what was found. Common clues are the control tested, the procedure performed, the sample or items examined, and any exceptions or deviations noted. Management’s assertion is management’s claim about fair presentation, suitable design, and operating effectiveness. The system description explains the service organization’s system and processing environment. CUECs and CSOCs identify complementary controls expected at user entities or carved-out subservice organizations. Service commitments are promises such as availability, security, or processing expectations. Those items help users understand the system and responsibilities, but they are not the service auditor’s testing results.
This excerpt describes the auditor’s test procedure, sample, and exception noted, which is what appears in tests of controls results.
Topic: Information Systems and Data Management
A company is updating its continuity program. In workshops with process owners, management identifies payroll and order entry as critical processes, documents each process’s application, network, and third-party dependencies, sets payroll at an RTO of 4 hours and an RPO of 30 minutes, and ranks payroll ahead of expense reimbursement for recovery. How should this work be characterized?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The scenario describes classic business impact analysis activities: identifying critical business processes, mapping dependencies, and assigning recovery objectives and priorities. Those steps determine availability requirements and recovery order rather than evaluating threats, testing recovery, or managing a live incident.
A business impact analysis (BIA) helps an organization determine which business processes are most critical, what systems and external services those processes depend on, how quickly they must be restored, and how much data loss is tolerable. Typical BIA outputs include process criticality rankings, dependency mapping, recovery priorities, and availability requirements such as RTO and RPO. In this scenario, management is interviewing process owners, identifying critical functions, documenting dependencies, and setting recovery targets, all of which are hallmark BIA activities. By contrast, a risk assessment emphasizes threats, vulnerabilities, and likelihood; a disaster recovery test checks whether restoration procedures actually work; and incident response focuses on detecting, containing, and recovering from a specific event.
This work defines business criticality, supporting dependencies, and recovery targets, which are core outputs of a business impact analysis.
Topic: Information Systems and Data Management
During a walkthrough of the billing application, a CPA learns:
Which control response best addresses this change management gap?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: Configuration parameters can change transaction processing just like code can. The best response is to bring those parameter changes under formal change management with approval, testing, controlled deployment, and automated audit logging before they affect production.
The gap is that production configuration changes are bypassing formal change control. Even though code moves through CI/CD, key billing parameters are being changed directly in production without approval, testing, or a retained history of what changed. That creates risk of unauthorized, erroneous, or undocumented changes affecting billing results. The strongest control response is to treat these parameters as controlled configuration items: store them under version control, require approved change requests, test changes in nonproduction, deploy them through the controlled release process, and keep automated before-and-after logs. That combination addresses both design and implementation weaknesses by adding preventive and traceability controls. After-the-fact documentation or periodic review may help detect issues, but they do not adequately govern the change before it affects production.
This response directly adds authorization, testing, controlled deployment, and an audit trail for production configuration changes.
Topic: Security, Confidentiality and Privacy
A company’s incident response plan includes:
At 9:05 a.m., monitoring alerts show a successful login to the production database using an administrator account from an unapproved country. At 9:08 a.m., the service desk opens a ticket and notifies the security analyst. At 9:14 a.m., the analyst confirms that a 2 GB file containing customer records was downloaded during the session.
According to the plan, what should the security analyst do next?
Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The plan assigns the security analyst the immediate tasks of preserving evidence, classifying severity, and escalating likely privileged-account or customer-data incidents within the initial response timeline. External notifications, shutdown decisions, and audit review belong to other roles or later phases.
A well-designed incident response plan defines who does what, in what order, and by when. In this scenario, the analyst has already received the ticket and confirmed suspicious activity involving both a privileged account and customer data, which fits the plan’s escalation criteria. The next step is to preserve relevant evidence, complete initial triage and severity classification within the 30-minute window, and notify the Incident Response Manager immediately after classification. That sequence supports investigation integrity and timely coordination. The plan also clearly separates responsibilities: the system owner decides on taking systems offline, while Legal and the Privacy Officer evaluate external notifications after containment. Internal audit review is not the immediate next response step.
The plan requires the analyst to preserve evidence, classify likely privileged-account and customer-data events within the response window, and then escalate them immediately to the Incident Response Manager.
Topic: Security, Confidentiality and Privacy
A CPA is testing a company’s response to a malware incident. The approved incident response plan requires responders to assign a severity level within 1 hour, preserve logs, and notify the privacy officer when a system containing customer PII is involved. In the incident file, the team isolated the affected file server and restored operations, but no severity rating or privacy-officer notification was documented. The incident manager says the team used an informal shortcut because production had to resume quickly.
Which documentation or follow-up is most appropriate?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The incident file shows required plan steps were skipped, so the difference must be treated as an exception, not ignored as a minor documentation gap. The proper follow-up is to document the deviation, understand why it occurred, and determine whether the problem was noncompliance with the plan or a plan that needs revision.
When actual incident response procedures do not match the approved plan, the reviewer should document the deviation and investigate it. That means obtaining management’s explanation, confirming what actions were actually performed, and evaluating the control impact. The key follow-up is to determine whether the approved plan was adequate but not followed, which indicates an operating issue, or whether the plan itself is outdated or incomplete, which indicates a design issue. That conclusion drives remediation such as training, escalation, corrective action, or formal plan updates. Simply accepting fast recovery, filling in missing records after the fact, or changing the plan to match an unapproved shortcut would weaken governance and leave the organization unable to show that incidents are handled consistently and according to approved requirements.
A mismatch between actual incident handling and the approved plan should be documented as an exception and evaluated to determine whether the issue is poor execution or an outdated plan.
Topic: Security, Confidentiality and Privacy
A CPA is evaluating an event under the company’s documented data-handling policy.
Data handling policy
- Tax ID numbers and bank account numbers are classified as Restricted Personal Data.
- Restricted Personal Data may be shared with approved processors only for contracted services and only through the company's approved SFTP portal.
- Email attachments may not be used to transmit Restricted Personal Data, even if password protected.
Incident summary
- PayPro LLC is an approved payroll processor under a signed data processing agreement.
- During scheduled SFTP maintenance, an HR analyst emailed a password-protected spreadsheet to PayPro.
- The spreadsheet contained employee names, tax ID numbers, and bank account numbers.
- PayPro downloaded the file and then deleted the email.
Which conclusion is best supported by the exhibit?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The exhibit shows that the recipient was authorized, but the transmission method was not. Restricted personal data had to be sent through the approved SFTP portal, and the policy explicitly says email attachments are not allowed even if password protected.
The correct conclusion comes from reading the source material in sequence: classify the data, confirm whether the recipient is permitted, and then verify whether the handling method complies with policy. Tax ID numbers and bank account numbers are Restricted Personal Data. PayPro is an approved processor under contract, so sharing for payroll services is allowed in principle. However, the policy limits transmission of that data to the company’s approved SFTP portal and specifically forbids email attachments, even when password protected. Therefore, the facts support a confidentiality/data-handling control violation. The temporary SFTP outage may explain the analyst’s action, but it does not make the transmission compliant, and the policy does not prohibit all sharing with approved processors.
The policy expressly prohibits emailing restricted personal data, so using email violated the required transmission control even though the processor was approved.
Topic: Security, Confidentiality and Privacy
A company is reviewing how it reduces exposure of sensitive data across several processes.
| Process | Exhibit fact |
|---|---|
| Payment processing | The internal order system stores a surrogate value for each card number. Only the payment processor can map the surrogate back to the actual card number in a separate secured vault. |
| Analytics reporting | A weekly customer file sent to analysts replaces customer names with random IDs and shows only the birth year instead of the full date of birth. |
| Outbound communications | The email gateway scans messages and file uploads for patterns matching SSNs and payment card numbers and blocks transmission unless an exception is approved by security. |
Which conclusion is best supported by the exhibit?
Best answer: C
What this tests: Security, Confidentiality and Privacy
Explanation: The exhibit shows three different data protection concepts serving different purposes. The payment system replaces card numbers with surrogate values stored through a separate mapping vault, which is tokenization; the analyst file alters identifying details, which is data obfuscation; and the email gateway detects and blocks outbound sensitive data, which is DLP.
Tokenization reduces exposure by replacing sensitive data, such as a card number, with a non-sensitive surrogate and keeping the original value separate in a secured token vault. Data obfuscation reduces exposure by masking, generalizing, or otherwise altering data so users can work with it without seeing the full sensitive values. DLP focuses on preventing unauthorized transmission of sensitive data by monitoring and blocking outbound channels such as email or file uploads. In the exhibit, the order system stores a surrogate tied to a separate vault, so that is tokenization. The analytics file substitutes random IDs and truncates date of birth to year only, so that is obfuscation. The email gateway scans for SSNs and card numbers and blocks transmission, so that is DLP.
A surrogate mapped through a separate vault is tokenization, altering visible data for analysis is obfuscation, and scanning/blocking outbound sensitive data is DLP.
Topic: Security, Confidentiality and Privacy
A payroll service company discovered that an attacker downloaded a file containing employees’ names, Social Security numbers, and bank account details. The incident response team disabled the payroll portal for 2 days while forensic specialists preserved logs and restored clean backups. Management also engaged outside legal counsel to evaluate notification requirements.
Which consequence is most likely from this breach?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The best conclusion is that this breach has both financial and operational implications beyond IT repair. The facts show sensitive data exposure, portal downtime, outside specialists, and legal review, all of which point to response costs, disruption, possible reporting, and reputational effects.
When a data breach exposes sensitive personal information, the impact usually extends beyond replacing hardware or recovering files. Direct financial effects often include forensic investigation, legal counsel, notification support, credit monitoring, public relations, and remediation activities. Operational effects can include downtime, delayed processing, and diverted staff attention while systems are investigated and restored. Reporting or notification obligations may arise because personal data was accessed, and those obligations are not removed simply because backups exist or systems are recovered. Reputational harm is also a common consequence, especially for a payroll service provider that is trusted with confidential employee information. In this scenario, the combination of data exfiltration, two days of portal outage, outside experts, and legal evaluation makes the broad business impact the most appropriate conclusion.
A breach involving sensitive personal data commonly creates remediation costs, service interruption, potential reporting obligations, and reputational harm all at once.
Topic: Considerations for System and Organization Controls Engagements
A payroll processing company is preparing a SOC 1 report. Its production environment and backups are hosted by a third-party cloud provider, and management elects the carve-out method for that subservice organization. The payroll company’s control objectives depend in part on the cloud provider’s physical security and backup infrastructure controls.
Which conclusion is most appropriate?
Best answer: A
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The correct conclusion is that, under the carve-out method, controls performed by the subservice organization can be identified as complementary subservice organization controls when those controls are necessary to achieve the service organization’s control objectives. They are relevant to users’ understanding of the system, but they are not included in the service auditor’s testing scope under carve-out.
Complementary subservice organization controls are controls at a subservice organization that the service organization expects to be in place because they are needed to meet the stated control objectives or criteria. When the carve-out method is used, the subservice organization is excluded from the scope of the service auditor’s opinion and testing. Even so, management may identify those subservice organization controls in the system description so user entities understand important assumptions about outsourced activities. This differs from the inclusive method, where the subservice organization’s controls are included in the system description and in the service auditor’s testing. It also differs from complementary user entity controls, which are controls the user entity, not the subservice organization, is expected to implement.
Under the carve-out method, relevant subservice organization controls can be presented as complementary subservice organization controls without being included in the service auditor’s scope.
Topic: Information Systems and Data Management
A CPA is evaluating change control over a retailer’s CI/CD process and notes the following:
Which control should management implement to best reduce the risk of unauthorized or insufficiently reviewed production changes?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The main problem is that developers can change production settings directly and approve their own work. The best response is to restrict production access, require independent authorization, and review emergency changes after implementation so the process preserves separation of duties.
Effective change control policies should require authorized changes, documented acceptance criteria before implementation, appropriate testing or review, logging and monitoring, and separation of duties over production access. In this scenario, automated tests already exist, but they do not address the more serious weakness: developers can bypass normal controls by editing production configuration directly and then closing their own tickets. That creates risk of unauthorized, unreviewed, or improperly implemented changes. Restricting production access to a limited release role and requiring separate approval before deployment directly strengthens access restrictions and authorization. For true emergencies, a break-glass style process can allow expedited changes, but it should still require logging and a post-implementation review.
This control directly addresses the core weakness by enforcing access restrictions, separation of duties, and authorization over production changes.
Topic: Security, Confidentiality and Privacy
An entity’s incident response plan states:
During daily monitoring, the SOC identifies 180 failed VPN login attempts from one external IP address against one employee account over 10 minutes. MFA blocked access, the account auto-locked after five attempts, and logs show no successful login, data access, privilege change, or service disruption.
What is the most appropriate conclusion?
Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The blocked VPN attempts are a security event because they show suspicious activity without evidence of unauthorized access, disruption, or data compromise. Since the controls worked and no harm is indicated, the facts support logging, preserving, and monitoring the event rather than launching full incident response or breach reporting.
A security event is an observable occurrence, such as failed logins, alerts, or policy violations, that may warrant review. A cybersecurity incident is a subset of events that actually compromises, or is likely to compromise, confidentiality, integrity, or availability, or otherwise triggers formal response obligations. In this scenario, the attempted access was blocked by MFA and account lockout, and there is no evidence of successful entry, privilege change, data access, or service disruption. That means the activity should still be documented and monitored, but it does not yet meet the stated threshold for incident-level escalation or reporting assessment. The key distinction is that suspicious activity alone is not automatically an incident; the classification depends on whether compromise or likely compromise is present.
The activity shows an attempted attack, but the facts given show no actual or likely compromise, so event handling is appropriate without incident-level escalation.
Topic: Information Systems and Data Management
A manufacturer uses a permissioned blockchain to pay certain suppliers. Once a payment is signed and confirmed on the blockchain, the ERP automatically posts the transaction to cash and accounts payable in the general ledger. The AP supervisor can both prepare the payment and use the wallet’s single private key to sign and release it, and blockchain payments cannot be reversed. Which control would best address the financial reporting risk in this process?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The key risk is that one person can both initiate and authorize an irreversible blockchain payment that automatically affects the general ledger. A multi-signature wallet adds transaction-level segregation of duties before the payment is committed on-chain.
In a blockchain payment process, control over the private key is effectively control over transaction authorization. Here, the same AP supervisor can prepare and sign the payment, and the confirmed transaction automatically posts to cash and accounts payable. Because blockchain transactions are generally irreversible, an unauthorized or erroneous payment can create an immediate financial reporting misstatement. A multi-signature wallet is the best control because it requires independent approvals before the transaction is broadcast, embedding segregation of duties into the blockchain workflow itself. This directly addresses the authorization risk at the point where the payment becomes final. Controls such as more confirmations, backups, or report hashing may improve finality, recovery, or integrity evidence, but they do not prevent a single person from releasing an improper payment.
A multi-signature approval control directly reduces the risk of unauthorized irreversible blockchain payments that would automatically misstate cash and accounts payable.
Topic: Information Systems and Data Management
An ISC associate is extracting a population to test whether all customer shipments over $25,000 made in June 2026 had a carrier tracking number recorded.
Facts:
orders contains one row per customer order.ship_date is the date goods left the warehouse.invoice_date is the billing date and can differ from ship_date.SELECT order_id, customer_id, invoice_date, order_amount, tracking_no
FROM orders
WHERE invoice_date BETWEEN '2026-06-01' AND '2026-06-30'
AND order_amount > 25000;
Which conclusion is correct about the relevance of the retrieved data set to the stated objective?
>= 25000 instead of > 25000.Best answer: D
What this tests: Information Systems and Data Management
Explanation: The data set is not relevant because the query filters on billing dates instead of shipment dates. Since the objective is specifically about June shipments, using invoice_date can include the wrong records and omit the right ones.
To assess SQL query relevance, compare the business or assurance objective to the table fields and filter logic used in the query. Here, the objective is to test shipments made in June 2026 that exceeded $25,000 and whether they had tracking numbers recorded. The decisive population-defining field is ship_date, because that identifies when goods actually left the warehouse. The query instead filters on invoice_date, and the facts state that invoice dates can differ from shipment dates. As a result, the extracted records may include orders invoiced in June but shipped in another month, and may miss June shipments invoiced earlier or later. That makes the retrieved data set irrelevant to the stated testing objective.
order_amount and tracking_no is not enough if the query pulls the wrong population in the first place.customer_id does not make a result set unreliable; extra columns can be unnecessary, but they do not defeat relevance.> 25000 matches the phrase “over $25,000”; using >= 25000 would incorrectly include exactly $25,000 orders.The objective is defined by shipment timing, and invoice dates can differ from shipment dates.
Topic: Security, Confidentiality and Privacy
An entity uses the following control activities for its VPN gateways, firewalls, and employee laptops:
Which is the best interpretation of this process?
Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The described activities are the core elements of vulnerability management: regular identification of weaknesses, prioritization based on risk, remediation, and follow-up validation. Its purpose is to reduce exposure before weaknesses are exploited.
Vulnerability management is an ongoing process, not a one-time task. It typically includes regularly scanning systems or devices for weaknesses, evaluating and prioritizing the findings based on factors such as severity and exposure, assigning remediation actions, and then retesting to confirm the issues were actually resolved. In this scenario, weekly authenticated scans identify weaknesses, severity ranking prioritizes them, remediation tickets drive corrective action, and rescanning verifies closure. That combination fits vulnerability management for network, device, endpoint, and remote-access environments. It is different from incident response, which addresses actual or suspected security events, and different from access administration or change management, which have narrower purposes.
The activities shown match the ongoing cycle of finding weaknesses, ranking them for action, fixing them, and confirming remediation.
Topic: Information Systems and Data Management
An entity uses the following accounts payable architecture:
At month-end, the controller notes:
VCM is not mapped to any GL posting ruleWhat is the best correction?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The problem is not in the system of record, the subledger, or the reporting layer. Those amounts agree and already include the credit memos. The missing balance in the GL is caused by an interface mapping failure, so the best correction is to fix that posting rule and remediate the affected GL amounts.
In an accounting information system, the system of record captures the original transaction, the subledger maintains detailed accounting balances, the interface moves data between layers, the general ledger holds summarized account balances, and the reporting layer presents information from its assigned source. Here, vendor credit memos were entered, the AP aging agrees to the AP subledger, and the BI dashboard reads from that same subledger data. That means the source transactions, detailed accounting records, and reporting layer are functioning as intended. The mismatch appears only in the GL control account, and the interface log identifies why: transaction code VCM was not mapped to a GL posting rule. The best remediation is to correct the interface so those subledger transactions post to the GL and then reconcile any periods already affected.
The source transactions and subledger are complete, so the defect is the unmapped interface rule that prevents credit memos from reaching the GL.
Topic: Security, Confidentiality and Privacy
During an ISC engagement, a CPA reviews an incident summary for a company’s public customer portal:
Which attack type best fits this incident?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: This incident is best classified as a distributed denial-of-service attack because the key fact is the massive volume of traffic coming from many external IP addresses and disrupting availability. The facts do not indicate code exploitation, malware execution, or user deception.
A distributed denial-of-service attack attempts to make a system or service unavailable by flooding it with traffic or requests from many different sources. In this scenario, the portal became unavailable, the request volume spiked sharply, and the traffic came from more than 15,000 external IP addresses across multiple countries. Those facts point to a distributed attack focused on exhausting capacity rather than exploiting application logic or compromising credentials. A web application attack would usually show signs such as malicious input, abnormal database errors, or targeted exploitation of forms or APIs. Malware would involve malicious code running on a device or server, and social engineering would involve manipulating people into revealing information or taking unsafe actions. The decisive clue here is the distributed traffic flood affecting availability.
Web application attack is tempting because the traffic targeted a website, but the facts show volume-based disruption rather than exploitation of application functions or inputs.Malware attack does not fit because there are no signs of malicious code execution, infected hosts, or system compromise.Social engineering attack is incorrect because no employee or user was manipulated into disclosing information or performing an unsafe action.Distributed denial-of-service attack fits because many external sources generated enough traffic to impair system availability.The incident centers on availability disruption caused by overwhelming traffic from many distributed external sources, which is characteristic of a DDoS attack.
Topic: Considerations for System and Organization Controls Engagements
A service organization is issuing a SOC 2 Type 2 report for the period January 1 through December 31, Year 1. The following control was described and tested:
How should this matter be characterized in the results of tests of controls?
Best answer: D
What this tests: Considerations for System and Organization Controls Engagements
Explanation: Because the control is suitably designed but one tested instance was not performed as specified, the issue is an operating effectiveness exception. In a SOC 2 Type 2 report, that deviation should be described in the results of tests for the specific control, even if no unauthorized access was identified.
In a SOC 2 Type 2 report, the results of tests of controls explain whether the described controls operated effectively over the period. Here, the control design is appropriate: quarterly privileged-access reviews with documentation. The identified problem is that one quarter’s review was completed 45 days late, so the control did not operate as described for that instance. That is an operating effectiveness exception, and the results of tests should describe the deviation for Q3. The fact that the review was later completed and no unauthorized access was detected may affect the significance of the exception, but it does not erase the deviation. This is not a design deficiency, and it is not a scope limitation because the practitioner has evidence showing what happened.
The control design is appropriate, so the late Q3 review is a deviation in operation that should be reported in the results of tests for that control.
Topic: Considerations for System and Organization Controls Engagements
A CPA is finalizing a SOC 2 Type 2 report on the security category for the period January 1-December 31, 20X5. The planned report date is February 7, 20X6.
Timeline:
What is the best interpretation for the SOC engagement?
Best answer: A
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The key fact is not when the exports occurred, but when the underlying control failure existed. Because the shared privileged credential existed during December, the CPA must perform additional procedures and evaluate whether the SOC 2 Type 2 conclusion for the covered period is affected before dating the report.
In a SOC 2 Type 2 engagement, the practitioner considers relevant events up to the report date. The critical distinction is whether the later-discovered matter arose only after the examination period or instead reveals a condition that existed during the period being reported on. Here, the unauthorized exports happened in January, but the investigation showed that the underlying control problem - shared privileged access bypassing individual authentication - existed during December, within the specified period. That means the CPA cannot treat this as only a post-period disclosure matter. Additional procedures are needed to determine the severity and effect of the control failure on operating effectiveness for the period ended December 31. The report should not be dated earlier than the date that evaluation is completed.
The January discovery revealed a control breakdown that existed during the covered period, so the CPA must reassess the period-end opinion before dating the report.
Topic: Information Systems and Data Management
Management wants a monthly report showing, by SKU, (1) net revenue and (2) average days from customer order to final delivery. For this report, net revenue equals invoiced sales minus return credits.
| System | Join fields available | Relevant fields |
|---|---|---|
| ERP Order/Invoice | order_id, invoice_id, sku | order_date, qty_invoiced, unit_price |
| Warehouse Management System | order_id, sku | ship_date, delivery_date, qty_shipped |
| CRM Returns | invoice_id, sku | return_qty, credit_amount, return_date |
| General Ledger Summary | accounting_period, product_line | total_revenue, returns_reserve |
Which conclusion is best supported by the exhibit?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The report needs multiple transaction-level sources. ERP provides order date and invoiced sales, the warehouse system provides delivery date, and CRM provides return credits, so combining those three sources is necessary to calculate both SKU-level net revenue and delivery cycle time.
When a report requires both financial and operational measures, the best data set usually combines detailed source records from each step of the process. Here, net revenue by SKU needs invoiced sales from ERP and return credits from CRM. Average days from order to final delivery needs the ERP order date and the warehouse delivery date. The shared identifiers shown in the exhibit—order_id, invoice_id, and SKU—support matching the records at a detailed level. The general ledger summary is too aggregated because it is by accounting period and product line, not by SKU or individual order. Using only one or two of the detailed systems would leave at least one required measure incomplete.
delivery_date and return-credit detail, so it cannot produce both requested metrics.This is the only combination that provides detailed sales, delivery, and return-credit data needed to calculate both requested SKU-level measures.
Topic: Security, Confidentiality and Privacy
A company allows managers to approve vendor payments through a mobile app on company-issued smartphones. The app keeps users signed in for 30 days, does not require multifactor authentication, and allows payment approval after only the phone’s 4-digit unlock PIN. The phones are used only on trusted networks, no suspicious apps are detected, and no phone has been lost or stolen. This situation is best classified as which mobile cybersecurity risk?
Best answer: C
What this tests: Security, Confidentiality and Privacy
Explanation: The scenario focuses on how access is granted and maintained on the mobile device. Long-lived sessions, no MFA, and reliance on only a simple device PIN indicate weak mobile access controls rather than network, malware, or loss-related risk.
Mobile cybersecurity risks should be classified by the main source of exposure. Here, the weakness is the app’s access design: it allows sensitive approvals after only a basic device PIN, keeps users signed in for an extended period, and does not require MFA. Those facts point to weak mobile access controls because authentication and session management are insufficient for a high-risk function. Insecure network exposure would involve use of untrusted or public networks that could enable interception. Mobile malware infection would involve malicious apps, code, or device compromise. Device loss or theft risk would center on a phone being misplaced or stolen. Although poor access controls can worsen the impact of loss, the direct issue described is still weak mobile access control.
The primary issue is inadequate authentication and session control on the mobile app, which is a weak mobile access control risk.
Topic: Security, Confidentiality and Privacy
A CPA is reviewing this excerpt from a company’s workforce-app notice:
Which characterization is most directly supported by the excerpt?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The excerpt is best characterized as privacy-focused because it describes how personal information is collected, used, retained, and made available for review and correction. Those features are more specific to privacy than to confidentiality, security, or availability.
Privacy addresses an entity’s practices for personal information, including notice, purpose limitation, consent, access, correction, retention, and disposal. The excerpt discusses all of those themes: it identifies the personal data collected, limits use to payroll unless separate consent is obtained, allows employees to review and correct information, and states when biometric data will be deleted. By contrast, confidentiality is about protecting designated information from unauthorized disclosure, and security is about protecting systems and information more broadly through safeguards such as access controls or monitoring. Availability concerns whether systems and data are accessible for operation and recovery. Because the excerpt centers on personal information handling and individual rights, privacy is the best classification.
The excerpt focuses on personal information practices such as collection, purpose limitation, consent, individual access, correction, and deletion, which are core privacy elements.
Topic: Security, Confidentiality and Privacy
GreenFarm Co. uses a third-party cloud platform to monitor refrigerated inventory at 40 stores.
| Item | Fact |
|---|---|
| IoT devices | Each store has freezer sensors connected to a local IoT gateway. The gateway sends data to the cloud platform using a cached API key tied to a store service account. Installation procedures require changing the gateway’s default admin password. |
| Exception noted | Store 18’s gateway was found still using the default admin password. |
| Mobile access | Store managers review alerts through a company mobile app using SSO and MFA. Logs show no unusual manager logins. |
| Incident logs | At 2:14 a.m., the cloud platform recorded successful service-account API calls from Store 18’s public IP to export six months of temperature data. At 2:20 a.m., the gateway initiated an outbound session to an unknown external host. |
| SOC excerpt | The cloud vendor’s SOC 2 report states customers are responsible for field-device configuration and local credential management. |
Based on these facts, which is the best interpretation of the cybersecurity threat?
Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The strongest interpretation is IoT gateway compromise that was used to access the cloud platform with the gateway’s stored service credentials. The default password remained in place, the export came from the store’s IP using the service account, and the gateway then contacted an unknown host.
This scenario points to a threat that is common in cloud-connected IoT environments: a weakly secured field device becomes the path into the cloud application. The key facts are the unchanged default admin password on the gateway, the successful API activity using the gateway’s service account, and the outbound session to an unfamiliar host shortly after the data export. Those facts support credential misuse through a compromised device, not a failure of mobile authentication or vendor-side encryption. The SOC excerpt also matters: it says the customer, not the cloud vendor, is responsible for field-device configuration and local credential management. In practice, IoT gateways that store API keys or service credentials can become a high-risk bridge between local networks and third-party cloud services if default credentials are not changed.
The default gateway password, successful service-account exports from the store IP, and the gateway’s outbound connection strongly indicate IoT-device compromise leading to unauthorized cloud access.
Topic: Information Systems and Data Management
A distributor plans to replace its legacy sales and inventory application with a new hosted ERP module.
| Fact | Detail |
|---|---|
| Process scope | The application supports about 65% of company revenue and updates shipment status, inventory, accounts receivable, and daily sales journal entries to the general ledger. |
| Go-live timing | A direct cutover is scheduled for December 29, three days before year-end close and the physical inventory count. |
| Testing status | Unit testing passed. End-to-end testing across order entry, shipping, billing, and GL posting was completed for 12 of 20 scenarios; 3 tested scenarios produced duplicate invoices, and 5 scenarios were not tested. |
| Conversion plan | The legacy system will become read-only at go-live. No parallel processing is planned. If the new system fails, orders will be captured manually in spreadsheets until issues are fixed. |
| Access/control status | Automated credit-limit and price-override approvals are configured, but the user-role access review for sales supervisors and billing clerks is scheduled for after go-live. |
| Vendor assurance | The ERP vendor has a SOC 1 Type 2 report covering its hosted infrastructure. |
Based on these facts, which is the best interpretation of the proposed conversion approach?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: This conversion plan combines a high-impact direct cutover with incomplete end-to-end testing, known duplicate invoicing errors, no practical rollback, and delayed access validation right before year-end. In that environment, the proposal creates unacceptable operational disruption, financial reporting risk, and control risk.
A conversion approach should be evaluated in light of the system’s business significance, timing, testing results, fallback capability, and control readiness. Here, the new system drives revenue, inventory, receivables, and GL postings, so defects can affect operations and financial reporting quickly. A direct cutover shortly before year-end heightens the risk because processing failures or duplicate invoices could distort revenue and inventory during close. The plan also lacks a strong fallback, since the legacy system becomes read-only and manual spreadsheets are not an equivalent recovery method. Delaying the user-role access review until after go-live adds control risk because approval workflows may operate with inappropriate access. The vendor’s SOC 1 Type 2 report may support reliance on certain hosted-service controls, but it does not replace customer-side conversion testing, configuration validation, or user access review.
This is correct because the conversion affects revenue-significant processing and key controls, yet unresolved processing errors, incomplete testing, no practical rollback, and delayed access review remain immediately before year-end.
Topic: Information Systems and Data Management
A CPA is reviewing why an April sales-detail report may be incomplete.
SELECT o.order_id, ol.line_id, ol.product_id, ol.line_amount
FROM Orders o
JOIN OrderLines ol
ON o.order_id = ol.order_id
WHERE o.order_date >= '2026-04-01'
AND o.order_date < '2026-05-01';
| Item | Fact |
|---|---|
| Orders | order_id is the primary key |
| OrderLines | line_id is the primary key |
| OrderLines.order_id | Required field, but no foreign key constraint to Orders.order_id |
| Data profiling | 214 OrderLines rows contain an order_id value that does not exist in Orders |
| Other testing | No duplicate line_id values were found |
Which database structure concern is the best interpretation of these facts?
Orders.order_date is likely causing some April rows to be skipped by the query.line_amount in OrderLines rather than deriving it from another table is the main structural problem affecting this report.line_id instead of a composite primary key on OrderLines is the main reason the report is incomplete.Orders and OrderLines allows orphaned detail rows that the inner join will exclude.Best answer: D
What this tests: Information Systems and Data Management
Explanation: The key issue is missing referential integrity. Because OrderLines.order_id is not enforced as a foreign key, orphan detail rows can exist, and the report’s inner join will drop those rows from the result set.
In a relational database, referential integrity helps ensure that each child row points to a valid parent row. Here, OrderLines is the child table and Orders is the parent table. Because OrderLines.order_id is not protected by a foreign key, the database accepted 214 detail rows whose order_id does not exist in Orders. The report uses an inner join, so only rows with matches in both tables are returned. As a result, those orphaned detail rows are omitted, which directly affects completeness and can understate reported sales totals. By contrast, an index mainly affects performance, not which matching rows qualify, and the other suggested design concerns are not the best explanation of the facts provided.
OrderLines; no duplicate line_id values were found, and the identified issue is unmatched parent-child rows.line_amount in the detail table may raise other design questions, but it does not explain why rows are excluded by this specific join.Without an enforced foreign key, orphan OrderLines rows can exist and an inner join will omit them from the report.
Topic: Security, Confidentiality and Privacy
A company documented the following VPN security items:
VPN security notes
- Security goal: Payroll data should be accessible only by authorized employees.
- Threat noted: External actors may attempt credential stuffing against VPN accounts.
- Control activity: The SIEM creates an alert when 10 failed VPN logins occur from one IP address within 5 minutes, and a security analyst reviews open alerts each morning.
- Management metric: The SOC manager tracks the monthly average time to close security alerts.
Based on the exhibit, which item is a detective control?
Best answer: C
What this tests: Security, Confidentiality and Privacy
Explanation: Reviewing SIEM alerts for repeated failed VPN logins is a detective control because it is meant to identify suspicious activity as it occurs. The other items describe a control objective, an attack technique, and a monitoring metric rather than the detective control itself.
A detective control is intended to discover errors, anomalies, or attacks that have occurred or are underway. In the exhibit, the SIEM alert threshold and the analyst’s review of failed-login alerts are used to detect possible unauthorized access attempts, so that activity is a detective control. By contrast, making payroll data accessible only to authorized employees states the control objective, which is the desired outcome. Credential stuffing is the threat or attack technique the company is concerned about, not a control. Tracking average alert-closure time is a monitoring activity used by management to oversee security operations, but it does not itself detect the suspicious login behavior.
Reviewing SIEM alerts is a detective control because it is designed to identify potentially unauthorized login activity after it occurs.
Topic: Information Systems and Data Management
A CPA is reconciling a documented cash-receipts flowchart to a walkthrough of the actual process.
Documented flowchart:
Walkthrough results:
Which change is the best correction to the documented flowchart?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: The documented flowchart wrongly assumes all receipts post automatically and omits the exception path. The best correction is to show the matched-versus-unmatched decision, the handoff to the cash application specialist, and reconciliation only after posting and exception handling are complete.
When reconciling a process narrative or walkthrough to documented process flows, the key is to identify where the documentation leaves out a branch, handoff, or sequencing detail that affects how the process actually works. Here, the flowchart incorrectly shows one straight-through posting path for all receipts. The walkthrough shows a conditional process: matched receipts post automatically, while unmatched receipts go to a cash application specialist for research before they are posted or otherwise resolved. That exception routing is a material handoff and must appear in the flow. Treasury reconciliation also belongs after the system reflects both auto-posted items and resolved exceptions, because reconciling earlier would not reflect the true processed population.
This revision fixes the missing decision point, documents the exception handoff, and aligns the reconciliation step with the actual sequence.
Topic: Information Systems and Data Management
During an ISC engagement, a CPA is assessing whether an emergency production hotfix to a billing application was managed through formal change control. Management claims the hotfix was requested, approved, tied to a specific code version, successfully tested before release, deployed through the build process, and had a documented rollback plan. Which item would best support that conclusion?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: The best support is the change ticket that links all key change-management elements for the same hotfix. It provides end-to-end evidence of request, approval, version tracking, testing, deployment, and rollback readiness, while the other records support only one part of the conclusion.
For change management, the most persuasive evidence is documentation that traces a specific change through the full lifecycle. A well-maintained change ticket or change record should connect the request and approval to the exact code version, testing evidence, build or deployment details, and rollback procedure. That makes it possible to confirm the change was formally tracked and could be reversed if needed. By contrast, a version-control log mainly shows code history, a test library report shows testing activity, and a baseline configuration report shows the current approved state of production. Each of those may be useful, but none alone demonstrates the full control path for one production change.
This is the strongest evidence because one record ties the specific change to authorization, version control, testing, deployment, and rollback documentation.
Topic: Information Systems and Data Management
A CPA is helping a midsize distributor begin its business impact analysis for business continuity planning. Management has already listed its major applications. Same-day shipping drives most revenue, the warehouse management system depends on ERP order data and a third-party network connection, and management has not yet defined acceptable downtime for each process. Which action should the CPA recommend next?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The next BIA step is to work with process owners to determine which business processes are most critical, what they depend on, and how quickly they must be restored. Those facts establish recovery priorities and availability requirements for later continuity and disaster recovery planning.
A business impact analysis starts by identifying the business processes that matter most, then assessing the operational and financial effects of disruption. From there, the organization identifies dependencies such as applications, data, personnel, facilities, and third-party services that support each process. Using that information, management can set recovery priorities and define availability needs, such as acceptable downtime and recovery targets. In this scenario, management already has a list of applications, but it has not yet defined acceptable downtime and the warehouse process depends on both internal ERP data and an external network provider. That means the next step is not testing recovery or setting generic targets; it is completing the BIA by linking critical processes to dependencies and required recovery timing.
A business impact analysis next identifies process criticality, dependencies, and recovery and availability requirements before detailed recovery testing or solution selection.
Topic: Information Systems and Data Management
A CPA is reviewing a change ticket for a cash receipts application update.
Facts:
Which action is most appropriate?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The best response is to complete integration and user acceptance testing in staging before deployment. Development is for coding and unit testing, staging is for production-like predeployment testing, and production should not be used as the primary environment for acceptance testing.
Development, staging, and production serve different purposes in a controlled change process. Development is where programmers build code and perform unit testing on individual components. Staging is a separate environment that closely mirrors production and is used for broader testing, such as integration, system, and user acceptance testing, often with masked or representative data. Production is the live environment for actual business processing, so using it for planned acceptance testing creates unnecessary risk to live transactions and system stability. Because the facts say coding and unit testing are already complete and staging mirrors production, the proper next step is to perform end-to-end and user acceptance testing in staging before release.
Staging is the production-like environment used for broader predeployment testing, while production should be reserved for live operations rather than acceptance testing.
Topic: Security, Confidentiality and Privacy
Maple Co. uses a third-party cloud HR portal that stores employee PII. During a vendor-risk review, HR management says the service provider “handles access security,” so Maple does not perform periodic user access reviews for the portal and does not have a process to notify the provider promptly when HR staff transfer or terminate.
The reviewer concludes that Maple has a gap in user-entity control responsibility and ongoing monitoring over access to the vendor-hosted system.
Which source would best support that conclusion?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The best support is the SOC 2 report excerpt that identifies complementary user entity controls. It directly shows that Maple, not just the provider, is responsible for periodic access review and timely termination notifications, which supports the conclusion about a monitoring gap.
When evaluating a third-party system, the key issue is not simply whether the vendor has strong controls, but which controls remain the customer’s responsibility. A SOC 2 report excerpt that lists complementary user entity controls is the strongest evidence because it explicitly assigns duties between the service provider and the user entity. If the report says Maple must review user access and notify the provider of personnel changes, then Maple cannot rely solely on the vendor for access control monitoring. That makes the gap a user-entity responsibility issue. By contrast, due diligence materials help assess vendor risk at onboarding, a user listing shows who currently has access, and a log extract shows a specific security event. Those sources may be useful, but they do not directly establish Maple’s ongoing control obligation.
A SOC 2 excerpt identifying complementary user entity controls directly assigns the access-review and termination-notification duties to Maple.
Topic: Security, Confidentiality and Privacy
A health benefits administrator stores members’ Social Security numbers. Analytics users only need a surrogate value, but a small customer service group must be able to retrieve the full SSN through a separate controlled service when identity verification is required. Management is deciding between tokenization and other protection techniques for the production database.
Which statement best captures the decisive distinction relevant to this choice?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The best distinction is that tokenization replaces sensitive data with a surrogate token and typically keeps the original-to-token mapping in a separate protected vault or service. Encryption, by contrast, converts the original data into ciphertext that is recovered with cryptographic keys.
This scenario calls for reducing exposure of privacy-regulated data in the production database while still allowing tightly controlled retrieval of the original value for a limited business purpose. Tokenization fits that objective because the application database can store a surrogate token instead of the actual SSN, and only an authorized detokenization service can return the original value. Encryption is also a strong control for protecting data, but its defining feature is cryptographic transformation of the actual data into ciphertext using keys, not substitution with a separate mapped token. Hashing is generally used when recovery of the original value is not needed, and masking mainly limits what users see rather than replacing the underlying stored sensitive value for controlled recovery.
Tokenization is distinguished by replacing the SSN with a surrogate token and keeping the recoverable mapping in a separate protected service or vault.
Topic: Information Systems and Data Management
A CPA is reviewing the following BPMN-style vendor onboarding process to identify improvements:
| Step | Lane | Activity |
|---|---|---|
| 1 | Requestor | Submits new vendor request |
| 2 | Procurement | Approves request |
| 3 | AP | Enters vendor name, tax ID, and bank account into the sourcing system |
| 4 | Purchasing | Re-enters vendor name, tax ID, and bank account into the ERP vendor master |
| 5 | Sourcing system | Assigns vendor code S-### |
| 6 | ERP | Assigns vendor code V-### |
| 7 | AP | Matches invoices in ERP using vendor name |
| 8 | Treasury | Uses bank account data from the sourcing system for ACH payments |
During the last quarter, the company created duplicate vendor records and sent two ACH payments to outdated bank accounts.
Which change is the best correction to this process model?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: The best correction is to redesign the process around a single authoritative vendor master record that feeds all downstream systems. That change addresses the root cause in the model: duplicate manual entry, different vendor IDs, and inconsistent payment data across systems.
When a business process model shows the same master data being entered into multiple systems, assigned different identifiers, and later matched by a weak field such as vendor name, the process is prone to duplicate records and data integrity failures. The strongest improvement is to create vendor data once, validate critical fields such as tax ID and bank account at setup, and distribute that approved record through an interface to ERP and payment functions. This supports a single source of truth, reduces rekeying errors, and keeps invoice matching and ACH payment data aligned. Detective steps or broader payment restrictions may help somewhat, but they do not correct the flawed process design shown in the model.
This correction removes duplicate entry points and inconsistent identifiers by establishing one authoritative vendor record for downstream systems.
Topic: Information Systems and Data Management
A company documents its vendor bank-account change process in the cash disbursement system as follows:
| Documented process | Walkthrough of actual process |
|---|---|
| Procurement manager approves the vendor bank-account change request. | AP master-data clerk updates the vendor bank account in the ERP when the email request is received. |
| AP master-data clerk updates the vendor bank account in the ERP. | Procurement manager reviews and approves the request later that day. |
| System logs the change and sends a daily change report to the controller. | System logs the change and sends a daily change report to the controller. |
Which action is the best correction for this discrepancy between the documented and actual process flow?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The documented flow requires approval before the vendor bank-account change is made, but the walkthrough showed the ERP update occurring first. The best correction is to restore the preventive approval step before the change posts, not to rely on stronger detective review or rewrite the documentation to match a weaker practice.
Reconciling actual process steps to documented process flows means checking whether key activities occur in the same sequence as designed. Here, approval is intended to authorize the bank-account change before it affects the vendor master file. In the walkthrough, the AP clerk makes the change first and the manager approves later, so the actual process no longer matches the documented control flow. That change in sequence matters because a preventive control has effectively become a detective one. A daily change report may help identify problems, but it does not prevent an unauthorized bank-account update from being posted. The strongest correction is to enforce approval before the ERP update, ideally through workflow or system restrictions, and only revise documentation if management formally approves a redesigned process.
The actual process moved approval after the update, so the best remediation is to restore approval as a preventive step before the bank-account change is posted.
Topic: Information Systems and Data Management
A CPA is evaluating processing integrity controls at a payroll service organization for a SOC 2 engagement.
Current findings:
What should the CPA do next?
Best answer: A
What this tests: Information Systems and Data Management
Explanation: The next step is to determine whether the control, as designed, can actually detect the stated risk. A file-count reconciliation may miss a truncated file, which points to a design deficiency; only after design is adequate should the missing daily review be evaluated as a possible operating deviation.
A design deficiency exists when a control, even if performed exactly as intended, is not capable of preventing or detecting the relevant error. Here, the control objective is to detect missing or incomplete payroll input files, but the walkthrough indicates that a truncated file would still be counted as one file received and one file loaded. That means the control may be incapable of detecting an incomplete file, which is a design issue. An operating deviation is different: it occurs when a properly designed control is not performed or does not operate as intended in a specific instance, such as the one day with no review evidence. Because the facts raise a possible design problem, the CPA should evaluate design first before concluding the unsigned day is merely an isolated execution failure.
A control that cannot detect truncated files has a design deficiency even if it is usually reviewed, so design capability must be evaluated before labeling the missing review as an operating deviation.
Topic: Information Systems and Data Management
A service organization transfers approved subscription invoices nightly from its Contract Management System (CMS) to its Accounts Receivable (AR) system. During SOC 2 testing of the June 30 batch, the following was noted:
| Item | Result |
|---|---|
| Approved invoices in CMS | 1,240 |
| Records received in AR | 1,240 |
| Batch completed before cutoff time | Yes |
| Invoices with valid customer IDs and approval codes | 1,240 |
| Invoices posted with incorrect amounts because of a field-mapping change | 37 |
Which processing integrity issue is best indicated by these results?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: This scenario points to an accuracy issue in system processing. The batch was complete and timely, and the invoices were valid and approved, but the field-mapping change caused 37 posted amounts to be wrong.
Processing integrity considers whether system processing is complete, accurate, timely, authorized, and valid. Here, completeness is supported because all 1,240 approved invoices in the source system were received in the AR system. Timeliness is supported because the batch finished before the cutoff. Authorization and validity are also supported because the invoices had valid customer IDs and approval codes, so the transactions themselves were legitimate and approved for processing. The problem is that a field-mapping change caused some invoice amounts to be posted incorrectly. When the system processes legitimate transactions but produces wrong values, the primary issue is accuracy. In a SOC 2 context, this could stem from a change management or interface control problem, but the affected processing integrity attribute is accuracy.
All approved invoices were received on time and had valid approvals, but 37 were posted with misstated amounts, which is an accuracy failure.
Topic: Considerations for System and Organization Controls Engagements
A cloud-based HR platform provides payroll processing to client companies. Management wants an assurance report it can post publicly for prospective customers and other outsiders. The report should address controls over the platform’s security, availability, and confidentiality, but it does not need to include the service auditor’s detailed tests and results. Which report type best fits this request?
Best answer: A
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The best choice is SOC 3 because the company wants a public-facing report about controls over security, availability, and confidentiality at its service organization system. A SOC 2 report covers similar subject matter, but it is more detailed and intended for specified users rather than general public distribution.
This scenario points to a service organization that wants a general-use report about controls over its system using Trust Services Criteria categories. That is the purpose of a SOC 3 report. SOC 3 reports are designed for broad distribution and do not include the detailed description of the service auditor’s tests and results that appear in a SOC 2 report. A SOC 1 report is different because it addresses controls at a service organization that are relevant to user entities’ internal control over financial reporting. A SOC for Cybersecurity report is also different because it reports on an entity’s overall cybersecurity risk management program, not specifically on a service organization’s system used to provide services to customers.
SOC 1 report is tempting because it involves a service organization, but SOC 1 is for controls relevant to user entities’ financial reporting, which is not the stated purpose here.SOC 2 report covers security, availability, and confidentiality, but it is a restricted-use report with detailed test procedures and results, unlike the public report requested.SOC for Cybersecurity report is public-facing, but it addresses an entity-wide cybersecurity risk management program rather than a service organization’s system for customer services.SOC 3 is a general-use report on a service organization’s controls relevant to the Trust Services Criteria and omits the detailed test descriptions and results included in SOC 2.
Topic: Security, Confidentiality and Privacy
During a SOC 2 walkthrough, management states these service commitments and system requirements for an analytics replica of its production database:
Walkthrough observations:
Which response is the best correction to address the design deficiency?
Best answer: B
What this tests: Security, Confidentiality and Privacy
Explanation: The problem is a design mismatch between stated SOC 2 commitments and the actual control structure. The best correction is to redesign access, data minimization, and retention/disposal controls so the replica supports confidentiality and privacy commitments if the controls operate as intended.
In a SOC 2 engagement, suitability of design asks whether the controls, as designed, would be capable of meeting the entity’s service commitments and system requirements. Here, the design is deficient because broad shared access conflicts with the confidentiality commitment to limit personal information to those with a business need. The design is also deficient because the replica contains sensitive fields that analytics users do not need, which violates data minimization. Finally, the privacy commitment to delete terminated-customer data within 60 days is unsupported because there is no purge process and backup retention has not been aligned to that commitment. The best remediation is the one that fixes all three design gaps: need-to-know access, minimization of sensitive data, and documented retention/disposal procedures.
This option directly aligns the analytics environment with the stated need-to-know, data-minimization, and 60-day deletion requirements.
Topic: Considerations for System and Organization Controls Engagements
CloudPay, a payroll SaaS provider, asks a CPA to report on controls using the Trust Services Criteria.
| Exhibit | Details |
|---|---|
| Information handled | Employee names, addresses, Social Security numbers, bank account numbers, and benefit elections |
| Public commitments | Provide notice about what personal information is collected, use it only for payroll and benefits administration, allow correction requests, and delete records after the retention period |
| Controls described | Notice acknowledgment logs, workflow for correction requests, retention schedule, and secure deletion procedures |
Which Trust Services Criteria subject matter is most directly supported by this exhibit?
Best answer: A
What this tests: Considerations for System and Organization Controls Engagements
Explanation: Privacy is the best answer because the exhibit focuses on personal information and the entity’s commitments about notice, permitted use, correction, retention, and deletion. Those are hallmark privacy subject matters under the Trust Services Criteria.
Under the Trust Services Criteria, a practitioner may report on one or more subject matters, including security, availability, processing integrity, confidentiality, and privacy. Privacy is specifically concerned with personal information and whether it is collected, used, retained, disclosed, and disposed of in line with the entity’s commitments and system requirements. In this exhibit, the key facts are the use of employee personal information plus commitments about notice, limited use, correction requests, retention periods, and secure deletion. Those are classic privacy-oriented elements. Confidentiality can also involve sensitive information, but it focuses more broadly on protecting information designated as confidential, not on the full life-cycle obligations for personal information. Availability and security are different subject matters with different emphasis.
The exhibit centers on personal information and controls over notice, use, correction, retention, and disposal, which are privacy matters.
Topic: Security, Confidentiality and Privacy
During a walkthrough of an entity’s cyber risk program, a CPA notes the following:
How should this set of activities be characterized?
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The activities focus on communicating security expectations, improving user knowledge, and reinforcing safe actions such as recognizing phishing and protecting confidential data. That makes them security awareness training, which is an administrative preventive control rather than a detective, corrective, or access provisioning control.
Security awareness training is an administrative control used to communicate security information to personnel so they understand risks and model appropriate behaviors. Typical examples include onboarding training, periodic reminders, phishing simulations, and follow-up coaching when employees make mistakes. These activities are aimed at reducing the likelihood of user-driven security failures, especially social engineering and poor data-handling practices. They do not detect intrusions through system monitoring, restore operations after an incident, or assign user access rights. In this scenario, the entity is using recurring communication and education to improve security knowledge and encourage secure conduct, which is the defining purpose of awareness training.
These activities are designed to educate users before incidents occur and shape secure behavior, which is the purpose of security awareness training.
Topic: Security, Confidentiality and Privacy
An auditor is evaluating whether a company followed its incident response standards for a confirmed critical cybersecurity incident. Based on the exhibit, which conclusion is best supported?
| Incident response standard | Evidence from incident 24-017 |
|---|---|
| Confirm or dismiss a high-severity alert within 20 minutes of receipt. | SIEM alert received 6/3 at 08:10; analyst confirmed account compromise at 08:25. |
| Notify the incident commander within 30 minutes after analyst confirmation. | Incident commander paged at 09:40 on 6/3. |
| Contain the affected account or host within 2 hours after analyst confirmation. | Privileged account disabled and active tokens revoked at 09:55 on 6/3. |
| Send any required customer notice within 24 hours after legal approval. | Legal approved the required customer notice at 17:00 on 6/3; notices sent at 09:30 on 6/4. |
| Complete root-cause remediation within 7 calendar days after containment. | IAM patch deployed and all privileged credentials rotated at 13:00 on 6/9. |
| Complete a post-incident review within 10 calendar days after incident closure. | Incident closed at 16:00 on 6/10; post-incident review held at 10:00 on 6/21. |
Best answer: D
What this tests: Security, Confidentiality and Privacy
Explanation: The timeline satisfies the standards for alert confirmation, containment, customer notice, and remediation. It does not satisfy the 30-minute escalation requirement or the 10-day post-incident review requirement, so the correct conclusion is the one identifying those two late actions.
To evaluate incident response evidence, compare each documented action to the stated response standard. Here, the alert was confirmed 15 minutes after receipt, so identification was timely. The affected account was disabled 90 minutes after confirmation, so containment was timely. Required customer notice was sent 16.5 hours after legal approval, and remediation was completed less than 7 calendar days after containment, so those steps were timely. However, escalation was late because the incident commander was notified 75 minutes after analyst confirmation, exceeding the 30-minute limit. The post-incident review was also late because it occurred after the 10-calendar-day deadline following closure. The best-supported conclusion is therefore the one that separates the timely actions from the two missed deadlines.
The evidence meets the deadlines for alert confirmation, containment, customer notice, and remediation, but escalation took 75 minutes and the post-incident review occurred more than 10 calendar days after closure.
Topic: Information Systems and Data Management
A company uses the following technology environment:
Which cloud deployment model best describes the company’s overall environment?
Best answer: D
What this tests: Information Systems and Data Management
Explanation: Hybrid cloud deployment is correct because the company uses both a private-cloud environment reserved for its exclusive use and a public-cloud SaaS service shared with other customers. The connected authentication and data transfers show the two environments operate together.
Cloud deployment models are distinguished mainly by who has access to the infrastructure and how control is allocated. A private cloud is dedicated to a single organization, even if a third party hosts it, so exclusive-use servers and company-defined security settings point to private cloud. A public cloud serves multiple customers on shared infrastructure, so the shared SaaS expense platform is public cloud. When an organization uses both private and public cloud resources as part of one connected environment, the overall deployment model is hybrid cloud. The integration facts matter here because they show the private and public portions are part of the same operating model rather than unrelated services.
The environment combines a private cloud reserved for one company with a shared public cloud service that is connected for authentication and data exchange.
Topic: Information Systems and Data Management
An accounting department uses the following components:
How should the routers and switches be characterized in this environment?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: The routers and switches are network infrastructure because their purpose is to connect devices and manage network traffic across the accounting environment. They are not the software layer, the data-processing host, or the user-facing device.
In an accounting environment, different IT architecture components serve different roles. End-user devices are the laptops or workstations employees use to access applications and enter data. Servers host applications or data and provide shared processing or storage. Operating systems manage hardware and software resources so applications can run. Network infrastructure includes components such as routers and switches that transmit, segment, and direct traffic between devices, servers, and external resources. Because the scenario describes routers and switches connecting the office LAN to the ERP system and other resources, the correct characterization is network infrastructure.
Routers and switches are network infrastructure because they provide connectivity and direct data traffic among systems, users, and applications.
Topic: Considerations for System and Organization Controls Engagements
A service organization requests a SOC 2 Type 1 report with these engagement facts:
Draft management assertion excerpt:
“Management asserts that, throughout the period January 1 through December 31, 20X5, the description fairly presents the system used to collect, use, retain, disclose, and dispose of personal information, and the related controls were suitably designed and operated effectively to provide reasonable assurance that personal information was handled in conformity with the entity’s privacy notice.”
Which interpretation is best?
Best answer: A
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The draft assertion mixes the wrong subject matter and the wrong report type. A SOC 2 Type 1 on security and confidentiality should be as of a specified date and address fair presentation and suitability of design, not operating effectiveness throughout a period or compliance with a privacy notice for personal information.
In SOC 2, management’s assertion depends on both the trust services categories included and whether the report is Type 1 or Type 2. Security and confidentiality focus on protecting the system and information designated as confidential based on service commitments and system requirements. Privacy is a separate category and is tied to personal information being collected, used, retained, disclosed, and disposed of in line with privacy commitments or a privacy notice. Type 1 is point-in-time reporting as of a specified date and addresses the fairness of the description and the suitability of control design. Type 2 adds whether controls operated effectively throughout a period. Here, the excerpt uses privacy language and asserts operating effectiveness throughout the year, so it does not match a SOC 2 Type 1 engagement limited to security and confidentiality.
The excerpt uses privacy-specific wording and Type 2 operating-effectiveness language, which does not fit a SOC 2 Type 1 assertion for security and confidentiality as of a date.
Topic: Information Systems and Data Management
A manufacturer uses the following process:
How should the integration platform and its primary monitoring implication be characterized?
Best answer: B
What this tests: Information Systems and Data Management
Explanation: The integration platform is a key supporting architecture component in the shipment-to-invoice flow. Since invoicing depends on successful interface processing, the most important monitoring is for failed, delayed, or queued messages that could lead to incomplete billing.
When an integration or middleware component sits between an operational system and the ERP, it is a critical architecture dependency for downstream processing. In this scenario, shipments are recorded in the warehouse system, but invoices are created only after the cloud integration platform transforms and sends those records to the ERP. That means an outage or processing failure in the platform can allow operations to continue while financial processing becomes incomplete or delayed. The primary control implication is therefore interface monitoring: reviewing failed transmissions, queue backlogs, and reconciliations between source shipments and generated invoices. This addresses transaction completeness. The scenario does not describe a master-data repository, an input edit check, or a backup environment.
Because ERP invoices are generated only after the platform successfully passes shipment records, the main monitoring need is for failed or queued interface messages that could cause incomplete invoicing.
Topic: Information Systems and Data Management
A distributor’s ERP generates a customer invoice only after it receives an electronic shipment confirmation from the warehouse system. During a two-day network outage, warehouse staff shipped goods using manual bills of lading and entered shipment confirmations after service was restored. Management wants to address the main AIS risk created by the outage. Which procedure is most appropriate?
Best answer: C
What this tests: Information Systems and Data Management
Explanation: The key AIS dependency is that shipment confirmation triggers billing and A/R posting. When shipments occur manually during an outage, the main risk is that some shipped orders will not be billed or will be recorded inaccurately, so reconciling shipping documents to invoices and A/R is the best response.
In an integrated accounting information system, downstream sales processing often depends on a system event from another module. Here, the warehouse system’s shipment confirmation is the event that causes the ERP to generate the customer invoice and update accounts receivable. Because shipments were made manually during the outage, the main risk is incomplete or inaccurate capture of those shipments once the system is restored. Manual bills of lading are the source record of what actually shipped, so reconciling them to invoices and A/R postings is the most direct way to identify omitted or duplicated transactions. Procedures over lockbox deposits, purchasing documents, or payroll records address different business processes and would not resolve the specific sales-cycle risk caused by the failed shipment interface.
Because shipment confirmation triggers billing in the AIS, matching manual shipping documents to later invoices and A/R postings directly tests whether all shipped orders were recorded.
Topic: Security, Confidentiality and Privacy
An entity is testing a security control over unusual privileged-access activity.
Control description:
Walkthrough and test results for 20 business days:
Which conclusion is best supported by these results?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The facts support a design-versus-operation conclusion. The control structure appears appropriate because anomalies are identified daily and assigned for follow-up, but the required next-business-day response failed on 2 of 5 anomaly days, so the control did not operate effectively throughout the period tested.
Security control testing often distinguishes whether a control is designed appropriately from whether it operated effectively during the period tested. Here, the walkthrough shows a reasonable design: the SIEM produces a daily anomaly report, the analyst reviews it, and the process requires prompt ticket creation for investigation. The test results show the automated report and analyst review occurred consistently, which supports that the control exists and is being performed. However, timely response is part of the control, not an optional extra step. Because incident tickets were opened late on 2 of the 5 days with anomalies, the control did not operate as prescribed throughout the tested period. The best conclusion is therefore an operating effectiveness problem, not a design failure.
The walkthrough supports the design, but the late ticket creation on 2 of 5 anomaly days shows the control did not operate as prescribed throughout the period.
Topic: Security, Confidentiality and Privacy
A company gives remote contractors company-managed laptops and MFA-protected VPN access. One contractor was hired only to update vendor records in the accounts payable application. After connecting, the contractor can still scan most internal subnets and open shared folders containing payroll, tax, and legal files. Security monitoring shows no malware and no failed logins.
Which remediation best addresses this weakness?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The problem is excessive remote access, not malware or failed authentication. The best correction is to stop granting broad network trust through the VPN and instead allow only the specific application, functions, and data the contractor needs to perform the assigned work.
Least privilege means giving a user only the minimum permissions needed to perform assigned duties. Need-to-know narrows that further by limiting access to only the data required for the task. Zero trust avoids assuming that a user should be broadly trusted just because the user is on the VPN; access should be granted per resource and per role. In this scenario, the contractor needs only vendor-record updates in the accounts payable application, so access to internal subnets and payroll, tax, and legal folders is excessive. The best remediation is to replace broad VPN access with application-specific remote access and to restrict both functions and files to the contractor’s assignment. Application whitelisting is useful for controlling what software can run, but it does not solve overbroad data and network permissions.
This removes implicit trust from network connectivity and applies least privilege and need-to-know to both system access and data access.
Topic: Considerations for System and Organization Controls Engagements
Nimbus Hosting engaged a CPA firm for a SOC 2 Type 2 report.
Which interpretation is most appropriate?
Best answer: D
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The MFA failure existed during the covered period and before the report date, but the service auditor learned of it only afterward. That makes it a subsequently discovered fact. When such a fact likely would have affected the report, the service auditor must reconsider the issued SOC report and possible user communication.
In a SOC 1 or SOC 2 engagement, a subsequently discovered fact is information that existed at the report date but was not known to the service auditor at that time. If the fact, had it been known, likely would have affected the report, the matter is not ignored just because it was discovered later. The service auditor should discuss the matter with management, evaluate whether the report should be revised, and consider what communication to report users is needed. If management does not take appropriate action, the service auditor may need to take steps to prevent further reliance on the report. This differs from a subsequent event, which involves something occurring after the period or after the report date rather than a preexisting undiscovered condition.
The control failure existed before the report date and could have affected the report, which makes it a subsequently discovered fact requiring reconsideration of the issued report.
Topic: Information Systems and Data Management
A CPA is reviewing the design of a company’s billing-to-analytics data flow.
| Item | Fact |
|---|---|
| Reporting purpose | State-level churn and monthly renewal trend dashboards |
| Source system | Billing application |
| Columns copied nightly to analytics warehouse | customer_id, state, plan_tier, renewal_date, email, date_of_birth, full_bank_account_number |
| Warehouse protections | SSO with MFA; encrypted at rest |
| Access | 18 marketing analysts have read access to the extracted table |
| Retention in warehouse | Indefinite; no purge job configured |
| Source-system business need | Full bank account number is needed only until payment authorization is confirmed |
Based on the exhibit, which conclusion is best supported?
Best answer: D
What this tests: Information Systems and Data Management
Explanation: The stated dashboards need trend and state-level reporting, but the extract also stores email, date of birth, and full bank account numbers. Because those sensitive fields are not needed for the reporting purpose and the warehouse retains them indefinitely, the best-supported conclusion is a data minimization and lifecycle control need.
When data is extracted into an analytics store, schema design and lifecycle controls should limit copied data to the fields required for the stated business purpose and remove it when no longer needed. Here, the reporting purpose is state-level churn and monthly renewal trends, yet the nightly extract also includes email, date of birth, and full bank account number. The exhibit further states that the warehouse keeps the data indefinitely and that full bank account numbers are only needed in the source system until payment authorization is confirmed. Those facts indicate excessive extraction and excessive retention of sensitive data. SSO with MFA and encryption at rest are already present, so the strongest conclusion is not an authentication or storage-encryption gap, but a need to reduce the schema to necessary fields and apply retention or purge controls.
The exhibit shows unnecessary sensitive fields in the analytics schema and no retention limit, creating a clear data minimization and lifecycle control gap.
Topic: Considerations for System and Organization Controls Engagements
A CPA is performing annual vendor oversight for a client that uses a cloud benefits portal to store employees’ SSNs and bank account data as of 12/31/20X5. The vendor sends only this SOC 2 Type 2 excerpt:
| Item | Excerpt |
|---|---|
| Period covered | 1/1/20X5-9/30/20X5 |
| Trust Services Criteria covered | Security |
| Complementary user entity control | The client performs a quarterly privileged-access review of portal administrators |
The CPA needs evidence over confidentiality controls through 12/31/20X5, and the client has not documented the Q4 privileged-access review.
What should the CPA do next?
Best answer: D
What this tests: Considerations for System and Organization Controls Engagements
Explanation: The excerpt is not sufficient because it ends at 9/30, addresses only security, and assumes the client performed a complementary user entity control. The CPA should obtain evidence for the missing period and confidentiality objective and confirm the user entity control operated.
A SOC report supports reliance only for the period, criteria, and controls actually covered, and only when relevant complementary user entity controls are in place. Here, the excerpt stops before year-end, omits the confidentiality criterion the CPA needs, and lists a quarterly privileged-access review that the client has not documented. The proper follow-up is to obtain the full report and other evidence targeted to the gaps, such as evidence covering 10/1/20X5-12/31/20X5 and confidentiality-related controls, and to verify the client performed the stated complementary user entity control. Security coverage does not automatically satisfy confidentiality, and a representation or absence of reported breaches does not replace missing scoped evidence.
The excerpt does not cover the needed criterion, the gap period, or operation of the complementary user entity control, so additional targeted evidence is required before reliance.
Topic: Security, Confidentiality and Privacy
During an ISC review, a CPA is investigating a payroll confidentiality incident. The following facts are known:
What should the CPA do next to evaluate the most relevant control?
Best answer: A
What this tests: Security, Confidentiality and Privacy
Explanation: The key issue is not whether the user could prove identity, but why access still existed after termination. The next step is to test access revocation by tracing the termination event to timely disabling of the account and related privileges.
Authentication verifies who the user is, while authorization determines what the user is allowed to do. Access provisioning creates or grants that access, access review periodically reassesses whether access remains appropriate, and access revocation removes access when it is no longer needed. Here, the terminated supervisor successfully logged in two days after HR finalized the termination, so the immediate control concern is failed revocation. The facts already indicate the application had authentication controls and that original access approval existed. A prior quarterly access review also does not replace the need to revoke access promptly when employment ends. The most relevant next procedure is therefore to trace the termination record to evidence that the user’s account and payroll privileges were disabled on a timely basis.
Because a terminated user retained access after the termination date, the next step is to test the access revocation process tied to termination.
Use the CPA ISC Practice Test page for the full practice route, mixed-topic practice, timed mock exams, and explanations.
Read the CPA ISC guide on CPAExamsMastery.com for concept review, then return here for Mastery Exam Prep practice.