Try 10 focused Java 17 1Z0-829 questions on Streams and Lambdas, with explanations, then continue with IT Mastery.
Open the matching IT Mastery practice page for timed mocks, topic drills, progress tracking, explanations, and full practice.
Try Java 17 1Z0-829 on Web View full Java 17 1Z0-829 practice page
| Field | Detail |
|---|---|
| Exam route | Java 17 1Z0-829 |
| Topic area | Working with Streams and Lambda Expressions |
| Blueprint weight | 10% |
| Page purpose | Focused sample questions before returning to mixed practice |
Use this page to isolate Working with Streams and Lambda Expressions for Java 17 1Z0-829. Work through the 10 questions first, then review the explanations and return to mixed practice in IT Mastery.
| Pass | What to do | What to record |
|---|---|---|
| First attempt | Answer without checking the explanation first. | The fact, rule, calculation, or judgment point that controlled your answer. |
| Review | Read the explanation even when you were correct. | Why the best answer is stronger than the closest distractor. |
| Repair | Repeat only missed or uncertain items after a short break. | The pattern behind misses, not the answer letter. |
| Transfer | Return to mixed practice once the topic feels stable. | Whether the same skill holds up when the topic is no longer obvious. |
Blueprint context: 10% of the practice outline. A focused topic score can overstate readiness if you recognize the pattern too quickly, so use it as repair work before timed mixed sets.
These questions are original IT Mastery practice items aligned to this topic area. They are designed for self-assessment and are not official exam questions.
Topic: Working with Streams and Lambda Expressions
Given this functional interface, which assignment uses valid Java 17 lambda syntax?
@FunctionalInterface
interface Combiner {
int combine(String s, int n);
}
Options:
A. Combiner c = s, n -> s.length() + n;
B. Combiner c = (String s, int n) -> return s.length() + n;
C. Combiner c = (String s, n) -> { return s.length() + n; };
D. Combiner c = (String s, int n) -> { return s.length() + n; };
Best answer: D
Explanation: The valid assignment uses an explicitly typed parameter list and a block body. For two lambda parameters, parentheses are required, and a block body for a non-void functional method must return a compatible value.
Java lambda parameter lists must be consistently typed: either all parameters have explicit types, or the parameter types are inferred. Multiple parameters must be enclosed in parentheses. After ->, the body can be a single expression or a block. If the target method returns a value and the lambda uses a block body, the block must use return to provide that value.
A bare return statement is not an expression body; it must appear inside braces.
String s is explicit while n is inferred.return is a statement and must be inside a block body.(...).Topic: Working with Streams and Lambda Expressions
What is the result of compiling and running the following Java 17 code?
import java.util.*;
import java.util.stream.*;
public class Demo {
public static void main(String[] args) {
var words = List.of("ant", "bee");
var stats = words.stream().collect(
Collectors.partitioningBy(
w -> w.length() > 3,
Collectors.summarizingInt(String::length)));
System.out.print(stats.containsKey(true) + " ");
System.out.print(stats.get(true).getCount() + ":" + stats.get(true).getSum());
}
}
Options:
A. A NullPointerException is thrown
B. true 2:6
C. true 0:0
D. false 0:0
Best answer: C
Explanation: The stream has no words longer than 3 characters, so the true partition is empty. However, Collectors.partitioningBy still creates entries for both true and false, applying the downstream collector to each partition.
Collectors.partitioningBy groups stream elements into exactly two partitions: true and false. Unlike a typical groupingBy result, both Boolean keys are present even if one side receives no elements. Here, the predicate w -> w.length() > 3 is false for both "ant" and "bee", so the true partition is empty. The downstream summarizingInt(String::length) collector still produces an IntSummaryStatistics object for that empty partition, whose count and sum are both 0.
The key takeaway is that an empty partition is not the same as a missing map entry.
partitioningBy includes both true and false keys.stats.get(true) summarizes only elements matching the predicate.stats.get(true) returns an empty IntSummaryStatistics, not null.Topic: Working with Streams and Lambda Expressions
A reporting utility exposes callbacks for a stream pipeline:
record Event(String source, int severity) {}
static void export(List<Event> events,
Function<Event, Boolean> keep,
Function<Event, String> format,
Function<String, Void> write) {
events.stream()
.filter(e -> keep.apply(e))
.map(format)
.forEach(s -> write.apply(s));
}
Callers currently need boxed Boolean results and dummy null returns. The callbacks should mean: decide whether to keep an Event, convert an Event to String, and write each String as a side effect. Which refactor is best?
Options:
A. Use Predicate<Event> keep, Supplier<String> format, and Function<String,Void> write.
B. Use Predicate<Event> keep, Function<Event,String> format, and Consumer<String> write.
C. Use Supplier<Boolean> keep, Function<Event,String> format, and Consumer<String> write.
D. Use Predicate<Event> keep, UnaryOperator<Event> format, and Consumer<String> write.
Best answer: B
Explanation: The best refactor matches each callback to its role in the stream pipeline. filter needs a Predicate, map needs a value-producing Function, and forEach needs a side-effect-oriented Consumer. This avoids boxed Boolean tests and dummy null return values.
Java’s standard functional interfaces describe both parameter and result intent. A boolean-valued test should be a Predicate<T>, not a Function<T,Boolean>. A transformation from one type to another should be a Function<T,R>. A callback that receives a value and performs a side effect without producing a result should be a Consumer<T>. In this pipeline, the natural signature is Predicate<Event> for filtering, Function<Event,String> for formatting, and Consumer<String> for writing. UnaryOperator<T> is only appropriate when the input and output types are the same, and Supplier<T> is only appropriate when no input is needed.
Event.Event to String, not Event to Event.Topic: Working with Streams and Lambda Expressions
In Java 17, a developer writes a lambda expression or method reference in an assignment or method-call argument and receives an incompatible target type compiler error. Which rule correctly describes how the compiler determines whether that expression is legal?
Options:
A. It allows any target type with matching method names.
B. It selects the target by running overloads at runtime.
C. It gives the expression its own standalone class type.
D. It checks compatibility with the target functional interface’s abstract method.
Best answer: D
Explanation: Lambda expressions and method references are target-typed in Java 17. The compiler uses the target functional interface and checks the expression against that interface’s single abstract method signature.
Lambda expressions and method references are poly expressions: they do not have a useful standalone type by themselves. The target type comes from context, such as an assignment, method argument, return statement, or cast. That target must be a functional interface, and the compiler checks parameter and return compatibility against its function descriptor. This is why the same lambda can compile for one interface but fail for another.
Topic: Working with Streams and Lambda Expressions
A developer wants to know how far this sequential stream pipeline runs before the terminal operation completes. What is printed?
var log = new ArrayList<String>();
var found = Stream.of("ax", "by", "cz", "dw")
.filter(s -> { log.add("f:" + s); return s.charAt(1) >= 'y'; })
.map(s -> { log.add("m:" + s); return s.toUpperCase(); })
.anyMatch(s -> { log.add("a:" + s); return s.startsWith("C"); });
System.out.println(found);
System.out.println(log);
Options:
A. true then [f:ax, f:by, m:by, a:BY, f:cz, m:cz, a:CZ]
B. false then [f:ax, f:by, m:by, a:BY, f:cz, m:cz, a:CZ, f:dw]
C. true then [f:ax, f:by, f:cz, f:dw, m:by, a:BY, m:cz, a:CZ]
D. true then [f:ax, f:by, m:by, a:BY, f:cz, m:cz, a:CZ, f:dw]
Best answer: A
Explanation: Sequential ordered stream pipelines process each element through the needed intermediate operations before moving on. The terminal operation anyMatch is short-circuiting, so processing stops as soon as it sees CZ, which starts with C.
The core rule is that stream intermediate operations are lazy and are driven by the terminal operation. For an ordered sequential stream, elements are considered in encounter order. The element ax is filtered out after logging f:ax. The element by passes the filter, is mapped to BY, but does not match. The element cz passes, is mapped to CZ, and makes anyMatch return true. Because anyMatch short-circuits on the first match, dw is never evaluated. A stream pipeline does not complete all filters first unless the terminal operation requires that behavior.
filter calls before map and anyMatch.anyMatch stops once CZ satisfies the predicate.cz to CZ before the match predicate is evaluated.Topic: Working with Streams and Lambda Expressions
An encounter-ordered List<Order> is processed in parallel. The goal is to keep the paid order discounts in encounter order and compute their total in cents.
List<Integer> discounts = new ArrayList<>();
int total = orders.parallelStream()
.filter(Order::paid)
.peek(o -> discounts.add(o.discountCents()))
.map(Order::discountCents)
.reduce(0, (a, b) -> a - b);
Which refactor is the best fix?
Options:
A. Create discounts with filter(...).map(...).toList(), then compute mapToInt(...).sum().
B. Use AtomicInteger in forEach to update the total and list.
C. Use Collections.synchronizedList, keep peek, and keep the subtraction reduce.
D. Keep peek, but change the reducer to (a, b) -> a + b.
Best answer: A
Explanation: Parallel stream operations should avoid stateful lambdas that mutate shared objects. The original code mutates an ArrayList inside peek and uses subtraction, which is not an associative reduction operation. Mapping to discounts, collecting them, and using sum() fixes both issues.
A parallel reduction must use an associative operation with a compatible identity. Subtraction is not associative, so splitting and combining stream partitions can produce results that differ from a sequential evaluation. Also, adding to an external ArrayList from peek is a stateful side effect and is unsafe in a parallel pipeline. A side-effect-free refactor first maps paid orders to discount values and lets the stream library build the list, preserving encounter order for an ordered source. Then mapToInt(Integer::intValue).sum() performs an associative numeric reduction. The key takeaway is to express parallel work as transformations and reductions, not as shared mutable updates inside lambdas.
peek side effect into ArrayList.Topic: Working with Streams and Lambda Expressions
Given this Java 17 code, what is printed?
var words = List.of("bear", "cat", "dog", "eel");
var result = words.stream()
.peek(s -> System.out.print(s.charAt(0)))
.filter(s -> s.length() == 3)
.findFirst()
.orElse("none");
System.out.print(":" + result);
Options:
A. bc:cat
B. bcde:cat
C. :cat
D. b:cat
Best answer: A
Explanation: findFirst() is a short-circuiting terminal operation. Because the stream has encounter order from List.of(...), elements are tested in list order until the first matching element, cat, is found.
A stream from a List is ordered, so this sequential pipeline visits elements in list encounter order. Intermediate operations such as peek() and filter() are lazy; they run only as the terminal operation requests elements. findFirst() asks for elements until it finds the first one that passes the filter. bear is peeked and rejected by the length check, then cat is peeked and accepted. The pipeline stops before dog and eel are visited. The key takeaway is that short-circuiting can prevent later elements from reaching even earlier intermediate operations such as peek().
findFirst() does not need to examine dog or eel after cat matches.peek() runs before filter(), so bear prints before being rejected.Topic: Working with Streams and Lambda Expressions
Given the following Java 17 code, what is the result?
import java.util.stream.*;
public class Stats {
public static void main(String[] args) {
double total = IntStream.rangeClosed(1, 3)
.mapToLong(i -> i * 2L)
.mapToDouble(n -> n / 2.0)
.sum();
System.out.println(total);
}
}
Options:
A. It does not compile because mapToDouble() is unavailable after mapToLong().
B. It does not compile because sum() returns OptionalDouble.
C. It prints 6.0.
D. It prints 6.
Best answer: C
Explanation: The code compiles and prints 6.0. IntStream.mapToLong() produces a LongStream, and LongStream.mapToDouble() then produces a DoubleStream. Calling sum() on that DoubleStream returns a primitive double, not an optional value.
Primitive stream mapping methods change the stream type when their names include the target primitive type. Here, IntStream.rangeClosed(1, 3) produces 1, 2, 3. The mapToLong() call converts those to 2L, 4L, 6L. The mapToDouble() call converts them to 1.0, 2.0, 3.0, and DoubleStream.sum() returns their primitive double sum: 6.0.
The key distinction is that average() returns OptionalDouble, but sum() returns a primitive numeric result for primitive streams.
double, so println displays 6.0.LongStream has a valid mapToDouble() operation.sum() with average(); only average() returns OptionalDouble.Topic: Working with Streams and Lambda Expressions
Assume necessary imports and static imports for java.util.stream.Collectors methods exist.
record Order(String customer, String sku, int qty) {}
var orders = List.of(
new Order("Ava", "pen", 2),
new Order("Ava", "pen", 3),
new Order("Ben", "pen", 4),
new Order("Ava", "pad", 1)
);
Map<String, Map<String, Integer>> totals =
orders.stream().collect(/* collector */);
The required result maps each customer to each SKU’s total quantity, summing duplicate customer/SKU pairs. Which collector expression correctly replaces /* collector */?
Options:
A. groupingBy(Order::customer, toMap(Order::sku, Order::qty, Integer::sum))
B. groupingBy(Order::customer, summingInt(Order::qty))
C. toMap(Order::customer, o -> Map.of(o.sku(), o.qty()), (a, b) -> b)
D. groupingBy(Order::customer, toMap(Order::sku, Order::qty))
Best answer: A
Explanation: groupingBy creates the outer Map by customer, and the downstream collector creates the inner map for each group. Because Ava has two pen orders, the downstream toMap needs a merge function; Integer::sum combines those duplicate inner keys.
For Collectors.toMap, duplicate keys are not merged unless a merge function is supplied. Here, groupingBy(Order::customer, downstream) first groups orders by customer, then applies the downstream collector separately inside each customer group. The downstream toMap(Order::sku, Order::qty, Integer::sum) creates Map<String, Integer> values where duplicate SKU keys in the same customer group are summed.
This produces the required grouped result type: Map<String, Map<String, Integer>>. Omitting the merge function would compile, but collection would fail when a duplicate SKU appears within the same customer group.
pen keys for Ava cause toMap to throw IllegalStateException.toMap replacement loses earlier inner maps for the same customer instead of grouping and summing SKUs.Topic: Working with Streams and Lambda Expressions
A batch job assigns shard numbers. The method must create a stream containing the primitive int values 1 through 5, inclusive, and pass it to an API that requires an IntStream. With java.util.* and java.util.stream.* imported, which declaration correctly applies the Java SE 17 stream source rule?
Options:
A. IntStream ids = List.of(1, 2, 3, 4, 5).stream();
B. IntStream ids = Stream.of(1, 2, 3, 4, 5);
C. IntStream ids = IntStream.range(1, 5);
D. IntStream ids = IntStream.rangeClosed(1, 5);
Best answer: D
Explanation: Primitive range streams use IntStream.range() or IntStream.rangeClosed(). Because the requirement includes both 1 and 5 and needs a primitive IntStream, rangeClosed(1, 5) matches the bounds and stream type.
Java provides primitive stream factories for numeric ranges. IntStream.range(startInclusive, endExclusive) excludes the second argument, while IntStream.rangeClosed(startInclusive, endInclusive) includes it. Since the stream must contain exactly 1, 2, 3, 4, and 5 as primitive int values, the inclusive range factory is the direct match. Factory methods such as Stream.of(1, 2, 3) and collection stream() methods create object streams, such as Stream<Integer>, not IntStream. Those streams can be converted with mapToInt(Integer::intValue), but they are not directly assignable to IntStream.
range(1, 5) produces 1 through 4 only.Stream.of(1, 2, 3, 4, 5) produces Stream<Integer>.List.of(...).stream() also produces a boxed Stream<Integer>, not an IntStream.Use the Java 17 1Z0-829 Practice Test page for the full IT Mastery route, mixed-topic practice, timed mock exams, explanations, and web/mobile app access.
Try Java 17 1Z0-829 on Web View Java 17 1Z0-829 Practice Test
Read the Java 17 1Z0-829 Cheat Sheet on Tech Exam Lexicon, then return to IT Mastery for timed practice.