Powerful tool for the stream of the Java 8 series Collector__java

Source: Internet
Author: User

Stream series:

Basic syntax for the stream of the Java 8 Series A powerful tool for the stream of the Java 8 Series collector Java 8 Series refactoring and custom collectors Java 8 series The stream of the almighty reduce
Overview

Previously we used collect (ToList ()) to generate a list in the stream. In the actual development process, the list is the data structure we often use, but sometimes we also want the stream to be able to transform to generate other values, such as map or set, and even want to customize the structure of the data you want to build.

Collect is the collector, a generic structure that generates complex values from the stream. As long as it is passed to the Collect method, the so-called conversion method, it generates the desired data structure. It has to be mentioned here that collectors the tool library, which encapsulates the corresponding conversion method. Of course, the Collectors Tool Library encapsulates only a few scenarios that are commonly used, and if there is a special need, it will be customized.

Clearly, the list is the most natural data structure generated from the stream, but sometimes people want to generate other values from the stream, such as Map or Set, or you want to customize a class to abstract what you want.

As has been said before, it is possible to determine whether this is an early-evaluated operation by the signature of the method on the stream. The reduce operation is a good example, but sometimes people want to do more.
This is the collector, a generic structure that generates complex values from the stream. Just pass it to the Collect method, and all the streams can use it.

<r, a> R collect (collector< Super T, A, r> Collector);

<R> R Collect (supplier<r> Supplier,
Biconsumer<r, huh? Super T> Accumulator,
Biconsumer<r, r> combiner); Secondary Interface Supplier

The Supplier<t> interface is a function interface that declares a GET method that is used primarily to create an object that returns a specified data type.

T: The specified data type

@FunctionalInterface
Public interface Supplier {
T get ();
} Biconsumer

Biconsumer<t, the U> interface is a function interface that declares the Accept method and has no return value, which is used primarily to declare some expected operations.

At the same time, the interface defines a default method Andthen, which takes a biconsumer and returns a combination of Biconsumer, which performs the operation in sequence. If any operation throws an exception, it is passed to the caller of the combined operation. If you perform this operation to throw an exception, the post operation is not performed.

@FunctionalInterface public
Interface Biconsumer<t, u> {

    void accept (T-T, u u);

    Default Biconsumer<t, u> andthen (biconsumer<? Super T,? Super U> after) {
        objects.requirenonnull ;

        Return (L, R)-> {
            Accept (L, R);
            After.accept (l, R);
        };
    }

Binaryoperator

The Binaryoperator interface inherits from the Bifunction interface, which specifies the type of parameter to be executed by the Apply method and the return value type of T.

@FunctionalInterface public
Interface Binaryoperator<t> extends bifunction<t,t,t> {

    public Static <T> binaryoperator<t> Minby (comparator< super t> Comparator) {
        objects.requirenonnull ( Comparator);
        Return (A, b)-> Comparator.compare (A, B) <= 0? a:b;
    }

    public static <T> binaryoperator<t> Maxby (comparator< super t> Comparator) {
        Objects.requirenonnull (comparator);
        Return (A, b)-> Comparator.compare (A, B) >= 0? A:b
    }
}

@FunctionalInterface public
Interface bifunction<t, u, r> {


    R apply (T T, u u);

    Default <V> bifunction<t, U, v> andthen (function<? Super R,? extends v> after) {
        Objects.requireno Nnull (after);
        Return (T T, u u)-> after.apply (Apply (t, u));
    }

Function

Funtion is a function interface that defines a transformation function that converts T to R. The map method in the stream, for example, is to accept the function argument and convert T to R.

@FunctionalInterface public
Interface Function<t, r> {

    /**
     * Conversion function, converts t to R
    /R apply (t);

    /**
     * Returns a composite function functions, executes the before first, and then executes the function
     *
     * If the evaluation of two functions throws an exception, it is relayed to the caller of the composite function.
     * If before is null, it will throw nullpointerexception
    /default <V> function<v, r> compose (Function ? Super V, huh? Extends t> before) {
        objects.requirenonnull (before);
        Return (v V)-> apply (before.apply (v));
    }

     /**
     * Returns a function of the composite functions, executes the function first and then executes after
     *
     * If the evaluation of two functions throws an exception, it is relayed to the caller of the composite function.
     * If after IS null, will throw NullPointerException
    /default <V> function<t, v> andthen (Function ? Super R,? Extends v> after) {
        objects.requirenonnull (after);
        Return (T-t)-> after.apply (apply (t));
    }

    /**
     * The function returned by input parameters
    /static <T> function<t, t> identity () {return
        T-> t;
    }
}
Collector

Collector is a variable reduction operation interface for stream, with variable reduction operations including: accumulating elements into collections, using StringBuilder connection strings, and calculating element-related statistics such as Sum,min,max or average. The Collectors (class collector) provides many common implementations of variable reduction operations.

Collector<t, A, r> accepts three generic parameters and restricts the data type of the variable reduction operation: T: INPUT element type A: variable cumulative type for reduced operations (usually hidden as implementation details) R: Result types for variable reduction operations

The Collector interface declares 4 functions, which coordinate execution to accumulate elements into a mutable result container, and optionally make a final transformation of the result. Supplier<a> Supplier (): Create a new result knot biconsumer<a, t> accumulator (): Add elements to the result container binaryoperator<a> Combiner (): Merge two result containers into one result container function<a, r> Finisher (): Transform the result container accordingly

Within the characteristics method of the Collector interface, you can declare related constraints on the collector set<characteristics> characteristics ():

Characteristics is an enumeration class within collector that declares three properties, such as concurrent, UNORDERED, and Identity_finish, to constrain the properties of the collector. CONCURRENT: Indicates that this collector supports concurrency, which means that in multiple threads, the accumulator can invoke the result container UNORDERED: Indicates that the collector does not execute according to the element input order in the stream

Identity_finish: Indicates that the finisher realizes the recognition function and can be ignored.
Note: If a container declares only the concurrent attribute, not the unordered property, then the container only supports unordered stream execution in multiple threads. identity constraints and dependency constraints

The stream can be executed sequentially or concurrently, or sequentially, in order to ensure that the stream can produce the same result, the collector function must satisfy the identity constraints and related item constraints.

The identity constraint says that for any part of the cumulative result, it must produce an equivalent result with the empty result container combination. In other words, the partial cumulative result of the result of an accumulator and a combo call to any series must be equal to the combiner.apply (A,supplier.get ()) a,a.

The correlation constraint says that the splitting computation must produce the equivalent result. In other words, for any INPUT element t1 and T2, the results in the following calculations R1 and R2 must be equivalent:

A a1 = Supplier.get ();
Accumulator.accept (A1,T1);
Accumulator.accept (A1,T2);
R r1 = finisher.apply (A1); Result without splitting

A a2 = Supplier.get ();
Accumulator.accept (A2,T1);
A a3 = Supplier.get ();
Accumulator.accept (A3,T2);
R r2 = finisher.apply (combiner.apply (A2,A3)); 
Create Collector Custom Collector

Java 8 Series Refactoring and custom collectors based on the Collector tool Library

In the Collector ToolPak, many commonly used collectors are declared to allow us to quickly create a collector. As we've learned earlier, collector functions must meet identity constraints and dependency constraints. When creating collectors based on Collector implementation simplification, such as Stream.collect (Collector), the following constraints must be observed: The first parameter is passed to the accumulator () function, and two parameters are passed to the combiner () function. The arguments passed to the Finisher () function must be the result of the last call to the supplier (), accumulator (), or combiner () function. Implementations should not do anything with the results of any accumulator (), combiner (), or finisher () function unless the collector returns the returned result to the caller if the result is passed to the combiner () or finisher () function. And the returned object is not the same as incoming, the object is no longer passed to the accumulator () function call. Once the result is passed to the combiner () or finisher () function, it is not passed to the accumulator () function again. For serial collectors, any results returned by the supplier (), accumulator () or combiner () function must be restricted to serial. This allows the collector to do so in parallel, and the collector does not need to perform any additional synchronization. The reduce operation implements elements that must manage the stream are correctly differentiated and processed separately, and merges data in the accumulator only after the cumulative completion. For concurrent collectors, the implementation can implement the reduce operation freely (but not necessarily) at the same time. Accumulator () can be invoked at the same time as multiple threads, rather than maintaining the independence of the results during the cumulative period. Concurrent restores are applied only if the collector has Collector.Characteristics.UNORDERED attributes or if the original data is unordered. Convert to other collection

For the chain operation of the many stream mentioned earlier, however, we always have to generate a set of Strea, for example: The existing code is written for the collection, so we need to transfer the flow into the collection, and after a series of chained operations on the collection, we eventually want to generate a value; When you write unit tests, You need to make assertions about a specific set.

Some stream can be converted to a collection, such as the previous mentioned ToList, which generates an instance of the Java.util.List class. Of course, there are also Toset and tocollection, which generate instances of set and collection classes, respectively. ToList

Example:

list<integer> collectlist = Stream.of (1, 2, 3, 4)
        . Collect (Collectors.tolist ());
System.out.println ("collectlist:" + collectlist);
Print results
//collectlist: [1, 2, 3, 4]
Toset

Example:

set<integer> Collectset = Stream.of (1, 2, 3, 4)
        . Collect (Collectors.toset ());
System.out.println ("Collectset:" + collectset);
Print results
//Collectset: [1, 2, 3, 4]
tocollection

Typically, when you create a collection, you need to call the appropriate constructor to indicate the specific type of the collection:

list<artist> artists = new arraylist<> ();

However, when calling ToList or Toset methods, you do not need to specify a specific type, and the Stream class library automatically infers and generates the appropriate type. Of course, sometimes we have specific requirements for the set of transformations generated, such as the desire to generate a treeset, rather than a type that is automatically specified by the Stream class library. Using Tocollection, it takes a function as an argument to create the collection.

It is worth noting that, see collectors source code, because its accepted function parameters must inherit from collection, which means that collection does not convert all the inheritance classes, the most obvious is that the tocollection can not be converted to map Tomap

If a map is generated, we need to invoke the Tomap method. Because of the value of key and value in map, the method is different from Toset, ToList and so on. Tomap should accept at least two parameters, one to generate a key, and one to generate value. There are three variants of the Tomap method:

Tomap (function< Super T,? extends k> keymapper,function< Super T,? extends u>) Valuemapper: The Keymapper tion is used to generate the key valuemapper: The funtion is used to generate value

Note: If there are duplicate values in the stream, causing the key in the map to repeat, the runtime will report an exception java.lang.IllegalStateException:Duplicate key * *

Tomap (Function turns to value

Use collect to convert a stream to a value. Maxby and Minby allow users to generate a value in a particular order. Averagingdouble: Averaging, the element type of the stream is double averagingint: average, the element type of the stream is int averaginglong: average, the element type of the stream is long Counting:stream the number of elements Maxby: Under specified conditions, the maximum element of the stream is Minby: Under specified conditions, the minimum element of the stream reducing:reduce operation Summarizingdouble: Statistical stream data (double) state, which includes count,min,max,sum and averages. Summarizingint: The state of the data (int) of the statistics stream, which includes count,min,max,sum and averages. Summarizinglong: Statistical stream data (long) state, including Count,min,max,sum and average. Summingdouble: Sum, the element type of the stream is double summingint: Sum, the element type of the stream is int summinglong: Sum, the element type of the stream is long

Example:

optional<integer> Collectmaxby = Stream.of (1, 2, 3, 4)
            . Collect (Collectors.maxby (Comparator.comparingint (o -> o)));
System.out.println ("Collectmaxby:" + collectmaxby.get ());
Print results
//Collectmaxby:4
Split Data blocks

A common operation of collect decomposes the stream into two sets. If the stream of a number, we may want to divide it into two sets, one is an even number set, the other is an odd set. The first thing we think of is the filtration operation, through two times filtration operation, very simple to complete our needs.

But there are problems with this operation. First, in order to perform two filtering operations, two streams are required. Second, if the filtering operation is complex, the operation is performed on each stream, and the code becomes redundant.

Here we have to say the Partitioningby method in the Collectors library, which takes a stream and divides it into two parts: using the Predicate object, specifying the condition and judging which part the element should belong to, and returning a map to the list based on the Boolean value. Therefore, the elements in the list corresponding to the key are true to satisfy the conditions specified in the predicate object; Similarly, the elements in the list corresponding to the key is false that do not meet the conditions specified in the predicate object

Thus, using Partitioningby, we can decompose the stream of numbers into odd sets and even sets.

Map<boolean, list<integer>> Collectparti = Stream.of (1, 2, 3, 4)
            . Collect (Collectors.partitioningby ( It-> it% 2 = 0));
System.out.println ("Collectparti:" + collectparti);
Print results
//Collectparti: {false=[1, 3], true=[2, 4]}
Data Grouping

Data grouping is a more natural way to split data, and it is possible to group data using arbitrary values, unlike dividing the data into true and false.

Call the Collect method of the stream, pass in a collector, groupingby accept a classification function to group the data, just like Partitioningby, accept a
The predicate object splits the data into both true and false parts. The classifier we use is a function object, which is the same as the map operation.

Example:

Map<boolean, list<integer>> collectgroup= stream.of (1, 2, 3, 4)
            . Collect (Collectors.groupingby (IT- > It > 3));
System.out.println ("Collectgroup:" + collectgroup);
Print results
//Collectgroup: {false=[1, 2, 3], true=[4]}


Note:

Look at the Groupingby and Partitioningby examples, their effects are the same, are the stream of data segmentation and return to a map. The possible examples give you a misunderstanding, in fact, they are completely different. Partitioningby The stream is divided according to the specified criteria, and the returned map is the map string

Sometimes we generate a set of strings for the elements of the stream (string type). For example, in Stream.of ("1", "2", "3", "4"), the stream format is formatted as "1,2,3,4".

If the stream is not used, we can implement it through the For loop iteration.

arraylist<integer> list = new arraylist<> ();
List.add (1);
List.add (2);
List.add (3);
List.add (4);

StringBuilder sb = new StringBuilder ();

for (Integer it:list) {
    if (sb.length () > 0) {
        sb.append (",");
    }
    Sb.append (it);

}
System.out.println (Sb.tostring ());
Print results
//1,2,3,4

In Java 1.8, we can use the stream to implement it. Here we will use collectors.joining to collect the values in the stream, which makes it easy to get a string out of the stream. The joining function accepts three parameters, representing the tolerance (used to separate elements), prefixes, and suffixes.

Example:

String Strjoin = Stream.of ("1", "2", "3", "4")
        . Collect (Collectors.joining (",", "[", "]"));
System.out.println ("Strjoin:" + strjoin);
Print results
//Strjoin: [1,2,3,4]
Combination Collector

In front, we have learned that collector is powerful and very useful. Would it be more powerful if they were grouped together? Looking at the previous example, when the data is grouped, we are getting the grouped data list collectgroup: {false=[1, 2, 3], true=[4]}. If our request is higher, we do not need to group the list, as long as the number of the list after the grouping is good.

At this time, many people subconsciously will think, convenient map is good, and then use List.size (), you can easily get the number of lists of each group.

Split blocks
of data map<boolean, list<integer>> Collectparti = Stream.of (1, 2, 3, 4)
        . Collect ( Collectors.partitioningby (It-> it% 2 = 0));

Map<boolean, integer> mapsize = new hashmap<> ();
Collectparti.entryset ()
        . ForEach (Entry-> mapsize.put (Entry.getkey (), Entry.getvalue (). Size ());

System.out.println ("mapsize:" + mapsize);
Print results
//mapsize: {false=2, true=2}

In the Partitioningby method, there is such a distortion:

Map<boolean, long> particount = Stream.of (1, 2, 3, 4)
        . Collect (Collectors.partitioningby (it-> it.intvalue ()% 2 = 0,
                collectors.counting ());
System.out.println ("Particount:" + particount);
Print results
//Particount: {false=2, true=2}

In the Partitioningby method, we not only pass the conditional function, but also pass in a second collector to collect a subset of the final result, which is called the downstream collector. A collector is a recipe for generating the final result, and a downstream collector is a recipe for generating partial results, and a downstream collector is used in the main collector. This combination uses collectors in a way that makes them more powerful in the Stream class library.

Those functions that are specially customized for the base type, such as Averagingint, Summarizinglong, and so on, are in fact equivalent to the method that invokes the special stream, plus they are used as a downstream collector.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.