Is Spring Batch's ItemWriter a singleton class?


Sayak Banerjee

I have a batch job with the following definitions:

<batch:job id="job">
    <batch:split id="main" task-executor="simpleAsyncTaskExecutor">
        <batch:flow>
            <batch:step id="getAccountDetails">
                <batch:tasklet ref="getAccountDetailsTasklet"/>
            </batch:step>
        </batch:flow>
        <batch:flow>
            <batch:step id="processAccounts">
                <batch:tasklet transaction-manager="transactionManager" task-executor="threadPoolTaskExecutor" throttle-limit="${processor.maxThreads}">
                    <batch:chunk reader="queueReader" writer="myCustomItemWriter" commit-interval="${processor.commitInterval}"/>
                </batch:tasklet>
            </batch:step>
        </batch:flow>
    </batch:split>
</batch:job>

myCustomItemWriter basically iterates over the list of accounts passed by the queueReader and submits them to the database.

The job is scalable to 100 threads running the block in parallel. In myCustomItemWriter's class, I have a private property that maintains the sum of specific BigDecimal properties for each account it handles. So if there are 10000 accounts, I will have 100 threads each handling 100 accounts. I want to capture the sum of this attribute for all these 10000 accounts.

Here's my question: Is the ItemWriter a singleton (so just one private property to maintain this amount is enough)? If not, should I define the counter as an AtomicReference bean and inject it into my writer so that the same instance of the property is injected in all 100 threads?

Rafik Berdi

If you annotate the writer with a @Componentdefault scope , the default scope is a singleton.

However, all batch artifacts in the scope used for the job XML will be instantiated prior to instantiation, and will be valid in the scope they contain. There are two scopes related to the artifact lifecycle: joband step.

In your case you can annotate your CustomItemWriter with, @Scope("step")and since you are running a multi-threaded batch, each thread will create its own instance which myCustomItemWriteris only valid for the current execution step.

Related


Is Spring Batch's ItemWriter a singleton class?

Sayak Banerjee I have a batch job with the following definitions: <batch:job id="job"> <batch:split id="main" task-executor="simpleAsyncTaskExecutor"> <batch:flow> <batch:step id="getAccountDetails"> <batch:tasklet ref="

Is Spring Batch's ItemWriter a singleton class?

Sayak Banerjee I have a batch job with the following definition: <batch:job id="job"> <batch:split id="main" task-executor="simpleAsyncTaskExecutor"> <batch:flow> <batch:step id="getAccountDetails"> <batch:tasklet ref="g

Is Spring Batch's ItemWriter a singleton class?

Sayak Banerjee I have a batch job with the following definitions: <batch:job id="job"> <batch:split id="main" task-executor="simpleAsyncTaskExecutor"> <batch:flow> <batch:step id="getAccountDetails"> <batch:tasklet ref="

Unable to catch exception in Spring Batch's ItemWriter

j I am writing a Spring Batch process to migrate datasets from one system to another. In this case it's as simple as using the RowMapperimplementation to build the object from the query before handing over to it ItemWriter. The ItemWritercall to the savemethod

Unable to catch exception in Spring Batch's ItemWriter

j I am writing a Spring Batch process to migrate datasets from one system to another. In this case it's as simple as using the RowMapperimplementation to build the object from the query before handing over to it ItemWriter. The ItemWritercall to the savemethod

Spring Batch GZIP ItemWriter/ItemReader

ChikChak : I have a Spring boot application and I wrote some ItemWriter and ItemReaders like JSON files and CSV files. I would like to add a step of compressing to GZIP and decompressing from GZIP. I wonder if it is possible to use JavaStreams as usual - If I

Spring Batch GZIP ItemWriter/ItemReader

ChikChak : I have a Spring boot application and I wrote some ItemWriter and ItemReaders like JSON files and CSV files. I would like to add a step of compressing to GZIP and decompressing from GZIP. I wonder if it is possible to use JavaStreams as usual - If I

Spring Batch for ItemWriter skips exception

caffeine I am trying to use Spring Batch 2.2.5 with Java configuration. Here is my configuration: @Configuration @EnableBatchProcessing public class JobConfiguration { @Autowired private JobBuilderFactory jobBuilder; @Autowired private StepBui

Spring Batch for ItemWriter skips exception

caffeine I am trying to use Spring Batch 2.2.5 with Java configuration. Here is my configuration: @Configuration @EnableBatchProcessing public class JobConfiguration { @Autowired private JobBuilderFactory jobBuilder; @Autowired private StepBui

Spring Batch GZIP ItemWriter/ItemReader

ChikChak : I have a Spring boot application and I wrote some ItemWriter and ItemReaders like JSON files and CSV files. I would like to add a step of compressing to GZIP and decompressing from GZIP. I wonder if it is possible to use JavaStreams as usual - If I

Spring Batch for ItemWriter skips exception

caffeine I am trying to use Spring Batch 2.2.5 with Java configuration. Here is my configuration: @Configuration @EnableBatchProcessing public class JobConfiguration { @Autowired private JobBuilderFactory jobBuilder; @Autowired private StepBui

Spring Batch - Using ItemWriter with List of Lists

Chat day: Our processor returns one List<?>(effectively passed List<List<?>>) back to us ItemWriter. Now, we observe that it JdbcBatchItemWriterhas not been programmed yet item instanceof List. We also observed to handle instanceInstanceof List; we need to wri

Spring Batch unmarshalling XML issue, ItemReader and ItemWriter

username Currently, I can marshal the XML below with no problem, but while reading the documentation I end up with a PublicFigure object and a list of Associate items . Below is my current implementation. my Xml <PublicFigure id="101"> <Associate id="102"

Last processed item in ItemWriter spring batch

Manas Saxena I want to get the last processed item in ItemWriter and write one of the item's string values in the execution context, I am using block writing. Since it's a block write, the write method will be called multiple times, how can I handle this? Here

Last processed item in ItemWriter spring batch

Manas Saxena I want to get the last processed item in ItemWriter and write one of the item's string values in the execution context, I am using block writing. Since it's a block write, the write method will be called multiple times, how can I handle this? Here

Spring Batch - Using ItemWriter with List of Lists

Chat day: Our processor returns one List<?>(effectively passed List<List<?>>) back to us ItemWriter. Now, we observe that it JdbcBatchItemWriterhas not been programmed yet item instanceof List. We also observed to handle instanceInstanceof List; we need to wri

Spring Batch unmarshalling XML issue, ItemReader and ItemWriter

username Currently, I can marshal the XML below with no problem, but while reading the documentation, I end up with a PublicFigure object and a list of Associate items . Below is my current implementation. my Xml <PublicFigure id="101"> <Associate id="102"

Last processed item in ItemWriter spring batch

Manas Saxena I want to get the last processed item in ItemWriter and write one of the item's string values in the execution context, I am using block writing. Since it's a block write, the write method will be called multiple times, how can I handle this? Here