Related
Sayak Banerjee I have a batch job with the following definitions: <batch:job id="job">
<batch:split id="main" task-executor="simpleAsyncTaskExecutor">
<batch:flow>
<batch:step id="getAccountDetails">
<batch:tasklet ref="
Sayak Banerjee I have a batch job with the following definition: <batch:job id="job">
<batch:split id="main" task-executor="simpleAsyncTaskExecutor">
<batch:flow>
<batch:step id="getAccountDetails">
<batch:tasklet ref="g
Sayak Banerjee I have a batch job with the following definitions: <batch:job id="job">
<batch:split id="main" task-executor="simpleAsyncTaskExecutor">
<batch:flow>
<batch:step id="getAccountDetails">
<batch:tasklet ref="
j I am writing a Spring Batch process to migrate datasets from one system to another. In this case it's as simple as using the RowMapperimplementation to build the object from the query before handing over to it ItemWriter. The ItemWritercall to the savemethod
j I am writing a Spring Batch process to migrate datasets from one system to another. In this case it's as simple as using the RowMapperimplementation to build the object from the query before handing over to it ItemWriter. The ItemWritercall to the savemethod
username I have a spring batch job in which an ItemWriter publishes to a web service. I have an input file that I have to convert and publish to a web service, but I can't access the database directly. My question is, how should I handle the failure response o
ChikChak : I have a Spring boot application and I wrote some ItemWriter and ItemReaders like JSON files and CSV files. I would like to add a step of compressing to GZIP and decompressing from GZIP. I wonder if it is possible to use JavaStreams as usual - If I
ChikChak : I have a Spring boot application and I wrote some ItemWriter and ItemReaders like JSON files and CSV files. I would like to add a step of compressing to GZIP and decompressing from GZIP. I wonder if it is possible to use JavaStreams as usual - If I
caffeine I am trying to use Spring Batch 2.2.5 with Java configuration. Here is my configuration: @Configuration
@EnableBatchProcessing
public class JobConfiguration {
@Autowired
private JobBuilderFactory jobBuilder;
@Autowired
private StepBui
caffeine I am trying to use Spring Batch 2.2.5 with Java configuration. Here is my configuration: @Configuration
@EnableBatchProcessing
public class JobConfiguration {
@Autowired
private JobBuilderFactory jobBuilder;
@Autowired
private StepBui
ChikChak : I have a Spring boot application and I wrote some ItemWriter and ItemReaders like JSON files and CSV files. I would like to add a step of compressing to GZIP and decompressing from GZIP. I wonder if it is possible to use JavaStreams as usual - If I
caffeine I am trying to use Spring Batch 2.2.5 with Java configuration. Here is my configuration: @Configuration
@EnableBatchProcessing
public class JobConfiguration {
@Autowired
private JobBuilderFactory jobBuilder;
@Autowired
private StepBui
Chat day: Our processor returns one List<?>(effectively passed List<List<?>>) back to us ItemWriter. Now, we observe that it JdbcBatchItemWriterhas not been programmed yet item instanceof List. We also observed to handle instanceInstanceof List; we need to wri
username Currently, I can marshal the XML below with no problem, but while reading the documentation I end up with a PublicFigure object and a list of Associate items . Below is my current implementation. my Xml <PublicFigure id="101">
<Associate id="102"
Manas Saxena I want to get the last processed item in ItemWriter and write one of the item's string values in the execution context, I am using block writing. Since it's a block write, the write method will be called multiple times, how can I handle this? Here
Manas Saxena I want to get the last processed item in ItemWriter and write one of the item's string values in the execution context, I am using block writing. Since it's a block write, the write method will be called multiple times, how can I handle this? Here
Chat day: Our processor returns one List<?>(effectively passed List<List<?>>) back to us ItemWriter. Now, we observe that it JdbcBatchItemWriterhas not been programmed yet item instanceof List. We also observed to handle instanceInstanceof List; we need to wri
username Currently, I can marshal the XML below with no problem, but while reading the documentation, I end up with a PublicFigure object and a list of Associate items . Below is my current implementation. my Xml <PublicFigure id="101">
<Associate id="102"
Manas Saxena I want to get the last processed item in ItemWriter and write one of the item's string values in the execution context, I am using block writing. Since it's a block write, the write method will be called multiple times, how can I handle this? Here
sore fingers I'm very new to Spring - Batch and I'm wondering if step-id can be accessed via ItemReader or ItemWriter ? In my case, this would allow enumeration types to be switched based on different step definitions within a single ItemReader implementation.
Alireza Alallah First of all, thanks for everyone's attention, I combined spring batch and spring integration, I define a workflow and retrieve files from ftp adapter and send to jobChannel, and use spring batch to process it, I want to write to output channel
Alireza Alallah First of all, thanks for your attention, I combine spring batch and spring integration, I define a workflow and retrieve files from ftp adapter and send to jobChannel, and use spring batch to process it, I want to write to output channel and To
Alireza Allallah First of all, thanks for your attention, I combine spring batch and spring integration, I define a workflow and retrieve files from ftp adapter and send to jobChannel, and use spring batch to process it, I want to write to output channel and T
Alireza Allallah First of all, thanks for your attention, I combine spring batch and spring integration, I define a workflow and retrieve files from ftp adapter and send to jobChannel, and use spring batch to process it, I want to write to output channel and T
Alireza Allallah First of all, thanks for everyone's attention, I combined spring batch and spring integration, I define a workflow and retrieve files from ftp adapter and send to jobChannel, and use spring batch to process it, I want to write to output channe
Dongning Min I am using spring batch with java configuration (batch core 3) I use 2 datasources. For reading DB(A). For writing to DB(B). I want to write a job to configure an ItemReader to read data from (A) and an ItemWriter to write data to (B). (Not just d
Oley 77 I read a flat file (eg. .csv file with 1 row per user, eg: UserId; Data1; Date2 ). But how to deal with duplicate user items in the reader (there is no list of previously read users here...) stepBuilderFactory.get("createUserStep1")
.<User, User>chunk(
Nikonj Sharma I have this problem sometimes. I am using spring-batch 3.0.7 The problem is that in case of org.springframework.dao.DataIntegrityViolationException in one record of ItemWriter, even after skipPolicy is provided (returns true for all exceptions),
Victor Suarez I have a spring batch application to take a file in a samba server and generate a new file in a different folder on the same server. However, only the ItemReader is called in the process. what is the problem? thanks. Batch configuration: @Configu