Hi everybody, I have a problem when I add table-prefix in the application properties, Spring-Batch doesn't get the property and sets the default prefix BATCH_.
Any Idea ?
Error org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
Property spring.batch.table-prefix=SOMETHING.BATCH_
Spring Version
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.4.RELEASE</version>
</parent>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.0.4.RELEASE</version>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
<version>2.1.2.RELEASE</version>
</dependency>
Comment From: snicoll
@marianogambino there's no way for us to tell with the information you've provided. Please share a minimal sample (github repo or zip) that we can run ourselves to reproduce the issue.
(Unrelated by you shouldn't mix and match Spring Boot versions as your pom snippet seems to indicate. As you're using the starter, you should remove the version and let the parent uses the right one).
Comment From: marianogambino
@snicoll I´m sorry, I can't share source code because it's for a business job.
I need to connect to the database (Oracle) with a specific schema, for this I add the property in the application.properties, spring.batch.table-prefix=SOMETHING.BATCH_ , but when Spring run the query -> SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ? , throws an exception because it does not find the table BATCH_JOB_INSTANCE. [ORA-00942: table or view does not exist]
This is because the tables are in another schema and Spring gets tablePrefix as BATCH_ (default) and not as SOMETHING.BATCH_, so when Spring concatenates the prefix, the result is BATCH_JOB_INSTANCE and not SOMETHING.BATCH_JOB_INSTANCE.
Thank you very much!
Comment From: snicoll
Thanks for the feedback.
I´m sorry, I can't share source code because it's for a business job.
I am not asking you to share your application. A minimal sample that shows the problem would be enough. It looks like this SO question covers some of the things you're trying to do. To be sure we're talking about the same thing, please share a sample we can run that reproduces the problem.
Comment From: marianogambino
Hi @snicoll After researching and testing, inject the Bean BatchConfigurer and add in my AppConfig
`@EnableBatchProcessing
@Configuration
public class MultipleDBConfig {
@Bean(name = "datasource")
@ConfigurationProperties(prefix = "spring.datasource")
public DataSource oracleDataSource(@Value("${spring.datasource.driverClassName}") String driver,
@Value("${spring.datasource.url}") String url,
@Value("${spring.datasource.username}") String user,
@Value("${spring.datasource.password}") String password,
@Value("${spring.datasource.poolSize}") Integer poolSize) {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driver);
dataSource.setUrl(url);
dataSource.setUsername(user);
dataSource.setPassword(password);
return dataSource;
}
@Bean(name = "jdbcTemplate")
public JdbcTemplate jdbcTemplate(@Qualifier("datasource") DataSource dsOracle) {
return new JdbcTemplate(dsOracle);
}
@Primary
@Bean(name = "dsAudit")
@ConfigurationProperties(prefix = "audit.datasource")
public DataSource primarydataSource(@Value("${audit.datasource.driverClassName}") String driver,
@Value("${audit.datasource.url}") String url,
@Value("${audit.datasource.username}") String user,
@Value("${audit.datasource.password}") String password,
@Value("${audit.datasource.poolSize}") Integer poolSize) {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driver);
dataSource.setUrl(url);
dataSource.setUsername(user);
dataSource.setPassword(password);
return dataSource;
//return DataSourceBuilder.create().build();
}
@Bean(name = "auditJdbcTemplate")
public JdbcTemplate auditJdbcTemplate(@Qualifier("dsAudit") DataSource dsOracle) {
return new JdbcTemplate(dsOracle);
}
public PlatformTransactionManager transactionManager(){
return new ResourcelessTransactionManager();
}
@Bean
public BatchConfigurer configurer(@Qualifier("dsAudit") DataSource dataSource){
return new DefaultBatchConfigurer(){
@Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager());
factory.setIsolationLevelForCreate("ISOLATION_READ_COMMITTED");
factory.setTablePrefix("SOMETHING.BATCH_");
factory.setDatabaseType("ORACLE");
factory.setMaxVarCharLength(1000);
return factory.getObject();
}
@Override
protected JobLauncher createJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(createJobRepository());
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
};
} }`
The problem now is that I have two Jobs scheduled and when one starts one of them if it has set prefix (SOMETHING_BATCH) , but the other (the error Jobs) keeps the default prefix (BATCH_BATCH).
I also put the code of the jobs when they start.
``` @Component public class PersonaLauncher {
private static final Logger LOGGER = LoggerFactory.getLogger(PersonaLauncher.class);
@Autowired
JobLauncher jobLauncherBc;
private final Job job;
private final Job errorsJob;
@Autowired
JobRepository jobRepository;
@Autowired
@Qualifier("dsAudit")
DataSource dataSourceAudit;
@Autowired
@Qualifier("datasource")
DataSource dataSource;
@Value("${maxDaysPerQuery}")
Integer maxDaysPerQuery;
@Value("${errors.maxRetries}")
Integer errorsMaxRetries;
@Value("${reader.reprocess.seconds}")
Integer minusSeconds;
@Autowired
PersonaLauncher(@Qualifier(JobConstant.PUCTOBCJOB) Job job, @Qualifier(JobConstant.ERRROSTOBCJOB) Job errorsJob) {
this.job = job;
this.errorsJob = errorsJob;
}
@Scheduled(cron = "${errors.to.bc.job.cron}")
public boolean launchErrorsJob() {
LOGGER.info("Starting errors to Blockchain job");
boolean flag = false;
try {
jobLauncherBc.run(errorsJob, newExecution(errorsJob.getName()));
} catch (Exception e) {
e.printStackTrace();
return flag;
} finally {
return flag;
}
}
@Scheduled(cron = "${database.to.bc.job.cron}")
public boolean launchCsvFileToDatabaseJob() {
LOGGER.info("Starting puc to Blockchain job");
boolean flag = false;
try {
jobLauncherBc.run(job, newExecution(job.getName()));
} catch (Exception e) {
e.printStackTrace();
return flag;
} finally {
return flag;
}
}
private JobParameters newExecution(String jobName) throws Exception {
Map<String, JobParameter> parameters = new HashMap<>();
JobParameter parameter = new JobParameter(new Date());
parameters.put(JobConstant.CURRENTTIME, parameter);
String lastDate = LastExecutionReader.getLastExecution(dataSourceAudit , dataSource ,maxDaysPerQuery);
parameters.put(JobConstant.LASTEXECUTION,new JobParameter(calculateLastExecutionToRead(lastDate)));
if (jobName.equals(JobConstant.ERRROSTOBCJOB)) {
parameters.put(JobConstant.TOTALROWSCOUNT, new JobParameter(getTotalErrCount()));
} else {
parameters.put(JobConstant.LASTEXECUTIONTOSAVE,new JobParameter(calculateLastExecutionToSave(lastDate)));
parameters.put(JobConstant.TOTALROWSCOUNT, new JobParameter(getTotalCount(lastDate)));
}
return new JobParameters(parameters);
}`
Comment From: snicoll
@marianogambino thanks for the feedback but that's not what I had in mind by "a minimal sample (github repo or zip) that we can run ourselves to reproduce the issue.".
Given you are configuring most of the infrastructure yourself, I can't really see how that's a Spring Boot problem at this point but I've pinged @benas to get more insigiths.
In the meantime, I can see you asked the same question on StackOverflow. In the interest of not duplicating efforts I am going to close this one.
Comment From: marianogambino
Hey @snicoll, I solved the problem, it's working fine. I injected a new JobRepository created with a JobRepositoryFactoryBean. In JobRepositoryFactoryBean, I have configured the data source and prefix, with this when jobs start the JdbcJobInstanceDao has the prefix.
I share the solution:
private PlatformTransactionManager transactionManager(){
return new ResourcelessTransactionManager();
}
private JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSourceAudit);
factory.setTransactionManager(transactionManager());
factory.setIsolationLevelForCreate("ISOLATION_READ_COMMITTED");
factory.setTablePrefix("SOMETHING.BATCH_");
factory.setDatabaseType("ORACLE");
factory.setMaxVarCharLength(1000);
return factory.getObject();
}
@Bean
public JobLauncher jobLauncherBc() throws Exception{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.setJobRepository(createJobRepository());
return jobLauncher;
}
Thank you very much for help me!!!
Comment From: javaHelper
Could you please guide me here: https://stackoverflow.com/questions/59544730/spring-batch-how-to-create-metadata-tables-on-different-schema/59544931#59544931 ?
Comment From: sohail-inf
Reason for this error : Spring Boot / Spring Batch needs default tables where it can track the batch job related information. If the tables are not available in the database, we need to allow Spring to create them. To configure this please add the following line in application.properties.
spring.batch.initialize-schema=always
If this doesn’t work, you can manually create tables by running the following script. See this: https://poopcode.com/bad_sql_grammar_batch_job_instance/