In Postgres database, there is a column type bytea. According to Storing Binary Data, here is the code to write files to bytea column with PreparedStatement:

File file = new File("myimage.gif");
try (FileInputStream fis = new FileInputStream(file);
        PreparedStatement ps = conn.prepareStatement("INSERT INTO images VALUES (?, ?)"); ) {
    ps.setString(1, file.getName());
    ps.setBinaryStream(2, fis, (int) file.length());  // <-- setBinaryStream needed here
    ps.executeUpdate();
}

For the current version of JdbcClient (6.1.1), StatementSpec.param() does not support InputStream. I have tried to open up underlying code and cannot find places where doing stream related code.

Missing this feature, will prevent JdbcClient to be used when bytea column exists.

Comment From: sbrannen

Have you tried using Spring JDBC's support for BLOBs (Binary Large OBjects) via the LobHandler abstraction -- for example, DefaultLobHandler?

Comment From: dopsun

I only tried using SqlLobValue with default LobHandler. Below is what it looks like:

InputStream inputStream = new FileInputStream(file);
SqlLobValue lobValue = new SqlLobValue(inputStream, (int) file.length());

this.jdbcClient.sql(sql)
    .param("data", lobValue)
    .update();

It throws exception with text "SqlLobValue only supports SQL types BLOB and CLOB".

Comment From: quaff

It should be supported at JdbcOperations not JdbcClient level, I confirm that is not supported currently.

package com.example.demo;

import static org.assertj.core.api.Assertions.assertThat;

import java.io.ByteArrayInputStream;
import java.io.InputStream;

import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.ImportAutoConfiguration;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase.Replace;
import org.springframework.boot.test.autoconfigure.jdbc.JdbcTest;
import org.springframework.boot.testcontainers.context.ImportTestcontainers;
import org.springframework.boot.testcontainers.service.connection.ServiceConnection;
import org.springframework.boot.testcontainers.service.connection.ServiceConnectionAutoConfiguration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.test.context.jdbc.Sql;
import org.testcontainers.containers.PostgreSQLContainer;
import org.testcontainers.junit.jupiter.Container;

@JdbcTest
@ImportAutoConfiguration(classes = ServiceConnectionAutoConfiguration.class)
@AutoConfigureTestDatabase(replace = Replace.NONE)
@Sql(statements = "create table test(value bytea)")
@ImportTestcontainers
public class PostgreSQLByteaTests {

    @Container
    @ServiceConnection
    static PostgreSQLContainer<?> container = new PostgreSQLContainer<>("postgres");

    @Autowired
    private JdbcTemplate jdbcTemplate;

    @Test
    void test() {
        InputStream binaryStream = new ByteArrayInputStream("test".getBytes());
        assertThat(jdbcTemplate.update("insert into test values(?)", binaryStream)).isEqualTo(1);
    }

}

Exception is thrown as:

org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [insert into test values(?)]
    at org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:112)
    at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:107)
    at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:116)
    at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1548)
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:677)
    at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:970)
    at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:1014)
    at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:1024)
    at com.example.demo.PostgreSQLByteaTests.test(PostgreSQLByteaTests.java:35)
    at java.base/java.lang.reflect.Method.invoke(Method.java:568)
    at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
    at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
Caused by: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of java.io.ByteArrayInputStream. Use setObject() with an explicit Types value to specify the type to use.
    at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:1051)
    at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
    at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:452)
    at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:249)
    at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:181)
    at org.springframework.jdbc.core.ArgumentPreparedStatementSetter.doSetValue(ArgumentPreparedStatementSetter.java:72)
    at org.springframework.jdbc.core.ArgumentPreparedStatementSetter.setValues(ArgumentPreparedStatementSetter.java:51)
    at org.springframework.jdbc.core.JdbcTemplate.lambda$update$2(JdbcTemplate.java:973)
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:658)
    ... 7 more


Comment From: dopsun

Thanks @quaff I just changed my code to Jdbctemplate and got the same exception as yours.

For time being, I have changed my code to use PreparedStatement.

Comment From: quaff

@dopsun I've fixed this by https://github.com/spring-projects/spring-framework/pull/32163, please patch StatementCreatorUtils to your project to verify.

Comment From: dopsun

Great @quaff

I tried with your new patch and build locally. It works for my case with JdbcClient.

This is my first time to build this repository from source and test locally, seems like IDE local cache causes my initial false errors (hence I deleted my previous comment with error report). Restarted IDE with a clean start, your patch works for me.

Comment From: quaff

Great @quaff

I tried with your new patch and build locally. It works for my case with JdbcClient.

This is my first time to build this repository from source and test locally, seems like IDE local cache causes my initial false errors (hence I deleted my previous comment with error report). Restarted IDE with a clean start, your patch works for me.

You can try the trick that paste changed files to your project src directory to override classes in jars.

Comment From: jhoeller

@dopsun as mentioned, for a BLOB type, we got SqlLobValue... but that is indeed restricted to BLOB handling, not covering direct byte array types as with Postgres here. For the Postgres bytea type, you could implement a custom SqlTypeValue where you call setBinaryStream or setBytes in your setTypeValue implementation, and pass that as an argument into JdbcClient. That's probably the best you can do with existing Spring versions.

The plain handling of an InputStream argument in StatementCreatorUtils (as in the PR) might work for simple scenarios, as long as the caller makes sure to close the InputStream afterwards, e.g. when the InputStream never reaches the PreparedStatement. However, we generally do not intend to support such plain "hot" streams as arguments. Also, JDBC drivers usually insist on knowing the length of the stream upfront in order to allocate accordingly, so a plain InputStream object won't be sufficient.

We intend to address the file use case with support for special value types, e.g. Resource handles such as FileSystemResource, that lazily retrieve the InputStream and also provide the content length upfront. Alternatively, we may also support byte[] arguments directly. I'm going to turn this issue into a corresponding enhancement request for 6.1.4.

For that reason, I'm going to close the PR. Thanks for your efforts there in any case, @quaff!

Comment From: dopsun

@jhoeller Thanks.

I tried again with your suggestion on overwriting setTypeValue. And it works with code like below.

static class InputStreamSqlLobValue extends SqlLobValue {
  private final InputStream stream;
  private final int length;

  public InputStreamSqlLobValue(InputStream stream, int length) {
    super(stream, length);
    this.stream = stream;
    this.length = length;
  }

  @Override
  public void setTypeValue(PreparedStatement ps, int paramIndex, int sqlType, @Nullable String typeName) throws SQLException {
    ps.setBinaryStream(paramIndex, this.stream, this.length);
  }
}

Comment From: jhoeller

For the above, you could simply declare InputStreamValue implements SqlTypeValue. No need to extend from SqlLobValue there, you could rather implement the same callback interface that SqlLobValue implements as well.

As for standard value types, it looks like we will introduce a pair of SqlBinaryValue and SqlCharacterValue classes, similar to SqlLobValue in design but delegating to the corresponding PreparedStatement methods for any target SQL type (no LobHandler involved). Those value types can even have BLOB/CLOB/NCLOB support when the corresponding SQL type is specified, e.g. through a SqlParameterValue that combines Types.BLOB with a SqlBinaryValue as value object.

As a side note, a parameter of type byte[] might work as-is already (without the need for a custom value type) since most database drivers explicitly handle that when passed into PreparedStatement.setObject (which Spring's StatementCreatorUtils uses by default). We can make this more explicit through dedicated setBytes support for a given byte array in StatementCreatorUtils but that's just for consistent behavior across databases.

Comment From: dopsun

In my case, the reason to choose InputStream over byte[], is based on my assumption that underlying driver will have better memory control/ optimization with InputStream, especially for relatively large file size. I should admit that I have not validated this assumption with proper benchmarks.

Thanks for your suggestion on improvements. Here the latest version which is working:

static class InputStreamValue implements SqlTypeValue {
  private final InputStream stream;
  private final int length;

  public InputStreamValue(InputStream stream, int length) {
    this.stream = stream;
    this.length = length;
  }

  @Override
  public void setTypeValue(PreparedStatement ps, int paramIndex, int sqlType, @Nullable String typeName) throws SQLException {
    ps.setBinaryStream(paramIndex, this.stream, this.length);
  }
}