Hi,

Starting from this project : https://spring.io/blog/2015/03/22/using-google-protocol-buffers-with-spring-mvc-based-rest-services I tried to get a working controller with Weblux / Protobuf support. This is the code after modification

package com.keeneye.io.server;

import java.util.Arrays;
import java.util.Collection;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.http.converter.protobuf.ProtobufHttpMessageConverter;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

import demo.CustomerProtos;
import reactor.core.publisher.Flux;

@SpringBootApplication
public class ServerApplication {

    public static void main(String[] args) {
        SpringApplication.run(ServerApplication.class, args);
    }

    @Bean
    ProtobufHttpMessageConverter protobufHttpMessageConverter() {
        return new ProtobufHttpMessageConverter();
    }

    private CustomerProtos.Customer customer(int id, String f, String l, Collection<String> emails) {
        Collection<CustomerProtos.Customer.EmailAddress> emailAddresses = emails.stream()
                .map(e -> CustomerProtos.Customer.EmailAddress.newBuilder()
                        .setType(CustomerProtos.Customer.EmailType.PROFESSIONAL).setEmail(e).build())
                .collect(Collectors.toList());

        return CustomerProtos.Customer.newBuilder().setFirstName(f).setLastName(l).setId(id).addAllEmail(emailAddresses)
                .build();
    }

    @Bean
    CustomerRepository customerRepository() {
        Map<Integer, CustomerProtos.Customer> customers = new ConcurrentHashMap<>();
        // populate with some dummy data
        Arrays.asList(customer(1, "Chris", "Richardson", Arrays.asList("crichardson@email.com")),
                customer(2, "Josh", "Long", Arrays.asList("jlong@email.com")),
                customer(3, "Matt", "Stine", Arrays.asList("mstine@email.com")),
                customer(4, "Russ", "Miles", Arrays.asList("rmiles@email.com")))
                .forEach(c -> customers.put(c.getId(), c));

        // our lambda just gets forwarded to Map#get(Integer)
        return id -> {
            return Flux.just(customers.get((id)));

        };
    }

}

interface CustomerRepository {
    Flux<CustomerProtos.Customer> findById(int id);
}

@RestController
class CustomerRestController {

    @Autowired
    private CustomerRepository customerRepository;

    @RequestMapping("/customers/{id}")
    Flux<CustomerProtos.Customer> customer(@PathVariable Integer id) {
        return this.customerRepository.findById(id);
    }

}

If you run this code, after having generated the protobuf classes, you'll get an error with

cat customer.msg | protoc --encode demo.Customer customer.proto | curl -sS -X POST  -H "Content-Type: application/x-protobuf" --data-binary @- http://localhost:8080/customers/1  protoc --decode demo.Customer customer.proto

But if you switch Flux to Mono in the Java code, the decoding passes. More surprisingly, if you skip the first byte of the response with Flux, the decoding passes also.

cat customer.msg | protoc --encode demo.Customer customer.proto | curl -sS -X POST  -H "Content-Type: application/x-protobuf" --data-binary @- http://localhost:8080/customers/1 | tail -c +2 | protoc --decode demo.Customer customer.proto

Is it mandatory to use Mono instead of Flux to make it work ? Why would Flux work with just a few bytes skipped ? It seems there's an issue somewhere.

Comment From: rstoyanchev

We seem to be lacking coverage of Protobuf in the Codecs section and that is something we could improve. However the Javadoc for ProtobufEncoder does mention this:

Flux are serialized using delimited Protobuf messages with the size of each message specified before the message itself. Single values are serialized using regular Protobuf message format (without the size prepended before the message).

And if you follow the link to you'll read more about that delimited format used to stream messages. I did not run the above but in all likelihood protoc is expecting a single object and stumbles on the initial byte(s) that indicate the size of the first message.

We do have integration tests in ProtobufIntegrationTests with both Flux and Mono, so I'm closing this but feel free to comment.

Comment From: imatmati

Hi,

Thanks, I'll have a look at the integration tests maybe I'll find the solution. Meanwhile, I created a java client and got the error message : Protocol message contained an invalid tag (zero). I think I'll have to investigate through the tests you provided. Thank you very much. Have a nice day.