As you can see i am using retrieve() but still connections are not closing.
Code which is causing this issue
WebClient webClient = WebClient.create("http://192.168.101.8:8066");
public Mono<City> getCityById(Integer id){
return webClient.get()
.uri("/cities/" + id)
.header(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE)
.retrieve()
.bodyToMono(City.class);
}
total socktes opened by this api on a docker
/ # lsof -p 1 | grep socket | wc -l
4
/ # lsof -p 1 | grep socket | wc -l
6
/ # lsof -p 1 | grep socket | wc -l
5
It is creating 2 port/connections for one request then closing only one. I am not sure why
Comment From: bclozel
Reactor Netty client is using a connection pool in order to reuse existing connections to the same host.
- are you reusing the same client instance for the whole application?
- are you seeing that number of connections growing without limit?
You can configure the connection pool as described in the reactor documentation by configuring your own ReactorClientHttpConnector
on the WebClient
builder.
Comment From: j-arpit
so netty will create this pool of 500 by Default for reuse purpose then how will i know that which connections are for reuse connection pool and which ones are create by memory leakage. As i have checked exchange() is causing memory leakage.
Comment From: bclozel
I don't think there's a memory leak here - this is merely about the behavior of the reactor-netty connection pool. Here you're showing that the client might be creating/borrowing several connections and that they remain in the pool for some time.
If this behavior doesn't work for you, you can configure the max idle time or max life time so that connections are closed in a timely fashion.
Here's how you can make progress on this:
- configure the connection pool as described in the docs to better fit your use case
- if this custom configuration doesn't help and you're seeing a behavior that doesn't fit the configuration you've applied, please raise an issue in the reactor-netty project
Spring Framework and WebClient
aren't handling the connection pool so I don't think we can do anything here.
I'm closing this issue as a result. Thanks!