We have mongo db ChangeStream implementation in our application (spring-boot). We had spring-boot 2.4.13 (mongo 2.4.4 and Kafka 2.5.1) .we have listener for both Kafka and MongoDB its working as expected. However we have upgraded to spring-boot 2.6.2 (mongo 2.6.2 and Kafka 2.8.1) and while starting the app application is not moving further from the below line, which is causing application timeout and app is failing to start. changes.iterator().forEachRemaining(

Comment From: snicoll

@sowmya-divyanand thanks for the report but there's nowhere near enough information for us to help you. If you want support, can you please share a small sample that we can run ourselves that reproduces what you've indicated? You can do so by attaching a zip to this issue or sharing a link to a GitHub repository.

Comment From: sowmya-divyanand

Note: Both Mongo Listener and Kafka listener has no common functionality. Both these listeners are doing different jobs and no dependency on each other. Below are the Mongodb Changestream and Kafka listener in the same spring boot applications.As per the problem I stated in the earlier post when I comment out Changestream listener code application starts up without any issue.I clearly suspect that there is some conflicting happening between Kafka and Mongo db listener combination. Mongo ChangeStream Listener:

[@EventListener(ApplicationReadyEvent.class)]()
public void onApplicationEvent(){
MongoDatabase database = mongoClient.getDatabase("user");
MongoCollection<UserEligibilityDataEntity> collection = database.getCollection("user_collection", UserEligibilityDataEntity.class);
ChangeStreamIterable<UserEligibilityDataEntity> changes = collection.watch(asList<Aggregates.match(Filters.in("operationType", asList("insert", "update", "replace", "delete"))))).fullDocument(FullDocument.UPDATE_LOOKUP);

changes.iterator().forEachRemaining(ChangeStreamDocument<UserEligibilityDataEntity> change) -> {
List<UserEligibilityDataEntity> userEntityList = new ArrayList();
UserEligibilityDataEntity userEntity = change.getFullDocument();
userEntityList.add(userEntity);
}
}

Kafka Listener:

@KafkaListener(topics ="testopic", groupId="com.test.group"..........
public void onMessage(ConsumerRecord<String, String> data, Acknowledgement ack){

// listener logic......

}

Comment From: wilkinsona

@sowmya-divyanand Thanks for the additional details but I'm afraid that's not enough for us to be able to spend time trying to diagnose the problem. As Stephane requested above, can you please provide a small sample that we can run? It should be a complete, yet minimal, application. Unfortunately, we don't have time to piece things together from small snippets of code, particularly as we won't know for sure that we've reproduced your specific problem.

Comment From: sowmya-divyanand

If I have just have Mongodb listener in the application. Application works fine and Vice versa if I just have Kafka listener alone applications starts up fine

Comment From: sowmya-divyanand

I do not think our application will work for you. Please generate a sample spring boot project with 2.6.2 along with Kafka listener and Mongo Changestream listener.

Comment From: wilkinsona

We don't need your full application, just a minimal sample that reproduces the problem you have described. As we don't know exactly what that problem is, it does not make sense for us to create an application to try to reproduce the problem.

If you would like us to spend some more time investigating, please spend some time providing a complete yet minimal sample that reproduces the problem. Unfortunately, if you are unable to do that we won't be able to help you and this issue will have to be closed.

Comment From: spring-projects-issues

If you would like us to look at this issue, please provide the requested information. If the information is not provided within the next 7 days this issue will be closed.

Comment From: spring-projects-issues

Closing due to lack of requested feedback. If you would like us to look at this issue, please provide the requested information and we will re-open the issue.