Hi, I am using cromadb with Ollama to train data String chromaUrl = "http://YYYYY"; ChromaApi chromaApi = new ChromaApi(chromaUrl, new RestTemplate());

String MODEL = OllamaModel.LLAMA3.id();

var ollamaApi = new OllamaApi("http://XXXX");

//new OllamaChatModel(ollamaApi, OllamaOptions.create().withModel(MODEL).withTemperature(0.9f));

OllamaEmbeddingModel ollamaEmbeddingModel = new OllamaEmbeddingModel(ollamaApi, OllamaOptions.create().withModel(MODEL));

ChromaVectorStore v = new ChromaVectorStore(ollamaEmbeddingModel, chromaApi,"chromadb.api.fastapi.FastAPI",  true);
v.afterPropertiesSet();

List<Document> documents = List.of(
  new Document("'test 1"),
  new Document("'test 2"),
  new Document("'test 3"));

v.add(documents);

Can we have RetrievalQA and RetrievalQAWithSourcesChain like phyton ? https://api.python.langchain.com/en/latest/chains/langchain.chains.qa_with_sources.retrieval.RetrievalQAWithSourcesChain.html

Comment From: LaCoCa

Actual this works:

QuestionAnswerAdvisor q = new QuestionAnswerAdvisor(v, SearchRequest.query("Question ?"));

ChatClient.ChatClientRequest chatClientRequest = new ChatClient.ChatClientRequest
  (chatModel,
    "Question ?",
    new HashMap<>(),
    null,
    new HashMap<>(),
    new ArrayList<>(), new ArrayList<>(),new ArrayList<>(), new ArrayList<>(), chatOptions, Collections.singletonList(q), new HashMap<>());

final ChatClient.ChatClientRequest.CallResponseSpec callResponseSpec = chatClientRequest.call();
final ChatResponse chatResponse1 = callResponseSpec.chatResponse();