Using the below code for semantic cache

from langchain.globals import set_llm_cache
from langchain_openai import OpenAI
from langchain.cache import RedisSemanticCache
from langchain_huggingface import HuggingFaceEmbeddings
import time, os
from langchain_openai import AzureChatOpenAI
llm = AzureChatOpenAI(<my credentials>)
huggingface_embedding = HuggingFaceEmbeddings(model_name="all-MiniLM-L6-v2")
set_llm_cache(
    RedisSemanticCache(redis_url="redis://127.0.0.1:6379", embedding=huggingface_embedding)
)
question = "What is capital of Japan?"
res = llm.invoke(question)

both redis db and redis python client I installed. redis-5.0.6 redis-cli 7.2.5 Still its getting the given error

[BUG]ValueError: Redis failed to connect: Redis cannot be used as a vector database without RediSearch >=2.4Please head to https://redis.io/docs/stack/search/quick_start/to know more about installing the RediSearch module within Redis Stack.

But the strange thing is, there is no 2.4 version available for the python client RediSearch in pypi

Comment From: sundb

please ref https://github.com/langchain-ai/langchain/issues/13611