Elastic, a search AI company, has announced its support for the Amazon Bedrock models hosted on Elasticsearch Open Inference API and Playground. This new integration allows developers to use any large language model (LLM) available on Amazon Bedrock to create production-ready RAG applications. Shay Banon, the founder and CTO of Elastic, stated that this integration aims to make it easier for AWS developers to build advanced search experiences.
With this integration, developers can now store and utilize embeddings, refine retrieval to provide responses with proprietary data, and more. Amazon Bedrock models can also be accessed in the low-code testing experience, giving developers additional options for LLM A/B testing. Support for Amazon Bedrock is already available, and interested developers can refer to the Inference API and Playground Blogs for more information on getting started.
Elastic, listed on the NYSE as ESTC, offers solutions for search, observability, and security using the Elastic Search AI Platform. This platform is utilized by thousands of companies, with over 50% of the Fortune 500 relying on Elastic’s technology. For more details, visit Elastic’s website at elastico.co.
In conclusion, Elastic’s support for the Amazon Bedrock models on Elasticsearch Open Inference API and Playground provides developers with enhanced capabilities to build transformative conversational search applications. This integration aims to simplify the process for AWS developers and offer them more options for creating next-generation search experiences. With Elastic’s dedication to empowering users to access real-time answers using all their data, this collaboration with Amazon Bedrock further enriches the development possibilities for search AI applications.
Article Source
https://www.datanami.com/this-just-in/elasticsearch-open-inference-api-and-playground-now-support-amazon-bedrock/