Get a DemoStart Free TrialSign In

Resources

3 min read

AWS OpenSearch Service is a fully managed service supplied by Amazon Web Services (AWS) for deploying, managing, and scaling OpenSearch clusters in the cloud. OpenSearch Service was formerly known as Amazon Elasticsearch Service (Amazon ES) but was renamed in 2021 due to changes in the open-source project it is based on. In 2022, AWS OpenSearch Serverless was announced. For organizations that don’t want to manage their OpenSearch clusters or businesses that don't possess the dedicated resources or expertise to operate large clusters, OpenSearch serverless is a valid option.

Contents

What is OpenSearch Serverless?

OpenSearch Serverless refers to a serverless computing offering built upon the top of OpenSearch service. It allows users to launch and manage serverless functions in the cloud without having to provision or manage servers. With OpenSearch Serverless, developers can deploy code as functions that automatically scale in response to demand and execute in a serverless environment.

Why Users Prefer OpenSearch Serverless to OpenSearch

While it's essential to consider the specific requirements and constraints of your organization's use case before deciding on the deployment approach, there are numerous reasons why users opt for OpenSearch serverless over traditional OpenSearch.

  • Quick Time to Market: Serverless architectures allow for rapid development and deployment of applications. With OpenSearch Serverless, developers can quickly construct and deploy serverless functions that interact with OpenSearch, accelerating time to market for new features and applications.
  • Event-Driven Architecture: OpenSearch Serverless functions can be triggered by events such as HTTP requests, changes in data stored in OpenSearch indices, or scheduled events. This event-driven architecture allows developers to create reactive and scalable applications that respond to real-time events and changes in data.
  • Flexibility and Agility: Serverless architectures provide flexibility and agility, enabling developers to experiment with new ideas, iterate quickly, and react to changing requirements. OpenSearch Serverless allows developers to build and deploy applications in a quick, iterative, and cost-effective manner.
  • Reduced Operational Overheads: OpenSearch Serverless offloads infrastructure management tasks, such as provisioning, scaling, and maintenance, to the cloud provider, AWS. This lessens the operational overhead for users, enabling them to concentrate on writing code and constructing applications rather than managing servers.
  • Scalability: Serverless architectures automatically scale functions in response to workload demand, guaranteeing that applications can manage various levels of traffic without manual intervention. This scalability is advantageous for applications with fluctuating workloads or unpredictable spikes in traffic.

Disclaimer

While crafting this article, we aimed to gather a diverse range of perspectives and insights from developers, engineers, and other technology leaders.

The insights presented in the following sections primarily originate from technology forums, threads, and user comments where individuals with hands-on experience using OpenSearch Serverless have shared their thoughts.

Advantages of Using OpenSearch Serverless from Users

“It allows me to focus on developing my application rather than worrying about the underlying infrastructure.”

“I think serverless search has been the most obvious missing link in the fence in the world of infrastructure, so I'm very happy to see this come about.”

“It can use IAM or SAML for access control, the lack of allowing username/passwords is really a positive to see from a security point of view.”

“OCUs can be shared across clusters. If you have multiple collections this can lower the cost of specific collections.”

Disadvantages of Using OpenSearch Serverless from Users

“It’s attractive, but only feasible for a log workload. Lots of missing API endpoints.”

“There are definitely things missing though (scripts for example) and the 2 scenarios you are offered (logging and search) definitely don't cover all our scenarios. There are apparently also potential cold start issues to contend with.”

“I’m trying to do something very simple. Using langchain load some docs into a OS Serverless vector store. I can connect to the vector database. When I try to write the docs to an index, it creates the index but can’t write the docs to the index. It returns a 404.”

“For the most part, it works. It's finicky for unbalanced ingestion (so if you have busy shards that kinda complicates things), I'm still monitoring JVM memory pressure (so much for AWS managed). Cold storage can't do a search, so that's kind of annoying.”

“Serverless doesn't scale up fast enough.”

“Serverless isn’t easy. It’s hard. You need real technical leadership vetting designs and you need to think really long and hard on your use cases and data access, because one misstep and it’s all a mess.”

Hosted OpenSearch from Logit.io

Integrated into Logit.io’s powerful observability platform is Hosted OpenSearch, which can be simply launched from the initial dashboard. Our solution completely removes the complexities of deploying OpenSearch serverless as this, along with the maintenance of the tool, will be handled by us. Hosted OpenSearch from Logit.io is convenient, highly scalable, and can significantly reduce operational costs when compared to OpenSearch Serverless.

If you’re interested in learning more about Logit.io’s Hosted OpenSearch, don’t hesitate to contact us or begin exploring the platform and its full capabilities yourself with a 14-day free trial.

If you've enjoyed this artile why not read Mastering Observability with OpenSearch: A Comprehensive Guide or The Top 10 OpenSearch Plugins next?

Get the latest elastic Stack & logging resources when you subscribe

© 2024 Logit.io Ltd, All rights reserved.