DEV Community

Cover image for Amazon Bedrock Pricing 101 💰
Laura Salinas for AWS

Posted on

Amazon Bedrock Pricing 101 💰

I don't know about you, but I hear a lot about how difficult it can be to predict & track costs when using Amazon Bedrock 🥵

I spent the last ~2yrs as a solutions architect helping customers in the the ML startup space understand how they can take advantage of this easy to use, LLM APIs at your fingertips service, and I think the #☝🏼 question was still:

This is great-- but how can we better predict and forecast our costs as we experiment?

Today I'm hoping to pull back the curtain and share everything I do to monitor Bedrock costs in my account, and what I have been recommending to customers for years.

AWS Pricing Calculator

Tried and true, the first place to start is with the pricing calculator. For all its hiccups, the calculator has come a long way since I first started using it in 2018, and it remains easy to use to forecast costs without needing an AWS account. With a single plug and play input field for token count you can generate an estimate and depending on your use case, add additional context for model customization in the other fields.

The 🐘 in the room? At this time the pricing calculator only supports estimates with first party Amazon Titan or Amazon Nova models.

For 3rd party models like Anthropic, Mistral, Meta, etc (arguably the most used on Bedrock) you'll still have to reference the Amazon Bedrock Pricing page for the specific model version and its per token price.

Screenshot of the AWS calculator with arrows pointing to the input fields for token input and ouput

AWS Cost Explorer

Ok, let's dive one step deeper (and one that requires your AWS account) into tracking & understanding Bedrock charges. My favorite way to do this is with the built in Cost Explorer tool in the management console.

First off, why I like this approach?

  • Interactive graph visuals

  • Ability to drill down by service, region, API call and more

  • Can review data 3yrs in the past and forecast ahead up to 1yr

  • Easily save reports and exports as CSVs

Tracking

The first thing I like to do when diving into my Bedrock costs is set the appropriate filters in the Report parameters pane.

  1. Date Range: YTD (unless I'm drilling down on a specific date/time)
  2. Granularity: Monthly (again, unless I'm investigating a specific issue)
  3. Dimension: This one varies but here are the main ones I use.

    a. Usage type - I love the granularity here, and it also displays region which is helpful when operating across multiple regions and looking for a quick snapshot.

    b. API Operation - The values here more closely match the verbiage of the bill which makes troubleshooting service charges easier.

    c. Tag - Very helpful if you're in the practice of tagging workloads by team, project, business unit, etc. (Which you should be 😉)

Screenshot of the cost explorer report with usage type filter on the Bedrock service

Screenshot of the cost explorer report with api operation filter on the Bedrock service

Forecasting

From the same Report parameters pane you can do forecasting easily by selecting how many months into the future you want the algorithm to run through.

There's a bit of data analytics science under the hood for this prediction algorithm, you can read more about that here if you're curious.

Screenshot of the cost explorer report with 3 months forecasting filter on the Bedrock service

Screenshot of the cost explorer graph report with 3 months forecasting filter on the Bedrock service

AWS Billing Console

While most of my use for service charge troubleshooting happens in cost explorer, sometimes it helps to see the line items in your bill directly.

For bill access head to the Bills console. In the service name search field you'll want to enter Bedrock Service and Bedrock. This will pickup both 3P and 1P model charges alongside any other native Bedrock features (like guardrails).

Screenshot of the billing console report with filters for the Bedrock services

One other aspect in the billing console I'd like to point out is the Usage Quantity column. For Bedrock specifically this is where you can find token count!

Screenshot of the billing console report with token count for Bedrock line items

AWS Budgets

I won't lie to you I haven't used budgets in my own AWS account, but I recommend it often given how powerful it can be for alerting before a major bill spike. You can also create actions based on budget breaches or approximations on a budget that's been set.

For more information on configuring budget actions check out the documentation.

For those of you who are visual learners, this tutorial on setting budget alerts provides a great walkthrough as well.

Takeaways 💡

  • As with all things AWS, use the right tool for the right job. For quick and dirty inspection I prefer cost explorer, for more intricate things there are plenty of third party tools to explore in the AWS marketplace (over 5000 hits on 'cost management' alone)

  • Use multiple dimensions in cost explorer to help triage costs across several services (e.g. S3 + CloudFront spikes)

  • Tagging and setting budget alerts will be your best friend to prevent a billing jump scare

Additional Resources 📚

Sound off in the comments if you have other pricing challenges you've observed with Bedrock or anything else you'd like to learn about!

Top comments (0)