Tip
This page only contains information on the st.cache_resource API. For a deeper dive into caching and how to use it, check out Caching.
st.cache_resource
Decorator to cache functions that return resource objects (e.g. database connections, ML models).
Cached objects can be global or session-scoped. Global resources are shared across all users, sessions, and reruns. Session-scoped resources are scoped to the current session and are removed when the session disconnects. Global resources must be thread-safe. If thread safety is an issue, consider using a session-scoped cache or storing the resource in st.session_state instead.
You can clear a function's cache with func.clear() or clear the entire cache with st.cache_resource.clear().
A function's arguments must be hashable to cache it. Streamlit makes a best effort to hash a variety of objects, but the fallback hashing method requires that the argument be pickleable, also. If you have an unhashable argument (like a database connection) or an argument you want to exclude from caching, use an underscore prefix in the argument name. In this case, Streamlit will return a cached value when all other arguments match a previous function call. Alternatively, you can declare custom hashing functions with hash_funcs.
Objects cached by st.cache_resource act like singletons and can mutate. To cache data and return copies, use st.cache_data instead. To learn more about caching, see Caching overview.
Warning
Async objects are not officially supported in Streamlit. Caching async objects or objects that reference async objects may have unintended consequences. For example, Streamlit may close event loops in its normal operation and make the cached object raise an Event loop closed error.
To upvote official asyncio support, see GitHub issue #8488. To upvote support for caching async functions, see GitHub issue #8308.
| Function signature[source] | |
|---|---|
st.cache_resource(func, *, ttl, max_entries, show_spinner, show_time=False, validate, hash_funcs=None, on_release=None, scope="global") | |
| Parameters | |
func (callable) | The function that creates the cached resource. Streamlit hashes the function's source code. |
ttl (float, timedelta, str, or None) | The maximum age of a returned entry from the cache. This can be one of the following values:
Changes to this value will trigger a new cache to be created. |
max_entries (int or None) | The maximum number of entries to keep in the cache, or None for an unbounded cache. When a new entry is added to a full cache, the oldest cached entry will be removed. Defaults to None. Changes to this value will trigger a new cache to be created. |
show_spinner (bool or str) | Enable the spinner. Default is True to show a spinner when there is a "cache miss" and the cached resource is being created. If string, value of show_spinner param will be used for spinner text. |
show_time (bool) | Whether to show the elapsed time next to the spinner text. If this is False (default), no time is displayed. If this is True, elapsed time is displayed with a precision of 0.1 seconds. The time format is not configurable. |
validate (callable or None) | An optional validation function for cached resources. validate is called each time the cached value is accessed. It receives the cached value as its only parameter and it must return a boolean. If validate returns False, the current cached value is discarded, and the decorated function is called to compute a new value. This is useful e.g. to check the health of database connections. |
hash_funcs (dict or None) | Mapping of types or fully qualified names to hash functions. This is used to override the behavior of the hasher inside Streamlit's caching mechanism: when the hasher encounters an object, it will first check to see if its type matches a key in this dict and, if so, will use the provided function to generate a hash for it. See below for an example of how this can be used. |
on_release (callable or None) | A function to call when an entry is removed from the cache. The removed item will be provided to the function as an argument. This is only useful for caches that remove entries normally. Most commonly, this is used session-scoped caches to release per-session resources. This can also be used with max_entries or ttl settings. TTL expiration only happens when expired resources are accessed. Therefore, don't rely on TTL expiration to guarantee timely cleanup. Also, expiration can happen on any script run. Ensure that on_release functions are thread-safe and don't rely on session state. The on_release function isn't guaranteed to be called when an app is shut down. |
scope ("global" or "session") | The scope for the resource cache. If this is "global" (default), the resource is cached globally. If this is "session", the resource is removed from the cache when the session disconnects. Because a session-scoped cache is cleared when a session disconnects, an unstable network connection can cause the cache to populate multiple times in a single session. If this is a problem, you might consider adjusting the server.websocketPingInterval configuration option. |
Example
Example 1: Global cache
By default, an @st.cache_resource-decorated function uses a global cache.
Example 2: Session-scoped cache
By passing scope="session", an @st.cache_resource-decorated function uses a session-scoped cache. You can also use on_release to clean up resources when they are no longer needed.
Example 3: Unhashable arguments
By default, all parameters to a cached function must be hashable. Any parameter whose name begins with _ will not be hashed. You can use this as an "escape hatch" for parameters that are not hashable:
Example 4: Clearing a cache
A cached function's cache can be procedurally cleared:
Example 5: Custom hashing
To override the default hashing behavior, pass a custom hash function. You can do that by mapping a type (e.g. Person) to a hash function (str) like this:
Alternatively, you can map the type's fully-qualified name (e.g. "__main__.Person") to the hash function instead:
| Function signature[source] | |
|---|---|
st.cache_resource.clear() |
Example
In the example below, pressing the "Clear All" button will clear all cache_resource caches. i.e. Clears cached global resources from all functions decorated with @st.cache_resource.
CachedFunc.clear
Clear the cached function's associated cache.
If no arguments are passed, Streamlit will clear all values cached for the function. If arguments are passed, Streamlit will clear the cached value for these arguments only.
| Function signature[source] | |
|---|---|
CachedFunc.clear(*args, **kwargs) | |
| Parameters | |
*args (Any) | Arguments of the cached functions. |
**kwargs (Any) | Keyword arguments of the cached function. |
Example
Using Streamlit commands in cached functions
Static elements
Since version 1.16.0, cached functions can contain Streamlit commands! For example, you can do this:
As we know, Streamlit only runs this function if it hasn’t been cached before. On this first run, the st.success message will appear in the app. But what happens on subsequent runs? It still shows up! Streamlit realizes that there is an st. command inside the cached function, saves it during the first run, and replays it on subsequent runs. Replaying static elements works for both caching decorators.
You can also use this functionality to cache entire parts of your UI:
Input widgets
You can also use interactive input widgets like st.slider or st.text_input in cached functions. Widget replay is an experimental feature at the moment. To enable it, you need to set the experimental_allow_widgets parameter:
Streamlit treats the checkbox like an additional input parameter to the cached function. If you uncheck it, Streamlit will see if it has already cached the function for this checkbox state. If yes, it will return the cached value. If not, it will rerun the function using the new slider value.
Using widgets in cached functions is extremely powerful because it lets you cache entire parts of your app. But it can be dangerous! Since Streamlit treats the widget value as an additional input parameter, it can easily lead to excessive memory usage. Imagine your cached function has five sliders and returns a 100 MB DataFrame. Then we’ll add 100 MB to the cache for every permutation of these five slider values – even if the sliders do not influence the returned data! These additions can make your cache explode very quickly. Please be aware of this limitation if you use widgets in cached functions. We recommend using this feature only for isolated parts of your UI where the widgets directly influence the cached return value.
Warning
Support for widgets in cached functions is currently experimental. We may change or remove it anytime without warning. Please use it with care!
Note
Two widgets are currently not supported in cached functions: st.file_uploader and st.camera_input. We may support them in the future. Feel free to open a GitHub issue if you need them!
Still have questions?
Our forums are full of helpful information and Streamlit experts.
