Caching of AI Summaries

@Mark Barnes (Logos) have a technical question for you. When I use the AI summarize features, does the software cache the responses so that if I ask for a summary of the same article or search result in the future it just pulls the already summarized content? If not, could this feature be considered?
Comments
-
We cache responses responses in two ways.
Summaries are cached as part of the panel, so if you request the same summary in the same panel (without closing the panel), you'll get an instant response, and no AI credits will be used.
We also cache summaries on the server, so two requests for the same article to summarize should result in identical results, even if another user previously generated the summary. In this instance, we still deduct AI credits, but the number of credits deducted reduces for everyone over time. What follows aren't real numbers, but let's use them to simplify the math. If we deducted ten credits per summary but found that 10% of responses were cached, we'd start deducting nine credits from everyone instead of ten. This is true for all AI features and helps AI be consistent across users and time. If we improve an AI feature, we invalidate all the caching to ensure that everyone benefits from the improvements.
1