-
-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature: dynamic endpoints based on pre-defined sql queries #370
Comments
@nitisht can I pick this issue? |
hey @sudharsangs you can, for sure |
any update with this issue? |
Right now there are no plans to work on this @mrchypark . Is this something you think will be useful to you? |
We can leverage Datafusion CREATE_FUNC https://github.com/apache/arrow-datafusion/blob/main/datafusion-examples/examples/function_factory.rs |
@nitisht Thank you for the information. I had initially understood this feature as something similar to a materialized view. I thought it would be a useful feature if it could proactively perform calculations and cache the results in advance, although it might also execute upon request depending on the configuration. I appreciate you sharing the details, and I will definitely look into the aspects you mentioned. Thank you for your help and insights. |
/bounty 300 |
💎 $300 bounty • ParseableSteps to solve:
Thank you for contributing to parseablehq/parseable! Add a bounty • Share on socials
|
/attempt #370
|
/attempt #370 Options |
💡 @TomBebb submitted a pull request that claims the bounty. You can visit your bounty board to reward. |
This feature could be valuable for us as well, so it's great to see a PR already in progress! Below, I’d like to share some feedback from a user perspective. We use Parseable not only for storing but also for retrieving log data. While response times vary depending on several factors, we wouldn’t describe the responses as instant when querying streams with a high volume of events (we’re already using partitions). With carefully selected start and end dates, the response times are fast enough for internal use cases (up to 10–20 seconds), but this isn’t sufficient when we need to retrieve logs for display to end users, who expect loading times within a few hundred milliseconds to a few seconds at most. For this reason, we currently use Parseable primarily for experimenting with internal features. A solution to further improve query response times would be fantastic. For our use cases, stale data is an acceptable trade-off for increased speed. I'm not sure if I understood this PR comment by @nikhilsinhaparseable correctly, but our use case would likely need support for more than 10 dynamic streams at a time. We could potentially have over 100,000 distinct queries, each with varying parameters (e.g., different IDs for filtering logs). Of course, not all of these queries would need to be dynamic/cached - just the most frequent/active ones. So, while I’m not certain this feature would fully meet our needs, it does address a similar concern around improving response times. |
Thanks for the detailed feedback @davidwlhlm . Just curious if you tried https://www.parseable.com/docs/features/tiering? And if you still see slow responses |
Thanks for pointing out, we will take a look and run some experiments :) |
Implement an endpoint
/api/v1/query/dynamic
with payloadThis API will respond with a unique URL will serves plain JSON response to this query. The data will be refreshed every
5m
as configured in the payload.We'd love to get feedback from the community if they think this is useful. Please add a 👍🏽 in the issue to show your interest.
The text was updated successfully, but these errors were encountered: