Mixed Bulk Read Support

This is me thinking out loud, so bear with me.

TL;DR Tons of latency can be reduced if there were have some uniform way to send multiple queries in one go to a backend and receive data back in a single response.

Right now, I’m dreaming up a world where I can stop worrying about latency between the browser and my JSON API (and even the latency between the API and my database). At times, it’s necessary to load unrelated objects from different resources simultaneously in order to render a component. In order to load the data in one go, I’d have to create a resource on my API server for that component. The problem then becomes that my application logic is now being coupled to my API, and in some cases, modifying the API is not an option.

I’m also thinking about developing a query optimizer of sorts that would leverage promises and such batching to help to alleviate a lot of the pain of N+1 queries too. Basically, every request would be fed through some singleton, which could combine queries intelligently (e.g. for same resource), combine them, and then fulfill the promises. This optimization could be done either client or server side.

I’ve discussed something like this with a co-worker on several occasions. Our use-case is a “firehose” endpoint which can fetch multiple resource types at once. For instance, in a scheduling application it would be handy to GET get all possible employees, utilization types (consulting, vacation, conference time etc), teams (different groupings of employees), and schedule entries in a single request.

From a pure performance standpoint, inlining this data in the html for the initial request is likely the correct way to go (that’s what discourse does), but there are plenty of cases where this would be nice to have.

I’d be open to implementing support for something like this in endpoints post 1.1 when we have defined semantics for extensions.

Ignoring the implementation details on the server, what would you imagine the request for something like this looking like?