just in case anyone finds it helpful, we had the same issue with long loading times expanding invoices, projects and other endpoints.
In the end we ended up building a cache using
requests_cache package in Python and "downloaded all the data/queries" we need first (overnight) and then use the data locally. The downside is that all the data is 1 day old, but for our use case that is not a Problem. We queried the API without expanding so many fields as that took forever, but reading all the entries per endpoint and then merging the data locally using the cache.
With the cache we are loading around 100.000 entries from several endpoints and, once loaded in cache, takes a couple of minutes to load them again and use them