function history to download Bloomberg data runs slow

3 vues (au cours des 30 derniers jours)
Shiping Yang
Shiping Yang le 22 Avr 2022
Réponse apportée : Simar le 25 Jan 2024
I am using Matlab function history to download data from Bloomberg.
s = a list of 70 securities.
f = three fields.
fromdate = 12/31/1999
todate = 4/20/2022
period = {'daily','actual','all_calendar_days','previous_value'};
It takes about 20 minutes to execute the command. I understand the data is about 21+ years. It may be expected that it takes longer time to run. But when I tried to download 2 weeks of data, it completes in a few seconds. 20 minutes for 21+ years of historical data seems too long.
Did anyone experience the same issue? Is there any way to optimize the way I download the historical data please?

Réponses (1)

Simar
Simar le 25 Jan 2024
Hi Shiping Yang,
I understand that you are looking for strategies to optimize the data download process to reduce the time taken. Since downloading a large amount of historical data from Bloomberg using the MATLAB “history” function, it might take considerate amount of time.
Here are some suggestions to potentially optimize the data download process:
  • Parallel Requests: If having access to MATLAB's Parallel Computing Toolbox, try to split requests into smaller batches and use parallel processing to download them concurrently.
  • Limit Fields: Review the fields being requested. If there are fields that are not strictly needed, consider removing them from request to reduce the amount of data being transferred.
  • Bloomberg API Limitations: The Bloomberg API itself might have limitations on data retrieval rates, which can affect the time taken to download historical data. Check if there are any such limitations and if they are configurable.
  • Caching: If frequently need to access the same historical data, consider caching it locally after the initial download so that only need to retrieve incremental updates on subsequent requests.
  • Data Compression: Check if the Bloomberg service allows for data compression options during transfer, which could reduce the amount of data being sent over the network.
  • Optimize MATLAB Code: Review MATLAB code for any inefficiencies. For example, preallocate memory for large datasets, use efficient data types, and avoid unnecessary loops or computations within data retrieval script.
  • Incremental Downloads: Instead of downloading the entire dataset in one go, consider breaking it down into smaller time frames and incrementally appending the results. This approach can sometimes be more efficient and allows for intermediate saving and checkpointing.
Remember to test any changes with a smaller subset of data first to ensure that they provide a performance improvement before applying them to the full 21+ years of data.
Please refer to the following links:
Hope it helps!
Best Regards,
Simar

Catégories

En savoir plus sur Downloads dans Help Center et File Exchange

Produits


Version

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by