Improve data downloading by implementing incremental trade download and candle building. This would prevent loading all past trades into memory when fetching data for a short time span, addressing memory issues with large historical datasets.
## Describe your environment * Operating system, Python, CCTX: Running in docker on linux * Freqtrade Version: 2021.8 ## Your question I am downloading trades from Kraken. It seems that even if I only fetch a very short time span of trades, all the past trades are loaded into memory in a dataframe, and then new files are produced from there. I have been fetching trades a couple of months at a time starting at the beginning of 2017, and as I approached the present, I started to run out of RAM with 55 million BTC/EUR trades. The process requires roughly 30GB of memory, and because I ran out I had to create a swap file and use that, which of course made everything even slower. Together with producing the candles (I'm creating all up to 1h), it has been running for about 5 hours now. This is quite untenable and I don't see myself updating my historic data any further. My questions are: - Is this normal or is something wrong on my side? - If this is indeed normal, is there