Ummmm I think you're misunderstanding me here. What I meant was that data sources can be modularized to prevent a factorial or exponential explosion in computational complexity.
In some "ideal" sense, yes, every piece of data is reconciled against every other piece of data. But practically speaking, there are many known processes to simplify and make tractable computational problems, these including lossy compression and modularization. Indeed, every form of life on the planet employs exactly these strategies. We should just assume that such strategies will be a part of centralized economic data processing.