Even using Excel 64-bit (which is not limited to 2 gb.) on a high-end cpu workstation loaded with RAM you can't do this.
fyi: Excel 64 can
address up to 8 terabytes of memory ... that does not mean you can
use that much memory !
The limits OriginalGriff mention are for one worksheet: you can, of course, have multiple worksheets per workbook, and multiple workbooks per project.
Quote:
The limit of virtual address space for 32-bit editions of Windows-based applications is 2 gigabytes (GB). For Excel, this space is shared by the Excel application itself together with any add-ins that run in the same process. The size of the worksheet itself also affects the usage of virtual address space. Because Excel loads the worksheet into addressable memory, some worksheets that have a file size of less than 2 GB may still require Excel to use more than 2 GB of addressable memory.
the 64-bit edition of Office does not impose hard limits on file size. Instead, workbook size is limited only by available memory and system resources.
Even if you could ... you'd most likely have a mess that would take forever to recalculate.
Assuming you don't have the funds to buy a supercomputer, you need to use another strategy to create your solution. A strategy that breaks the computational process into steps each of which can work on a subset of the data.
Do investigate Power Pivot, and Power Query, as possible tools that would allow you to use very large data with Excel (I have not used either one): [
^],[
^].
Read these: [
^], [
^] for other ideas.
And, as OriginalGriff suggested, consider using a database.