Quote:
I'm learning about the time complexity of algorithms. As far as I know, the compile-time will take longer the larger the input (larger in this situation means more input). But I wonder, is the input number itself matter?
No. Compilation takes place once, when the compiler converts your code into "machine code" (or an internal form of machine code called IL) for you and produces an EXE file that you can run as many times as you wish. Inputs aren't considered then unless they are in the form of files which are "built" into that EXE file.
Whatever the user is going to type is input, external files are input, databases are input - but they are only a part of the performance of yortu app while it is running, they have no affect whatsoever on compilation.
Compile time doesn't depend on the size of values to any significant degree, it is only affected by the complexity of code and the modules it has to compile into your application.
Big O notation does not include any "one off" processes like compilation as they are not a part of the algorithm at all!