Ho letto quanto segue in Algorithms 4th Edition di Robert Sedgewick e Kevin Wayne :
Our first qualitative observation about most programs is that there is a problem size that characterizes the difficulty of the computational task. Normaly, the problem size is either the size of the input or the value of a command-line argument. Intuitively, the running time should increase with problem size, but the question is by how much it increases...
e
Another qualitative observation for many programs is that the running time is relatively insensitive to the input itself; it depends primarily on the problem size. If this relationship does not hold, we need to take steps to better understand and perhaps better control the runnig time's sensitivity to the input. But it does ofter hold, so we now focus on the goal of better quantifying the relationship between problem size and running time
La mia domanda è, se la dimensione di input (dimensione del problema del programma) aumenta, e quindi aumenta anche il tempo di esecuzione, perché il tempo di esecuzione del programma sarà relativamente insensibile all'ingresso? Sono confuso.