There is no correct tool or one correct sequence of tools that will provide insight to all. No doubt each of us has our bias as to what tool is most effective or efficient given the situation. Regardless, the answer does not lie in the tool. The key is critical thinking which requires iteration.
One might start with science or engineering based hypotheses (e.g., thermodynamics, entropy). If you have no hypotheses, then perhaps "data mining" can be useful. I use data mining quite liberally. Basically looking at data to see patterns (or lack there of). This, of course, is best done graphically.
There are many forms of regression that attempt to quantify relationships (e.g., OLS, stepwise, PCR, ridge, PCA, PLS, etc.). If they implore the investigator to explain the results, this can be quite useful (e.g., why does this factor seem significant or why is this one not). This is the development of hypotheses (perhaps to be evaluated in designed experiments).
Just some words of caution using historical or observational data.
1. If there is a factor that has been deemed critical at some time in the past (by whatever means), there might be controls on this factor that do not permit it to vary much. Analysis of said factor might show it to be insignificant in any regression analysis (it didn't vary much).
2. Hidden confounding may be present (x's that were not identified or labeled in the data set.
3. The ability to model what happened, does not mean we can predict what will happen.
4. Extrapolation of the model outside the range of collected x’s, without understanding, could be hazardous.
5. Regression used with historical data does not consider the context under which the data was acquired (e.g., do you know the measurement uncertainty?)
6. Can be easily affected by "bad" data points.
"All models are wrong, some are useful" G.E.P. Box