Where to start regarding the question about getting the "correct" predictive model, there are two schools of thought.
1. develop mathematical models based solely on data analysis (e.g., neural networks, PCA)
2. understand, with scientific basis, relationships between input variables and output variable (What Deming Called the analytic problem)
Being deterministic, the 2nd approach is what I prefer. This begins with statements of hypotheses about the relationships between inputs and outputs. It requires an understanding of inference (over what conditions do you want the model to be effective). Then the appropriate "sampling plan" to acquire the data (directed sampling or experimentation). Certainly you can examine historical data, but only to help develop hypotheses which then need to be tested.
Your data set lacks any context (as Dale suggests). There is no "meaning" to the columns, just columns of numbers. I'm not sure why Dale thinks this is a time series as I see no times or dates in the data? So, creating the correct predictive model is left to option 1 above. If you include all of the columns (224) and run Fit Model, you get Rsquare Adj of .59 and Rsquare of .83. These values are way too different which suggests you have over-specified the model (unimportant terms are in the model). If you look at VIFs (Parameter Estimates table) there are many above the threshold of >5 (or >10) which is a measure of multicollinearity. So the model needs to be reduced. However there is no intelligence how to do this as there is no context.
"All models are wrong, some are useful" G.E.P. Box