I think what @statpharmer was getting at was to let a data governance plan dictate the 'where, how, when, what' of the process for managing data.
Generally speaking, I think storing everything in a single JMP data table, and I'm taking the word 'everything' literally, that is, every single data point from the beginning of time, to the latest data point in time created, across all workflows coming through the LIMS system, in perpetuity, is not recommended.
The general model I saw across my customers was to have a standard process in place for storing, accessing, adding, maintaining, and distributing the data in the database(s) that is NOT tied to any single analysis or reporting application (even JMP). Then married to that piece of the data flow to end users is some sort of SQL based connection (for example, using JMP Query Builder or other internally generated applications), for acquisition, and distribution of the data to analysis end users. Then the users work with the data in JMP for their specific analysis and reporting/sharing purposes. So generally the data in JMP is temporal related to a specific work problem. It may be stored in JMP in perpetuity but it's still temporal related to a specific business/analysis issue.
Generally I found the IT organization was charged by management from a strategic point of view with data stewardship wrt to the contents of database(s). These companies view this data as the 'crown jewels' of their R & D, quality, compliance, and production systems...and they want a foolproof/mistake proof/defect free means for managing that data. That's IT's responsibility...not the analysis end users.
Hopefully, I've not stepped on @statpharmer too badly here? If I have...he'll let me know!