It isn’t necessarily a computer programming problem either. Rather it is an IT problem at least in part, one that the poster states is the primary job of his ‘lab guy’ – to maintain two ancient Windows 95 computers specifically. That person must know enough to sustain the troubleshooting and replacement of the hardware and certainly at least the transfer of data from the own spinning hard drives. Why not instead put that technical expertise into actually solving the problem long-term? Why not just run both in qemu and use hardware passthru if required? At least then, you would rid yourself of the ticking time-bomb of hardware and its diminishing availability. That RAM that is no longer made isn’t going to last forever. They don’t even need to know much about how it all works. There are guides, even for Windows 95 available.
Perhaps there are other hurdles such as running something on ISA but even so, eventually it isn’t going to matter. Primarily, it seems rather the hurdle is specifically the software and the data it facilitates though. Does it really have some sort of ancient hardware dependency? Maybe. But in all that time of this ‘lab guy’ who’s main role is just these two machines must have some time to experiment and figure this out. The data must be copyable, even as a straight hard drive image even if it isn’t a flat file (extremely doubtful but it doesn’t matter). I mean the data is by the author’s own emphasis CRITICAL.
If it is CRITICAL then why don’t they give it that priority, even to the lone ‘lab guy’ that’s acting IT?
Unless there’s some big edge case here that just isn’t simply said and there is something above and beyond simply just the software they speak about, I feel like I’ve put more effort into typing these responses than it would take to effectively solve the hardware on life support side of it. Solving the software dependency side? Depending on how the datasets are logically stored it may require a software developer but it also may not. However, simply virtualizing the environment would solve many, if not all, of these problems with minimal investment, especially to CRITICAL (their emphasis) data with ~20 years to figure it out. It would simply be a new computer and some sort of media to install Linux or *BSD on and perhaps a COTS converter if it is using something like an LPT interface or even a DB9/DE-9 D-Sub (though you can still find modern motherboards, cards or even laptops capable of supporting those but also certainly a cheap USB adapter as well).
Anyway, I’m just going to leave it at that, I think I’ve said a lot on the subject to numerous people and do not have much more to add other than this is most likely solvable and outside of severe edge cases, solvable without expert knowledge considering the timeframe.
It isn’t necessarily a computer programming problem either. Rather it is an IT problem at least in part, one that the poster states is the primary job of his ‘lab guy’ – to maintain two ancient Windows 95 computers specifically. That person must know enough to sustain the troubleshooting and replacement of the hardware and certainly at least the transfer of data from the own spinning hard drives. Why not instead put that technical expertise into actually solving the problem long-term? Why not just run both in qemu and use hardware passthru if required? At least then, you would rid yourself of the ticking time-bomb of hardware and its diminishing availability. That RAM that is no longer made isn’t going to last forever. They don’t even need to know much about how it all works. There are guides, even for Windows 95 available.
Perhaps there are other hurdles such as running something on ISA but even so, eventually it isn’t going to matter. Primarily, it seems rather the hurdle is specifically the software and the data it facilitates though. Does it really have some sort of ancient hardware dependency? Maybe. But in all that time of this ‘lab guy’ who’s main role is just these two machines must have some time to experiment and figure this out. The data must be copyable, even as a straight hard drive image even if it isn’t a flat file (extremely doubtful but it doesn’t matter). I mean the data is by the author’s own emphasis CRITICAL.
If it is CRITICAL then why don’t they give it that priority, even to the lone ‘lab guy’ that’s acting IT?
Unless there’s some big edge case here that just isn’t simply said and there is something above and beyond simply just the software they speak about, I feel like I’ve put more effort into typing these responses than it would take to effectively solve the hardware on life support side of it. Solving the software dependency side? Depending on how the datasets are logically stored it may require a software developer but it also may not. However, simply virtualizing the environment would solve many, if not all, of these problems with minimal investment, especially to CRITICAL (their emphasis) data with ~20 years to figure it out. It would simply be a new computer and some sort of media to install Linux or *BSD on and perhaps a COTS converter if it is using something like an LPT interface or even a DB9/DE-9 D-Sub (though you can still find modern motherboards, cards or even laptops capable of supporting those but also certainly a cheap USB adapter as well).
Anyway, I’m just going to leave it at that, I think I’ve said a lot on the subject to numerous people and do not have much more to add other than this is most likely solvable and outside of severe edge cases, solvable without expert knowledge considering the timeframe.