At that time (up to 2003), every time we upgraded the tape density, or
added new archival storage, there was a "packing" job in the background.
Every time a file was retrieved and used, if it wasn't on the most dense
(newer) tape format, then the file was re-saved onto the newest,
higher-density tape.
This meant that active files were constantly copied forward onto the newer
tape formats. As all the files were copied off the lower density tapes, the
older tapes were marked "retired" and removed from the tape silos.
Then, in the background, idle tape drive time was used by a background job
that retrieved the oldest tapes, and copied the files on that tape forward.
This was enable by tape-level metadata that included the batch number and
the date that every physical tape was put into service.
Now....
Later, at Playstation we investigated the Sony ODA. If we had needed the
deep archive, we would have definitely gone with the Sony Petascale
archive. This is optical, but a disc technology that is denser and a
different formulation than Blu-Ray.
Actually I like this fork. I'm curious, do you
know what is best practice
for keeping bits around these days?
On Fri, Jan 27, 2023 at 01:42:17PM -0800, Tom Perrine wrote:
A tiny bit of a fork, but...
When I was at
SDSC.EDU we did a project for the National Archives. Gotta
love an agency that's mission is "data for the lifetime of the
Republic"...
They wanted to be sure that they could still access data at least 100
years
later, even assuming that no one had accessed it
in that 100 year period.
Anyway, we looked at all the options at the time (very early 2000s).
While media lifetime was indeed understood to be critical, we
specifically
called out needing to retain the software and the
encryption keys. AND
the
encryption algorithms!
At that time, media encryption was still quite new, and they hadn't
considered that issue. At all.
Overall, the best, most practical approach (at that time) was to
periodically copy the data forward, into new media, into new
storage software, and decrypting with the old keys and algos, and
re-encrypting with new.
Only by doing this periodically, we argued, could they really be sure of
being able to recover data 100+ years from now.
Don't get me started on the degradation of early generation optical media
that was guaranteed for 50 years, but rusted internally within 2 years.
And of course now there are companies that specialize in providing
mothballed obsolete tape and other readers.
--tep
On Fri, Jan 27, 2023 at 6:55 AM Ron Natalie <ron(a)ronnatalie.com> wrote:
> When I worked in the intelligience industry, the government spent a lot
> of money tasking someone (I think it was Kodak) to determine the best
> media for archival storage. It included traditional 6250 9 track
> tapes and the then-popular exabyte 8mm (which was atrociously short
> lived). I pointed out that magnetic storage was probably always going
> to be problematic and things needed ???digital refresh??? if you really
> wanted to keep them.
>
>
> If you know the tape may be problematic when played back, there are
> things you can do. I was gifted the master tapes of one of the radio
> shows originated at WJHU in the 70???s. I had them sent out to a
company
> who ???baked??? them, but then they also had
to redo all the splices
on them
> when they were played back.
>
--
---
Larry McVoy Retired to fishing
http://www.mcvoy.com/lm/boat