Moving 40TB Assets from B to A - nightly!
This talk shows approaches and considerations when migrating from a legacy assets system ("B-estand") to an A-EM assets instance. It will cover ingestion methods available, calculations on "theoretical lowest runtimes" for such large-scale data, how Adobe I/O can be useful in this picture, the limits of ACS Commons MCP for really long-running processes in the cloud and best repository structures for large reports.
The second part of this talk will discuss how we used Dynamic Media with Open API (also known under the code name “Polaris”) to make assets in a flexible and efficient way available to third-party systems. This will cover the authoring using the Asset Selector SDK, the search API and the delivery capabilities that include video streaming and image transformations.
Georg Henzler
I haven't explicitly tested but overall the the time was mostly driven by the "count of bytes" - so 40TB with 1 Mio files and smaller files would have been a similar runtime
nschmuter
in the description of the speech they mention: "There will also be a small and very light-weight framework available on GitHub to demonstrate the patterns used.". So I would expect that to be shared.
Davorin
Do you know if Adobe uses Azure Import/Export service in case we want to migrate more than 40TB so we can ship disks to them?
Georg Henzler
We haven't tried, we wanted to avoid manual steps ;-)
Nilesh Mali
Will it move the user information as well, like which user uploaded the image?
Georg Henzler
We migrated a rich set of metadata, including some metadata from the old system we just migrated for analysis/trouble-shooting reason and we will delete at some point