A simple, straightforward way export and import AWS DynamoDB tableβs data with AWS CLI and a few scripts.
At first, export all the data from AWS DynamoDB table:
1 | π aws --profile production dynamodb scan --table-name tile-event > tile-event-export.json |
Convert a list of items/records (DynamoDB JSON) into individual PutRequest JSON with jq
.
1 | π cat tile-event-export.json | jq '{"Items": [.Items[] | {PutRequest: {Item: .}}]}' > tile-event-import.json |
Transform the data if necessary:
1 | π sed 's/tile-images-prod/tile-images-pdev/g' tile-event-import.json > tile-event-import-transformed.json |
Split all requests into 25 requests per file, with jq
and awk
(Note: There are some restriction with AWS DynamoDB batch-write-item request - The BatchWriteItem operation can contain up to 25 individual PutItem and DeleteItem requests and can write up to 16 MB of data. The maximum size of an individual item is 400 KB.)
1 | π cat tile-event-processed.awk |
Import all 22 processed JSON files into DynamoDB table:
1 | for f in tile-event-import-processed-{1..22}.json; do \ |