Vault
Data tokenization with transform secrets engine
Vault Enterprise feature
Transform secrets engine requires a Vault Enterprise Advanced Data Protection (ADP) license.
What is data tokenization?
Data tokenization process replaces sensitive data with unique values (tokens) that are unrelated to the original value in any algorithmic sense. Therefore, those tokens cannot risk exposing the plaintext satisfying the PCI-DSS guidance.
Vault's transform secrets engine has a data transformation method to tokenize sensitive data stored outside of Vault.
Tokenization versus FPE
When encrypting sensitive data, preservation of the original data format or length may be required to meet certain industry standards such as HIPAA or PCI. One method to fulfill this requirement is to use format preserving encryption (FPE).
However, there are organizations that care more about the irreversibility of the tokenized data and not so much about preserving the original data format. Therefore, the transform secrets engine's FPE transformation may not meet the governance, risk and compliance (GRC) strategy they are looking for due to the use of reversible cryptography to perform FPE.
Characteristics of the tokenization transformation:
Non-reversible identification: Protect data pursuant to requirements for data irreversibility (PCI-DSS, GDPR, etc.)
Integrated Metadata: Supports metadata for identifying data type and purpose
Extreme scale and performance: Support for performantly managing billions of tokens across clouds as well as on-premise
Prerequisites
To perform the tasks described in this tutorial, you need to have Vault Enterprise with the Advanced Data Protection module or HCP Vault Dedicated Plus tier cluster.
- Access to a Vault Enterprise license with the ADP module to run Vault in dev mode or a HCP Vault Dedicated Plus tier cluster.
- Vault binary installed
- Docker installed
- jq installed
Policy requirements
For the purpose of this tutorial, you can use root
token to work
with Vault. However, it is recommended that root tokens are only used for just
enough initial setup or in emergencies. As a best practice, use tokens with
appropriate set of policies based on your role in the organization.
To perform all tasks demonstrated in this tutorial, your policy must include the following permissions:
# Work with transform secrets engine
path "transform/*" {
capabilities = [ "create", "read", "update", "delete", "list" ]
}
# Enable secrets engine
path "sys/mounts/*" {
capabilities = [ "create", "read", "update", "delete", "list" ]
}
# List enabled secrets engine
path "sys/mounts" {
capabilities = [ "read", "list" ]
}
If you are not familiar with policies, refer to the policies tutorial.
Lab setup
Open a terminal and export an environment variable with a valid Vault Enterprise license.
$ export VAULT_LICENSE=02MV4UU43BK5....
Start Vault Enterprise in a container.
$ docker run --name vault-enterprise \ --cap-add=IPC_LOCK \ --env VAULT_LICENSE=$(echo $VAULT_LICENSE) \ --env VAULT_DEV_ROOT_TOKEN_ID=root \ --env VAULT_DEV_LISTEN_ADDRESS=0.0.0.0:8200 \ --publish 8200:8200 \ --detach \ --rm \ hashicorp/vault-enterprise
The Vault dev server listens on all addresses using port
8200
. The server is initialized and unsealed.Insecure operation
Do not run a Vault dev server in production. This approach starts a Vault server with an in-memory database and runs in an insecure way.
Export an environment variable for the
vault
CLI to address the Vault server.$ export VAULT_ADDR=http://127.0.0.1:8200
Export an environment variable for the
vault
CLI to authenticate with the Vault server.$ export VAULT_TOKEN=root
Note
For these tasks, you can use Vault's root token. However, it is recommended that root tokens are only used for enough initial setup or in emergencies. As a best practice, use an authentication method or token that meets the policy requirements.
The Vault server is ready to proceed with the tutorial.
Setup the transform secrets engine
Create a role named mobile-pay
which is attached to a transformation named credit-card
.
The tokenized value will have a fixed maximum time-to-live (TTL) of 24 hours.
Enable the
transform
secrets engine attransform/
.$ vault secrets enable transform Success! Enabled the transform secrets engine at: transform/
Create a role named
mobile-pay
with a transformation namedcredit-card
.$ vault write transform/role/mobile-pay transformations=credit-card Success! Data written to: transform/role/mobile-pay
Create a transformation named
credit-card
which sets the generated token's time-to-live (TTL) to 24 hours.$ vault write transform/transformations/tokenization/credit-card \ allowed_roles=mobile-pay \ max_ttl=24h
Example output:
Success! Data written to: transform/transformations/tokenization/credit-card
The
max_ttl
is an optional parameter which allows you to control how long the token should stay valid.You can set the
allowed_roles
parameter to a wildcard (*
) to allow all roles or with globs at the end for pattern matching (e.g.mobile-*
).Display details about the
credit-card
transformation.$ vault read transform/transformations/tokenization/credit-card Key Value --- ----- allowed_roles [mobile-pay] deletion_allowed false mapping_mode default max_ttl 24h stores [builtin/internal] templates <nil> type tokenization
The
type
is set totokenization
.
Tokenize secrets
The Vault client applications must have the following in their policy to perform
tokenization transformation using the transform secrets engine enabled at
transform/
.
Required Client Policy
# To request data encoding using any of the roles
# Specify the role name in the path to narrow down the scope
path "transform/encode/mobile-pay" {
capabilities = [ "update" ]
}
# To request data decoding using any of the roles
# Specify the role name in the path to narrow down the scope
path "transform/decode/mobile-pay" {
capabilities = [ "update" ]
}
# To validate the token
path "transform/validate/mobile-pay" {
capabilities = [ "update" ]
}
# To retrieve the metadata belong to the token
path "transform/metadata/mobile-pay" {
capabilities = [ "update" ]
}
# To check and see if the secret is tokenized
path "transform/tokenized/mobile-pay" {
capabilities = [ "update" ]
}
Encode data using the
mobile-pay
role and store it in a variable.$ TOKEN_VALUE=$(vault write transform/encode/mobile-pay value=1111-2222-3333-4444 \ transformation=credit-card \ ttl=8h \ metadata="Organization=HashiCorp" \ metadata="Purpose=Travel" \ metadata="Type=AMEX" -format=json | jq -r '.data | .encoded_value') \ && echo encoded_value: $TOKEN_VALUE
The
ttl
value is an optional parameter. Remember that themax_ttl
was set to 24 hours when you created thecredit-card
transformation. You can overwrite that value to make the token's TTL to be shorter.In addition, you can set optional metadata about the data.
Example output:
encoded_value: Q4tYgFXHxUbU5XZoQsxusAzxhNyVWmCFWxAT8SGkHoYB3VkrmQGXVR
Retrieve the metadata of the token.
$ vault write transform/metadata/mobile-pay value=$TOKEN_VALUE transformation=credit-card Key Value --- ----- expiration_time 2021-03-13 06:49:58.041608 +0000 UTC metadata map[Organization:HashiCorp Purpose:Travel Type:AMEX]
Notice the
expiration_time
value. Since you have overwritten themax_ttl
, thettl
is set to 8 hours.Validate the token value.
$ vault write transform/validate/mobile-pay value=$TOKEN_VALUE transformation=credit-card Key Value --- ----- valid true
Validate that the credit card number has been tokenized already.
$ vault write transform/tokenized/mobile-pay value=1111-2222-3333-4444 transformation=credit-card Key Value --- ----- tokenized true
Retrieve the original plaintext credit card value.
$ vault write transform/decode/mobile-pay transformation=credit-card value=$TOKEN_VALUE Key Value --- ----- decoded_value 1111-2222-3333-4444
Convergent tokenization
Requirement
This feature requires Vault 1.11.0 or later.
If you encode the same value multiple times, it returns a different encoded value each time.
Example:
$ vault write transform/encode/mobile-pay value=5555-6666-7777-8888 transformation=credit-card
Key Value
--- -----
encoded_value Q4tYgFXHxURKc5zaVDRhi3QT35htvADKtnBpBazpoBZPG373FP1mXs
$ vault write transform/encode/mobile-pay value=5555-6666-7777-8888 transformation=credit-card
Key Value
--- -----
encoded_value Q4tYgFXHxUVpMnf4SSVQZWm1YhAG2eSGvS3dBeDNmm7fkPLnw3JV54
In some use cases, you may want to have the same encoded value for a given input so that you can query your database to count the number of entries for a given secret.
Key derivation is supported to allow the same key to be used for multiple purposes by deriving a new key based on a user-supplied context value. In this mode, convergent encryption can optionally be supported, which allows the same input values to produce the same ciphertext.
Update the
mobile-pay
role with a convergent transformation namedcredit-card-convergent
.$ vault write transform/role/mobile-pay transformations="credit-card, credit-card-convergent" Success! Data written to: transform/role/mobile-pay
Create a transformation named
credit-card-convergent
which sets the enables the convergent encryption. When you define a transformation, setconvergent=true
.$ vault write transform/transformations/tokenization/credit-card-convergent \ allowed_roles="*" \ convergent=true
Example output:
Success! Data written to: transform/transformations/tokenization/credit-card-convergent
Encode a value using the
credit-card-convergent
transformation.$ vault write transform/encode/mobile-pay value=5555-6666-7777-8888 \ transformation=credit-card-convergent
Example output:
Key Value --- ----- encoded_value DaCJhefr1oRUoY1vCtgJ8HixeazFhWHzFsNoNGeDWnyGGe76Xp8Er4UyW76xiPyQ2Eh2jGztnb3x
Run the command again.
$ vault write transform/encode/mobile-pay value=5555-6666-7777-8888 \ transformation=credit-card-convergent
Example output:
It returns the same encrypted value.
Key Value --- ----- encoded_value DaCJhefr1oRUoY1vCtgJ8HixeazFhWHzFsNoNGeDWnyGGe76Xp8Er4UyW76xiPyQ2Eh2jGztnb3x
Lookup token
When the transformation is configured with convergent encryption, you can look up the tokenized value (token).
Encode the value using the
credit-card-convergent
transformation with time-to-live (TTL) of 8 hours.$ vault write transform/encode/mobile-pay value=5555-6666-7777-8888 \ transformation=credit-card-convergent ttl=8h
Example output:
Key Value --- ----- encoded_value MQEgMY8jwXYJNb9p177dYCttwRUcctAeo89yVmMPiLotzfe3sx2btpBGNLijo2KA9cy4hAnTfeV78D5rJwyBdC5j6tMEZgH
The encoded value (token) is longer than the one without a TTL.
Look up the token for card number "5555-6666-7777-8888".
$ vault write transform/tokens/mobile-pay value=5555-6666-7777-8888 \ transformation=credit-card-convergent
Example output:
Key Value --- ----- tokens [DaCJhefr1oRUoY1vCtgJ8HixeazFhWHzFsNoNGeDWnyGGe76Xp8Er4UyW76xiPyQ2Eh2jGztnb3x]
Look up with expiration of "any".
$ vault write transform/tokens/mobile-pay value=5555-6666-7777-8888 \ transformation=credit-card-convergent expiration="any"
Example output:
Key Value --- ----- tokens [DaCJhefr1oRUoY1vCtgJ8HixeazFhWHzFsNoNGeDWnyGGe76Xp8Er4UyW76xiPyQ2Eh2jGztnb3x MQEgMY8jwXYJNb9p177dYCttwRUcctAeo89yVmMPiLotzfe3sx2btpBGNLijo2KA9cy4hAnTfeV78D5rJwyBdC5j6tMEZgH]
This returns two space-separated tokens. In absence of the
expiration
parameter, the command returns token with no expiration. When the expiration is set to "any", it returns tokens with any expiration.Look up tokens that have an expiration between a given range using
min_expiration
andmin_expiration
which are RFC3339 formatted time and date.$ vault write transform/tokens/mobile-pay value=5555-6666-7777-8888 \ transformation=credit-card-convergent \ min_expiration=$(echo $(date -v-60M "+%Y-%m-%dT%H:%M:%S+00:00")) \ max_expiration=$(echo $(date -v+10d "+%Y-%m-%dT%H:%M:%S+00:00"))
Example output:
Key Value --- ----- tokens [MQEgMY8jwXYJNb9p177dYCttwRUcctAeo89yVmMPiLotzfe3sx2btpBGNLijo2KA9cy4hAnTfeV78D5rJwyBdC5j6tMEZgH]
Any token that expires within the provided date range is displayed.
Key rotation
Note
The automatic key rotation requires Vault Enterprise 1.12.0 or later.
Rotating keys regularly limits the amount of information produced by a key if that key ever becomes compromised. In this section, you are going to enable automatic key rotation for your tokenization keys.
Read the key information for
credit-card
transformation.$ vault read transform/tokenization/keys/credit-card Key Value --- ----- auto_rotate_period 0s latest_version 1 min_available_version 0 min_decryption_version 1 name credit-card type aes256-gcm96
Notice that the
latest_version
is 1.Rotate the key for
credit-card
transformation.$ vault write -force transform/tokenization/keys/credit-card/rotate Success! Data written to: transform/tokenization/keys/credit-card/rotate
Read the key information again.
$ vault read transform/tokenization/keys/credit-card Key Value --- ----- auto_rotate_period 0s latest_version 2 min_available_version 0 min_decryption_version 1 name credit-card type aes256-gcm96
The
latest_version
is now 2.Configure the key to be automatically rotated every 90 days to reduce operational overhead.
Note
The minimum permitted value for the
auto_rotate_period
is 1 hour.$ vault write transform/tokenization/keys/credit-card/config \ auto_rotate_period=90d
Example output:
Success! Data written to: transform/tokenization/keys/credit-card/config
Verify the configuration.
$ vault read transform/tokenization/keys/credit-card
Example output:
Key Value --- ----- auto_rotate_period 2160h latest_version 1 min_available_version 0 min_decryption_version 1 name credit-card type aes256-gcm96
If the key becomes compromised, you can rotate the key using the
transform/tokenization/keys/<transformation_name>/rotate
, and then set themin_decryption_version
to the latest key version so that the older (possibly compromised) key will not be able to decrypt the data.
Because the minimum rotation period you can set is 1 hour, you will need to come back later to see that the key is rotated.
Set the rotation period to 1 hour.
$ vault write transform/tokenization/keys/credit-card/config \ auto_rotate_period=1h
Encrypt another value for testing.
$ export TOKEN_VALUE_2=$(vault write -format=json transform/encode/mobile-pay \ value=1234-5678-9012-3456 transformation=credit-card ttl=8h \ | jq -r ".data.encoded_value")
The value,
1234-5678-9012-3456
is encoded withcredit-card
key version of 2, and the returned encoded value is stored in theTOKEN_VALUE_2
environment variable.You can make sure that the environment variable holds the encoded value.
$ echo $TOKEN_VALUE_2 eRwUjS2L9e7fjTv7gFa3Kaf1LakuWc387XdECkbWBBLu6AiMCMWyG8mr1cPBkrSj6keC4DV69
Wait for at least 1 hour to see the key has been rotated.
$ vault read transform/tokenization/keys/credit-card Key Value --- ----- auto_rotate_period 2160h latest_version 3 min_available_version 0 min_decryption_version 1 name credit-card type aes256-gcm96
You can test that the data encoded by version 1 of the key can be still decoded because
min_decryption_version
is 1.$ vault write transform/decode/mobile-pay transformation=credit-card value=$TOKEN_VALUE
Also, you should be able to decode
TOKEN_VALUE_2
.$ vault write transform/decode/mobile-pay transformation=credit-card value=$TOKEN_VALUE_2
If you change the
min_decryption_version
to 2, you will be able to decodeTOKEN_VALUE_2
but notTOKEN_VALUE
.
Setup external token storage
Unlike format preserving encryption (FPE) transformation, tokenization is a stateful procedure to facilitate mapping between tokens and various cryptographic values (one way HMAC of the token, encrypted metadata, etc.) including the encrypted plaintext itself which must be persisted.
At scale, this could put a lot of additional load on the Vault's storage backend. To avoid this, you have an option to use external storage to persist data for tokenization transformation.
External storage
Currently, PostgreSQL, MySQL, and MSSQL are supported as external storage for tokenization.
To demonstrate, you will run a PostgreSQL database in a Docker container. Then create a new transformation named, "passport" which uses PostgreSQL as its storage rather than using Vault's storage backend.
Run the PostgreSQL Docker image in a container which listens on port
5432
, and sets the superuser (root
) password torootpassword
.$ docker run --name postgres -e POSTGRES_USER=root \ -e POSTGRES_PASSWORD=rootpassword \ -d -p 5432:5432 postgres
If the image is not present locally, Docker will pull the latest image.
Unable to find image 'postgres:latest' locally latest: Pulling from library/postgres 52d2b7f179e3: Pull complete d9c06b35c8a5: Pull complete ec0d4c36c7f4: Pull complete aa8e32a16a69: Pull complete 8950a67e90d4: Pull complete 1b47429b7c5f: Pull complete a773f7da97bb: Pull complete 7bddc9bbcf13: Pull complete 60829730fa39: Pull complete f3d9c845d2f3: Pull complete cfcd43fe346d: Pull complete 576335d55cdb: Pull complete caad4144446c: Pull complete Digest: sha256:a5e89e5f2679863bedef929c4a7ec5d1a2cb3c045f13b47680d86f8701144ed7 Status: Downloaded newer image for postgres:latest 229201f2d39c241fb07ea6a6e6c18aa8233130341558a6998312f9fd808bf3b3
Verify that the postgres container is running.
$ docker ps | grep postgres CONTAINER ID IMAGE ... PORTS NAMES befcf913da91 postgres ... 0.0.0.0:5432->5432/tcp postgres
Create a new role,
global-id
.$ vault write transform/role/global-id transformations=passport Success! Data written to: transform/role/global-id
Create a store which points to the postgres.
$ vault write transform/stores/postgres \ type=sql \ driver=postgres \ supported_transformations=tokenization \ connection_string="postgresql://{{username}}:{{password}}@$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' postgres):5432/root?sslmode=disable" \ username=root \ password=rootpassword
Example output:
Success! Data written to: transform/stores/postgres
Create a schema in postgres to store tokenization artifacts.
$ vault write transform/stores/postgres/schema transformation_type=tokenization \ username=root password=rootpassword
Example output:
Success! Data written to: transform/stores/postgres/schema
Create a new transformation named, "passport" which points to the postgres store.
$ vault write transform/transformations/tokenization/passport \ allowed_roles=global-id stores=postgres
Example output:
Success! Data written to: transform/transformations/tokenization/passport
Open another terminal and connect to the
postgres
container.$ docker exec -it postgres bash
Start
psql
.$ psql -U root
Verify that there is no entry.
$ select * from tokens; storage_token | key_version | ciphertext | encrypted_metadata | fingerprint | expiration_time ---------------+-------------+------------+--------------------+-------------+----------------- (0 rows)
Return to the terminal you were running Vault CLI, and encode some test data.
$ vault write transform/encode/global-id \ transformation=passport \ value="123456789"
Example output:
Key Value --- ----- encoded_value Q4tYgFXHxUS3PnQLiUnyH2JfGeEZQDFXMMaFXLU6MZfiix1tjqwgNX
Return to the postgres container, and check the data entry.
$ select * from tokens; storage_token | key_version | ciphertext | encrypted_metadata | ... --------------------------+-------------+---------------------------+--------------------+-... \x128aa3c24699...snip... | 1 | \x1ee7cc3505e31...snip... | | ... (1 row)
As you encode more data, the table entry grows.
Enter
\q
to quit the psql session.Enter
exit
to exit out of the Docker container.
Bring your own key (BYOK)
Note
This section is provided to help you understand the process generate and use a data key without sending data to Vault. A complete end-to-end scenario cannot be replicated in the tutorial.
When your use case requires an external key, users of Vault version 1.12.0 or greater can use BYOK functionality to import an existing encryption key that was generated outside Vault.
The target key for import can originate from an HSM or other external source, and must be prepared according to its origin before you can import it.
Note
Tokenization transformations with imported keys do not currently support convergent tokenization.
The example shown here will use a 256-bit AES key, referred to as the target key. To successfully import the target key, you must perform the following operations to prepare it.
Generate an ephemeral 256-bit AES key.
Wrap the target key using the ephemeral AES key with AES-KWP.
Wrap the AES key under the Vault wrapping key using RSAES-OAEP with MGF1 and either SHA-1, SHA-224, SHA-256, SHA-384, or SHA-512.
Delete the ephemeral AES key.
Append the wrapped target key to the wrapped AES key.
Base64 encode the result.
A specific code example for preparing and wrapping the key for import is beyond the scope of this tutorial. For more details about wrapping the key for import including instructions for wrapping key from an HSM, refer to the key wrapping guide.
Before you can wrap the key for import, you must read the wrapping key from Vault so that it can be used to prepare your key.
$ vault read -field=public_key transform/wrapping_key
-----BEGIN PUBLIC KEY-----
MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAttRvwFiW9kPtTutUeC3s
Q3PXXrCOuHrUlAqJxH/nNj7HpuERSJg+HW70b16YCU7mo+lVhpD+bMW3WVzov3v3
jXu6LSZYPZjlO8cEUtQQfydpWVC98cxjn8WCKrIeI+J211/xQQIsOT8sF9MYe11G
Pz/VGdEj+BKel7HWWD1nJ4i0Uo10iN76r0E5GioN8E5wK8L/11k/puEB3EcWhyPp
6aQ8PUbFqU8cmChd2Jg+Cj0k81sp72vOPkDJ7oAekYeR/iIYUjUINlz2Yob/qt25
AuF1t+TSLcI+RHzTetpL6/oHnbKdyM95MNq4XlELun4x2xOa4Q2EVkczwnb+X1kO
0d3FL2dAmMVxCXQnMdd/e2rJx8HxQZX02R4IfojmvfD5x/yICYrqm6bmv9RzTCxE
myQMVUYZsEfXgCKffW0vmEZOFIKvLkI5iaaHqMPk3SCGpMnniybEFt7dmXAnswpE
mtxf+5/gm+aTBOUmEWy0WYb+hHK0RXnUC9ScRWBd+eOMONEm+N7nUUS4e57+EnjH
OrjshTEU/B2BO8DvfYyuYyFzBiWzCJywj2eOaQxYdg8Y05YgDrE1GVEOceErDxT8
eoP/MQgUvVOAsD1KIAT5WmmNOCZ0HHAHjOKDdZ1FWtasNBjGJI7g5a1oCHuwl1p/
CZ2i8Rd7XOEnWZ+ld7l+qS0CAwEAAQ==
-----END PUBLIC KEY-----
The output is the (4096-bit RSA) wrapping key.
Use the wrapping key value at step 3 in the previously detailed preparation steps. Once you have prepared and base64 encoded the ciphertext, export the value to the environment variable IMPORT_CIPHERTEXT
.
Example:
$ export IMPORT_CIPHERTEXT=ceXhQrVMuf70i2qL3DvQu/0AFhkPXAV6JyzbPdTs9A/Twjd8PGs/2XV3VhBgvhb4Fr1xWnVmIUKwxgP+emBlIqwpmoJsnkVNJSMpXP0YG+MkvheB9ATlfGXTlf6RLt7OaOtSSBxeVZQBtuWuVnbatTQiXhhC91J49V4+n1JiDjs0tRpz8hUxuzyedkjXWv8Mn0gD4nHV1GgrxLvNGPrrk2Y3xcZO4MoO3Lp447BjPTXwSwmR2rSOeW9+MsuYPfjx2dKC/uJRr6GM2wyyzaRu+5N+goNDaPzTAcNhutb/isf9wAOC3v8SGYXkF9919/ioxZKejWDscFLKBU/6O3+n7Zgf+NLJP/5qyjFOR6F/OnyXpEaOcD+5zpW0yppo/SqThS14E3jDIDmvwxiDBY0SHMBebLYx/t6Y29P+kAP3dPBXeHejHTZ0kJlQuKNcu2Ge1EkTFvR4vScvFlG4OcZIDzxwVxbS6IR83m/xeFyQhuMc/PntDvIJBnUb+AREtZCtNOcwh2Qei8iHfSttCNzROLRZ4ETaBUEKBBWKJztscDfKFOAGKiJIZeze+nVYK7sf94PMJF1r3ET75IyvwCXUEQoPtacg8nyUx6UVpp4BP6xzeOcEPKr1GPEj1kjcj43+rhozzzrs5AgTknQAcorpUwHFUezQtKWyEh/wastp6elX7V0IBL8YmKmWmbIz2hS+uH9TspQM9BnJL/ex18qgmpwIhfR40hzo
Create a new transformation role named legacy-system
to use for the transformation that you will import the key into.
$ vault write transform/role/legacy-system transformations=application-form
Success! Data written to: transform/role/legacy-system
Import the key into the application-form
transformation. Imported keys do not support rotation by default, so include the allow_rotation
parameter and set its value to true
so that you can also try rotating the imported key. Add the allowed_roles parameter and specify the legacy-system
role.
$ vault write transform/transformations/tokenization/application-form/import \
ciphertext=$IMPORT_CIPHERTEXT allowed_roles=legacy-system allow_rotation=true
Success! Data written to: transform/transformations/tokenization/application-form/import
Try using the newly imported key to encode a value and some metadata.
$ vault write transform/encode/legacy-system value=887345 \
transformation=application-form \
ttl=8h \
metadata="Organization=ColoTown" \
metadata="Purpose=Server-Cage" \
metadata="Type=PIN"
Key Value
--- -----
encoded_value eRwUjS2L9e5mNKLD9Rqt6c49QfsWiqbCEWjtaM334NStmZRG4K7parh6UXJqhYpMVLT9Fe81q
The imported key is working, and the encoded value returned by the application-form
transformation is using the imported key.
Note
To import subsequent versions of the key, you must use the import_version API endpoint.
Review the key information.
$ vault read transform/tokenization/keys/application-form
Key Value
--- -----
auto_rotate_period 0s
latest_version 1
min_available_version 0
min_decryption_version 1
name application-form
type aes256-gcm96
The key's latest_version
is currently 1..
Rotate the key.
Note
Once an imported key is rotated within Vault, it will no longer
support importing key material with the import_version
endpoint.
$ vault write -force transform/tokenization/keys/application-form/rotate
Success! Data written to: transform/tokenization/keys/application-form/rotate
Check the key information once more.
$ vault read transform/tokenization/keys/application-form
Key Value
--- -----
auto_rotate_period 0s
latest_version 2
min_available_version 0
min_decryption_version 1
name application-form
type aes256-gcm96
The key's latest_version
is currently 2., and you can no longer import external versions of the key as it is now internally maintained by Vault.
Clean up
Unset the
VAULT_TOKEN
environment variable.$ unset VAULT_TOKEN
Unset the
VAULT_ADDR
environment variable.$ unset VAULT_ADDR
Unset the
IMPORT_CIPHERTEXT
environment variable.$ unset IMPORT_CIPHERTEXT
Stop and remove the Postgres container.
$ docker rm postgres --force
Stop the Vault Enterprise dev mode container.
$ docker rm vault-enterprise --force
Summary
Transformation secrets engine introduced tokenization transformation feature which replaces sensitive data with unique value (token) that are unrelated to the original value in any algorithmic sense. This can help organizations to meet certain industry standards.
If retaining the original data format is important, refer to the Transform Secrets Engine to learn about the format preserving encryption (FPE) transformation.