credential

Manages credentials, passwords, and secrets within credential providers.

The CredentialProvider API in Arenadata Hadoop allows for the separation of applications and how they store their passwords/secrets.

In order to indicate particular provider type and location, use the hadoop.security.credential.provider.path parameter in core-site.xml or use the command line option -provider on each of the following commands.

This provider path is a comma-separated list of URLs that indicates the type and location of list of providers that should be consulted.

For example, the path user:///jceks://file/tmp/test.jceks,jceks://hdfs@nn1.example.com/my/path/test.jceks indicates that:

  • the current user’s credentials file should be addressed via the User Provider;

  • the local file /tmp/test.jceks is a Java Keystore Provider;

  • the HDFS file nn1.example.com/my/path/test.jceks is also a store for Java Keystore Provider.

The credential command is often used for provisioning passwords or secrets to a particular credential store provider.

In order to explicitly indicate which provider store to use, the the -provider option.
Otherwise, for the given path of multiple providers, the first non-transient provider will be used. This may or may not be the one that you want.

Providers frequently require a password or other secrets.
If a provider requires a password and is unable to find one, it will use the default password and emit a warning message that the default password is being used.

If the -strict flag is supplied, the warning message becomes an error message and the command returns immediately with an error status.

The usage is as follows:

$ hadoop credential <subcommand> [options]
Arguments

print [-alias alias ] filename [ filename2 …​]

Prints the token fields contained in filename (filename2, and so on).
If alias is specified, prints only the tokens matching alias.
Otherwise, prints all tokens

get URL
[-service scheme ]
[-format (java|protobuf)]
[-alias alias ]
[-renewer renewer ]
filename | filename [ filename2 …​]

Fetches tokens from a service using a URL and places it in the file.
The URL is required and must be the first parameter.

URL is the service URL, e.g. hdfs://localhost:9000.
alias will overwrite the service field in the token.
It’s intended for hosts that have external and internal names, e.g. firewall.com:14000 filename should come last and is the name of the token file.
It will be created if it does not exist.
Otherwise, token(s) are added to an existing file.
The -service flag should only be used with a URL that starts with http or https.
The following are equivalent: hdfs://localhost:9000/ vs. http://localhost:9000 -service hdfs

append
[-format (java|protobuf)]
filename filename2 [ filename3 …​]

Appends the contents of the first N files to the last file.
When tokens with common service fields are present in multiple files, earlier file tokens are overwritten.
Thus, tokens present in the last file are always preserved

remove -alias alias
[-format (java|protobuf)]
filename [ filename2 …​]

Removes the tokens matching alias from each file specified and writes out each file using the specified format.
The alias argument must be specified

cancel -alias alias
[-format (java|protobuf)]
filename [ filename2 …​]

Acts similarly to remove, except the tokens are also cancelled using the service specified in the token object.
The alias argument must be specified

renew -alias alias
[-format (java|protobuf)]
filename [ filename2 …​]

For each file specified, renews the tokens matching alias and writes out each file using the specified format.
The alias argument must be specified

import base64
[-alias alias ]
filename

Imports a Base64 token.
The alias argument will overwrite the service field in the token

Example:

$ hadoop credential list -provider jceks://file/tmp/test.jceks
Found a mistake? Seleсt text and press Ctrl+Enter to report it