ApiKeyAuditLog
interface for audit.
The currently supported Kafka API requests are:
Name | Type | Default | Description |
---|---|---|---|
topic | String | .* | Topics that match this regex will have the Interceptor applied |
apiKeys | Set[string] | Set of Kafka API keys to be audited | |
vcluster | String | .* | vcluster that matches this regex will have the Interceptor applied |
username | String | .* | username that matches this regex will have the Interceptor applied |
consumerGroupId | String | .* | consumerGroupId that matches this regex will have the Interceptor applied |
topicPartitions | Set[Integer] | Set of topic partitions to be audited |
Key | Type | Default | Description |
---|---|---|---|
topic | String | .* | Topics that match this regex will have the Interceptor applied |
policies | Policy list | List of your masking policies | |
errorPolicy | String | fail_fetch | Determines the plugin behavior when it can’t parse a fetched message: fail_fetch or skip_masking |
Key | Type | Description |
---|---|---|
name | String | Unique name to identify your policy |
fields | list | List of fields that should be obfuscated with the masking rule. Fields can be in a nested structure with dot . . For example: education.account.username , banks[0].accountNo or banks[*].accountNo |
rule | Rule | Rule |
schemaRegistryConfig | Schema registry | The schema registry in use. |
Key | Type | Default | Description |
---|---|---|---|
type | Masking type | MASK_ALL | The type of masking (see below). |
maskingChar | char | * | The character used for masking data. |
numberOfChars | number | Number of masked characters, required if type != MASK_ALL |
MASK_ALL
: all data will be maskedMASK_FIRST_N
: the first n
characters will be maskedMASK_LAST_N
: the last n
characters will be maskederrorPolicy
which can be set to fail_fetch
or skip_masking
.
The default is fail_fetch
. In this mode, the plugin will return a failure to read the batch which the fetch record is part of, effectively blocking any consumer.
In skip_masking
mode, if there’s a failure to parse a message being fetched (e.g. an encrypted record is read in), then that record is skipped and returned un-masked.
Key | Type | Default | Description |
---|---|---|---|
type | string | CONFLUENT | The type of schema registry to use. Choose CONFLUENT (for Confluent-like schema registries including OSS Kafka) or AWS for AWS Glue schema registries. |
additionalConfigs | map | Additional properties that map to specific security-related parameters. For enhanced security, you can hide the sensitive values using environment variables as secrets. | |
Confluent Like | Configuration for Confluent-like schema registries | ||
host | string | URL of your schema registry. | |
cacheSize | string | 50 | Number of schemas that can be cached locally by this Interceptor so that it doesn’t have to query the schema registry every time. |
AWS Glue | Configuration for AWS Glue schema registries | ||
region | string | The AWS region for the schema registry, e.g. us-east-1 . | |
registryName | string | The name of the schema registry in AWS. Leave blank for the AWS default of default-registry . | |
basicCredentials | string | Access credentials for AWS. | |
AWS credentials | AWS credential configuration | ||
accessKey | string | The access key for the connection to the schema registry. | |
secretKey | string | The secret key for the connection to the schema registry. | |
validateCredentials | bool | true | The true or false flag determines whether the credentials provided should be validated when set. |
accountId | string | The Id for the AWS account to use. |
basicCredentials
section for the AWS Glue schema registry, the client we use to connect will instead attempt to find the connection information is needs from the environment and the credentials required can be passed this way to Gateway as part of its core configuration. Find out more about setting up AWS.X-CLIENT_IP: "{{userIp}} testing"
.
Here are the values we can expand:
Config | Type | Description |
---|---|---|
topic | String | Regular expression that matches topics from your produce request |
headers | Map | Map of header key and header value will be injected, with the header value we can use {{userIp}} for the user ip information we want to be injected |
overrideIfExists | boolean | Default false , configuration to override header on already exist |
injectHeaderTopic
topic.
injectHeaderTopic
.
password
field, it will target this field within the incoming Kafka record for encryption.
keySecretId
specified in your configuration to ensure the correct key is utilized.
keySecretId
provided in your Interceptor configuration and that’s stored in the header of the record, on the backing Kafka.
envelope encryption
technique to encrypt data. Here are some key terms we’ll use:
Term | Definition |
---|---|
KMS | Key Management Service: A system responsible for managing and storing cryptographic keys, including the KEK. |
KEK | Key Encryption Key: A key stored in the KMS, used to encrypt the DEK. Notably, the KEK is never exposed to or known by the interceptor. |
DEK | Data Encryption Key: A key generated by the interceptor, used to encrypt the actual data. |
EDEK | Encrypted Data Encryption Key: The DEK that has been encrypted by the KEK, ensuring that the DEK remains secure when stored or transmitted. |