Is there a way to avoid AWS credentials being leaked in the logs when doing Snowflake fast write?

ecerulm
ecerulm Registered Posts: 47 ✭✭✭✭✭

I just tried the Writing data to snowflake with the Automatic fast-write using a S3 connection that uses "Credentials STS with AssumeRole" and the job build logs contain the AWS credentials in plain text like this:

{
"context": {
"type": "JOB_ACTIVITY",
"authIdentifier": "xxxxxx,
"projectKey": "TESTSNOWFLAKEWRITE",
"jobId": "Build_DKU_TUTORIAL_BASICS_101orders_date_changed_filtered_filtered__NP__2023-08-17T11-27-15.507",
"activityId": "compute_DKU_TUTORIAL_BASICS_101orders_date_changed_filtered_filtered_NP",
"activityType": "recipe",
"recipeType": "sampling",
"recipeName": "compute_DKU_TUTORIAL_BASICS_101orders_date_changed_filtered_filtered"
},
"type": "SQL_QUERY",
"id": "VjHZ3ILWd4I5kenp",
"startTime": 1692271644872,
"sqlQuery": {
"connectionUsageId": "HCFdrtLewGZ86g2m",
"connection": "xxxxx",
"query": "COPY INTO \"SANDBOX\".\"RUBEN\".\"TESTSNOWFLAKEWRITE_DKU_TUTORIAL_BASICS_101ORDERS_DATE_CHANGED_FILTERED_FILTERED\" FROM 's3://xxxx/dataiku/dataiku-snowflake-tmp/tmp.PDpZXNdJqTYBu3Y7/' CREDENTIALS=(aws_key_id='AWS_KEY_ID' aws_secret_key='CLEARTEXTSECRETKEY' aws_token='CLEARTEXTTOKEN') FILE_FORMAT = ( TYPE=csv NULL_IF=() FIELD_DELIMITER='\\t' FIELD_OPTIONALLY_ENCLOSED_BY='\"' ESCAPE_UNENCLOSED_FIELD=NONE) FORCE=false"
}
}

Note the COPY .... CREDENTIALS(aws_key_id=xxxx aws_secret_key=yyyy aws_token=zzz).
Obviously I would prefer AWS credentials not been written to the logs (masking them would be best).


Is there any way to prevent this?


Operating system used: Amazon Linux 2

Tagged:

Answers

Setup Info
    Tags
      Help me…