Skip to main content

How to encrypt s3 bucket credentials using openssl in shellscript

 Hello Techie's

 If someone reading your shell script and you have provided password variable without encyption, then the person reading that script can use password credentials for his own sake. In order overcome this, we need to encrypt the passwords so that person reading the script will not understand the password credentials.

------------------------------------------

Example s3 Bucket credentials:

Bucket_name: Archived

s3_access_key: abcdef

s3_secret_key: ghijkl


-- > Follow below steps to encrypt your bucket credentials.

---------------------------------------------

Step 1:  we have create our common key to encrypt access_key & secret_key of s3 bucket.

---------------------------------------------

key: techie

---------------------------------------------

Step 2: use below command to find base64 encryted format of your comman key and s3 bucket credentials.

echo "techie" | base64




key=$(echo "dGVjaGllCg==" | base64 -d)

echo "abcdef" | openssl enc -aes-256-cbc -md sha512 -a -salt -pass pass:${key}
U2FsdGVkX198KxzvnjkkgROpWbZ2m/+vMZ35QQ5F6c8=




echo "ghijkl" | openssl enc -aes-256-cbc -md sha512 -a -salt -pass pass:${key}

U2FsdGVkX1/JqHKlUL5HM1d2gH9L0S4ScDux9uaNkh0=




Step 3: Assigning  above encrypted credentials to variable

encrypted_s3_access_key=U2FsdGVkX198KxzvnjkkgROpWbZ2m/+vMZ35QQ5F6c8=

encrypted_s3_secret_key=U2FsdGVkX1/JqHKlUL5HM1d2gH9L0S4ScDux9uaNkh0=





Step 4: Store the decrypted s3 bucket credential in variable so that we can use this credentials to login the bucket

s3_access_key=$(echo ${encrypted_s3_access_key}| openssl enc -aes-256-cbc -md sha512 -a -d -salt -pass pass:${key})

s3_secret_key=$(echo ${encrypted_s3_secret_key}| openssl enc -aes-256-cbc -md sha512 -a -d -salt -pass pass:${key})


Step 5: Check if credentials variables are decryted and assigned to variable





#------------------------------------------------------------------------------------

#your script should look like below

#------------------------------------------------------------------------------------

key=$(echo "dGVjaGllCg==" | base64 -d)

encrypted_s3_access_key=U2FsdGVkX198KxzvnjkkgROpWbZ2m/+vMZ35QQ5F6c8=

encrypted_s3_secret_key=U2FsdGVkX1/JqHKlUL5HM1d2gH9L0S4ScDux9uaNkh0=

s3_access_key=$(echo ${encrypted_s3_access_key}| openssl enc -aes-256-cbc -md sha512 -a -d -salt -pass pass:${key})

s3_secret_key=$(echo ${encrypted_s3_secret_key}| openssl enc -aes-256-cbc -md sha512 -a -d -salt -pass pass:${key})

------------------------------------------------------------------------------------------

Comments

Popular posts from this blog

How to create a job in autosys using sample jil file

  Hello Techie's I know you are new to Autosys and want to know How to Create Job in Autosys using a jil file, Don't worry we are here with wonderful example and sample jil file. Before going into the sample jil, you need some script to run via Autosys. you can use the sample powershell script written in our blog  How to call stored procedure using powershell scripting using try and catch (vichietechie.blogspot.com)   or if you have your own script, you can use that as well. Problem Statement: 1. you need to Run particular script on sheduled time. 2. you need to know whether the script executed without errors. 2. If executed with errors the sheduled job should send notification. Prerequisties: 1. you must have CA Workload Automation Tool installed in your machine. 2. you must know the hostname & Uid of that machine. 3. you should have script to run in autosys. what is the difference between box and job in autosys ? A box is used to organize and control process flow of...

Curl command to check if file exists in s3 bucket

 Hello Techie's,  we can use below Shell script to check if particular file present in s3 bucket. # -------------------------------------------------------------------- #s3 Bucket Credentials #-------------------------------------------------------------------- s3_access_key=XXXXXXXXXXX s3_secret_key=XXXXXXXXXXX host=s3_bucket_api_url bucket=s3_bucket_name folder_name=name_of_the folder_in_bucket file_name=name_of_the_file_to_check #-------------------------------------------------------------------- #curl command variables #-------------------------------------------------------------------- dateValue=`date -R` contentType="application/xml" filepath="/${bucket}/${folder_name}/${file_name} signature_string="GET\n\${contentType}\n${dateValue}\n${filepath}" signature_hash=`echo -en ${signature_string} | openssl sha1 -hmac ${s3_secret_key} -binary | base64` #-------------------------------------------------------------------- #Curl command to check if file exi...

How to link image pull secret to service account in openshift

Hello Techie's, If your services are running in ECS - Openshift container platform and you need to edit the credentials of the Image pull secret, please follow the below steps. What is Image Pull Secret ? It is used to pull an image from a private container image registry or repository to the deployment. How does the yaml file of Image Pull secret looks ? apiVersion: v1 kind: Secret metadata:   ...   name: secret_name   ... data:   .dockerconfigjson: eyJodHRwczovL2luZGV4L ... J0QUl6RTIifX0= type: kubernetes.io/dockerconfigjson Problem Statement: Let's say, you are updating the existing Image Pull Secret values -  for example, changing docker credentials of your repository or updating the uid password from 8 character password to 15 character.  As you make changes in exising secret, Deployments or build which you trigger will obviously fail.  To overcome that you need to link the updated Image Pull secret to the service account which are using that secr...