You can create multiple profiles with different credentials in each.
The credentials and config file are updated when you run the command aws configure. The config file is located at ~/.aws/config on Linux or macOS, or at C:\Users\USERNAME\.aws\config on Windows.
aws configure --profile isilon_user1AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]:
Default output format [None]: text

This page will auto populate the command line so you can cut and paste to your command prompt.
Some commands will not update if the relevent fields below are not populated.
EndPoint (required):
Example: http://s3.isilon.local:9020
The profile you created above. Example: isilon_user1
Example: bucket1
File Source:
Example: c:\temp\file.txt
File Destination:
Example: /folder/object.txt

AWS CLI Supported Enviromental Variables

Additional information was provided by DonR.
When troubleshooting HTTPS issues, try - -no-verify-ssl. Because if your SSL cert is invalid, the command will fail with a 403 error.
With no verify, the traffic is still be encrypted, but it is not from a secure host.

s3 high-level API Calls s3 high-level cli manual

# s3 make bucket (create bucket)
aws s3 mb s3://tgsbucket

# s3 remove bucket
aws s3 rb s3://tgsbucket
aws s3 rb s3://tgsbucket --force

# s3 ls commands
aws s3 ls
aws s3 ls s3://tgsbucket
aws s3 ls s3://tgsbucket --recursive
aws s3 ls s3://tgsbucket --recursive --human-readable --summarize
(if you are trying to list a PREFIX ( aka Folder/Directory ) end it with a backslash / ) s3://tgsbucket/folder1/

# s3 cp commands
aws s3 cp getdata.php s3://tgsbucket
aws s3 cp /local/dir/data s3://tgsbucket --recursive
aws s3 cp s3://tgsbucket/getdata.php /local/dir/data
aws s3 cp s3://tgsbucket/ /local/dir/data --recursive
aws s3 cp s3://tgsbucket/init.xml s3://backup-bucket
aws s3 cp s3://tgsbucket s3://backup-bucket --recursive

# s3 mv commands
aws s3 mv source.json s3://tgsbucket
aws s3 mv s3://tgsbucket/getdata.php /home/project
aws s3 mv s3://tgsbucket/source.json s3://backup-bucket
aws s3 mv /local/dir/data s3://tgsbucket/data --recursive
aws s3 mv s3://tgsbucket s3://backup-bucket --recursive

# s3 rm commands
aws s3 rm s3://tgsbucket/queries.txt
aws s3 rm s3://tgsbucket --recursive

Using s3api low level calls s3api low-level cli manual

# S3 API list-buckets
aws s3 api list-buckets --query "Buckets[].Name" --output (json,text,table)

# s3api Get Bucket ACL
aws s3api get-bucket-acl --bucket mybucket --output (json,text,table)

# s3api Get Bucket Location
aws s3api get-bucket-location --bucket mybucket --output (json,text,table)

# s3api Get Object ACL
aws s3api get-object-ACL --bucket mybucket --key file.txt --output (json,text,table)

# s3api head bucket
aws s3api head-bucket --bucket mybucket --output (json,text,table)
empty responce is you have access, otherwise it is access denied

# s3api head-object
aws s3api head-object --bucket mybucket --key file.txt --output (json,text,table)

# s3api List Multipart Uploads IN PROGRESS
aws s3api list-multipart-uploads --bucket mybucket --output (json,text,table)

# s3api List Objects
aws s3api list-objects --bucket text-content --query 'Contents[].{Key: Key, Size: Size}' --output (json,text,table)

# s3api List Objects V2
aws s3api list-objects-v2 --bucket text-content --output (json,text,table)

# s3api List Parts
aws s3api list-parts --bucket my-bucket --key 'multipart/01' --upload-id dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R

# s3api Put Bucket ACL
aws s3api put-bucket-acl --bucket MyBucket --grant-full-control, --grant-read uri=

# s3api Put Object
aws s3api put-object --bucket text-content --key dir-1/big-video-file.mp4 --body e:\media\videos\f-sharp-3-data-services.mp4 --output (json,text,table)

# s3api Put Object ACL
aws s3api put-object-acl --bucket MyBucket --key file.txt --grant-full-control, --grant-read uri=

# s3api Upload Part
aws s3api upload-part --bucket my-bucket --key 'multipart/01' --part-number 1 --body part01 --upload-id "dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R"

# s3api Upload Part Copy
aws s3api upload-part-copy \
--bucket my-bucket \
--key "Map_Data_June.mp4" \
--copy-source "my-bucket/copy_of_Map_Data_June.mp4" \
--part-number 1 \
--upload-id "bq0tdE1CDpWQYRPLHuNG50xAT6pA5D.m_RiBy0ggOH6b13pVRY7QjvLlf75iFdJqp_2wztk5hvpUM2SesXgrzbehG5hViyktrfANpAD0NO.Nk3XREBqvGeZF6U3ipiSm"

Make a batch file to test the system. Copy and paste the below (after you filled in the fields above and submitted) into a batch file.

echo off

set CUR_YYYY=%date:~10,4%
set CUR_MM=%date:~4,2%
set CUR_DD=%date:~7,2%
set CUR_HH=%time:~0,2%
if %CUR_HH% lss 10 (set CUR_HH=0%time:~1,1%)

set CUR_NN=%time:~3,2%
set CUR_SS=%time:~6,2%
set CUR_MS=%time:~9,2%


echo on

date /t >%SUBFILENAME%
time /t >>%SUBFILENAME%
@echo REM ******************* Profile 1 ***************** >>%SUBFILENAME%
@echo List bucket3 ******************************** >>%SUBFILENAME%
@echo REM Make a bucket ******************************** >>%SUBFILENAME%
@echo REM Copy Object ******************************** >>%SUBFILENAME%
@echo REM List Bucket ******************************** >>%SUBFILENAME%
@echo REM Remove Object ******************************** >>%SUBFILENAME%
@echo REM List Bucket ******************************** >>%SUBFILENAME%
@echo REM Remove Bucket ******************************** >>%SUBFILENAME%
@echo List bucket3 ******************************** >>%SUBFILENAME%
@echo REM ******************* Profile 2 ***************** >>%SUBFILENAME%
@echo REM to test multiple users, copy and paste above but change to profile 2 >>%SUBFILENAME%

© 2024 Copyright.
@ Captain Jack Sparrow