SSL is not the tool you need in this case, although you should obviously already be running exclusively on encrypted traffic.
The problem here is one of access rights - you should not make files default-available for anyone that can figure out the file name to the particular file in the bucket. At the very least, you need to be using signed URLs with a reasonably short expiration, and default all other access to be blocked.
As I mentioned in other comments, I am a noob when it comes to web-sec; please forgive what may be dumb questions.
Is it really just permission rights “over-exposure” issue? Or does one need to also encrypt and then decrypt the data itself that must be sent to a database?
Also, if you have time, recommend any links to web/cloud/SaaS security best practices “for dummies”?
As I mentioned in other comments, I am a noob when it comes to web-sec; please forgive what may be dumb questions.
There’s nothing to forgive. Asking questions and being curious is how you learn this stuff.
Is it really just permission rights “over-exposure” issue?
From what I’ve read, it’s more fundamental than that. It’s a basic architecture issue. The datastore was publicly accessible, which it should never be. If they had it setup according to best practices, with an API to proxy access and auth, the datastore’s permissions would be of minimal consequence, unless their network was compromised (still best practice to secure it and approach with a zero-trust mindset).
Or does one need to also encrypt and then decrypt the data itself that must be sent to a database?
Generally, cloud datastores handle encryption/decryption transparently, as long as the account accessing data has authorization to use the key. They probably also didn’t have encryption setup.
Also, if you have time, recommend any links to web/cloud/SaaS security best practices “for dummies”?
SSL is not the tool you need in this case, although you should obviously already be running exclusively on encrypted traffic.
The problem here is one of access rights - you should not make files default-available for anyone that can figure out the file name to the particular file in the bucket. At the very least, you need to be using signed URLs with a reasonably short expiration, and default all other access to be blocked.
As I mentioned in other comments, I am a noob when it comes to web-sec; please forgive what may be dumb questions.
Is it really just permission rights “over-exposure” issue? Or does one need to also encrypt and then decrypt the data itself that must be sent to a database?
Also, if you have time, recommend any links to web/cloud/SaaS security best practices “for dummies”?
There’s nothing to forgive. Asking questions and being curious is how you learn this stuff.
From what I’ve read, it’s more fundamental than that. It’s a basic architecture issue. The datastore was publicly accessible, which it should never be. If they had it setup according to best practices, with an API to proxy access and auth, the datastore’s permissions would be of minimal consequence, unless their network was compromised (still best practice to secure it and approach with a zero-trust mindset).
Generally, cloud datastores handle encryption/decryption transparently, as long as the account accessing data has authorization to use the key. They probably also didn’t have encryption setup.
Here are some more resources: