All posts by Mooky Desai

SOC 2 Compliance. What, Why, and How.

What is SOC 2?

Service Organization Controls (SOC) 2 is an InfoSec Compliance Standard maintained by the American Institute of Certified Public Accountants (AICPA). It’s designed to test and demonstrate the cybersecurity of an organization. 

To get a SOC 2, companies must create a compliant cybersecurity program and complete an audit with an AICPA-affiliated CPA. The auditor reviews and tests the cybersecurity controls to the SOC 2 standard, and writes a report documenting their findings. 

The resulting SOC 2 report facilitates sales and vendor management by providing one document that sales teams can send to potential customers for review, instead of working through cybersecurity questionnaires.

5 Trust Services Criteria

  • Security (REQUIRED)
    • Guidelines on company management and culture, risk assessments, communication, control monitoring, and cybersecurity strategy
  • Availability – uptime of a vendors services
    • Controls include plans to maximize uptime and restore availability after an outage. 
    • Business continuity, data recovery, and backup plans are all important controls for this criteria.
  • Processing Integrity – how a vendor processes the data it collects
    • Controls are meant to evaluate that data processing is being performed in a consistent manner and that exceptions are handled appropriately.
      • It is challenging and laborious work to create the documentation needed to meet this criteria, because it requires SOC 2-specific content with detailed descriptions on how data is being processed. (Almost all other content used in a SOC 2 audit has applications outside of SOC 2, this does not.) 
  • Confidentiality – used to keep confidential business data confidential
    • This criteria expects vendors to identify and protect confidential data. 
    • Example controls for confidentiality include encryption and data destruction. 
  • Privacy – how personal information is kept private.
    • This criteria requires that vendors have a privacy policy, that personal data is collected legally, and is stored securely. 
    • SOC 2 Privacy is more applicable to Business-to-Consumer companies as opposed to Business-to-Business companies.

How Do I Get There?

To achieve SOC 2 Type 2 compliance, you will need to implement and maintain strong controls over your systems and processes. This will typically involve the following steps:

  • Identify the systems and processes that need to be covered by your SOC 2 Type 2 compliance efforts.
  • Develop policies and procedures to ensure that these systems and processes are secure and meet the relevant standards.
  • Implement technical and organizational controls to support these policies and procedures.
  • Test and monitor your controls to ensure that they are effective.
  • Conduct an independent audit of your controls by a certified third-party auditor.
  • Obtain a report from the auditor indicating that your controls meet the relevant standards.

Achieving SOC 2 Type 2 compliance is an ongoing process, and you will need to continually review and update your controls to ensure that they remain effective.

What Can I Do To Prepare?

  • Determine Scope
    • SOC 2 is about demonstrating your commitment to security and improving customer confidence in your security program. You should include all services and products that you expect customers will have security concerns for. 
  • Identify and fill gaps
    • Evaluate your current cybersecurity program in comparison to the SOC 2 control set. Even companies with mature cybersecurity programs do not meet every single control from the get-go.
      • There are a number of administrative and technical security controls that are often overlooked prior to getting a SOC 2, and they can be sticking points that generate a lot of additional work before and during the audit process
  • Document – create and edit security policies and other documentation
    • Define access controls. Who is required to have it? What types of apps are required to use it, versus which ones are not? What authenticator apps are allowable?
      • Most controls need to have a policy and evidence your organization is sticking to the policy created for them. It’s a lot of work – but your company will become much more secure in the process. 
  • Modify internal procedures.

What Are The Benefits?

  • Demonstrating to customers that you have strong controls in place to protect their data can help you build trust with them and increase customer satisfaction.
  • Achieving SOC 2 Type 2 compliance can help you comply with regulatory requirements, such as the Payment Card Industry Data Security Standard (PCI DSS) and the Health Insurance Portability and Accountability Act (HIPAA).
  • Demonstrating that you have been independently audited and have met industry standards for information security can give your business a competitive advantage.
  • Achieving SOC 2 Type 2 compliance can also protect your business from potential liabilities, as it shows that you have taken reasonable steps to protect your customers’ data.
  • Finally, implementing strong controls as part of your SOC 2 Type 2 compliance efforts can help you improve the security and reliability of your systems, which can ultimately benefit your business by reducing the risk of data breaches, system failures, and other security incidents.

How Can We Help?

  • Perform a Gap Assessment – A gap assessment is crucial for taking stock of an existing cybersecurity program and finding gaps that need to be filled to get your company audit-ready.
  • Acquire and implement technical controls – if there’s a deficit, consultants help companies add those needed controls to to improve security and ensure compliance.
  • Adjust policies and procedures – As we just mentioned, policies and procedures are likely not be audit-ready until efforts are made to make them so.
  • Create content – The content that’s created is going to be key documentation for a SOC 2 audit. Policies, procedures, reports – they can write it and get it in place. 
  • Project manage – Virtual CISOs can project-manage the whole audit project. There’s something to be said about domain-expert project managers. 
  • Perform risk assessments – if this is not something that you were doing before you will now! Risk Assessments are mandatory for SOC 2 compliance, and a Virtual CISO can perform the assessment and write the report. 
  • Perform vendor evaluations – Vendor management is a part of every SOC 2 compliance program. If this is not already in practice at an organization, it can valuable to outsource the activity to an expert. 
  • Perform “External Internal Audit” – Internal audits are necessary for SOC 2 compliance – they help make sure that your company is doing everything needed before the auditor catches you. Some firms don’t have an internal audit function, so an “External Internal Auditor” who is familiar with the standards and can keep the organization accountable is helpful.
  • Select an Auditor – A good Virtual CISO will know what makes a good SOC 2 auditor and can remove auditor selection from your plate. 
  • Advocate on your behalf with the Auditor – Your Virtual CISO will be with you for every audit call. They will advocate on your behalf, ensuring the auditor sets realistic compliance expectations for your organization. 

Reach out to us if you need help @ mdesai@intervision.com

Minecraft 1.17 Java Error

I downloaded the new 1.17 jar and updated my server as I have done numerous times. I received the error below this time around:

Error: LinkageError occurred while loading main class net.minecraft.server.Main
java.lang.UnsupportedClassVersionError: net/minecraft/server/Main has been compiled by a more recent version of the Java Runtime (class file version 60.0), this version of the Java Runtime only recognizes class file versions up to 55.0

I updated my java version using the commands below.

sudo add-apt-repository ppa:linuxuprising/java
sudo apt update
sudo apt install oracle-java16-installer --install-recommends

After updating it, it started up with no problems.

Galaxy S21 Ultra – Pic from Video

I bought a shiny new Galaxy S21 Ultra the other day and wanted to try out the “Pic from Video” feature. After pulling my hair out a few times, I resorted to Googling. The hair pulling resumed.

The short answer here is:

Go to Camera

Select Video mode

Click the 5th Icon on the top

Scroll that bar to the right and select 8K/24 (see below)

Shoot your video

From here, Samsung says…”Go to Gallery and click the little icon”…YOU CANT (but dont pull your hair out).

Once you are in the Gallery, click the 3 little dots (ellipses) in the bottom right

Select “Open in Video Player”

NOW, if you click on your screen, you will see a little icon on the top right of the screen and you can pause your video and click the icon to capture a pic.

Hope that helps. Good luck!

Fabwerx – Sun Valley

This is a long winded 1 Star Review. They have way too many projects going on and also manufacture stuff. One small shop, not enough people, horrible customer service. You will wait and wait and wait and wait. They dont answer phones, dont respond to emails, string you along…and then post on their social media like they turn things around “while you wait”. Dont fall for it.

About 6 months ago I bought a spankin’ new Polaris Pro XP 4 SXS (UTV). I jokingly call it the red rocket. In my constant obsessive quest to customize every vehicle I own, I pre-ordered a sweet new cage with roof, optional rear chase lights, front windshield, and any other option you could get on it. I subsequently bought a bunch of other lighting accessories to mount on the cage. Bunch of lights from Baja Designs, a few thing from KC HiLites, and some 5150 187 whips.

I was looking around the internet and came across this video from UTV Source and a company called Fabwerx was doing a full custom build on a PRO XP 4. Turns out, the shop just happened right around the corner from me so I stopped by. I met with Thad the owner, told him that I had a cage and bunch of accessories that I wanted him to install for me. He gave me a reasonable price and he told me to let him know when I had the cage.

Fast forward about a month and I got word that my cage would be ready in a couple of weeks. This is where the fun began.

I called him, left numerous messages, I emailed him, I stopped by the shop (he wasnt there), so I hit him up on Instagram where I saw him helping people out “while they wait”. He mentioned he was busy and would let me know when he could “fit me in”. Crickets. Hit him up again. Same story, “we are crazy busy. Ill check the cal and let you know”. Crickets. Pinged him AGAIN, he said sorry, crazy busy, ill call you tomorrow. I think you know how that went. I tried calling again, no answer. I finally just said, hey I’m not trying to waste either of our time. If you cant do it, just let me know. I got the “crazy busy” speech yet again. I felt like I was dealing with Dory at this point. I was frustrated and sent him a last chance text saying, “maybe IG DM’s, emails, and/or phone calls arent the proper way to schedule a cage install with you. Can you let me know how all your customers schedule work with you?”. I finally got an honest text from him saying, “Honestly, we are swamped, try these other shops in Simi Valley or Santa Clarita”. Finally some honesty out of this guy.

I replied back saying thanks for being honest, if you could have told me that a month ago, I could have had my cage on already. “Good luck”.

His response? “Sorry pal. good luck to you too” followed by a thumbs down emoji. As if I had rubbed him the wrong way. Super classy.

He gets minus stars. I would stay clear away from this place. They do good design and fab work but getting them to do anything in a timely fashion wont happen. Besides, there are plenty of other places that can do the same thing. SDR Motorsports has much better design and engineering if you ask me. I have another friend that has a Turbo S and ordered a cage and much more Fabwerx. Same story. He had to wait over 2 months to get his cage and then another month to get a slot to install it. And this was for a cage they sold readily through UTV source all day long.

Speaking of UTV Source, remember that video? Watch the video and you will see how frustrated the owner is by the time he picked up his UTV. The 1st video was posted July 15th. The “reveal” video was posted September 20th. “A tornado that spun out of control”. I dont think it was a good spin personally.

Im going to try a couple of other shops around my area. Ill update this when I get my cage installed.

UPDATE: 3/4/2021

SDR Motorsports recommended a few shops to me. One was DirtDirect Motorsports in the San Fernando Valley. They are right by the San Fernando Mission, in fact. I had a brief conversation with Raul, he provided me with a reasonable quote, gave me a date to bring the car and parts and BOOM, that was that. They are very knowledgeable, informative, and helpful and exactly what you want when you are about to drop thousands on upgrades for your baby. They have had my car for about 2 weeks and even worked directly with SDR to obtain some missing parts from their kit. I cant say enough good things about these guys. I called numerous shops to get my project off the ground, these guys are by FAR the best. Big thanks to Tim @ SDR for being a great business owner too. It took quite some time to get my kit but he has always been very responsive and goes the extra mile to make sure you are taken care of.

Happy off-roading everyone. Stay safe out there.

Chakros.com

Completely fraudulent website. Dont order anything from this place. I was looking for a winch for my UTV, it showed on in a Google shopping search at a great price. I ordered it using PayPal and it never showed up but my bank account was charged. All the information in the PayPal transaction had foreign characters (Chinese?), the email address never responded and the tracking information showed that it was delivered on a date BEFORE the ship date. I have filed a claim through PayPal but not holding my breath. Again, http://www.chakros.com…looks like they primarily sell jewelry…TOTAL FRAUD.

UPDATE (3/4/2021): PayPal determined that this was, in fact, a fraudulent website with numerous complaints and refunded me in full.

Terraform on Windows 101

Create a folder called “bin” in %USERPROFILE%

Start–>Run–>%USERPROFILE%–>create a folder called “bin”

Download Terraform

https://www.terraform.io/downloads.html
Save the .exe in the “bin” folder you created

Set Windows “PATH” Variable

System Properties–>Environment Variables
Highight PATH
Click “Edit”
Click “New”
Add %USERPROFILE%\bin

Create a user in AWS for Terraform

In AWS, go to IAM
Create a user called “terraform”
programmatic access only
Attach existing policies directly
Administrator access (proceed with caution!)
Copy the Access Key ID (save to credentials store like KeePass or an excel spreadsheet for now)
Copy the Secret Access Key (save to credentials store)
Or download the .CSV and grab the values

Create a folder called .aws on your PC

Make sure to add a “.” at the end of the folder name or it will throw an error

Create a credentials file

Create a new file called “credentials” in the .aws directory (remove the extension)
Using the ID and Key from above, make it look like this:
Line 1: [default]
Line 2: aws_access_key_id=your_key_id_here
Line 3: aws_secret_access_key=your_access_key_here
Save the file (again, make sure to remove the .txt extension or it wont work)

Download and install Git for Windows

https://gitforwindows.org

Create a folder called TF_Code for your working files

I created mine on my desktop

Open Git Bash, navigate to your working directory

cd desktop
cd TF_Code

Make the directory a Git repository

git init

Create a new file with VI

vi first_code.tf
Line 1: provider “aws” {
Line 2: profile = “default”
Line 3: region = “us-west-2”
Line 4: }
Line 6: resource “aws_s3_bucket” “tf_course” {
Line 7: bucket = “tf-course-uniqueID”
Line 8: acl = “private”
Line 9: }

Commit the code

git add first_code.tf
git commit -m “some commit message”

Try Terraform! (in Git Bash)

terraform init
Downloads and Initializes plugins

Apply the code

terraform apply
yes (to perform the actions)

Check your AWS account (S3), you should see a new S3 bucket!

Delete the bucket

terraform plan -destroy -out=example.plan
terraform apply example.plan

Your bucket will now be deleted!

To recreate the bucket, just run the ‘terraform apply’ command again, say yes, and…BOOM, your bucket is created again!

Hope that helps. Good luck and happy computing!

SSH from PuTTY to GCP Compute Engine

First off, if you are trying to securely connect to your enterprise production network and instances, there are better (safer) methods (architectures) to do this. OSLogin or federating your Azure AD for instance, might be more secure and scalable. I run a pointless website (this one) with nothing to really lose across a handful of instances. This is a hobby.

Second, I recently got a dose of humble pie when trying to use PuTTY on Windows to connect to a Ubuntu instance in GCP. I was generally using gCloud command-line for getting my app running but I got a wild hair up my ass this morning to try and just use PuTTY to avoid the step of logging into Google Cloud (via Chrome) for administration. I am fairly use to AWS where I just create an instance, download the .pem file, convert it to a ppk with PuTTYgen, and then use that along with the default login (ec2-user or ubuntu) to connect to my minecraft and web servers. GCP was a little different.

Once I read a few docs from Google searches, it became much more apparent vs reading the GCP docs. Here is how I did it.

Download PuTTYgen if you dont have it already.

Launch PuTTYgen.

Click on “Generate“. I used a 2048 bit RSA key.

Move your mouse around the box to generate a key.

In the “Key comment” field, replace the data there with a username you want to use to connect to your Compute Engine instance (highlighted)

Copy the ENTIRE contents of the public key (the data in the “public key for pasting…”) box. It should end with the username you want to connect with if you scroll down.

Click on “Save Private Key” and select a location/path that is secure (and one that you will remember!).

Create a new Compute Engine instance or go to an existing instance. From the VM instances page, click on the instance name. In my case it was “minecraft001”.

At the top of the page, click on “Edit“.

Scroll almost all the way to the bottom and you will see an “SSH Keys” section.

Click on “show and edit

Click on “+ Add Item

Paste in the key data you copied from PuTTYgen from the step above.

  • You will notice that it extracts the username from your key on the left. This is the username you will use from PuTTY.

On the same page, click on “Save” at the bottom of the page.

On the VM instance details page, find the “External IP” section and copy the IP address (the cascaded window icon will add it to your buffer).

Now open or go back to your PuTTY client (not PuTTYgen).

Paste the IP address into your PuTTY client.

On the left side of the PuTTY client, scroll down to the “Connection” section and click the “+” to expand it

Click the “+” next to the “SSH” section

HIGHLIGHT the “Auth” section. Dont expand it.


Click on “Browse…

Find the Private Key file you saved from earlier (should have a .ppk file extension). Double click to select and use it.

Scroll back up and highlight the Session category.

From here you can either name your connection and Save it under “saved sessions“…or just click the “Open” button.

It should make a connection to your Compute Instance and ask for a username. Supply the username you specified in the step above and voila! I used “jonny” in my example.

That’s it! Happy computing!

Pushing Docker Containers to GitHub

I recently went through the process of building a dockerfile from scratch. I wont get into the details of that process but I did come across an error when trying to publish my package to GitHub Packages.

I tried to do a sudo docker push docker.pkg.github.com/mookyd/mymooky/mymooky:latest (my repo) and was thrown the error:

unauthorized: Your request could not be authenticated by the GitHub Packages service. Please ensure your access token is valid and has the appropriate scopes configured.

Its pretty clear what needed to happen but I thought my credentials would be enough since I wasnt using a script per se. I used docker login and provided my username and password and tried the command again. Same error.

After doing some reading, I discovered that you need to pass a “Personal Access Token” as a password. I generated a PAT under Settings–> Developer Settings –> Personal Access Tokens. I gave the token the access to the repo and to read and write packages. I then used docker login and passed the token string to login. After that, I was able to use docker push to upload my image.

Minikube on VirtualBox on Ubuntu on VirtualBox

I recently needed a small lab environment to sharpen my Kubernetes skills. I setup Minikube on an Ubuntu VM running 18.04.4 LTS (bionic). This VM was created on my Windows Desktop in VirtualBox. Confused yet? Some of the commands can leave your environment insecure so do not do this in your Production Internet facing environment.

To get started, I downloaded and installed VirtualBox onto my Windows PC. I then created an Ubuntu 18.04 VM and make sure the number of vCPUs on your VM is greater than or equal to 2.

First step is to update your VM.

  • sudo apt-get update
  • sudo apt-get install apt-transport-https (if using 1.4 or earlier)
  • sudo apt-get upgrade

Install VirtualBox on your Ubuntu VM

  • sudo apt install virtualbox virtualbox-ext-pack

Download Minikube

  • wget https://storage.googleapis.com/minikube/releases/latest/minikube-linux-amd64

Make it executable

  • sudo chmod +x minikube-linux-amd64

Move it so its in path

  • sudo mv minikube-linux-amd64 /usr/local/bin/minikube

Download kubectl

  • curl -LO https://storage.googleapis.com/kubernetes-release/release/curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt/bin/linux/amd64/kubectl

Make it executable

  • chmod +x ./kubectl
  • sudo mv ./kubectl /usr/local/bin/kubectl

Check that its working properly

  • kubectl version -o json

I received an error saying docker wasn’t in $PATH. You may or may not see this error.

Install docker

  • curl -fsSL https://get.docker.com/ | sh

Start Minikube

  • sudo minikube start –vm-driver=virtualbox

Start the Kubernetes Dashboard

  • minikube dashboard
  • minikube dashboard –url

If you want to view the dashboard remotely, you will need to run the following commands:

  • sudo kubectl proxy –address=’0.0.0.0′ –disable-filter=true

You will get a message saying “Starting to serve on [::]:8001”

Hopefully this helps. If you get stuck or have a way to optimize this, please comment below.

Kudos to https://computingforgeeks.com/how-to-install-minikube-on-ubuntu-18-04/ for helping me get started.