Setting up Log Analysis and KVS stream with Deepracer for cloud on EC2

Finlay Macrae
3 min readFeb 9, 2022

My last article showed how to get a Ec2 (Elastic Compute) instance up and running. This one will focus more on how to use it and configuring log analysis.

If you want to watch your model training and run log analysis you will need to open some ports on your EC2 instance.

Search ec2 under services and click EC2:

Click on your instance ID:

Click on the security tab on the security group name (take a note of your Public IPv4 DNS while you are here as that’s the best way to connect to the instance from your browser later):

Now click Edit inbound rules:

Update to add TCP ports 8888 (for log analysis) and 8080 (for the viewer) like so then click Save rules:

Now if you are training you should be able to watch the car driving and see what it is seeing:

Clicking on the /racecar/deepracer/kvs_stream link to see the model training:

View of the model training.

If this isn’t showing check you are currently training and check you have this parameter in system.env set to True:

DR_KINESIS_STREAM_ENABLE=True

For log analysis run the dr-start-loganalysis:

Take a copy of the token you are given and then in your browser go to the public IP4v4 DNS name for your instance with port 8888. For me this is http://ec2-44-200-26-118.compute-1.amazonaws.com:8888

Paste the token in and click Log in and you’ll see something like this:

Double click on Training_analysis.ipynb

Edit the cell that loads the logs to this (swap out your model name and bucket name for the ones you see in s3)

model_logs_root = 'rl-deepracer-1/'
my_bucket ='deepracer-finlay-macrae'
import boto3
import os
def download_dir(client, resource, dist, local='/tmp', bucket='your_bucket'):
paginator = client.get_paginator('list_objects')
for result in paginator.paginate(Bucket=bucket, Delimiter='/', Prefix=dist):
if result.get('CommonPrefixes') is not None:
for subdir in result.get('CommonPrefixes'):
download_dir(client, resource, subdir.get('Prefix'), local, bucket)
for file in result.get('Contents', []):
dest_pathname = os.path.join(local, file.get('Key'))
if not os.path.exists(os.path.dirname(dest_pathname)):
os.makedirs(os.path.dirname(dest_pathname))
if not file.get('Key').endswith('/'):
resource.meta.client.download_file(bucket, file.get('Key'), dest_pathname)

client = boto3.client('s3')
resource = boto3.resource('s3')
download_dir(client, resource, model_logs_root, '/workspace', bucket=my_bucket)
log = DeepRacerLog(model_logs_root)# load logs into a dataframe
log.load()
try:
pprint(log.agent_and_network())
print("-------------")
pprint(log.hyperparameters())
print("-------------")
pprint(log.action_space())
except Exception:
print("Robomaker logs not available")
df = log.dataframe()

Ignore the ‘Robomaker logs not available’ warning you get.

You can see I got the values from my s3 bucket:

Modify the model_logs_root each time you increment training to review you’re latest batch of training. You can load the logs mid session too so you can check periodically. Just rerun the cell to load the files.

Next article will show how to add a new track.

--

--

Finlay Macrae

Virgin Money UK IT manager and AWS Deepracer Pro racer.