TROY CARLSON
How I Get A Django Shell On Google App Engine Standard Environment
July 21, 2019

Note: the title is a bit misleading since I don't run Django on the Google App Engine (GAE) instance itself. Rather, I run Django locally but connected to my Google Cloud SQL database. If this doesn't solve your problem or you run into roadblocks, tweet at me (@troy_carlson) and I'll see if I can help.


If you're reading this, I'm assuming you know what Django and the Django shell are. If not, you're definitely in the wrong place.

Anyway, sometimes it's convenient to access the Django shell in your production environment for administrative purposes, ad-hoc queries, etc. I recently spun up a new Django project using Google App Engine (GAE) backed by a Postgres Cloud SQL database on Google Cloud Platform (GCP). Since the database was fresh, I needed to seed it with a super user. The easiest way to do this is to use the provided createsuperuser command:

$ python3 manage.py createsuperuser

That's easy enough for adding a super user to my local development database, but how do I get one created in my production database? My first thought was to SSH into one of my production GAE instances and run the exact same command. But when I logged into my GCP dashboard to find the SSH link for my instances, that option wasn't available. Oh...turns out SSH isn't available when running in the GAE Standard environment, it's only available in the GAE Flexible environment. Wonderful. After reviewing the differences between environments I decided I really didn't want to switch to the Flexible environment.

If you're using the Flexible environment, this is your cue to close this page and just go the SSH route. But as a Standard simpleton like myself, how do I get this damn super user created?

My next thought was to simply copy the user row from the local database into the production database. While that would get the job done, I didn't want to kick this proverbial can further down the road. What about the next time I want to do some administrative work from the shell against production data? Time to over-engineer something!

1. Overview

The solution I arrived at, and remain happy with, involves installing the Cloud SQL Proxy, modifying Django settings to play nice with the Cloud SQL Proxy, and scripting the setup/teardown to make connecting a simple one-line command:

$ django_shell_prod

Starting Cloud SQL Proxy...
Python 3.7.3
[Clang 10.0.1] on darwin
(InteractiveConsole)

>>> print('It works!')
It works!

Note that I'm working with Postgres and these links point to Postgres related documentation. MySQL versions are also available.

2. Install and configure Cloud SQL Proxy

I followed the installation instructions to get a copy of the cloud_sql_proxy binary set up, enabled the Google SQL Admin API, and created a service account with the appropriate role(s). I chose to put the binary and all other GCP related files in a new ~/gcp directory:

$ mkdir ~/gcp
$ mv ~/cloud_sql_proxy ~/gcp/

As part of setting up a service account, I needed to generate and download a JSON key (described in the above installation instructions). I then moved that key into the ~/gcp directory (obivously the file name looks different):

$ mv ~/Downloads/your-app-12345.json ~/gcp/

The last thing I put in the ~/gcp directory was a text file containing the production database connection details:

$ touch ~/gcp/production.txt
$ echo DB_CONNECTION_NAME=your-app-id:region-id:database-name >> ~/gcp/production.txt
$ echo DB_NAME=database-name >> ~/gcp/production.txt
$ echo DB_PASSWORD=yourVERYstrongPASSWORD >> ~/gcp/production.txt

The DB_CONNECTION_NAME can be found on the instance details page in the GCP dashboard.

Reminder: don't check anything from this directory into source control!

3. Django settings

A few things need to be set up in the DATABASES object in the settings.py file. NAME, PASSWORD, HOST, and PORT should be retrieved from environment variables. This setup may be more complicated when deploying to multiple environments or connecting to multiple databases, but here's roughly what mine looks like:

# settings.py

...

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': os.getenv('DB_NAME'),
        'USER': os.getenv('DB_USER'),
        'PASSWORD': os.getenv('DB_PASSWORD'),
        'HOST': os.getenv('DB_HOST'),
        'PORT': os.getenv('DB_PORT', default='5432'),
    }
}

...

I set a default value for PORT since Postgres' default port is 5432, but when running the Cloud SQL Proxy I use a different port so as not to conflict with my local Postgres server. Defaults can be provided for the other keys, but I personally keep the rest of my environment variables in a local .env file to reduce the risk of a rogue hard-coded default getting deployed to production and wreaking havoc.

4. Fish shell script

Now on to the fun part: writing custom Fish shell functions! I've been really into automating things with Fish lately since I use it as my daily-driver shell and it has a really nice scripting syntax. If you're following along and don't use Fish, you'll have to port this over to work with your shell of choice (Zsh, Bash, etc.) but the basic structure shouldn't need to change much.

Here's a summary of what needs to happen:

  • Get the production database connection details from the ~/gcp/production.txt file I created earlier.
  • Start the Cloud SQL Proxy, providing the appropriate connection name, port, and service account credential file. Redirect the output to /dev/null and background it so I can run the Django interactive shell simultaneously without the proxy output getting garbled with the shell output.
  • Start the Django shell, specifying the database environment variables for the running proxy.
  • When the shell exits, kill the backgrounded proxy job.

First, I created a new file in the Fish functions directory so it gets autoloaded (if the functions directory doesn't exist, just create it):

$ touch ~/.config/fish/functions/django_shell_prod.fish

And here's the script:

# django_shell_prod.fish

function django_shell_prod
  # Get the production database connection details from ~/gcp/production.txt
  set cnn (get_django_var DB_CONNECTION_NAME)
  set db (get_django_var DB_NAME)
  set pw (get_django_var DB_PASSWORD)

  # Specify a port other than Postgres' default port (5432)
  # to prevent conflicts with local Postgres server
  set port 5433

  # Start the proxy, redirect the output to the null device, and background it
  echo 'Starting Cloud SQL Proxy...'
  ~/gcp/cloud_sql_proxy -instances=$cnn=tcp:$port \
                        -credential_file=~/gcp/your-app-12345.json \
                        2>/dev/null &

  # Give the proxy a few seconds to start up before
  # attempting to connect to it
  sleep 2

  # Set the proxy-specific, command-specific environment variables
  # and start the Django shell
  begin
    set -lx DB_NAME $db
    set -lx DB_PASSWORD $pw
    set -lx DB_HOST /cloudsql/$cnn
    set -lx DB_PORT $port
    set -lx USE_CLOUD_SQL_PROXY 1
    python3 manage.py shell
  end

  # Kill the backgrounded proxy job
  echo 'Killing Cloud SQL Proxy...'
  set proxy_job_id (jobs | grep /cloud_sql_proxy | cut -c -1)
  if test -n "$proxy_job_id"
    kill %$proxy_job_id
  end
end

function get_django_var
  # Returns the value associated with the provided key
  cat ~/gcp/production.txt | grep $argv[1] | sed 's/.*=//g'
end

If everything is set up correctly, the proxy start message and a healthy, happy shell prompt should appear:

$ django_shell_prod

Starting Cloud SQL Proxy...
Python 3.7.3
[Clang 10.0.1] on darwin
(InteractiveConsole)

>>> print('It works!')
It works!

Finally, I can create my super user:

>>> from django.contrib.auth.models import User
>>> user = User.objects.create_user('admin', password='password1')
>>> user.is_superuser = True
>>> user.is_staff = True
>>> user.save()

Tweet at me (@troy_carlson) if this worked for you, if you have questions, or if you can point out that this entire effort was completely pointless because this is acheivable in a simpler, more appropriate way :)

Troubleshooting

Here are a few things to keep in mind if you're trying to replicate this and things break:

  • Make sure you've enabled the Google SQL Admin API.
  • Make sure your service account has the appropriate roles.
  • Make sure the database name, password, etc. you specify in ~/gcp/production.txt are correct.
  • Make sure your Django settings are reading from the correct environment variables.
  • Make sure you're providing command-specific environment variables when starting the shell.