Here we look at two major updates to my web application.
1. Upgrading to Python 3.7.4 and Django 2.1.12.
2. Move micro services to docker.
1. Upgrading to Python 3.7.4 and Django 2.1.12
We will look into prerequisites and steps to make the upgrade in a project. Python version was 3.5.3. Release, change logs and new features are linked below. 3.7.4 is not yet available on Ubuntu 18.04 and was installed from source. In addition to all the new major features and extending lifespan of project, Python 3.7.4/3.7.8 is to Python 3 what 2.7 was to Python 2.
Python 3.7.4 release
Release schedule for Python 3.7.4
Whats new in Python 3.7.4
Changelog
Django 2.1.12 release notes
Prerequisites
1) Tests: You should have reliable tests for project. If there are no tests for a project, then upgrading on any language / software will be difficult.
2) Read change log and release notes: These help to get an idea of effort required and estimation. The objective here is to identify changes that have impacts and backward incompatible changes. Python version chosen also depends on supported libraries used in project.
Django 2.1.12 has changes that made impact. As examples, the new model view permission created on db migrations, database router allow_relations and contrib.auth views. These were in the release notes and did show up as expected during the upgrade.
Steps
1. Build Python 3.7.4 from source. Install to a specific location. For steps see
this. Here it was built with --enable-optimizations, --enable-shared and LDFLAGS for the target system.
2. Run the new Python interpreter and check version. Also run ldd on Python executable to make sure it is looking at the correct place for libpython so file.
3. Identify Python version requirements of third party libraries in pip requirements file. Create a Python 3.7.4 virtual environment and install requirements. One of the issues encountered was that
numpy 1.14.0 does not have Python 3.7 wheels. Similar version upgrades for
celery, kombu, scikit-learn were necessary and also applied. This takes care of changes to requirements for Python virtual environment.
4) Add Django 2.1.12 to pip requirements.
5) Run tests in project.
5 a) Identify each failure in tests. Some of the breaks encountered were scikit-learn deprecated methods (like cross_validation, serialization incompatibility in machine learning model store joblib module), Django auth app migrations, db routers, djangrestframework upgrade to 3.10.3 and passsword security rules in Django.
5 b) Change the code to use new features, move out of deprecated methods, method signatures and fix failures encountered.
5 c) Do this until all tests pass.
Coverage can also be used during test.
6) Once all tests pass modify Dockerfile for Django to build Python from source and set Python virtual environment. Check in container that it is using the correct interpreter and environment.
The docker image is now ready with the required Python version and requirements.
2. Micro services on docker swarm
The web application has 10 services. These are as follows
1) web app: This is the Django app running under uWsgi. Within each of these there are 6 Django apps each of which can be run alone or in groups within a web app service.
2) Memcache: This helps to minimise database hits and overall web app performance. Three instance types for three item sizes are used.
3) Celery workers: These do the long running tasks asynchronously.
4) Celery beat: This is for initiating periodic tasks.
5) Flower: a tool used to monitor celery
6) Rabbitmq: Message broker for celery.
7) Loadbalancer: nginx sitting in front of the web app service
8) Static server: Serves assets like js, css and static content.
9) Media server: Serves media that users have uploaded. Examples are avatar pictures, machine learning models, out of bag data and the like.
10) Database: Two Postgres instances
Of these all except that database have been moved to docker. The reasons for not moving the database services are similar to those listed below.
Also, if a database needs to be scaled, there are options available in the database that can be used. Scaling the services individually using containers and the database using its own options keeps things predictable.
Directory structure looks like
Starting the application involves initiating a docker swarm and then deploying the stack on it. The two commands are
$ docker swarm init --advertise-addr <network>
$ docker stack deploy -c <the-stack.yml file> <a name for your application>
Checking the services that make up the stack with
$ docker stack services <name of your application>
As mentioned in the
previous post, scaling any individual service involves increasing replicas count in the stack's yml file and redeploying the stack.