Django FileDescriptor in settings


In my Django project, I want to use the MaxMind db file to acquire information about IP-addresses.
The file takes up to 90 megabytes. I expect about 50 requests per second and also the application will be hosted on the Gunicorn with 3 workers. So, Will I meet the limits if I open a file every request?

with maxminddb.open_database('GeoLite2-City.mmdb') as mmdb:

    info = mmdb.get('')
    return Response({'city':, 'other': 'some_other_information_about_ip'})

or should I open the file in once:

mmdb = maxminddb.open_database('GeoLite2-City.mmdb')

and use this descriptor in my ?

from django.conf import settings
info = settings.mmdb.get('')
return Response({'city':, 'other': 'some_other_information_about_ip'})

I am concerned about several problems:

  1. If 2 workers will try to read the file simultaneously, where might be conflicts.
  2. The File descriptor will not be properly closed if the application fails.
  3. What is best practice for Django apps to keep files in memory.

>Solution :

  1. No, there will not be conflicts. Two workers can read the same file simultaneously.
  2. If you’re using with, you’ll be alright.
  3. To not to keep them in memory at all, if possible.

The maxminddb module will not load the file into memory if it can avoid it (see the documentation). You shouldn’t open it in settings, but in your view.

You can also slap a lru_cache decorator on the mmdb-accessing code, so each worker will cache up to (e.g.) the 128 most recently used results. And, of course, you can replace that with the Django cache to share it between workers.

def get_ip_info(ip):
    with maxminddb.open_database("GeoLite2-City.mmdb") as mmdb:
        return mmdb.get(ip)

def my_view(request):
    info = get_ip_info(...)
    return Response(
            "other": "some_other_information_about_ip",

Leave a ReplyCancel reply