Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

What's the best practice method for implementing common functions in a Python microservices architecture?

Background Info
I’m currently working on the software for a device which involves the reading, writing, and processing of many different sensors and modules.
The device runs Alpine Linux and handles most of the communication over various serial interfaces.

Because of the nature of needing to handle many different modules and processes, we decided on using something like a microservices architecture. It is not a full microservices arch in the traditional sense as the microservices don’t have individual web APIs handling their I/Os, but instead all communicate using redis. The large majority of the services are written in Python, with a couple in Golang.

Question
In my implementation, I’ve come up against one of those subjective best practice questions. There are a few common functions, constant variables and the like which most services will make use of. What is the best way to implement them?

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Supporting Info
Here is a representation of the directory structure

services/
├─ adc_read/
│  ├─ adc_read.ini
│  ├─ adc_read.py
├─ streamer/
│  ├─ go.mod
│  ├─ go.sum
│  ├─ streamer.ini
│  ├─ streamer.go
├─ settings_handler/
│  ├─ settings.yaml
│  ├─ settings_handler.py
│  ├─ settings_handler.ini
├─ .../
├─ common.py

I’ve tried putting a file ‘common.py’ containing all the common functions (at least for the services written in python) in the top level directory ‘services’, but doing so results in a little strangeness due to the microservices architecture. Since each of the services are called directly to run (managed by Supervisord, no top-level file/application/script to run all the services) If I want to import common.py from any of the python services I have to modify the PYTHONPATH like so: sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))). This seems pretty nasty in a microservices architecture and strikes me as kind of an antipattern.
The same problem would occur if I wanted to create a parent class for all the services to inherit in the top-level directory so that option is out too (I think).

Anyone have any suggestions, ideas or alternatives besides simply adding the common functions to every single service?

Cheers!

>Solution :

In a microservices architecture, especially one where you’re dealing with multiple services written in different languages (like Python and Go in your case), it’s important to handle shared functionality in a way that doesn’t break the independence of each service.

1. Create a Shared Python Package

One of the best practices here would be to create a shared Python package for your common functions and constants. Instead of having a common.py file at the top level, you could create a proper package, say shared_lib, and install it in each service that needs it. Here’s a quick outline:

  1. Create a directory structure for your package:

    shared_lib/
    ├── __init__.py
    ├── utils.py
    ├── constants.py
    
  2. Add a setup.py to make it installable:

    from setuptools import setup, find_packages
    
    setup(
        name="shared_lib",
        version="0.1",
        packages=find_packages(),
    )
    
  3. Now, in each of your services, you can install this package using pip:

    pip install /path/to/shared_lib
    

    This way, you avoid messing with PYTHONPATH and keep your code clean. Plus, it’s easy to update the shared package across services.

2. Using Git Submodules

If you’re looking for something a bit lighter than creating a full package, you might consider using a Git submodule. You can move your common.py (or other shared code) to a separate repo and then add it as a submodule in each service repo. Here’s how you can do it:

  1. Create a new repo for your shared code.

  2. In each service repo, add it as a submodule:

    git submodule add <url-to-shared-repo> common
    
  3. Then, you can just import it as you normally would:

    from common import utils
    

This approach keeps things straightforward and versioned alongside your services.

3. Environment Configuration

Another option is to manage PYTHONPATH at the environment level rather than directly in your code. For example, you can set PYTHONPATH in your Supervisor config or as an environment variable in your Docker setup. This avoids the need to modify sys.path in your scripts.

export PYTHONPATH=$PYTHONPATH:/path/to/services

This way, you can import common.py across services without hardcoding anything in your Python files.

Conclusion

In short, the best approach depends on how complex your shared functionality is and how frequently you update it. If you need a robust, scalable solution, creating a shared package is likely the best route. If you want something simpler, using a Git submodule or managing PYTHONPATH via environment variables are viable alternatives.


This should help maintain modularity and cleanliness in your microservices setup while avoiding the pitfalls of directly manipulating PYTHONPATH within your code.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading