Best practices for Python imports in a production project: handling relativeabsolute imports across different directories and te

admin2025-04-30  0

I have the following folder structure in python:

my_project/
├── Dockerfile
├── Makefile
├── run.py
├── data/
│   ├── raw/
│   └── processed/
└── src/
    ├── __init__.py
    ├── config.py 
    ├── settings.env
    └── response/ 
        ├── __init__.py
        ├── llm.py
        ├── instances.py
        └── get_response.py

Context

I started with a simple project to get structured responses from an LLM using Python. As the project grew, I decided to make it more production-ready by adding proper structure and organization. However, I'm unsure if my current structure is optimal, particularly regarding import handling.

Current Issues

Initially, when all code was in one folder, imports were straightforward:

# When everything was in one folder
from llm import get_completion
from instances import MyClass

After restructuring and using run.py as the main entry point, I had to modify imports to work from the parent directory:

# In run.py
from src.response.llm import get_completion
from src.response.instances import MyClass

Specific Questions

Is this the correct way to structure a production-ready Python project? How should I handle imports when I want to:

Run tests from a separate test directory? Execute files directly within their folders (e.g., for development/debugging)? Use the if __name__ == '__main__': block with test code in individual files?

Do I need to modify import statements every time I run files from different locations? Is adding project root to Python path when running directly, the best possible option?

Technical Details

Python version: 3.11

Current behavior: Files only run correctly when executed from the parent directory

Desired behavior: Ability to run files, tests, and debug code from any location without constantly modifying imports (if possible)

Currently I cannot run my various scripts from their own location, I have to run them from the project root - how can I make them work from any location?

I have the following folder structure in python:

my_project/
├── Dockerfile
├── Makefile
├── run.py
├── data/
│   ├── raw/
│   └── processed/
└── src/
    ├── __init__.py
    ├── config.py 
    ├── settings.env
    └── response/ 
        ├── __init__.py
        ├── llm.py
        ├── instances.py
        └── get_response.py

Context

I started with a simple project to get structured responses from an LLM using Python. As the project grew, I decided to make it more production-ready by adding proper structure and organization. However, I'm unsure if my current structure is optimal, particularly regarding import handling.

Current Issues

Initially, when all code was in one folder, imports were straightforward:

# When everything was in one folder
from llm import get_completion
from instances import MyClass

After restructuring and using run.py as the main entry point, I had to modify imports to work from the parent directory:

# In run.py
from src.response.llm import get_completion
from src.response.instances import MyClass

Specific Questions

Is this the correct way to structure a production-ready Python project? How should I handle imports when I want to:

Run tests from a separate test directory? Execute files directly within their folders (e.g., for development/debugging)? Use the if __name__ == '__main__': block with test code in individual files?

Do I need to modify import statements every time I run files from different locations? Is adding project root to Python path when running directly, the best possible option?

Technical Details

Python version: 3.11

Current behavior: Files only run correctly when executed from the parent directory

Desired behavior: Ability to run files, tests, and debug code from any location without constantly modifying imports (if possible)

Currently I cannot run my various scripts from their own location, I have to run them from the project root - how can I make them work from any location?

Share Improve this question edited Jan 4 at 23:26 vossi asked Jan 4 at 22:48 vossivossi 9312 bronze badges 3
  • 1 This question is similar to: What is the best project structure for a Python application?. If you believe it’s different, please edit the question, make it clear how it’s different and/or how the answers on that question are not helpful for your problem. – Abhijit Sarkar Commented Jan 4 at 23:03
  • 1 The choices you are making are not generic, and don't affect third parties other than people using your software directly. So, there's not really a way you should be doing things, unlike when developing a package that is intended for use in other people's projects. Since the current behaviour is not the desired behaviour, it seems you're really asking: "currently I cannot run my various scripts from their own location, I have to run them from the project root - how can I make them work from any location" - modifying your code to change to the project root automatically would be one solution. – Grismar Commented Jan 4 at 23:15
  • @Grismar Thank you for your suggestion. I have edited my question in order to be more precise! I was thinking about that, adding project root to Python path when running directly, but was not sure whether this is the best possible option. – vossi Commented Jan 4 at 23:29
Add a comment  | 

1 Answer 1

Reset to default 2

Here's how I would design it:

my_project/
├── Dockerfile
├── Makefile
├── pyproject.toml // The newest official packaging file
├── tests/
│   └── test_stuff.py
├── data/
│   ├── raw/
│   └── processed/
└── my_project/  // More comfortable namespacing 
    ├── __init__.py
    ├── __main__.py  // Running using python -m my_project
    ├── config.py 
    ├── settings.env
    └── response/ 
        ├── __init__.py
        ├── llm.py
        ├── instances.py
        └── get_response.py

You may then install the project using pip install -e . which will allow you to update the code yet still "install it" for the tests to run comfortably.

After doing so, tests can be run using python -m unittest discover or simply pytest depending on your framework of choice. They simply import my_project, no need for __main__ but you can if you wish (I rarely run specific files, and when I do I use the -k param to unittest).

There is no right or wrong, but this is a simple viable setup that I've used in production serving millions of clients.

转载请注明原文地址:http://anycun.com/QandA/1746024906a91505.html