I'm recreating below test scenario but constantly getting below error:
ImportError while importing test module '/Workspace/python_tests/dummy_test.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.12/importlib/__init__.py:90: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
E ModuleNotFoundError: No module named 'dummy_test'
=========================== short test summary info ============================
ERROR dummy_test.py
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
=============================== 1 error in 0.45s ===============================
This is the structure of my /Workspace/python_tests
directory and each file. All 3 files reside in the same directory:
dummy.py
import pyspark
from pyspark.sql import SparkSession
from pyspark.sql.functions import col
# Because this file is not a Databricks notebook, you
# must create a Spark session. Databricks notebooks
# create a Spark session for you by default.
spark = SparkSession.builder.appName('integrity-tests').getOrCreate()
def dummy():
return "I am a dummy"
dummy_test.py
# import pytest
import pyspark
from dummy import *
def test_dummy():
assert dummy() == "I am a dummy"
pyTestRunner
(notebook)%pip install pytest
import pytest
import sys
# Skip writing pyc files on a readonly filesystem.
sys.dont_write_bytecode = True
# Run pytest.
retcode = pytest.main([".", "-v", "-p", "no:cacheprovider"])
# Fail the cell execution if there are any test failures.
assert retcode == 0, "The pytest invocation failed. See the log for details."
I'm recreating below test scenario but constantly getting below error:
https://learn.microsoft.com/en-us/azure/databricks/notebooks/testing
ImportError while importing test module '/Workspace/python_tests/dummy_test.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.12/importlib/__init__.py:90: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
E ModuleNotFoundError: No module named 'dummy_test'
=========================== short test summary info ============================
ERROR dummy_test.py
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
=============================== 1 error in 0.45s ===============================
This is the structure of my /Workspace/python_tests
directory and each file. All 3 files reside in the same directory:
dummy.py
import pyspark
from pyspark.sql import SparkSession
from pyspark.sql.functions import col
# Because this file is not a Databricks notebook, you
# must create a Spark session. Databricks notebooks
# create a Spark session for you by default.
spark = SparkSession.builder.appName('integrity-tests').getOrCreate()
def dummy():
return "I am a dummy"
dummy_test.py
# import pytest
import pyspark
from dummy import *
def test_dummy():
assert dummy() == "I am a dummy"
pyTestRunner
(notebook)%pip install pytest
import pytest
import sys
# Skip writing pyc files on a readonly filesystem.
sys.dont_write_bytecode = True
# Run pytest.
retcode = pytest.main([".", "-v", "-p", "no:cacheprovider"])
# Fail the cell execution if there are any test failures.
assert retcode == 0, "The pytest invocation failed. See the log for details."
I had the same issue when I found this post. I was able to resolve it by moving dummy_test.py into a subfolder of the tests folder from which I execute pytest (i.e., a subfolder of the directory where your pyTestRunner notebook lies). So:
tests/
py_test_runner.ipynb
dummy_tests/
dummy_test.py
Hope this helps someone encountering the same issue and landing at this question.
python -m pytest python_tests_folder
. In a Databricks context this may be different. More information about the-m
option can be found here: stackoverflow.com/questions/7610001/… – Bouke Commented Jan 12 at 19:58