OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Python multiprocessing.Queue put kills process in Docker container

  • Thread starter Thread starter brvh
  • Start date Start date
B

brvh

Guest
I'm running a small multiprocessing test in a Docker container. The test uses multiprocessing.Queue with a consumer and a producer. The minimal code that leads to the crash can be found below, together with the Dockerfile. In the code, the main process starts a new process that receives a reference to the Queue, and uses Queue.put() to put an integer on the queue to the main process. What I'm seeing is that the producer process gets killed during the call to .put(), there is no Exception raised whatsoever.

Any ideas on why the .put() is killing the process? This does work locally (macOS) but does not work on the Python base image for a container. The Python version is 3.9.16.

Code:
import multiprocessing as mp
import time
import traceback
from typing import Any, Optional

import psutil

base_file = "logs.txt"


def main() -> None:
    queue: Any = mp.Queue()
    print("Queue created")
    print("Starting producer process")

    p = mp.get_context("spawn").Process(target=producer, args=(queue,), daemon=True)
    p.start()
    print(f"Main: producer started: {p.pid}")

    alive = True
    while alive:
        alive = p.is_alive()
        print(f"Ha ha ha staying alive, producer: {p.is_alive()}")
        time.sleep(1)

    print("Every process is dead :( ")

def producer(q: mp.Queue) -> None:
    with open(f"producer.{base_file}", "w") as f:
        print("Producer: started", file=f, flush=True)
        current_value: int = 0
        while True:
            print(f"Producer: Adding value {current_value} to queue", file=f, flush=True)
            try:
                q.put(current_value, block=False)
            except BaseException as e:
                print(f"Producer: exception: {e}", file=f, flush=True)
                print(f"{traceback.format_exc()}", file=f, flush=True)
                raise e
            print(f"Producer: Value {current_value} added to queue", file=f, flush=True)
            print("Producer: Sleeping for 1 second", file=f, flush=True)
            time.sleep(1)
            current_value += 1


if __name__ == "__main__":
    main()

Code:
FROM python:3.9.16

RUN apt-get update && apt-get install -y gettext git mime-support && apt-get clean

RUN python3 -m pip install psutil

COPY ./multiprocessing_e2e.py /src/multiprocessing_e2e.py

WORKDIR /src

CMD ["python", "-u", "multiprocessing_e2e.py"]
<p>I'm running a small <code>multiprocessing</code> test in a Docker container. The test uses <code>multiprocessing.Queue</code> with a consumer and a producer. The minimal code that leads to the crash can be found below, together with the Dockerfile.
In the code, the main process starts a new process that receives a reference to the <code>Queue</code>, and uses <code>Queue.put()</code> to put an integer on the queue to the main process. What I'm seeing is that the producer process gets killed during the call to <code>.put()</code>, there is no <code>Exception</code> raised whatsoever.</p>
<p>Any ideas on why the <code>.put()</code> is killing the process? This does work locally (macOS) but does not work on the Python base image for a container. The Python version is 3.9.16.</p>
<pre class="lang-py prettyprint-override"><code>import multiprocessing as mp
import time
import traceback
from typing import Any, Optional

import psutil

base_file = "logs.txt"


def main() -> None:
queue: Any = mp.Queue()
print("Queue created")
print("Starting producer process")

p = mp.get_context("spawn").Process(target=producer, args=(queue,), daemon=True)
p.start()
print(f"Main: producer started: {p.pid}")

alive = True
while alive:
alive = p.is_alive()
print(f"Ha ha ha staying alive, producer: {p.is_alive()}")
time.sleep(1)

print("Every process is dead :( ")

def producer(q: mp.Queue) -> None:
with open(f"producer.{base_file}", "w") as f:
print("Producer: started", file=f, flush=True)
current_value: int = 0
while True:
print(f"Producer: Adding value {current_value} to queue", file=f, flush=True)
try:
q.put(current_value, block=False)
except BaseException as e:
print(f"Producer: exception: {e}", file=f, flush=True)
print(f"{traceback.format_exc()}", file=f, flush=True)
raise e
print(f"Producer: Value {current_value} added to queue", file=f, flush=True)
print("Producer: Sleeping for 1 second", file=f, flush=True)
time.sleep(1)
current_value += 1


if __name__ == "__main__":
main()
</code></pre>
<pre><code>FROM python:3.9.16

RUN apt-get update && apt-get install -y gettext git mime-support && apt-get clean

RUN python3 -m pip install psutil

COPY ./multiprocessing_e2e.py /src/multiprocessing_e2e.py

WORKDIR /src

CMD ["python", "-u", "multiprocessing_e2e.py"]

</code></pre>
 
Top