OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Using 100% of all cores with the multiprocessing module

  • Thread starter Thread starter Ggggggg
  • Start date Start date
G

Ggggggg

Guest
I have two pieces of code that I'm using to learn about multiprocessing in Python 3.1. My goal is to use 100% of all the available processors. However, the code snippets here only reach 30% - 50% on all processors.

Is there anyway to 'force' python to use all 100%? Is the OS (windows 7, 64bit) limiting Python's access to the processors? While the code snippets below are running, I open the task manager and watch the processor's spike, but never reach and maintain 100%. In addition to that, I can see multiple python.exe processes created and destroyed along the way. How do these processes relate to processors? For example, if I spawn 4 processes, each process isn't using it's own core. Instead, what are the processes using? Are they sharing all cores? And if so, is it the OS that is forcing the processes to share the cores?

code snippet 1​


Code:
import multiprocessing

def worker():
    #worker function
    print ('Worker')
    x = 0
    while x < 1000:
        print(x)
        x += 1
    return

if __name__ == '__main__':
    jobs = []
    for i in range(50):
        p = multiprocessing.Process(target=worker)
        jobs.append(p)
        p.start()

code snippet 2​


Code:
from multiprocessing import Process, Lock

def f(l, i):
    l.acquire()
    print('worker ', i)
    x = 0
    while x < 1000:
        print(x)
        x += 1
    l.release()

if __name__ == '__main__': 
    lock = Lock()
    for num in range(50):
        Process(target=f, args=(lock, num)).start()
<p>I have two pieces of code that I'm using to learn about multiprocessing in Python 3.1. My goal is to use 100% of all the available processors.
However, the code snippets here only reach 30% - 50% on all processors.</p>

<p>Is there anyway to 'force' python to use all 100%?
Is the OS (windows 7, 64bit) limiting Python's access to the processors?
While the code snippets below are running, I open the task manager and watch the processor's spike, but never reach and maintain 100%.
In addition to that, I can see multiple python.exe processes created and destroyed along the way. How do these processes relate to processors? For example, if I spawn 4 processes, each process isn't using it's own core. Instead, what are the processes using? Are they sharing all cores? And if so, is it the OS that is forcing the processes to share the cores?</p>

<h1>code snippet 1</h1>

<pre><code>import multiprocessing

def worker():
#worker function
print ('Worker')
x = 0
while x < 1000:
print(x)
x += 1
return

if __name__ == '__main__':
jobs = []
for i in range(50):
p = multiprocessing.Process(target=worker)
jobs.append(p)
p.start()
</code></pre>

<h1>code snippet 2</h1>

<pre><code>from multiprocessing import Process, Lock

def f(l, i):
l.acquire()
print('worker ', i)
x = 0
while x < 1000:
print(x)
x += 1
l.release()

if __name__ == '__main__':
lock = Lock()
for num in range(50):
Process(target=f, args=(lock, num)).start()
</code></pre>
 
Top