[python-users] multiprocessing shared memory ?
Python Meeting Düsseldorf
info at pyddf.de
Mo Mär 9 18:50:53 CET 2020
Hallo Joachim,
wie rufst du die Funktion denn auf ?
In deinem Skript fehlt scheinbar das:
if __name __ == '__main__':
with_shared_memory()
Beste Grüße,
--
Marc-Andre Lemburg
eGenix.com
Professional Python Services directly from the Experts (#1, Mar 09 2020)
>>> Python Projects, Coaching and Support ... https://www.egenix.com/
>>> Python Product Development ... https://consulting.egenix.com/
________________________________________________________________________
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48
D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
Registered at Amtsgericht Duesseldorf: HRB 46611
https://www.egenix.com/company/contact/
https://www.malemburg.com/
On 3/9/2020 6:01 PM, Joachim Sasse wrote:
> Hallo,
> die kleine test funktion with_shared_memory funktioniert nicht , dh, der
> print befehl
> wird nicht ausgeführt, also auch die gesamte Funktion nicht ?
>
> Kann mir jemand helfen ?
> Benutze Python 3.8.2 unter Windows 10
>
> gruss J. Sasse
>
> from multiprocessing import Process, Queue, managers
> import time
>
>
> def with_shared_memory():
> with managers.SharedMemoryManager() as smm:
> sl = smm.ShareableList(range(2000))
> # Divide the work among two processes, storing partial results
> in sl
> p1 = Process(target=do_work, args=(sl, 0, 1000))
> print (p1)
> p2 = Process(target=do_work, args=(sl, 1000, 2000))
> p1.start()
> p2.start() # A multiprocessing.Pool might be more efficient
> p1.join()
> p2.join() # Wait for all work to complete in both processes
> total_result = sum(sl) # Consolidate the partial results now in sl
> print (total_result)
>
>
> def do_work(data,von,bis):
> print (von,bis)
> for i in range(von,bis):
> data.append(i)
>
Mehr Informationen über die Mailingliste python-users