今天同事反映一个问题让帮忙看一下:多进程共用一个变量,在一个进程中修改后,在另外的进程中并无产生修改。html
最初觉得是没添加global声明致使修改未生效,但实际操做发现global方式在多进程中也只能读不能写。错误示例代码以下:python
import multiprocessing # 声明一个全局变量 share_var = ["start flag"] def sub_process(process_name): # 企图像单个进程那样经过global声明使用全局变量 global share_var share_var.append(process_name) # 可是很惋惜,在多进程中这样引用只能读,修改其余进程不会同步改变 for item in share_var: print(f"{process_name}-{item}") pass def main_process(): process_list = [] # 建立进程1 process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,)) process_list.append(tmp_process) # 建立进程2 process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,)) process_list.append(tmp_process) # 启动全部进程 for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
执行结果以下,能够看到进程1中的修改未表如今进程2中(不过要注意,和多线程同样,若是运算量再大一点进程1并不必定比进程2先执行):数组
参考:http://www.javashuo.com/article/p-axnbfdnv-v.html多线程
import multiprocessing # 不能将共享变量和共享锁定义成全局变量而后经过global引用那样会报错,只能传过来 def sub_process(process_name,share_var,share_lock): # 获取锁 share_lock.acquire() share_var.append(process_name) # 释放锁 share_lock.release() for item in share_var: print(f"{process_name}-{item}") pass def main_process(): # 单个值声明方式。typecode是进制类型,value是初始值 # share_var = multiprocessing.Manager().Value(typecode, value) # 数组声明方式。typecode是数组变量中的变量类型,sequence是数组初始值 # share_var = multiprocessing.Manager().Array(typecode, sequence) # 字典声明方式 # share_var = multiprocessing.Manager().dict() # 列表声明方式 share_var = multiprocessing.Manager().list() share_var.append("start flag") # 声明一个共享锁 share_lock = multiprocessing.Manager().Lock() process_list = [] process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,share_var,share_lock)) process_list.append(tmp_process) process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,share_var,share_lock)) process_list.append(tmp_process) for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
执行结果以下,能够看到进程1中的修改已表如今进程2中(不过要注意,和多线程同样,若是运算量再大一点进程1并不必定比进程2先执行):app
typecode若是是数值或单个字符,可为如下类型(注意有引号):ui
Type Code | C Type | Python Type |
'c' | char | character |
'b' | signed char | int |
'B' | unsigned char | int |
'u' | Py_UNICODE | unicode character |
'h' | signed short | int |
'H' | unsigned short | int |
'i' | signed int | int |
'I' | unsigned int | int |
'l' | signed long | int |
'L' | unsigned long | int |
'f' | float | float |
'd' | double | float |
若是是字符串类型,typecode可为如下第一列形式(注意无引号):spa
ctypes type | C type | Python type |
c_bool.net |
_Bool | bool (1) |
char | char | 1-character string |
c_wchar | wchar_t | 1-character unicode string |
c_byte | char | int/long |
c_ubyte | unsigned char | int/long |
c_short | short | int/long |
c_ushort | unsigned short | int/long |
c_int | int | int/long |
c_uint | unsigned in | int/long |
c_long | long | int/long |
c_ulong | unsigned long | int/long |
c_longlong | __int64 or long long | int/long |
c_ulonglong | unsigned __int64 or unsigned long long | int/long |
c_float | float | float |
c_double | double | float |
c_longdouble | long double | float |
c_char_p | char * (NUL terminated) | string or None |
c_wchar_p线程 |
wchar_t * (NUL terminated) | unicode or None |
c_void_p | void * | int/long or None |
同事还想共享一个文件对象,而后问上边的方法是否是只能共享字典、列表,无法共享对象。code
回头一看,Value和Array中typecode要求是c语言中存在的类型,其余只有dict()和list()方法没有其余方法,因此彷佛上边的方法共享实例化对象是不行的。
但咱们前面说过global方式不能够修改,但读仍是没问题的;因此对象引用仍是可使用global方式。
import multiprocessing import threading # 实例化一个全局文件对象 file_obj = open("1.txt","a") share_lock = threading.Lock() def sub_process(process_name): global file_obj,share_lock share_lock.acquire() file_obj.writelines(f"{process_name}") share_lock.release() pass def main_process(): process_list = [] # 建立进程1 process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,)) process_list.append(tmp_process) # 建立进程2 process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,)) process_list.append(tmp_process) # 启动全部进程 for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
global方式不能修改变量(如要修改其成员变量),在大多时候也是能够了,但总让人以为不是一种完美的实现方法。有没有能够修改的实现方法呢,答案是有的,可使用BaseManager。示例代码以下。
参考:https://blog.csdn.net/jacke121/article/details/82658471
import multiprocessing from multiprocessing.managers import BaseManager import threading # 锁能够经过global也能够在Process中传无所谓 share_lock = threading.Lock() # 定义一个要共享实例化对象的类 class Test(): def __init__(self): self.test_list = ["start flag"] def test_function(self,arg): self.test_list.append(arg) def print_test_list(self): for item in self.test_list: print(f"{item}") def sub_process(process_name,obj): global share_lock share_lock.acquire() obj.test_function(f"{process_name}") share_lock.release() obj.print_test_list() pass def main_process(): # 若是是想注册open方法这样操做 # manager = BaseManager() # # 必定要在start前注册,否则就注册无效 # manager.register('open', open) # manager.start() # obj = manager.open("1.txt","a") # 为了更加直接咱们直接以一个Test类的实例化对象来演示 manager = BaseManager() # 必定要在start前注册,否则就注册无效 manager.register('Test', Test) manager.start() obj = manager.Test() process_list = [] # 建立进程1 process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,obj)) process_list.append(tmp_process) # 建立进程2 process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,obj)) process_list.append(tmp_process) # 启动全部进程 for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
执行结果以下,能够看到进程1中的修改已表如今进程2中(不过要注意,和多线程同样,若是运算量再大一点进程1并不必定比进程2先执行):
参考:
https://blog.51cto.com/11026142/1874807
https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing