Celery任务完成后Redis连接未释放



我使用Redis有两件事:1)作为Celery后端,2)作为我的Celery任务的锁持有者

下面是我正在运行的代码示例:

def get_redis():
url = os.environ.get("REDIS_URL")
if url:
r = redis.from_url(url)  # use secure for heroku
else:
r = redis.Redis()  # use unauthed connection locally
return r
@app.task(bind=True, max_retries=10)
def test_delay_task(self, task_id):
''' Each task with try to grab a lock and once it does, will sleep 5 seconds, then
print and exit.
'''
have_lock = False
r = get_redis()
lock = r.lock('mws_api')
try:
have_lock = lock.acquire(blocking=False)
if have_lock:
logger.warning("{} Lock Acquired".format(task_id))
time.sleep(5)
logger.warning('Test Task {} successful!'.format(task_id))
else:
logger.warning("{} Lock In Use, Retrying".format(task_id))
self.request.retries = 1
self.retry(countdown=5 * random.uniform(0.8, 1.2))
finally:
if have_lock:
lock.release()
# We'll come back to this code, but it partially works
# c = r.info()['connected_clients']
# print("Disconnecting Redis | Connections: {}".format(c))
# r.connection_pool.disconnect()

@app.task(bind=True, max_retries=10)
def test_parallel_tasks(self):
''' Runs 10 consecutive tasks, each which will try to grab a lock and run. '''
for i in range(5):
test_delay_task.delay(i)

当我运行这个程序时,Redis的连接量会大幅增加。我用这个代码测量这个:

def get_connected_clients():
try:
connections = 0
while True:
time.sleep(.25)
c = get_redis().info()['connected_clients']
# c = redis.Redis().info()['connected_clients']
if c != connections:
now = datetime.datetime.now()
print("{} | Active Connections: {}".format(now, c))
connections = c
else:
continue
except KeyboardInterrupt:
print("Shutting Down")

结果是:

Celery Starts
2017-11-04 01:29:51.463512 | Active Connections: 7
2017-11-04 01:29:52.477220 | Active Connections: 12

Run Task
2017-11-04 01:30:18.755118 | Active Connections: 33
2017-11-04 01:30:23.847573 | Active Connections: 34
2017-11-04 01:30:24.101263 | Active Connections: 39
2017-11-04 01:30:24.610450 | Active Connections: 40
2017-11-04 01:30:28.944949 | Active Connections: 41
2017-11-04 01:30:30.208845 | Active Connections: 43
2017-11-04 01:30:33.780812 | Active Connections: 42
2017-11-04 01:30:34.548651 | Active Connections: 43
2017-11-04 01:30:34.804526 | Active Connections: 44
2017-11-04 01:30:35.058731 | Active Connections: 47
2017-11-04 01:30:39.626745 | Active Connections: 48
2017-11-04 01:30:40.648594 | Active Connections: 49
Task Complete
Wait
Kill Celery
2017-11-04 01:31:57.766001 | Active Connections: 45
2017-11-04 01:31:58.786042 | Active Connections: 5
2017-11-04 01:31:59.291814 | Active Connections: 3

这些连接永远不会消失,据我所知,除非我关闭Celery并重新启动它。再次运行任务会增加打开的连接数量,直到我关闭Celary,它才会减少。运行3次后,活动连接数将达到77。


如果我在上面的任务中添加注释代码,这似乎有帮助,但总连接数对我来说仍然很高。现在运行多次看起来是这样的:

Started with Disconnect Code Uncommented
2017-11-04 01:37:44.773113 | Active Connections: 29
2017-11-04 01:37:54.689032 | Active Connections: 33
2017-11-04 01:37:59.789031 | Active Connections: 32
2017-11-04 01:38:01.057219 | Active Connections: 33
2017-11-04 01:38:02.330613 | Active Connections: 36
2017-11-04 01:38:06.139188 | Active Connections: 35
2017-11-04 01:38:07.917854 | Active Connections: 36
2017-11-04 01:38:13.016428 | Active Connections: 35
2017-11-04 01:39:11.848758 | Active Connections: 36
Second Run
2017-11-04 01:39:18.224475 | Active Connections: 38
2017-11-04 01:39:22.043765 | Active Connections: 37
2017-11-04 01:39:23.061727 | Active Connections: 38
2017-11-04 01:39:38.106320 | Active Connections: 37
Third Run
2017-11-04 01:40:49.623050 | Active Connections: 38
2017-11-04 01:40:54.480170 | Active Connections: 37
2017-11-04 01:40:55.501791 | Active Connections: 38
2017-11-04 01:41:00.330222 | Active Connections: 37
2017-11-04 01:41:03.643833 | Active Connections: 38
2017-11-04 01:41:08.735973 | Active Connections: 37
2017-11-04 01:41:10.257756 | Active Connections: 38
2017-11-04 01:41:15.348323 | Active Connections: 37
2017-11-04 01:41:17.137816 | Active Connections: 38
2017-11-04 01:41:22.241020 | Active Connections: 37

好吧,尽管如此,我的问题是:为什么我的连接没有关闭,我该如何解决?我将需要运行类似的代码,但对于100多个并行任务,而不仅仅是我在这里使用的5。

以下是似乎正在工作的代码。至少我无法重现与原来不同的问题。注意app.conf.broker_pool_limit = 0connection_pool.disconnect。以下是broker_pool_limit的作用:

连接中可以打开的最大连接数水塘如果设置为None或0,则连接池将被禁用,并且每次使用都将建立和关闭连接。

import os
import time
import random
import datetime
import logging
import redis
logging.basicConfig()
logger = logging.getLogger(__name__)
from celery import Celery
from celery.contrib import rdb
app = Celery('tasks', backend='redis://localhost', broker='redis://localhost')
app.conf.broker_pool_limit = 0
def get_redis():
url = os.environ.get("REDIS_URL")
if url:
r = redis.from_url(url)  # use secure for heroku
else:
r = redis.Redis()  # use unauthed connection locally
return r
@app.task(bind=True, max_retries=10)
def test_delay_task(self, task_id):
''' Each task with try to grab a lock and once it does, will sleep 5 seconds, then
print and exit.
'''
have_lock = False
redis_cli = get_redis()
lock = redis_cli.lock('mws_api')
try:
have_lock = lock.acquire(blocking=False)
if have_lock:
logger.warning("{} Lock Acquired".format(task_id))
time.sleep(5)
logger.warning('Test Task {} successful!'.format(task_id))
else:
logger.warning("{} Lock In Use, Retrying".format(task_id))
self.request.retries = 1
self.retry(countdown=5 * random.uniform(0.8, 1.2))
finally:
if have_lock:
lock.release()
redis_cli.connection_pool.disconnect()
# We'll come back to this code, but it partially works
# c = r.info()['connected_clients']
# print("Disconnecting Redis | Connections: {}".format(c))
# r.connection_pool.disconnect()

@app.task(bind=True, max_retries=10)
def test_parallel_tasks(self):
''' Runs 10 consecutive tasks, each which will try to grab a lock and run. '''
for i in range(5):
test_delay_task.delay(i)

def get_connected_clients():
try:
connections = 0
while True:
time.sleep(.25)
c = get_redis().info()['connected_clients']
# c = redis.Redis().info()['connected_clients']
if c != connections:
now = datetime.datetime.now()
print("{} | Active Connections: {}".format(now, c))
connections = c
else:
continue
except KeyboardInterrupt:
print("Shutting Down")

当运行此代码时,在每个工作程序都有机会处理一个请求+主芹菜进程持有的一堆连接之后,每个工作程序只持有一个连接。

连接数学

对于此脚本,主celery进程需要8个连接,ipythonshell在查询某些任务后需要4个个连接。因此,初始尖峰是由需要如此多连接的celery主机引起的。如果未设置broker_pool_limit,则最初需要10连接

最新更新