nodejs redis错误会因为爬虫而发生吗?



redis v4.0.6正在使用

如果爬虫重复连接是否有错误?

我的代码如下,

private setRedisClient() {
const client = createClient({
url: `redis://${process.env.REDIS_URL}:${process.env.REDIS_PORT}`,
isolationPoolOptions: {
max: 10,
min: 0,
maxWaitingClients: 10,
fifo: true,
autostart: true,
idleTimeoutMillis: 30000,
},
});
client.on('error', (e: any) => {
throw e;
});
return client;
}
try {
this.setRedisClient();
await this.client.connect();
const res = await this.client.hGetAll(key);
await this.client.quit();
return res;
} catch (error) {
console.log(error);
return undefined;
}

重复刷新页面导致错误。

ClientClosedError: The client is closed
at RedisSocket.quit (/home/ec2-user/buyer-service/node_modules/@redis/client/dist/lib/client/socket.js:72:19)
at Commander.QUIT (/home/ec2-user/buyer-service/node_modules/@redis/client/dist/lib/client/index.js:222:71)
at RedisClient.getAll (/home/ec2-user/buyer-service/dist/infra/cache/redis/cache.redis.client.js:31:31)

我不擅长说英语。

我请求你的帮助。
public async getAll(key: string): Promise<any> {
try {
const client = this.setRedisClient();
const [_, hGetAll, _a] = await Promise.all([client.connect(), client.hGetAll(key), client.quit()]);
return hGetAll;
} catch (error) {
console.log(error);
return undefined;
}
}

最新更新