NEAR Lake framework data retrieval experiencing delays

I’m implementing a blockchain data listener using the NEAR Lake framework to monitor mainnet transactions. The issue I’m facing is that data fetching takes an extremely long time. The message_queue.get() call often hangs for several minutes before returning any results. Sometimes it just freezes completely at that point.

Here’s my current implementation:

handler, message_queue = create_streamer(settings)
while True:
    begin_time = int(time.time())
    logger.info("Starting fetch at:{}", begin_time)
    blockchain_message = await message_queue.get()
    finish_time = int(time.time())
    logger.info("message_queue.get() took:{}", begin_time - finish_time)
    logger.info(f"Processing Block #{blockchain_message.block.header.height} with {len(blockchain_message.shards)} shards")
    begin_time = int(time.time())
    await process_blockchain_data(blockchain_message)
    finish_time = int(time.time())
    logger.info("process_blockchain_data took:{}", begin_time - finish_time)

Sample output:
Starting fetch at:1660711985
message_queue.get() took:-282
Processing Block #71516637 with 4 shards
process_blockchain_data took:0

Is there a configuration setting I’m missing that could improve performance? Are there known rate limits or network issues that might cause these delays?