Hey folks, I have a doubt regarding Messenger/chat systems in general. Look at the attached screenshot for reference to the System design I'm referring to. Every message that's sent from a User/Client flows through a chat server into a Message Sync queue awaiting delivery to another User and also gets stored in a key-value store. But after that, if the chat server (#2 in this case) that has to read that message is not available, wouldn't all the subsequent messages that are in the queue get blocked? My doubt is not just WRT this system. Even in general, if a specific server has to read messages from a message queue, wouldn't it delay all the subsequent messages' delivery in case a server that has to read it is unavailable? Or am I missing something here? Though not relevant, blind tax: TC: 40 LPA YOE: 5.5 #facebook #slack #whatsapp #systemdesign #productdesign #google #instagram #microsoft
there are different kinds of queues, I think the one is mentioned are the normal ones then there ks pub sub type queues, as soon as the even happens the next logic is triggered. These are called async queues. This video explains exactly what you are asking for: https://youtu.be/J6CBdSCB_fY
Oh ok. Also, is it practically possible to have a separate queue for each user/client, so that they could pull the latest messages from that?
I thought on this for sometime. Let me share my opinion on this. Short thing: Yes it should be. Looks good in terms of pros cons to be. (I would love to discuss the bottlenecks). So, let's say on the client we have an API available (assuming only text messages are supported) getChat(my_id, recerver_id): Json of (id, message_text, timestamp, status) Whenever we open a chat, this API is used. Now, on the backend side, we will have a database storing this chat also a queue which will be capturing the in-transit messages(For example, a message is sent by A but not received by B and B has called the getChat). Now, if B will call getChat, they will get response by getting chat from db and also looking into it's own queue if there is message for it. We will then move this message to db, removing it from queue, as we removed from the read queue of B, we will notify the sender A that B has read the message by pushing it to client A. Now B will get the chat with last message from A. Let's see some of the factors of this system: Availability: What if the queue fails? A's message won't be sent to B. in that case we can make use of A's write queue. We can again sent a push request to B's queue. Also, I'm observing that having an acknowledgement each time we interact with a queue can make our system more reliant. Cost: Are we paying higher cost because we are maintaining queue for each user? We can make use of async queue in case we want to merge the queues, we can maintain same acknowledgement system with this system too. I see no downside if compared with our current solution. Scalability: What's the load on the db? it's fine. Load on the queue? in case of our approach for new users we just need to create another queue. Storage: how to store separate queues, we will have a system which will have a number of queues. Overall it looks good with minor details needs to be planned in a good way.
World Conflicts
6h
251
Israeli precision-guided munition likely killed group of children playing foosball in Gaza, weapons experts say
Tech Industry
Yesterday
3656
What happens when most of your team is Indian?
Health & Wellness
Yesterday
858
Lasik cost
Tech Industry
Yesterday
1091
Last good year to visit europe
Tech Industry
1h
312
Why does everyone say that Amazon is toxic?
I understand this “sync queue” has to be partitioned in some manner like topic partitions in kafka. If thats the architecture then how do you think other consumers can be blocked if one of them is hung ?
I assumed this one to be a regular FIFO queue. Is that not right?
FIFO queue would not be suitable in this use case. You get lots of benefits like replication, durability, checkpointing etc with kafka like system.