You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm wondering if I could use autofaiss + pyspark to store ~100 billion vectors. I read in a thread on the faiss package that milvus is just basically faiss, but already distributed. I was wondering if I could use autofaiss and then distribute the data across a bunch of nodes.
Do you think this is a reasonable solution if I need to store a ton of vectors?
The text was updated successfully, but these errors were encountered:
I'm wondering if I could use autofaiss + pyspark to store ~100 billion vectors. I read in a thread on the faiss package that milvus is just basically faiss, but already distributed. I was wondering if I could use autofaiss and then distribute the data across a bunch of nodes.
Do you think this is a reasonable solution if I need to store a ton of vectors?
The text was updated successfully, but these errors were encountered: