Search is not available for this dataset
id int32 0 400k | emb listlengths 512 512 |
|---|---|
0 | [
-0.007791304495185614,
-0.015224537812173367,
-0.008229559287428856,
-0.010055935941636562,
0.004904313012957573,
-0.02735348977148533,
-0.0013128136051818728,
0.07088662683963776,
0.002992109162732959,
0.00559978699311614,
0.02419976331293583,
-0.0013508090050891042,
0.04029602184891701,
... |
1 | [
0.017142945900559425,
-0.053393106907606125,
-0.0032763907220214605,
0.03696998953819275,
0.05494776740670204,
-0.0005573368980549276,
0.005563091021031141,
0.03268326446413994,
0.053567152470350266,
0.0493190623819828,
0.05252572521567345,
0.0023788900580257177,
0.029992738738656044,
0.00... |
2 | [
-0.025871995836496353,
-0.012300250120460987,
0.040082935243844986,
0.023225268349051476,
-0.006043246015906334,
-0.0022977276239544153,
0.006774374283850193,
0.02935672365128994,
0.009364059194922447,
-0.0010472452268004417,
0.021448763087391853,
0.015383899211883545,
0.08629953116178513,
... |
3 | [
-0.01034918799996376,
0.011991390027105808,
-0.010151764377951622,
-0.02944931574165821,
-0.017966246232390404,
0.011759537272155285,
-0.009121783077716827,
0.08587646484375,
0.0012819262919947505,
-0.051265981048345566,
0.02072279527783394,
0.05245504528284073,
-0.04870577156543732,
-0.00... |
4 | [
0.021625230088829994,
0.03172579035162926,
0.008845205418765545,
-0.009492510929703712,
-0.03138050064444542,
-0.02251124568283558,
-0.03454836830496788,
0.05618049576878548,
0.0034995228052139282,
-0.01871727965772152,
0.03192944452166557,
-0.02393523044884205,
0.06858906894922256,
-0.005... |
5 | [
-0.006621102802455425,
-0.0015056313714012504,
0.022375283762812614,
0.02497979998588562,
0.03644092008471489,
-0.032247114926576614,
0.03415245562791824,
-0.028094803914427757,
0.012718421407043934,
-0.033969372510910034,
0.04720389470458031,
0.004257140681147575,
0.010575619526207447,
-0... |
6 | [
0.029297875240445137,
0.004137449897825718,
0.023475926369428635,
0.05080823972821236,
-0.06093350797891617,
0.02309378795325756,
0.024607330560684204,
0.03979518264532089,
0.010553894564509392,
-0.010067902505397797,
0.0053770276717841625,
0.018157027661800385,
0.05387644097208977,
-0.027... |
7 | [
-0.0030342291574925184,
0.054395534098148346,
0.016912110149860382,
-0.04286830127239227,
0.022201096639037132,
0.05669558420777321,
-0.037070658057928085,
0.020749244838953018,
-0.0849359780550003,
-0.0032013417221605778,
0.016784092411398888,
0.020271826535463333,
0.036964159458875656,
-... |
8 | [
0.02864873595535755,
-0.000006924151421117131,
0.007704201620072126,
-0.004432817455381155,
0.04214686155319214,
0.011717680841684341,
-0.01746988296508789,
0.04941565915942192,
0.03582417219877243,
0.03919411823153496,
0.016615502536296844,
-0.004699390847235918,
0.04309104010462761,
0.00... |
9 | [
-0.011841805651783943,
-0.02482086792588234,
0.01544088963419199,
0.0005031637265346944,
0.0162531528621912,
-0.0011687076184898615,
0.00044281233567744493,
0.04821963980793953,
0.07668166607618332,
0.0016451660776510835,
-0.01844119280576706,
0.003299369476735592,
0.015917882323265076,
-0... |
10 | [
-0.0328146256506443,
0.00289535173214972,
0.038271091878414154,
0.049088265746831894,
-0.02184494212269783,
0.0061638508923351765,
-0.005541000049561262,
0.05105862393975258,
-0.01837458834052086,
-0.0007902801735326648,
0.040977876633405685,
0.021295275539159775,
0.1264486163854599,
-0.02... |
11 | [
-0.00893929973244667,
-0.00711915222927928,
-0.00437356298789382,
-0.011115397326648235,
0.006482880096882582,
-0.024140972644090652,
0.0013483773218467832,
0.08042097836732864,
0.008663149550557137,
-0.0032434198074042797,
0.023671414703130722,
0.004004803951829672,
0.03935232385993004,
0... |
12 | [
-0.0005842684186063707,
-0.0436883419752121,
0.03990186005830765,
0.07978527247905731,
-0.01584831438958645,
0.012227942235767841,
0.030871789902448654,
-0.0011359797790646553,
0.02515246905386448,
0.03619273006916046,
0.03899455815553665,
0.03701014071702957,
-0.005119676236063242,
-0.045... |
13 | [
-0.011565961875021458,
-0.006493818014860153,
0.013635354116559029,
0.025921668857336044,
0.046614572405815125,
-0.028742261230945587,
0.04790583625435829,
0.030127082020044327,
0.007535889279097319,
0.027770522981882095,
0.02989034540951252,
0.02239033952355385,
0.00936608575284481,
0.041... |
14 | [
0.027070654556155205,
0.032505571842193604,
0.0015156589215621352,
-0.01024109311401844,
0.028288794681429863,
-0.0011701879557222128,
-0.026518668979406357,
-0.07478633522987366,
0.006766411475837231,
-0.014959793537855148,
-0.020662549883127213,
-0.060250140726566315,
0.02929144725203514,
... |
15 | [
0.002524523064494133,
-0.02400687336921692,
0.01168355718255043,
-0.02038077637553215,
0.025858910754323006,
-0.006257291417568922,
0.012161383405327797,
0.022135792300105095,
-0.007394581567496061,
0.004822783172130585,
-0.05303327366709709,
0.021438295021653175,
0.0185846034437418,
-0.02... |
16 | [
0.023424537852406502,
0.021380102261900902,
0.023569097742438316,
-0.0011660215677693486,
0.023903584107756615,
0.0008464169804938138,
-0.012985500507056713,
0.030800849199295044,
-0.020770909264683723,
-0.026035364717245102,
-0.0008502827258780599,
-0.014543557539582253,
-0.0042845844291150... |
17 | [
-0.025602515786886215,
-0.00725109176710248,
-0.011937657371163368,
0.024883631616830826,
-0.01991841569542885,
-0.003403355600312352,
-0.0037490990944206715,
0.011860295198857784,
0.03167407214641571,
-0.0021798608358949423,
0.04979526251554489,
0.0018456460675224662,
0.06597030162811279,
... |
End of preview. Expand
in Data Studio
Amazon Products CLIP ViT-B/32
Dataset Description
This dataset contains pre-computed embeddings of product image and text pairs from Amazon, designed for evaluating vector database performance on multi-modal datasets. The embeddings are generated using OpenAI's CLIP-ViT-B/32 model.
Purpose
Benchmark dataset for evaluating vector database performance, specifically designed for use with VectorDBBench.
Dataset Summary
- Total Training Samples: 400,000 images
- Test Queries: 1,000 texts
- Ground Truth: Top-1000 nearest neighbors per query
- Embedding Dimension: 512
- Embedding Model: CLIP-ViT-B-32
- Source Data: Amazon-Products-Eval
Dataset Structure
Data Splits
| Split | Samples | Description |
|---|---|---|
train |
400,000 | Training image embeddings (random sample from source image) |
test |
1,000 | Test query embeddings (random sample from corresponding source text) |
neighbors.parquet |
1,000 | Top-1000 nearest neighbors for each test query |
Data Fields
train & test
id(int64): Unique identifier for each productemb(List[float64]): 512-dimensional L2-normalized embedding vector
neighbors.parquet
id(int64): Query identifier (matches test)neighbors_id(List[int64]): List of 1,000 nearest neighbor IDs from train set
Dataset Creation
Source Data
The dataset is derived from approximately 1M products image-text pairs from Marqo/amazon-products-eval:
- Train: 400,000 randomly sampled images
- Test: 1,000 randomly sampled from corresponding texts
Preprocessing
- Data Preparation: images and text embeddings were generated by embedding model
- Normalization: All embeddings are L2-normalized
Embedding Generation
- Model: sentence-transformers/clip-ViT-B-32
- Dimension: 512
- Normalization: L2-normalized
Ground Truth Generation
Ground truth nearest neighbors were computed using:
- Method: Flat search (brute-force)
- Metric: Cosine similarity
- K: Top-1000 neighbors per query
Usage
Loading the Dataset
from datasets import load_dataset
import pandas as pd
# Load train and test splits
dataset = load_dataset("cryptolab-playground/amazon-products-clip-vit-b-32")
Evaluation Example
import numpy as np
from datasets import load_dataset
import pandas as pd
# Load data
dataset = load_dataset("cryptolab-playground/amazon-products-clip-vit-b-32")
neighbors = pd.read_parquet(
"hf://datasets/cryptolab-playground/amazon-products-clip-vit-b-32/neighbors.parquet"
)
# Convert to numpy arrays
train_embeddings = np.array(dataset['train']['emb'])
test_embeddings = np.array(dataset['test']['emb'])
# Example: Compute recall@10
def compute_recall_at_k(retrieved_ids, neighbors_ids, k=10):
"""
Compute Recall@K
Args:
retrieved_ids: List of retrieved neighbor IDs
neighbors_ids: List of ground truth neighbor IDs
k: Number of top results to consider
"""
retrieved_k = set(retrieved_ids[:k])
neighbors_k = set(neighbors_ids[:k])
if len(neighbors_k) == 0:
return 0.0
return len(retrieved_k & neighbors_k) / len(neighbors_k)
# Use with your vector database
# ... insert your vector DB search code here ...
Use Cases
- Vector database performance benchmarking
- Approximate nearest neighbor (ANN) algorithm evaluation
- Retrieval system testing on product images
Limitations
- Domain-Specific: Optimized for product images; may not generalize to other domains
- Language: English only
- Ground Truth: Based on cosine similarity with embeddings, not human relevance judgments
License
Apache 2.0 (same as source dataset)
Citation
If you use this dataset, please cite:
@dataset{cryptolab-playground/amazon-products-clip-vit-b-32,
author = {CryptoLab, Inc.},
title = {Amazon Products CLIP ViT-B/32},
year = {2026},
publisher = {Hugging Face},
url = {https://huggingface.co/datasets/cryptolab-playground/amazon-products-clip-vit-b-32}
}
Source Dataset Citation
@software{zhu2024marqoecommembed_2024,
author = {Tianyu Zhu and and Jesse Clark},
month = oct,
title = {{Marqo Ecommerce Embeddings - Foundation Model for Product Embeddings}},
url = {https://github.com/marqo-ai/marqo-ecommerce-embeddings/},
version = {1.0.0},
year = {2024}
}
Embedding Model Citation
@misc{clipvitb32,
title={CLIP ViT-B/32},
author={Open AI},
year={2021},
url={https://huggingface.co/sentence-transformers/clip-ViT-B-32}
}
Acknowledgments
- Original dataset: Marqo/amazon-products-eval
- Embedding model: sentence-transformers/clip-ViT-B-32
- Benchmark framework: VectorDBBench
- Downloads last month
- 13