Geralt-Targaryen nielsr HF Staff commited on
Commit
57cafb1
·
verified ·
1 Parent(s): 18ac3ed

Improve model card: Add pipeline tag, library name, and paper link (#1)

Browse files

- Improve model card: Add pipeline tag, library name, and paper link (68659111f6edd5a94f96de9c2e29ecccf1d61df9)


Co-authored-by: Niels Rogge <[email protected]>

Files changed (1) hide show
  1. README.md +9 -3
README.md CHANGED
@@ -1,13 +1,19 @@
1
  ---
2
- license: apache-2.0
 
3
  datasets:
4
  - codefuse-ai/F2LLM
5
  language:
6
  - en
7
- base_model:
8
- - Qwen/Qwen3-0.6B
 
9
  ---
10
 
 
 
 
 
11
  F2LLMs (Foundation to Feature Large Language Models) are foundation models directly finetuned on 6 million high-quality query-document pairs (available in [codefuse-ai/F2LLM](https://huggingface.co/datasets/codefuse-ai/F2LLM)) covering a diverse range of retrieval, classification, and clustering data, curated solely from open-source datasets without any synthetic data. These models are trained with homogeneous macro batches in a single stage, without sophisticated multi-stage pipelines.
12
 
13
  ## Usage
 
1
  ---
2
+ base_model:
3
+ - Qwen/Qwen3-0.6B
4
  datasets:
5
  - codefuse-ai/F2LLM
6
  language:
7
  - en
8
+ license: apache-2.0
9
+ pipeline_tag: feature-extraction
10
+ library_name: transformers
11
  ---
12
 
13
+ # F2LLM Technical Report: Matching SOTA Embedding Performance with 6 Million Open-Source Data
14
+
15
+ This repository contains the F2LLM-0.6B model presented in the paper [F2LLM Technical Report: Matching SOTA Embedding Performance with 6 Million Open-Source Data](https://huggingface.co/papers/2510.02294).
16
+
17
  F2LLMs (Foundation to Feature Large Language Models) are foundation models directly finetuned on 6 million high-quality query-document pairs (available in [codefuse-ai/F2LLM](https://huggingface.co/datasets/codefuse-ai/F2LLM)) covering a diverse range of retrieval, classification, and clustering data, curated solely from open-source datasets without any synthetic data. These models are trained with homogeneous macro batches in a single stage, without sophisticated multi-stage pipelines.
18
 
19
  ## Usage