Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
•
1908.10084
•
Published
•
9
This is a sentence-transformers model finetuned from BAAI/bge-m3. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Jrinky/model4")
# Run inference
sentences = [
'What is the significance of the first written mention of Metylovice, and in which year did it occur',
'The Olešná Stream flows through the municipality. History\nThe first written mention of Metylovice is in a deed of Bishop Dětřich from 1299. From the second half of the 17th century, tanning developed in the village, thanks to which the originally agricultural village began to prosper and grow. Brick houses began to replace the original wooden ones and the education and cultural life of the inhabitants increased. Sights\nThe most important monument is the Church of All Saints.',
'Users could also get discounts when they bought the coins in bulk and earn coins through certain apps on the Appstore. In 2014, with the release of the Fire Phone, Amazon offered app developers 500,000 Amazon Coins for each paid app or app with in-app purchasing developed and optimized for the Fire Phone.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
anchor and positive| anchor | positive | |
|---|---|---|
| type | string | string |
| details |
|
|
| anchor | positive |
|---|---|
What was the birth date and place of Helena Binder, also known as Blanche Blotto |
Born June 13, 1955 in Batavia, New York. Helena Binder, aka Blanche Blotto (keyboards, vocals; 1978-1980). |
What incidents involving Israeli soldiers occurred in the occupied West Bank on Tuesday |
Also Tuesday, Israeli soldiers fired a barrage of gas bombs and concussion grenades at a Palestinian home in the Masafer Yatta area, south of Hebron, in the southern part of the occupied West Bank, wounding an entire family, including children. On Tuesday evening, Israeli soldiers invaded the al-Maghayir village northeast of Ramallah, in the central West Bank, after many illegal colonizers attacked Palestinian cars. In related news, the soldiers shot three Palestinian construction workers near the illegal Annexation Wall, west of Hebron, in the southern part of the occupied West Bank, and abducted them. |
How was the Mosbrucher Maar formed, and when did it occur |
The Mosbrucher Weiher, also called the Mosbrucher Maar, is a silted up maar east of the municipal boundary of the village of Mosbruch in the county Vulkaneifel in Germany. It is located immediately at the foot of the 675-metre-high Hochkelberg, a former volcano. The floor of the maar is in the shape of an elongated oval and is about 700×500 metres in size, its upper boundary has a diameter of about 1,300 × 1,050 metres. This makes the Mosbrucher Maar the third largest of the maars in the western Eifel region. The Üßbach stream flows past and close to the Mosbrucher Weiher. Origin |
selfloss.Infonce with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
anchor and positive| anchor | positive | |
|---|---|---|
| type | string | string |
| details |
|
|
| anchor | positive |
|---|---|
What architectural features are present on the front and southern sides of the Martínez Adobe house |
The front and southern sides of the house have wooden wrap-around porches at each level. Wood shingles of either cedar or redwood originally covered the roof. The Martínez Adobe is now part of the John Muir National Historic Site and is open to the public. See also |
What are the cognitive aspects being assessed in relation to TBI, and how do they impact the rehabilitation services for individuals, including warfighters with hearing problems |
“Within AASC, we’ve been very proactive as part of interdisciplinary teams assessing TBI. Another area we’re looking at involves cognitive aspects associated with TBI and mild TBI and the best approach to providing rehabilitative services.” |
What are the benefits mentioned by BIO President & CEO Jim Greenwood regarding the energy title programs in rural America |
BIO President & CEO Jim Greenwood said, “The important energy title programs authorized and funded in this bill are just beginning to have a positive impact in revitalizing rural America, fueling economic growth and creating well-paying opportunities where we need it most -- in manufacturing, energy, agriculture and forestry. These programs can also help meet our responsibilities to revitalize rural areas, reduce dependence on foreign oil, and renew economic growth. |
selfloss.Infonce with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
eval_strategy: stepsper_device_train_batch_size: 2per_device_eval_batch_size: 2learning_rate: 2e-05num_train_epochs: 5warmup_ratio: 0.1fp16: Truebatch_sampler: no_duplicatesoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 2per_device_eval_batch_size: 2per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonelearning_rate: 2e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 5max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Falsehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseeval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Nonedispatch_batches: Nonesplit_batches: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseprompts: Nonebatch_sampler: no_duplicatesmulti_dataset_batch_sampler: proportional| Epoch | Step | Training Loss | Validation Loss |
|---|---|---|---|
| 0.0961 | 100 | 0.2849 | 0.0915 |
| 0.1921 | 200 | 0.0963 | 0.0511 |
| 0.2882 | 300 | 0.069 | 0.0459 |
| 0.3842 | 400 | 0.0622 | 0.0445 |
| 0.4803 | 500 | 0.0544 | 0.0441 |
| 0.5764 | 600 | 0.0615 | 0.0418 |
| 0.6724 | 700 | 0.0573 | 0.0416 |
| 0.7685 | 800 | 0.0524 | 0.0435 |
| 0.8646 | 900 | 0.0523 | 0.0398 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Base model
BAAI/bge-m3