×
This model is a fine-tuned version of ParsBERT on PersianQA dataset. It achieves the following results on the evaluation set: Loss: 1.7297. Model description.
Missing: مجله خبری ای بی سی مگ? q= https:// commit/ d417118073712bf73d4d2e44619d332d44098196. diff
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله خبری ای بی سی مگ? q= https:// commit/ d417118073712bf73d4d2e44619d332d44098196. diff
Jul 25, 2021 · We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله خبری ای بی سی مگ? q= d417118073712bf73d4d2e44619d332d44098196. diff
Introduction. ParsBERT is a monolingual language model based on Google's BERT architecture. This model is pre-trained on large Persian corpora with various ...
Missing: مجله خبری ای بی سی مگ? q= ForutanRad/ QA- commit/ d417118073712bf73d4d2e44619d332d44098196. diff
People also ask
It's a bidirectional transformer pretrained using a combination of masked language modeling objective and next sentence prediction on a large corpus comprising ...
Missing: مجله خبری ای بی سی مگ? q= ForutanRad/ fa- QA- commit/ d417118073712bf73d4d2e44619d332d44098196.
We're on a journey to advance and democratize artificial intelligence through open source and open science.
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Mar 25, 2021 · Regarding question answering systems using BERT, I seem to mainly find this being used where a context is supplied ... Hugging Face Forums · Bert ...
Missing: مجله خبری ای بی سی مگ? q= ForutanRad/ fa- v1/ commit/ d417118073712bf73d4d2e44619d332d44098196. diff
In order to show you the most relevant results, we have omitted some entries very similar to the 8 already displayed. If you like, you can repeat the search with the omitted results included.