BERT (language model)

id: bert-language-model-182-2162196
title: BERT (language model)
text: Bidirectional Encoder Representations from Transformers (BERT) is a language model introduced in October 2018 by researchers at Google. It learned by self-supervised learning to represent text as a sequence of vectors. It had the transformer encoder architecture. It was notable for its dramatic improvement over previous state of the art models, and as an early example of large language model. As of 2020, BERT was a ubiquitous baseline in Natural Language Processing (NLP) experiments. BERT is tra
brand slug: wiki
category slug: encyclopedia
description: Series of language models developed by Google AI
original url: https://en.wikipedia.org/wiki/BERT_(language_model)
date created: 2019-10-10T16:19:45Z
date modified: 2024-09-06T02:51:28Z
main entity: {"identifier":"Q61726893","url":"https://www.wikidata.org/entity/Q61726893"}
image:
fields total: 13
integrity: 15

Related Entries

Explore Next Part