Vis enkel innførsel

dc.contributor.authorDe La Rosa, Javier
dc.contributor.authorPonferrada, Eduardo G.
dc.contributor.authorVillegas, Paulo
dc.contributor.authorDe Prado Salas, Pablo Gonzalez
dc.contributor.authorRomero, Manu
dc.contributor.authorGrandury, Maria
dc.date.accessioned2024-05-30T09:09:23Z
dc.date.available2024-05-30T09:09:23Z
dc.date.created2022-07-01T12:11:40Z
dc.date.issued2022
dc.identifier.citationRevista de Procesamiento de Lenguaje Natural (SEPLN). 2022, 68 13-23.
dc.identifier.issn1135-5948
dc.identifier.urihttps://hdl.handle.net/11250/3131976
dc.language.isoeng
dc.titleBERTIN: Efficient Pre-Training of a Spanish Language Model using Perplexity Sampling
dc.title.alternativeBERTIN: Efficient Pre-Training of a Spanish Language Model using Perplexity Sampling
dc.typePeer reviewed
dc.typeJournal article
dc.description.versionpublishedVersion
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.doi10.26342/2022-68-1
dc.identifier.cristin2036642
dc.source.journalRevista de Procesamiento de Lenguaje Natural (SEPLN)
dc.source.volume68
dc.source.pagenumber13-23


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel