Generative Models

Training a German LLM from Scratch 🦜, 14 Nov. 2024 (posts)
This article is not finished and will be updated. The research group I work with has access to a small GPU cluster, which occasionally sits idle. To avoid wasting valuable compute resources (IDLE GPUs essentially burn money through opportunity costs), I decided to train a German GPT-2-style model from scratch, using only German text. Existing German models available on Hugging Face have 137M parameters and a context length of 1024 tokens1, which is quite limited compared to recently released …
Categories: Deep Learning
1806 Words, Tagged with: Deep Learning · Generative Models · LLM
Thumbnail for Training a German LLM from Scratch 🦜
Mining the Bundestag, 22 Jan. 2023 (posts)
Did you know the German parliament publishes protocols for all of its proceedings in PDF format? It is relatively straightforward to download and parse them, so we can easily collect a dataset of transcripts of what seems to be every speech in the Bundestag since the Second World War. My original idea was to mine the speeches for word associations. Some words will be associated with other words based on the intended connotation, and this association might change over time as the connotations …
Categories: Data Mining
1024 Words, Tagged with: Bundestag · Data Mining · Generative Models
Thumbnail for Mining the Bundestag
Mining tagesschau.de, 26 Nov. 2022 (posts)
I like to read tagesschau.de, so I wrote a script to scrape it in regular intervals. My original goal was to determine which articles stay on the front page the longest, which ones allow commenting (a feature that seems to have been disabled almost entirely since March 2020), and if articles are modified after the initial release (without mentioning this), because I sometimes feel that headlines change. Dataset Creation Tagesschau provides a JSON API, so fetching all of the articles is …
Categories: Data Mining
1040 Words, Tagged with: Tagesschau · Generative Models · Data Mining
Thumbnail for Mining tagesschau.de
On Outlier Exposure with Generative Models, 23 Nov. 2022 (papers)
Our paper On Outlier Exposure with Generative Models has been accepted on the NeurIPS Machine Learning Safety Workshop. Abstract While Outlier Exposure reliably increases the performance of Out-of-Distribution detectors, it requires a set of available outliers during training. In this paper, we propose Generative Outlier Exposure (GOE), which alleviates the need for available outliers by using generative models to sample synthetic outliers from low-density regions of the data distribution. The …
Categories: Anomaly Detection
110 Words, Tagged with: MLSW · Generative Models · Anomaly Detection
Thumbnail for On Outlier Exposure with Generative Models