gpt-2-output-dataset

Dataset of GPT-2 outputs for research in detection, biases, and more

  • Owner: openai/gpt-2-output-dataset
  • Platform:
  • License:: MIT License
  • Category::
  • Topic:
  • Like:
    0
      Compare:

Github stars Tracking Chart

gpt-2-output-dataset

This dataset contains:

  • 250K documents from the WebText test set
  • For each GPT-2 model (trained on the WebText training set), 250K random samples (temperature 1, no truncation) and 250K samples generated with Top-K 40 truncation

We look forward to the research produced using this data!

Download

For each model, we have a training split of 250K generated examples, as well as validation and test splits of 5K examples.

All data is located in Google Cloud Storage, under the directory gs://gpt-2/output-dataset/v1.

There, you will find files:

  • webtext.${split}.jsonl
  • small-117M.${split}.jsonl
  • small-117M-k40.${split}.jsonl
  • medium-345M.${split}.jsonl
  • medium-345M-k40.${split}.jsonl
  • large-762M.${split}.jsonl
  • large-762M-k40.${split}.jsonl
  • xl-1542M.${split}.jsonl
  • xl-1542M-k40.${split}.jsonl

where split is one of train, test, and valid.

We've provided a script to download all of them, in download_dataset.py.

Finetuned model samples

Additionally, we encourage research on detection of finetuned models. We have released data under gs://gpt-2/output-dataset/v1-amazonfinetune/ with samples from a GPT-2 full model finetuned to output Amazon reviews.

Detectability baselines

We're interested in seeing research in detectability of GPT-2 model family generations.

We provide some initial analysis of two baselines, as well as code for the better baseline.

Overall, we are able to achieve accuracies in the mid-90s for Top-K 40 generations, and mid-70s to high-80s (depending on model size) for random generations. We also find some evidence that adversaries can evade detection via finetuning from released models.

Data removal requests

If you believe your work is included in WebText and would like us to remove it, please let us know at webtextdata@openai.com.

Main metrics

Overview
Name With Owneropenai/gpt-2-output-dataset
Primary LanguagePython
Program languagePython (Language Count: 2)
Platform
License:MIT License
所有者活动
Created At2019-05-03 02:58:09
Pushed At2023-12-13 03:03:19
Last Commit At2023-12-12 19:03:19
Release Count0
用户参与
Stargazers Count2k
Watchers Count74
Fork Count549
Commits Count22
Has Issues Enabled
Issues Count49
Issue Open Count28
Pull Requests Count1
Pull Requests Open Count2
Pull Requests Close Count4
项目设置
Has Wiki Enabled
Is Archived
Is Fork
Is Locked
Is Mirror
Is Private