Metadata-Version: 2.1
Name: openprompt
Version: 1.0.1
Summary: An open source framework for prompt-learning.
Home-page: https://github.com/thunlp/OpenPrompt
Author: Ning Ding, Shengding Hu, Weilin Zhao, Yulin Chen
Author-email: dingn18@mails.tsinghua.edu.cn
License: Apache
Keywords: PLM,prompt,AI,NLP
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Education
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.6.0
Description-Content-Type: text/markdown

# Hard Prompts Made Easy: Discrete Prompt Tuning for Language Models

## To generate the prompts from a full dataset running the following lines

Note that discrete_prompt_generation refers to our method while fluentPrompt refers to FluentPrompt and when fluentPrompt and zero_beta is set to 1 refers to the AutoPrompt SGD version.


To run our baseline, you can run the following line
```
conda create --name test python=3.9

pip install -r requirements.txt

python setup.py install

export datasetname=sst2

export discrete_prompt_generation=1

python src/train_prompts.py --model=gpt2 --plm_eval_mode=True --template_id=0 --verbalizer_id=0 --warmup_step_prompt=0 --model_name_or_path=gpt2-large --with_tracking=True  --dataset=$datasetname --dataset_holdout=5000 --eval_every_steps=100 --fluency_weight=0 --max_steps=5000 --optimizer=Adafactor --prompt_lr=0.3 --seed=100 --soft_token_num=10 --discrete_prompt_generation=$discrete_prompt_generation
```
