Quick Start

Quick Start

Easily run LLMs locally

  1. Download Ollama from Ollama.

  2. Download large language models:
    To ensure fairness to the full,these open-source models are recommended to be used:

    If your computer can’t afford these models, you are welcome to use a quantized version or other open-source models.

  3. We provide a few example programs implemented using various prompt engineering techniques. These examples are offered to assist you in getting started with the competition. You can utilize these examples as a foundation to develop your own advanced prompt engineering techniques. The examples are accessible in a GitHub repository, which you can find here.

  1. Download LM Studio from LM Studio.

  2. Download large language models from the “Search” page:
    To ensure fairness to the full,these open-source models are recommended to be used:

    If your computer can’t afford these models, you are welcome to use a quantized version or other open-source models.

  3. Once the model is downloaded, navigate to the “Local Server” tab, select the model, and click “Start Server”.
    Please view more details on Prompt Engineering for Science Birds Level Generation and Beyond

  4. You can start the tutorial code from this github repository

  5. We provide a few example programs implemented using various prompt engineering techniques. These examples are offered to assist you in getting started with the competition. You can utilize these examples as a foundation to develop your own advanced prompt engineering techniques. The examples are accessible in a GitHub repository, which you can find here.