Developer Quickstart

Table of Contents

Overview

This guide helps developers get started with GABM quickly. Please refer to the Developer Guide for details of how to contribute.

Your choice of development environment is your own, but we recommend Visual Studio Code.

Pre-requisits

You need:

  • A GitHub account to contribute.

  • git version 2 at or above 2.43.

  • Python version 3 at or above version 3.12.

  • PIP at or above version 25.2.

  • GNU Make version 4 at or above version 4.3.

To check your Python version run:

python3 --version

To check your PIP version run:

pip --version

To check you GNU Make version run:

make --version

You may have multiple versions of Python and GNU Make on your system. Please use the right Path to the versions needed for GABM when developing GABM. You might want to do this by modifying your PATH.

With the pre-requisites in place, the following steps should have you set up in a matter of minutes.

1. Fork and Clone

  • Fork from the GABM Repository to your own GitHub account using the “Fork” button to “Create a new fork”.

  • Clone your fork locally using the following command and by replacing “” with your actual GitHub username:

git clone https://github.com/<your-GitHub-username>/GABM.git

To keep your fork up to date with the main repository, add the upstream remote:

git remote add upstream https://github.com/compolis/GABM.git

2. Install Dependencies

Change into the GABM directory. From the project root, install all runtime, development, and documentation dependencies:

pip install -r requirements-dev.txt

Important: The torch package (PyTorch) is required for Apertus LLM model inference and can take several minutes to install. The download size is typically hundreds of megabytes, and the installed package may require over 1 GB of disk space. Please ensure you have sufficient storage and bandwidth before installing. For GPU support, follow the instructions at pytorch.org to match your environment. Using local installs of Apertus models is optional.

Warning about LLM model downloads:

If you want to use a local LLM (such as Apertus), see HUGGING_FACE.md for full instructions, including authentication and troubleshooting. Downloading the model weights from Hugging Face can require a very large amount of disk space (10–20 GB or more per model) and a fast, stable internet connection. The download and setup of these models is optional for most users. If you only want to use API-based LLMs (OpenAI, GenAI, DeepSeek, etc.), you do not need to download any local models.

If you do want to use a local LLM, ensure you have at least 20 GB of free disk space and be prepared for a long download time.

3. Set Up LLM API Keys

Create data/api_key.csv with your API keys for LLM providers. For all supported providers (including PublicAI/Apertus), see API_KEYS.md for format and details.

To initialize your environment, test your API keys, and generate model lists and caches for all supported LLMs, run:

make setup-llms

This will:

  • Check for all required API keys (OpenAI, GenAI, DeepSeek, PublicAI)

  • Test each key with a default prompt and model

  • Populate data/llm/ directories with:

    • models.json and models.txt files for each LLM, detailing available models

    • Response caches: prompt_response_cache.pkl and prompt_response_cache.jsonl for test prompts sent to the default model of each service

  • Create log files in data/logs/llm and data/logs/setup_llms.log for diagnostics

  • Report any issues, such as missing or malformed api_key.csv, or problems with API keys/services

If you need to clear all caches and model lists (for a fresh start or troubleshooting), use:

make clear-caches

Note: For each LLM service directories are created in data/llm, for example data/llm/openai. Model list files (models.json, models.txt) are lists of available models for the LLM service. Cache files (prompt_response_cache.pkl, prompt_response_cache.json) are prompt/response data from each LLM service.

4. Run Tests and Build Documentation

  • Run tests:

make test
  • Build documentation:

make docs
  • View docs in your browser: open docs/_build/html/index.html

Troubleshooting

If you encounter errors, double-check your setup steps above. If you want help, open an issue on GitHub.

Additional Resources