Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Prerequisite for meta-llama2 model:

    • GPU instance Nvidia A100-80GB with Ubuntu 22.04

    • Install NVIDIA drivers and docker

      Code Block
      languagebash
      #!/bin/bash
      
      sudo apt-get install linux-headers-$(uname -r)
      distribution=$(. /etc/os-release;echo $ID$VERSION_ID | sed -e 's/\.//g')
      wget https://developer.download.nvidia.com/compute/cuda/repos/$distribution/x86_64/cuda-keyring_1.0-1_all.deb
      sudo dpkg -i cuda-keyring_1.0-1_all.deb
      sudo apt-get update
      sudo apt-get -y install cuda-drivers
      sudo apt-get update
      sudo apt-get install -y nvidia-container-toolkit-base
      
      nvidia-ctk --version
      sudo nvidia-ctk cdi generate --output=/etc/cdi/nvidia.yaml
      
      curl https://get.docker.com | sh \
        && sudo systemctl --now enable docker
      
      distribution=$(. /etc/os-release;echo $ID$VERSION_ID) \
        && curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg \
        && curl -s -L https://nvidia.github.io/libnvidia-container/$distribution/libnvidia-container.list | \
          sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | \
          sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list
      
      sudo apt-get update
      sudo apt-get install -y nvidia-container-toolkit
      sudo nvidia-ctk runtime configure --runtime=docker
      
      sudo systemctl restart docker
      sudo docker run --rm --runtime=nvidia --gpus all nvidia/cuda:11.6.2-base-ubuntu20.04 nvidia-smi
  • Run the following docker command on the above instance (this will be LLAMA2-API-SERVER-IPADDRESS)

    Code Block
    docker run --gpus all --network host --rm -d -p 5000:5000 zbio/roostai-llama2-13b:1.1.3

...

  • Provide Roost Token.

  • Select the open-source option for the Generative AI Model.

  • Provide the open-source model endpoint in the following format: http://<LLAMA2-API-SERVER-IPADDRESS>:5000/generate.

  • Set the rest of the values according to the test type and the code repo, more information about what values are required for which test can be found here.

  • Go to your workspace explorer, on right-click on a file or folder, you you will see a Generate tests using RoostGPT option

    • Right-click the API-Spec file if you want to generate API or integration tests.

    • Any other file/folder for generating Unit tests or API test tests from Source Code

RoostGPT UI Usage

  • If RoostGPT UI is available, then users can login log in to the Roost URL.

  • Go to the RoostGPT tab to define and trigger test generation.

    • UI requires Bitbucket credential credentials with permissions to create PR, read repo

    • UI requires LLAMA2-API-SERVER-IPADDRESS

    • Optionally, UI requires 3rd party credentials for integration with tools like

      • Jira

      • ELKS

...