Skip to content

Commit

Permalink
Merge branch 'main' into shortfin_invoke
Browse files Browse the repository at this point in the history
  • Loading branch information
stellaraccident authored Sep 3, 2024
2 parents c1b03e8 + 89cc4c5 commit 6a8e608
Show file tree
Hide file tree
Showing 6 changed files with 2,339 additions and 0 deletions.
44 changes: 44 additions & 0 deletions .github/workflows/ci-tuner.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
name: CI - Tuner

on:
workflow_dispatch:
pull_request:
push:
branches:
- main

concurrency:
group: ${{ github.workflow }}-${{ github.event.number || github.sha }}
cancel-in-progress: true

permissions:
contents: read

jobs:
test:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4.1.7

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10.12'

- name: Install dev dependencies
run: |
python -m pip install --upgrade pip
pip install -r tuner/requirements-dev.txt
- name: Install tuner dependencies
run: |
pip install -r tuner/requirements-tuner.txt
python -m pip install \
--find-links https://iree.dev/pip-release-links.html \
--upgrade \
iree-compiler iree-runtime
- name: Run tuner tests
run: pytest tuner/
67 changes: 67 additions & 0 deletions tuner/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# IREE dispatch auto-tuning scripts
`libtuner.py` is the core Python script that provides the fundamental functions for the tuning loop. It imports `candidate_gen.py` for candidate generation. To implement the full tuning loop, `libtuner.py` requires a separate Python script that uses the provided `TuningClient` API from `libtuner.py`.

## Prerequisites
[Optional] Using virtual environments:
```shell
cd tuning
python -m venv .venv
source .venv/bin/activate
```
Install python dependencies:
```shell
pip install -r ./requirements-tuner.txt
```
Using the IREE's Python bindings:
- Building with CMake
```shell
-DIREE_BUILD_PYTHON_BINDINGS=ON \
-DPython3_EXECUTABLE="$(which python)"
```
- Set environment
```shell
source ../iree-build/.env && export PYTHONPATH
```
For more information, refer to the [IREE documentation](https://iree.dev/building-from-source/getting-started/#python-bindings)

### Overall flow

1. Symlink all scripts and mlir/irpa files in your build dir.
- Symlink `iree-build-dir/tools` inside `tuning`.
- Symlink ML model MLIR and weights based on `unet.sh`.

2. Copy the attention/matmul spec as `config.mlir` in the tuning dir.

3. Temporarily comment out all the existing configs in `config.mlir`.
- Example:
```mlir
// , @match_mmt_2048x10240x1280 -> @apply_op_config
// , @match_mmt_2048x1280x5120 -> @apply_op_config
// , @match_mmt_2048x1280x1280 -> @apply_op_config
```

4. Compile a baseline unet
```shell
./unet.sh winograd unet.mlir -o unet_baseline.vmfb --iree-hal-dump-executable-files-to=dump-winograd
```

5. Find the matmul to tune and copy the `*_benchmark.mlir` file to the build dir.
```shell
cp dump-winograd/*_141_*benchmark.mlir ./141.mlir
```

6. Run the tuning script.
- Example:
```shell
python punet_autotune.py 141.mlir --devices=hip://GPU-0,hip://GPU-4 --num-candidates=1024
```

7. Check the winner candidate in `result_summary.log`, find and copy the transform spec.

8. Paste the transform spec into the `config.mlir` and uncomment them.

9. Add the match function to the entry point in `config.mlir`
- Example:
```mlir
@match_something -> @apply_op_config
```
Loading

0 comments on commit 6a8e608

Please sign in to comment.