Skip to content

Commit

Permalink
Adapt calibration reporting to new calibration implementation
Browse files Browse the repository at this point in the history
  • Loading branch information
ricardarosemann committed Jan 17, 2025
1 parent b32fba3 commit 4d4b1b7
Show file tree
Hide file tree
Showing 10 changed files with 126 additions and 59 deletions.
4 changes: 3 additions & 1 deletion .buildlibrary
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
ValidationKey: '1420426'
ValidationKey: '1447560'
AutocreateReadme: yes
AcceptedWarnings:
- 'Warning: package ''.*'' was built under R version'
Expand All @@ -7,3 +7,5 @@ AcceptedNotes: Namespaces in Imports field not imported from\:\n *.ggplot2. .kni
.mip. .piamPlotComparison. .purrr.
allowLinterWarnings: no
enforceVersionUpdate: no
AutocreateCITATION: yes
skipCoverage: no
25 changes: 16 additions & 9 deletions .github/workflows/check.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,14 @@ jobs:
- uses: r-lib/actions/setup-r-dependencies@v2
with:
extra-packages: |
any::lucode2
any::covr
any::madrat
any::magclass
any::citation
any::gms
any::goxygen
any::GDPuc
lucode2
covr
madrat
magclass
citation
gms
goxygen
GDPuc
# piam packages also available on CRAN (madrat, magclass, citation,
# gms, goxygen, GDPuc) will usually have an outdated binary version
# available; by using extra-packages we get the newest version
Expand All @@ -44,6 +44,13 @@ jobs:
[ -f requirements.txt ] && python -m pip install --upgrade pip wheel || true
[ -f requirements.txt ] && pip install -r requirements.txt || true
- name: Run pre-commit checks
shell: bash
run: |
python -m pip install pre-commit
python -m pip freeze --local
pre-commit run --show-diff-on-failure --color=always --all-files
- name: Verify validation key
shell: Rscript {0}
run: lucode2:::validkey(stopIfInvalid = TRUE)
Expand All @@ -63,6 +70,6 @@ jobs:
shell: Rscript {0}
run: |
nonDummyTests <- setdiff(list.files("./tests/testthat/"), c("test-dummy.R", "_snaps"))
if(length(nonDummyTests) > 0) covr::codecov(quiet = FALSE)
if(length(nonDummyTests) > 0 && !lucode2:::loadBuildLibraryConfig()[["skipCoverage"]]) covr::codecov(quiet = FALSE)
env:
NOT_CRAN: "true"
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
exclude: '^tests/testthat/_snaps/.*$'
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: 2c9f875913ee60ca25ce70243dc24d5b6415598c # frozen: v4.6.0
rev: cef0300fd0fc4d2a87a85fa2093c6b283ea36f4b # frozen: v5.0.0
hooks:
- id: check-case-conflict
- id: check-json
Expand All @@ -15,7 +15,7 @@ repos:
- id: mixed-line-ending

- repo: https://github.com/lorenzwalthert/precommit
rev: bae853d82da476eee0e0a57960ee6b741a3b3fb7 # frozen: v0.4.3
rev: 3b70240796cdccbe1474b0176560281aaded97e6 # frozen: v0.4.3.9003
hooks:
- id: parsable-R
- id: deps-in-desc
Expand All @@ -25,4 +25,4 @@ repos:
- id: readme-rmd-rendered
- id: use-tidy-description
ci:
autoupdate_schedule: quarterly
autoupdate_schedule: weekly
4 changes: 2 additions & 2 deletions CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@ cff-version: 1.2.0
message: If you use this software, please cite it using the metadata from this file.
type: software
title: 'reportbrick: Reporting package for BRICK'
version: 0.7.1
date-released: '2024-10-10'
version: 0.7.2
date-released: '2025-01-17'
abstract: This package contains BRICK-specific routines to report model results. The
main functionality is to generate a mif-file from a given BRICK model run folder.
authors:
Expand Down
4 changes: 2 additions & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
Type: Package
Package: reportbrick
Title: Reporting package for BRICK
Version: 0.7.1
Date: 2024-10-10
Version: 0.7.2
Date: 2025-01-17
Authors@R: c(
person("Robin", "Hasse", , "robin.hasse@pik-potsdam.de",
role = c("aut", "cre"),
Expand Down
1 change: 1 addition & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ importFrom(tidyr,replace_na)
importFrom(tidyr,separate)
importFrom(tidyr,unite)
importFrom(utils,capture.output)
importFrom(utils,read.csv)
importFrom(utils,tail)
importFrom(utils,write.csv)
importFrom(yaml,read_yaml)
87 changes: 61 additions & 26 deletions R/reportCalibration.R
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,8 @@
#' @author Ricarda Rosemann
#'
#' @importFrom tidyr crossing replace_na
#' @importFrom utils write.csv
#' @importFrom utils read.csv write.csv
#' @importFrom yaml read_yaml
#' @export

reportCalibration <- function(gdx) {
Expand All @@ -18,69 +19,87 @@ reportCalibration <- function(gdx) {
path <- dirname(gdx)

allFiles <- list.files(path = path,
pattern = paste0(sub("\\.gdx$", "", basename(gdx)),
pattern = paste0(sub("_0\\.gdx$", "", basename(gdx)),
"_\\d{1,3}\\.gdx"))
maxIter <- max(as.numeric(sub(".*_(\\d{1,3})\\.gdx$", "\\1", allFiles)))

gdxHist <- file.path(path, "historic.gdx")

# Read config
cfg <- read_yaml(file = file.path(path, "config", "config_COMPILED.yaml"))

# Read relevant time periods
tCalib <- readGdxSymbol(gdx, "tCalib", asMagpie = FALSE)[["ttot"]]
tCalib <- cfg[["calibperiods"]]

# Read different parameters from gdx files for all iterations
# TODO: Could potentially replace this call by mip::getPlotData (but that uses gdxrrw and can only start counting at 1) # nolint
targetFunction <- .readGdxIter(gdx, "p_f", maxIter, asMagpie = FALSE)
calibOptim <- cfg[["switches"]][["RUNTYPE"]] == "calibrationOptimization"

stepSize <- .readGdxIter(gdx, "p_alpha", maxIter, asMagpie = FALSE)
# Read diagnostic parameters from csv
stepSize <- read.csv(file.path(path, "stepSizeParamsIter.csv")) %>%
select(-"delta", -"phiDeriv") %>%
rename(value = "alpha")
descDirCon <- read.csv(file.path(path, "deviationConIter.csv")) %>%
rename(value = "d") %>%
.replaceVarName()
descDirRen <- read.csv(file.path(path, "deviationRenIter.csv")) %>%
rename(value = "d")

# Potentially shift the time filter to a later stage if I want to save and plot pure stock/flow data
p_stock <- .readGdxIter(gdx, "p_stock", maxIter, asMagpie = FALSE, ttotFilter = tCalib, replaceVar = TRUE)
v_stock <- .readGdxIter(gdx,
if (calibOptim) "p_stock" else "v_stock",
maxIter, asMagpie = FALSE, ttotFilter = tCalib, replaceVar = TRUE)

p_construction <- .readGdxIter(gdx, "p_construction", maxIter,
v_construction <- .readGdxIter(gdx,
if (calibOptim) "p_construction" else "v_construction",
maxIter,
asMagpie = FALSE, ttotFilter = tCalib, replaceVar = TRUE)

p_renovation <- .readGdxIter(gdx, "p_renovation", maxIter,
v_renovation <- .readGdxIter(gdx,
if (calibOptim) "p_renovation" else "v_renovation", maxIter,
asMagpie = FALSE, ttotFilter = tCalib)

# Read historical gdx files
p_stockHist <- .replaceVarName(readGdxSymbol(gdx, "p_stockHist", asMagpie = FALSE))
p_stockHist <- .replaceVarName(readGdxSymbol(gdxHist, "p_stockHist", asMagpie = FALSE))

p_constructionHist <- .replaceVarName(readGdxSymbol(gdx, "p_constructionHist", asMagpie = FALSE))
p_constructionHist <- .replaceVarName(readGdxSymbol(gdxHist, "p_constructionHist", asMagpie = FALSE))

p_renovationHist <- readGdxSymbol(gdx, "p_renovationHist", asMagpie = FALSE)
p_renovationHist <- readGdxSymbol(gdxHist, "p_renovationHist", asMagpie = FALSE)


# COMPUTE DEVIATIONS ---------------------------------------------------------

p_stockDev <- .computeDeviation(p_stock, p_stockHist)
v_stockDev <- .computeDeviation(v_stock, p_stockHist)

p_constructionDev <- .computeDeviation(p_construction, p_constructionHist)
v_constructionDev <- .computeDeviation(v_construction, p_constructionHist)

p_renovationDev <- .computeDeviation(p_renovation, p_renovationHist)
v_renovationDev <- .computeDeviation(v_renovation, p_renovationHist)

# Start writing output quantities --------------------------------------------
# WRITE DIAGNOSTIC OUTPUT ----------------------------------------------------

out <- list()

out[["targetFunction"]] <- targetFunction

out[["stepSize"]] <- stepSize

out[["descDirCon"]] <- .computeAvg(descDirCon, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr", "ttot"))

out[["descDirRen"]] <- .computeAvg(descDirRen, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr", "ttot"))

# AGGREGATE QUANTITIES -------------------------------------------------------

# Aggregate across all dimensions
out[["stockDevAgg"]] <- .computeSumSq(p_stockDev, rprt = c("iteration", "reg", "typ", "loc", "inc"))
out[["stockDevAgg"]] <- .computeSumSq(v_stockDev, rprt = c("iteration", "reg", "typ", "loc", "inc"))

out[["conDevAgg"]] <- .computeSumSq(p_constructionDev, rprt = c("iteration", "reg", "typ", "loc", "inc"))
out[["conDevAgg"]] <- .computeSumSq(v_constructionDev, rprt = c("iteration", "reg", "typ", "loc", "inc"))

out[["renDevAgg"]] <- .computeSumSq(p_renovationDev, rprt = c("iteration", "reg", "typ", "loc", "inc"))
out[["renDevAgg"]] <- .computeSumSq(v_renovationDev, rprt = c("iteration", "reg", "typ", "loc", "inc"))

out[["flowDevAgg"]] <- .computeFlowSum(out[["conDevAgg"]], out[["renDevAgg"]])

# Aggregate by heating system (hs)
out[["stockDevHs"]] <- .computeSumSq(p_stockDev, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr"))
out[["stockDevHs"]] <- .computeSumSq(v_stockDev, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr"))

out[["conDevHs"]] <- .computeSumSq(p_constructionDev, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr"))
out[["conDevHs"]] <- .computeSumSq(v_constructionDev, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr"))

out[["renDevHs"]] <- .computeSumSq(p_renovationDev, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr"))
out[["renDevHs"]] <- .computeSumSq(v_renovationDev, rprt = c("iteration", "reg", "typ", "loc", "inc", "hsr"))

out[["flowDevHs"]] <- .computeFlowSum(out[["conDevHs"]], out[["renDevHs"]])

Expand Down Expand Up @@ -141,7 +160,7 @@ reportCalibration <- function(gdx) {
# Loop over all iterations and read in gdx files
res <- data.frame()
for (i in seq(0, maxIter)) {
fileName <- file.path(dirname(gdx), paste0(gsub("\\.gdx$", "", basename(gdx)), "_", i, ".gdx"))
fileName <- file.path(dirname(gdx), paste0(gsub("_0\\.gdx$", "", basename(gdx)), "_", i, ".gdx"))
if (file.exists(fileName)) {
res <- rbind(res, readGdxSymbol(fileName, symbol, asMagpie = asMagpie) %>%
mutate(iteration = i))
Expand Down Expand Up @@ -209,6 +228,22 @@ reportCalibration <- function(gdx) {

}

#' Compute the sum of the squares in a data frame
#'
#' @param df data frame, containing the data to be evaluated
#' @param rprt character, column names for which the sum of the squares should be reported separately
#' @returns data frame averages as value column
#'
#' @importFrom dplyr %>% across any_of .data group_by summarise
#'
.computeAvg <- function(df, rprt = "") {

df %>%
group_by(across(any_of(rprt))) %>%
summarise(value = mean(.data[["value"]], na.rm = TRUE), .groups = "drop")

}

#' Compute the sum of the squares in a data frame
#'
#' @param df data frame, containing the data to be evaluated
Expand Down
15 changes: 7 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Reporting package for BRICK

R package **reportbrick**, version **0.7.1**
R package **reportbrick**, version **0.7.2**

[![CRAN status](https://www.r-pkg.org/badges/version/reportbrick)](https://cran.r-project.org/package=reportbrick) [![R build status](https://github.com/pik-piam/reportbrick/workflows/check/badge.svg)](https://github.com/pik-piam/reportbrick/actions) [![codecov](https://codecov.io/gh/pik-piam/reportbrick/branch/master/graph/badge.svg)](https://app.codecov.io/gh/pik-piam/reportbrick) [![r-universe](https://pik-piam.r-universe.dev/badges/reportbrick)](https://pik-piam.r-universe.dev/builds)
[![CRAN status](https://www.r-pkg.org/badges/version/reportbrick)](https://cran.r-project.org/package=reportbrick) [![R build status](https://github.com/pik-piam/reportbrick/workflows/check/badge.svg)](https://github.com/pik-piam/reportbrick/actions) [![codecov](https://codecov.io/gh/pik-piam/reportbrick/branch/master/graph/badge.svg)](https://app.codecov.io/gh/pik-piam/reportbrick) [![r-universe](https://pik-piam.r-universe.dev/badges/reportbrick)](https://pik-piam.r-universe.dev/builds)

## Purpose and Functionality

Expand Down Expand Up @@ -38,16 +38,15 @@ In case of questions / problems please contact Robin Hasse <robin.hasse@pik-pots

To cite package **reportbrick** in publications use:

Hasse R, Rosemann R (2024). _reportbrick: Reporting package for BRICK_. R package version 0.7.1, <https://github.com/pik-piam/reportbrick>.
Hasse R, Rosemann R (2025). "reportbrick: Reporting package for BRICK - Version 0.7.2."

A BibTeX entry for LaTeX users is

```latex
@Manual{,
title = {reportbrick: Reporting package for BRICK},
@Misc{,
title = {reportbrick: Reporting package for BRICK - Version 0.7.2},
author = {Robin Hasse and Ricarda Rosemann},
year = {2024},
note = {R package version 0.7.1},
url = {https://github.com/pik-piam/reportbrick},
date = {2025-01-17},
year = {2025},
}
```
20 changes: 12 additions & 8 deletions inst/plotsCalibrationReporting/plotsCalibration.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -107,14 +107,6 @@ if (length(filePath) == 1) {
```

## Target function

```{r target function}
.createCalibrationPlot(data, "targetFunction", outPath, outName = outName,
color = color, savePlots = savePlots)
```

## Absolute stock and flow deviations

Expand Down Expand Up @@ -189,6 +181,7 @@ if (length(filePath) == 1) {
```


## Step size

```{r step size}
Expand All @@ -197,3 +190,14 @@ if (length(filePath) == 1) {
color = color, savePlots = savePlots)
```


## Descent direction by heating system

```{r descent direction}
.createCalibrationPlot(data, "descDirCon", outPath, outName = outName,
color = "hsr", savePlots = savePlots)
.createCalibrationPlot(data, "descDirRen", outPath, outName = outName,
color = "hsr", savePlots = savePlots)
```
19 changes: 19 additions & 0 deletions man/dot-computeAvg.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 4d4b1b7

Please sign in to comment.