Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(doc): update docs and add link checker #382

Merged
merged 1 commit into from
Nov 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/book.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ title = "Boavizta cloud scanner 📡"

# Deactivate link checker by default
# cargo install mdbook-linkcheck
# [output.linkcheck]
[output.linkcheck]

[output.html]
#theme = "my-theme"
Expand Down
7 changes: 5 additions & 2 deletions docs/src/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,11 @@
- [Quickstart - using cargo 🦀](tutorials/quickstart-rust-cli.md)
- [Quickstart as serverless ⚡](tutorials/quickstart-serverless.md)

# Explanations

- [Methodology](explanations/methodology.md)
- [How we process workload](explanations/processing-workload.md)

# How-to guides

- [Building CLI](how-to/building-cli.md)
Expand All @@ -16,8 +21,6 @@
- [Setup monitoring dashboard](how-to/set-up-dashboard.md)
- [Filtering by tags](how-to/filter-by-tags.md)
- [Using a private instance of Boavizta API](how-to/using-private-boaviztapi.md)
- [Source of data](explanations/source-of-data.md)
- [How we process workload](explanations/processing-workload.md)

# Reference

Expand Down
16 changes: 16 additions & 0 deletions docs/src/explanations/methodology.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Methodology and source of data

Cloud scanner uses the Boavizta methodology to estimate the impacts of cloud resources.

The methodology of Boavizta is described in [Digital & environment : How to evaluate server manufacturing footprint, beyond greenhouse gas emissions? | Boavizta](https://boavizta.org/en/blog/empreinte-de-la-fabrication-d-un-serveur)

Impact data is retrieved from [BOAVIZTA reference data API](https://github.com/Boavizta/boaviztapi/) v1.0.x.

The results are similar to what you can visualize in [Datavizta](http://datavizta.boavizta.org/cloudimpact), but with automated inventory.

⚠ Cloud scanner **underestimate the impacts of the cloud resources**. Because it only considers the _instances_ and _block storage_ a lot of impacts (network, potential redundancy, cloud control plan) are not included in the estimation.

See also [other limits](../reference/limits.md).

- https://www.boavizta.org/en
- https://boavizta.org/en/blog/empreinte-de-la-fabrication-d-un-serveur
7 changes: 0 additions & 7 deletions docs/src/explanations/source-of-data.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/src/reference/limits.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ Carbon intensity of electricity is not (yet) real time. It uses and yearly extra

### Allocation of manufacture impacts

Today, cloud scanner returns the manufacture impacts of a resource corresponding to _the entire lifecycle_ of the ressource. The manufacture impacts returned for a VM are the same if you use it one hours or several year. Said differently we do _not_ amortize the manufacturing impacts over the duration of use.
Today, cloud scanner returns the manufacture impacts of a resource corresponding to _the entire lifecycle_ of the ressource. The manufacture impacts returned for a VM are the same if you use it one hours or several year. To say it differently, we do _not_ amortize the manufacturing impacts over the duration of use.

### We do not provide margins of error

Expand Down
5 changes: 0 additions & 5 deletions docs/src/reference/source-of-data.md

This file was deleted.

4 changes: 2 additions & 2 deletions docs/src/reference/testing.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Testing

Whne launched with `cargo test -- --include-ignored` some the unit tests require a specific instance to run (when launched ).
When launched with `cargo test -- --include-ignored` some the unit tests require a specific instance to run (when launched ).

> These integration tests requiere specific instance to be up and running to pass. This means they are tied to a specific cloud account.
> These integration tests require specific instances to be up and running to pass. This means they are tied to a specific cloud account.

Commands to start or stop instances:

Expand Down
10 changes: 6 additions & 4 deletions docs/src/tutorials/quickstart-dashboard-docker.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
# Quick start : display dashboard using docker-compose 🐳

No installation needed, you will run a public docker image of cloud-scanner CLI and Boavizta API.
Visualize the live impacts of you account in a dashboard.

All data remain local (this docker-compose stack uses a _private instance_ of Boavizta API).
No installation needed, you will run a public docker image of cloud-scanner CLI, Boavizta API, Prometheus and Grafana to get access to a demo dashboard..

All data remain will local to your environment (this docker-compose stack uses a _private instance_ of Boavizta API).

## Pre-requisites

Expand Down Expand Up @@ -31,6 +33,6 @@ docker-compose up
- ⚠ This docker-compose example is **not** intended for production deployment, but rather for quick testing.
- ports of all services are exposed.
- Grafana is served on http with default login.
- You may have to update the line mapping your AWS profile (Replace `AWS_PROFILE=${AWS_PROFILE}` by `AWS_PROFILE=the-real-name-of-your-profile`).
- You may have to update the line mapping your AWS profile (Replace `AWS_PROFILE=${AWS_PROFILE}` by `AWS_PROFILE=the-real-name-of-your-profile`) when using Podman. It seems that Podman compose does not map the variables of the environment to the containers.
- In corporate environments, you may need to provide your certificates authorities certificates (`ca-certificates`) to the cloud-scanner container (uncomment the mapping line in the docker-compose file).
- For the demo, we deliberately set a short metrics scrapping interval (30 seconds in this demo). In production deloymnent, you may want to increase this metric scraping interval in the prometheus configuration file.
- For the demo, we deliberately set a short metrics scrapping interval (30 seconds). In a production environment, you may want to increase this metric scraping interval to reduce the API calls and volume of data. This scrapping period is set in the Prometheus configuration file.