Skip to content

Commit

Permalink
chore: reshuffle files, loadtest is now the main module
Browse files Browse the repository at this point in the history
  • Loading branch information
ctron committed Sep 4, 2024
1 parent 290acd0 commit 3f190d7
Show file tree
Hide file tree
Showing 22 changed files with 550 additions and 299 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,6 @@ Cargo.lock
*.pdb

/.idea

report.*
baseline.json
23 changes: 20 additions & 3 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,8 +1,25 @@
[package]
name = "scale-testing"
name = "loadtest"
version = "0.1.0"
edition = "2021"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[workspace]

[dependencies]
[dependencies]
anyhow = "1"
chrono = "0.4"
clap = { version = "4", features = ["derive", "env"] }
goose = "=0.17.3-dev"
goose-eggs = "0.5.3-dev"
humantime = "2"
log = "0.4"
openid = "0.14"
reqwest = "0.12"
tokio = { version = "1.38.0", features = ["sync"] }

[patch.crates-io]
#goose = { path = "../../goose" }
#goose-eggs = { path = "../../goose-eggs" }

goose = { git = "https://github.com/ctron/goose", branch = "feature/baseline_1" }
goose-eggs = { git = "https://github.com/ctron/goose-eggs", branch = "feature/uptick_deps_1" }
102 changes: 26 additions & 76 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,88 +1,38 @@
# scale-testing
# trustify loadtest

[![ci](https://github.com/trustification/scale-testing/actions/workflows/ci.yaml/badge.svg)](https://github.com/trustification/scale-testing/actions/workflows/ci.yaml)
A set of simple [goose](https://book.goose.rs/) load tests against the web and rest endpoints.

Utility for testing trustification at scale.
## quickstart

This tool is to help replicating existing SBOMs (SPDX or CycloneDX) file in order to augment an existing data set by multiplying the number SBOMs files.
1. Ensure trustify is running.

For instance let's say we have a total of 1000 SBMS (500 SPDX and 500) and we'd like to obtain a total of 10K SBOMs files for our scale test, so we can run the tool using a replication size of 10.
2. Set environment variables for OIDC authentication:
```bash
export ISSUER_URL = "http://localhost:8090/realms/trustify"
export CLIENT_ID = "testing-user"
export CLIENT_SECRET = "****************"
```

The tool replicates existing SBOMs, by copying each file content and change its file name and its key records.
To change wait times between http invokes set the following env vars:

## Usage
After installing trustification/scale-testing repo,
```bash
export WAIT_TIME_FROM = 1
export WAIT_TIME_TO = 2
```

We can run the tool, by providing the size of the replication, the source directory and the destination directory :
Alternately, for no wait times between http invokes set these env vars to 0.

We need to provide a source of SBOMs, here located in /SBOMs which contains SBOM files in json format.
3. To load trustify endpoints with 3 concurrent users.
```bash
cargo run --release --bin loadtest -- --host http://localhost:8080 -u 3
```

Also the target directory must not exists, this is to ensure we're not erasing an existing test set.
To stop load test hit [ctl-C], which should generate aggregate statistics.

```sh
$ rm -rf /data-set
$ cargo run -- 10 /SBOMs/ /data-set/
To load trustify endpoints against 10 concurrent users, generating an html report.

The latter will replicate 10 times each SBOM file available in /SBOMs/.
```bash
cargo run --release -- --host http://localhost:8080 --report-file=report.html --no-reset-metrics -u 10
```

Each replicated SBOM file will be created under its corresponding batch directory under `/data-set`.

> Example

```sh
$ cargo run -- 2 ./SBOMs ./data-set
Compiling scale-testing v0.1.0 (/home/gildub/github.com/gildub/scale-testing)
Finished dev [unoptimized + debuginfo] target(s) in 0.24s
Running `target/debug/scale-testing 2 ./SBOMs ./data-set`
Replication multiplier 2
Source directory ./SBOMs
Destination directory ./data-set
successfully wrote to metadata file
successfully wrote to metadata file
Amending version: "version": 1,
successfully wrote to ./data-set/batch1/A7ED160707AB4BC.replicate1.cdx.json
Amending version: "version": 1,
successfully wrote to ./data-set/batch2/A7ED160707AB4BC.replicate2.cdx.json
Amending name: "name": "quarkus-2.13",
Amending documentNameSpacekey: "documentNamespace": "https://access.redhat.com/security/data/sbom/beta/spdx/quarkus-2.13-1a6ac4c55918a44fb3bada1b7e7d12f887d67be4",
successfully wrote to ./data-set/batch1/quarkus-2.replicate1.13.json
Amending name: "name": "quarkus-2.13",
Amending documentNameSpacekey: "documentNamespace": "https://access.redhat.com/security/data/sbom/beta/spdx/quarkus-2.13-1a6ac4c55918a44fb3bada1b7e7d12f887d67be4",
successfully wrote to ./data-set/batch2/quarkus-2.replicate2.13.json
```

```sh
$ tree data-set/
data-set/
├── batch1
│ ├── metadata
│ │ └── metadata.json
│ ├── A7ED160707AB4BC.replicate1.cdx.json
│ └── quarkus-2.replicate1.13.json
└── batch2
├── metadata
│ └── metadata.json
├── A7ED160707AB4BC.replicate2.cdx.json
└── quarkus-2.replicate2.13.json
5 directories, 6 files
```

## Using bombastic_walker

### Prepare initial SBOMs data set

Provide initial set of SBOMs files to be replicated, i.e `/SBOMs/`.
Use only SPDX files, for now, because the CycloneDX files tested were rejected with following error: `JSON: Unsupported CycloneDX version: 1.4`

### Replicate SBOMs

Run the replication tool to multiply your existing SBOM files set

`cargo run -- 10 /SBOMs /data-set`

### Use the replicated SBOMs

The bombastic_walker could exploit each replicated SBOMs batch, for example in devmode :

`RUST_LOG=info cargo run -p trust bombastic walker --sink http://localhost:8082 --source /data-set/batch1/ --devmode -3`
4. More goose run-time options [here](https://book.goose.rs/getting-started/runtime-options.html)
2 changes: 0 additions & 2 deletions loadtests/.gitignore

This file was deleted.

25 changes: 0 additions & 25 deletions loadtests/Cargo.toml

This file was deleted.

38 changes: 0 additions & 38 deletions loadtests/README.md

This file was deleted.

135 changes: 0 additions & 135 deletions loadtests/src/main.rs

This file was deleted.

2 changes: 2 additions & 0 deletions replicator/.clippy.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
allow-unwrap-in-tests = true
allow-expect-in-tests = true
19 changes: 19 additions & 0 deletions replicator/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Generated by Cargo
# will have compiled files and executables
debug/
target/

# Remove Cargo.lock from gitignore if creating an executable, leave it for libraries
# More information here https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
Cargo.lock

# These are backup files generated by rustfmt
**/*.rs.bk

# MSVC Windows builds of rustc generate these, which store debugging information
*.pdb

/.idea

report.*
baseline.json
6 changes: 6 additions & 0 deletions replicator/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[package]
name = "scale-testing"
version = "0.1.0"
edition = "2021"

[workspace]
Loading

0 comments on commit 3f190d7

Please sign in to comment.