Probitas

Probitas is a scenario-based testing framework. It provides intuitive APIs for writing integration tests for APIs, databases, message queues, and other backend services.

Features

  • Scenario-based testing: Define tests as readable scenarios with setup, steps, and cleanup
  • Multi-protocol support: HTTP, gRPC, GraphQL, SQL, Redis, MongoDB, and message queues with unified APIs
  • Type-safe: Full type inference through the builder chain
  • Fluent assertions: Natural syntax like expect(result).toBeOk().toHaveJsonMatching({...})

Installation

Install the CLI to run scenarios. Choose your preferred method:

# Shell installer (requires Deno v2.x+)
curl -fsSL https://raw.githubusercontent.com/probitas-test/cli/main/install.sh | bash

# Homebrew (macOS/Linux)
brew install probitas-test/tap/probitas

# Nix
nix run github:probitas-test/cli

See Installation Guide for detailed options including Nix flake integration for project-level dependency management.

Quick Start

Initialize a Project

Create a new Probitas project with example files:

# Create probitas/ directory with example files
probitas init

# Create custom directory
probitas init -d scenarios

# Overwrite existing files
probitas init --force

This creates:

  • example.probitas.ts - Example scenario file
  • probitas.jsonc - Configuration file with defaults

Your First Scenario

import { client, expect, scenario } from "jsr:@probitas/probitas";

export default scenario("User API Test")
  .resource("http", () =>
    client.http.createHttpClient({
      url: "http://localhost:8080",
    }))
  .step("GET /users/1", async (ctx) => {
    const { http } = ctx.resources;
    const res = await http.get("/users/1");

    expect(res)
      .toBeOk()
      .toHaveStatus(200)
      .toHaveJsonMatching({ id: 1 });
  })
  .build();

File Naming Convention

Scenario files should use the .probitas.ts extension and be placed in the probitas/ directory:

probitas/
  auth.probitas.ts
  user-crud.probitas.ts
  payment-flow.probitas.ts

Running Scenarios

# Run all scenarios
probitas run

# Run scenarios with specific tag
probitas run -s tag:example

# Run with different reporter
probitas run --reporter json

Code Quality Commands

Probitas provides commands to maintain code quality in your scenario files:

# Format scenario files
probitas fmt

# Lint scenario files
probitas lint

# Type-check scenario files
probitas check

These commands use the same file discovery mechanism as probitas run, respecting your includes and excludes configuration.

fmt

Formats scenario files:

probitas fmt                    # Format all discovered scenarios
probitas fmt probitas/auth/     # Format specific directory

lint

Lints scenario files. Automatically excludes rules that conflict with scenario imports (no-import-prefix, no-unversioned-import):

probitas lint                   # Lint all discovered scenarios
probitas lint probitas/auth/    # Lint specific directory

check

Type-checks scenario files:

probitas check                  # Type-check all discovered scenarios
probitas check probitas/auth/   # Type-check specific directory

Common Options

All code quality commands support these options:

--include <pattern>   # Include pattern for file discovery (repeatable)
--exclude <pattern>   # Exclude pattern for file discovery (repeatable)
--config <path>       # Path to probitas config file
-v, --verbose         # Verbose output
-q, --quiet           # Suppress output

Tag-Based Filtering

Organize scenarios with tags for easy filtering:

probitas run -s tag:auth              # Match tag
probitas run -s "tag:critical,tag:auth"  # AND logic
probitas run -s "!tag:slow"              # NOT logic

Reporters

Choose output format based on your needs:

  • list - Detailed human-readable output (default)
  • json - Machine-readable JSON

Available Clients

All clients are accessed via the `client` namespace:

Core Concepts

Scenario

A scenario is a complete test case composed of:

  • Name: Descriptive identifier for the test
  • Resources: Managed dependencies (clients, connections)
  • Setup hooks: Initialization code with cleanup callbacks
  • Steps: Sequential test operations with assertions

Builder Pattern

scenario(name, options?)
  .resource(name, factoryFn, options?)    // Register resources (factory function)
  .setup(name?, fn, options?)             // Add setup/cleanup hooks (name optional)
  .step(name?, fn, options?)              // Define test steps (name optional)
  .build()                                // Create immutable definition

Expect API

The `expect()` function auto-dispatches based on result type:

import { client, expect } from "jsr:@probitas/probitas";

await using http = client.http.createHttpClient({
  url: "http://localhost:8080",
});
const httpResponse = await http.get("/users/1");

// HTTP response
expect(httpResponse)
  .toBeOk()
  .toHaveStatus(200)
  .toHaveJsonMatching({ id: 1 });

await using pg = await client.sql.postgres.createPostgresClient({
  url: "postgres://user:pass@localhost/db",
});
const sqlResult = await pg.query("SELECT * FROM users WHERE id = $1", [1]);

// SQL result
expect(sqlResult)
  .toHaveRowCount(1)
  .toHaveRowsMatching({ name: "Alice" });

await using grpc = client.grpc.createGrpcClient({ url: "localhost:50051" });
const grpcResponse = await grpc.call("users.UserService", "GetUser", {
  id: "123",
});

// gRPC response
expect(grpcResponse)
  .toBeOk()
  .toHaveDataMatching({ id: "123" });

Included Utilities

Probitas re-exports useful libraries for convenience:

import {
  assertSpyCalls,
  // Test data generation
  faker,
  // Time control
  FakeTime,
  // Template literal dedent
  outdent,
  // Error handling
  raise,
  // Mocking
  spy,
  stub,
  tryOr,
} from "jsr:@probitas/probitas";

Next Steps

Search Documentation