Skip to Content
GlossarySource - Glossary

Source

Back to GlossaryConfiguration

A document or knowledge artifact uploaded to a Rhesis project that provides context for test generation or grounding AI response evaluation.

Also known as: source document, knowledge source

Overview

Sources are files or documents that you attach to a project in Rhesis. They serve as the knowledge foundation for test generation and as reference material for evaluating whether your AI application's responses are accurate and grounded.

Supported File Types

Rhesis supports a range of document formats as sources:

  • Text: ,
  • Documents: , ,
  • Spreadsheets:
  • Web: ,
  • Archives: (containing supported file types)

Use Cases

Test Generation: Provide a product manual, FAQ document, or knowledge base article as a source, then use the DocumentSynthesizer to generate test cases based on its content. This ensures your tests reflect real user questions about your actual product.

RAG Evaluation: When testing a Retrieval-Augmented Generation system, sources serve as the ground truth corpus. Rhesis can evaluate whether your application's responses are consistent with the provided sources.

Context for Metrics: Some metrics (such as faithfulness or groundedness) require access to the source material to determine whether the model's response is factually grounded.

Uploading Sources

  1. Navigate to the Knowledge section of your project
  2. Click "Upload Source"
  3. Select one or more files
  4. Rhesis automatically extracts and indexes the content

Automatic Metadata Generation

When you upload a source, Rhesis uses an LLM to automatically generate a name and description for the document based on its content, reducing manual data entry.

Relationship to Knowledge

Sources are the individual files within the broader Knowledge section of a project. The Knowledge section aggregates all sources for a project and makes them available for test generation and evaluation.

Best Practices

  • Upload sources that reflect the actual knowledge your application is expected to use or cite
  • Use descriptive filenames so the auto-generated metadata accurately describes the document's content
  • Refresh sources when your underlying knowledge base changes to keep generated tests aligned with current content
  • For RAG evaluation, ensure sources cover the same corpus as your retrieval system to avoid false negatives

Related Terms