We don’t want your data. We want it to flow.

Observatory reads from where your data already lives and writes to where your dashboards already are. No new database to provision, no new format to learn, no new vendor to lock you in. The library is the connector — nothing more.


THE DATA FLOW

Sources In

Jira ServiceNow SAP Salesforce
Pega Dynamics 365 Zendesk Odoo

Through the Engine

Observatory · Polars + C++

Outputs Anywhere

Power BI Tableau Parquet CSV / JSON

READ ANYTHING. COMPUTE ONCE. WRITE ANYWHERE.

Read

Read Anything

Eight enterprise platforms supported out of the box, with schema auto-detection that infers case keys, event timestamps, and activity columns. Custom adapters for proprietary systems are a hundred lines of Python — no consulting engagement required.

Compute

Compute Once

A single Polars pipeline handles every source. The same Lean diagnostics, the same conformance checks, the same metric definitions across Jira and SAP and Pega. No per-source rewrites. The numbers are comparable because the math is identical.

Write

Write Anywhere

Power BI-ready Parquet. Tableau-ready CSV. JSON for custom dashboards. Star schema for warehouse loaders. All exports are standard formats — no proprietary encoding, no vendor-locked viewer required to read your own outputs.


Auto-Configured

Drop in any export. We’ll figure it out.

No schema mapping. No source-specific configuration. detect_platform() reads the column shape, the timestamp patterns, and the case-key cardinality — then routes to the correct ingestion adapter. The same three-line script works on every source.

# One script. Eight sources. Zero config.
from ghostcitadel import Observatory

for export in ["jira_q3.csv", "servicenow_q3.csv",
               "pega_q3.csv", "sap_q3.csv"]:
    obs = Observatory(export)
    obs.detect_platform()
    obs.mine(metrics=["value_yield"])
    obs.export(f"{export}.parquet")

# ─────────────────────────────────────────
# jira_q3.csv         → detected: Jira (98% conf)
# servicenow_q3.csv   → detected: ServiceNow (96%)
# pega_q3.csv         → detected: Pega Infinity (94%)
# sap_q3.csv          → detected: SAP S/4HANA (97%)
# 4 exports processed in 1.8s

Information Follows the Customer.

No proprietary formats. No required consulting contracts. No SaaS account that gates your access to your own data. If you decide to leave, your outputs stay readable, your pipeline stays portable, and your metrics stay yours.

Standard Formats Only

Parquet, CSV, JSON, star schema. No vendor-encoded outputs. Anything Observatory writes is readable by any standard tool, today and in ten years — without our involvement.

Open Pipeline

The Python and Polars layers are open source. The C++ kernel ships with a deterministic build pipeline. Your team can fork, audit, or extend the entire stack without asking permission.

Portable Metrics

Every metric definition is documented, versioned, and reproducible from the source event log. If you switch tools, your historical metrics remain comparable. Your KPI history doesn’t reset.


Continue Exploring