Quickstart

In under ten minutes, you'll connect a datasource, register a semantic concept, and watch Colrows compile your first governed query end-to-end.

Don't have a Snowflake/Databricks instance handy?

Colrows ships with a sample dataset (anonymized retail orders) you can wire up in one click. Skip Step 2 below - choose Sample dataset when prompted.

Before you start

You'll need:

  • A modern browser (Chrome, Edge, Firefox, or Safari).
  • Read-only credentials for the datasource you want to connect - JDBC URL, username, and password.
  • If your warehouse sits behind a private network, the IP allow-list values from Datasources → Network.

Step-by-step

  1. Create your workspace

    Go to cloud.colrows.com/register and sign up with email or SSO. Your first workspace is created automatically and you become its first administrator.

  2. Connect a datasource

    From the side nav, choose Datasources → Add datasource. Select a connector (Snowflake, Databricks, Postgres, ClickHouse, Trino, Oracle, or SQL Server), enter the JDBC URL and credentials, and choose how Colrows should reach the database - direct, SSH tunnel, or SSL.

    # Example: Snowflake
    JDBC URL  = jdbc:snowflake://acme.snowflakecomputing.com
    Warehouse = COMPUTE_WH
    Database  = ANALYTICS
    Role      = COLROWS_READER
    Auth      = username + password (or key-pair)

    Colrows tests the connection, then crawls the schema. Tables, columns, and inferred relationships are written into the semantic graph as candidate nodes.

  3. Register your first semantic concept

    Open Consensus → New concept. Give the concept a business name, an optional definition, and bind it to a column or expression in the source. Colrows generates an embedding for the definition and links it to the underlying schema as an anchor edge.

    Concept name     : Active Customer
    Definition       : A customer who has placed at least one
                       non-refunded order in the last 90 days.
    Anchor           : analytics.orders.customer_id
    Grain            : per customer
  4. Compile your first query

    Open the SQL Editor or Colrows AI, then ask:

    How many active customers did we have in EMEA last month?

    Colrows compiles the question through the semantic layer. You'll see four traces:

    1. Resolution - which concepts and entities were bound.
    2. Join path proof - the deterministic graph traversal that connected customer to order.
    3. Constraint solve - applied policies (region scope, PII restrictions) and the resolved time window.
    4. Dialect SQL - the final, dialect-perfect statement that ran against your warehouse.

    Every trace is point-in-time reproducible: re-running the same question tomorrow will use whichever semantic state was active at execution time.

  5. Invite your team

    Go to Settings → Users and add teammates by email. Assign each one a persona (analyst, engineer, viewer) - the persona binds them to an allowed subgraph at compile time, so authorization is built into every query they run, not bolted on as a row-mask after the fact.

What you've just done

You've stood up a four-stage runtime that resolves intent, proves joins, applies policy, and generates dialect-perfect SQL - without writing a single transformation. Everything you do from here builds on the same primitives.

Where to go next