Quickstart
In under ten minutes, you'll connect a datasource, register a semantic concept, and watch Colrows compile your first governed query end-to-end.
Colrows ships with a sample dataset (anonymized retail orders) you can wire up in one click. Skip Step 2 below - choose Sample dataset when prompted.
Before you start
You'll need:
- A modern browser (Chrome, Edge, Firefox, or Safari).
- Read-only credentials for the datasource you want to connect - JDBC URL, username, and password.
- If your warehouse sits behind a private network, the IP allow-list values from Datasources → Network.
Step-by-step
-
Create your workspace
Go to cloud.colrows.com/register and sign up with email or SSO. Your first workspace is created automatically and you become its first administrator.
-
Connect a datasource
From the side nav, choose Datasources → Add datasource. Select a connector (Snowflake, Databricks, Postgres, ClickHouse, Trino, Oracle, or SQL Server), enter the JDBC URL and credentials, and choose how Colrows should reach the database - direct, SSH tunnel, or SSL.
# Example: Snowflake JDBC URL = jdbc:snowflake://acme.snowflakecomputing.com Warehouse = COMPUTE_WH Database = ANALYTICS Role = COLROWS_READER Auth = username + password (or key-pair)Colrows tests the connection, then crawls the schema. Tables, columns, and inferred relationships are written into the semantic graph as candidate nodes.
-
Register your first semantic concept
Open Consensus → New concept. Give the concept a business name, an optional definition, and bind it to a column or expression in the source. Colrows generates an embedding for the definition and links it to the underlying schema as an
anchoredge.Concept name : Active Customer Definition : A customer who has placed at least one non-refunded order in the last 90 days. Anchor : analytics.orders.customer_id Grain : per customer -
Compile your first query
Open the SQL Editor or Colrows AI, then ask:
How many active customers did we have in EMEA last month?Colrows compiles the question through the semantic layer. You'll see four traces:
- Resolution - which concepts and entities were bound.
- Join path proof - the deterministic graph traversal that connected
customertoorder. - Constraint solve - applied policies (region scope, PII restrictions) and the resolved time window.
- Dialect SQL - the final, dialect-perfect statement that ran against your warehouse.
Every trace is point-in-time reproducible: re-running the same question tomorrow will use whichever semantic state was active at execution time.
-
Invite your team
Go to Settings → Users and add teammates by email. Assign each one a persona (
analyst,engineer,viewer) - the persona binds them to an allowed subgraph at compile time, so authorization is built into every query they run, not bolted on as a row-mask after the fact.
What you've just done
You've stood up a four-stage runtime that resolves intent, proves joins, applies policy, and generates dialect-perfect SQL - without writing a single transformation. Everything you do from here builds on the same primitives.
Where to go next
Core concepts
Understand compile-then-execute, metrics-as-state, and join path proof in depth.
Consensus
Add metrics, events, constraints, and personas to your semantic graph.
Colrows AI
Use natural language safely - every answer is grounded in governed semantics.
Access control
Define personas, scopes, and row/column-level predicates that shape every plan.