SELECT expression [AS alias], ... [FROM table_name | (subquery)] [WHERE condition] [GROUP BY expression, ...] [HAVING condition] [ORDER BY expression [ASC|DESC], ...] [LIMIT n | ALL] [OFFSET n] [FORMAT JSON | JSONEachRow | TabSeparated]
⚠️ No JOIN or UNION — single table queries only. Use subqueries for nesting.
| Column | Type | Description |
|---|---|---|
| blob1–blob20 | String | Dimension/label values |
| double1–double20 | Float64 | Numeric metric values |
| index1 | String | Sampling index key |
| timestamp | DateTime | Auto-populated event time |
| _sample_interval | Float64 | Sampling weight (use in aggs) |
💡 Use AS to alias: blob1 AS user_id
SHOW TABLES -- list datasets SHOW TIMEZONES -- all supported TZs SHOW TIMEZONE -- current TZ (Etc/UTC)
Append FORMAT JSONEachRow to any SHOW statement.
| Function | Description |
|---|---|
| count() | Row count |
| countIf(expr) | Count where expr is true |
| sum(col) | Sum of values |
| sumIf(col, cond) | Conditional sum |
| avg(col) | Mean of values |
| avgIf(col, cond) | Conditional mean |
| min(col) | Minimum value |
| max(col) | Maximum value |
| quantileExactWeighted(q, col, weight) | Weighted quantile |
| topK(N, col) | Top N most common values |
| topKWeighted(N, col, weight) | Weighted top N |
| argMin(arg, val) | arg at min(val) |
| argMax(arg, val) | arg at max(val) |
SELECT SUM(_sample_interval * double1) / SUM(_sample_interval) AS avg_val FROM my_dataset
SELECT SUM(_sample_interval) AS true_count FROM my_dataset
SELECT quantileExactWeighted(0.95, double1, _sample_interval) AS p95 FROM my_dataset
SELECT toStartOfInterval(timestamp, INTERVAL 5 MINUTE) AS t, blob1 AS city, SUM(_sample_interval * double1) / SUM(_sample_interval) AS avg_humidity FROM weather WHERE timestamp > now() - INTERVAL 1 DAY GROUP BY t, city ORDER BY t ASC
| Function | Returns |
|---|---|
| now() | Current DateTime |
| today() | Current Date |
| toStartOfInterval(ts, INTERVAL n UNIT) | Rounded DateTime |
| toDate(ts) | Extract date part |
| toHour(ts) | Hour (0–23) |
| toMinute(ts) | Minute (0–59) |
| toDayOfWeek(ts) | Day (1=Mon) |
| formatDateTime(ts, fmt) | Formatted string |
| toUnixTimestamp(ts) | Unix seconds |
| fromUnixTimestamp(n) | DateTime from Unix |
INTERVAL units: SECONDMINUTEHOURDAYWEEKMONTH
= != < > <= >= IS NULL IS NOT NULL
blob1 LIKE 'prefix%' -- % = any chars blob1 LIKE '_bc%' -- _ = single char blob1 NOT LIKE 'test%'
AND OR NOT IN(...) NOT IN(...)
+ - * / % (standard math)
| Function | Description |
|---|---|
| if(cond, then, else) | Ternary |
| ifNull(col, default) | NULL fallback |
| multiIf(c1,v1,c2,v2,...,else) | CASE-like |
| CASE WHEN...THEN...END | Standard CASE |
| Function | Description |
|---|---|
| length(s) | String length |
| lower(s) / upper(s) | Case convert |
| trim(s) | Strip whitespace |
| concat(s1, s2, ...) | Concatenate |
| substring(s, offset, len) | Substring |
| toString(val) | Cast to string |
| Function | Output |
|---|---|
| toInt32(expr) | 32-bit signed int |
| toInt64(expr) | 64-bit signed int |
| toUInt8(expr) | 8-bit unsigned int |
| toFloat32(expr) | 32-bit float |
| toFloat64(expr) | 64-bit float |
| toString(expr) | String |
| toDateTime(expr) | DateTime |
| toDate(expr) | Date |
| Function | Description |
|---|---|
| abs(x) | Absolute value |
| ceil(x) | Ceiling |
| floor(x) | Floor |
| round(x [,n]) | Round to n decimals |
| sqrt(x) | Square root |
| log(x) / log2(x) / log10(x) | Logarithms |
| exp(x) | eˣ |
| pow(x, y) | xʸ |
| greatest(x, y) / least(x, y) | Max/min of two values |
| Function | Description |
|---|---|
| hex(val) | To hex string |
| unhex(str) | From hex string |
| base64Encode(str) | Base64 encode |
| base64Decode(str) | Base64 decode |
| Function | Description |
|---|---|
| bitAnd(a, b) | Bitwise AND |
| bitOr(a, b) | Bitwise OR |
| bitXor(a, b) | Bitwise XOR |
| bitNot(a) | Bitwise NOT |
| bitShiftLeft(a, n) | Left shift |
| bitShiftRight(a, n) | Right shift |
# curl example curl "https://api.cloudflare.com/client/v4/accounts/$ACCOUNT/analytics_engine/sql" \ -H "Authorization: Bearer $TOKEN" \ --data "SELECT blob1 AS city, SUM(_sample_interval * double1) / SUM(_sample_interval) AS avg_temp FROM weather WHERE timestamp > now() - INTERVAL 1 DAY GROUP BY city ORDER BY avg_temp DESC LIMIT 10"
Token needs: Account Analytics → Read permission. Default response format is JSON (includes meta + data + rows). Use FORMAT JSONEachRow for streaming or TabSeparated for Grafana CSV.
| Topic | Note |
|---|---|
| No JOINs | Single table per query only |
| Sampling | Use _sample_interval in all aggs for accuracy |
| avg() | Incorrect under sampling — use weighted sum instead |
| WHERE vs HAVING | WHERE = before group, HAVING = after aggregation |
| LIKE | % = any chars, _ = exactly one char |
| Timezone | Always Etc/UTC — convert with formatDateTime |
| Columns | blob1–20 (strings), double1–20 (numbers), index1 (sampling key) |
| Table creation | Auto-created on first writeDataPoint() — no DDL needed |
SELECT blob1 AS user, SUM(_sample_interval) AS events FROM my_table GROUP BY user HAVING events > 100 ORDER BY events DESC
SELECT * FROM ( SELECT blob1 AS user, SUM(_sample_interval) AS n FROM events GROUP BY user ) WHERE n > 50