Your Location is: Home > Postgresql

Aggregate data based on unix time stamp crate database

From: Malaysia View: 3218 june alex 

Question

I'm very new to SQL and time series database. I'm using crate database ( it think which is used PostgreSQL).i want to aggregate the data by hour,day ,week and month. Unix time stamp is used to store the data. following is my sample database.

|sensorid | reading    | timestamp|
====================================
|1        | 1604192522 | 10       |
|1        | 1604192702 | 9.65     |
|2        | 1605783723 | 8.1      |
|2        | 1601514122 | 9.6      |
|2        | 1602292210 | 10    |
|2        | 1602291611 | 12    |
|2        | 1602291615 | 10    |

i tried the sql query using FROM_UNIXTIME not supported . please help me?

im looking the answer for hourly data as follows.

sensorid ,reading , timestamp

1          19.65(10+9.65)        1604192400(starting hour unixt time)
2           8.1                  1605783600(starting hour unix time)
2           9.6                  1601514000(starting hour unix time)
2           32 (10+12+10)        1602291600(starting hour unix time)

im looking the answer for monthly data is like

sensorid  ,     reading  ,               timestamp
1           24.61(10+9.65+8.1)       1604192400(starting month unix time)
2           41.6(9.6+10+12+10)       1601510400(starting month unix time)

Best answer

A straight-forward approach is:

SELECT
    (date '1970-01-01' + unixtime * interval '1 second')::date as date,
    extract(hour from date '1970-01-01' + unixtime * interval '1 second') AS hour,
    count(c.user) AS count
FROM core c
GROUP BY 1,2 

If you are content with having the date and time in the same column (which would seem more helpful to me), you can use date_trunc():

select 
    date_trunc('hour', date '1970-01-01' + unixtime * interval '1 second') as date_hour,
    count(c.user) AS count
FROM core c
GROUP BY 1,2 

Another answer

You can convert a unix timestamp to a date/time value using to_timestamp(). You can aggregate along multiple dimensions at the same time using grouping sets. So, you might want:

select date_trunc('year', v.ts) as year,
       date_trunc('month', v.ts) as month,
       date_trunc('week', v.ts) as week,
       date_trunc('day', v.ts) as day,
       date_trunc('hour', v.ts) as hour,
       count(*), avg(reading), sum(reading)
from t cross join lateral
     (values (to_timestamp(timestamp))) v(ts)
group by grouping sets ( (year), (month), (week), (day), (hour) );