How to import nanosecond timestamps into QuestDB via http csv import?

I want to import a csv file with nanosecond timestamps into QuestDB via a http curl request. After researching a bit I found the timestamp formats supported by QuestDB, but none of them seem to fit in this case. When I dont provide a type it defaults to LONG and when I just provide type TIMESTAMP it gives only errors. I know I can import the timestamps as LONG and then convert them to a timestamp afterwards but since I am dealing with large amounts of data it would be optimal to just import them directly in the right format. Does anyone know how to do this?

curl --location 'http://localhost:9000/imp?name=table1' \        
              --form 'schema="[{\"name\":\"unixtime\", \"type\": \"timestamp\", \
                            \"format\":\"<WHICH_FORMAT?>\"}, ... ]"' \
                                                    --form 'data=@-'

The simplest possible way to reproduce this is by creating a csv file “test.csv”:

unixtime, value
1577980740000000000, 0

Then start QuestDB via Docker

docker run \
  -p 9000:9000 -p 9009:9009 -p 8812:8812 -p 9003:9003 \

Run the curl command to import the csv

curl  -F data=@test.csv 'http://localhost:9000/imp'

And visit to view the imported data.

When I’ve found similar problems, what I’ve done is creating the table beforehand, so QuestDB already knows the data type for the column is a timestamp.

Unfortunately, when importing the CSV with nanoseconds resolution, QuestDB will wrongly assume it is micros, so the date won’t be correct.

If you create the table beforehand AND remove three trailing zeroes from the timestamp before sending the CSV that would work. Maybe this is workable for you, but otherwise right now I cannot think of a different way to parse it.

1 Like