@javier
I’m fighting around with data migration from InfluxDB to QuestDB and no way I was discussing with Nick Woolmer was working really well. So he has the idea to contact you for transferring data with Telegraf vom InfluxDB to QuestDB to get the data without errors and in dens format to QuestDB.
Hi,
Sorry I didn’t reply earlier. I was travelling last week for a conference and didn’t notice this.
I have successfully converted from non-dense to dense data using telegraf using an aggregator on telegraf, so metrics with the same timestamp get merged into a single dense row.
I have an example on this repository GitHub - questdb/opcua-questdb-telegraf: Playground for OPCUA data ingestion into QuestDB using telegraf and on this one GitHub - javier/questdb-basic-monitoring: Basic questdb monitoring using grafana and collecting questdb metrics via telegraf
In both I am using the same approach.
In your case, you want to migrate from influx into questdb, so one option is to use telegraf to read from influx and write into questdb. I did that once like this.
Telegraf.conf file
[agent]
omit_hostname = true
# Accept metrics over InfluxDB 2.x HTTP API
[[inputs.influxdb_v2_listener]]
## Address and port to host InfluxDB listener on
## (Double check the port. Needs to be available)
service_address = ":8099"
# Merge metrics into multifield metrics by series key.
[[aggregators.merge]]
period = "876000h" # Set the period to 100 years (876000 hours)
grace = "876000h" # Set the grace period to 100 years (876000 hours)
drop_original = true
# this is the questdb destination
[[outputs.influxdb_v2]]
urls = ["http://127.0.0.1:9000"]
content_encoding = "identity" # Important to ensuring no gzip encoding
Start Telegraf
telegraf --config sparse_to_dense.conf --debug
Use influxdb2 to export data into the telegraf proxy
In my case, I have a dockerized influx, so I am running this directly from docker.
docker exec -it influxdb2 influx write --bucket javier --file output.lp --host http://host.docker.internal:8099
Hopefully this helps!