Deleting the wiki page 'Data Export Import Merging Processing' cannot be undone. Continue?
Make sure to always keep backups of your data. There are many simple ways to loose your data:
You can use the Data export or Data Auto export to get copy of your data, here is how:
The Data Export function exports database with your activity, sports activities and other data, like prefrence settings. Activity data is exported as sqlite database. Sports activities (if captured as GPX) are exported as GPX files. Preference data are also exported in the form of XML files. All these files are exported into a folder called files
in internal storage area available to Gadgetbridge, for example /storage/emulated/0/Android/data/nodomain.freeyourgadget/files/
. So this is a system folder. From here, you must copy the data into some permanent storage, because this folder is erased if you uninstall Gadgetbridge. So make sure to copy/move/backup your files after export!
So the backup/restore procedure is:
/storage/emulated/0/Android/data/nodomain.freeyourgadget/files/
into some other location/storage/emulated/0/Android/data/nodomain.freeyourgadget/files/
You can use the Data export or Data Auto export to get copy of your data.
The Data Export function exports database with your activity, sports activities and other data, like prefrence settings. Activity data is exported as sqlite database. Sports activities (if captured as GPX) are exported as GPX files. Preference data are also exported in the form of XML files. All these files are exported into a folder called files
in internal storage area available to Gadgetbridge, for example /storage/emulated/0/Android/data/nodomain.freeyourgadget/files/
. So this is a system folder. From here, you must copy the data into some permanent storage, because this folder is erased if you uninstall Gadgetbridge. So make sure to copy/move/backup your files after export!
Export_preference
- your global preferences, XML fileExport_preference_xx:xx:xx:xx:xx:xx
- device specific preferences, XML filesGadgetbridge
- activity, sleep, activity data, sqlite databasegadgetbridge-track-2020-04-14T17_14_29+02_00.gpx
- GPX file (might contain GPS, Heartrate...)The Gadgetbridge activity database is be exported to SQLite format and can be used for further analysis or visualization.
Under the main menu, go to Database management
. Exported data will be in the Gadgdetbridge
file under a directory shown on the screen.
For this example we'll assume the file to be /storage/emulated/0/Android/data/nodomain.freeyourgadget/files/Gadgdetbridge
.
Some Android versions might not have yet created the /storage/emulated/0
, resulting in GB error saying, 'Cannot find export path'. This was probably because there was no /storage/sdcard0/ directory. After inserting external storage and tried exporting, Android probably created the /storage/sdcard0/
directory. /storage/sdcard0/
is still in local storage.
File previously exported via export function can be re-imported back (current data will be overwritten!) by placing the database file into /storage/emulated/0/Android/data/nodomain.freeyourgadget/files/Gadgdetbridge
.
Gadgetbridge data cab be automatically exported periodically for external backups, sync, processing etc. To set Autoexport, go to Settings
→ Auto export
→ Auto export enabled
→ Yes
.
Selected export directory and name of the exported file via Export location
Set interval period (in hours) via Export interval
.
One can test the Auto Export via menu Database management
→ AutoExport
→ Run AutoExport now
.
This shows how to access the previously exported database on Debian GNU/Linux 9, the sqlite3
and android-tools-adb
packages are required.
To list all tables in the database:
$ adb shell
OnePlus3:/ $ su
OnePlus3:/ # cd /storage/emulated/0/Android/data/nodomain.freeyourgadget.gadgetbridge/files
OnePlus3:/ # sqlite3 Gadgetbridge
SQLite version 3.19.4 2017-08-18 19:28:12
Enter ".help" for usage hints.
sqlite> .table
ACTIVITY_DESCRIPTION NOTIFICATION_FILTER
ACTIVITY_DESC_TAG_LINK NOTIFICATION_FILTER_ENTRY
ALARM PEBBLE_HEALTH_ACTIVITY_OVERLAY
BASE_ACTIVITY_SUMMARY PEBBLE_HEALTH_ACTIVITY_SAMPLE
CALENDAR_SYNC_STATE PEBBLE_MISFIT_SAMPLE
DEVICE PEBBLE_MORPHEUZ_SAMPLE
DEVICE_ATTRIBUTES TAG
HPLUS_HEALTH_ACTIVITY_OVERLAY USER
HPLUS_HEALTH_ACTIVITY_SAMPLE USER_ATTRIBUTES
ID115_ACTIVITY_SAMPLE XWATCH_ACTIVITY_SAMPLE
MI_BAND_ACTIVITY_SAMPLE ZE_TIME_ACTIVITY_SAMPLE
NO1_F1_ACTIVITY_SAMPLE android_metadata
To export to CSV:
sqlite> .headers on
sqlite> .mode csv
sqlite> .output out.csv
sqlite> select * from BASE_ACTIVITY_SUMMARY;
Reference: SO.
SELECT "TIMESTAMP",datetime("TIMESTAMP",'unixepoch','localtime') as DATETIME, STEPS, HEART_RATE, ("TIMESTAMP"- LAG("TIMESTAMP") OVER())/60 as TIME_DIFF
FROM MI_BAND_ACTIVITY_SAMPLE
WHERE "TIMESTAMP" BETWEEN (strftime('%s','2019-08-02 16:15:00','utc')) and (strftime('%s','2019-08-03 23:15:00','utc'))
and HEART_RATE<>255
select avg(a) from (select strftime('%Y.%m.%d', datetime(timestamp, 'unixepoch')) as d,sum(STEPS)as a from MI_BAND_ACTIVITY_SAMPLE group by d)
You can merge data from previous export (for example another Xiaomi device) into new one by opening your current export and inserting all data from an older file:
Make fresh new export as per Data export and open it in sqlite3:
sqlite3 Gadgetbridge
Open older file with previous data in this sqlite3 instance:
ATTACH 'Gadgetbridge_old' as old;
Insert old data into fresh backup:
insert into MI_BAND_ACTIVITY_SAMPLE SELECT * from old.MI_BAND_ACTIVITY_SAMPLE;
Close sqlite3, make sure to put this fresh Gadgetbridge file back into /storage/emulated/0/Android/data/nodomain.freeyourgadget.gadgetbridge/files/ and perform Data import.
import sqlite3
import matplotlib.pyplot as plt
import datetime
import numpy as np
conn = sqlite3.connect('Gadgetbridge')
c = conn.cursor()
min_steps_per_minute=00
d=c.execute("select strftime('%Y.%m.%d', datetime(timestamp, 'unixepoch')) as d,sum(STEPS) from MI_BAND_ACTIVITY_SAMPLE where STEPS > ? group by d",(min_steps_per_minute,)).fetchall()
w=c.execute("select strftime('%Y.%W', datetime(timestamp, 'unixepoch')) as d,sum(STEPS) from MI_BAND_ACTIVITY_SAMPLE where STEPS > ? group by d",(min_steps_per_minute,)).fetchall()
m=c.execute("select strftime('%Y.%m', datetime(timestamp, 'unixepoch')) as d,sum(STEPS) from MI_BAND_ACTIVITY_SAMPLE where STEPS > ? group by d",(min_steps_per_minute,)).fetchall()
y=c.execute("select strftime('%Y', datetime(timestamp, 'unixepoch')) as d,sum(STEPS) from MI_BAND_ACTIVITY_SAMPLE where STEPS > ? group by d",(min_steps_per_minute,)).fetchall()
print("all avg:",c.execute("select avg(STEPS) from MI_BAND_ACTIVITY_SAMPLE where STEPS > ? ",(min_steps_per_minute,)).fetchall())
db={x[0]:x[1] for x in d}
wb={x[0]:x[1] for x in w}
mb={x[0]:x[1] for x in m}
yb={x[0]:x[1] for x in y}
fig, ax = plt.subplots(4)
def doit(where,what,color,label):
where.bar(
np.arange(len(what)),
list(what.values()),
0.3,
#tick_label=list(what.values()),
tick_label=list(what.keys()),
label=label,
color=color,
)
where.legend()
#where.xticks(rotation=60)
doit(ax[3],yb,"g","steps/year")
doit(ax[2],mb,"b","steps/month")
doit(ax[1],db,"r","steps/day")
doit(ax[0],wb,"g","steps/week")
for ax in fig.axes:
plt.sca(ax)
plt.xticks(rotation=65)
plt.show()
c.close()
select date(TIMESTAMP, 'unixepoch') as "Date", sum(STEPS) as "Steps"
from PEBBLE_HEALTH_ACTIVITY_SAMPLE
group by date(TIMESTAMP, 'unixepoch')
select
round(sum(TIMESTAMP_TO-TIMESTAMP_FROM)/3600.0,1) as "Duration",
sum(TIMESTAMP_TO-TIMESTAMP_FROM)/3600 as "Hours",
sum(TIMESTAMP_TO-TIMESTAMP_FROM)%3600/60 as "Minutes",
datetime(min(TIMESTAMP_FROM), 'unixepoch') as "Start",
datetime(max(TIMESTAMP_TO), 'unixepoch') as "End"
from PEBBLE_HEALTH_ACTIVITY_OVERLAY
where RAW_KIND = 1
group by date(TIMESTAMP_FROM, 'unixepoch', '+4 hours', 'start of day');
select
round(sum(case when RAW_KIND = 1 then TIMESTAMP_TO-TIMESTAMP_FROM else 0 end)/3600.0,1) as "Sleep Duration",
round(sum(case when RAW_KIND = 2 then TIMESTAMP_TO-TIMESTAMP_FROM else 0 end)/3600.0,1) as "Deep Sleep Duration",
round(sum(case when RAW_KIND = 3 then TIMESTAMP_TO-TIMESTAMP_FROM else 0 end)/3600.0,1) as "Nap Duration",
round(sum(case when RAW_KIND = 4 then TIMESTAMP_TO-TIMESTAMP_FROM else 0 end)/3600.0,1) as "Deep Nap Duration",
datetime(min(TIMESTAMP_FROM), 'unixepoch') as "Start",
datetime(max(TIMESTAMP_TO), 'unixepoch') as "End"
from PEBBLE_HEALTH_ACTIVITY_OVERLAY
where RAW_KIND in (1, 2, 3, 4)
group by date(TIMESTAMP_FROM, 'unixepoch', '+4 hours', 'start of day');
get your MiFit data via GDPR data request, either from Mifit or via this URL.
make an export of database in Gadgetbridge (make an extra backup of this exported database)
unzip received MiFit data and place the .csv files into a single folder (see list of required files below)
put exported Gadgetbridge database file into the same folder
put this script into the same folder
either remove "numbers" from the .csv file names, or rename the xxx_file_name variables below
you may need to edit device_id and user_id. for most people (with one device) this will remain as is below
run this script with python3: python import_from_mifit.py
re-import the updated Gadgetbridge database file to GB
#!/usr/bin/env python3
import csv
import datetime
import sys
import sqlite3
import random
# import script to get MiFit data into Gadgetbridge database
# what this tool does:
# - it checks if a particular record (based on timestamp) is in database
# - if record does not exist, it is created:
# - steps are added
# - for sleep, separate minute based records are created
# - all records will have heart rate measurements, if available in the data
# what this tool does not:
# - doesn't import activities
# it can damage your data, make plenty of backups to be able to roll back at any point
# 1) get your MiFit data via GDPR data request, URL:
# https://account.xiaomi.com/pass/serviceLogin?callback=https%3A%2F%2Faccount.xiaomi.com%2Fsts%2Foauth%3Fsign%3D7QjKYjTipB1s7OliGXsWt1OL9sE%253D%26followup%3Dhttps%253A%252F%252Faccount.xiaomi.com%252Foauth2%252Fauthorize%253Fskip_confirm%253Dfalse%2526client_id%253D428135909242707968%2526pt%253D1%2526redirect_uri%253Dhttps%25253A%25252F%25252Fapi-mifit-cn.huami.com%25252Fhuami.health.loginview.do_not%2526_locale%253Dde_DE%2526response_type%253Dcode%2526scope%253D1%25252016001%25252020000%2525206000%2526_sas%253Dtrue%26sid%3Doauth2.0&sid=oauth2.0&lsrp_appName=In%20%24%7BMi%20Fit%7D%24%20%C3%BCber%20das%20Mi-Konto%20anmelden&_customDisplay=20&scope=1%206000%2016001&_locale=de_DE&_ssign=2%26V1_oauth2.0%266qHWjGF3kaRWOWCGQdM8gIt6lR8%3D
# 2) make an export of database in Gadgetbridge
# - make an extra backup of this exported database
# 3) unzip received MiFit data and place the .csv files into a single folder (see list of required files below)
# 4) put exported Gadgetbridge database file into the same folder
# 5) put this script into the same folder
# 6) either remove "numbers" from the .csv file names, or rename the xxx_file_name variables below
# 7) you may need to edit device_id and user_id. for most people (with one device) this will remain as is below
# 8) run this script with python3:
# python import_from_mifit.py
# 9) re-import the updated Gadgetbridge database file to GB
activity_file_name = "ACTIVITY_MINUTE.csv"
hr_file_name1 = "HEARTRATE.csv"
hr_file_name2 = "HEARTRATE_AUTO.csv"
sleep_file_name = "SLEEP.csv"
database = "Gadgetbridge"
device_id = 1
user_id = 1
# do not edit below
conn = sqlite3.connect(database)
cursor = conn.cursor()
# build HR dictionary
hr = {}
data = csv.reader(open(hr_file_name1), delimiter=",")
# 1572088219,81
next(data) # skip header
for line in data:
hr[line[0]] = line[1]
data = csv.reader(open(hr_file_name2), delimiter=",")
# 2017-07-14,23:35,54
next(data) # skip header
for line in data:
date = "{0},{1}".format(*line)
dt = datetime.datetime.strptime(date, "%Y-%m-%d,%H:%M")
# timestamp=dt.timestamp()
timestamp = dt.replace(tzinfo=datetime.timezone.utc).timestamp()
hr[timestamp] = line[2]
# steps
data = csv.reader(open(activity_file_name), delimiter=",")
# 2017-07-04,13:18,11
next(data) # skip header
for line in data:
# print(line)
date = "{0},{1}".format(*line)
dt = datetime.datetime.strptime(date, "%Y-%m-%d,%H:%M")
w = {}
# timestamp=dt.timestamp()
timestamp = dt.replace(tzinfo=datetime.timezone.utc).timestamp()
w["timestamp"] = timestamp
r = cursor.execute(
"SELECT * from MI_BAND_ACTIVITY_SAMPLE where TIMESTAMP=$timestamp", (w)
).fetchone()
if r:
# print("record exists", r,line[2])
pass
else:
steps = int(line[2])
heart_rate = hr.get(timestamp, 255)
raw_intensity = random.randint(10, 130)
if steps < 80:
raw_kind = 1 # slow walking
elif 100 > steps > 80:
raw_kind = 1 # 3 fast walking, unsupported by GB
else:
raw_kind = 4 # 4 running
print("inserting", steps, heart_rate)
cursor.execute(
"INSERT INTO MI_BAND_ACTIVITY_SAMPLE VALUES (?,?,?,?,?,?,?)",
(timestamp, device_id, user_id, raw_intensity, steps, raw_kind, heart_rate),
)
# sleep
data = csv.reader(open(sleep_file_name), delimiter=",")
# 2017-07-05,1499317099,45,348,0,1499206440,1499230020
next(data) # skip header
for line in data:
deep_sleep = int(line[2])
light_sleep = int(line[3])
timestamp = int(line[5])
ts_to = int(line[6])
# deep sleep
# timestamp=ts_from
for i in range(0, deep_sleep):
w["timestamp"] = timestamp
r = cursor.execute(
"SELECT * from MI_BAND_ACTIVITY_SAMPLE where TIMESTAMP=$timestamp", (w)
).fetchone()
if r:
# print("record exists", r,line[2])
pass
else:
heart_rate = hr.get(timestamp, 255)
print("inserting sleep", timestamp)
steps = 0
raw_kind = 123
raw_intensity = random.choice([20, 2, 7] + [0] * 20)
cursor.execute(
"INSERT INTO MI_BAND_ACTIVITY_SAMPLE VALUES (?,?,?,?,?,?,?)",
(
timestamp,
device_id,
user_id,
raw_intensity,
steps,
raw_kind,
heart_rate,
),
)
timestamp = timestamp + 60
for i in range(0, light_sleep):
w["timestamp"] = timestamp
r = cursor.execute(
"SELECT * from MI_BAND_ACTIVITY_SAMPLE where TIMESTAMP=$timestamp", (w)
).fetchone()
if r:
# print("record exists", r,line[2])
pass
else:
heart_rate = hr.get(timestamp, 255)
print("inserting sleep", timestamp)
steps = 0
raw_kind = 121
raw_intensity = random.choice([20, 2, 7] + [0] * 20)
cursor.execute(
"INSERT INTO MI_BAND_ACTIVITY_SAMPLE VALUES (?,?,?,?,?,?,?)",
(
timestamp,
device_id,
user_id,
raw_intensity,
steps,
raw_kind,
heart_rate,
),
)
timestamp = timestamp + 60
conn.commit()
conn.close()
General
Sports/Activities
Device Related
Development
Feature Discussion
FAQ
Deleting the wiki page 'Data Export Import Merging Processing' cannot be undone. Continue?