Recently, I wrote this article about installing Pi-Hole on your Raspberry PI to keep your home network safe.
After installing and getting comfortable with the Pi-Hole and looking at all the data, I felt that I wanted to do a bit more with the data that it generates. After exhausting the options available within the Pi-Hole itself, I turned to my favorite Analytics platform, Tableau.
So, how do go I go about connecting Tableau to my Pi-Hole. After going through a bunch of research and implementing my solution, the final solution allows me to play with the data to my heart’s desire.
Have a look here

You can see how many requests from ad colony and google analytics are automatically being blocked. More importantly, I can now use the power of Tableau to identify trends and take action on my pi.hole
High level flow
Here is the high level flow for my process
- Connect to the pi.hole database on my raspberry pi
- Run a query that creates the data that you need
- Upload the data to Google Drive
- Connect Tableau to the Google Drive
- Create the dashboard against the data
Once I was comfortably with the entire process, I automated the entire process and now I have new files showing up my Google Drive and Tableau connects to the updated data.
Detailed Instructions
Connect to the pi.hole database on my raspberry pi
Command to connect to the sqlite db
sudo pihole-FTL sqlite3 /etc/pihole/pihole-FTL.db
Changing the options while connecting to the db
sudo pihole-FTL sqlite3 /etc/pihole/pihole-FTL.db -header -csv
Run a query that creates the data that you need.
SQL query that I want to run
Select client.name, queries.*, datetime(timestamp, 'unixepoch', 'localtime') as datetime from queries, \
(select ip, max(id), name from client_by_id group by ip) client where client.ip = queries.client
I want to dump the results into a file, so I will add
> /pihole.queries/results/queries_with_ts_all_data.csv
To run a single command that connects to the DB, runs the SQL query and then exports it to a csv, run this
sudo pihole-FTL sqlite3 /etc/pihole/pihole-FTL.db -header -csv \
"Select client.name, queries.*, datetime(timestamp, 'unixepoch', 'localtime') as datetime from queries, \
(select ip, max(id), name from client_by_id group by ip) client where client.ip = queries.client" \
> /pihole.queries/results/queries_with_ts_all_data.csv
Upload the data to Google Drive
Okay, let’s be clear, if you want to just get all the data from the DB and then connect Tableau to it, you are done. You have the CSV file and Tableau connects to it natively. You can create any dashboard on that data.
CONGRATS!!!
However, if you would like to automate the process every day, you can use rclone to do just that. Here is the process for rlcone configuration
a. Install rclone on your raspberry pi. I believe I used this link for instruction.
b. Set up a new remote drive
rclone config
c. configure the google drive setup using this link
You don’t have to to create your own drive id in this case because I am only uploading files once every week, but if you want more control, you will want to do that using this link. FWIW, I ended up doing that just because I wasn’t sure if I wanted to use the global drive for security reasons
d. Create the command to upload the files
rclone copy -vv --drive-allow-import-name-change --drive-import-formats csv \
/pihole.queries/results/ gdrive:pi-hole
The above command will simply take all the files in the /pihole.queries/results/ directory and upload to a sub-folder that I have create in my google drive called “pi-hole”. Keep in mind, the word “gdrive” is the name of the new remote set that I created in rclone. If you don’t remember what name you gave in step b. above, you can simply type
rclone config
to see the list of current remotes
Upload the data to Google Drive
Bringing it all together
Now, you can run a cron job that calls a script. The first command in the script will connect to the DB, export the dataset into a csv file and place it a location. The next command in the script will copy the file to Google drive using rclone.
Here is my script
. /pihole.queries/sql/export_daily_data.sql
. /pihole.queries/scripts/upload_daily_file_gdrive.sh
Let’s open both of those steps. the first step above is actually a script with these contents
sudo pihole-FTL sqlite3 /etc/pihole/pihole-FTL.db -header -csv \ "Select client.name, queries.*, datetime(timestamp, 'unixepoch', 'localtime') as datetime from queries, \ (select ip, max(id), name from client_by_id group by ip) client where client.ip = queries.client" \ > /pihole.queries/results/queries_with_ts_all_data.csv
The second step is a script with these contents
rclone copy -vv --drive-allow-import-name-change --drive-import-formats csv \
/pihole.queries/results/ gdrive:pi-hole
This is it, folks !!!. You have now configured your pi.hole to export the data you need and uploaded it to your Google drive so you can have fun with Tableau. The best part is that entire process is automated
CAN’T GET ENOUGH?
okay, you want to do even more or have some questions about this proces like I did.
- Do I really want to export the entire data from pi.hole every day?
- How long will that take?
- Does it impact my pi.hole performance when I run the query?
- Could I just get the data for one day at a time and upload that incremental data?
- How big can this data get?
- Will it affect how much data I am using from my ISP everyday but resending all the older data?
So, here’s where I went even further, I have scripts that are customized to create a new file every day with a date specific name. Here is that command
pihole-FTL sqlite3 /etc/pihole/pihole-FTL.db -header -csv \
"Select client.name, queries.*, \
datetime(timestamp, 'unixepoch', 'localtime') as datetime from queries, \
(select ip, max(id), name from client_by_id group by ip) client \
where client.ip = queries.client \
and strftime('%Y-%m-%d', datetime(timestamp, 'unixepoch', 'localtime')) \
= strftime('%Y-%m-%d', datetime('now', 'localtime', '-1 days'))" \
> /pihole.queries/results/daily/queries_with_ts-"$(date +"%Y-%m-%d").csv"
Conclusion
Benefits of Visualizing Pi-Hole Data with Tableau
Exporting the Pi-Hole data to a CSV file and visualizing it with Tableau can offer several benefits:
- Understand Network Traffic: By visualizing the Pi-Hole data in Tableau, you can gain insights into the types of traffic on your network and identify potential security risks.
- Monitor Performance: By tracking the amount of traffic and the types of requests being made, you can monitor the performance of your network and identify potential bottlenecks.
- Optimize Blocklists: By analyzing the data in Tableau, you can identify additional domains to block and optimize the blocklists to improve performance and privacy.
- Customize Visualizations: Tableau provides powerful visualization tools that allow you to create customized dashboards and reports to display the Pi-Hole data in a way that is most useful for you.
Benefits of Using Rclone
Using rclone to automatically upload the CSV file to your Google Drive offers several benefits:
- Automatic Backup: By automatically uploading the CSV file to your Google Drive, you can ensure that you always have a backup of the Pi-Hole data.
- Easy Sharing: By uploading the CSV file to your Google Drive, you can easily share the data with others or use it in other applications.
- Remote Access: By uploading the CSV file to your Google Drive, you can access the data from anywhere with an internet connection.