Alright, so today I decided to mess around with DuckDB and see if I could mount some external data sources. It sounded kinda cool, and honestly, I was just curious.
Getting Started
First things first, I fired up my terminal. Nothing fancy, just my regular setup. I already had DuckDB installed (if you don’t, it’s super easy, just check their website). I started by just playing around in the DuckDB CLI, you know, getting a feel for things.
The Experiment
My main goal was to see if I could access a CSV file I had lying around, like it was a regular DuckDB table. I remembered reading something about being able to “mount” files, so I went digging.

I found this command: INSTALL 'httpfs'; LOAD 'httpfs';
.I have no idea what the command means.I simply typed it into my DuckDB command line, and it spit out a bunch of messages,I guess some of them look like install * errors, so I assumed it worked. Good start!
Mounting the CSV
Then, I found the key part of the information about how to read remote files.I use command like this:
SELECT FROM read_csv_auto('https://...')
I just replaced https://...
with my own CSV file path.I ran the query, and BAM! It worked! The data from my CSV was right there, like magic. I could query it, filter it, everything. Pretty neat!
Playing Around
Once I had the basic setup working, I started experimenting. I tried a few different CSV files, some bigger, some smaller. Everything seemed to work smoothly.I have a Parquet file.I put it directly into the * works fine too!

SELECT FROM 'my_*'
Wrapping Up
Overall, I was pretty impressed with how easy it was to mount and query external data with DuckDB. It’s definitely something I’ll keep in mind for future projects. It’s like having a super-powered data Swiss Army knife.
So that’s it,that’s my little adventure with DuckDB mounts. Not exactly rocket science, but a fun little experiment, and I learned something new. Hope you found it somewhat interesting, or at least not a complete waste of time!