Rendered at 08:39:36 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
chaz6 18 hours ago [-]
When I got the update I looked through the settings and there appears to be no way to disable it. I do not want AI anywhere near my database. I only use it for testing/staging at least so I should hopefully not have to worry about it wrecking production.
It is nice that they have the default set to "None". However to have this feature in pgAmdin is as distraction from the project.
If it is just calling API anyway, then I don't want to have this in my db admin tool. It also expose surface area of potential data leakage.
bensyverson 18 hours ago [-]
Worth pointing out that Postgres is perfectly usable without an admin dashboard at all
smartbit 18 hours ago [-]
Note: AI features must be enabled in the server configuration
LLM_ENABLED = True
in config.py for these preferences to be available.
OptionOfT 16 hours ago [-]
I did not enable this and yet I got the panel in the UI.
18 hours ago [-]
ziml77 18 hours ago [-]
What's the danger? It can see the schemas to help it generate the queries but it can't run anything on its own. Also you have to give the application credentials to an AI provider for the feature to work. So, you can just not do that.
adamas 18 hours ago [-]
There is no need of potential dangers to not want to have non-deterministic features in an application.
justinclift 11 hours ago [-]
> What's the danger?
Hallucinated ideas about what needs doing, what commands to run, etc.
So, data that's no longer reliable (ie could be subtly changed), or even outright data loss.
Natfan 5 hours ago [-]
just don't accept bogus changes it suggests? this is why having a human in the loop is a very good idea
18 hours ago [-]
giancarlostoro 10 hours ago [-]
If you arent using it is it even a danger?
lateforwork 16 hours ago [-]
Did you miss this:
"This feature requires an AI provider to be configured in Preferences > AI."
Yeah, no thanks. I switched to dbeaver already anyway, because pgadmin was annoying about to which postgres versions it could connect. Too much of a hassle to setup a new version from source back when I tried. With dbeaver I just run ./dbeaver from the extracted .tag.gz. dbeaver is also not a web interface, but a real desktop application (Java, though).
Click on the "Reset layout" button in the query tool (located in the top right corner), and it will move the "AI Assistant" tab to the right. Now, when you query a table, it will default to the Query tab as always.
webprofusion 6 hours ago [-]
I've used similar with SQL Server Management Studio (GH copilot) and it's pretty useful for database work and gnarly queries.
This is great, but I'd prefer to see a refit of their UI first, it's currently a bit slow and looks prehistoric.
Fuzzwah 12 hours ago [-]
While everyone else is posting top level comments about which tools they're using rather than PgAdmin; I've been a huge fan of Beekeeper Studio since I tried out a range of postgresql db apps such as DBeaver, Postico, etc a few years ago.
I was on the prowl for a new DB Management tool, after pgAdmin 4 shifted to their web based client crap.
I never came across this. Found DBeaver and using it since then.
SOLAR_FIELDS 10 hours ago [-]
I found DBGate to be a pretty good cross platform FOSS option
aitchnyu 18 hours ago [-]
Might as well choose our AI subscription for our tools. I always hated the sparkle icons in Mongodb Compass (db browsing tool), Cloudwatch (logs) etc which is wired to a useless model. So I always chose to write Python scripts to query Postgres and other DBs and render pretty tables to CLI.
zbentley 17 hours ago [-]
Eh, as someone generally on the skeptical end of the spectrum for a lot of AI-assisted ops tasks, exploratory query generation is a great use case for it.
I’m highly proficient in code, only average at SQL, and am routinely tasked to answer one-off questions or prototype reporting queries against highly complex schemas of thousands of tables (owned by multiple teams and changing all the time, with wildly insufficient shared DAO libraries or code APIs for constructing novel queries). My skill breakdown and situation aren’t optimal, certainly, but they aren’t uncommon either.
In that context, being able to ask “write a query that returns the last ten addresses of each of the the highest-spending customers, but only if those addresses are in rhetorical shipment system and are residences, not businesses”. Like, I could figure out the schemas of the ten tables involved in those queries and write those joins by hand, slowly. That would take time and, depending on data queries, the approach might get stale fast.
jplaz 15 hours ago [-]
Switched from DBeaver to DataGrip and I couldn't be happier.
swasheck 10 hours ago [-]
i want to love datagrip but it big, slow, memory-hungry, and presents an unfamiliar paradigm to me over against most tools i've used for admin tasks. other than this last issue, do you have any suggestions for streamlining the experience?
stuaxo 17 hours ago [-]
If I can use this with a local LLM it could be useful.
zbentley 17 hours ago [-]
Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.
kay_o 15 hours ago [-]
In ollama is included default add the endpoint URL yourself
18 hours ago [-]
msavara 16 hours ago [-]
No thank you. One of the worst ads for python that exists. The only one worse than pgAdmin is Windows 11.
allthetime 14 hours ago [-]
postico is really nice on macos
testbjjl 9 hours ago [-]
Now I don’t need to copy, paste, take screenshots or use Claude? This will save me minutes per year.
david_iqlabs 11 hours ago [-]
[flagged]
david_iqlabs 11 hours ago [-]
[flagged]
naranha 17 hours ago [-]
The only interface that works for me efficiently with LLMs is the chatbot interface. I rather copy and paste snippets into the chat box than have IDEs and other tools guess what I might want to ask AI.
The first thing I do with these integration is look how I can remove them.
If it is just calling API anyway, then I don't want to have this in my db admin tool. It also expose surface area of potential data leakage.
Hallucinated ideas about what needs doing, what commands to run, etc.
So, data that's no longer reliable (ie could be subtly changed), or even outright data loss.
"This feature requires an AI provider to be configured in Preferences > AI."
And then you have to supply an API key (see here https://www.pgedge.com/blog/ai-features-in-pgadmin-configura... )
You don't get AI for free!
Click on the "Reset layout" button in the query tool (located in the top right corner), and it will move the "AI Assistant" tab to the right. Now, when you query a table, it will default to the Query tab as always.
This is great, but I'd prefer to see a refit of their UI first, it's currently a bit slow and looks prehistoric.
https://www.beekeeperstudio.io
I never came across this. Found DBeaver and using it since then.
I’m highly proficient in code, only average at SQL, and am routinely tasked to answer one-off questions or prototype reporting queries against highly complex schemas of thousands of tables (owned by multiple teams and changing all the time, with wildly insufficient shared DAO libraries or code APIs for constructing novel queries). My skill breakdown and situation aren’t optimal, certainly, but they aren’t uncommon either.
In that context, being able to ask “write a query that returns the last ten addresses of each of the the highest-spending customers, but only if those addresses are in rhetorical shipment system and are residences, not businesses”. Like, I could figure out the schemas of the ten tables involved in those queries and write those joins by hand, slowly. That would take time and, depending on data queries, the approach might get stale fast.
The first thing I do with these integration is look how I can remove them.