From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | Joshua Berkus <josh(at)agliodbs(dot)com> |
Cc: | Stephen Frost <sfrost(at)snowman(dot)net>, pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: Potential autovacuum optimization: new tables |
Date: | 2012-10-13 20:05:53 |
Message-ID: | 19389.1350158753@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Joshua Berkus <josh(at)agliodbs(dot)com> writes:
> I've been going over the notes and email archives from the period
> where Matt O'Connor and I arrived at the current settings. All of our
> testing was devoted to autovacuum, not autoanalyze.
> Our mistake was assuming that the same formula which worked well for
> vacuum would work well for analyze.
Ah. Okay, maybe we can agree that that wasn't a good idea.
> So, problem #1 is coming up with a mathematical formula. My initial target values are in terms of # of rows in the table vs. # of writes before analyze is triggered:
> 1 : 3
> 10 : 5
> 100 : 10
> 1000 : 100
> 100000 : 2000
> 1000000 : 5000
> 10000000 : 25000
> 100000000 : 100000
I don't really see that we need to bend over backwards to exactly match
some data points that you made up out of thin air. How about
ceil(sqrt(N)) to start with?
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Jim Nasby | 2012-10-13 20:15:08 | Re: Optimizer regression |
Previous Message | Jeff Janes | 2012-10-13 20:04:07 | Re: Potential autovacuum optimization: new tables |