Differentiating an integral that grows like log asymptotically












7












$begingroup$


Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)










share|cite|improve this question











$endgroup$












  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    19 hours ago










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    19 hours ago






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    18 hours ago
















7












$begingroup$


Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)










share|cite|improve this question











$endgroup$












  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    19 hours ago










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    19 hours ago






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    18 hours ago














7












7








7


2



$begingroup$


Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)










share|cite|improve this question











$endgroup$




Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)







pr.probability real-analysis ca.classical-analysis-and-odes probability-distributions asymptotics






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 15 hours ago







random_person

















asked 19 hours ago









random_personrandom_person

905




905












  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    19 hours ago










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    19 hours ago






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    18 hours ago


















  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    19 hours ago










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    19 hours ago






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    18 hours ago
















$begingroup$
Am I missing something? Is this not l'Hospital's rule?
$endgroup$
– Venkataramana
19 hours ago




$begingroup$
Am I missing something? Is this not l'Hospital's rule?
$endgroup$
– Venkataramana
19 hours ago












$begingroup$
@Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
$endgroup$
– random_person
19 hours ago




$begingroup$
@Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
$endgroup$
– random_person
19 hours ago




1




1




$begingroup$
This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
$endgroup$
– Mateusz Kwaśnicki
18 hours ago




$begingroup$
This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
$endgroup$
– Mateusz Kwaśnicki
18 hours ago










2 Answers
2






active

oldest

votes


















10












$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    18 hours ago










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    17 hours ago










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    15 hours ago






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    15 hours ago



















5












$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    15 hours ago










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    15 hours ago








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    15 hours ago











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "504"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f320841%2fdifferentiating-an-integral-that-grows-like-log-asymptotically%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









10












$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    18 hours ago










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    17 hours ago










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    15 hours ago






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    15 hours ago
















10












$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    18 hours ago










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    17 hours ago










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    15 hours ago






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    15 hours ago














10












10








10





$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$



The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 18 hours ago

























answered 18 hours ago









RazielRaziel

1,99911425




1,99911425












  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    18 hours ago










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    17 hours ago










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    15 hours ago






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    15 hours ago


















  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    18 hours ago










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    17 hours ago










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    15 hours ago






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    15 hours ago
















$begingroup$
What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
$endgroup$
– random_person
18 hours ago




$begingroup$
What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
$endgroup$
– random_person
18 hours ago












$begingroup$
Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
$endgroup$
– Mateusz Kwaśnicki
17 hours ago




$begingroup$
Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
$endgroup$
– Mateusz Kwaśnicki
17 hours ago












$begingroup$
@MateuszKwaśnicki Thanks, I am looking forward to your answer.
$endgroup$
– random_person
15 hours ago




$begingroup$
@MateuszKwaśnicki Thanks, I am looking forward to your answer.
$endgroup$
– random_person
15 hours ago




1




1




$begingroup$
@random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
$endgroup$
– Mateusz Kwaśnicki
15 hours ago




$begingroup$
@random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
$endgroup$
– Mateusz Kwaśnicki
15 hours ago











5












$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    15 hours ago










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    15 hours ago








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    15 hours ago
















5












$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    15 hours ago










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    15 hours ago








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    15 hours ago














5












5








5





$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$



The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 13 hours ago

























answered 15 hours ago









Iosif PinelisIosif Pinelis

18.2k22159




18.2k22159












  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    15 hours ago










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    15 hours ago








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    15 hours ago


















  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    15 hours ago










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    15 hours ago








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    15 hours ago












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    15 hours ago
















$begingroup$
Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
$endgroup$
– random_person
15 hours ago




$begingroup$
Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
$endgroup$
– random_person
15 hours ago












$begingroup$
@random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
$endgroup$
– Iosif Pinelis
15 hours ago






$begingroup$
@random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
$endgroup$
– Iosif Pinelis
15 hours ago














$begingroup$
Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
$endgroup$
– random_person
15 hours ago






$begingroup$
Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
$endgroup$
– random_person
15 hours ago






1




1




$begingroup$
@random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
$endgroup$
– Iosif Pinelis
15 hours ago






$begingroup$
@random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
$endgroup$
– Iosif Pinelis
15 hours ago














$begingroup$
I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
$endgroup$
– random_person
15 hours ago




$begingroup$
I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
$endgroup$
– random_person
15 hours ago


















draft saved

draft discarded




















































Thanks for contributing an answer to MathOverflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f320841%2fdifferentiating-an-integral-that-grows-like-log-asymptotically%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Список кардиналов, возведённых папой римским Каликстом III

Deduzione

Mysql.sock missing - “Can't connect to local MySQL server through socket”