Jekyll2021-01-25T10:43:11-08:00https://alexisakira.github.io/feed.xmlAlexis Akira Todapersonal descriptionAlexis Akira TodaGPIF Award2021-01-25T00:00:00-08:002021-01-25T00:00:00-08:00https://alexisakira.github.io/posts/2021/01/blog-post<p>It is a great honor to announce that I have been awarded the 4th
<a href="https://www.gpif.go.jp/en/investment/research/awards/announcement_04.html">Government Pension Investment Fund (GPIF) Finance Award</a>
under the auspices of Ministry of Health, Labor and Welfare and Ministry of Education, Culture, Sports, Science and Technology in Japan.
I am very happy that my research has been recognized.</p>Alexis Akira TodaIt is a great honor to announce that I have been awarded the 4th Government Pension Investment Fund (GPIF) Finance Award under the auspices of Ministry of Health, Labor and Welfare and Ministry of Education, Culture, Sports, Science and Technology in Japan. I am very happy that my research has been recognized.Homothetic theory of the saving rate of the rich2021-01-14T00:00:00-08:002021-01-14T00:00:00-08:00https://alexisakira.github.io/posts/2021/01/blog-post<p>My paper “<a href="https://doi.org/10.1016/j.jet.2021.105193">A Theory of the Saving Rate of the Rich</a>”
was accepted at <a href="https://www.sciencedirect.com/journal/journal-of-economic-theory">Journal of Economic Theory</a>. I started this project in early 2020.
Initially, the paper was about rigorously establishing the asymptotic linearity of policy functions when preferences are homothetic and
the constraint is asymptotically homogeneous of degree 1. This is not surprising but the proof is difficult. As we worked on the proof, we (my coauthor and I)
discovered that the asymptotic slope of the policy function can be zero, which was surprising. When the asymptotic marginal propensity to consume (MPC) is zero,
an infinitely wealthy agent saves 100% of the wealth, which can explain the empirical puzzle that wealthy people save a lot although it seems unnecessary.
So instead of focusing on just a mathematical fact, we decided to frame the paper as a new theory of the saving rate of the rich.</p>
<p>Some economists make a big deal about non-homothetic preferences to explain the high saving rate of the rich. According to our theory, non-homotheticity is
unnecessary. It is nice to have an intuition, but sometimes the intuition is incorrect, as in this case where homotheticity can generate zero asymptotic MPCs.
This is a good example that any theoretical argument should be based on theorems and proofs, and not on intuition.</p>Alexis Akira TodaMy paper “A Theory of the Saving Rate of the Rich” was accepted at Journal of Economic Theory. I started this project in early 2020. Initially, the paper was about rigorously establishing the asymptotic linearity of policy functions when preferences are homothetic and the constraint is asymptotically homogeneous of degree 1. This is not surprising but the proof is difficult. As we worked on the proof, we (my coauthor and I) discovered that the asymptotic slope of the policy function can be zero, which was surprising. When the asymptotic marginal propensity to consume (MPC) is zero, an infinitely wealthy agent saves 100% of the wealth, which can explain the empirical puzzle that wealthy people save a lot although it seems unnecessary. So instead of focusing on just a mathematical fact, we decided to frame the paper as a new theory of the saving rate of the rich.Financial lesson from 20202021-01-01T00:00:00-08:002021-01-01T00:00:00-08:00https://alexisakira.github.io/posts/2021/01/blog-post<p>I have been investing for over 20 years. After learning about the capital asset pricing model and the mutual fund theorem and reading “<a href="https://www.amazon.com/Random-Walk-Down-Wall-Street/dp/0393358380">A Random Walk Down Wall Street</a>” and “<a href="https://www.amazon.com/Stocks-Long-Run-Definitive-Investment/dp/0071800514">Stocks for the Long Run</a>”, I have been more or less consistently investing in low-cost index ETFs such as <a href="https://investor.vanguard.com/etf/profile/VTI">VTI</a> and <a href="https://investor.vanguard.com/etf/profile/VXUS">VXUS</a>. This allowed me to stay in the market during the bottom in March 2009 and not to miss the bull market since then despite some of my colleagues advising me that stocks are overpriced. My kids’ 529 funds have grown about 3 times in nominal value. I have been maxing out my 403b, 457b, and Roth IRA contributions and joking I could retire if I choose to. Based on theory and experience, I preach the importance of passive investing to students in my finance class.</p>
<p>However, I have to confess I made my worst financial mistakes in 2020. During the COVID crash in March 2020, I bought stocks with extra cash. This was a good decision. But then the stock market rebounced, and due to the fact that the government imposed lockdowns, I was so convinced that there would be a second wave of stock market crash, and in April 2020 I liquidated my Roth IRA stock position. In addition, I bought put options on the SP500 index. Of course the stock market kept rising, my put option expired out-of-money, and I missed the stock recovery sitting on cash. The good news is that I only touched my Roth IRA account, whose value was much smaller than other accounts.</p>
<p>Based on this hard lesson, I have changed my investment strategy slightly. Instead of investing in ETFs, now I make automatic contributions to an old-fashioned mutual fund, namely <a href="https://investor.vanguard.com/mutual-funds/profile/VTWAX">VTWAX</a>. I know ETFs are more efficient, but their liquidity tempts me to actively trade. I view old-fashioned mutual funds as a commitment device for passive investing.</p>Alexis Akira TodaI have been investing for over 20 years. After learning about the capital asset pricing model and the mutual fund theorem and reading “A Random Walk Down Wall Street” and “Stocks for the Long Run”, I have been more or less consistently investing in low-cost index ETFs such as VTI and VXUS. This allowed me to stay in the market during the bottom in March 2009 and not to miss the bull market since then despite some of my colleagues advising me that stocks are overpriced. My kids’ 529 funds have grown about 3 times in nominal value. I have been maxing out my 403b, 457b, and Roth IRA contributions and joking I could retire if I choose to. Based on theory and experience, I preach the importance of passive investing to students in my finance class.Necessity of HARA for the concavity of consumption functions2020-12-18T00:00:00-08:002020-12-18T00:00:00-08:00https://alexisakira.github.io/posts/2020/12/blog-post<p>My paper “<a href="https://doi.org/10.1016/j.jmateco.2020.102460">Necessity of Hyperbolic Absolute Risk Aversion for the Concavity of Consumption Functions</a>”
was accepted at <a href="https://www.journals.elsevier.com/journal-of-mathematical-economics">Journal of Mathematical Economics</a>. The publication process was
quite efficient. I came up with the idea in late September 2020 and wrote a short paper. After getting rejected from a different journal, I sent to JME.
I am very happy that it came out in less than three months after I have started the project. Journals in economics tend to be very slow in the review
process, perhaps because many papers tend to be long and unfocused. We should all write concise and focused papers.</p>
<p>This paper marks my 30th published paper. I hope to maintain my productivity.</p>Alexis Akira TodaMy paper “Necessity of Hyperbolic Absolute Risk Aversion for the Concavity of Consumption Functions” was accepted at Journal of Mathematical Economics. The publication process was quite efficient. I came up with the idea in late September 2020 and wrote a short paper. After getting rejected from a different journal, I sent to JME. I am very happy that it came out in less than three months after I have started the project. Journals in economics tend to be very slow in the review process, perhaps because many papers tend to be long and unfocused. We should all write concise and focused papers.Great Barrington Declaration2020-10-07T00:00:00-07:002020-10-07T00:00:00-07:00https://alexisakira.github.io/posts/2020/10/blog-post<p>I have just signed the <a href="https://gbdeclaration.org/">Great Barrington Declaration</a>.</p>Alexis Akira TodaI have just signed the Great Barrington Declaration.My unpublished COVID-19 paper is now my most cited paper2020-09-29T00:00:00-07:002020-09-29T00:00:00-07:00https://alexisakira.github.io/posts/2020/09/blog-post<p>Recently, my working paper <a href="https://arxiv.org/abs/2003.11221">Susceptible-Infected-Recovered (SIR) Dynamics of COVID-19 and Economic Impact</a> has surpassed my <a href="https://doi.org/10.1016/j.jebo.2012.04.012">JEBO paper</a> in terms of citation counts, and has become my most cited paper. My COVID-19 paper is one of the very first written by an economist on this topic, and it appeared in the first issue of the working paper series <a href="https://cepr.org/sites/default/files/news/CovidEcon1%20final.pdf">Covid Economics</a>. Although I am no longer working on this paper since the situation with COVID-19 has been changing too quickly (especially when I wrote the paper in March 2020) to keep up with, I am glad that this paper has made some impact. In fact, it was featured in <a href="https://voxeu.org/article/early-draconian-social-distancing-may-be-suboptimal-fighting-covid-19-epidemic#">VoxEU</a> and <a href="https://fortune.com/2020/05/04/reopening-reopen-economy-coronavirus-covid-19-lifting-lockdown-economists">Fortune</a> articles.</p>
<p>Although the data analysis in my COVID-19 paper is now completely outdated, I believe its message is still valid. From the very beginning of the COVID-19 epidemic, I have been consistently skeptical of drastic mitigation measures such as closure of schools, restaurants, and other businesses. These measures work only in totalitarian countries where the government has the ability to enforce the lockdown, and probably appear to work in such countries because totalitarian regimes do not report cases accurately anyway. It is unfortunate that US (in particular, California) has (at least initially) implemented draconian measures. Although I have no ability to do research in applied microeconomics, I am sure one can estimate the causal effects of lockdown on economic outcomes, education achievement, suicide, etc., and those negative effects will be big. The problem with lockdown is that it ignores the tradeoff between lives and everything else, and it is not even clear whether it saves lives (because the pandemic will not end until we achieve herd immunity, and lockdown only delays herd immunity). I think policy makers should focus on mild measures such as mask-wearing and hand-washing, which have clear benefits and little cost for implementation. Though not many people are saying so openly, I think we are moving in that (right) direction.</p>Alexis Akira TodaRecently, my working paper Susceptible-Infected-Recovered (SIR) Dynamics of COVID-19 and Economic Impact has surpassed my JEBO paper in terms of citation counts, and has become my most cited paper. My COVID-19 paper is one of the very first written by an economist on this topic, and it appeared in the first issue of the working paper series Covid Economics. Although I am no longer working on this paper since the situation with COVID-19 has been changing too quickly (especially when I wrote the paper in March 2020) to keep up with, I am glad that this paper has made some impact. In fact, it was featured in VoxEU and Fortune articles.Berkeley banning junk food in checkout aisles is nonsense paternalism2020-09-25T00:00:00-07:002020-09-25T00:00:00-07:00https://alexisakira.github.io/posts/2020/09/blog-post<p>As a long-time California resident, I am very well aware of all those government red tapes. But the recent move of the city of <a href="https://sanfrancisco.cbslocal.com/2020/09/23/berkeley-healthy-checkout-ordinance-junk-food-ban-grocery-store-checkout-aisle/">Berkeley to ban placing junk food and beverages at checkout ailes</a> is completely nonsense.</p>
<p>There are two reasons. First, this type of regulation unnecessarily constrains business owners and indirectly increases the cost to customers.</p>
<p>Second, and more importantly, we have the right to do whatever we want as long as we don’t interfere with other people’s freedom. Although I almost never eat junk food and I don’t think it is wise to consume a lot of junk food, I don’t care if others eat junk food because they have every right to do so, and their consumption of junk food causes me no harm at all. If somebody becomes obese or gets diabetes by eating junk food, that’s their problem, not mine. This is a good example to think about negative externality. Junk food and tabacco may be (and I believe are) bad for health. However, a crucial difference between junk food and tabacco is that while smoking causes harm to others through passive smoking, I can think of no reason why junk food would cause harm to others. Thus, while I would understand regulating the use or taxing the sale of tabacco products, I see no justification for regulating junk food.</p>
<p>Some people are paternalistic and believe what is bad to them is also bad to others, and thus should be regulated. They should wake up. Because we all have different preferences and beliefs, something bad for you need not be bad for others. No regulation is needed unless there is externality.</p>Alexis Akira TodaAs a long-time California resident, I am very well aware of all those government red tapes. But the recent move of the city of Berkeley to ban placing junk food and beverages at checkout ailes is completely nonsense.Difference between `$$…`, `\[…`, and `\begin{equation*}…`2020-09-15T00:00:00-07:002020-09-15T00:00:00-07:00https://alexisakira.github.io/posts/2020/09/blog-post<p>I have been using \(\LaTeX\) for over 20 years now. When I write a displayed equation without numbering on a single line, I have been using <code class="language-plaintext highlighter-rouge">$$...$$</code> because it was simple. I didn’t understand why some people use <code class="language-plaintext highlighter-rouge">\[...\]</code>, because the latter takes more time to type and is not necessarily easy to read. Today I read <a href="https://tex.stackexchange.com/questions/503/why-is-preferable-to">this article</a> and learned that <code class="language-plaintext highlighter-rouge">$$...$$</code> is incorrect. From now on, I will switch to <code class="language-plaintext highlighter-rouge">\begin{equation*}...\end{equation*}</code> because it is easy to read and we can add equation numbering by deleting <code class="language-plaintext highlighter-rouge">*</code> if we change our mind.</p>Alexis Akira TodaI have been using \(\LaTeX\) for over 20 years now. When I write a displayed equation without numbering on a single line, I have been using $$...$$ because it was simple. I didn’t understand why some people use \[...\], because the latter takes more time to type and is not necessarily easy to read. Today I read this article and learned that $$...$$ is incorrect. From now on, I will switch to \begin{equation*}...\end{equation*} because it is easy to read and we can add equation numbering by deleting * if we change our mind.Naming things after each other2020-09-04T00:00:00-07:002020-09-04T00:00:00-07:00https://alexisakira.github.io/posts/2020/09/blog-post<p>I have recently read the article “<a href="http://nautil.us/issue/89/the-dark-side/why-mathematicians-should-stop-naming-things-after-each-other">Why Mathematicians Should Stop Naming Things After Each Other</a>”. The same logic seems to apply to economics.</p>
<p>A long time ago when I read McKenzie’s paper “<a href="https://dx.doi.org/10.2307/1907539">On Equilibrium in Graham’s Model of World Trade and Other Competitive Systems</a>”, my first impression was that the title was bad. It does not say what properties of equilibrium the paper is about; the paper has nothing to do with international trade; but above all, I had no idea what “Graham’s model” meant. Maybe <a href="https://en.wikipedia.org/wiki/Frank_Dunstone_Graham">Graham</a> was well-known in the 1940s, but I wonder how many economists remember him now.</p>
<p>One lesson may be that if we want our papers to be still readable in the distant future, we should not use concepts named after people, unless the names are already very well established such as Nash equilibrium or Pareto distribution. Instead of Bewley-Huggett-Aiyagari model (or any subset of these names), we should use “heterogeneous-agent model”; instead of Krusell-Smith model, we should use “heterogeneous-agent model with aggregate risk”, etc. But if we insist on this direction, we would have to use “recursive preferences with constant relative risk aversion and constant elasticity of intertemporal substitution” instead of Epstein-Zin preferences, which may be too long. (If so we may perhaps use “CES (constant elasticity of substitution) recursive preferences”.)</p>Alexis Akira TodaI have recently read the article “Why Mathematicians Should Stop Naming Things After Each Other”. The same logic seems to apply to economics.Inequality on spectral abscissa2020-08-12T00:00:00-07:002020-08-12T00:00:00-07:00https://alexisakira.github.io/posts/2020/08/blog-post<p>Let \(A, B\) be square (complex) matrices such that \(|B| \le A\). Then it is well known that \(\rho(B) \le \rho(|B|) \le \rho(A)\), where \(\rho\) denotes the <a href="https://en.wikipedia.org/wiki/Spectral_radius">spectral radius</a> (largest absolute value of all eigenvalues). See, for example, Theorem 8.4.5 of <a href="https://www.amazon.com/Matrix-Analysis-Second-Roger-Horn/dp/0521548233">Horn and Johnson (2013)</a>. In my recent <a href="https://arxiv.org/abs/2009.08010">paper</a>, we needed to use the spectral abscissa (largest real part of all eigenvalues) instead of the spectral radius. By analogy, we can make the following conjecture: if \(A, B\) are square complex matrices such that \(\mathrm{Re} b_{nn} \le a_{nn}\) for all \(n\) and \(|b_{nn’}| \le a_{nn’}\) for all \(n \neq n’\), then is it true that \(\zeta(B) \le \zeta(A)\), where \(\zeta\) denotes the spectral abscissa?</p>
<p>It turns out that the answer is yes. The main part of the proof uses the fact that \(\zeta(B) \le \zeta(C)\), where \(C\) is the matrix with diagonal entries \( c_{nn} = \mathrm{Re} b_{nn} \) and off-diagonal entries \(c_{nn’} = |b_{nn’}|\). This result is Corollary 1 of <a href="https://doi.org/10.1016/0022-247X(75)90038-4">Deutsch (1975)</a>. The reason I am writing this post is because this result is not well known: in fact, Deutsch (1975) has been cited only once, and it’s a self-citation! This is a good example that a theorem is eternal and may be useful to somebody in the far distant future.</p>Alexis Akira TodaLet \(A, B\) be square (complex) matrices such that \(|B| \le A\). Then it is well known that \(\rho(B) \le \rho(|B|) \le \rho(A)\), where \(\rho\) denotes the spectral radius (largest absolute value of all eigenvalues). See, for example, Theorem 8.4.5 of Horn and Johnson (2013). In my recent paper, we needed to use the spectral abscissa (largest real part of all eigenvalues) instead of the spectral radius. By analogy, we can make the following conjecture: if \(A, B\) are square complex matrices such that \(\mathrm{Re} b_{nn} \le a_{nn}\) for all \(n\) and \(|b_{nn’}| \le a_{nn’}\) for all \(n \neq n’\), then is it true that \(\zeta(B) \le \zeta(A)\), where \(\zeta\) denotes the spectral abscissa?