**Mathematicization of Microeconomics **: **The Principle of Using Quantitative Techniques **

**Mathematicization of Microeconomics **: **The Principle of Using Quantitative Techniques** Microeconomics refers to the field of individuals and corporations making decisions in resource allocation to maximize profit. One would classify those activities as purely human behaviors. Thus, we should validate the rationale behind the application of math in the economic market before any type of usage.

Even though with a lot of noise, economic behavior can be abstracted into simpler patterns with proper assumptions made. The best-known example could be the stock price fluctuation in a short period.

In economic theory we learn that the random price fluctuation in a short time corresponds to different transaction prices between the most recent buyer and seller, as there are thousands of trading parties simultaneously trading at each time, the stock prices perform randomness.

From the observation of that randomness, people surprisingly find that if we assume every price change is in a *t → *0 time interval, the pattern is well described by the Wiener Process in mathematics.

The abstract of stock randomness to the Wiener Process is the foundation of the Black-Scholes Formula, which is one of the most useful tools in derivative pricing. The relationship between math and economy is from human behaviors to scientific principles back to human behaviors again. The transformation between these stages is what quantitative developers and researchers should focus on.

**Mathematics Model Construction and Method Application **

The two major components of the transformation are usages of mathematical models and methods, corresponding to the static description of the product and dynamic process to deal with it.

Take the risk management field as an example.

Corporations would like to measure the underlying risk of a financial decision.

To quantify the maximum loss under general circumstances, they build a model of VaR (Value-at-Risk) to estimate a confidence level C how much would lose in the worst case, as *P*(*w < w*_{0 }*− V aR*) = 1 *− c*, where *w*_{0 }is the initial condition and *w *is the final stage/payoff. Regardless of the performance of *V aR*, it is a solid explanation of risk management in financial decision-making. How to approach this model introduces our next concept, method application.

It is common that in the real world people do not have a distribution model to describe the payoff, which makes the calculation of the above probability extremely difficult. In that case, we can be using an evolving historical simulation to estimate the current *V aR*. As we have the past data of our financial product, we can sort historical returns and pick the worst *c *level to be our current *V aR*.

This process is very flexible as we can change the historical data weight if we believe old data are less reliable. My point is, model/method selection and application should fol low by realistic situations in the economic world, these approaches are aimed to make the market more efficient.

Another well-known instance is the Bayes Theorem in machine learning. The equation is given as *p*(*θ|D*) = ^{p}^{(}^{D|θ}^{)}^{p}^{(}^{θ}^{)}

_{p}_{(}_{D}_{)}, or *p*(*θ|D*) *∝ p*(*D|θ*)*p*(*θ*). This expression in mathematics illustrates the posterior distribution can be updated based on the prior distribution and the data. With proper shrinkage operator and cross-validation techniques, we can evolve a mathematical model with better parameter estimation which would lead to better performance of the model.

Moreover, there is no need to say that the model/method itself is subjected to real-world circumstances. For example, all the LASSO, Ridge Regression, and Elastic Net are considered as valid shrinkage operators, so the selection of constraint cannot only be based on mathematical reasonability but also case-related performance.

**The Future of the Quantitative Field **

I believe that the transformation from human beings to computer automation in the economic market will continue, especially from the microeconomics aspect. If we admit that society’s evolution is an irresistible process, the usage of quantitative techniques effectively reduces the alpha from the whole market perspective. As more transactions are made by designed algorithms, the traditional trading patterns would lose their competitiveness and exit the market, which means that the market as a whole will become a better booster to society.

However, there are concerns about such evolution.

The current situation in the HFT market shows that the overuse of techniques in the financial market could lead to an arms race in capital speculation. Some HFT algorithms tend to trigger other algorithms into false signals (for example, false judgment on iceberg-order detection) thus making bad moves.

Besides, billions of equipment investment on computers and electric cables are made just to reduce the transaction speed by microseconds. These behaviors can be harmful to the market and especially to start-up companies. Therefore, regulations are important in the quantitative field and developers should be very careful not to waste resources on alpha-diminishing areas.

**Mathematicization of Microeconomics : The Principle of Using Quantitative Techniques **

**Written by Dennis Deng**

**Mathematicization of Microeconomics : The Principle of Using Quantitative Techniques**

**Mathematicization of Microeconomics **: **The Principle of Using Quantitative Techniques**

Machine Learning Factor Analysis : Fundamental & Quantitative Investing (rebellionresearch.com)

Winklevii Twins, Cameron Winklevoss & Tyler Winklevoss on Bitcoin’s Future & Gemini Exchange

**Mathematicization of Microeconomics **: **The Principle of Using Quantitative Techniques**