Optimization of the impact measurement of market structure on liquidity and volatility

Liquidity and volatility are the two barometers that allow stock markets to appreciate in terms of attractiveness, profitability and efficiency. Several macroeconomic and microstructure variables condition the level of liquidity that directly impact the asset allocation decisions of different investor profiles institutional and individuals and therefore the dynamics of the market as a whole. Volatility is the regulatory component that provides information on the level of risk that characterizes the market. Thus, the appreciation of these two elements is of considerable help to fund managers looking to optimize their equity pockets. In this work, we will use the liquidity ratio as a proxy variable for the liquidity of the Moroccan stock market, to estimate the indicators and factors that determine its shortand long-term variability. The appropriate econometric method would be to estimate an error correction vector model (ECVM) which has the property of determining the longand short-term relationships between the variables. The volatility of the MASI index will be the subject of a second estimate to capture the shape of the function of its evolution.


Introduction
The high volumes recorded cyclically thanks to the dynamics of introductions and placements by institutional and foreign investors make the prices of companies listed on the Casablanca Stock Exchange easily manipulated and directly impact the efficiency of the market.
The drop in volumes of more than 75% after the 2007 crisis À including round-trip transactions which skew real volumes À produced a liquidity rate that never exceeded 20%, while it is 130% in Turkey and 50% in South Africa as shown in Figure 1.
Historically, and especially after the financial crisis of 2008, the low volumes led to a lack of liquidity, with concrete consequences on the behavior of investors [1]: -The impact on the process of equilibrium price formation and therefore the valuation of listed shares; -Disconnection between the fundamentals of the values of listed companies and their stock market prices (minimal degree of integration of the Casablanca stock exchange into the Moroccan economy); -Limitation of diversification strategies for investors' portfolios (especially institutional ones); -Reluctance of fund managers to overweight their equity pockets [2]; -The weak dynamics of the secondary market negatively impacts the primary market by minimizing IPOs.
An analysis of the evolution of volumes and the liquidity ratio (calculated as the ratio of volumes to capitalization) allows us to confirm these findings.
From 2010, the main indicators of the Casablanca stock exchange recorded a remarkable decline in terms of market capitalization and volumes traded. All these indicators show that the Moroccan stock market suffers from a lack of liquidity.

Literature review
The models based on the storage risk of Demsetz (1968), Stoll (1978), Ho and Stoll (1981) and Biais (1993) assert that the level of liquidity is linked to the storage of securities [3,4]. The cost of holding shares is assimilated to the interest earned by investing the currency value of the securities held. Therefore, the liquidity of securities indirectly depends on the level of short and long-term interest rates, represented by the bond market and that of treasury bills. Assets produced on these markets constitute an alternative to listed equities considered cyclically as less profitable [5]. Likewise, real estate prices are considered safe havens if the gains made in the stock markets are minimal.

Common factors of liquidity
The assumption of having a common liquidity factor is to assume that during a liquidity shock the entire market will be affected as a whole.
Liquidity would therefore have two components: the first is specific representing its determinants discreetly linked to listed securities and a second systematic integrating characteristics common to all securities and linked to the markets [6]. There are generally two categories of factors, namely macroeconomic factors and structural factors specific to the vagaries of the stock market.
Variables linked to the organization and activity of the market can also explain common variations in liquidity. They include volume, number of transactions (Every transaction on the Casablanca Stock Exchange is the result of at least two contracts), number of contracts traded, index volatility, free float and market capitalization [7]. Another variable may explain the common variations in liquidity. This is the share of foreign investment in the stock market. Admittedly, a high share of this variable reflects a strong attractiveness of the stock market and therefore more liquidity.
They relate to specific determinants of liquidity which may act alone or in interaction with other factors to influence liquidity. In addition, a high profitability on the market leads to a stronger attraction on the part of investors. This leads to an increase in activity, and consequently a modification of the liquidity parameters of the securities making up the market index [8].

Data and methodology
The data used in this analysis come from national sources (DAPS, pension funds, insurance companies, ASFIM, Ministry of Finance, Bank AL-Maghrib, AMMC and Casablanca Stock Exchange). The analysis covers several macroeconomic and microstructure variables.

Definition of variables
In Table 1 we have defied a code for each variable used in this research.

Econometric analysis of the relationship between the liquidity ratio and the parameters of the capital market
We take the variable RL representing the liquidity ratio as a proxy variable for the liquidity of the Moroccan stock market. The analysis covers several macroeconomic and microstructure variables. The appropriate econometric method would be to estimate an error correction vector model (VECM) which has the property of determining the long-and short-term relationships between the variables [9]. The interaction of variables likely to influence liquidity and investments on the Casablanca stock exchange are more clarified in the Figure 2.

Correlation between variables
The correlation matrix Figure 3 shows high correlation coefficients calculated on several variables taken in pairs. These coefficients make it possible to detect certain links upstream. However, interpretations are difficult because the number of variables is large.
The correlation analysis could be used to sort the explanatory variables leading to the most appropriate model that we will use in our modeling of the liquidated ratio.

Study of stationarity
To have a non-spurious estimate of this relationship in a non-stationary statistical universe, it is recommended to first resort to a protocol of preliminary statistical tests (Spurious regression is a situation in which using nonstationary time series in linear regression shows erroneous results). First, we must determine the order of integration of the variables [10]. Considering the importance of this step thereafter, one must resort to different stationarity tests: the usual Dickey-Fuller unit root test (ADF).
Unlike the ADF test which takes into account the presence of autocorrelations in the series, the PP test also considers the hypothesis of the presence of a heteroscedastic dimension in the series. The KPSS test is based on the decomposition of the series studied into a deterministic part, a random walk and a white noise. It is therefore a test of nullity of the variance of the residual of the random walk. The null hypothesis of the KPSS test is based on the stationarity hypothesis. Thus, for the series to be considered as stationary, the KPSS statistic must be lower than the critical value.
The ADF test is carried out in three stages: first, test the significance of the coefficient (b) associated with the trend (t), then if the latter is not significant, the significance of the constant (c). Otherwise, a third equation, comprising only the shifted variable, will be estimated. It is the value of the coefficient (a) which illustrates the presence of a unit root, or on the contrary, leads us to conclude that the series is stationary in level.
The application of the various stationarity tests for the four series considered leads to the results grouped in the following table Table 2.
The results obtained for the level variables indicate that the series are not stationary at the 1% threshold. Indeed, the statistical tests agree with the rejection of the stationarity hypothesis. The tests of Dickey Fuller simple (1979) not only illustrate the stationarity of a chronicle or detect the existence of a trend, but also provide good ways to stationary a non-stationary series. As for the tests carried out on the first difference series, they make it possible to reject the null hypothesis of non-stationarity for all the series at the 1% threshold. The series used have the same order of integration, and therefore we can adopt a VECM model.

Implementation and results of cointegration tests
Once the order of integration of the series has been determined, the next step is to examine the presence of any long-run cointegrating relationships between the variables. This analysis will be done according to the test procedure of cointegration of Johansen (1988) considered more efficient than the two-step strategy of Engel and Granger (1987) when the sample is small and the variables are numerous.
Once the order of integration of the series has been determined, the next step is to examine the presence of any long-run cointegrating relationships between the variables. This analysis will be done according to the test procedure of cointegration of Johansen (1988) considered more effective than the two-step strategy of Engel and Granger (1987) when the sample is small and the variables are numerous.
Indeed, there can be several stationary linear combinations between integrated variables of order one. In the Johansen method, the determination of the dimension of the cointegrating space is done by estimating an autoregressive model. The advantage of this method is that it takes into account several specifications for the long-term relationship: presence of a trend / constant or not in the cointegration space.
Modeling in the presence of non-stationary series (at the start) led us to identify a possible long-term equilibrium relationship between the model variables. For this reason, we use the Johansen procedure based on the estimation of an autoregressive vector model.
One of the most important steps that precedes the multivariate Johansen cointegration test is finding the optimal number of lags. The choice of the number of delays can significantly affect test results (Indeed, the work of Boswijk and Fances (1992), Gonzalo (1994) and Ho and Sorensen (1996) have clearly underlined this observation). If the number of lags is insufficient, the model may retain autocorrelation within its residual term, and if, on the other hand, the order of the VAR is too large, the tests tend to overestimate the number of cointegrating relationships. The number of delays is determined from usual information criteria such as the Akaike and Schwartz and Hannan-Quinn criteria. These criteria are based on the input of information generated by additional delays in the model.
When variables are integrated of order 1 and cointegrated (Engle-Granger 1987, Granger 1988, Johanson 1988) it is necessary to resort to error correction models (VECM) to test causality on short-and long-term models. The short-term one being obtained from the coefficients associated with the differentiated explanatory variables while the long-term causality comes from the variables used in the cointegration vector. However, in view of the limitations and weakness of unit root tests for small samples, Johanson cointegration tests tend to reject the hypothesis of no cointegration (Toda and Yamamoto, 1995) in particular in relation to with a possible under-parameterization of the VAR models itself linked to the losses of degree of freedom caused by the addition of the delays. In the Figure 4 we have a recapitulation of the progress of the test strategy.
Dickey and Fuller use the error term (et) non-autocorrelation assumption in all three models. But in most cases this assumption may not be true. If it does not hold, the values tabulated by Dickey and Fuller will no longer be correct. Delays in the endogenous variable are then added to take into account an autocorrelated error term.
Thus, for a choice of p delays, corresponding to an autocorrelation of order p + 1 of the innovations in an AR (1) representation, the three models used to develop the DFA test are as follows: The test is carried out similarly to simple DF tests, only the statistical tables differ.  9 (2022) To find the number of delays Table 3 p, several approaches can be considered, among them we can cite the criteria of Akaike or Schwartz or else, starting from a sufficiently large value of "p".
Author calculations: We performed the cointegration test based on the comparison of the likelihood ratio to its critical value. The hypothesis of the test is formulated as follows in Table 4: H0: There is a cointegrating relationship; H1: There is no cointegrating relationship. JOHANSEN's cointegration test Table 5 enlightens us on the number of cointegration relation and its functional form by following the criteria of the trace and minimal eigenvalue and the information criteria of AKAIKE and SCHWARZ (As presented on the results table of the test in appendices).
For a 5% significance level, the null hypothesis locating the existence of a cointegration relationship between the model variables is accepted. There are at most four (4) cointegrating relationships between the variables of our model. For a given significance threshold, the null hypothesis situating the existence of a cointegration relation between the variables of the model is accepted, if the value of the trace (TR) is less than its tabulated critical value (OSTERWALD-LENUM, 1992) . On the other hand, a value of the trace greater than its critical value implies that there is no cointegrating relationship between the variables.
The presence of a unit root for each of the series studied is established using an augmented Dickey-Fuller test. The number of delays is set at p = 0 since this is the conclusion referred to by the information criteria AIC and SBC. The table in the appendix presents the results for the cointegration tests according to the different deterministic hypotheses.

Residue tests
From the residual graph Figure 5 we can validate À visually À the quality of our estimate, since we were able to reproduce a considerable part of our estimated series.

Heteroscedasticity test
This test Table 6 has as null hypothesis the homoscedasticity of the residuals. Its result shows that we accept this hypothesis at the 5% threshold since the probability is 0.64 which is greater than 0.05.

Normality test
The objective of the normality test is to verify whether the residuals follow a normal distribution or not. This test is based on the two asymmetry and kurtosis coefficients before ending with the Jarque-Bera test as shown in Figure 6.
The displayed probability of 0.65 allows us to validate our model and say that the residuals are normally distributed.
Likewise, the analysis of the correlogram displayed in the appendix makes it possible to conclude that there are no autocorrelations between the residuals, which further confirms the validity and relevance of the estimated model.
According to the VECM model estimated and validated by the various tests listed above, the liquidity ratio of the Moroccan stock market, providing information on the transactional dynamics of the stock market is: -Negatively linked to the volatility of the MASI index in the short term (À0.34e-05) and in the long term (À2.06e-05); -Positively linked to the number of contracts registered on the central market between short-term investors (5.81e-07) and long-term (7.22e-07); -Negatively linked to the long-term liquidity ratio (À0.14); -Negatively linked to the long-term M3 aggregate (0.04); -Positively linked to the long-term MASI index (0.04); positively linked to the 52-week long-term secondary market yield curve (1.50); -Negatively linked to the Long-term Share variable (part of the equity pocket in the portfolio of institutional investors) (À0.51).

Conclusion
This work, which aims to describe the structure of the financial market in order to be able to determine its optimal structure subsequently, is based on an aggregated modeling of main financial variables such as interest rates, economic growth, stock market liquidity, etc. This approach is based on its analyzes on vector error correction (VECM) economic models, which have the advantage of being up-to-date both simple and rich in lessons. Likewise, the estimated coefficients confirm the findings relating to the liquidity of the economy, which was also impacted by the slowdown in the stock of aggregates of liquid investments. This situation is explained by the enthusiasm of investors for short-term Treasury bills, which negatively impacted the net asset value of bond UCITS. In addition, the slowdown in the trend in holdings of treasury bills and negotiable debt securities also contributed to the drop in the rate of outstanding liquid assets. Also, the sluggishness of the stock market has had a negative impact on the outstanding amount of equity and diversified UCITS, the growth rate of which has declined in recent years.
The Treasury issuance policy is characterized by the greater weight of medium and long term subscriptions. This situation took place in connection with the orientation of investors towards the secondary market following the importance of Treasury raising on the primary market [11]. The changes recorded on the different parts of the yield curve have resulted in a remarkable drop in the average annual remuneration rates weighted by maturity. Also, the recovery in the private bond market has also prompted investors to abandon equity investments.