Issue |
Int. J. Simul. Multidisci. Des. Optim.
Volume 13, 2022
Simulation and Optimization for Industry 4.0
|
|
---|---|---|
Article Number | 9 | |
Number of page(s) | 10 | |
DOI | https://doi.org/10.1051/smdo/2021040 | |
Published online | 06 January 2022 |
Research Article
Optimization of the impact measurement of market structure on liquidity and volatility
1
Engineering Sciences Laboratory – ENSA, Ibn Tofail University, Kenitra, Morocco
2
Research Laboratory in Management Sciences Organizations, ENCG, Ibn Tofail University, Kenitra, Morocco
* e-mail: rhouas.sara@gmail.com
Received:
7
May
2021
Accepted:
15
November
2021
Liquidity and volatility are the two barometers that allow stock markets to appreciate in terms of attractiveness, profitability and efficiency. Several macroeconomic and microstructure variables condition the level of liquidity that directly impact the asset allocation decisions of different investor profiles − institutional and individuals − and therefore the dynamics of the market as a whole. Volatility is the regulatory component that provides information on the level of risk that characterizes the market. Thus, the appreciation of these two elements is of considerable help to fund managers looking to optimize their equity pockets. In this work, we will use the liquidity ratio as a proxy variable for the liquidity of the Moroccan stock market, to estimate the indicators and factors that determine its short- and long-term variability. The appropriate econometric method would be to estimate an error correction vector model (ECVM) which has the property of determining the long- and short-term relationships between the variables. The volatility of the MASI index will be the subject of a second estimate to capture the shape of the function of its evolution.
Key words: Liquidity / volatility / optimization stock market / ratio / ECVM / MASI / CSE
© S. Rhouas et al., Published by EDP Sciences, 2022
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1 Introduction
The high volumes recorded cyclically thanks to the dynamics of introductions and placements by institutional and foreign investors make the prices of companies listed on the Casablanca Stock Exchange easily manipulated and directly impact the efficiency of the market.
The drop in volumes of more than 75% after the 2007 crisis − including round-trip transactions which skew real volumes − produced a liquidity rate that never exceeded 20%, while it is 130% in Turkey and 50% in South Africa as shown in Figure 1.
Historically, and especially after the financial crisis of 2008, the low volumes led to a lack of liquidity, with concrete consequences on the behavior of investors [1]:
The impact on the process of equilibrium price formation and therefore the valuation of listed shares;
Disconnection between the fundamentals of the values of listed companies and their stock market prices (minimal degree of integration of the Casablanca stock exchange into the Moroccan economy);
Limitation of diversification strategies for investors' portfolios (especially institutional ones);
Reluctance of fund managers to overweight their equity pockets [2];
The weak dynamics of the secondary market negatively impacts the primary market by minimizing IPOs.
An analysis of the evolution of volumes and the liquidity ratio (calculated as the ratio of volumes to capitalization) allows us to confirm these findings.
From 2010, the main indicators of the Casablanca stock exchange recorded a remarkable decline in terms of market capitalization and volumes traded. All these indicators show that the Moroccan stock market suffers from a lack of liquidity.
Fig. 1 Illustration of the low liquidity rate of the Casablanca stock exchange. |
2 Literature review
The models based on the storage risk of Demsetz (1968), Stoll (1978), Ho and Stoll (1981) and Biais (1993) assert that the level of liquidity is linked to the storage of securities [3,4]. The cost of holding shares is assimilated to the interest earned by investing the currency value of the securities held. Therefore, the liquidity of securities indirectly depends on the level of short and long-term interest rates, represented by the bond market and that of treasury bills. Assets produced on these markets constitute an alternative to listed equities considered cyclically as less profitable [5]. Likewise, real estate prices are considered safe havens if the gains made in the stock markets are minimal.
2.1 Common factors of liquidity
The assumption of having a common liquidity factor is to assume that during a liquidity shock the entire market will be affected as a whole.
Liquidity would therefore have two components: the first is specific representing its determinants discreetly linked to listed securities and a second systematic integrating characteristics common to all securities and linked to the markets [6]. There are generally two categories of factors, namely macroeconomic factors and structural factors specific to the vagaries of the stock market.
Variables linked to the organization and activity of the market can also explain common variations in liquidity. They include volume, number of transactions (Every transaction on the Casablanca Stock Exchange is the result of at least two contracts), number of contracts traded, index volatility, free float and market capitalization [7]. Another variable may explain the common variations in liquidity. This is the share of foreign investment in the stock market. Admittedly, a high share of this variable reflects a strong attractiveness of the stock market and therefore more liquidity.
They relate to specific determinants of liquidity which may act alone or in interaction with other factors to influence liquidity. In addition, a high profitability on the market leads to a stronger attraction on the part of investors. This leads to an increase in activity, and consequently a modification of the liquidity parameters of the securities making up the market index [8].
3 Data and methodology
The data used in this analysis come from national sources (DAPS, pension funds, insurance companies, ASFIM, Ministry of Finance, Bank AL-Maghrib, AMMC and Casablanca Stock Exchange). The analysis covers several macroeconomic and microstructure variables.
3.1 Definition of variables
In Table 1 we have defied a code for each variable used in this research.
Coding of estimated variables.
3.2 Econometric analysis of the relationship between the liquidity ratio and the parameters of the capital market
We take the variable RL representing the liquidity ratio as a proxy variable for the liquidity of the Moroccan stock market. The analysis covers several macroeconomic and microstructure variables. The appropriate econometric method would be to estimate an error correction vector model (VECM) which has the property of determining the long- and short-term relationships between the variables [9]. The interaction of variables likely to influence liquidity and investments on the Casablanca stock exchange are more clarified in the Figure 2.
3.2.1 Correlation between variables
The correlation matrix Figure 3 shows high correlation coefficients calculated on several variables taken in pairs. These coefficients make it possible to detect certain links upstream. However, interpretations are difficult because the number of variables is large.
The correlation analysis could be used to sort the explanatory variables leading to the most appropriate model that we will use in our modeling of the liquidated ratio.
Fig. 2 Interaction of variables likely to influence liquidity and investments on the Casablanca Stock Exchange. |
Fig. 3 Correlation matrix. |
3.3 Study of stationarity
To have a non-spurious estimate of this relationship in a non-stationary statistical universe, it is recommended to first resort to a protocol of preliminary statistical tests (Spurious regression is a situation in which using non-stationary time series in linear regression shows erroneous results).
First, we must determine the order of integration of the variables [10]. Considering the importance of this step thereafter, one must resort to different stationarity tests: the usual Dickey-Fuller unit root test (ADF).
Unlike the ADF test which takes into account the presence of autocorrelations in the series, the PP test also considers the hypothesis of the presence of a heteroscedastic dimension in the series. The KPSS test is based on the decomposition of the series studied into a deterministic part, a random walk and a white noise. It is therefore a test of nullity of the variance of the residual of the random walk. The null hypothesis of the KPSS test is based on the stationarity hypothesis. Thus, for the series to be considered as stationary, the KPSS statistic must be lower than the critical value.
The ADF test is carried out in three stages: first, test the significance of the coefficient (b) associated with the trend (t), then if the latter is not significant, the significance of the constant (c). Otherwise, a third equation, comprising only the shifted variable, will be estimated. It is the value of the coefficient (a) which illustrates the presence of a unit root, or on the contrary, leads us to conclude that the series is stationary in level.
The application of the various stationarity tests for the four series considered leads to the results grouped in the following table Table 2.
The results obtained for the level variables indicate that the series are not stationary at the 1% threshold. Indeed, the statistical tests agree with the rejection of the stationarity hypothesis. The tests of Dickey Fuller simple (1979) not only illustrate the stationarity of a chronicle or detect the existence of a trend, but also provide good ways to stationary a non-stationary series. As for the tests carried out on the first difference series, they make it possible to reject the null hypothesis of non-stationarity for all the series at the 1% threshold. The series used have the same order of integration, and therefore we can adopt a VECM model.
Summary of stationarity test results.
3.4 Implementation and results of cointegration tests
Once the order of integration of the series has been determined, the next step is to examine the presence of any long-run cointegrating relationships between the variables. This analysis will be done according to the test procedure of cointegration of Johansen (1988) considered more efficient than the two-step strategy of Engel and Granger (1987) when the sample is small and the variables are numerous.
Once the order of integration of the series has been determined, the next step is to examine the presence of any long-run cointegrating relationships between the variables. This analysis will be done according to the test procedure of cointegration of Johansen (1988) considered more effective than the two-step strategy of Engel and Granger (1987) when the sample is small and the variables are numerous.
Indeed, there can be several stationary linear combinations between integrated variables of order one. In the Johansen method, the determination of the dimension of the cointegrating space is done by estimating an autoregressive model. The advantage of this method is that it takes into account several specifications for the long-term relationship: presence of a trend / constant or not in the cointegration space.
Modeling in the presence of non-stationary series (at the start) led us to identify a possible long-term equilibrium relationship between the model variables. For this reason, we use the Johansen procedure based on the estimation of an autoregressive vector model.
One of the most important steps that precedes the multivariate Johansen cointegration test is finding the optimal number of lags. The choice of the number of delays can significantly affect test results (Indeed, the work of Boswijk and Fances (1992), Gonzalo (1994) and Ho and Sorensen (1996) have clearly underlined this observation). If the number of lags is insufficient, the model may retain autocorrelation within its residual term, and if, on the other hand, the order of the VAR is too large, the tests tend to overestimate the number of cointegrating relationships. The number of delays is determined from usual information criteria such as the Akaike and Schwartz and Hannan-Quinn criteria. These criteria are based on the input of information generated by additional delays in the model.
When variables are integrated of order 1 and cointegrated (Engle-Granger 1987, Granger 1988, Johanson 1988) it is necessary to resort to error correction models (VECM) to test causality on short- and long-term models. The short-term one being obtained from the coefficients associated with the differentiated explanatory variables while the long-term causality comes from the variables used in the cointegration vector. However, in view of the limitations and weakness of unit root tests for small samples, Johanson cointegration tests tend to reject the hypothesis of no cointegration (Toda and Yamamoto, 1995) in particular in relation to with a possible under-parameterization of the VAR models itself linked to the losses of degree of freedom caused by the addition of the delays. In the Figure 4 we have a recapitulation of the progress of the test strategy.
Our model is estimated from the significant variables (after the elimination of the non-significant variables) and from the dummy variables introduced to correct some cyclical events that influenced the ordinary correlations of the variables. The model is presented as follows:
Model estimated on Eviews
D(RL) = 0.17 − 2.34e-05*D(VOLMASI) + 5.81e-07*D(NCMC) − 0.14*RL(−1) − 0.04*LOG(M3(−1)) + 0.04*LOG(MASI(−1)) + 1.5*CTM52S(−1) − 0.516*PART(−1) + 7.22e-07*NCMC(−1) − 2.06e-05*VOLMASI(−1) − 0.087*DUM0409 − 0.08*DUM0812 + 0.04*DUM0504 + 0.05*DUM0512 − 0.04*DUM0612
GARCH = 9.7e-06 + 0.83*RESID (−1)^2 + 0.23*GARCH(−1)
R2 = 6667.
Dickey and Fuller use the error term (εt) non-autocorrelation assumption in all three models. But in most cases this assumption may not be true. If it does not hold, the values tabulated by Dickey and Fuller will no longer be correct. Delays in the endogenous variable are then added to take into account an autocorrelated error term.
Thus, for a choice of p delays, corresponding to an autocorrelation of order p + 1 of the innovations in an AR (1) representation, the three models used to develop the DFA test are as follows:
The test is carried out similarly to simple DF tests, only the statistical tables differ.
To find the number of delays Table 3 p, several approaches can be considered, among them we can cite the criteria of Akaike or Schwartz or else, starting from a sufficiently large value of “p”.
Author calculations:
We performed the cointegration test based on the comparison of the likelihood ratio to its critical value. The hypothesis of the test is formulated as follows in Table 4:
H0: There is a cointegrating relationship;
H1: There is no cointegrating relationship.
JOHANSEN's cointegration test Table 5 enlightens us on the number of cointegration relation and its functional form by following the criteria of the trace and minimal eigenvalue and the information criteria of AKAIKE and SCHWARZ (As presented on the results table of the test in appendices).
For a 5% significance level, the null hypothesis locating the existence of a cointegration relationship between the model variables is accepted. There are at most four (4) cointegrating relationships between the variables of our model.
For a given significance threshold, the null hypothesis situating the existence of a cointegration relation between the variables of the model is accepted, if the value of the trace (TR) is less than its tabulated critical value (OSTERWALD-LENUM, 1992) . On the other hand, a value of the trace greater than its critical value implies that there is no cointegrating relationship between the variables.
The presence of a unit root for each of the series studied is established using an augmented Dickey-Fuller test. The number of delays is set at p = 0 since this is the conclusion referred to by the information criteria AIC and SBC. The table in the appendix presents the results for the cointegration tests according to the different deterministic hypotheses.
Fig. 4 The progress of the test strategy. |
Determination of the number of model delays.
Summary of Johansen's Co-integration test.
JOHANSEN's cointegration test.
3.5 Residue tests
From the residual graph Figure 5 we can validate − visually − the quality of our estimate, since we were able to reproduce a considerable part of our estimated series.
Fig. 5 Residue tests. |
3.6 Heteroscedasticity test
This test Table 6 has as null hypothesis the homoscedasticity of the residuals. Its result shows that we accept this hypothesis at the 5% threshold since the probability is 0.64 which is greater than 0.05.
Heteroskedasticity Test: ARCH.
3.7 Normality test
The objective of the normality test is to verify whether the residuals follow a normal distribution or not. This test is based on the two asymmetry and kurtosis coefficients before ending with the Jarque-Bera test as shown in Figure 6.
The displayed probability of 0.65 allows us to validate our model and say that the residuals are normally distributed.
Likewise, the analysis of the correlogram displayed in the appendix makes it possible to conclude that there are no autocorrelations between the residuals, which further confirms the validity and relevance of the estimated model.
Fig. 6 Normality of residuals. |
4 Interpretations and discussions of the results
The estimation of the error correction model allowed the writing of the following equation:
D(RL) = 0.17–2.34e-05*D(VOLMASI) + 5.81e-07*D(NCMC) − 0.14*RL(−1) − 0.04*LOG(M3(−1)) + 0.04*LOG(MASI(−1)) + 1.50*CTM52S(−1) − 0.51*PART(−1) + 7.22e-07*NCMC(−1) − 2.06e-05*VOLMASI(−1) − 0.087*DUM0409–0.08*DUM0812 + 0.04*DUM0504 + 0.05*DUM0512–0.04*DUM0612 + AR(12).
According to the VECM model estimated and validated by the various tests listed above, the liquidity ratio of the Moroccan stock market, providing information on the transactional dynamics of the stock market is:
Negatively linked to the volatility of the MASI index in the short term (−0.34e-05) and in the long term (−2.06e-05);
Positively linked to the number of contracts registered on the central market between short-term investors (5.81e-07) and long-term (7.22e-07);
Negatively linked to the long-term liquidity ratio (−0.14);
Negatively linked to the long-term M3 aggregate (0.04);
Positively linked to the long-term MASI index (0.04);
positively linked to the 52-week long-term secondary market yield curve (1.50);
Negatively linked to the Long-term Share variable (part of the equity pocket in the portfolio of institutional investors) (−0.51).
5 Conclusion
This work, which aims to describe the structure of the financial market in order to be able to determine its optimal structure subsequently, is based on an aggregated modeling of main financial variables such as interest rates, economic growth, stock market liquidity, etc. This approach is based on its analyzes on vector error correction (VECM) economic models, which have the advantage of being up-to-date both simple and rich in lessons.
Likewise, the estimated coefficients confirm the findings relating to the liquidity of the economy, which was also impacted by the slowdown in the stock of aggregates of liquid investments. This situation is explained by the enthusiasm of investors for short-term Treasury bills, which negatively impacted the net asset value of bond UCITS. In addition, the slowdown in the trend in holdings of treasury bills and negotiable debt securities also contributed to the drop in the rate of outstanding liquid assets. Also, the sluggishness of the stock market has had a negative impact on the outstanding amount of equity and diversified UCITS, the growth rate of which has declined in recent years.
The Treasury issuance policy is characterized by the greater weight of medium and long term subscriptions. This situation took place in connection with the orientation of investors towards the secondary market following the importance of Treasury raising on the primary market [11]. The changes recorded on the different parts of the yield curve have resulted in a remarkable drop in the average annual remuneration rates weighted by maturity. Also, the recovery in the private bond market has also prompted investors to abandon equity investments.
Acknowledgement
Rhouas Sara thanks the National Scientific Research and Technology Center (CNRST) for providing her with a national Doctoral Research Scholarship.
References
- K. Bel Hadj Miled, F. Darwez, Le comportement mimétique sur le marché Financier Tunisien avant et après la révolution, Int. J. Econ. Strategic Manage. Bus. Process. 2014, International Conference on Business, Economics, Marketing & Management Research (BEMM'14) [Google Scholar]
- E.A. Nenu, G. Vintilă, Ş.C. Gherghina, The impact of capital structure on risk and firm performance: Empirical evidence for the bucharest stock exchange listed companies, Int. J. Finan. Stud. 6 (2018) [Google Scholar]
- S. Stereńczak, State-dependent stock liquidity premium: The case of the Warsaw stock exchange, Int. J. Financial Stud. 8, 13 (2020) [CrossRef] [Google Scholar]
- M.E. Blume, D.B. Keim, Institutional investors and stock market liquidity: Trends and relationships, Finance Department, the Wharton School, University of Pennsylvania (2012) [Google Scholar]
- N. El Hami, M. Bouchekourte, Optimising liquidity with modified particle swarm optimization application: Case of Casablanca stock exchange, in: 2016 4th IEEE International Colloquium on Information Science and Technology (CiSt), 2016, pp. 725–729, doi: 10.1109/CIST.2016.7804981 [CrossRef] [Google Scholar]
- M.D. Chinn, H. Ito, What matters for financial development? Capital controls, institutions, and interactions, J. Develop. Econ. 81, 163–192 (2006), JEL classification: F36; F43; G28 [CrossRef] [Google Scholar]
- Y. Amihud, Illiquidity and stock returns: cross-section and time-series effects, J. Finan. Mar. 5, 31–56 (2002) [CrossRef] [Google Scholar]
- A. Beber, M.W. Brandt, M. Cosemans, M. Verardo, Ownership crowded with style: Institutional investors, liquidity, and iquidity risk, Netspar and the Duisenberg School of Finance (2012) [Google Scholar]
- D. Herlemont, Econometry of Financial markets with R-project, (YATS Finances & Technologies (2014) [Google Scholar]
- P.P. Dungore, S.H. Patel, Analysis of volatility volume and open interest for nifty index futures using GARCH analysis and VAR model, Int. J. Financial Stud. 9 (2021) [Google Scholar]
- F.A. Sulehri, A. Ali, Impact of political uncertainty on pakistan stock exchange: An event study approach, J. Adv. Stud. Finan. 11 (2020) JASF XI, 2 (22) (Winter 2020) [Google Scholar]
Cite this article as: Sara Rhouas, Mustapha Bouchekourte, Norelislam El Hami, Optimization of the impact measurement of market structure on liquidity and volatility, Int. J. Simul. Multidisci. Des. Optim. 13, 9 (2022)
All Tables
All Figures
Fig. 1 Illustration of the low liquidity rate of the Casablanca stock exchange. |
|
In the text |
Fig. 2 Interaction of variables likely to influence liquidity and investments on the Casablanca Stock Exchange. |
|
In the text |
Fig. 3 Correlation matrix. |
|
In the text |
Fig. 4 The progress of the test strategy. |
|
In the text |
Fig. 5 Residue tests. |
|
In the text |
Fig. 6 Normality of residuals. |
|
In the text |
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.