Tuesday, December 31, 2019

Macro and Micro Environmental Analysis - Free Essay Example

Sample details Pages: 5 Words: 1438 Downloads: 9 Date added: 2017/06/26 Category Economics Essay Type Analytical essay Did you like this example? Macro: Macro environment refers to the overwhelming and external factors that the firms cannot have the influence on which can affect its business if not addressed. The economy of Malaysia has been in a healthy growth, however it the year of 2012 has dropped. In addition, the inflation rates of the year 2012 have dropped from a whopping 2.7 to satisfactory 1.3 during a year. Don’t waste time! Our writers will create an original "Macro and Micro Environmental Analysis" essay for you Create order These factors have directly affected the inflation and unemployment rates of Malaysia, this is because the inflation rates have fallen and the unemployment rated have fallen as well. Micro: Micro environment refers to the internal factors that relates to a business environment which can affect the business’ operation. These factors are suppliers, shareholders, competitors, customers and distributors. These factors have played a very big rule in the performance of proton, where proton has a big problem with their suppliers, as well as a drop in its market shares, as they have a big competitor in the market which is Perodua, the other factor that has affected proton is the customs, where Proton is facing difficulties in dealing with their customers, in fact proton is losing its customers due to the lack of service and lack of trust. Cross Culture and Global Issues: Being an automobile industry, this firm has to deal with other cultures on a daily basis. Cultures a re hard to define values, norm, and traditions. Understanding culture is an extremely complex concept. Various theories have been made on culture but the most projecting perhaps is of Greet Hofstede’s. Hosfsted’s Cultural Dimensions: Collectivism and Individualism: This refers to the extent to which people of a country a willing to work together. In collectivistic societies people tend to better put the groups needs first eliminating personal goals, whereas countries with invidualistic culture have follow personal or individual attainment over the groups. Power Distance: this refers to the extent to which people accept the hierarchal position to be authority in the business environment. Meaning in high power distance societies like Pakistan, India, Bangladesh, people tend to respect the authority because of their hierarchal of social status other than personal achievement like that in the low power distance societies like France, Italy. Uncertainty Avoidanc e: Refers to the extent to which people accept change in the society. Countries like Indonesia, North Korea, and Japan represent a high uncertainty avoidance whereby they dislike change. Masculinity and Feminism: relates to the role of women in different cultures, masculine culture believes Male to be the dominant part of the family and the only one allowed to support the family financially. Cultures with low masculinity dimension show females to be an important part of the workforce. Global issues: When it comes to addressing global issues to and automobile industry (Proton) most of the issues are relevant, whether it is of rising cost, taxation, global financial crises, regulations safety and health issues, all of the issues can be tracked back to an automobile industry (proton). There are many factors that Proton has to constrains before going overseas, for example, the tax fees is different in different countries, as well as the economic condition which will affect the sales, these are some of the issues that have to be addressed from Proton. Business volatility and risk Management: Considering the automobile industry (Proton), it took thousands of failed attempts before finally making a plane that could’ve worked. With every attempt being shot down, the amount of risk involved kept increasing but it was a necessary evil to reach to the point now where cars are made with perfection and counter abilities to have a safe driving. In the case of Proton, there are a number of risks involved which the mangers in charge have to think about. Initially, since the automobile industry is a huge business, the risk of investing enormous capital is the prime risk which decides the fate of the operation. Secondly, there is the risk of economic decline or inflation to sweep over which would directly affect the businesses in the service industry such as carmakers. Thirdly comes the cost of the fluctuating cost of material steals globally which has been increasing. The point is not to note out the risks involved the point is to manage risks relating to the merger effectively. Branding Success and Challenges: Branding is a concept originated for as far back as the 1200 in Sweden, where they used to burn insignia’s on a horse to differentiate it from others. For a company that wants its name out in the market, it takes extensive marketing, heavy funds, and a lot of patience because it takes a while. Reaching brand awareness is only the first step, and then the company needs to reach its target market, once it’s done then it comes Brand loyalty, whereby customers prefer choosing a brand they have become loyal to. If the brands present a particular negative perception of the company then it may take a long time and efforts to change the perception of the masses. For example, when Lexus was first revealed and appreciated by the masses, it was unknown that it belonged to Toyota, because of Toyota’s percept ion of being an economical and affordable automotive vehicle. Business Sustainability: It is one thing to start a business but it is another to sustain it. By sustaining it doesn’t always mean running it profitably, it also include social and environmental obligations, risk and opportunities it has. There are a number of ways in which companies can ensure their business sustainability which would be as follows; Shareholder engagement. Environmental management system. Reporting and disclosure. If proton implements these sustainable strategies into their operation and monito the quality, sustainability can be expected. Tuckman theories on teamwork: Introduction For the subject of Business and Commercial Awareness, we got an assignment to make a business plan for Proton. According to Dr Mahathir’s statement â€Å"our immediate plan would be to change its strategy from being a maker of cheap cars to become a world standard car manufacture†. Stage one Forming The class was divided into a group of five members, each of the members was giving a department to work on, and these departments are Finance, Operation, Human Resource, and marketing. Unfortunately I was given Finance department, where I have to analyse the current situation of Proton’s finance performance. However, before the assignment my knowledge of finance was substantially lower than an average final year student of Business Administration, but I have had a mind-set to not pay attention to things that do not interest me for as far back as I can remember. Stage two Storming Then we moved into the second stage which is storming, in this stage we started to push against boundaries. We have many conflicts between us in the nature of our working style, where each of the members has a different working style, for example in marketing department, where Amir and Khider were handed this part, they had many issues in solving the problems which cause frustration to the entire group, as well as I was giving Finance department, which was a big challenge for my authority. As the team work wasn’t clearly defined, which cause us to feel overwhelmed by the workload. I believe that it was mission impossible for us to overcome the issues we were facing in this stage; we could not even come up with the framework for our strategies, which was a big disappointment. We end up working these issues out with Dr Tan, and finally we could move to the next step. Stage three Norming In this stage we finally identified our goals, and each of the members is fully aware of the methods and the strategies which should be adopted in order to achieve our goal. As well as trust and apperception was built between us, where the leader of our group Ben was a big motivation for us, he has helped each of us to take responsibility for progress towards achieving the goal. As most of the discussions were conducted through a facebook page. Stage four Performing In this stage we solved all the problems and issues by using appropriate controls, and we have achieved and effective and satisfying result. It was amazing that we worked collaboratively to achieve our goal, as there was caring from the members towards each there. For example Brain Scot was very helpful in helping me with my part in Finance department, he recommended me on how to finalize my part. In addition to, the commitment from the group members increased positively towards the group work com paring to the first stage.

Monday, December 23, 2019

The Emergence of Yellow Power - 2159 Words

The common perspective of the civil rights movement is often seen from one angle: and that is the African American civil rights movement towards racial equality. And though this movement had significant historical context in American history, the pursuits of other minorities such as the Asian American civil rights movement are often undermined and overlooked. Yet, the Asian American movement surpassed the efforts of the African American movement despite the social and cultural obstacles faced with integrating into a new society. Through intrinsic cultural unity and the influences of the African American civil rights movement, the Asian American civil rights movement achieved more success than the African American civil rights movement†¦show more content†¦The Asian American civil rights movement was set into motion and challenged Asian stereotypes and promoted equal rights as Asian Americans took charge. Since then, African and Asian Americans moved together in the pursu it for racial equality in similar pacing. Therefore, to fight for racial equality and combat bigotry, African Americans formed the NAACP as the Japanese Americans formed the JACL--similar organization both against racial discrimination and to assist its members in rising up in society. These councils significantly impacted the United States as they promoted the causes for social equality and influenced similar organizations all over the United States. Essentially as a catalyst for the civil rights movement, such organizations played a major role in gaining the voice of the people and created unification of citizens reluctant to voice their opinions in a white dominated country. For example, a similar council was created known as the Jackson Street Community Council, which promoted business, social groups, and public services. This created ethnic cooperation between blacks and Asians in the community because of the prior competition between the colored regions of Seattle for job s. This council helped dissolve tension, not only between Japanese and African Americans, but also between Japanese and whites, which came from the resentment from the war and Japanese internment. Subsequently,Show MoreRelatedThe Effect of Black Power on the Emergence of Yellow Power1257 Words   |  6 PagesEffect of Black Power on the Emergence of Yellow Power African-Americans were not alone in the shift to â€Å"ethnic power.† Other minority groups also shifted from the fight for integration and began to adopt the rhetoric of ethnic power and pride in the late 1960’s. By the late 1960’s, a host of other groups began to adopt the rhetoric of â€Å"power†: Red Power, Grey Power, Pink Power, Brown Power, etc. What were the similarities and differences between the rhetoric of Chicano Power, Yellow Power and Black PowerRead MoreThe Human Social Relations Changed With The Emergence Of Sedentary And Surplus Accumulation991 Words   |  4 PagesThe human social relations changed with the emergence of sedentary and surplus accumulation. This happened because people are moving to a place where they can live year round. The types of political economies that emerged were pyramidal and acephalous. The conditions that caused these differences is how fertile the land is. If the land was not fertile then they would have an a cephalous political economy, where if the land was fertile then they would have a pyramidal political economy. The biggestRead MoreThe Yellow Wallpaper and the Awakening Comparison1488 Words   |  6 Pagesâ€Å"The Yellow Wallpaper† is a short story telling about a young woman who is eventually driven mad by the society. The narrator is apparently confused with the norm defining â€Å"true† and â€Å"good† woman constructed by society dominated by man. â€Å"The Awakening† addressed the social, scientific, and cultural landscape of the country and the undergoing of radical changes. Each of these stories addresses the issue of women’s rights and how they were treated in the late 19th century. â€Å"The Awakening† exploresRead MoreThe Role of Spices on the Expansion of Europe Essay1697 Words   |  7 Pagesachieved. This was not due to the difficulty of obtaining the certain spices but based on the occurrence that certain merchants were infatuated with profit. Obtaining the spices wasn’t as difficult as one would think even after the collapse of Roman power, since spices continued to find their way from Asia to Europe. Although many merchants priced certain spices based on their rarity. Which was classified in three categories: intrinsic, circumstantial, and artificial. Intrinsic rarity would entail thatRead MoreTaoism : Religion And Religion1428 Words   |  6 Pagesthis time, Taoism shifted from being philosophical into more of religion. China was a polytheistic nation but Taoism introduced a philosophy that challenged the whole mythology of the ancients. While polytheism and folk religion never died out, the emergence of Chang Tao-ling and Chang Chiao folk religion took the new form of Taoism. First, Chang Tao-ling grew up in southwest China were Taoism was not as prevalent. Buddhism was the religion that appealed to the people. Instead of immersing himself intoRead MoreNavajo Creation Story Analysis1136 Words   |  5 Pagesrelated to the number 4. Four Mountains Snow capped Four Clans Honaghaahnii, Bit’ahnii, Hashtl’ishnii, Todich’ii’nii Fours Colors White, blue, yellow, black Four Holy People White Body – Talking God Blue Body – Water Sprinkler Yellow Body – Calling God Black Body – Fire God Four calls Voice called 4 times before the Holy People came. Four days/mornings Emergence of man and woman, length of time until children were born ,number of days First Man prayed Four Skies/Directions Eastern, Western, SouthernRead MoreEssay on The Cause And Effect Of The Spanish American War953 Words   |  4 Pagesinnocents starved and died from disease in the unlivable conditions of these camps. Spanish atrocities against Cuban rebels were widely reported to Americans in newspaper publications. Publishers Joseph Pulitzer and William Randolph Hearst used ‘yellow journalism’ to boost publications sales and to stir up sympathy for Cuba and patriotism of Americans. They ran sensational and false stories of mass executions and starving children (Faragher, J., 2008, Out of Many). â€Å"Hearsts New York Journal publishedRead More The Social Impact of Slavery on the Caribbean Society Essay1336 Words   |  6 Pagesheavy. The settlers represented a cross section of the Spanish society, and as the Spanish imposed themselves on the local communities two things happened. 1) the emergence of the mestizo: a racial mixture of Spanish and Indian and 2) the extermination of the native due to famine and disease. Later through the slave trade and the emergence of the plantation society the African became present creating the Creole. What follows then- is the Caribbean people- who’s many ancestors were brought fromRead MoreThe Civil War, The Nez Perce Story, By Elliot West1344 Words   |  6 PagesUnited States transformed into a settled and dominant nation which signaled the end of the frontier in 1890. From land disputes to reenactments of infamous battles for nostalgia purposes, the West had become a more modern civilization that emanated power. Although these three works provide a precise timeline from the Indian wars all the way to the closing of the frontier, they do not argue the same thing. The unique interpretations of the history of the American West is perceived by the authors inRead MoreAnalysis Of The Movie Breakfast At Tiffany s 1502 Words   |  7 Pageswhite for a second year in a row. The movie industry is no stranger to controversy and since its inception it’s constantly been guilty of underrepresenting ethnic people. It’s evident that film is a type of mass media that ha s a certifiable amount of power to influence audience’s views, yet this platform constantly disregards the need for diversity in favor of stereotypes. Movies such as Breakfast at Tiffany’s (1961), Scarface (1983), and Pan (2015) are all guilty of this. The depiction of non-American

Sunday, December 15, 2019

Extreme conditional value at risk a coherent scenario for risk management Free Essays

string(52) " returns in the modelling of extreme market events\." CHAPTER ONE 1. INTRODUCTION Extreme financial losses that occurred during the 2007-2008 financial crisis reignited questions of whether existing methodologies, which are largely based on the normal distribution, are adequate and suitable for the purpose of risk measurement and management. The major assumptions employed in these frameworks are that financial returns are independently and identically distributed, and follow the normal distribution. We will write a custom essay sample on Extreme conditional value at risk a coherent scenario for risk management or any similar topic only for you Order Now However, weaknesses in these methodologies has long been identified in the literature. Firstly, it is now widely accepted that financial returns are not normally distributed; they are asymmetric, skewed, leptokurtic and fat-tailed. Secondly, it is a known fact that financial returns exhibit volatility clustering, thus the assumption of independently distributed is violated. The combined evidence concerning the stylized facts of financial returns necessitates the need for adapting existing methodologies or developing new methodologies that will account for all the stylised facts of financial returns explicitly. In this paper, I discuss two related measures of risk; extreme value-at-risk (EVaR) and extreme conditional value-at-risk (ECVaR). I argue that ECVaR is a better measure of extreme market risk than EVaR utilised by Kabundi and Mwamba (2009) since it is coherent, and captures the effects of extreme markets events. In contrast, even though EVaR captures the effect of extreme market events, it is non-coherent. 1.1.BACKGROUND Markowitz (1952), Roy (1952), Shape (1964), Black and Scholes (1973), and Merton’s (1973) major toolkit in the development of modern portfolio theory (MPT) and the field of financial engineering consisted of means, variance, correlations and covariance of asset returns. In MPT, the variance or equivalently the standard deviation was the panacea measure of risk. A major assumption employed in this theory is that financial asset returns are normally distributed. Under this assumption, extreme market events rarely happen. When they do occur, risk managers can simply treat them as outliers and disregard them when modelling financial asset returns. The assumption of normally distributed asset returns is too simplistic for use in financial modelling of extreme market events. During extreme market activity similar to the 2007-2008 financial crisis, financial returns exhibit behavior that is beyond what the normal distribution can model. Starting with the work of Mandelbrot (1963) there is increasingly more convincing empirical evidence that suggest that asset returns are not normally distributed. They exhibit asymmetric behavior, ‘fat tails’ and high kurtosis than the normal distribution can accommodate. The implication is that extreme negative returns do occur, and are more frequent than predicted by the normal distribution. Therefore, measures of risk based on the normal distribution will underestimate the risk of portfolios and lead to huge financial losses, and potentially insolvencies of financial institutions. To mitigate the effects of inadequate risk capital buffers stemming from underestimation of risk by normality-based financial modelling, risk measures such as EVaR that go beyond the assumption of normally distributed returns have been developed. However, EVaR is non-coherent just like VaR from which it is developed. The implication is that, even though it captures the effects of extreme mar ket events, it is not a good measure of risk since it does not reflect diversification – a contradiction to one of the cornerstone of portfolio theory. ECVaR naturally overcomes these problems since it coherent and can capture extreme market events. 1.2 RSEARCH PROBLEM The purpose of this paper is to develop extreme conditional value-at-risk (ECVaR), and propose it as a better measure of risk than EVaR under conditions of extreme market activity with financial returns that exhibit volatility clustering, and are not normally distributed. Kabundi and Mwamba (2009) have proposed EVaR as a better measure of extreme risk than the widely used VaR, however, it is non-coherent. ECVaR is coherent, and captures the effect of extreme market activity, thus it is more suited to model extreme losses during market turmoil, and reflects diversification, which is an important requirement for any risk measure in portfolio theory. 1.3 RELEVENCE OF THE STUDY The assumption that financial asset returns are normally distributed understates the possibility of infrequent extreme events whose impact is more detrimental than that of events that are more frequent. Use of VaR and CVaR underestimate the riskiness of assets and portfolios, and eventually lead to huge losses and bankruptcies during times of extreme market activity. There are many adverse effects of using the normal distribution in the measurement of financial risk, the most visible being the loss of money due to underestimating risk. During the global financial crisis, a number of banks and non-financial institutions suffered huge financial losses; some went bankrupt and failed, partly because of inadequate capital allocation stemming from underestimation of risk by models that assumed normally distributed returns. Measures of risk that do not assume normality of financial returns have been developed. One such measure is EVaR (Kabundi and Mwamba (2009)). EVaR captures the effect of extreme market events, however it is not coherent. As a result, EVaR is not a good measure of risk since it does not reflect diversification. In financial markets characterised by multiple sources of risk and extreme market volatility, it is important to have a risk measure that is coherent and can capture the effect of extreme market activity. ECVaR is advocated to fulfils this role of ensuring extreme market risk while conforming to portfolio theory’s wisdom of diversification. 1.4 RESEARCH DESIGN Chapter 2 will present a literature review of risk measurement methodologies currently used by financial institutions, in particular, VaR and CVaR. I also discuss the strengths and weaknesses of these measures. Another risk measure not widely known thus far is the EVaR. We discuss EVaR as an advancement in risk measurement methodologies. I advocate that EVaR is not a good measure of risk since it is non-coherent. This leads to the next chapter, which presents ECVaR as a better risk measure that is coherent and can capture extreme market events. Chapter 3 will be concerned with extreme conditional value-at-risk (ECVaR) as a convenient modelling framework that naturally overcomes the normality assumption of asset returns in the modelling of extreme market events. You read "Extreme conditional value at risk a coherent scenario for risk management" in category "Essay examples" This is followed with a comparative analysis of EVaR and ECVaR using financial data covering both the pre-financial crisis and the financial crisis periods. Chapter 4 will be concerned with data sources, preliminary data description, and the estimation of EVaR, and ECVaR. Chapter 5 will discuss the empirical results and the implication for risk measurement. Finally, chapter 6 will give concussions and highlight the directions for future research. CHAPTER 2: RISK MEASUREMENT AND THE EMPIRICAL DISTRIBUTION OF FINANCIAL RETURNS 2.1 Risk Measurement in Finance: A Review of Its Origins The concept of risk has been known for many years before Markowitz’s Portfolio Theory (MPT). Bernoulli (1738) solved the St. Petersburg paradox and derived fundamental insights of risk-averse behavior and the benefits of diversification. In his formulation of expected utility theory, Bernoulli did not define risk explicitly; however, he inferred it from the shape of the utility function (Bulter et al. (2005:134); Brancinger Weber, (1997: 236)). Irving Fisher (1906) suggested the use of variance to measure economic risk. Von Neumann and Morgenstern (1947) used expected utility theory in the analysis of games and consequently deduced many of the modern understanding of decision making under risk or uncertainty. Therefore, contrary to popular belief, the concept of risk has been known well before MPT. Even though the concept of risk was known before MPT, Markowitz (1952) first provided a systematic algorithm to measure risk using the variance in the formulation of the mean-variance model for which he won the Nobel Prize in 1990. The development of the mean-variance model inspired research in decision making under risk and the development of risk measures. The study of risk and decision making under uncertainty (which is treated the same as risk in most cases) stretch across disciplines. In decision science and psychology, Coombs and Pruitt (1960), Pruitt (1962), Coombs (1964), Coombs and Meyer (1969), and Coombs and Huang (1970a, 1970b) studied the perception of gambles and how their preference is affected by their perceived risk. In economics, finance and measurement theory, Markowitz (1952, 1959), Tobin (1958), Pratt (1964), Pollatsek Tversky (1970), Luce (1980) and others investigate portfolio selection and the measurement of risk of those portfolios, and gambles in general. T heir collective work produces a number of risk measures that vary in how they rank the riskiness of options, portfolios, or gambles. Though the risk measures vary, Pollatsek and Tversky (1970: 541) recognises that they share the following: (1) Risk is regarded as a property of choosing among options. (2) Options can be meaningfully ordered according to their riskiness. (3) As suggested by Irving Fisher in 1906, the risk of an option is somehow related to the variance or dispersion in its outcomes. In addition to these basic properties, Markowitz regards risk as a ‘bad’, implying something that is undesirable. Since Markowitz (1952), many risk measures such as the semi-variance, absolute deviation, and the lower semi-variance etc. (see Brachinger and Weber, (1997)) were developed, however, the variance continued to dominate empirical finance. It was in the 1990s that a new measure, VaR was popularised and became industry standard as a risk measure. I present this risk m easure in the next section. 2.2 Value-at-risk (VaR) 2.2.1 Definition and concepts Besides these basic ideas concerning risk measures, there is no universally accepted definition of risk (Pollatsek and Tversky, 1970:541); as a result, risk measures continue to be developed. J.P Morgan Reuters (1996) pioneered a major breakthrough in the advancement of risk measurement with the use of value-at-risk (VaR), and the subsequent Basel committee recommendation that banks could use it for their internal risk management. VaR is concerned with measuring the risk of a financial position due to the uncertainty regarding the future levels of interest rates, stock prices, commodity prices, and exchange rates. The risk resulting in the movement of these market factors is called market risk. VaR is the expected maximum loss of a financial position with a given level of confidence over a specified horizon. VaR provides answers to question: what is the maximum loss that I can lose over, say the next ten days with 99 percent confidencePut differently, what is the maximum loss that will be exceeded only one percent of the times in the next ten dayI illustrate the computation of VaR using one of the methods that is available, namely parametric VaR. I denote by the rate of return and by the portfolio value at time. Then is given by (1) The actual loss (the negative of the profit, which is) is given by (2) When is normally distributed (as is normally assumed), the variable has a standard normal distribution with mean of and standard deviation of. We can calculate VaR from the following equation: (3) where implies a confidence level. If we assume a 99% confidence level, we have (4) In we have -2.33 as our VaR at 99% confidence level, and we will exceed this VaR only 1% of the times. From (4), it can be shown that the 99% confidence VaR is given byVaR (5)Generalising from (5), we can state the quantile VaR of the distribution as follows (6)VaR is an intuitive measure of risk that can be easily implemented. This is evident in its wide use in the industry. However, is it an optimal measureThe next section addresses the limitations of VaR. 2.2.2 Limitations of VaR Artzner et al. (1997,1999) developed a set of axioms that if satisfied by a risk measure, then that risk measure is ‘coherent’. The implication of coherent measures of risk is that â€Å"it is not possible to assign a function for measuring risk unless it satisfies these axioms† (Mitra, 2009:8). Risk measures that satisfy these axioms can be considered universal and optimal since they are founded on the same mathematical axioms that are generally accepted. Artzner et al. (1997, 1999) put forward the first axioms of risk measures, and any risk measure that satisfies them is a coherent measure of risk. Letting be a risk measure defined on two portfolios and. Then, the risk measure is coherent if it satisfies the following axioms: (1)Monotonicity: if then We interpret the monotonicity axiom to mean that higher losses are associated with higher risk. (2)Homogeneity: for; Assuming that there is no liquidity risk, the homogeneity axiom mean that risk is not a function of the quantity of a stock purchased, therefore we cannot reduce or increase risk by investing different amounts in the same stock. (3)Translation invariance: , where is a riskless security; This means that investing in a riskless asset does not increase risk with certainty. (4)Sub-additivity: Possibly the most important axiom, sub-additivity insures that a risk measure reflects diversification – the combined risk of two portfolios is less than the sum of the risks of individual portfolios. VaR does not satisfy the most important axiom of sub-additivity, thus it is non-coherent. More so, VaR tells us what we can expect to lose if an extreme event does not occur, thus it does not tell us the extend of losses we can incur if a â€Å"tail† event occurs. VaR is therefore not optimal measure of risk. The non-coherence, and therefor non-optimality of VaR as a measuring of risk led to the development of conditional value-at-risk (CVaR) by Artzner et al. (1997, 1999), and Uryasev and Rockafeller (1999). I discus CVaR in the next section. 2.3 Conditional Value-at-Risk CVaR is also known as â€Å"Expected Shortfall† (ES),â€Å"Tail VaR†, or â€Å"Tail conditional expectation†, and it measures risk beyond VaR. Yamai and Yoshiba (2002) define CVaR as the conditional expectation of losses given that the losses exceed VaR. Mathematically, CVaR is given by the following: (7) CVaR offers more insights concerning risk that VaR in that it tells us what we can expect to lose if the losses exceed VaR. Unfortunately, the finance industry has been slow in adopting CVaR as its preferred risk measure. This is besides the fact that â€Å"the actuarial/insurance community has tended to pick up on developments in financial risk management much more quickly than financial risk managers have picked up on developments in actuarial science† (Dowd and Black (2006:194)). Hopefully, the effects of the financial crisis will change this observation. In much of the applications of VaR and CVaR, returns have been assumed to be normally distributed. However, it is widely accepted that returns are not normally distributed. The implication is that, VaR and CVaR as currently used in finance will not capture extreme losses. This will lead to underestimation of risk and inadequate capital allocation across business units. In times of market stress when extra capital is required, it will be inadequate. This may lead to the insolvency of financial institutions. Methodologies that can capture extreme events are therefore needed. In the next section, I discuss the empirical evidence on financial returns, and thereafter discuss extreme value theory (EVT) as a suitable framework of modelling extreme losses. 2.4 The Empirical Distribution of Financial Returns Back in 1947, Geary wrote, â€Å"Normality is a myth; there never was, and never will be a normal distribution† (as cited by Krishnaiah (1980: 279). Today this remark is supported by a voluminous amount of empirical evidence against normally distributed returns; nevertheless, normality continues to be the workhorse of empirical finance. If the normality assumption fails to pass empirical tests, why are practitioners so obsessed with the bell curveCould their obsession be justifiedTo uncover some of the possible responses to these questions, let us first look at the importance of being normal, and then look at the dangers of incorrectly assuming normality. 2.4.1 The Importance of Being Normal The normal distribution is the widely used distribution in statistical analysis in all fields that utilises statistics in explaining phenomenon. The normal distribution can be assumed for a population, and it gives a rich set of mathematical results (Mardia, 1980: 279). In other words, the mathematical representations are tractable, and are easy to implement. The populations can simply be explained by its mean and variance when the normal distribution is assumed. The panacea advantage is that the modelling process under normality assumption is very simple. In fields that deal with natural phenomenon, such as physics and geology, the normal distribution has unequivocally succeeded in explaining the variables of interest. The same cannot be said in the finance field. The normal probability distribution has been subject to rigorous empirical rejection. A number of stylized facts of asset returns, statistical tests of normality and the occurrence of extreme negative returns disputes the normal distribution as the underlying data generating process for asset returns. We briefly discuss these empirical findings next. 2.4.2 Deviations From Normality Ever since Mandelbrot (1963), Fama (1963), Fama (1965) among others, it is a known fact that asset returns are not normally distributed. The combined empirical evidence since the 1960s points out the following stylized facts of asset returns: (1)Volatility clustering: periods of high volatility tend to be followed by periods of high volatility, and period of low volatility tend to be followed by low volatility. (2)Autoregressive price changes: A price change depends on price changes in the past period. (3)Skewness: Positive prices changes and negative price changes are not of the same magnitude. (4)Fat-tails: The probabilities of extreme negative (positive) returns are much larger than predicted by the normal distribution. (5)Time-varying tail thickness: More extreme losses occur during turbulent market activity than during normal market activity. (6)Frequency dependent fat-tails: high frequency data tends to be more fat-tailed than low frequency data. In addition to these stylized facts of asset returns, extreme events of 1974 Germany banking crisis, 1978 banking crisis in Spain, 1990s Japanese banking crisis, September 2001, and the 2007-2008 US experience ( BIS, 2004) could not have happened under the normal distribution. Alternatively, we could just have treated them as outliers and disregarded them; however, experience has shown that even those who are obsessed with the Gaussian distribution could not ignore the detrimental effects of the 2007-2008 global financial crisis. With these empirical facts known to the quantitative finance community, what is the motivation for the continued use of the normality assumptionIt could be possible that those that stick with the normality assumption know only how to deal with normally distributed data. It is their hammer; everything that comes their way seems like a nail! As Esch (2010) notes, for those that do have other tools to deal with non-normal data, they continue to use the normal distribution on the grounds of parsimony. However, â€Å"representativity should not be sacrificed for simplicity† (Fabozzi et al., 2011:4). Better modelling frameworks to deal with extreme values that are characteristic of departures from normality have been developed. Extreme value theory is one such methodology that has enjoyed success in other fields outside finance, and has been used to model financial losses with success. In the next chapter, I present extreme value-based methodologies as a practical and better methodology to overcome non-normality in asset returns. CHAPTER 3: EXTREME VALUE THEORY: A SUITABLE AND ADEQUATE FRAMEWORK? 1.3. Extreme Value Theory Extreme value theory was developed to model extreme natural phenomena such as floods, extreme winds, and temperature, and is well established in fields such as engineering, insurance, and climatology. It provides a convenient way to model the tails of distributions that capture non-normal activities. Since it concentrates on the tails of distributions, it has been adopted to model asset returns in time of extreme market activity (see Embrechts et al. (1997); McNeil and Frey (2000); Danielsson and de Vries (2000). Gilli and Kellezi (2003) points out two related ways of modelling extreme events. The first way describes the maximum loss through a limit distribution known as the generalised extreme value distribution (GED), which is a family of asymptotic distributions that describe normalised maxima or minima. The second way provides asymptotic distribution that describes the limit distribution of scaled excesses over high thresholds, and is known as the generalised Pareto distribution (GPD). The two limit distributions results into two approaches of EVT-based modelling – the block of maxima method and the peaks over threshold method respectively[2]. 3.1. The Block of Maxima Method Let us consider independent and identically distributed (i.i.d) random variable with common distribution function ?. Let be the maximum of the first random variables. Also, let us suppose is the upper end of. For, the corresponding results for the minima can be obtained from the following identity (8) almost surely converges to whether it is finite or infinite since, Following Embrechts et al. (1997), and Shanbhang and Rao (2003), the limit theory finds norming constants and a non-degenerate distribution function in such a way that the distribution function of a normalized version of converges to as follows;, as (9) is an extreme value distribution function, and ? is the domain of attraction of, (written as), if equation (2) holds for suitable values of and. It can also be said that the two extreme value distribution functions and belong in the same family if for someand all. Fisher and Tippett (1928), De Haan (1970, 1976), Weissman (1978), and Embrechts et al. (1997) show that the limit distribution function belongs to one of the following three density functions for some. (10) (11) (12) Any extreme value distribution can be classified as one of the three types in (10), (11) and (12). and are the standard extreme value distribution and the corresponding random variables are called standard extreme random variables. For alternative characterization of the three distributions, see Nagaraja (1988), and Khan and Beg (1987). 3.2.The Generalized Extreme Value Distribution The three distribution functions given in (10), (11) and (12) above can be combined into one three-parameter distribution called the generalised extreme value distribution (GEV) given by,, with (13) We denote the GEV by, and the values andgive rise to the three distribution functions in (3). In equation (4) above, and represent the location parameter, the scale parameter, and the tail-shape parameter respectively. corresponds to the Frechet, and distributioncorresponds to the Weibull distribution. The case where reduces to the Gumbel distribution. To obtain the estimates of we use the maximum likelihood method, following Kabundi and Mwamba (2009). To start with, we fit the sample of maximum losses to a GEV. Thereafter, we use the maximum likelihood method to estimate the parameters of the GEV from the logarithmic form of the likely function given by; (14) To obtain the estimates of we take partial derivatives of equation (14) with respect to and, and equating them to zero. 3.2.1. Extreme Value-at-Risk The EVaR defined as the maximum likelihood quantile estimator of, is by definition given by (15) The quantity is the quantile of, and I denote it as the alpha percept VaR specified as follows following Kabundi and Mwamba (2009), and Embrech et al. (1997): (16) Even though EVaR captures extreme losses, by extension from VaR it is non-coherent. As such, it cannot be used for the purpose of portfolio optimization since it does not reflect diversification. To overcome this problem, In the next section, I extend CVaR to ECVaR so as to capture extreme losses coherently. 3.2.2. Extreme Conditional Value-at-Risk (ECVaR): An Extreme Coherent Measure of Risk I extend ECVaR from EVaR in a similar manner that I used to extend CVaR from VaR. ECVaR can therefore be expressed as follows: (17) In the following chapter, we describe the data and its sources. CHAPTER 4: DATA DISCRIPTION. I will use stock market indexes of five advanced economies comprising that of the United States, Japan, Germany, France, and United Kingdom, and five emerging economies comprising Brazil, Russia, India, China, and South Africa. Possible sources of data that will be used are I-net Bride, Bloomberg, and individual country central banks. CHAPTER 5: DISCUSION OF EMPIRICAL RESULTS In this chapter, I will discuss the empirical results. Specifically, the adequacy of ECVaR will be discussed relative to that of EVaR. Implications for risk measurement will also be discussed in this chapter. CHAPTER 6: CONCLUSIONS This chapter will give concluding remarks, and directions for future research. References [1] Markowitz, H.M.: 1952, Portfolio selection, Journal of Finance 7 (1952), 77-91 2 Roy, A.D.: 1952, Safety First and the Holding of Assets. Econometrica, vol. 20 no 3 p 431-449. 3 Shape, W.F.: 1964, Capital Asset Prices: A Theory of Market Equilibrium under Conditions of Risk. The Journal of Finance, Vol. 19 No 3 p 425-442. 4 Black, F., and Scholes, M.: 1973, The Pricing of Options and Corporate Liabilities, Journal of Political Economy, vol. 18 () 637-59. 5 Merton, R. C.: 1973, The Theory of Rational Option Pricing. Bell Journal of Economics and Management Science, Spring. 6 Artzner, Ph., F. Delbaen, J.-M. Eber, And D. Heath .: 1997, Thinking Coherently, Risk 10 (11) 68–71. 7 Artzner, Ph., Delbaen, F., Eber, J-M., And Heath , D.: 1999, Thinking Coherently. Mathematical Finance, Vol. 9, No. 3 203–228 8 Bernoulli, D.: 1954, Exposition of a new theory on the measurement of risk, Econometrica 22 (1) 23-36, Translation of a paper originally published in Latin in St. Petersburg in 1738. 9 Butler, J.C., Dyer, J.S., and Jia, J.: 2005, An Empirical Investigation of the Assumption of Risk –Value Models. Journal of Risk and Uncertainty, vol. 30 (2), pp. 133-156. 10 Brachinger, H.W., and Weber, M.: 1997, Risk as a primitive: a survey of measures of perceived risk. OR Spektrum, Vol 19 () 235-250 [1] Fisher, I.: 1906, The nature of Capital and Income. Macmillan. 1[1] von Neumann, J. and Morgenstern, O.: 1947, Theory of games and economic behaviour, 2nd ed., Princeton University Press. [1]2 Coombs, C.H., and Pruitt, D.G.: 1960, Components of Risk in Decision Making: Probability and Variance preferences. Journal of Experimental Psychology, vol. 60 () pp. 265-277. [1]3 Pruitt, D.G.: 1962, Partten and Level of risk in Gambling Decisions. Psychological Review, vol. 69 ()( pp. 187-201. [1]4 Coombs, C.H.: 1964, A Theory of Data. New York: Wiley. [1]5 Coombs, C.H., and Meyer, D.E.: 1969, Risk preference in Coin-toss Games. Journal of Mathematical Psychology, vol. 6 () p 514-527. [1]6 Coombs, C.H., and Huang, L.C.: 1970a, Polynomial Psychophysics of Risk. Journal of Experimental psychology, vol 7 (), pp. 317-338. [1]7 Markowitz, H.M.: 1959, Portfolio Selection: Efficient diversification of Investment. Yale University Press, New Haven, USA. [1]8 Tobin, J. E.: 1958, liquidity preference as behavior towards risk. Review of Economic Studies p 65-86. [1]9 Pratt, J.W.: 1964, Risk Aversion in the Small and in the Large. Econometrica, vol. 32 () p 122-136. 20 Pollatsek, A. and Tversky, A.: 1970, A theory of Risk. Journal of Mathematical Psychology 7 (no issue) 540-553. 2[1] Luce, D. R.:1980, Several possible measures of risk. Theory and Decision 12 (no issue) 217-228. 22 J.P. Morgan and Reuters.: 1996, RiskMetrics Technical document. Available at http://riskmetrics.comrmcovv.html Accessed†¦ 23 Uryasev, S., and Rockafeller, R.T.: 1999, Optimization of Conditional Value-at-Risk. Available at http://www.gloriamundi.org 24 Mitra, S.: 2009, Risk measures in Quantitative Finance. Available on line. [Accessed†¦] 25 Geary, R.C.: 1947, Testing for Normality, Biometrika, vol. 34, pp. 209-242. 26 Mardia, K.V.: 1980, P.R. Krishnaiah, ed., Handbook of Statistics, Vol. 1. North-Holland Publishing Company. Pp. 279-320. 27 Mandelbrot, B.: 1963, The variation of certain speculative prices. Journal of Business, vol. 26, pp. 394-419. 28 Fama, E.: 1963, Mandelbrot and the stable paretian hypothesis. Journal of Business, vol. 36, pp. 420-429. 29 Fama, E.: 1965, The behavior of stock market prices. Journal of Business, vol. 38, pp. 34-105. 30 Esch, D.: 2010, Non-Normality facts and fallacies. Journal of Investment Management, vol. 8 (1), pp. 49-61. 3[1] Stoyanov, S.V., Rachev, S., Racheva-Iotova, B., Fabozzi, F.J.: 2011, Fat-tailed Models for Risk Estimation. Journal of Portfolio Management, vol. 37 (2). Available at http://www.iijournals.com/doi/abs/10.3905/jpm.2011.37.2.107 32 Embrechts, P., Uppelberg, C.K.L, and T. Mikosch.: 1997, Modeling extremal events for insurance and finance, Springer 33 McNeil, A. and Frey, R.: 2000, Estimation of tail-related risk measures for heteroscedastic financial time series: an extreme value approach, Journal of Empirical Finance, Volume 7, Issues 3-4, 271- 300. 34 Danielsson, J. and de Vries, C.: 2000, Value-at-Risk and Extreme Returns, Annales d’Economie et deb Statistique, Volume 60, 239-270. 35Gilli, G., and Kellezi, E.: (2003), An Application of Extreme Value Theory for Measuring Risk, Department of Econometrics, University of Geneva, Switzerland. Available from: http://www.gloriamundi.org/picsresources/mgek.pdf 36 Shanbhag, D.N., and Rao, C.R.: 2003, Extreme Value Theory, Models and Simulation. Handbook of Statistics, Vol 21(). Elsevier Science B.V. 37 Fisher, R. A. and Tippett, L.H.C.: 1928, Limiting forms of the frequency distribution of the largest or smallest member of a sample. Proc. Cambridge Philos. Soc. Vol 24, 180-190. 38 De Haan, L.: 1970, On Regular Variation and Its Application to the Weak Convergence of Sample Extremes. Mathematical Centre Tract, Vol. 32. Mathematisch Centmm, Amsterdam 39 De Haan, L.: 1976, Sample extremes: an elementary introduction. Statistica Neerlandica, vol. 30, 161-172. 40 Weissman, I.: 1978, Estimation of parameters and large quantiles based on the k largest observations. J. Amer. Statist. Assoc. vol. 73, 812-815. 4[1] Nagaraja, H. N.: 1988, Some characterizations of continuous distributions based on regressions of adjacent order statistics and record values. Sankhy A 50, 70-73. 42 Khan, A. H. and Beg, M.I.: 1987, Characterization of the Weibull distribution by conditional variance. Snaky A 49, 268-271. 43 Kabundi, A. and Mwamba, J.W.M.: 2009, Extreme value at Risk: a Scenario for Risk management. SAJE Forthcoming. How to cite Extreme conditional value at risk a coherent scenario for risk management, Essay examples

Saturday, December 7, 2019

Synthesis Preserving Artifacts Essay Example For Students

Synthesis Preserving Artifacts Essay Priceless artifacts, made centuries ago, should in no means belong to an individual. Who gets to decide who owns them? Was it created for an individual or nation? Artifacts, which holds the culture, stories and even the past of civilizations, should belong to everyone. â€Å"Does it really matter who owns a particular artifact whether it is a museum in the first world or a nation in the third world? † ( Source A ). The topic of whether to remove an artifact of its place or origin and who it belongs to has been longed debated. Removing historic artifacts from their place of origin is in fact essential, and necessary to ensure a fuller understanding of human history, that can be shared by the world. Through global warming and natural disasters, if left in their found spots; treasured artifacts are likely to be destroyed. Warfare and terrorist attacks can also lead to the disappearance of a treasured artifact. â€Å"In 2001, the ruling Taliban blew up this 175 foot tall Buddha, which dated back fifteen hundred years† ( Source C ). This event is a perfect example of why artifacts need to be preserved and how if left alone, can result in its destruction. The statue of Buddha, represents the pride and strength of the religion, standing tall for years, meant to be By putting these priceless and treasured artifacts in a secured place like a museum, they will have the protection and the ability to continue to exist and teach others about the past. Besides educating the ordinary people, museums also serve as research centers for historians, archeologists and other scholars. It is through their studies that we know so much today about the lives of our ancient ancestors. If the artifacts were scattered around the globe, it would had been more difficult for experts to travel in order to study them. Museums are definitely major contributors to the advancement of knowledge about our past. â€Å"They are not in the collection of the art museum for the art museum. They are there for the public Besides, we have too many examples where presumed countries of origin could not preserve its antiquities: Afghanistan and Iraq, only most recently ( Source A ). With the advanced facilities and security museums of today provide, Artifacts have a higher chance of being preserved. The protection of these artifacts are also questionable in countries in financial difficulty. If a country fails to do so, then in no right should it be kept by them. â€Å"What does it mean, exactly for something to belong to a people? † (Source E). To which extent does an artifact belong to a group of people or in individual? Modern countries, should not claim monuments and statues merely on the basis that it was found in their land, when in reality the artifacts dates back centuries before the country’s border were even drawn. â€Å"Most of Nigeria’s cultural heritage was produced before the modern Nigerian state existed. So why should Nigeria have a special claim on those objects, buried in the forest and forgotten for so long? (Source E ). Every individual, disregarding their cultural background, should have the same amount of rights and ownership of the artifacts. Only through preserving the artifacts in a secured and well facilitated building, will treasured artifacts be able to last, long enough for the future generation to study and appreciate them. Removing historic artifacts from their place or origin is infact a necessary act to ensure a fuller understanding of human history, that can be shared by the world.