889 resultados para Realized volatility
Resumo:
Whether ethical screening affects portfolio performance is an important question that is yet to be settled in the literature. This paper aims to shed further light on this question by examining the performance of a large global sample of Islamic equity funds (IEFs) from 1984 to 2010. We find that IEFs underperform conventional funds by an average of 40 basis points per month, consistent with the underperformance hypothesis. In line with popular media claims that Islamic funds are a safer investment, IEFs outperformed conventional funds during the recent banking crisis. However, we find no such outperformance for other crises or high volatility periods. Based on fund holdings-based data, we provide evidence of a negative curvilinear relation between fund performance and ethical screening intensity, consistent with a return trade-off to being more ethical.
Resumo:
The Kyoto Protocol is remarkable among global multilateral environmental agreements for its efforts to depoliticize compliance. However, attempts to create autonomous, arm’s length and rule-based compliance processes with extensive reliance on putatively neutral experts were only partially realized in practice in the first commitment period from 2008 to 2012. In particular, the procedurally constrained facilitative powers vested in the Facilitative Branch were circumvented, and expert review teams (ERTs) assumed pivotal roles in compliance facilitation. The ad hoc diplomatic and facilitative practices engaged in by these small teams of technical experts raise questions about the reliability and consistency of the compliance process. For the future operation of the Kyoto compliance system, it is suggested that ERTs should be confined to more technical and procedural roles, in line with their expertise. There would then be greater scope for the Facilitative Branch to assume a more comprehensive facilitative role, safeguarded by due process guarantees, in accordance with its mandate. However, if – as appears likely – the future compliance trajectories under the United Nations Framework Convention on Climate Change will include a significant role for ERTs without oversight by the Compliance Committee, it is important to develop appropriate procedural safeguards that reflect and shape the various technical and political roles these teams currently play.
Resumo:
The upstream oil and gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data” is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil and gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This viewpoint examines existing data management practices in the upstream oil and gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the way in Big Data. The comparison shows that, in companies that are widely considered to be leaders in Big Data analytics, data is regarded as a valuable asset—but this is usually not true within the oil and gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how the industry could potentially extract more value from data, and concludes with a series of policy-related questions to this end.
Resumo:
Industrial production and supply chains face increased demands for mass customization and tightening regulations on the traceability of goods, leading to higher requirements concerning flexibility, adaptability, and transparency of processes. Technologies for the ’Internet of Things' such as smart products and semantic representations pave the way for future factories and supply chains to fulfill these challenging market demands. In this chapter a backend-independent approach for information exchange in open-loop production processes based on Digital Product Memories DPMs is presented. By storing order-related data directly on the item, relevant lifecycle information is attached to the product itself. In this way, information handover between several stages of the value chain with focus on the manufacturing phase of a product has been realized. In order to report best practices regarding the application of DPM in the domain of industrial production, system prototype implementations focusing on the use case of producing and handling a smart drug case are illustrated.
Resumo:
Recent 'Global Burden of Disease' studies have provided quantitative evidence of the significant role air pollution plays as a human health risk factor (Lim et al., The Lancet, 380: 2224–2260, 2012). Tobacco smoke, including second hand smoke, household air pollution from solid fuels and ambient particulate matter are among the top risks, leading to lower life expectancy around the world. Indoor air constitutes an environment particularly rich in different types of pollutants, originating from indoor sources, as well as penetrating from outdoors, mixing, interacting or growing (when considering microbes) under the protective enclosure of the building envelope. Therefore, it is not a simple task to follow the dynamics of the processes occurring there, or to quantify the outcomes of the processes in terms of pollutant concentrations and other characteristics. This is further complicated by limitations such as building access for the purpose of air quality monitoring, or the instrumentation which can be used indoors, because of their possible interference with the occupants comfort (due to their large size, noise generated or amount of air drawn). European studies apportioned contributions of indoor versus outdoor sources of indoor air contaminants in 26 European countries and quantified IAQ associated DALYs (Disability-Adjusted Life Years) in those countries (Jantunen et al., Promoting actions for healthy indoor air (IAIAQ), European Commission Directorate General for Health and Consumers, Luxembourg, 2011). At the same time, there has been an increase in research efforts around the world to better understand the sources, composition, dynamics and impacts of indoor air pollution. Particular focus has been directed towards the contemporary sources, novel pollutants and new detection methods. The importance of exposure assessment and personal exposure, the majority of which occurs in various indoor micro¬environments, has also been realized. Overall, this emerging knowledge has been providing input for global assessments of indoor environments, the impact of indoor pollutants and their science based management and control. It was a major outcome of recent international conferences that interdisciplinarity and especially a better colla¬boration between exposure and indoor sciences would be of high benefit for the health related evaluation of environmental stress factors and pollutants. A very good example is the combination of biomonitoring and indoor air, particle and dust analysis to study the exposure routes of semi volatile organic compounds (SVOCs). We have adopted the idea of combining the forces of exposure and indoor sciences for this Special Issue, identified new and challenging topics and have attracted colleagues who are top researchers in their field to provide their inputs. The Special Issue includes papers, which collectively present advances in current research topics and in our view, build the bridge between indoor and exposure sciences.
Resumo:
Enterprise Architecture Management (EAM) is discussed in academia and industry as a vehicle to guide IT implementations, alignment, compliance assessment, or technology management. Still, a lack of knowledge prevails about how EAM can be successfully used, and how positive impact can be realized from EAM. To determine these factors, we identify EAM success factors and measures through literature reviews and exploratory interviews and propose a theoretical model that explains key factors and measures of EAM success. We test our model with data collected from a cross-sectional survey of 133 EAM practitioners. The results confirm the existence of an impact of four distinct EAM success factors, ‘EAM product quality’, ‘EAM infrastructure quality’, ‘EAM service delivery quality’, and ‘EAM organizational anchoring’, and two important EAM success measures, ‘intentions to use EAM’ and ‘Organizational and Project Benefits’ in a confirmatory analysis of the model. We found the construct ‘EAM organizational anchoring’ to be a core focal concept that mediated the effect of success factors such as ‘EAM infrastructure quality’ and ‘EAM service quality’ on the success measures. We also found that ‘EAM satisfaction’ was irrelevant to determining or measuring success. We discuss implications for theory and EAM practice.
Resumo:
2,4,6-trinitrotoluene (TNT) is one of the most commonly used nitro aromatic explosives in landmine, military and mining industry. This article demonstrates rapid and selective identification of TNT by surface-enhanced Raman spectroscopy (SERS) using 6-aminohexanethiol (AHT) as a new recognition molecule. First, Meisenheimer complex formation between AHT and TNT is confirmed by the development of pink colour and appearance of new band around 500 nm in UV-visible spectrum. Solution Raman spectroscopy study also supported the AHT:TNT complex formation by demonstrating changes in the vibrational stretching of AHT molecule between 2800-3000 cm−1. For surface enhanced Raman spectroscopy analysis, a self-assembled monolayer (SAM) of AHT is formed over the gold nanostructure (AuNS) SERS substrate in order to selectively capture TNT onto the surface. Electrochemical desorption and X-ray photoelectron studies are performed over AHT SAM modified surface to examine the presence of free amine groups with appropriate orientation for complex formation. Further, AHT and butanethiol (BT) mixed monolayer system is explored to improve the AHT:TNT complex formation efficiency. Using a 9:1 AHT:BT mixed monolayer, a very low detection limit (LOD) of 100 fM TNT was realized. The new method delivers high selectivity towards TNT over 2,4 DNT and picric acid. Finally, real sample analysis is demonstrated by the extraction and SERS detection of 302 pM of TNT from spiked.
Resumo:
The importance of modelling correlation has long been recognised in the field of portfolio management, with largedimensional multivariate problems increasingly becoming the focus of research. This paper provides a straightforward and commonsense approach toward investigating a number of models used to generate forecasts of the correlation matrix for large-dimensional problems.We find evidence in favour of assuming equicorrelation across various portfolio sizes, particularly during times of crisis. During periods of market calm, however, the suitability of the constant conditional correlation model cannot be discounted, especially for large portfolios. A portfolio allocation problem is used to compare forecasting methods. The global minimum variance portfolio and Model Confidence Set are used to compare methods, while portfolio weight stability and relative economic value are also considered.
Resumo:
Unlike US and Continental European jurisdictions, Australian monetary policy announcements are not followed promptly by projections materials or comprehensive summaries that explain the decision process. This information is disclosed 2 weeks later when the explanatory minutes of the Reserve Bank board meeting are released. This paper is the first study to exploit the features of the Australian monetary policy environment in order to examine the differential impact of monetary policy announcements and explanatory statements on the Australian interest rate futures market. We find that both monetary policy announcements and explanatory minutes releases have a significant impact on the implied yield and volatility of Australian interest rate futures contracts. When the differential impact of these announcements is examined using the full sample, no statistically significant difference is found. However, when the sample is partitioned based on stable periods and the Global Financial Crisis, a differential impact is evident. Further, contrary to the findings of Kim and Nguyen (2008), Lu et al. (2009), and Smales (2012a), the response along the yield curve, is found to be indifferent between the short and medium terms.
Resumo:
There are currently 23,500 level crossings in Australia, broadly divided active level crossings with flashing lights; and passive level crossings controlled by stop and give way signs. The current strategy is to annually upgrade passive level crossings with active controls within a given budget, but the 5,900 public passive crossings are too numerous to be upgraded all. The rail industry is considering alternative options to treat more crossings. One of them is to use lower cost equipment with reduced safety integrity level, but with a design that would fail to a safe state: in case of the impossibility for the system to know whether a train is approaching, the crossing changes to a passive crossing. This is implemented by having a STOP sign coming in front of the flashing lights. While such design is considered safe in terms of engineering design, questions remain on human factors. In order to evaluate whether such approach is safe, we conducted a driving simulator study where participants were familiarized with the new active crossing, before changing the signage to a passive crossing. Our results show that drivers treated the new crossing as an active crossing after the novelty effect had passed. While most participants did not experience difficulties with the crossing being turned back to a passive crossing, a number of participants experienced difficulties stopping in time at the first encounter of such passive crossing. Worse, a number of drivers never realized the signage had changed, highlighting the link between the decision to brake and stop at an active crossing to the lights flashing. Such results show the potential human factor issues of changing an active crossing to a passive crossing in case of failure of the detection of the train.
Resumo:
This thesis improves our insight towards the effects of using biodiesels on the particulate matter emission of diesel engines and contributes to our understanding of their potential adverse health effects. The novelty of this project is the use of biodiesel fuel with controlled chemical composition that enables us to relate changes of physiochemical properties of particles to specific properties of the biodiesel. For the first time, the possibility of a correlation of the volatility and the Reactive Oxygen Species concentration of the particles is investigated versus the saturation, oxygen content and carbon chain length of the fuel.
Resumo:
To date, a number of two-dimensional (2D) topological insulators (TIs) have been realized in Group 14 elemental honeycomb lattices, but all are inversionsymmetric. Here, based on first-principles calculations, we predict a new family of 2D inversion-asymmetric TIs with sizeable bulk gaps from 105 meV to 284 meV, in X2–GeSn (X = H, F, Cl, Br, I) monolayers, making them in principle suitable for room-temperature applications. The nontrivial topological characteristics of inverted band orders are identified in pristine X2–GeSn with X = (F, Cl, Br, I), whereas H2–GeSn undergoes a nontrivial band inversion at 8% lattice expansion. Topologically protected edge states are identified in X2–GeSn with X = (F, Cl, Br, I), as well as in strained H2–GeSn. More importantly, the edges of these systems, which exhibit single-Dirac-cone characteristics located exactly in the middle of their bulk band gaps, are ideal for dissipationless transport. Thus, Group 14 elemental honeycomb lattices provide a fascinating playground for the manipulation of quantum states.
Resumo:
This paper studies arts industries in all 366 US metropolitan statistical areas between 1980 and 2010. Our analysis provides evidence that the arts are an important component of many regional economies, but also highlights their volatility. After radical growth and diffusion between 1980 and 2000, in the last decade, the arts industries are defined more by shrinkage and reconcentration in fewer metropolitan areas. Further, we find that the vast majority of metros have strengths in particular sets of arts industries. As we discuss in the conclusion, these conditions present challenges and opportunities for urban cultural policy that goes beyond the current focus on the arts as consumption amenities.
Resumo:
Aims To discuss ethical issues that may arise in using WWA to monitor illicit drug use in the general population and in entertainment precincts, prisons, schools and work-places. Method Review current applications of WWA and identify ethical and social issues that may be raised with current and projected future uses of this method. Results Wastewater analysis (WWA) of drug residues is a promising method of monitoring illicit drug use that may overcome some limitations of other monitoring methods. When used for monitoring purposes in large populations, WWA does not raise major ethical concerns because individuals are not identified and the prospects of harming residents of catchment areas are remote. When WWA is used in smaller catchment areas (entertainment venues, prisons, schools or work-places) their results could, possibly, indirectly affect the occupants adversely. Researchers will need to take care in reporting their results to reduce media misreporting. Fears about possible use of WWA for mass individual surveillance by drug law enforcement officials are unlikely to be realized, but will need to be addressed because they may affect public support adversely for this type of research. Conclusions Using wastewater analysis to monitor illicit drug use in large populations does not raise major ethical concerns, but researchers need to minimize possible adverse consequences in studying smaller populations, such as workers, prisoners and students.
Resumo:
This paper investigates several competing procedures for computing the prices of vanilla European options, such as puts, calls and binaries, in which the underlying model has a characteristic function that is known in semi-closed form. The algorithms investigated here are the half-range Fourier cosine series, the half-range Fourier sine series and the full-range Fourier series. Their performance is assessed in simulation experiments in which an analytical solution is available and also for a simple affine model of stochastic volatility in which there is no closed-form solution. The results suggest that the half-range sine series approximation is the least effective of the three proposed algorithms. It is rather more difficult to distinguish between the performance of the halfrange cosine series and the full-range Fourier series. However there are two clear differences. First, when the interval over which the density is approximated is relatively large, the full-range Fourier series is at least as good as the half-range Fourier cosine series, and outperforms the latter in pricing out-of-the-money call options, in particular with maturities of three months or less. Second, the computational time required by the half-range Fourier cosine series is uniformly longer than that required by the full-range Fourier series for an interval of fixed length. Taken together,these two conclusions make a case for pricing options using a full-range range Fourier series as opposed to a half-range Fourier cosine series if a large number of options are to be priced in as short a time as possible.