187 resultados para Realized volatility
Resumo:
The value of information technology (IT) is often realized when continuously being used after users’ initial acceptance. However, previous research on continuing IT usage is limited for dismissing the importance of mental goals in directing users’ behaviors and for inadequately accommodating the group context of users. This in-progress paper offers a synthesis of several literature to conceptualize continuing IT usage as multilevel constructs and to view IT usage behavior as directed and energized by a set of mental goals. Drawing from the self-regulation theory in the social psychology, this paper proposes a process model, positioning continuing IT usage as multiple-goal pursuit. An agent-based modeling approach is suggested to further explore causal and analytical implications of the proposed process model.
Resumo:
An analysis of the emissions from 14 CNG and 5 Diesel buses was conducted during April & May, 2006. Studies were conducted at both steady state and transient driving modes on a vehicle dynamometer utilising a CVS dilution system. This article will focus on the volatile properties of particles from 4 CNG and 4 Diesel vehicles from within this group with a priority given to the previously un-investigated CNG emissions produced at transient loads. Particle number concentration data was collected by three CPC’s (TSI 3022, 3010 & 3782WCPC) having D50 cut-offs set to 5nm, 10nm & 20nm respectively. Size distribution data was collected using a TSI 3080 SMPS with a 3025 CPC during the steady state driving modes. During transient cycles mono-disperse “slices” of between 5nm & 25nm were measured. The volatility of these particles was determined by placing a thermodenuder before the 3022 and the SMPS and measuring the reduction in particle number concentration as the temperature in the thermodenuder was increased. This was then normalised against the total particle count given by the 3010 CPC to provide high resolution information on the reduction in particle concentration with respect to temperature.
Resumo:
Particle emission measurements from a fleet of 14 CNG and 5 Diesel buses were measured both for transient and steady state mode s on a chassis dynamometer with a CVS dilution system. Several transient DT80 cycles and 4 steady sate modes (0, 25, 50 100% of maximum load) were measured for each bus tested. Particle number concentration data was collected by three CPC’s (TSI 3022, 3010 3782WCPC) having D50 cut-offs set to 5, 10 and 20nm respectively. The size distributions were measured with a TSI 3080 SMPS with a 3025 CPC during the steady state modes. Particle mass emissions were measured with a TSI Dustrak. Particle mass emissions for Diesel buses were upto 2 orders of magnitude higher than for CNG buses. Particle number emissions during steady state modes for Diesel busses were 2 to 5 times higher than for CNG busses for all of the tested loads. On the other hand for the DT80 transient cycle particle number emissions were up to 3 times higher for the CNG buses. More detailed analysis of the transient cycles revealed that the reason for this was due to high particle number emissions from CNG busses during the acceleration parts of the cycles. Particles emitted by the CNG busses during acceleration were in the nucleation mode with the majority being smaller than 10nm. Volatility measurements have also shown that they were highly volatile.
Resumo:
The debate about the democratic significance of these trends—a more aggressively inquisitorial media environment, greater public participation in political communication, a more accessible and transparent (at least in appearance) political class—continues, not least in Australia. This essay was written in the first half of 2013, a time of extreme political volatility in Australia, and in the run-up to a general election following three years of minority Labor government. By that stage in the political cycle, Prime Minister Julia Gillard had survived not one but two attempts at leadership “spills”, ministers had resigned or been sacked for disloyalty to the leader, major policy initiatives had been dumped, reversed or quietly dropped, and a Coalition opposition was confidently looking forward to a landslide majority in the election of September that year. Labor’s internal party turmoil, rather than the Coalition’s policy prospectus (which remained sketchy and vague right up to the eve of the election), were widely assumed to be the cause of the former’s poor standing in the opinion polls.
Resumo:
Maintenance decisions for large-scale asset systems are often beyond an asset manager's capacity to handle. The presence of a number of possibly conflicting decision criteria, the large number of possible maintenance policies, and the reality of budget constraints often produce complex problems, where the underlying trade-offs are not apparent to the asset manager. This paper presents the decision support tool "JOB" (Justification and Optimisation of Budgets), which has been designed to help asset managers of large systems assess, select, interpret and optimise the effects of their maintenance policies in the presence of limited budgets. This decision support capability is realized through an efficient, scalable backtracking- based algorithm for the optimisation of maintenance policies, while enabling the user to view a number of solutions near this optimum and explore tradeoffs with other decision criteria. To assist the asset manager in selecting between various policies, JOB also provides the capability of Multiple Criteria Decision Making. In this paper, the JOB tool is presented and its applicability for the maintenance of a complex power plant system.
Resumo:
In this paper we propose a new multivariate GARCH model with time-varying conditional correlation structure. The time-varying conditional correlations change smoothly between two extreme states of constant correlations according to a predetermined or exogenous transition variable. An LM–test is derived to test the constancy of correlations and LM- and Wald tests to test the hypothesis of partially constant correlations. Analytical expressions for the test statistics and the required derivatives are provided to make computations feasible. An empirical example based on daily return series of five frequently traded stocks in the S&P 500 stock index completes the paper.
Resumo:
The upstream oil & gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data”—that is, the ability to apply more sophisticated types of analytical tools to information in a way that extracts new insights or creates new forms of value—is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil & gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This paper examines existing data management practices in the upstream oil & gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the Big Data revolution. The comparison shows that, in companies that are leading the Big Data revolution, data is regarded as a valuable asset. The presented evidence also shows, however, that this is usually not true within the oil & gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how upstream oil & gas companies could potentially extract more value from data, and concludes with a series of specific technical and management-related recommendations to this end.
Resumo:
Interpolation techniques for spatial data have been applied frequently in various fields of geosciences. Although most conventional interpolation methods assume that it is sufficient to use first- and second-order statistics to characterize random fields, researchers have now realized that these methods cannot always provide reliable interpolation results, since geological and environmental phenomena tend to be very complex, presenting non-Gaussian distribution and/or non-linear inter-variable relationship. This paper proposes a new approach to the interpolation of spatial data, which can be applied with great flexibility. Suitable cross-variable higher-order spatial statistics are developed to measure the spatial relationship between the random variable at an unsampled location and those in its neighbourhood. Given the computed cross-variable higher-order spatial statistics, the conditional probability density function (CPDF) is approximated via polynomial expansions, which is then utilized to determine the interpolated value at the unsampled location as an expectation. In addition, the uncertainty associated with the interpolation is quantified by constructing prediction intervals of interpolated values. The proposed method is applied to a mineral deposit dataset, and the results demonstrate that it outperforms kriging methods in uncertainty quantification. The introduction of the cross-variable higher-order spatial statistics noticeably improves the quality of the interpolation since it enriches the information that can be extracted from the observed data, and this benefit is substantial when working with data that are sparse or have non-trivial dependence structures.
Resumo:
Whether ethical screening affects portfolio performance is an important question that is yet to be settled in the literature. This paper aims to shed further light on this question by examining the performance of a large global sample of Islamic equity funds (IEFs) from 1984 to 2010. We find that IEFs underperform conventional funds by an average of 40 basis points per month, consistent with the underperformance hypothesis. In line with popular media claims that Islamic funds are a safer investment, IEFs outperformed conventional funds during the recent banking crisis. However, we find no such outperformance for other crises or high volatility periods. Based on fund holdings-based data, we provide evidence of a negative curvilinear relation between fund performance and ethical screening intensity, consistent with a return trade-off to being more ethical.
Resumo:
The Kyoto Protocol is remarkable among global multilateral environmental agreements for its efforts to depoliticize compliance. However, attempts to create autonomous, arm’s length and rule-based compliance processes with extensive reliance on putatively neutral experts were only partially realized in practice in the first commitment period from 2008 to 2012. In particular, the procedurally constrained facilitative powers vested in the Facilitative Branch were circumvented, and expert review teams (ERTs) assumed pivotal roles in compliance facilitation. The ad hoc diplomatic and facilitative practices engaged in by these small teams of technical experts raise questions about the reliability and consistency of the compliance process. For the future operation of the Kyoto compliance system, it is suggested that ERTs should be confined to more technical and procedural roles, in line with their expertise. There would then be greater scope for the Facilitative Branch to assume a more comprehensive facilitative role, safeguarded by due process guarantees, in accordance with its mandate. However, if – as appears likely – the future compliance trajectories under the United Nations Framework Convention on Climate Change will include a significant role for ERTs without oversight by the Compliance Committee, it is important to develop appropriate procedural safeguards that reflect and shape the various technical and political roles these teams currently play.
Resumo:
The upstream oil and gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data” is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil and gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This viewpoint examines existing data management practices in the upstream oil and gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the way in Big Data. The comparison shows that, in companies that are widely considered to be leaders in Big Data analytics, data is regarded as a valuable asset—but this is usually not true within the oil and gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how the industry could potentially extract more value from data, and concludes with a series of policy-related questions to this end.
Resumo:
Industrial production and supply chains face increased demands for mass customization and tightening regulations on the traceability of goods, leading to higher requirements concerning flexibility, adaptability, and transparency of processes. Technologies for the ’Internet of Things' such as smart products and semantic representations pave the way for future factories and supply chains to fulfill these challenging market demands. In this chapter a backend-independent approach for information exchange in open-loop production processes based on Digital Product Memories DPMs is presented. By storing order-related data directly on the item, relevant lifecycle information is attached to the product itself. In this way, information handover between several stages of the value chain with focus on the manufacturing phase of a product has been realized. In order to report best practices regarding the application of DPM in the domain of industrial production, system prototype implementations focusing on the use case of producing and handling a smart drug case are illustrated.
Resumo:
Recent 'Global Burden of Disease' studies have provided quantitative evidence of the significant role air pollution plays as a human health risk factor (Lim et al., The Lancet, 380: 2224–2260, 2012). Tobacco smoke, including second hand smoke, household air pollution from solid fuels and ambient particulate matter are among the top risks, leading to lower life expectancy around the world. Indoor air constitutes an environment particularly rich in different types of pollutants, originating from indoor sources, as well as penetrating from outdoors, mixing, interacting or growing (when considering microbes) under the protective enclosure of the building envelope. Therefore, it is not a simple task to follow the dynamics of the processes occurring there, or to quantify the outcomes of the processes in terms of pollutant concentrations and other characteristics. This is further complicated by limitations such as building access for the purpose of air quality monitoring, or the instrumentation which can be used indoors, because of their possible interference with the occupants comfort (due to their large size, noise generated or amount of air drawn). European studies apportioned contributions of indoor versus outdoor sources of indoor air contaminants in 26 European countries and quantified IAQ associated DALYs (Disability-Adjusted Life Years) in those countries (Jantunen et al., Promoting actions for healthy indoor air (IAIAQ), European Commission Directorate General for Health and Consumers, Luxembourg, 2011). At the same time, there has been an increase in research efforts around the world to better understand the sources, composition, dynamics and impacts of indoor air pollution. Particular focus has been directed towards the contemporary sources, novel pollutants and new detection methods. The importance of exposure assessment and personal exposure, the majority of which occurs in various indoor micro¬environments, has also been realized. Overall, this emerging knowledge has been providing input for global assessments of indoor environments, the impact of indoor pollutants and their science based management and control. It was a major outcome of recent international conferences that interdisciplinarity and especially a better colla¬boration between exposure and indoor sciences would be of high benefit for the health related evaluation of environmental stress factors and pollutants. A very good example is the combination of biomonitoring and indoor air, particle and dust analysis to study the exposure routes of semi volatile organic compounds (SVOCs). We have adopted the idea of combining the forces of exposure and indoor sciences for this Special Issue, identified new and challenging topics and have attracted colleagues who are top researchers in their field to provide their inputs. The Special Issue includes papers, which collectively present advances in current research topics and in our view, build the bridge between indoor and exposure sciences.
Resumo:
Enterprise Architecture Management (EAM) is discussed in academia and industry as a vehicle to guide IT implementations, alignment, compliance assessment, or technology management. Still, a lack of knowledge prevails about how EAM can be successfully used, and how positive impact can be realized from EAM. To determine these factors, we identify EAM success factors and measures through literature reviews and exploratory interviews and propose a theoretical model that explains key factors and measures of EAM success. We test our model with data collected from a cross-sectional survey of 133 EAM practitioners. The results confirm the existence of an impact of four distinct EAM success factors, ‘EAM product quality’, ‘EAM infrastructure quality’, ‘EAM service delivery quality’, and ‘EAM organizational anchoring’, and two important EAM success measures, ‘intentions to use EAM’ and ‘Organizational and Project Benefits’ in a confirmatory analysis of the model. We found the construct ‘EAM organizational anchoring’ to be a core focal concept that mediated the effect of success factors such as ‘EAM infrastructure quality’ and ‘EAM service quality’ on the success measures. We also found that ‘EAM satisfaction’ was irrelevant to determining or measuring success. We discuss implications for theory and EAM practice.
Resumo:
2,4,6-trinitrotoluene (TNT) is one of the most commonly used nitro aromatic explosives in landmine, military and mining industry. This article demonstrates rapid and selective identification of TNT by surface-enhanced Raman spectroscopy (SERS) using 6-aminohexanethiol (AHT) as a new recognition molecule. First, Meisenheimer complex formation between AHT and TNT is confirmed by the development of pink colour and appearance of new band around 500 nm in UV-visible spectrum. Solution Raman spectroscopy study also supported the AHT:TNT complex formation by demonstrating changes in the vibrational stretching of AHT molecule between 2800-3000 cm−1. For surface enhanced Raman spectroscopy analysis, a self-assembled monolayer (SAM) of AHT is formed over the gold nanostructure (AuNS) SERS substrate in order to selectively capture TNT onto the surface. Electrochemical desorption and X-ray photoelectron studies are performed over AHT SAM modified surface to examine the presence of free amine groups with appropriate orientation for complex formation. Further, AHT and butanethiol (BT) mixed monolayer system is explored to improve the AHT:TNT complex formation efficiency. Using a 9:1 AHT:BT mixed monolayer, a very low detection limit (LOD) of 100 fM TNT was realized. The new method delivers high selectivity towards TNT over 2,4 DNT and picric acid. Finally, real sample analysis is demonstrated by the extraction and SERS detection of 302 pM of TNT from spiked.