971 resultados para Empirical Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conservation of free-ranging cheetah (Acinonyx jubatus) populations is multi faceted and needs to be addressed from an ecological, biological and management perspective. There is a wealth of published research, each focusing on a particular aspect of cheetah conservation. Identifying the most important factors, making sense of various (and sometimes contrasting) findings, and taking decisions when little or no empirical data is available, are everyday challenges facing conservationists. Bayesian networks (BN) provide a statistical modeling framework that enables analysis and integration of information addressing different aspects of conservation. There has been an increased interest in the use of BNs to model conservation issues, however the development of more sophisticated BNs, utilizing object-oriented (OO) features, is still at the frontier of ecological research. We describe an integrated, parallel modeling process followed during a BN modeling workshop held in Namibia to combine expert knowledge and data about free-ranging cheetahs. The aim of the workshop was to obtain a more comprehensive view of the current viability of the free-ranging cheetah population in Namibia, and to predict the effect different scenarios may have on the future viability of this free-ranging cheetah population. Furthermore, a complementary aim was to identify influential parameters of the model to more effectively target those parameters having the greatest impact on population viability. The BN was developed by aggregating diverse perspectives from local and independent scientists, agents from the national ministry, conservation agency members and local fieldworkers. This integrated BN approach facilitates OO modeling in a multi-expert context which lends itself to a series of integrated, yet independent, subnetworks describing different scientific and management components. We created three subnetworks in parallel: a biological, ecological and human factors network, which were then combined to create a complete representation of free-ranging cheetah population viability. Such OOBNs have widespread relevance to the effective and targeted conservation management of vulnerable and endangered species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airport efficiency is important because it has a direct impact on customer safety and satisfaction and therefore the financial performance and sustainability of airports, airlines, and affiliated service providers. This is especially so in a world characterized by an increasing volume of both domestic and international air travel, price and other forms of competition between rival airports, airport hubs and airlines, and rapid and sometimes unexpected changes in airline routes and carriers. It also reflects expansion in the number of airports handling regional, national, and international traffic and the growth of complementary airport facilities including industrial, commercial, and retail premises. This has fostered a steadily increasing volume of research aimed at modeling and providing best-practice measures and estimates of airport efficiency using mathematical and econometric frontiers. The purpose of this chapter is to review these various methods as they apply to airports throughout the world. Apart from discussing the strengths and weaknesses of the different approaches and their key findings, the paper also examines the steps faced by researchers as they move through the modeling process in defining airport inputs and outputs and the purported efficiency drivers. Accordingly, the chapter provides guidance to those conducting empirical research on airport efficiency and serves as an aid for aviation regulators and airport operators among others interpreting airport efficiency research outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Irradiance profile around the receiver tube (RT) of a parabolic trough collector (PTC) is a key effect of optical performance that affects the overall energy performance of the collector. Thermal performance evaluation of the RT relies on the appropriate determination of the irradiance profile. This article explains a technique in which empirical equations were developed to calculate the local irradiance as a function of angular location of the RT of a standard PTC using a vigorously verified Monte Carlo ray tracing model. A large range of test conditions including daily normal insolation, spectral selective coatings and glass envelop conditions were selected from the published data by Dudley et al. [1] for the job. The R2 values of the equations are excellent that vary in between 0.9857 and 0.9999. Therefore, these equations can be used confidently to produce realistic non-uniform boundary heat flux profile around the RT at normal incidence for conjugate heat transfer analyses of the collector. Required values in the equations are daily normal insolation, and the spectral selective properties of the collector components. Since the equations are polynomial functions, data processing software can be employed to calculate the flux profile very easily and quickly. The ultimate goal of this research is to make the concentrating solar power technology cost competitive with conventional energy technology facilitating its ongoing research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Context-awareness has emerged as an important principle in the design of flexible business processes. The goal of the research is to develop an approach to extend context-aware business process modeling toward location-awareness. The purpose of this paper is to identify and conceptualize location-dependencies in process modeling. Design/methodology/approach – This paper uses a pattern-based approach to identify location-dependency in process models. The authors design specifications for these patterns. The authors present illustrative examples and evaluate the identified patterns through a literature review of published process cases. Findings – This paper introduces location-awareness as a new perspective to extend context-awareness in BPM research, by introducing relevant location concepts such as location-awareness and location-dependencies. The authors identify five basic location-dependent control-flow patterns that can be captured in process models. And the authors identify location-dependencies in several existing case studies of business processes. Research limitations/implications – The authors focus exclusively on the control-flow perspective of process models. Further work needs to extend the research to address location-dependencies in process data or resources. Further empirical work is needed to explore determinants and consequences of the modeling of location-dependencies. Originality/value – As existing literature mostly focusses on the broad context of business process, location in process modeling still is treated as “second class citizen” in theory and in practice. This paper discusses the vital role of location-dependencies within business processes. The proposed five basic location-dependent control-flow patterns are novel and useful to explain location-dependency in business process models. They provide a conceptual basis for further exploration of location-awareness in the management of business processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new multivariate GARCH model with time-varying conditional correlation structure. The time-varying conditional correlations change smoothly between two extreme states of constant correlations according to a predetermined or exogenous transition variable. An LM–test is derived to test the constancy of correlations and LM- and Wald tests to test the hypothesis of partially constant correlations. Analytical expressions for the test statistics and the required derivatives are provided to make computations feasible. An empirical example based on daily return series of five frequently traded stocks in the S&P 500 stock index completes the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organizational and technological systems analysis and design practices such as process modeling have received much attention in recent years. However, while knowledge about related artifacts such as models, tools, or grammars has substantially matured, little is known about the actual tasks and interaction activities that are conducted as part of analysis and design acts. In particular, key role of the facilitator has not been researched extensively to date. In this paper, we propose a new conceptual framework that can be used to examine facilitation behaviors in process modeling projects. The framework distinguishes four behavioral styles in facilitation (the driving engineer, the driving artist, the catalyzing engineer, and the catalyzing artist) that a facilitator can adopt. To distinguish between the four styles, we provide a set of ten behavioral anchors that underpin facilitation behaviors. We also report on a preliminary empirical exploration of our framework through interviews with experienced analysts in six modeling cases. Our research provides a conceptual foundation for an emerging theory for describing and explaining different behaviors associated with process modeling facilitation, provides first preliminary empirical results about facilitation in modeling projects, and provides a fertile basis for examining facilitation in other conceptual modeling activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise Architecture Management (EAM) is discussed in academia and industry as a vehicle to guide IT implementations, alignment, compliance assessment, or technology management. Still, a lack of knowledge prevails about how EAM can be successfully used, and how positive impact can be realized from EAM. To determine these factors, we identify EAM success factors and measures through literature reviews and exploratory interviews and propose a theoretical model that explains key factors and measures of EAM success. We test our model with data collected from a cross-sectional survey of 133 EAM practitioners. The results confirm the existence of an impact of four distinct EAM success factors, ‘EAM product quality’, ‘EAM infrastructure quality’, ‘EAM service delivery quality’, and ‘EAM organizational anchoring’, and two important EAM success measures, ‘intentions to use EAM’ and ‘Organizational and Project Benefits’ in a confirmatory analysis of the model. We found the construct ‘EAM organizational anchoring’ to be a core focal concept that mediated the effect of success factors such as ‘EAM infrastructure quality’ and ‘EAM service quality’ on the success measures. We also found that ‘EAM satisfaction’ was irrelevant to determining or measuring success. We discuss implications for theory and EAM practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Employees’ safety climate perceptions dictate their safety behavior because individuals act based on their perceptions of reality. Extensive empirical research in applied psychology has confirmed this relationship. However, rare efforts have been made to investigate the factors contributing to a favorable safety climate in construction research. As an initial effort to address the knowledge gap, this paper examines factors contributing to a psychological safety climate, an operationalization of a safety climate at the individual level, and, hence, the basic element of a safety climate at higher levels. A multiperspective framework of contributors to a psychological safety climate is estimated by a structural equation modeling technique using individual questionnaire responses from a random sample of construction project personnel. The results inform management of three routes to psychological safety climate: a client’s proactive involvement in safety management, a workforce-friendly workplace created by the project team, and transformational supervisors’ communication about safety matters with the workforce. This paper contributes to the field of construction engineering and management by highlighting a broader contextual influence in a systematic formation of psychological safety climate perceptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.