916 resultados para Failure time analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In their dialogue entitled - The Food Service Industry Environment: Market Volatility Analysis - by Alex F. De Noble, Assistant Professor of Management, San Diego State University and Michael D. Olsen, Associate Professor and Director, Division of Hotel, Restaurant & Institutional Management at Virginia Polytechnic Institute and State University, De Noble and Olson preface the discussion by saying: “Hospitality executives, as a whole, do not believe they exist in a volatile environment and spend little time or effort in assessing how current and future activity in the environment will affect their success or failure. The authors highlight potential differences that may exist between executives' perceptions and objective indicators of environmental volatility within the hospitality industry and suggest that executives change these perceptions by incorporating the assumption of a much more dynamic environment into their future strategic planning efforts. Objective, empirical evidence of the dynamic nature of the hospitality environment is presented and compared to several studies pertaining to environmental perceptions of the industry.” That weighty thesis statement presumes that hospitality executives/managers do not fully comprehend the environment in which they operate. The authors provide a contrast, which conventional wisdom would seem to support and satisfy. “Broadly speaking, the operating environment of an organization is represented by its task domain,” say the authors. “This task domain consists of such elements as a firm's customers, suppliers, competitors, and regulatory groups.” These are dynamic actors and the underpinnings of change, say the authors by way of citation. “The most difficult aspect for management in this regard tends to be the development of a proper definition of the environment of their particular firm. Being able to precisely define who the customers, competitors, suppliers, and regulatory groups are within the environment of the firm is no easy task, yet is imperative if proper planning is to occur,” De Noble and Olson further contribute to support their thesis statement. The article is bloated, and that’s not necessarily a bad thing, with tables both survey and empirically driven, to illustrate market volatility. One such table is the Bates and Eldredge outline; Table-6 in the article. “This comprehensive outline…should prove to be useful to most executives in expanding their perception of the environment of their firm,” say De Noble and Olson. “It is, however, only a suggested outline,” they advise. “…risk should be incorporated into every investment decision, especially in a volatile environment,” say the authors. De Noble and Olson close with an intriguing formula to gauge volatility in an environment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Being at-risk is a growing problem in the U.S. because of disturbing societal trends such as unemployment, divorce, substance abuse, child abuse and neglect, and the new threat of terrorist violence. Resilience characterizes individuals who rebound from or adapt to adversities such as these, and academic resilience distinguishes at-risk students who succeed in school despite hardships. ^ The purpose of this research was to perform a meta-analysis to examine the power of resilience and to suggest ways educators might improve academic resilience, which was operationalized by satisfactory test scores and grades. In order to find all studies that were relevant to academic resilience in at-risk kindergarten through 12th-grade students, extensive electronic and hardcopy searches were conducted, and these resulted in a database of 421 articles. Two hundred eighty seven of these were rejected quickly, because they were not empirical research. Upon further examination, another 106 were rejected for not meeting study protocol criteria. Ultimately, 28 studies were coded for study level descriptors and effect size variables. ^ Protective factors for resilience were found to originate in physical, psychological, and behavioral domains on proximal/intraindividual, transitional/intrafamilial, or distal/extrafamilial levels. Effect sizes (ESs) for these were weighted and the means for each level or category were interpreted by commonly accepted benchmarks. Mean effect sizes for proximal (M = .27) and for transitional (M = .15) were small but significant. The mean effect size for the distal level was insignificant. This supported the hypotheses that the proximal level was the source of most protective factors for academic resilience in at-risk students followed by the transitional level. The distal effect size warranted further research particularly in light of the small number of studies (n = 11) contributing effect sizes to that category. A homogeneity test indicated a search for moderators, i.e., study variables affecting outcomes, was justified. “Category” was the largest moderator. Graphs of weighted mean effect sizes in the physical, psychological, and behavioral domains were plotted for each level to better illustrate the findings of the meta-analysis. Suggestions were made for combining resilience development with aspects of positive psychology to promote resilience in the schools. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To promote regional or mutual improvement, numerous interjurisdictional efforts to share tax bases have been attempted. Most of these efforts fail to be consummated. Motivations to share revenues include: narrowing fiscal disparities, enhancing regional cooperation and economic development, rationalizing land-use, and minimizing revenue losses caused by competition to attract and keep businesses. Various researchers have developed theories to aid understanding of why interjurisdictional cooperation efforts succeed or fail. Walter Rosenbaum and Gladys Kammerer studied two contemporaneous Florida local-government consolidation attempts. Boyd Messinger subsequently tested their Theory of Successful Consolidation on nine consolidation attempts. Paul Peterson's dual theories on Modern Federalism posit that all governmental levels attempt to further economic development and that politicians act in ways that either further their futures or cement job security. Actions related to the latter theory often interfere with the former. Samuel Nunn and Mark Rosentraub sought to learn how interjurisdictional cooperation evolves. Through multiple case studies they developed a model framing interjurisdictional cooperation in four dimensions. ^ This dissertation investigates the ability of the above theories to help predict success or failure of regional tax-base revenue sharing attempts. A research plan was formed that used five sequenced steps to gather data, analyze it, and conclude if hypotheses concerning the application of these theories were valid. The primary analytical tools were: multiple case studies, cross-case analysis, and pattern matching. Data was gathered from historical records, questionnaires, and interviews. ^ The results of this research indicate that Rosenbaum-Kammerer theory can be a predictor of success or failure in implementing tax-base revenue sharing if it is amended as suggested by Messinger and further modified by a recommendation in this dissertation. Peterson's Functional and Legislative theories considered together were able to predict revenue sharing proposal outcomes. Many of the indicators of interjurisdictional cooperation forwarded in the Nunn-Rosentraub model appeared in the cases studied, but the model was not a reliable forecasting instrument. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Construction projects are complex endeavors that require the involvement of different professional disciplines in order to meet various project objectives that are often conflicting. The level of complexity and the multi-objective nature of construction projects lend themselves to collaborative design and construction such as integrated project delivery (IPD), in which relevant disciplines work together during project conception, design and construction. Traditionally, the main objectives of construction projects have been to build in the least amount of time with the lowest cost possible, thus the inherent and well-established relationship between cost and time has been the focus of many studies. The importance of being able to effectively model relationships among multiple objectives in building construction has been emphasized in a wide range of research. In general, the trade-off relationship between time and cost is well understood and there is ample research on the subject. However, despite sustainable building designs, relationships between time and environmental impact, as well as cost and environmental impact, have not been fully investigated. The objectives of this research were mainly to analyze and identify relationships of time, cost, and environmental impact, in terms of CO2 emissions, at different levels of a building: material level, component level, and building level, at the pre-use phase, including manufacturing and construction, and the relationships of life cycle cost and life cycle CO2 emissions at the usage phase. Additionally, this research aimed to develop a robust simulation-based multi-objective decision-support tool, called SimulEICon, which took construction data uncertainty into account, and was capable of incorporating life cycle assessment information to the decision-making process. The findings of this research supported the trade-off relationship between time and cost at different building levels. Moreover, the time and CO2 emissions relationship presented trade-off behavior at the pre-use phase. The results of the relationship between cost and CO2 emissions were interestingly proportional at the pre-use phase. The same pattern continually presented after the construction to the usage phase. Understanding the relationships between those objectives is a key in successfully planning and designing environmentally sustainable construction projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The exploration and development of oil and gas reserves located in harsh offshore environments are characterized with high risk. Some of these reserves would be uneconomical if produced using conventional drilling technology due to increased drilling problems and prolonged non-productive time. Seeking new ways to reduce drilling cost and minimize risks has led to the development of Managed Pressure Drilling techniques. Managed pressure drilling methods address the drawbacks of conventional overbalanced and underbalanced drilling techniques. As managed pressure drilling techniques are evolving, there are many unanswered questions related to safety and operating pressure regimes. Quantitative risk assessment techniques are often used to answer these questions. Quantitative risk assessment is conducted for the various stages of drilling operations – drilling ahead, tripping operation, casing and cementing. A diagnostic model for analyzing the rotating control device, the main component of managed pressure drilling techniques, is also studied. The logic concept of Noisy-OR is explored to capture the unique relationship between casing and cementing operations in leading to well integrity failure as well as its usage to model the critical components of constant bottom-hole pressure drilling technique of managed pressure drilling during tripping operation. Relevant safety functions and inherent safety principles are utilized to improve well integrity operations. Loss function modelling approach to enable dynamic consequence analysis is adopted to study blowout risk for real-time decision making. The aggregation of the blowout loss categories, comprising: production, asset, human health, environmental response and reputation losses leads to risk estimation using dynamically determined probability of occurrence. Lastly, various sub-models developed for the stages/sub-operations of drilling operations and the consequence modelling approach are integrated for a holistic risk analysis of drilling operations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Finite-Differences Time-Domain (FDTD) algorithms are well established tools of computational electromagnetism. Because of their practical implementation as computer codes, they are affected by many numerical artefact and noise. In order to obtain better results we propose using Principal Component Analysis (PCA) based on multivariate statistical techniques. The PCA has been successfully used for the analysis of noise and spatial temporal structure in a sequence of images. It allows a straightforward discrimination between the numerical noise and the actual electromagnetic variables, and the quantitative estimation of their respective contributions. Besides, The GDTD results can be filtered to clean the effect of the noise. In this contribution we will show how the method can be applied to several FDTD simulations: the propagation of a pulse in vacuum, the analysis of two-dimensional photonic crystals. In this last case, PCA has revealed hidden electromagnetic structures related to actual modes of the photonic crystal.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The work is supported in part by NSFC (Grant no. 61172070), IRT of Shaanxi Province (2013KCT-04), EPSRC (Grant no.Ep/1032606/1).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The work is supported in part by NSFC (Grant no. 61172070), IRT of Shaanxi Province (2013KCT-04), EPSRC (Grant no.Ep/1032606/1).