14 resultados para Equifinality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rock mass is widely recognized as a kind of geologic body which consists of rock blocks and discontinuities. The deformation and failure of rock mass is not only determined by rock block,but also by discontinuity which is virtually more important. Mutual cutting and combination of discontinuities controlled mechanical property of rock mass. The complex cutting of discontinuities determine the intense anisotropy on mechanical property of rock mass,especially under the effect of ground stress. Engineering practice has show that the brittle failure of hard rock always occurs when its working stress is far lower than the yield strength and compressive strength,the failure always directly related to the fracture propagation of discontinuities. Fracture propagation of discontinuities is the virtue of hard rock’s failure. We can research the rock mass discontinuous mechanical properties precisely by the methods of statistical analysis of discontinuities and Fracture Mechanics. According to Superposition Principle in Fracture Mechanics,A Problem or C Problem could be chosen to research. Problem A mainly calculates the crack-tip stress field and displacement field on internal discontinuities by numerical method. Problem C calculate the crack-tip stress field and displacement field under the assumption of that the mainly rock mass stress field has been known. So the Problem C avoid the complex mutual interference of stress fields of discontinuities,which is called crack system problem in Fracture Mechanics. To solve Problem C, field test on stress field in the rock mass is needed. The linear Superposition of discontinuities strain energies are Scientific and Rational. The difference of Fracture Mechanics between rock mass and other materials can mostly expression as:other materials Fracture Mechanics mostly face the problem A,and can’t avoid multi-crack puzzle, while the Rock mass Fracture Mechanics answer to the Problem C. Problem C can avoid multi-discontinuities mutual interference puzzle via the ground stress test. On the basis of Problem C, Fracture Mechanics could be used conveniently in rock mass. The rock mass statistics fracture constitutive relations, which introduced in this article, are based on the Problem C and the Discontinuity Strain Energy linear superposition. This constitutive relation has several merits: first, it is physical constitutive relation rather than empirical; second, it is very fit to describe the rock mass anisotropy properties; third, it elaborates the exogenous factors such as ground stress. The rock mass statistics fracture constitutive relation is the available approach to answer to the physical, anisotropic and ground stress impacted rock mass problems. This article stand on the foundation of predecessor’s statistics fractures constitutive relation, and improved the discontinuity distributive function. This article had derived the limitation of negative exponential distribution in the course of regression analysis, and advocated to using the two parameter negative exponential distribution for instead. In order to solve the problems of two-dimension stability on engineering key cross-sectional view in rock mass, this article derived the rock mass planar flexibility tensor, and established rock mass two-dimension penetrate statistics fracture constitutive relation on the basis of penetrate fracture mechanics. Based on the crack tip plasticity research production of penetrate fracture, for example the Irwin plasticity equifinality crack, this article established the way to deal with the discontinuity stress singularity and plastic yielding problem at discontinuity tip. The research on deformation parameters is always the high light region of rock mass mechanics field. After the dam foundation excavation of XiaoWan hydroelectric power station, dam foundation rock mass upgrowthed a great deal of unload cracks, rock mass mechanical property gotten intricacy and strong anisotropy. The dam foundation rock mass mostly upgrowthed three group discontinuities: the decantation discontinuity, the steep pitch discontinuity, and the schistosity plane. Most of the discontinuities have got partial unload looseness. In accordance with ground stress field data, the dam foundation stress field greatly non-uniform, which felled under the great impaction of tectonic stress field, self-weight stress field, excavation geometric boundary condition, and excavation, unload. The discontinuity complexity and stress field heterogeneity, created the rock mass mechanical property of dam foundation intricacy and levity. The research on the rock mass mechanics, if not take every respected influencing factor into consideration as best as we can, major errors likely to be created. This article calculated the rock mass elastic modulus that after Xiao Wan hydroelectric power station dam foundation gutter excavation finished. The calculation region covered possession monolith of Xiao Wan concrete double-curvature arch dam. Different monolith were adopted the penetrate fracture statistics constitutive relation or bury fracture statistics constitutive relation selectively. Statistics fracture constitutive relation is fit for the intensity anisotropy and heterogeneity rock mass of Xiao Wan hydroelectric power station dam foundation. This article had contrastive analysis the statistics fracture constitutive relation result with the inclined plane load test actual measurement elastic modulus and RMR method estimated elastic modulus, and find that the three methods elastic modulus have got greatly comparability. So, the statistics fracture constitutive relations are qualified for trust. Generally speaking,this article had finished following works based on predecessors job: “Argumentation the C Problems of superposition principle in Fracture Mechanics, establish two-dimension penetrate statistics fracture constitutive relation of rock mass, argue the negative exponential distribution limitation and improve it, improve of the three-dimension berry statistics fracture constitutive relation of rock mass, discontinuity-tip plastic zone isoeffect calculation, calculate the rock mass elastic modulus on two-dimension cross-sectional view”. The whole research clue of this article inherited from the “statistics rock mass mechanics” of Wu Faquan(1992).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are now considerable expectations that semi-distributed models are useful tools for supporting catchment water quality management. However, insufficient attention has been given to evaluating the uncertainties inherent to this type of model, especially those associated with the spatial disaggregation of the catchment. The Integrated Nitrogen in Catchments model (INCA) is subjected to an extensive regionalised sensitivity analysis in application to the River Kennet, part of the groundwater-dominated upper Thames catchment, UK The main results are: (1) model output was generally insensitive to land-phase parameters, very sensitive to groundwater parameters, including initial conditions, and significantly sensitive to in-river parameters; (2) INCA was able to produce good fits simultaneously to the available flow, nitrate and ammonium in-river data sets; (3) representing parameters as heterogeneous over the catchment (206 calibrated parameters) rather than homogeneous (24 calibrated parameters) produced a significant improvement in fit to nitrate but no significant improvement to flow and caused a deterioration in ammonium performance; (4) the analysis indicated that calibrating the flow-related parameters first, then calibrating the remaining parameters (as opposed to calibrating all parameters together) was not a sensible strategy in this case; (5) even the parameters to which the model output was most sensitive suffered from high uncertainty due to spatial inconsistencies in the estimated optimum values, parameter equifinality and the sampling error associated with the calibration method; (6) soil and groundwater nutrient and flow data are needed to reduce. uncertainty in initial conditions, residence times and nitrogen transformation parameters, and long-term historic data are needed so that key responses to changes in land-use management can be assimilated. The results indicate the general, difficulty of reconciling the questions which catchment nutrient models are expected to answer with typically limited data sets and limited knowledge about suitable model structures. The results demonstrate the importance of analysing semi-distributed model uncertainties prior to model application, and illustrate the value and limitations of using Monte Carlo-based methods for doing so. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Integrated Catchments model of Phosphorus dynamics (INCA-P) was applied to the River Lugg to determine the key factors controlling delivery of phosphorus to the main channel and to quantify the relative contribution of diffuse and point sources to the in-stream phosphorus (P) load under varying hydrological conditions. The model is able to simulate the seasonal variations and inter-annual variations in the in-stream total-phosphorus concentrations. However, difficulties in simulating diffuse inputs arise due to equifinality in the model structure and parameters. The River Lugg is split into upper and lower reaches. The upper reaches are dominated by grassland and woodland, so the patterns in the stream-water total-phosphorus concentrations are typical of diffuse source inputs; application of the model leads to estimates of the relative contribution to the in-stream P load from diffuse and point sources as 9:1. In the lower reaches, which are more intensively cultivated and urbanised, the stream-water total-phosphorus concentration dynamics are influenced more by point-sources; the simulated relative diffuse/point contribution to the in-stream P load is 1: 1. The model set-up and simulations are used to identify the key source-areas of P in the catchment, the P contribution of the Lugg to the River Wye during years with contrasting precipitation inputs, and the uptake and release of P from within-reach sediment. In addition, model scenarios are run to identify the impacts of likely P reductions at sewage treatment works on the in-stream soluble-reactive P concentrations and the suitability of this as a management option is assessed for reducing eutrophication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the Lambourn and Pang river-systems to integrate current process-knowledge and available-data to test two hypotheses and thereby determine the key factors and processes controlling the movement of nitrate at the catchment-scale in lowland, permeable river-systems: (i) that the in-stream nitrate concentrations were controlled by two end-members only: groundwater and soil-water, and (ii) that the groundwater was the key store of nitrate in these river-systems. Neither hypothesis was proved true or false. Due to equifinality in the model structure and parameters at least two alternative models provided viable explanations for the observed in-stream nitrate concentrations. One model demonstrated that the seasonal-pattern in the stream-water nitrate concentrations was controlled mainly by the mixing of ground- and soil-water inputs. An alternative model demonstrated that in-stream processes were important. It is hoped further measurements of nitrate concentrations made in the catchment soil- and ground-water and in-stream may constrain the model and help determine the correct structure, though other recent studies suggest that these data may serve only to highlight the heterogeneity of the system. Thus when making model-based assessments and forecasts it is recommend that all possible models are used, and the range of forecasts compared. In this study both models suggest that cereal production contributed approximately 50% the simulated in-stream nitrate toad in the two catchments, and the point-source contribution to the in-stream load was minimal. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Design/methodology/approach – Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. Findings – It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models. Evidence is found of equifinality in the outputs of a simple, aggregated model of development viability relative to more complex, disaggregated models. Originality/value – Development viability appraisal has become increasingly important in the planning system. Consequently, the theory, application and outputs from development appraisal are under intense scrutiny from a wide range of users. However, there has been very little published evaluation of viability models. This paper contributes to the limited literature in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Management Control System (MCS) research is undergoing turbulent times. For a long time related to cybernetic instruments of management accounting only, MCS are increasingly seen as complex systems comprising not only formal accounting-driven instruments, but also informal mechanisms of control based on organizational culture. But not only have the means of MCS changed; researchers increasingly ap-ply MCS to organizational goals other than strategy implementation.rnrnTaking the question of "How do I design a well-performing MCS?" as a starting point, this dissertation aims at providing a comprehensive and integrated overview of the "current-state" of MCS research. Opting for a definition of MCS, broad in terms of means (all formal as well as informal MCS instruments), but focused in terms of objectives (behavioral control only), the dissertation contributes to MCS theory by, a) developing an integrated (contingency) model of MCS, describing its contingencies, as well as its subcomponents, b) refining the equifinality model of Gresov/Drazin (1997), c) synthesizing research findings from contingency and configuration research concerning MCS, taking into account case studies on research topics such as ambi-dexterity, equifinality and time as a contingency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ultimate intent of this dissertation was to broaden and strengthen our understanding of IT implementation by emphasizing research efforts on the dynamic nature of the implementation process. More specifically, efforts were directed toward opening the "black box" and providing the story that explains how and why contextual conditions and implementation tactics interact to produce project outcomes. In pursuit of this objective, the dissertation was aimed at theory building and adopted a case study methodology combining qualitative and quantitative evidence. Precisely, it examined the implementation process, use and consequences of three clinical information systems at Jackson Memorial Hospital, a large tertiary care teaching hospital.^ As a preliminary step toward the development of a more realistic model of system implementation, the study proposes a new set of research propositions reflecting the dynamic nature of the implementation process.^ Findings clearly reveal that successful implementation projects are likely to be those where key actors envision end goals, anticipate challenges ahead, and recognize the presence of and seize opportunities. It was also found that IT implementation is characterized by the systems theory of equifinality, that is, there are likely several equally effective ways to achieve a given end goal. The selection of a particular implementation strategy appears to be a rational process where actions and decisions are largely influenced by the degree to which key actors recognize the mediating role of each tactic and are motivated to action. The nature of the implementation process is also characterized by the concept of "duality of structure," that is, context and actions mutually influence each other. Another key finding suggests that there is no underlying program that regulates the process of change and moves it form one given point toward a subsequent and already prefigured end. For this reason, the implementation process cannot be thought of as a series of activities performed in a sequential manner such as conceived in stage models. Finally, it was found that IT implementation is punctuated by a certain indeterminacy. Results suggest that only when substantial efforts are focused on what to look for and think about, it is less likely that unfavorable and undesirable consequences will occur. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of transport parameters by inverse modeling often suffers from equifinality or parameter correlation when models are fitted to observations of the solute breakthrough in column outflow experiments. This parameters uncertainty can be approached by the application of multiple experimental designs such as column experiments in open-flow mode and the recently proposed closed-flow mode. Latter are characterized by the recirculation of the column effluent into the solution supply vessel that feeds the inflow. Depending on the experimental conditions, the solute concentration in the solution supply vessel and the effluent follows a damped sinusoidal oscillation. As a result, the closed-flow experiment provides additional observables in the breakthrough curve. The evaluation of these emergent features allows intrinsic control over boundary conditions and impacts the uncertainty of parameters in inverse modeling. We present a comprehensive sensitivity analysis to illustrate the potential application of closed-flow experiments. We show that the sensitivity with respect to the apparent dispersion can be controlled by the experimenter leading to a decrease in parameter uncertainty as compared to classical experiments by an order of magnitude for optimal settings. With these finding we are also able to reduce the equifinality found for situations, where rate-limited interactions impede a proper determination of the apparent dispersion and rate coefficients. Furthermore, we show the expected breakthrough curve for equilibrium and kinetic sorption, the latter showing strong similarities to the behavior found for completely mixed batch reactor experiments. This renders the closed-flow mode a useful complementary approach to classical column experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The domestication of plants and animals marks one of the most significant transitions in human, and indeed global, history. Traditionally, study of the domestication process was the exclusive domain of archaeologists and agricultural scientists; today it is an increasingly multidisciplinary enterprise that has come to involve the skills of evolutionary biologists and geneticists. Although the application of new information sources and methodologies has dramatically transformed our ability to study and understand domestication, it has also generated increasingly large and complex datasets, the interpretation of which is not straightforward. In particular, challenges of equifinality, evolutionary variance, and emergence of unexpected or counter-intuitive patterns all face researchers attempting to infer past processes directly from patterns in data. We argue that explicit modeling approaches, drawing upon emerging methodologies in statistics and population genetics, provide a powerful means of addressing these limitations. Modeling also offers an approach to analyzing datasets that avoids conclusions steered by implicit biases, and makes possible the formal integration of different data types. Here we outline some of the modeling approaches most relevant to current problems in domestication research, and demonstrate the ways in which simulation modeling is beginning to reshape our understanding of the domestication process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les enjeux hydrologiques modernes, de prévisions ou liés aux changements climatiques, forcent l’exploration de nouvelles approches en modélisation afin de combler les lacunes actuelles et d’améliorer l’évaluation des incertitudes. L’approche abordée dans ce mémoire est celle du multimodèle (MM). L’innovation se trouve dans la construction du multimodèle présenté dans cette étude : plutôt que de caler individuellement des modèles et d’utiliser leur combinaison, un calage collectif est réalisé sur la moyenne des 12 modèles globaux conceptuels sélectionnés. Un des défis soulevés par cette approche novatrice est le grand nombre de paramètres (82) qui complexifie le calage et l’utilisation, en plus d’entraîner des problèmes potentiels d’équifinalité. La solution proposée dans ce mémoire est une analyse de sensibilité qui permettra de fixer les paramètres peu influents et d’ainsi réduire le nombre de paramètres total à caler. Une procédure d’optimisation avec calage et validation permet ensuite d’évaluer les performances du multimodèle et de sa version réduite en plus d’en améliorer la compréhension. L’analyse de sensibilité est réalisée avec la méthode de Morris, qui permet de présenter une version du MM à 51 paramètres (MM51) tout aussi performante que le MM original à 82 paramètres et présentant une diminution des problèmes potentiels d’équifinalité. Les résultats du calage et de la validation avec le « Split-Sample Test » (SST) du MM sont comparés avec les 12 modèles calés individuellement. Il ressort de cette analyse que les modèles individuels, composant le MM, présentent de moins bonnes performances que ceux calés indépendamment. Cette baisse de performances individuelles, nécessaire pour obtenir de bonnes performances globales du MM, s’accompagne d’une hausse de la diversité des sorties des modèles du MM. Cette dernière est particulièrement requise pour les applications hydrologiques nécessitant une évaluation des incertitudes. Tous ces résultats mènent à une amélioration de la compréhension du multimodèle et à son optimisation, ce qui facilite non seulement son calage, mais également son utilisation potentielle en contexte opérationnel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Miles e Snow’s configurational theory has received a great deal of attention from many investigators. Framing the Miles e Snow Typology with the organizational configuration concept, the main purpose of this paper is to make an empirical evaluation of what configurational theories postulate: higher organizational performance is associated to the resemblance to one of the ideal types defined. However, as it is often assumed that an organization can increase performance by selecting the adjustable hybrid type to its own exogenous environment, the relation between the organization’s effectiveness and the hybrid configuration alignment to the respective specific environment types was also analyzed. The assumption of equifinality was also considered because the configurational theory assumes that all the ideal types can potentially achieve the same performance level. A multiple regression model was made to confirm if the misfit related to the ideal and hybrid types has significant impact on the organizational effectiveness. The analysis of variance and the Kruskal-Wallis test were used to verify the equality of performance between the different organization types. In short, the empirical results obtained confirm what is postulated in the theory.