920 resultados para Ecosystem-level models
Resumo:
A Pilot-Scale Engineered Ecosystem (PSEE) operated for over two years in sub-tropical conditions, produced an effluent with COD (median 38 mg/L) and TSS (median 3 mg/L) levels comparable to that required by the AS/NZS 1547:2000 Onsite Domestic Wastewater Management standard. Only partial nitrification was achieved as dissimilatory nitrate reduction to ammonia occurred; however the level of NH4-N was reduced by 75% and total inorganic nitrogen by 53%. Phosphorus was not removed by the system due to the lack of regular sludge removal. Mass balances around the system showed that bacteria removed 36% of the influent nitrogen and 76% of the influent COD. Algae and plants were shown to remove 5% of the influent nitrogen, and 6% of the influent phosphorus. Challenges in developing a sustainable on-site wastewater treatment system were largely met by minimising chemical, energy and labour inputs, eliminating the need for frequent sludge handling, and creating an effluent quality suitable for re-use in non-potable applications. However, the sludge removal from the system needs to be adequately managed to avoid excessive accumulation as this can cause a range of negative impacts.
Resumo:
Modeling volcanic phenomena is complicated by free-surfaces often supporting large rheological gradients. Analytical solutions and analogue models provide explanations for fundamental characteristics of lava flows. But more sophisticated models are needed, incorporating improved physics and rheology to capture realistic events. To advance our understanding of the flow dynamics of highly viscous lava in Peléean lava dome formation, axi-symmetrical Finite Element Method (FEM) models of generic endogenous dome growth have been developed. We use a novel technique, the level-set method, which tracks a moving interface, leaving the mesh unaltered. The model equations are formulated in an Eulerian framework. In this paper we test the quality of this technique in our numerical scheme by considering existing analytical and experimental models of lava dome growth which assume a constant Newtonian viscosity. We then compare our model against analytical solutions for real lava domes extruded on Soufrière, St. Vincent, W.I. in 1979 and Mount St. Helens, USA in October 1980 using an effective viscosity. The level-set method is found to be computationally light and robust enough to model the free-surface of a growing lava dome. Also, by modeling the extruded lava with a constant pressure head this naturally results in a drop in extrusion rate with increasing dome height, which can explain lava dome growth observables more appropriately than when using a fixed extrusion rate. From the modeling point of view, the level-set method will ultimately provide an opportunity to capture more of the physics while benefiting from the numerical robustness of regular grids.
Resumo:
Wildlife-habitat models are an important tool in wildlife management toda?, and by far the majority of these predict aspects of species distribution (abundance or presence) as a proxy measure of habitat quality. Unfortunately, few are tested on independent data, and of those that are, few show useful predictive st;ill. We demonstrate that six critical assumptions underlie distribution based wildlife-habitat models, all of which must be valid for the model to predict habitat quality. We outline these assumptions in a mete-model, and discuss methods for their validation. Even where all sis assumptions show a high level of validity, there is still a strong likelihood that the model will not predict habitat quality. However, the meta-model does suggest habitat quality can be predicted more accurately if distributional data are ignored, and variables more indicative of habitat quality are modelled instead.
Resumo:
This article tests the hypothesis of opportunistic and partisan cycle models using a new large data set of Brazilian municipalities over the 1989-2005 period. The results show an increase in total and current expenditures and a decrease in municipal investments, local tax revenues, and budget surplus in election years. They also show that partisan ideology exerts a relative influence on the performance of the local public accounts. These results confirm that both opportunistic and partisan cycles have occurred in the management of the budgets of Brazilian municipalities after the end of the military government.
Resumo:
Objective: To compare the triggering performance of mid-level ICU mechanical ventilators with a standard ICU mechanical ventilator. Design: Experimental bench study. Setting: The respiratory care laboratory of a university-affiliated teaching hospital. Subject: A computerized mechanical lung model, the IngMar ASL5000. Interventions: Ten mid-level ICU ventilators were compared to an ICU ventilator at two levels of lung model effort, three combinations of respiratory mechanics (normal, COPD and ARDS) and two modes of ventilation, volume and pressure assist/control. A total of 12 conditions were compared. Measurements and main results: Performance varied widely among ventilators. Mean inspiratory trigger time was < 100 ms for only half of the tested ventilators. The mean inspiratory delay time (time from initiation of the breath to return of airway pressure to baseline) was longer than that for the ICU ventilator for all tested ventilators except one. The pressure drop during triggering (Ptrig) was comparable with that of the ICU ventilator for only two ventilators. Expiratory Settling Time (time for pressure to return to baseline) had the greatest variability among ventilators. Conclusions: Triggering differences among these mid-level ICU ventilators and with the ICU ventilator were identified. Some of these ventilators had a much poorer triggering response with high inspiratory effort than the ICU ventilator. These ventilators do not perform as well as ICU ventilators in patients with high ventilatory demand.
Resumo:
Impulsivity based on Gray's [Gray, J. A. (1982) The neuropsychology of anxiety: an enquiry into the function of the septo-hippocampal system. New York: Oxford University Press: (1991). The neurophysiology of temperament. In J. Strelau & A. Angleitner. Explorations in temperament: international perspectives on theory and measurement. London. Plenum Press]. physiological model of personality was hypothesised to be more predictive of goal oriented criteria within the workplace than scales derived From Eysenck's [Eysenck. H.J. (1967). The biological basis of personality. Springfield, IL: Charles C. Thompson.] physiological model of personality. Results confirmed the hypothesis and also showed that Gray's scale of Impulsivity was generally a better predictor than attributional style and interest in money. Results were interpreted as providing support for Gray's Behavioural Activation System which moderates response to reward. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Understanding the genetic architecture of quantitative traits can greatly assist the design of strategies for their manipulation in plant-breeding programs. For a number of traits, genetic variation can be the result of segregation of a few major genes and many polygenes (minor genes). The joint segregation analysis (JSA) is a maximum-likelihood approach for fitting segregation models through the simultaneous use of phenotypic information from multiple generations. Our objective in this paper was to use computer simulation to quantify the power of the JSA method for testing the mixed-inheritance model for quantitative traits when it was applied to the six basic generations: both parents (P-1 and P-2), F-1, F-2, and both backcross generations (B-1 and B-2) derived from crossing the F-1 to each parent. A total of 1968 genetic model-experiment scenarios were considered in the simulation study to quantify the power of the method. Factors that interacted to influence the power of the JSA method to correctly detect genetic models were: (1) whether there were one or two major genes in combination with polygenes, (2) the heritability of the major genes and polygenes, (3) the level of dispersion of the major genes and polygenes between the two parents, and (4) the number of individuals examined in each generation (population size). The greatest levels of power were observed for the genetic models defined with simple inheritance; e.g., the power was greater than 90% for the one major gene model, regardless of the population size and major-gene heritability. Lower levels of power were observed for the genetic models with complex inheritance (major genes and polygenes), low heritability, small population sizes and a large dispersion of favourable genes among the two parents; e.g., the power was less than 5% for the two major-gene model with a heritability value of 0.3 and population sizes of 100 individuals. The JSA methodology was then applied to a previously studied sorghum data-set to investigate the genetic control of the putative drought resistance-trait osmotic adjustment in three crosses. The previous study concluded that there were two major genes segregating for osmotic adjustment in the three crosses. Application of the JSA method resulted in a change in the proposed genetic model. The presence of the two major genes was confirmed with the addition of an unspecified number of polygenes.
Resumo:
Observations of accelerating seismic activity prior to large earthquakes in natural fault systems have raised hopes for intermediate-term eartquake forecasting. If this phenomena does exist, then what causes it to occur? Recent theoretical work suggests that the accelerating seismic release sequence is a symptom of increasing long-wavelength stress correlation in the fault region. A more traditional explanation, based on Reid's elastic rebound theory, argues that an accelerating sequence of seismic energy release could be a consequence of increasing stress in a fault system whose stress moment release is dominated by large events. Both of these theories are examined using two discrete models of seismicity: a Burridge-Knopoff block-slider model and an elastic continuum based model. Both models display an accelerating release of seismic energy prior to large simulated earthquakes. In both models there is a correlation between the rate of seismic energy release with the total root-mean-squared stress and the level of long-wavelength stress correlation. Furthermore, both models exhibit a systematic increase in the number of large events at high stress and high long-wavelength stress correlation levels. These results suggest that either explanation is plausible for the accelerating moment release in the models examined. A statistical model based on the Burridge-Knopoff block-slider is constructed which indicates that stress alone is sufficient to produce accelerating release of seismic energy with time prior to a large earthquake.
Resumo:
As inorganic arsenic is a proven human carcinogen, significant effort has been made in recent decades in an attempt to understand arsenic carcinogenesis using animal models, including rodents (rats and mice) and larger mammals such as beagles and monkeys. Transgenic animals were also used to test the carcinogenic effect of arsenicals, but until recently all models had failed to mimic satisfactorily the actual mechanism of arsenic carcinogenicity. However, within the past decade successful animal models have been developed using the most common strains of mice or rats. Thus dimethylarsinic acid (DMA), an organic arsenic compound which is the major metabolite of inorganic arsenicals in mammals, has been proven to be tumorigenic in such animals. Reports of successful cancer induction in animals by inorganic arsenic (arsenite and arsenate) have been rare, and most carcinogenetic studies have used organic arsenicals such as DMA combined with other tumor initiators. Although such experiments used high concentrations. of arsenicals for the promotion of tumors, animal models using doses of arsenicals species closed to the exposure level of humans in endemic areas are obviously the most significant. Almost all researchers have used drinking water or food as the pathway for the development of animal model test systems in order to mimic chronic arsenic poisoning in humans; such pathways seem more likely to achieve desirable results. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Developments in computer and three dimensional (3D) digitiser technologies have made it possible to keep track of the broad range of data required to simulate an insect moving around or over the highly heterogeneous habitat of a plant's surface. Properties of plant parts vary within a complex canopy architecture, and insect damage can induce further changes that affect an animal's movements, development and likelihood of survival. Models of plant architectural development based on Lindenmayer systems (L-systems) serve as dynamic platforms for simulation of insect movement, providing ail explicit model of the developing 3D structure of a plant as well as allowing physiological processes associated with plant growth and responses to damage to be described and Simulated. Simple examples of the use of the L-system formalism to model insect movement, operating Lit different spatial scales-from insects foraging on an individual plant to insects flying around plants in a field-are presented. Such models can be used to explore questions about the consequences of changes in environmental architecture and configuration on host finding, exploitation and its population consequences. In effect this model is a 'virtual ecosystem' laboratory to address local as well as landscape-level questions pertinent to plant-insect interactions, taking plant architecture into account. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
This paper examines the performance of Portuguese equity funds investing in the domestic and in the European Union market, using several unconditional and conditional multi-factor models. In terms of overall performance, we find that National funds are neutral performers, while European Union funds under-perform the market significantly. These results do not seem to be a consequence of management fees. Overall, our findings are supportive of the robustness of conditional multi-factor models. In fact, Portuguese equity funds seem to be relatively more exposed to smallcaps and more value-oriented. Also, they present strong evidence of time-varying betas and, in the case of the European Union funds, of time-varying alphas too. Finally, in terms of market timing, our tests suggest that mutual fund managers in our sample do not exhibit any market timing abilities. Nevertheless, we find some evidence of timevarying conditional market timing abilities but only at the individual fund level.
Resumo:
Current software development relies increasingly on non-trivial coordination logic for com- bining autonomous services often running on di erent platforms. As a rule, however, in typical non-trivial software systems, such a coordination layer is strongly weaved within the application at source code level. Therefore, its precise identi cation becomes a major methodological (and technical) problem which cannot be overestimated along any program understanding or refactoring process. Open access to source code, as granted in OSS certi cation, provides an opportunity for the devel- opment of methods and technologies to extract, from source code, the relevant coordination information. This paper is a step in this direction, combining a number of program analysis techniques to automatically recover coordination information from legacy code. Such information is then expressed as a model in Orc, a general purpose orchestration language
Resumo:
The Ambient Assisted Living (AAL) area is in constant evolution, providing new technologies to users and enhancing the level of security and comfort that is ensured by house platforms. The Ambient Assisted Living for All (AAL4ALL) project aims to develop a new AAL concept, supported on a unified ecosystem and certification process that enables a heterogeneous environment. The concepts of Intelligent Environments, Ambient Intelligence, and the foundations of the Ambient Assisted Living are all presented in the framework of this project. In this work, we consider a specific platform developed in the scope of AAL4ALL, called UserAccess. The architecture of the platform and its role within the overall AAL4ALL concept, the implementation of the platform, and the available interfaces are presented. In addition, its feasibility is validated through a series of tests.
Resumo:
A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.
Resumo:
A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.