959 resultados para ecological box-model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

La formiga argentina (Linepithema humile) es troba entre les espècies més invasores: originària d'Amèrica del Sud, actualment ha envaït nombroses àrees arreu del món. Aquesta tesi doctoral intenta fer una primera anàlisi integrada i multiescalar de la distribució de la formiga argentina mitjançant l'ús de models de nínxol ecològic. D'acord amb els resultats obtinguts, es preveu que la formiga argentina assoleixi una distribució més àmplia que l'actual. Les prediccions obtingudes a partir dels models concorden amb la distribució actualment coneguda i, a més, indiquen àrees a prop de la costa i dels rius principals com a altament favorables per a l'espècie. Aquests resultats corroboren la idea que la formiga argentina no es troba actualment en equilibri amb el medi. D'altra banda, amb el canvi climàtic, s'espera que la distribució de la formiga argentina s'estengui cap a latituds més elevades en ambdós hemisferis, i sofreixi una retracció en els tròpics a escales globals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using a flexible chemical box model with full heterogeneous chemistry, intercepts of chemically modified Langley plots have been computed for the 5 years of zenith-sky NO2 data from Faraday in Antarctica (65°S). By using these intercepts as the effective amount in the reference spectrum, drifts in zero of total vertical NO2 were much reduced. The error in zero of total NO2 is ±0.03×1015 moleccm−2 from one year to another. This error is small enough to determine trends in midsummer and any variability in denoxification between midwinters. The technique also suggests a more sensitive method for determining N2O5 from zenith-sky NO2 data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Biosecurity is a great challenge to policy-makers globally. Biosecurity policies aim to either prevent invasions before they occur or to eradicate and/or effectively manage the invasive species and diseases once an invasion has occurred. Such policies have traditionally been directed towards professional producers in natural resource based sectors, including agriculture. Given the wide scope of issues threatened by invasive species and diseases, it is important to account for several types of stakeholders that are involved. We investigate the problem of an invasive insect pest feeding on an agricultural crop with heterogeneous producers: profit-oriented professional farmers and utility-oriented hobby farmers. We start from an ecological-economic model conceptually similar to the one developed by Eiswerth and Johnson [Eiswerth, M.E. and Johnson, W.S., 2002. Managing nonindigenous invasive species: insights from dynamic analysis. Environmental and Resource Economics 23, 319-342.] and extend it in three ways. First, we make explicit the relationship between the invaded state carrying capacity and farmers' planting decisions. Second, we add another producer type into the framework and hence account for the existence of both professional and hobby fanners. Third, we provide a theoretical contribution by discussing two alternative types of equilibria. We also apply the model to an empirical case to extract a number of stylised facts and in particular to assess: a) under which circumstances the invasion is likely to be not controllable; and b) how extending control policies to hobby farmers could affect both types of producers. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

SeaWiFS (Sea-viewing Wide Field-of-view Sensor) chlorophyll data revealed strong interannual variability in fall phytoplankton dynamics in the Gulf of Maine, with 3 general features in any one year: (1) rapid chlorophyll increases in response to storm events in fall; (2) gradual chlorophyll increases in response to seasonal wind-and cooling-induced mixing that gradually deepens the mixed layer; and (3) the absence of any observable fall bloom. We applied a mixed-layer box model and a 1-dimensional physical-biological numerical model to examine the influence of physical forcing (surface wind, heat flux, and freshening) on the mixed-layer dynamics and its impact on the entrainment of deep-water nutrients and thus on the appearance of fall bloom. The model results suggest that during early fall, the surface mixed-layer depth is controlled by both wind-and cooling-induced mixing. Strong interannual variability in mixed-layer depth has a direct impact on short-and long-term vertical nutrient fluxes and thus the fall bloom. Phytoplankton concentrations over time are sensitive to initial pre-bloom profiles of nutrients. The strength of the initial stratification can affect the modeled phytoplankton concentration, while the timing of intermittent freshening events is related to the significant interannual variability of fall blooms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the Paleocene-Eocene Thermal Maximum (PETM), rapid release of isotopically light C to the ocean-atmosphere system elevated the greenhouse effect and warmed temperatures by 5-7 °C for 105 yr. The response of the planktic ecosystems and productivity to the dramatic climate changes of the PETM may represent a significant feedback to the carbon cycle changes, but has been difficult to document. We examine Sr/Ca ratios in calcareous nannofossils in sediments spanning the PETM in three open ocean sites as a new approach to examine productivity and ecological shifts in calcifying plankton. The large heterogeneity in Sr/Ca among different nannofossil genera indicates that nannofossil Sr/Ca reflects primary productivity-driven geochemical signals and not diagenetic overprinting. Elevated Sr/Ca ratios in several genera and constant ratios in other genera suggest increased overall productivity in the Atlantic sector of the Southern Ocean during the PETM. Dominant nannofossil genera in tropical Atlantic and Pacific sites show Sr/Ca variations during the PETM which are comparable to background variability prior to the PETM. Despite acidification of the ocean there was not a productivity crisis among calcifying phytoplankton. We use the Pandora ocean box model to explore possible mechanisms for PETM productivity change. If independent proxy evidence for more stratified conditions in the Southern Ocean during the PETM is robust, then maintenance of stable or increased productivity there likely reflects increased nutrient inventories of the ocean. Increased nutrient inventories could have resulted from climatically enhanced weathering and would have important implications for burial rates of organic carbon and stabilization of climate and the carbon cycle.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hydroxyl radical (OH) is the primary oxidant in the troposphere, initiating the removal of numerous atmospheric species including greenhouse gases, pollutants that are detrimental to human health, and ozone-depleting substances. Because of the complexity of OH chemistry, models vary widely in their OH chemistry schemes and resulting methane (CH4) lifetimes. The current state of knowledge concerning global OH abundances is often contradictory. This body of work encompasses three projects that investigate tropospheric OH from a modeling perspective, with the goal of improving the tropospheric community’s knowledge of the atmospheric lifetime of CH4. First, measurements taken during the airborne CONvective TRansport of Active Species in the Tropics (CONTRAST) field campaign are used to evaluate OH in global models. A box model constrained to measured variables is utilized to infer concentrations of OH along the flight track. Results are used to evaluate global model performance, suggest against the existence of a proposed “OH Hole” in the tropical Western Pacific, and investigate implications of high O3/low H2O filaments on chemical transport to the stratosphere. While methyl chloroform-based estimates of global mean OH suggest that models are overestimating OH, we report evidence that these models are actually underestimating OH in the tropical Western Pacific. The second project examines OH within global models to diagnose differences in CH4 lifetime. I developed an approach to quantify the roles of OH precursor field differences (O3, H2O, CO, NOx, etc.) using a neural network method. This technique enables us to approximate the change in CH4 lifetime resulting from variations in individual precursor fields. The dominant factors driving CH4 lifetime differences between models are O3, CO, and J(O3-O1D). My third project evaluates the effect of climate change on global fields of OH using an empirical model. Observations of H2O and O3 from satellite instruments are combined with a simulation of tropical expansion to derive changes in global mean OH over the past 25 years. We find that increasing H2O and increasing width of the tropics tend to increase global mean OH, countering the increasing CH4 sink and resulting in well-buffered global tropospheric OH concentrations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neglecting health effects from indoor pollutant emissions and exposure, as currently done in Life Cycle Assessment (LCA), may result in product or process optimizations at the expense of workers' or consumers' health. To close this gap, methods for considering indoor exposure to chemicals are needed to complement the methods for outdoor human exposure assessment already in use. This paper summarizes the work of an international expert group on the integration of human indoor and outdoor exposure in LCA, within the UNEP/ SETAC Life Cycle Initiative. A new methodological framework is proposed for a general procedure to include human-health effects from indoor exposure in LCA. Exposure models from occupational hygiene and household indoor air quality studies and practices are critically reviewed and recommendations are provided on the appropriateness of various model alternatives in the context of LCA. A single-compartment box model is recommended for use as a default in LCA, enabling one to screen occupational and household exposures consistent with the existing models to assess outdoor emission in a multimedia environment. An initial set of model parameter values was collected. The comparison between indoor and outdoor human exposure per unit of emission shows that for many pollutants, intake per unit of indoor emission may be several orders of magnitude higher than for outdoor emissions. It is concluded that indoor exposure should be routinely addressed within LCA.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Paltridge found reasonable values for the most significant climatic variables through maximizing the material transport part of entropy production by using a simple box model. Here, we analyse Paltridge's box model to obtain the energy and the entropy balance equations separately. Derived expressions for global entropy production, which is a function of the radiation field, and even its material transport component, are shown to be different from those used by Paltridge. Plausible climatic states are found at extrema of these parameters. Feasible results are also obtained by minimizing the radiation part of entropy production, in agreement with one of Planck's results, Finally, globally averaged values of the entropy flux of radiation and material entropy production are obtained for two dynamical extreme cases: an earth with uniform temperature, and an earth in radiative equilibrium at each latitudinal point

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A wide range of modelling algorithms is used by ecologists, conservation practitioners, and others to predict species ranges from point locality data. Unfortunately, the amount of data available is limited for many taxa and regions, making it essential to quantify the sensitivity of these algorithms to sample size. This is the first study to address this need by rigorously evaluating a broad suite of algorithms with independent presence-absence data from multiple species and regions. We evaluated predictions from 12 algorithms for 46 species (from six different regions of the world) at three sample sizes (100, 30, and 10 records). We used data from natural history collections to run the models, and evaluated the quality of model predictions with area under the receiver operating characteristic curve (AUC). With decreasing sample size, model accuracy decreased and variability increased across species and between models. Novel modelling methods that incorporate both interactions between predictor variables and complex response shapes (i.e. GBM, MARS-INT, BRUTO) performed better than most methods at large sample sizes but not at the smallest sample sizes. Other algorithms were much less sensitive to sample size, including an algorithm based on maximum entropy (MAXENT) that had among the best predictive power across all sample sizes. Relative to other algorithms, a distance metric algorithm (DOMAIN) and a genetic algorithm (OM-GARP) had intermediate performance at the largest sample size and among the best performance at the lowest sample size. No algorithm predicted consistently well with small sample size (n < 30) and this should encourage highly conservative use of predictions based on small sample size and restrict their use to exploratory modelling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It has been shown in organizational settings that trust is a crucial factor in different kinds of outcomes, and consequently, building employee trust in the employer is a goal for all kinds of organizations. Although it is recognized that trust in organizations operates on multiple levels, at present there is no clear consensus on the concept of trust within the organization. One can have trust in particular people (i.e. interpersonal trust) or in organized systems (i.e. impersonal trust). Until recently organizational trust has been treated mainly as an interpersonal phenomenon. However, the interpersonal approach is limited. Scholars studying organizational trust have thus far focused only on specific dimensions of impersonal trust, and none have taken a comprehensive approach. The first objective in this study was to develop a construct and a scale encompassing the impersonal element of organizational trust. The second objective was to examine the effects of various HRM practices on the impersonal dimensions of organizational trust. Moreover, although the “black box” model of HRM is widely studied, there have been only a few attempts to unlock the box. Previous studies on the HRM-performance link refer to trust, and this work contributes to the literature in considering trust an impersonal issue in the relationship between HRM, trust, and performance. The third objective was thus to clarify the role of impersonal trust in the relationship between HRM and performance. The study is divided into two parts comprising the Introduction and four separate publications. Each publication addresses a distinct sub-question, whereas the Introduction discusses the overall results in the light of the individual sub-questions. The study makes two major contributions to the research on trust. Firstly, it offers a framework describing the construct of impersonal trust, which to date has not been clearly articulated in the research on organizational trust. Secondly, a comprehensive, psychometrically sound, operationally valid scale for measuring impersonal trust was developed. In addition, the study makes an empirical contribution to the research on strategic HRM. First, it shows that HRM practices affect impersonal trust and the contribution is to consider the HRM-trust link in terms of impersonal organizational trust. It is shown that each of the six HRM practices in focus is connected to impersonal trust. A further contribution lies in unlocking the black box. The study explores the impersonal element of organizational trust and its mediating role between HRM practices and performance. The result is the identification of the path by which HRM contributes to performance through the mediator of impersonal trust. It is shown that the effect on performance of HRM designed specifically to enhance employees’ impersonal trust in the organization is positive.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Imagine the potential implications of an organization whose business and IT processes are well aligned and are capable of reactively and proactively responding to the external and internal changes. The Philips IT Infrastructure and Operations department (I&O) is undergoing a series of transformation activities to help Philips business keeping up with the changes. I&O would serve a critical function in any business sectors; given that the I&O’s strategy switched from “design, build and run” to “specify, acquire and performance manage”, that function is amplified. In 2013, I&O’s biggest transforming programme I&O Futures engaged multiple interdisciplinary departments and programs on decommissioning legacy processes and restructuring new processes with respect to the Information Technology Internet Library (ITIL), helping I&O to achieve a common infrastructure and operating platform (CI&OP). The author joined I&O Futures in the early 2014 and contributed to the CI&OP release 1, during which a designed model Bing Box and its evaluations were conducted through the lens of six sigma’s structured define-measure-analyze-improve-control (DMAIC) improvement approach. This Bing Box model was intended to firstly combine business and IT principles, namely Lean IT, Agile, ITIL best practices, and Aspect-oriented programming (AOP) into a framework. Secondly, the author implemented the modularized optimization cycles according to the defined framework into Philips’ ITIL-based processes and, subsequently, to enhance business process performance as well as to increase efficiency of the optimization cycles. The unique of this thesis is that the Bing Box model not only provided comprehensive optimization approaches and principles for business process performance, but also integrated and standardized optimization modules for the optimization process itself. The research followed a design research guideline that seek to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts. The Chapter 2 firstly reviewed the current research on Lean Six Sigma, Agile, AOP and ITIL, aiming at identifying the broad conceptual bases for this study. In Chapter 3, we included the process of constructing the Bing Box model. The Chapter 4 described the adoption of Bing Box model: two-implementation case validated by stakeholders through observations and interviews. Chapter 5 contained the concluding remarks, the limitation of this research work and the future research areas. Chapter 6 provided the references used in this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Contexte et objectifs. Ce mémoire propose un modèle conceptuel écologique afin de mieux comprendre la violence dans les écoles. Les objectifs de cette recherche sont de ; 1) estimer l’effet des facteurs individuels, contextuels et environnementaux sur le risque de victimisation, 2) vérifier la présence d’interactions entre les différents facteurs. Méthodologie. Les élèves de 16 écoles primaires de la grande région métropolitaine de Montréal ont pris part à un sondage auto-révélé en lien avec différentes dimensions liées à la victimisation en milieu scolaire. Des analyses descriptives ont été menées, dans un premier temps, pour dresser le portrait de la violence en milieu scolaire. Dans un second temps, l’emploi d’un modèle linéaire hiérarchique généralisé (MLHG) a permis d’estimer les effets de variables propres à l’individu, au contexte et à l’environnement sur le risque de victimisation. Résultats. Les résultats aux analyses multiniveaux montrent que des variables individuelles, contextuelles et environnementales influent sur la probabilité d’être victime de violence verbale, physique et dans les médias sociaux. Ainsi, les élèves les plus délinquants sont aussi ceux qui rapportent le plus d’antécédents de victimisation. Toutefois, ces résultats ne sont pas entièrement imputables aux caractéristiques des individus. Le risque de victimisation est atténué lorsque les « gardiens » interviennent pour mettre un terme au conflit et que les victimes se défendent. Enfin, le risque de victimisation est moins élevé dans les écoles où il y a un grand nombre d’élèves. Interprétation. Les résultats suggèrent que plusieurs facteurs qui ne sont pas liés aux victimes et aux délinquants permettent de mieux comprendre le processus de victimisation en milieu scolaire. Le rôle des gardiens de même que la taille des écoles sont des éléments centraux à la compréhension du passage à l’acte.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thermoaktive Bauteilsysteme sind Bauteile, die als Teil der Raumumschließungsflächen über ein integriertes Rohrsystem mit einem Heiz- oder Kühlmedium beaufschlagt werden können und so die Beheizung oder Kühlung des Raumes ermöglichen. Die Konstruktionenvielfalt reicht nach diesem Verständnis von Heiz, bzw. Kühldecken über Geschoßtrenndecken mit kern-integrierten Rohren bis hin zu den Fußbodenheizungen. Die darin enthaltenen extrem trägen Systeme werden bewußt eingesetzt, um Energieangebot und Raumenergiebedarf unter dem Aspekt der rationellen Energieanwendung zeitlich zu entkoppeln, z. B. aktive Bauteilkühlung in der Nacht, passive Raumkühlung über das kühle Bauteil am Tage. Gebäude- und Anlagenkonzepte, die träge reagierende thermoaktive Bauteilsysteme vorsehen, setzen im kompetenten und verantwortungsvollen Planungsprozeß den Einsatz moderner Gebäudesimulationswerkzeuge voraus, um fundierte Aussagen über Behaglichkeit und Energiebedarf treffen zu können. Die thermoaktiven Bauteilsysteme werden innerhalb dieser Werkzeuge durch Berechnungskomponenten repräsentiert, die auf mathematisch-physikalischen Modellen basieren und zur Lösung des bauteilimmanenten mehrdimensionalen instationären Wärmeleitungsproblems dienen. Bisher standen hierfür zwei unterschiedliche prinzipielle Vorgehensweisen zur Lösung zur Verfügung, die der physikalischen Modellbildung entstammen und Grenzen bzgl. abbildbarer Geometrie oder Rechengeschwindigkeit setzen. Die vorliegende Arbeit dokumentiert eine neue Herangehensweise, die als experimentelle Modellbildung bezeichnet wird. Über den Weg der Systemidentifikation können aus experimentell ermittelten Datenreihen die Parameter für ein kompaktes Black-Box-Modell bestimmt werden, das das Eingangs-Ausgangsverhalten des zugehörigen beliebig aufgebauten thermoaktiven Bauteils mit hinreichender Genauigkeit widergibt. Die Meßdatenreihen lassen sich über hochgenaue Berechnungen generieren, die auf Grund ihrer Detailtreue für den unmittelbaren Einsatz in der Gebäudesimulation ungeeignet wären. Die Anwendung der Systemidentifikation auf das zweidimensionale Wärmeleitungsproblem und der Nachweis ihrer Eignung wird an Hand von sechs sehr unterschiedlichen Aufbauten thermoaktiver Bauteilsysteme durchgeführt und bestätigt sehr geringe Temperatur- und Energiebilanzfehler. Vergleiche zwischen via Systemidentifikation ermittelten Black-Box-Modellen und physikalischen Modellen für zwei Fußbodenkonstruktionen zeigen, daß erstgenannte auch als Referenz für Genauigkeitsabschätzungen herangezogen werden können. Die Praktikabilität des neuen Modellierungsansatzes wird an Fallstudien demonstriert, die Ganzjahressimulationen unter Bauteil- und Betriebsvariationen an einem exemplarischen Büroraum betreffen. Dazu erfolgt die Integration des Black-Box-Modells in das kommerzielle Gebäude- und Anlagensimulationsprogramm CARNOT. Die akzeptablen Rechenzeiten für ein Einzonen-Gebäudemodell in Verbindung mit den hohen Genauigkeiten bescheinigen die Eignung der neuen Modellierungsweise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is based on alkyl nitrate measurements made over the North Atlantic as part of the International Consortium for Research on Atmospheric Transport and Transformation (ICARTT). The focus is on the analysis of air samples collected on the UK BAe-146 aircraft during the Intercontinental Transport of Ozone and Precursors (ITOP) project, but air samples collected on board the NASA DC-8 and NOAA WP-3D aircraft as part of a Lagrangian experiment are also used. The ratios between the alkyl nitrates and their parent hydrocarbons are compared with those expected from chemical theory. Further, a box model is run to investigate the temporal evolution of the alkyl nitrates in three Lagrangian case studies and compared to observations. The air samples collected during ITOP do not appear to be strongly influenced by oceanic sources, but rather are influenced by emissions from the N.E. United States and from Alaskan fires. There also appears to be a widespread common source of ethyl nitrate and 1-propyl nitrate other than from their parent hydrocarbons. The general agreement between the alkyl nitrate data and photochemical theory suggests that during the first few days of transport from the source region, photochemical production of alkyl nitrates, and thus ozone, had taken place. The observations in the more photochemically processed air masses are consistent with the alkyl nitrate production reactions no longer dominating the peroxy radical self/cross reactions. Further, the results also suggest that the rates of photochemical processing in the Alaskan smoke plumes were small.