896 resultados para New career models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in ocean circulation associated with internal climate variability have a major influence on upper ocean temperatures, particularly in regions such as the North Atlantic, which are relatively well-observed and therefore over-represented in the observational record. As a result, global estimates of upper ocean heat content can give misleading estimates of the roles of natural and anthropogenic factors in causing oceanic warming. We present a method to quantify ocean warming that filters out the natural internal variability from both observations and climate simulations and better isolates externally forced air-sea heat flux changes. We obtain a much clearer picture of the drivers of oceanic temperature changes, being able to detect the effects of both anthropogenic and volcanic influences simultaneously in the observed record. Our results show that climate models are capable of capturing in remarkable detail the externally forced component of ocean temperature evolution over the last five decades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The MarQUEST (Marine Biogeochemistry and Ecosystem Modelling Initiative in QUEST) project was established to develop improved descriptions of marine biogeochemistry, suited for the next generation of Earth system models. We review progress in these areas providing insight on the advances that have been made as well as identifying remaining key outstanding gaps for the development of the marine component of next generation Earth system models. The following issues are discussed and where appropriate results are presented; the choice of model structure, scaling processes from physiology to functional types, the ecosystem model sensitivity to changes in the physical environment, the role of the coastal ocean and new methods for the evaluation and comparison of ecosystem and biogeochemistry models. We make recommendations as to where future investment in marine ecosystem modelling should be focused, highlighting a generic software framework for model development, improved hydrodynamic models, and better parameterisation of new and existing models, reanalysis tools and ensemble simulations. The final challenge is to ensure that experimental/observational scientists are stakeholders in the models and vice versa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new method for the inclusion of nonlinear demand and supply relationships within a linear programming model. An existing method for this purpose is described first and its shortcomings are pointed out before showing how the new approach overcomes those difficulties and how it provides a more accurate and 'smooth' (rather than a kinked) approximation of the nonlinear functions as well as dealing with equilibrium under perfect competition instead of handling just the monopolistic situation. The workings of the proposed method are illustrated by extending a previously available sectoral model for the UK agriculture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MOTIVATION: The accurate prediction of the quality of 3D models is a key component of successful protein tertiary structure prediction methods. Currently, clustering or consensus based Model Quality Assessment Programs (MQAPs) are the most accurate methods for predicting 3D model quality; however they are often CPU intensive as they carry out multiple structural alignments in order to compare numerous models. In this study, we describe ModFOLDclustQ - a novel MQAP that compares 3D models of proteins without the need for CPU intensive structural alignments by utilising the Q measure for model comparisons. The ModFOLDclustQ method is benchmarked against the top established methods in terms of both accuracy and speed. In addition, the ModFOLDclustQ scores are combined with those from our older ModFOLDclust method to form a new method, ModFOLDclust2, that aims to provide increased prediction accuracy with negligible computational overhead. RESULTS: The ModFOLDclustQ method is competitive with leading clustering based MQAPs for the prediction of global model quality, yet it is up to 150 times faster than the previous version of the ModFOLDclust method at comparing models of small proteins (<60 residues) and over 5 times faster at comparing models of large proteins (>800 residues). Furthermore, a significant improvement in accuracy can be gained over the previous clustering based MQAPs by combining the scores from ModFOLDclustQ and ModFOLDclust to form the new ModFOLDclust2 method, with little impact on the overall time taken for each prediction. AVAILABILITY: The ModFOLDclustQ and ModFOLDclust2 methods are available to download from: http://www.reading.ac.uk/bioinf/downloads/ CONTACT: l.j.mcguffin@reading.ac.uk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter introduces ABMs, their construction, and the pros and cons of their use. Although relatively new, agent-basedmodels (ABMs) have great potential for use in ecotoxicological research – their primary advantage being the realistic simulations that can be constructed and particularly their explicit handling of space and time in simulations. Examples are provided of their use in ecotoxicology primarily exemplified by different implementations of the ALMaSS system. These examples presented demonstrate how multiple stressors, landscape structure, details regarding toxicology, animal behavior, and socioeconomic effects can and should be taken into account when constructing simulations for risk assessment. Like ecological systems, in ABMs the behavior at the system level is not simply the mean of the component responses, but the sum of the often nonlinear interactions between components in the system; hence this modeling approach opens the door to implementing and testing much more realistic and holistic ecotoxicological models than are currently used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Jerdon's courser Rhinoptilus bitorquatus is a nocturnally active cursorial bird that is only known to occur in a small area of scrub jungle in Andhra Pradesh, India, and is listed as critically endangered by the IUCN. Information on its habitat requirements is needed urgently to underpin conservation measures. We quantified the habitat features that correlated with the use of different areas of scrub jungle by Jerdon's coursers, and developed a model to map potentially suitable habitat over large areas from satellite imagery and facilitate the design of surveys of Jerdon's courser distribution. 2. We used 11 arrays of 5-m long tracking strips consisting of smoothed fine soil to detect the footprints of Jerdon's coursers, and measured tracking rates (tracking events per strip night). We counted the number of bushes and trees, and described other attributes of vegetation and substrate in a 10-m square plot centred on each strip. We obtained reflectance data from Landsat 7 satellite imagery for the pixel within which each strip lay. 3. We used logistic regression models to describe the relationship between tracking rate by Jerdon's coursers and characteristics of the habitat around the strips, using ground-based survey data and satellite imagery. 4. Jerdon's coursers were most likely to occur where the density of large (>2 m tall) bushes was in the range 300-700 ha(-1) and where the density of smaller bushes was less than 1000 ha(-1). This habitat was detectable using satellite imagery. 5. Synthesis and applications. The occurrence of Jerdon's courser is strongly correlated with the density of bushes and trees, and is in turn affected by grazing with domestic livestock, woodcutting and mechanical clearance of bushes to create pasture, orchards and farmland. It is likely that there is an optimal level of grazing and woodcutting that would maintain or create suitable conditions for the species. Knowledge of the species' distribution is incomplete and there is considerable pressure from human use of apparently suitable habitats. Hence, distribution mapping is a high conservation priority. A two-step procedure is proposed, involving the use of ground surveys of bush density to calibrate satellite image-based mapping of potential habitat. These maps could then be used to select priority areas for Jerdon's courser surveys. The use of tracking strips to study habitat selection and distribution has potential in studies of other scarce and secretive species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we list some new orthogonal main effects plans for three-level designs for 4, 5 and 6 factors in IS runs and compare them with designs obtained from the existing L-18 orthogonal array. We show that these new designs have better projection properties and can provide better parameter estimates for a range of possible models. Additionally, we study designs in other smaller run-sizes when there are insufficient resources to perform an 18-run experiment. Plans for three-level designs for 4, 5 and 6 factors in 13 to 17 runs axe given. We show that the best designs here are efficient and deserve strong consideration in many practical situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this review we describe how concepts of shoot apical meristem function have developed over time. The role of the scientist is emphasized, as proposer, receiver and evaluator of ideas about the shoot apical meristem. Models have become increasingly popular over the last 250 years, and we consider their role. They provide valuable grounding for the development of hypotheses, but in addition they have a strong human element and their uptake relies on various degrees of persuasion. The most influential models are probably those that most data support, consolidating them as an insight into reality; but they also work by altering how we see meristems, re-directing us to influence the data we collect and the questions we consider meaningful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A review is given of the mechanics of cutting, ranging from the slicing of thin floppy offcuts (where there is negligible elasticity and no permanent deformation of the offcut) to the machining of ductile metals (where there is severe permanent distortion of the offcut/chip). Materials scientists employ the former conditions to determine the fracture toughness of ‘soft’ solids such as biological materials and foodstuffs. In contrast, traditional analyses of metalcutting are based on plasticity and friction only, and do not incorporate toughness. The machining theories are inadequate in a number of ways but a recent paper has shown that when ductile work of fracture is included many, if not all, of the shortcomings are removed. Support for the new analysis is given by examination of FEM simulations of metalcutting which reveal that a ‘separation criterion’ has to be employed at the tool tip. Some consideration shows that the separation criteria are versions of void-initiation-growth-and-coalescence models employed in ductile fracture mechanics. The new analysis shows that cutting forces for ductile materials depend upon the fracture toughness as well as plasticity and friction, and reveals a simple way of determining both toughness and flow stress from cutting experiments. Examples are given for a wide range of materials including metals, polymers and wood, and comparison is made with the same properties independently determined using conventional testpieces. Because cutting can be steady state, a new way is presented for simultaneously measuring toughness and flow stress at controlled speeds and strain rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the first of two articles presenting a detailed review of the historical evolution of mathematical models applied in the development of building technology, including conventional buildings and intelligent buildings. After presenting the technical differences between conventional and intelligent buildings, this article reviews the existing mathematical models, the abstract levels of these models, and their links to the literature for intelligent buildings. The advantages and limitations of the applied mathematical models are identified and the models are classified in terms of their application range and goal. We then describe how the early mathematical models, mainly physical models applied to conventional buildings, have faced new challenges for the design and management of intelligent buildings and led to the use of models which offer more flexibility to better cope with various uncertainties. In contrast with the early modelling techniques, model approaches adopted in neural networks, expert systems, fuzzy logic and genetic models provide a promising method to accommodate these complications as intelligent buildings now need integrated technologies which involve solving complex, multi-objective and integrated decision problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models devoted to different aspects of building studies and brought about a significant shift in the way we view buildings. From this background a new definition of building has emerged known as intelligent building that requires integration of a variety of computer-based complex systems. Research relevant to intelligent continues to grow at a much faster pace. This paper is a review of different mathematical models described in literature, which make use of different mathematical methodologies, and are intended for intelligent building studies without complex mathematical details. Models are discussed under a wide classification. Mathematical abstract level of the applied models is detailed and integrated with its literature. The goal of this paper is to present a comprehensive account of the achievements and status of mathematical models in intelligent building research. and to suggest future directions in models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reports that heat processing of foods induces the formation of acrylamide heightened interest in the chemistry, biochemistry, and safety of this compound. Acrylamide-induced neurotoxicity, reproductive toxicity, genotoxicity, and carcinogenicity are potential human health risks based on animal studies. Because exposure of humans to acrylamide can come from both external sources and the diet, there exists a need to develop a better understanding of its formation and distribution in food and its role in human health. To contribute to this effort, experts from eight countries have presented data on the chemistry, analysis, metabolism, pharmacology, and toxicology of acrylamide. Specifically covered are the following aspects: exposure from the environment and the diet; biomarkers of exposure; risk assessment; epidemiology; mechanism of formation in food; biological alkylation of amino acids, peptides, proteins, and DNA by acrylamide and its epoxide metabolite glycidamide; neurotoxicity, reproductive toxicity, and carcinogenicity; protection against adverse effects; and possible approaches to reducing levels in food. Cross-fertilization of ideas among several disciplines in which an interest in acrylamide has developed, including food science, pharmacology, toxicology, and medicine, will provide a better understanding of the chemistry and biology of acrylamide in food, and can lead to the development of food processes to decrease the acrylamide content of the diet.