952 resultados para Non-response model approach


Relevância:

50.00% 50.00%

Publicador:

Resumo:

This study explored the utility of the impact response surface (IRS) approach for investigating model ensemble crop yield responses under a large range of changes in climate. IRSs of spring and winter wheat Triticum aestivum yields were constructed from a 26-member ensemble of process-based crop simulation models for sites in Finland, Germany and Spain across a latitudinal transect. The sensitivity of modelled yield to systematic increments of changes in temperature (-2 to +9°C) and precipitation (-50 to +50%) was tested by modifying values of baseline (1981 to 2010) daily weather, with CO2 concentration fixed at 360 ppm. The IRS approach offers an effective method of portraying model behaviour under changing climate as well as advantages for analysing, comparing and presenting results from multi-model ensemble simulations. Though individual model behaviour occasionally departed markedly from the average, ensemble median responses across sites and crop varieties indicated that yields decline with higher temperatures and decreased precipitation and increase with higher precipitation. Across the uncertainty ranges defined for the IRSs, yields were more sensitive to temperature than precipitation changes at the Finnish site while sensitivities were mixed at the German and Spanish sites. Precipitation effects diminished under higher temperature changes. While the bivariate and multi-model characteristics of the analysis impose some limits to interpretation, the IRS approach nonetheless provides additional insights into sensitivities to inter-model and inter-annual variability. Taken together, these sensitivities may help to pinpoint processes such as heat stress, vernalisation or drought effects requiring refinement in future model development.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

It has been reported that for certain colour samples, the chromatic adaptation transform CAT02 imbedded in the CIECAM02 colour appearance model predicts corresponding colours with negative tristimulus values (TSVs), which can cause problems in certain applications. To overcome this problem, a mathematical approach is proposed for modifying CAT02. This approach combines a non-negativity constraint for the TSVs of corresponding colours with the minimization of the colour differences between those values for the corresponding colours obtained by visual observations and the TSVs of the corresponding colours predicted by the model, which is a constrained non-linear optimization problem. By solving the non-linear optimization problem, a new matrix is found. The performance of the CAT02 transform with various matrices including the original CAT02 matrix, and the new matrix are tested using visual datasets and the optimum colours. Test results show that the CAT02 with the new matrix predicted corresponding colours without negative TSVs for all optimum colours and the colour matching functions of the two CIE standard observers under the test illuminants considered. However, the accuracy with the new matrix for predicting the visual data is approximately 1 CIELAB colour difference unit worse compared with the original CAT02. This indicates that accuracy has to be sacrificed to achieve the non-negativity constraint for the TSVs of the corresponding colours.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The mechanical response of the cornea subjected to a non-contact air-jet tonometry diagnostic test represents an interplay between its geometry, the corneal material behavior and the loading. The objective is to study this interplay to better understand and interpret the results obtained with a non-contact tonometry test. A patient-specific finite element model of a healthy eye, accounting for the load free configuration, was used. The corneal tissue was modeled as an anisotropic hyperelastic material with two preferential directions. Three different sets of parameters within the human experimental range obtained from inflation tests were considered. The influence of the IOP was studied by considering four pressure levels (10–28 mmHg) whereas the influence of corneal thickness was studied by inducing a uniform variation (300–600 microns). A Computer Fluid Dynamics (CFD) air-jet simulation determined pressure loading exerted on the anterior corneal surface. The maximum apex displacement showed a linear variation with IOP for all materials examined. On the contrary, the maximum apex displacement followed a cubic relation with corneal thickness. In addition, a significant sensitivity of the apical displacement to the corneal stiffness was also obtained. Explanation to this behavior was found in the fact that the cornea experiences bending when subjected to an air-puff loading, causing the anterior surface to work in compression whereas the posterior surface works in tension. Hence, collagen fibers located at the anterior surface do not contribute to load bearing. Non-contact tonometry devices give useful information that could be misleading since the corneal deformation is the result of the interaction between the mechanical properties, IOP, and geometry. Therefore, a non-contact tonometry test is not sufficient to evaluate their individual contribution and a complete in-vivo characterization would require more than one test to independently determine the membrane and bending corneal behavior.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Descriptive models of social response are concerned with identifying and discriminating between different types of response to social influence. In a previous article (Nail, MacDonald, & Levy, 2000), the authors demonstrated that 4 conceptual dimensions are necessary to adequately distinguish between such phenomena as conformity, compliance, contagion, independence, and anticonformity in a single model. This article expands the scope of the authors' 4-dimensional approach by reviewing selected experimental and cultural evidence, further demonstrating the integrative power of the model. This review incorporates political psychology, culture and aggression, self-persuasion, group norms, prejudice, impression management, psychotherapy, pluralistic ignorance, bystander intervention/nonintervention, public policy, close relationships, and implicit attitudes.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper presents a new method for producing a functional-structural plant model that simulates response to different growth conditions, yet does not require detailed knowledge of underlying physiology. The example used to present this method is the modelling of the mountain birch tree. This new functional-structural modelling approach is based on linking an L-system representation of the dynamic structure of the plant with a canonical mathematical model of plant function. Growth indicated by the canonical model is allocated to the structural model according to probabilistic growth rules, such as rules for the placement and length of new shoots, which were derived from an analysis of architectural data. The main advantage of the approach is that it is relatively simple compared to the prevalent process-based functional-structural plant models and does not require a detailed understanding of underlying physiological processes, yet it is able to capture important aspects of plant function and adaptability, unlike simple empirical models. This approach, combining canonical modelling, architectural analysis and L-systems, thus fills the important role of providing an intermediate level of abstraction between the two extremes of deeply mechanistic process-based modelling and purely empirical modelling. We also investigated the relative importance of various aspects of this integrated modelling approach by analysing the sensitivity of the standard birch model to a number of variations in its parameters, functions and algorithms. The results show that using light as the sole factor determining the structural location of new growth gives satisfactory results. Including the influence of additional regulating factors made little difference to global characteristics of the emergent architecture. Changing the form of the probability functions and using alternative methods for choosing the sites of new growth also had little effect. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Exploratory analysis of data in all sciences seeks to find common patterns to gain insights into the structure and distribution of the data. Typically visualisation methods like principal components analysis are used but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this technical report we discuss a complementary approach based on a non-linear probabilistic model. The generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate far more structure than a two dimensional principal components plot could, and deal at the same time with missing data. We show that using the generative topographic mapping provides us with an optimal method to explore the data while being able to replace missing values in a dataset, particularly where a large proportion of the data is missing.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Cystic fibrosis (CF) is the most common lethal inherited disease among Caucasians and arises due to mutations in a chloride channel, called cystic fibrosis transmembrane conductance regulator. A hallmark of this disease is the chronic bacterial infection of the airways, which is usually, associated with pathogens such as Pseudomonas aeruginosa, S. aureus and recently becoming more prominent, B. cepacia. The excessive inflammatory response, which leads to irreversible lung damage, will in the long term lead to mortality of the patient at around the age of 40 years. Understanding the pathogenesis of CF currently relies on animal models, such as those employing genetically-modified mice, and on single cell culture models, which are grown either as polarised or non-polarised epithelium in vitro. Whilst these approaches partially enable the study of disease progression in CF, both types of models have inherent limitations. The overall aim of this thesis was to establish a multicellular co-culture model of normal and CF human airways in vitro, which helps to partially overcome these limitations and permits analysis of cell-to-cell communication in the airways. These models could then be used to examine the co-ordinated response of the airways to infection with relevant pathogens in order to validate this approach over animals/single cell models. Therefore epithelial cell lines of non-CF and CF background were employed in a co-culture model together with human pulmonary fibroblasts. Co-cultures were grown on collagen-coated permeable supports at air-liquid interface to promote epithelial cell differentiation. The models were characterised and essential features for investigating CF infections and inflammatory responses were investigated and analysed. A pseudostratified like epithelial cell layer was established at air liquid interface (ALI) of mono-and co-cultures and cell layer integrity was verified by tight junction (TJ) staining and transepithelial resistance measurements (TER). Mono- and co-cultures were also found to secrete the airway mucin MUC5AC. Influence of bacterial infections was found to be most challenging when intact S. aureus, B. cepacia and P. aeruginosa were used. CF mono- and co-cultures were found to mimic the hyperinflammatory state found in CF, which was confirmed by analysing IL-8 secretions of these models. These co-culture models will help to elucidate the role fibroblasts play in the inflammatory response to bacteria and will provide a useful testing platform to further investigate the dysregulated airway responses seen in CF.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Exploratory analysis of petroleum geochemical data seeks to find common patterns to help distinguish between different source rocks, oils and gases, and to explain their source, maturity and any intra-reservoir alteration. However, at the outset, one is typically faced with (a) a large matrix of samples, each with a range of molecular and isotopic properties, (b) a spatially and temporally unrepresentative sampling pattern, (c) noisy data and (d) often, a large number of missing values. This inhibits analysis using conventional statistical methods. Typically, visualisation methods like principal components analysis are used, but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this paper we introduce a complementary approach based on a non-linear probabilistic model. Generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, while also dealing with missing data. We show how using generative topographic mapping also provides an optimal method with which to replace missing values in two geochemical datasets, particularly where a large proportion of the data is missing.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Data Envelopment Analysis (DEA) is recognized as a modern approach to the assessment of performance of a set of homogeneous Decision Making Units (DMUs) that use similar sources to produce similar outputs. While DEA commonly is used with precise data, recently several approaches are introduced for evaluating DMUs with uncertain data. In the existing approaches many information on uncertainties are lost. For example in the defuzzification, the a-level and fuzzy ranking approaches are not considered. In the tolerance approach the inequality or equality signs are fuzzified but the fuzzy coefficients (inputs and outputs) are not treated directly. The purpose of this paper is to develop a new model to evaluate DMUs under uncertainty using Fuzzy DEA and to include a-level to the model under fuzzy environment. An example is given to illustrate this method in details.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production.In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming.The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. © 2013.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This chapter explains a functional integral approach about impurity in the Tomonaga–Luttinger model. The Tomonaga–Luttinger model of one-dimensional (1D) strongly correlates electrons gives a striking example of non-Fermi-liquid behavior. For simplicity, the chapter considers only a single-mode Tomonaga–Luttinger model, with one species of right- and left-moving electrons, thus, omitting spin indices and considering eventually the simplest linearized model of a single-valley parabolic electron band. The standard operator bosonization is one of the most elegant methods developed in theoretical physics. The main advantage of the bosonization, either in standard or functional form, is that including the quadric electron–electron interaction does not substantially change the free action. The chapter demonstrates the way to develop the formalism of bosonization based on the functional integral representation of observable quantities within the Keldysh formalism.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The paper presents a new network-flow interpretation of Łukasiewicz’s logic based on models with an increased effectiveness. The obtained results show that the presented network-flow models principally may work for multivalue logics with more than three states of the variables i.e. with a finite set of states in the interval from 0 to 1. The described models give the opportunity to formulate various logical functions. If the results from a given model that are contained in the obtained values of the arc flow functions are used as input data for other models then it is possible in Łukasiewicz’s logic to interpret successfully other sophisticated logical structures. The obtained models allow a research of Łukasiewicz’s logic with specific effective methods of the network-flow programming. It is possible successfully to use the specific peculiarities and the results pertaining to the function ‘traffic capacity of the network arcs’. Based on the introduced network-flow approach it is possible to interpret other multivalue logics – of E.Post, of L.Brauer, of Kolmogorov, etc.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We present three jargonaphasic patients who made phonological errors in naming, repetition and reading. We analyse target/response overlap using statistical models to answer three questions: 1) Is there a single phonological source for errors or two sources, one for target-related errors and a separate source for abstruse errors? 2) Can correct responses be predicted by the same distribution used to predict errors or do they show a completion boost (CB)? 3) Is non-lexical and lexical information summed during reading and repetition? The answers were clear. 1) Abstruse errors did not require a separate distribution created by failure to access word forms. Abstruse and target-related errors were the endpoints of a single overlap distribution. 2) Correct responses required a special factor, e.g., a CB or lexical/phonological feedback, to preserve their integrity. 3) Reading and repetition required separate lexical and non-lexical contributions that were combined at output.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Recent changes to the legislation on chemicals and cosmetics testing call for a change in the paradigm regarding the current 'whole animal' approach for identifying chemical hazards, including the assessment of potential neurotoxins. Accordingly, since 2004, we have worked on the development of the integrated co-culture of post-mitotic, human-derived neurons and astrocytes (NT2.N/A), for use as an in vitro functional central nervous system (CNS) model. We have used it successfully to investigate indicators of neurotoxicity. For this purpose, we used NT2.N/A cells to examine the effects of acute exposure to a range of test chemicals on the cellular release of brain-derived neurotrophic factor (BDNF). It was demonstrated that the release of this protective neurotrophin into the culture medium (above that of control levels) occurred consistently in response to sub-cytotoxic levels of known neurotoxic, but not non-neurotoxic, chemicals. These increases in BDNF release were quantifiable, statistically significant, and occurred at concentrations below those at which cell death was measureable, which potentially indicates specific neurotoxicity, as opposed to general cytotoxicity. The fact that the BDNF immunoassay is non-invasive, and that NT2.N/A cells retain their functionality for a period of months, may make this system useful for repeated-dose toxicity testing, which is of particular relevance to cosmetics testing without the use of laboratory animals. In addition, the production of NT2.N/A cells without the use of animal products, such as fetal bovine serum, is being explored, to produce a fully-humanised cellular model.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.