30 resultados para INTEGRATING DIRECT-METHODS
em CentAUR: Central Archive University of Reading - UK
Resumo:
Many scientific and engineering applications involve inverting large matrices or solving systems of linear algebraic equations. Solving these problems with proven algorithms for direct methods can take very long to compute, as they depend on the size of the matrix. The computational complexity of the stochastic Monte Carlo methods depends only on the number of chains and the length of those chains. The computing power needed by inherently parallel Monte Carlo methods can be satisfied very efficiently by distributed computing technologies such as Grid computing. In this paper we show how a load balanced Monte Carlo method for computing the inverse of a dense matrix can be constructed, show how the method can be implemented on the Grid, and demonstrate how efficiently the method scales on multiple processors. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In positron emission tomography and single photon emission computed tomography studies using D2 dopamine (DA) receptor radiotracers, a decrease in radiotracer binding potential (BP) is usually interpreted in terms of increased competition with synaptic DA. However, some data suggest that this signal may also reflect agonist (DA)-induced increases in D2 receptor (D2R) internalization, a process which would presumably also decrease the population of receptors available for binding to hydrophilic radioligands. To advance interpretation of alterations in D2 radiotracer BP, direct methods of assessment of D2R internalization are required. Here, we describe a confocal microscopy-based approach for the quantification of agonist-dependent receptor internalization. The method relies upon double-labeling of the receptors with antibodies directed against intracellular as well as extracellular epitopes. Following agonist stimulation, DA D2R internalization was quantified by differentiating, in optical cell sections, the signal due to the staining of the extracellular from intracellular epitopes of D2Rs. Receptor internalization was increased in the presence of the D2 agonists DA and bromocriptine, but not the D1 agonist SKF38393. Pretreatment with either the D2 antagonist sulpiride, or inhibitors of internalization (phenylarsine oxide and high molarity sucrose), blocked D2-agonist induced receptor internalization, thus validating this method in vitro. This approach therefore provides a direct and streamlined methodology for investigating the pharmacological and mechanistic aspects of D2R internalization, and should inform the interpretation of results from in vivo receptor imaging studies.
Resumo:
Controlling Armillaria infections by physical and chemical methods alone is at present inadequate, ineffective, or impractical. Effective biological control either alone or in integration with another control strategy appears necessary. Biological control agents of Armillaria function by the antagonists inhibiting or preventing its rhizomorphic and mycelial development, by limiting it to substrate already occupied, by actively pre-empting the substrate, or by eliminating the pathogen from substrate it has already occupied. Among the most thoroughly investigated antagonists of Armillaria are Trichoderma species. Depending on the particular isolate of a Trichoderma species, control may be achieved by competition, production of antibiotics, or by mycoparasitism. The level of control is also influenced by the growth and carrier substrate of the antagonist, time of application in relation to the occurrence of the disease, and several environmental conditions. Among a range of the other antagonists are several cord-forming fungi and an isolate of Dactylium dendroides. Integrating biological methods with an appropriate method of chemical could control the disease more effectively. However it is essential to determine whether the antagonist or the fungicide should be applied first, and the time interval between.
Resumo:
1. Wildlife managers often require estimates of abundance. Direct methods of estimation are often impractical, especially in closed-forest environments, so indirect methods such as dung or nest surveys are increasingly popular. 2. Dung and nest surveys typically have three elements: surveys to estimate abundance of the dung or nests; experiments to estimate the production (defecation or nest construction) rate; and experiments to estimate the decay or disappearance rate. The last of these is usually the most problematic, and was the subject of this study. 3. The design of experiments to allow robust estimation of mean time to decay was addressed. In most studies to date, dung or nests have been monitored until they disappear. Instead, we advocate that fresh dung or nests are located, with a single follow-up visit to establish whether the dung or nest is still present or has decayed. 4. Logistic regression was used to estimate probability of decay as a function of time, and possibly of other covariates. Mean time to decay was estimated from this function. 5. Synthesis and applications. Effective management of mammal populations usually requires reliable abundance estimates. The difficulty in estimating abundance of mammals in forest environments has increasingly led to the use of indirect survey methods, in which abundance of sign, usually dung (e.g. deer, antelope and elephants) or nests (e.g. apes), is estimated. Given estimated rates of sign production and decay, sign abundance estimates can be converted to estimates of animal abundance. Decay rates typically vary according to season, weather, habitat, diet and many other factors, making reliable estimation of mean time to decay of signs present at the time of the survey problematic. We emphasize the need for retrospective rather than prospective rates, propose a strategy for survey design, and provide analysis methods for estimating retrospective rates.
Resumo:
In positron emission tomography and single photon emission computed tomography studies using D2 dopamine (DA) receptor radiotracers, a decrease in radiotracer binding potential (BP) is usually interpreted in terms of increased competition with synaptic DA. However, some data suggest that this signal may also reflect agonist (DA)-induced increases in D2 receptor (D2R) internalization, a process which would presumably also decrease the population of receptors available for binding to hydrophilic radioligands. To advance interpretation of alterations in D2 radiotracer BP, direct methods of assessment of D2R internalization are required. Here, we describe a confocal microscopy-based approach for the quantification of agonist-dependent receptor internalization. The method relies upon double-labeling of the receptors with antibodies directed against intracellular as well as extracellular epitopes. Following agonist stimulation, DA D2R internalization was quantified by differentiating, in optical cell sections, the signal due to the staining of the extracellular from intracellular epitopes of D2Rs. Receptor internalization was increased in the presence of the D2 agonists DA and bromocriptine, but not the D1 agonist SKF38393. Pretreatment with either the D2 antagonist sulpiride, or inhibitors of internalization (phenylarsine oxide and high molarity sucrose), blocked D2-agonist induced receptor internalization, thus validating this method in vitro. This approach therefore provides a direct and streamlined methodology for investigating the pharmacological and mechanistic aspects of D2R internalization, and should inform the interpretation of results from in vivo receptor imaging studies.
Resumo:
Differential thermal expansion over the range 90-210 K has been applied successfully to determine the crystal structure of chlorothiazide from synchrotron powder diffraction data using direct methods. Key to the success of the approach is the use of a multi-data-set Pawley refinement to extract a set of reflection intensities that is more 'single-crystal-like' than those extracted from a single data set. The improvement in reflection intensity estimates is quantified by comparison with reference single-crystal intensities. (C) 2008 International Union of Crystallography Printed in Singapore - all rights reserved
Resumo:
A number of computationally reliable direct methods for pole assignment by feedback have recently been developed. These direct procedures do not necessarily produce robust solutions to the problem, however, in the sense that the assigned poles are insensitive to perturbalions in the closed-loop system. This difficulty is illustrated here with results from a recent algorithm presented in this TRANSACTIONS and its causes are examined. A measure of robustness is described, and techniques for testing and improving robustness are indicated.
Integrating methods for developing sustainability indicators that can facilitate learning and action
Resumo:
Bossel's (2001) systems-based approach for deriving comprehensive indicator sets provides one of the most holistic frameworks for developing sustainability indicators. It ensures that indicators cover all important aspects of system viability, performance, and sustainability, and recognizes that a system cannot be assessed in isolation from the systems upon which it depends and which in turn depend upon it. In this reply, we show how Bossel's approach is part of a wider convergence toward integrating participatory and reductionist approaches to measure progress toward sustainable development. However, we also show that further integration of these approaches may be able to improve the accuracy and reliability of indicators to better stimulate community learning and action. Only through active community involvement can indicators facilitate progress toward sustainable development goals. To engage communities effectively in the application of indicators, these communities must be actively involved in developing, and even in proposing, indicators. The accuracy, reliability, and sensitivity of the indicators derived from local communities can be ensured through an iterative process of empirical and community evaluation. Communities are unlikely to invest in measuring sustainability indicators unless monitoring provides immediate and clear benefits. However, in the context of goals, targets, and/or baselines, sustainability indicators can more effectively contribute to a process of development that matches local priorities and engages the interests of local people.
Resumo:
Answering many of the critical questions in conservation, development and environmental management requires integrating the social and natural sciences. However, understanding the array of available quantitative methods and their associated terminology presents a major barrier to successful collaboration. We provide an overview of quantitative socio-economic methods that distils their complexity into a simple taxonomy. We outline how each has been used in conjunction with ecological models to address questions relating to the management of socio-ecological systems. We review the application of social and ecological quantitative concepts to agro-ecology and classify the approaches used to integrate the two disciplines. Our review included all published integrated models from 2003 to 2008 in 27 journals that publish agricultural modelling research. Although our focus is on agro-ecology, many of the results are broadly applicable to other fields involving an interaction between human activities and ecology. We found 36 papers that integrated social and ecological concepts in a quantitative model. Four different approaches to integration were used, depending on the scale at which human welfare was quantified. Most models viewed humans as pure profit maximizers, both when calculating welfare and predicting behaviour. Synthesis and applications. We reached two main conclusions based on our taxonomy and review. The first is that quantitative methods that extend predictions of behaviour and measurements of welfare beyond a simple market value basis are underutilized by integrated models. The second is that the accuracy of prediction for integrated models remains largely unquantified. Addressing both problems requires researchers to reach a common understanding of modelling goals and data requirements during the early stages of a project.
Resumo:
The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an “inner” direct or iterative process. In comparison with Newton’s method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact Gauss–Newton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the Gauss–Newton method of two types of approximation used commonly in data assimilation. First, we examine “truncated” Gauss–Newton methods where the inner linear least squares problem is not solved exactly, and second, we examine “perturbed” Gauss–Newton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed Gauss–Newton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.
Resumo:
Satellite observed data for flood events have been used to calibrate and validate flood inundation models, providing valuable information on the spatial extent of the flood. Improvements in the resolution of this satellite imagery have enabled indirect remote sensing of water levels by using an underlying LiDAR DEM to extract the water surface elevation at the flood margin. Further to comparison of the spatial extent, this now allows for direct comparison between modelled and observed water surface elevations. Using a 12.5m ERS-1 image of a flood event in 2006 on the River Dee, North Wales, UK, both of these data types are extracted and each assessed for their value in the calibration of flood inundation models. A LiDAR guided snake algorithm is used to extract an outline of the flood from the satellite image. From the extracted outline a binary grid of wet / dry cells is created at the same resolution as the model, using this the spatial extent of the modelled and observed flood can be compared using a measure of fit between the two binary patterns of flooding. Water heights are extracted using points at intervals of approximately 100m along the extracted outline, and the students T-test is used to compare modelled and observed water surface elevations. A LISFLOOD-FP model of the catchment is set up using LiDAR topographic data resampled to the 12.5m resolution of the satellite image, and calibration of the friction parameter in the model is undertaken using each of the two approaches. Comparison between the two approaches highlights the sensitivity of the spatial measure of fit to uncertainty in the observed data and the potential drawbacks of using the spatial extent when parts of the flood are contained by the topography.
Resumo:
The 'direct costs' attributable to 30 different endemic diseases of farm animals in Great Britain are estimated using a standardised method to construct a simple model for each disease that includes consideration of disease prevention and treatment costs. The models so far developed provide a basis for further analyses including cost-benefit analyses for the economic assessment of disease control options. The approach used reflects the inherent livestock disease information constraints, which limit the application of other economic analytical methods. It is a practical and transparent approach that is relatively easily communicated to veterinary scientists and policy makers. The next step is to develop the approach by incorporating wider economic considerations into the analyses in a way that will demonstrate to policy makers and others the importance of an economic perspective to livestock disease issues.
Resumo:
This paper examines the potential of using Participatory Farm Management methods to examine the suitability of a technology with farmers prior to on-farm trials. A study examining the suitability of green manuring as a technology for use with wet season tomato producers in Ghana is described. Findings from this case-study demonstrate that Participatory Budgeting can be used by farmers and researchers to analyse current cultivation practices, identify the options for including green manures into the system and explore the direct and wider resource implications of the technology. Scored-Causal Diagrams can be used to identify farmers' perceptions of the relative importance of the problem that the technology seeks to address. The use of the methods in this examine evaluation process appears to have the potential to improve the effectiveness and efficiency of the adaptive research process. This ensures that technologies subsequently examined in trials ate relevant to farmers' interests, existing systems and resources, thereby increasing the chances of farmer adoption. It is concluded that this process has potential for use-with other technologies and in other farming systems. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In many river floodplains in the UK, there has been a long history of flood defence, land reclamation and water regime management for farming. In recent years, however, changing European and national policies with respect to farming, environment and flood management are encouraging a re-appraisal of land use in rural areas. In particular, there is scope to develop, through the use of appropriate promotional mechanisms, washland areas, which will simultaneously accommodate winter inundation, support extensive farming methods, deliver environmental benefits, and do this in a way which can underpin the rural economy. This paper explores the likely economic impacts of the development of flood storage and washland creation. In doing so, consideration is given to feasibility of this type of development, the environmental implications for a variety of habitats and species, and the financial and institutional mechanisms required to achieve implementation. (C) 2007 Elsevier Ltd. All rights reserved.