87 resultados para Level-Set Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An efficient numerical method is presented for the solution of the Euler equations governing the compressible flow of a real gas. The scheme is based on the approximate solution of a specially constructed set of linearised Riemann problems. An average of the flow variables across the interface between cells is required, and this is chosen to be the arithmetic mean for computational efficiency, which is in contrast to the usual square root averaging. The scheme is applied to a test problem for five different equations of state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and nonconvective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud-resolving model simulations, and from the Bayesian formulation itself. Synthetic rain-rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in TMI instantaneous rain-rate estimates at 0.5°-resolution range from approximately 50% at 1 mm h−1 to 20% at 14 mm h−1. Errors in collocated spaceborne radar rain-rate estimates are roughly 50%–80% of the TMI errors at this resolution. The estimated algorithm random error in TMI rain rates at monthly, 2.5° resolution is relatively small (less than 6% at 5 mm day−1) in comparison with the random error resulting from infrequent satellite temporal sampling (8%–35% at the same rain rate). Percentage errors resulting from sampling decrease with increasing rain rate, and sampling errors in latent heating rates follow the same trend. Averaging over 3 months reduces sampling errors in rain rates to 6%–15% at 5 mm day−1, with proportionate reductions in latent heating sampling errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are developing computational tools supporting the detailed analysis of the dependence of neural electrophysiological response on dendritic morphology. We approach this problem by combining simulations of faithful models of neurons (experimental real life morphological data with known models of channel kinetics) with algorithmic extraction of morphological and physiological parameters and statistical analysis. In this paper, we present the novel method for an automatic recognition of spike trains in voltage traces, which eliminates the need for human intervention. This enables classification of waveforms with consistent criteria across all the analyzed traces and so it amounts to reduction of the noise in the data. This method allows for an automatic extraction of relevant physiological parameters necessary for further statistical analysis. In order to illustrate the usefulness of this procedure to analyze voltage traces, we characterized the influence of the somatic current injection level on several electrophysiological parameters in a set of modeled neurons. This application suggests that such an algorithmic processing of physiological data extracts parameters in a suitable form for further investigation of structure-activity relationship in single neurons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-level rise is an important aspect of climate change because of its impact on society and ecosystems. Here we present an intercomparison of results from ten coupled atmosphere-ocean general circulation models (AOGCMs) for sea-level changes simulated for the twentieth century and projected to occur during the twenty first century in experiments following scenario IS92a for greenhouse gases and sulphate aerosols. The model results suggest that the rate of sea-level rise due to thermal expansion of sea water has increased during the twentieth century, but the small set of tide gauges with long records might not be adequate to detect this acceleration. The rate of sea-level rise due to thermal expansion continues to increase throughout the twenty first century, and the projected total is consequently larger than in the twentieth century; for 1990-2090 it amounts to 0.20-0.37 in. This wide range results from systematic uncertainty in modelling of climate change and of heat uptake by the ocean. The AOGCMs agree that sea-level rise is expected to be geographically non-uniform, with some regions experiencing as much as twice the global average, and others practically zero, but they do not agree about the geographical pattern. The lack of agreement indicates that we cannot currently have confidence in projections of local sea- level changes, and reveals a need for detailed analysis and intercomparison in order to understand and reduce the disagreements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this contribution we aim at anchoring Agent-Based Modeling (ABM) simulations in actual models of human psychology. More specifically, we apply unidirectional ABM to social psychological models using low level agents (i.e., intra-individual) to examine whether they generate better predictions, in comparison to standard statistical approaches, concerning the intentions of performing a behavior and the behavior. Moreover, this contribution tests to what extent the predictive validity of models of attitude such as the Theory of Planned Behavior (TPB) or Model of Goal-directed Behavior (MGB) depends on the assumption that peoples’ decisions and actions are purely rational. Simulations were therefore run by considering different deviations from rationality of the agents with a trembling hand method. Two data sets concerning respectively the consumption of soft drinks and physical activity were used. Three key findings emerged from the simulations. First, compared to standard statistical approach the agent-based simulation generally improves the prediction of behavior from intention. Second, the improvement in prediction is inversely proportional to the complexity of the underlying theoretical model. Finally, the introduction of varying degrees of deviation from rationality in agents’ behavior can lead to an improvement in the goodness of fit of the simulations. By demonstrating the potential of ABM as a complementary perspective to evaluating social psychological models, this contribution underlines the necessity of better defining agents in terms of psychological processes before examining higher levels such as the interactions between individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pollination is one of the most important ecosystem services in agroecosystems and supports food production. Pollinators are potentially at risk being exposed to pesticides and the main route of exposure is direct contact, in some cases ingestion, of contaminated materials such as pollen, nectar, flowers and foliage. To date there are no suitable methods for predicting pesticide exposure for pollinators, therefore official procedures to assess pesticide risk are based on a Hazard Quotient. Here we develop a procedure to assess exposure and risk for pollinators based on the foraging behaviour of honeybees (Apis mellifera) and using this species as indicator representative of pollinating insects. The method was applied in 13 European field sites with different climatic, landscape and land use characteristics. The level of risk during the crop growing season was evaluated as a function of the active ingredients used and application regime. Risk levels were primarily determined by the agronomic practices employed (i.e. crop type, pest control method, pesticide use), and there was a clear temporal partitioning of risks through time. Generally the risk was higher in sites cultivated with permanent crops, such as vineyard and olive, than in annual crops, such as cereals and oil seed rape. The greatest level of risk is generally found at the beginning of the growing season for annual crops and later in June–July for permanent crops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymer-stabilised liquid crystals are systems in which a small amount of monomer is dissolved within a liquid crystalline host, and then polymerised in situ to produce a network. The progress of the polymerisation, performed within electro-optic cells, was studied by establishing an analytical method novel to these systems. Samples were prepared by photopolymerisation of the monomer under well-defined reaction conditions; subsequent immersion in acetone caused the host and any unreacted monomer to dissolve. High performance liquid chromatography was used to separate and detect the various solutes in the resulting solutions, enabling the amount of unreacted monomer for a given set of conditions to be quantified. Longer irradiations cause a decrease in the proportion of unreacted monomer since more network is formed, while a more uniform LC director alignment (achieved by decreasing the sample thickness) or a higher level of order (achieved by decreasing the polymerisation temperature) promotes faster reactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the method and findings of a survey designed to explore the economic benefits of the adoption of Bacillus thuringiensis (Bt) cotton for smallholder farmers in the Republic of South Africa. The study found reason for cautious optimism in that the Bt variety generally resulted in a per hectare increase in yields and value of output with a reduction in pesticide costs, which outweighed the increase in seed costs to give a substantial increase in gross margins. Thus, these preliminary results suggest that Bt cotton is good for smallholder cotton farmers and the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the method and findings of the first independent survey of smallholder farmers in the Republic of South Africa designed to explore the economic benefits of their adoption of Bt cotton. The study found that the Bt variety generally resulted in a per hectare increase in yields, value of output and reduction of pesticide costs which outweighed the increase in seed costs to give a substantial increase in gross margins. There are several surveys being carried out at the moment on different aspects of the Makhathini experience. The Monitor will be reporting on their results as these become available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ever increasing demand for high image quality requires fast and efficient methods for noise reduction. The best-known order-statistics filter is the median filter. A method is presented to calculate the median on a set of N W-bit integers in W/B time steps. Blocks containing B-bit slices are used to find B-bits of the median; using a novel quantum-like representation allowing the median to be computed in an accelerated manner compared to the best-known method (W time steps). The general method allows a variety of designs to be synthesised systematically. A further novel architecture to calculate the median for a moving set of N integers is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an approach for dealing with coarse-resolution Earth observations (EO) in terrestrial ecosystem data assimilation schemes. The use of coarse-scale observations in ecological data assimilation schemes is complicated by spatial heterogeneity and nonlinear processes in natural ecosystems. If these complications are not appropriately dealt with, then the data assimilation will produce biased results. The “disaggregation” approach that we describe in this paper combines frequent coarse-resolution observations with temporally sparse fine-resolution measurements. We demonstrate the approach using a demonstration data set based on measurements of an Arctic ecosystem. In this example, normalized difference vegetation index observations are assimilated into a “zero-order” model of leaf area index and carbon uptake. The disaggregation approach conserves key ecosystem characteristics regardless of the observation resolution and estimates the carbon uptake to within 1% of the demonstration data set “truth.” Assimilating the same data in the normal manner, but without the disaggregation approach, results in carbon uptake being underestimated by 58% at an observation resolution of 250 m. The disaggregation method allows the combination of multiresolution EO and improves in spatial resolution if observations are located on a grid that shifts from one observation time to the next. Additionally, the approach is not tied to a particular data assimilation scheme, model, or EO product and can cope with complex observation distributions, as it makes no implicit assumptions of normality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter presents an effective approach for selection of appropriate terrain modeling methods in forming a digital elevation model (DEM). This approach achieves a balance between modeling accuracy and modeling speed. A terrain complexity index is defined to represent a terrain's complexity. A support vector machine (SVM) classifies terrain surfaces into either complex or moderate based on this index associated with the terrain elevation range. The classification result recommends a terrain modeling method for a given data set in accordance with its required modeling accuracy. Sample terrain data from the lunar surface are used in constructing an experimental data set. The results have shown that the terrain complexity index properly reflects the terrain complexity, and the SVM classifier derived from both the terrain complexity index and the terrain elevation range is more effective and generic than that designed from either the terrain complexity index or the terrain elevation range only. The statistical results have shown that the average classification accuracy of SVMs is about 84.3% ± 0.9% for terrain types (complex or moderate). For various ratios of complex and moderate terrain types in a selected data set, the DEM modeling speed increases up to 19.5% with given DEM accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fingerprint method for detecting anthropogenic climate change is applied to new simulations with a coupled ocean-atmosphere general circulation model (CGCM) forced by increasing concentrations of greenhouse gases and aerosols covering the years 1880 to 2050. In addition to the anthropogenic climate change signal, the space-time structure of the natural climate variability for near-surface temperatures is estimated from instrumental data over the last 134 years and two 1000 year simulations with CGCMs. The estimates are compared with paleoclimate data over 570 years. The space-time information on both the signal and the noise is used to maximize the signal-to-noise ratio of a detection variable obtained by applying an optimal filter (fingerprint) to the observed data. The inclusion of aerosols slows the predicted future warming. The probability that the observed increase in near-surface temperatures in recent decades is of natural origin is estimated to be less than 5%. However, this number is dependent on the estimated natural variability level, which is still subject to some uncertainty.