21 resultados para embedded level set

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The level set method is commonly used to address image noise removal. Existing studies concentrate mainly on determining the speed function of the evolution equation. Based on the idea of a Canny operator, this letter introduces a new method of controlling the level set evolution, in which the edge strength is taken into account in choosing curvature flows for the speed function and the normal to edge direction is used to orient the diffusion of the moving interface. The addition of an energy term to penalize the irregularity allows for better preservation of local edge information. In contrast with previous Canny-based level set methods that usually adopt a two-stage framework, the proposed algorithm can execute all the above operations in one process during noise removal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a unique two-stage image restoration framework especially for further application of a novel rectangular poor-pixels detector, which, with properties of miniature size, light weight and low power consumption, has great value in the micro vision system. To meet the demand of fast processing, only a few measured images shifted up to subpixel level are needed to join the fusion operation, fewer than those required in traditional approaches. By maximum likelihood estimation with a least squares method, a preliminary restored image is linearly interpolated. After noise removal via Canny operator based level set evolution, the final high-quality restored image is achieved. Experimental results demonstrate effectiveness of the proposed framework. It is a sensible step towards subsequent image understanding and object identification.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Simultaneous observations of cloud microphysical properties were obtained by in-situ aircraft measurements and ground based Radar/Lidar. Widespread mid-level stratus cloud was present below a temperature inversion (~5 °C magnitude) at 3.6 km altitude. Localised convection (peak updraft 1.5 m s−1) was observed 20 km west of the Radar station. This was associated with convergence at 2.5 km altitude. The convection was unable to penetrate the inversion capping the mid-level stratus. The mid-level stratus cloud was vertically thin (~400 m), horizontally extensive (covering 100 s of km) and persisted for more than 24 h. The cloud consisted of supercooled water droplets and small concentrations of large (~1 mm) stellar/plate like ice which slowly precipitated out. This ice was nucleated at temperatures greater than −12.2 °C and less than −10.0 °C, (cloud top and cloud base temperatures, respectively). No ice seeding from above the cloud layer was observed. This ice was formed by primary nucleation, either through the entrainment of efficient ice nuclei from above/below cloud, or by the slow stochastic activation of immersion freezing ice nuclei contained within the supercooled drops. Above cloud top significant concentrations of sub-micron aerosol were observed and consisted of a mixture of sulphate and carbonaceous material, a potential source of ice nuclei. Particle number concentrations (in the size range 0.1level stratus evaporated before reaching the surface, whereas rates of up to 1 mm h−1 were observed below the convective feature. There is strong evidence for the Hallett-Mossop (HM) process of secondary ice particle production leading to the formation of the precipitation observed. This includes (1) Ice concentrations in the convective feature were more than an order of magnitude greater than the concentration of primary ice in the overlaying stratus, (2) Large concentrations of small pristine columns were observed at the ~−5 °C level together with liquid water droplets and a few rimed ice particles, (3) Columns were larger and increasingly rimed at colder temperatures. Calculated ice splinter production rates are consistent with observed concentrations if the condition that only droplets greater than 24 μm are capable of generating secondary ice splinters is relaxed. This case demonstrates the importance of understanding the formation of ice at slightly supercooled temperatures, as it can lead to secondary ice production and the formation of precipitation in clouds which may not otherwise be considered as significant precipitation sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new spectral-based approach is presented to find orthogonal patterns from gridded weather/climate data. The method is based on optimizing the interpolation error variance. The optimally interpolated patterns (OIP) are then given by the eigenvectors of the interpolation error covariance matrix, obtained using the cross-spectral matrix. The formulation of the approach is presented, and the application to low-dimension stochastic toy models and to various reanalyses datasets is performed. In particular, it is found that the lowest-frequency patterns correspond to largest eigenvalues, that is, variances, of the interpolation error matrix. The approach has been applied to the Northern Hemispheric (NH) and tropical sea level pressure (SLP) and to the Indian Ocean sea surface temperature (SST). Two main OIP patterns are found for the NH SLP representing respectively the North Atlantic Oscillation and the North Pacific pattern. The leading tropical SLP OIP represents the Southern Oscillation. For the Indian Ocean SST, the leading OIP pattern shows a tripole-like structure having one sign over the eastern and north- and southwestern parts and an opposite sign in the remaining parts of the basin. The pattern is also found to have a high lagged correlation with the Niño-3 index with 6-months lag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Habitat fragmentation can affect pollinator and plant population structure in terms of species composition, abundance, area covered and density of flowering plants. This, in turn, may affect pollinator visitation frequency, pollen deposition, seed set and plant fitness. 2. A reduction in the quantity of flower visits can be coupled with a reduction in the quality of pollination service and hence the plants’ overall reproductive success and long-term survival. Understanding the relationship between plant population size and⁄ or isolation and pollination limitation is of fundamental importance for plant conservation. 3. Weexamined flower visitation and seed set of 10 different plant species fromfive European countries to investigate the general effects of plant populations size and density, both within (patch level) and between populations (population level), on seed set and pollination limitation. 4. Wefound evidence that the effects of area and density of flowering plant assemblages were generally more pronounced at the patch level than at the population level. We also found that patch and population level together influenced flower visitation and seed set, and the latter increased with increasing patch area and density, but this effect was only apparent in small populations. 5. Synthesis. By using an extensive pan-European data set on flower visitation and seed set we have identified a general pattern in the interplay between the attractiveness of flowering plant patches for pollinators and density dependence of flower visitation, and also a strong plant species-specific response to habitat fragmentation effects. This can guide efforts to conserve plant–pollinator interactions, ecosystem functioning and plant fitness in fragmented habitats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Farming systems research is a multi-disciplinary holistic approach to solve the problems of small farms. Small and marginal farmers are the core of the Indian rural economy Constituting 0.80 of the total farming community but possessing only 0.36 of the total operational land. The declining trend of per capita land availability poses a serious challenge to the sustainability and profitability of farming. Under such conditions, it is appropriate to integrate land-based enterprises such as dairy, fishery, poultry, duckery, apiary, field and horticultural cropping within the farm, with the objective of generating adequate income and employment for these small and marginal farmers Under a set of farm constraints and varying levels of resource availability and Opportunity. The integration of different farm enterprises can be achieved with the help of a linear programming model. For the current review, integrated farming systems models were developed, by Way Of illustration, for the marginal, small, medium and large farms of eastern India using linear programming. Risk analyses were carried out for different levels of income and enterprise combinations. The fishery enterprise was shown to be less risk-prone whereas the crop enterprise involved greater risk. In general, the degree of risk increased with the increasing level of income. With increase in farm income and risk level, the resource use efficiency increased. Medium and large farms proved to be more profitable than small and marginal farms with higher level of resource use efficiency and return per Indian rupee (Rs) invested. Among the different enterprises of integrated farming systems, a chain of interaction and resource flow was observed. In order to make fanning profitable and improve resource use efficiency at the farm level, the synergy among interacting components of farming systems should be exploited. In the process of technology generation, transfer and other developmental efforts at the farm level (contrary to the discipline and commodity-based approaches which have a tendency to be piecemeal and in isolation), it is desirable to place a whole-farm scenario before the farmers to enhance their farm income, thereby motivating them towards more efficient and sustainable fanning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of experiments was completed to investigate the impact of addition of enzymes at ensiling on in vitro rumen degradation of maize silage. Two commercial products, Depot 40 (D, Biocatalysts Ltd., Pontypridd, UK) and Liquicell 2500 (L, Specialty Enzymes and Biochemicals, Fresno, CA, USA), were used. In experiment 1, the pH optima over a pH range 4.0-6.8 and the stability of D and L under changing pH (4.0, 5.6, 6.8) and temperature (15 and 39 degreesC) conditions were determined. In experiment 2, D and L were applied at three levels to whole crop maize at ensiling, using triplicate 0.5 kg capacity laboratory minisilos. A completely randomized design with a factorial arrangement of treatments was used. One set of treatments was stored at room temperature, whereas another set was stored at 40 degreesC during the first 3 weeks of fermentation, and then stored at room temperature. Silages were opened after 120 days. Results from experiment I indicated that the xylanase activity of both products showed an optimal pH of about 5.6, but the response differed according to the enzyme, whereas the endoglucanase activity was inversely related to pH. Both products retained at least 70% of their xylanase activity after 48 h incubation at 15 or 39 degreesC. In experiment 2, enzymes reduced (P < 0.05) silage pH, regardless of storage temperature and enzyme level. Depol 40 reduced (P < 0.05) the starch contents of the silages, due to its high alpha-amylase activity. This effect was more noticeable in the silages stored at room temperature. Addition of L reduced (P < 0.05) neutral detergent fiber (NDF) and acid detergent fiber (ADF) contents. In vitro rumen degradation, assessed using the Reading Pressure Technique (RPT), showed that L increased (P < 0.05) the initial 6 h gas production (GP) and organic matter degradability (OMD), but did not affect (P > 0.05) the final extent of OMD, indicating that this preparation acted on the rumen degradable material. In contrast, silages treated with D had reduced (P < 0.05) rates of gas production and OMD. These enzymes, regardless of ensiling temperature, can be effective in improving the nutritive quality of maize silage when applied at ensiling. However, the biochemical properties of enzymes (i.e., enzymic activities, optimum pH) may have a crucial role in dictating the nature of the responses. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonregular two-level fractional factorial designs are designs which cannot be specified in terms of a set of defining contrasts. The aliasing properties of nonregular designs can be compared by using a generalisation of the minimum aberration criterion called minimum G2-aberration.Until now, the only nontrivial designs that are known to have minimum G2-aberration are designs for n runs and m n–5 factors. In this paper, a number of construction results are presented which allow minimum G2-aberration designs to be found for many of the cases with n = 16, 24, 32, 48, 64 and 96 runs and m n/2–2 factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally acknowledged that population-level assessments provide,I better measure of response to toxicants than assessments of individual-level effects. population-level assessments generally require the use of models to integrate potentially complex data about the effects of toxicants on life-history traits, and to provide a relevant measure of ecological impact. Building on excellent earlier reviews we here briefly outline the modelling options in population-level risk assessment. Modelling is used to calculate population endpoints from available data, which is often about Individual life histories, the ways that individuals interact with each other, the environment and other species, and the ways individuals are affected by pesticides. As population endpoints, we recommend the use of population abundance, population growth rate, and the chance of population persistence. We recommend two types of model: simple life-history models distinguishing two life-history stages, juveniles and adults; and spatially-explicit individual-based landscape models. Life-history models are very quick to set up and run, and they provide a great deal or insight. At the other extreme, individual-based landscape models provide the greatest verisimilitude, albeit at the cost of greatly increased complexity. We conclude with a discussion of the cations of the severe problems of parameterising models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the uniformization of a system of afine recurrence equations. This transformation is used in the design (or compilation) of highly parallel embedded systems (VLSI systolic arrays, signal processing filters, etc.). In this paper, we present and implement an automatic system to achieve uniformization of systems of afine recurrence equations. We unify the results from many earlier papers, develop some theoretical extensions, and then propose effective uniformization algorithms. Our results can be used in any high level synthesis tool based on polyhedral representation of nested loop computations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-level rise is an important aspect of climate change because of its impact on society and ecosystems. Here we present an intercomparison of results from ten coupled atmosphere-ocean general circulation models (AOGCMs) for sea-level changes simulated for the twentieth century and projected to occur during the twenty first century in experiments following scenario IS92a for greenhouse gases and sulphate aerosols. The model results suggest that the rate of sea-level rise due to thermal expansion of sea water has increased during the twentieth century, but the small set of tide gauges with long records might not be adequate to detect this acceleration. The rate of sea-level rise due to thermal expansion continues to increase throughout the twenty first century, and the projected total is consequently larger than in the twentieth century; for 1990-2090 it amounts to 0.20-0.37 in. This wide range results from systematic uncertainty in modelling of climate change and of heat uptake by the ocean. The AOGCMs agree that sea-level rise is expected to be geographically non-uniform, with some regions experiencing as much as twice the global average, and others practically zero, but they do not agree about the geographical pattern. The lack of agreement indicates that we cannot currently have confidence in projections of local sea- level changes, and reveals a need for detailed analysis and intercomparison in order to understand and reduce the disagreements.