954 resultados para Objective method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Muchas de las nuevas aplicaciones emergentes de Internet tales como TV sobre Internet, Radio sobre Internet,Video Streamming multi-punto, entre otras, necesitan los siguientes requerimientos de recursos: ancho de banda consumido, retardo extremo-a-extremo, tasa de paquetes perdidos, etc. Por lo anterior, es necesario formular una propuesta que especifique y provea para este tipo de aplicaciones los recursos necesarios para su buen funcionamiento. En esta tesis, proponemos un esquema de ingeniería de tráfico multi-objetivo a través del uso de diferentes árboles de distribución para muchos flujos multicast. En este caso, estamos usando la aproximación de múltiples caminos para cada nodo egreso y de esta forma obtener la aproximación de múltiples árboles y a través de esta forma crear diferentes árboles multicast. Sin embargo, nuestra propuesta resuelve la fracción de la división del tráfico a través de múltiples árboles. La propuesta puede ser aplicada en redes MPLS estableciendo rutas explícitas en eventos multicast. En primera instancia, el objetivo es combinar los siguientes objetivos ponderados dentro de una métrica agregada: máxima utilización de los enlaces, cantidad de saltos, el ancho de banda total consumido y el retardo total extremo-a-extremo. Nosotros hemos formulado esta función multi-objetivo (modelo MHDB-S) y los resultados obtenidos muestran que varios objetivos ponderados son reducidos y la máxima utilización de los enlaces es minimizada. El problema es NP-duro, por lo tanto, un algoritmo es propuesto para optimizar los diferentes objetivos. El comportamiento que obtuvimos usando este algoritmo es similar al que obtuvimos con el modelo. Normalmente, durante la transmisión multicast los nodos egresos pueden salir o entrar del árbol y por esta razón en esta tesis proponemos un esquema de ingeniería de tráfico multi-objetivo usando diferentes árboles para grupos multicast dinámicos. (en el cual los nodos egresos pueden cambiar durante el tiempo de vida de la conexión). Si un árbol multicast es recomputado desde el principio, esto podría consumir un tiempo considerable de CPU y además todas las comuicaciones que están usando el árbol multicast serán temporalmente interrumpida. Para aliviar estos inconvenientes, proponemos un modelo de optimización (modelo dinámico MHDB-D) que utilice los árboles multicast previamente computados (modelo estático MHDB-S) adicionando nuevos nodos egreso. Usando el método de la suma ponderada para resolver el modelo analítico, no necesariamente es correcto, porque es posible tener un espacio de solución no convexo y por esta razón algunas soluciones pueden no ser encontradas. Adicionalmente, otros tipos de objetivos fueron encontrados en diferentes trabajos de investigación. Por las razones mencionadas anteriormente, un nuevo modelo llamado GMM es propuesto y para dar solución a este problema un nuevo algoritmo usando Algoritmos Evolutivos Multi-Objetivos es propuesto. Este algoritmo esta inspirado por el algoritmo Strength Pareto Evolutionary Algorithm (SPEA). Para dar una solución al caso dinámico con este modelo generalizado, nosotros hemos propuesto un nuevo modelo dinámico y una solución computacional usando Breadth First Search (BFS) probabilístico. Finalmente, para evaluar nuestro esquema de optimización propuesto, ejecutamos diferentes pruebas y simulaciones. Las principales contribuciones de esta tesis son la taxonomía, los modelos de optimización multi-objetivo para los casos estático y dinámico en transmisiones multicast (MHDB-S y MHDB-D), los algoritmos para dar solución computacional a los modelos. Finalmente, los modelos generalizados también para los casos estático y dinámico (GMM y GMM Dinámico) y las propuestas computacionales para dar slución usando MOEA y BFS probabilístico.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data from four recent reanalysis projects [ECMWF, NCEP-NCAR, NCEP - Department of Energy ( DOE), NASA] have been diagnosed at the scale of synoptic weather systems using an objective feature tracking method. The tracking statistics indicate that, overall, the reanalyses correspond very well in the Northern Hemisphere (NH) lower troposphere, although differences for the spatial distribution of mean intensities show that the ECMWF reanalysis is systematically stronger in the main storm track regions but weaker around major orographic features. A direct comparison of the track ensembles indicates a number of systems with a broad range of intensities that compare well among the reanalyses. In addition, a number of small-scale weak systems are found that have no correspondence among the reanalyses or that only correspond upon relaxing the matching criteria, indicating possible differences in location and/or temporal coherence. These are distributed throughout the storm tracks, particularly in the regions known for small-scale activity, such as secondary development regions and the Mediterranean. For the Southern Hemisphere (SH), agreement is found to be generally less consistent in the lower troposphere with significant differences in both track density and mean intensity. The systems that correspond between the various reanalyses are considerably reduced and those that do not match span a broad range of storm intensities. Relaxing the matching criteria indicates that there is a larger degree of uncertainty in both the location of systems and their intensities compared with the NH. At upper-tropospheric levels, significant differences in the level of activity occur between the ECMWF reanalysis and the other reanalyses in both the NH and SH winters. This occurs due to a lack of coherence in the apparent propagation of the systems in ERA15 and appears most acute above 500 hPa. This is probably due to the use of optimal interpolation data assimilation in ERA15. Also shown are results based on using the same techniques to diagnose the tropical easterly wave activity. Results indicate that the wave activity is sensitive not only to the resolution and assimilation methods used but also to the model formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new objective climatology of polar lows in the Nordic (Norwegian and Barents) seas has been derived from a database of diagnostics of objectively identified cyclones spanning the period January 2000 to April 2004. There are two distinct parts to this study: the development of the objective climatology and a characterization of the dynamical forcing of the polar lows identified. Polar lows are an intense subset of polar mesocyclones. Polar mesocyclones are distinguished from other cyclones in the database as those that occur in cold air outbreaks over the open ocean. The difference between the wet-bulb potential temperature at 700 hPa and the sea surface temperature (SST) is found to be an effective discriminator between the atmospheric conditions associated with polar lows and other cyclones in the Nordic seas. A verification study shows that the objective identification method is reliable in the Nordic seas region. After demonstrating success at identifying polar lows using the above method, the dynamical forcing of the polar lows in the Nordic seas is characterized. Diagnostics of the ratio of mid-level vertical motion attributable to quasi-geostrophic forcing from upper and lower levels (U/L ratio) are used to determine the prevalence of a recently proposed category of extratropical cyclogenesis, type C, for which latent heat release is crucial to development. Thirty-one percent of the objectively identified polar low events (36 from 115) exceeded the U/L ratio of 4.0, previously identified as a threshold for type C cyclones. There is a contrast between polar lows to the north and south of the Nordic seas. In the southern Norwegian Sea, the population of polar low events is dominated by type C cyclones. These possess strong convection and weak low-level baroclinicity. Over the Barents and northern Norwegian seas, the well-known cyclogenesis types A and B dominate. These possess stronger low-level baroclinicity and weaker convection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is presented for determining the time to first division of individual bacterial cells growing on agar media. Bacteria were inoculated onto agar-coated slides and viewed by phase-contrast microscopy. Digital images of the growing bacteria were captured at intervals and the time to first division estimated by calculating the "box area ratio". This is the area of the smallest rectangle that can be drawn around an object, divided by the area of the object itself. The box area ratios of cells were found to increase suddenly during growth at a time that correlated with cell division as estimated by visual inspection of the digital images. This was caused by a change in the orientation of the two daughter cells that occurred when sufficient flexibility arose at their point of attachment. This method was used successfully to generate lag time distributions for populations of Escherichia coli, Listeria monocytogenes and Pseudomonas aeruginosa, but did not work with the coccoid organism Staphylococcus aureus. This method provides an objective measure of the time to first cell division, whilst automation of the data processing allows a large number of cells to be examined per experiment. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usefulness of motor subtypes of delirium is unclear due to inconsistency in subtyping methods and a lack of validation with objective measures of activity. The activity of 40 patients was measured over 24 h with a discrete accelerometer-based activity monitor. The continuous wavelet transform (CWT) with various mother wavelets were applied to accelerometry data from three randomly selected patients with DSM-IV delirium that were readily divided into hyperactive, hypoactive, and mixed motor subtypes. A classification tree used the periods of overall movement as measured by the discrete accelerometer-based monitor as determining factors for which to classify these delirious patients. This data used to create the classification tree were based upon the minimum, maximum, standard deviation, and number of coefficient values, generated over a range of scales by the CWT. The classification tree was subsequently used to define the remaining motoric subtypes. The use of a classification system shows how delirium subtypes can be categorized in relation to overall motoric behavior. The classification system was also implemented to successfully define other patient motoric subtypes. Motor subtypes of delirium defined by observed ward behavior differ in electronically measured activity levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the integral manifold approach, a composite control—the sum of a fast control and a slow control—is derived for a particular class of non-linear singularly perturbed systems. The fast control is designed completely at the outset, thus ensuring the stability of the fast transients of the system and, furthermore, the existence of the integral manifold. A new method is then presented which simplifies the derivation of a slow control such that the singularly perturbed system meets a preselected design objective to within some specified order of accuracy. Though this approach is, by its very nature, ad hoc, the underlying procedure is easily extended to more general classes of singularly perturbed systems by way of three examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Foundation construction process has been an important key point in a successful construction engineering. The frequency of using diaphragm wall construction method among many deep excavation construction methods in Taiwan is the highest in the world. The traditional view of managing diaphragm wall unit in the sequencing of construction activities is to establish each phase of the sequencing of construction activities by heuristics. However, it conflicts final phase of engineering construction with unit construction and effects planning construction time. In order to avoid this kind of situation, we use management of science in the study of diaphragm wall unit construction to formulate multi-objective combinational optimization problem. Because the characteristic (belong to NP-Complete problem) of problem mathematic model is multi-objective and combining explosive, it is advised that using the 2-type Self-Learning Neural Network (SLNN) to solve the N=12, 24, 36 of diaphragm wall unit in the sequencing of construction activities program problem. In order to compare the liability of the results, this study will use random researching method in comparison with the SLNN. It is found that the testing result of SLNN is superior to random researching method in whether solution-quality or Solving-efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous studies have reported that cheese curd syneresis kinetics can be monitored by dilution of chemical tracers, such as Blue Dextran, in whey. The objective of this study was to evaluate an improved tracer method to monitor whey volumes expelled over time during syneresis. Two experiments with different ranges of milk fat (0-5% and 2.3-3.5%) were carried out in an 11 L double-O laboratory scale cheese vat. Tracer was added to the curd-whey mixture during the cutting phase of cheese making and samples were taken at 10 min intervals up to 75 min after cutting. The volume of whey expelled was measured gravimetrically and the dilution of tracer in the whey was measured by absorbance at 620 nm. The volumes of whey expelled were significantly reduced at higher milk fat levels. Whey yield was predicted with a SEP ranging from 3.2 to 6.3 g whey/100 mL of milk and a CV ranging from 2.03 to 2.7% at different milk fat levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated Arable Farming Systems (IAFS), which involve a reduction in the use of off-farm inputs, are attracting considerable research interest in the UK. The objectives of these systems experiments are to compare their financial performance with that from conventional or current farming practices. To date, this comparison has taken little account of any environmental benefits (or disbenefits) of the two systems. The objective of this paper is to review the assessment methodologies available for the analysis of environmental impacts. To illustrate the results of this exercise, the methodology and environmental indicators chosen are then applied to data from one of the LINK - Integrated Farming Systems experimental sites. Data from the Pathhead site in Southern Scotland are used to evaluate the use of invertebrates and nitrate loss as environmental indicators within IAFS. The results suggest that between 1992 and 1995 the biomass of earthworms fell by 28 kg per hectare on the integrated rotation and rose by 31 kg per hectare on the conventional system. This led to environmental costs ranging between £2.24 and £13.44 per hectare for the integrated system and gains of between £2.48 and £14.88 for the conventional system. In terms of nitrate, the integrated system had an estimated loss of £72.21 per hectare in comparison to £149.40 per hectare on the conventional system. Conclusions are drawn about the advantages and disadvantages of this type of analytical framework. Keywords: Farming systems; IAFS; Environmental valuation; Economics; Earthworms; Nitrates; Soil fauna

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate assessment of dietary exposure is important in investigating associations between diet and disease. Research in nutritional epidemiology, which has resulted in a large amount of information on associations between diet and chronic diseases in the last decade, relies on accurate assessment methods to identify these associations. However, most dietary assessment instruments rely to some extent on self-reporting, which is prone to systematic bias affected by factors such as age, gender, social desirability and approval. Nutritional biomarkers are not affected by these and therefore provide an additional, alternative method to estimate intake. However, there are also some limitations in their application: they are affected by inter-individual variations in metabolism and other physiological factors, and they are often limited to estimating intake of specific compounds and not entire foods. It is therefore important to validate nutritional biomarkers to determine specific strengths and limitations. In this perspective paper, criteria for the validation of nutritional markers and future developments are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Controllers for feedback substitution schemes demonstrate a trade-off between noise power gain and normalized response time. Using as an example the design of a controller for a radiometric transduction process subjected to arbitrary noise power gain and robustness constraints, a Pareto-front of optimal controller solutions fulfilling a range of time-domain design objectives can be derived. In this work, we consider designs using a loop shaping design procedure (LSDP). The approach uses linear matrix inequalities to specify a range of objectives and a genetic algorithm (GA) to perform a multi-objective optimization for the controller weights (MOGA). A clonal selection algorithm is used to further provide a directed search of the GA towards the Pareto front. We demonstrate that with the proposed methodology, it is possible to design higher order controllers with superior performance in terms of response time, noise power gain and robustness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technique for objective classification of boundary layers is applied to ground-based vertically pointing Doppler lidar and sonic anemometer data. The observed boundary layer has been classified into nine different types based on those in the Met Office ‘Lock’ scheme, using vertical velocity variance and skewness, along with attenuated backscatter coefficient and surface sensible heat flux. This new probabilistic method has been applied to three years of data from Chilbolton Observatory in southern England and a climatology of boundary-layer type has been created. A clear diurnal cycle is present in all seasons. The most common boundary-layer type is stable with no cloud (30.0% of the dataset). The most common unstable type is well mixed with no cloud (15.4%). Decoupled stratocumulus is the third most common boundary-layer type (10.3%) and cumulus under stratocumulus occurs 1.0% of the time. The occurrence of stable boundary-layer types is much higher in the winter than the summer and boundary-layer types capped with cumulus cloud are more prevalent in the warm seasons. The most common diurnal evolution of boundary-layer types, occurring on 52 days of our three-year dataset, is that of no cloud with the stability changing from stable to unstable during daylight hours. These results are based on 16393 hours, 62.4% of the three-year dataset, of diagnosed boundary-layer type. This new method is ideally suited to long-term evaluation of boundary-layer type parametrisations in weather forecast and climate models.