57 resultados para optimization of production processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When observers are presented with two visual targets appearing in the same position in close temporal proximity, a marked reduction in detection performance of the second target has often been reported, the so-called attentional blink phenomenon. Several studies found a similar decrement of P300 amplitudes during the attentional blink period as observed with detection performances of the second target. However, whether the parallel courses of second target performances and corresponding P300 amplitudes resulted from the same underlying mechanisms remained unclear. The aim of our study was therefore to investigate whether the mechanisms underlying the AB can be assessed by fixed-links modeling and whether this kind of assessment would reveal the same or at least related processes in the behavioral and electrophysiological data. On both levels of observation three highly similar processes could be identified: an increasing, a decreasing and a u-shaped trend. Corresponding processes from the behavioral and electrophysiological data were substantially correlated, with the two u-shaped trends showing the strongest association with each other. Our results provide evidence for the assumption that the same mechanisms underlie attentional blink task performance at the electrophysiological and behavioral levels as assessed by fixed-links models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This multi-phase study examined the influence of retrieval processes on children’s metacognitive processes in relation to and in interaction with achievement level and age. First, N = 150 9/10- and 11/12-year old high and low achievers watched an educational film and predicted their test performance. Children then solved a cloze test regarding the film content including answerable and unanswerable items and gave confidence judgments to every answer. Finally, children withdrew answers that they believed to be incorrect. All children showed adequate metacognitive processes before and during test taking with 11/12- year-olds outperforming 9/10-year-olds when considering characteristics of on-going retrieval processes. As to the influence of achievement level, high compared to low achievers proved to be more accurate in their metacognitive monitoring and controlling. Results suggest that both cognitive resources (operationalized through achievement level) and mnemonic experience (assessed through age) fuel metacognitive development. Nevertheless, when facing higher demands regarding retrieval processes, experience seems to play the more important role.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A measurement of the ZZ production cross section in proton-proton collisions at root s = 7 TeV using data recorded by the ATLAS experiment at the Large Hadron Collider is presented. In a data sample corresponding to an integrated luminosity of 4.6 fb(-1) collected in 2011, events are selected that are consistent either with two Z bosons decaying to electrons or muons or with one Z boson decaying to electrons or muons and a second Z boson decaying to neutrinos. The ZZ((*)) -> l(+)l(-)l'(+)l'(-) and ZZ -> l(+)l(-) nu(nu) over bar cross sections are measured in restricted phase-space regions. These results are then used to derive the total cross section for ZZ events produced with both Z bosons in the mass range 66 to 116 GeV, sigma(tot)(ZZ) = 6.7 +/- 0.7 (stat.) (+0.4)(-0.3) (syst.) +/- 0.3 (lumi.) pb, which is consistent with the Standard Model prediction of 5.89(-0.18)(+0.22) pb calculated at next-to-leading order in QCD. The normalized differential cross sections in bins of various kinematic variables are presented. Finally, the differential event yield as a function of the transverse momentum of the leading Z boson is used to set limits on anomalous neutral triple gauge boson couplings in ZZ production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystem management policies increasingly emphasize provision of multiple, as opposed to single, ecosystem services. Management for such "multifunctionality" has stimulated research into the role that biodiversity plays in providing desired rates of multiple ecosystem processes. Positive effects of biodiversity on indices of multifunctionality are consistently found, primarily because species that are redundant for one ecosystem process under a given set of environmental conditions play a distinct role under different conditions or in the provision of another ecosystem process. Here we show that the positive effects of diversity (specifically community composition) on multifunctionality indices can also arise from a statistical fallacy analogous to Simpson's paradox (where aggregating data obscures causal relationships). We manipulated soil faunal community composition in combination with nitrogen fertilization of model grassland ecosystems and repeatedly measured five ecosystem processes related to plant productivity, carbon storage, and nutrient turnover. We calculated three common multifunctionality indices based on these processes and found that the functional complexity of the soil communities had a consistent positive effect on the indices. However, only two of the five ecosystem processes also responded positively to increasing complexity, whereas the other three responded neutrally or negatively. Furthermore, none of the individual processes responded to both the complexity and the nitrogen manipulations in a manner consistent with the indices. Our data show that multifunctionality indices can obscure relationships that exist between communities and key ecosystem processes, leading us to question their use in advancing theoretical understanding-and in management decisions-about how biodiversity is related to the provision of multiple ecosystem services.