895 resultados para multiple approach


Relevância:

40.00% 40.00%

Publicador:

Resumo:

[EN] In this paper we present a variational technique for the reconstruction of 3D cylindrical surfaces. Roughly speaking by a cylindrical surface we mean a surface that can be parameterized using the projection on a cylinder in terms of two coordinates, representing the displacement and angle in a cylindrical coordinate system respectively. The starting point for our method is a set of different views of a cylindrical surface, as well as a precomputed disparity map estimation between pair of images. The proposed variational technique is based on an energy minimization where we balance on the one hand the regularity of the cylindrical function given by the distance of the surface points to cylinder axis, and on the other hand, the distance between the projection of the surface points on the images and the expected location following the precomputed disparity map estimation between pair of images. One interesting advantage of this approach is that we regularize the 3D surface by means of a bi-dimensio al minimization problem. We show some experimental results for large stereo sequences.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

[EN] In the last years we have developed some methods for 3D reconstruction. First we began with the problem of reconstructing a 3D scene from a stereoscopic pair of images. We developed some methods based on energy functionals which produce dense disparity maps by preserving discontinuities from image boundaries. Then we passed to the problem of reconstructing a 3D scene from multiple views (more than 2). The method for multiple view reconstruction relies on the method for stereoscopic reconstruction. For every pair of consecutive images we estimate a disparity map and then we apply a robust method that searches for good correspondences through the sequence of images. Recently we have proposed several methods for 3D surface regularization. This is a postprocessing step necessary for smoothing the final surface, which could be afected by noise or mismatch correspondences. These regularization methods are interesting because they use the information from the reconstructing process and not only from the 3D surface. We have tackled all these problems from an energy minimization approach. We investigate the associated Euler-Lagrange equation of the energy functional, and we approach the solution of the underlying partial differential equation (PDE) using a gradient descent method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As more investigations into factors affecting the quality of life of patients with multiple sclerosis (MS) are undertaken, it is becoming increasingly apparent that certain comorbidities and associated symptoms commonly found in these patients differ in incidence, pathophysiology and other factors compared with the general population. Many of these MS-related symptoms are frequently ignored in assessments of disease status and are often not considered to be associated with the disease. Research into how such comorbidities and symptoms can be diagnosed and treated within the MS population is lacking. This information gap adds further complexity to disease management and represents an unmet need in MS, particularly as early recognition and treatment of these conditions can improve patient outcomes. In this manuscript, we sought to review the literature on the comorbidities and symptoms of MS and to summarize the evidence for treatments that have been or may be used to alleviate them.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A combinatorial protocol (CP) is introduced here to interface it with the multiple linear regression (MLR) for variable selection. The efficiency of CP-MLR is primarily based on the restriction of entry of correlated variables to the model development stage. It has been used for the analysis of Selwood et al data set [16], and the obtained models are compared with those reported from GFA [8] and MUSEUM [9] approaches. For this data set CP-MLR could identify three highly independent models (27, 28 and 31) with Q2 value in the range of 0.632-0.518. Also, these models are divergent and unique. Even though, the present study does not share any models with GFA [8], and MUSEUM [9] results, there are several descriptors common to all these studies, including the present one. Also a simulation is carried out on the same data set to explain the model formation in CP-MLR. The results demonstrate that the proposed method should be able to offer solutions to data sets with 50 to 60 descriptors in reasonable time frame. By carefully selecting the inter-parameter correlation cutoff values in CP-MLR one can identify divergent models and handle data sets larger than the present one without involving excessive computer time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When observers are presented with two visual targets appearing in the same position in close temporal proximity, a marked reduction in detection performance of the second target has often been reported, the so-called attentional blink phenomenon. Several studies found a similar decrement of P300 amplitudes during the attentional blink period as observed with detection performances of the second target. However, whether the parallel courses of second target performances and corresponding P300 amplitudes resulted from the same underlying mechanisms remained unclear. The aim of our study was therefore to investigate whether the mechanisms underlying the AB can be assessed by fixed-links modeling and whether this kind of assessment would reveal the same or at least related processes in the behavioral and electrophysiological data. On both levels of observation three highly similar processes could be identified: an increasing, a decreasing and a u-shaped trend. Corresponding processes from the behavioral and electrophysiological data were substantially correlated, with the two u-shaped trends showing the strongest association with each other. Our results provide evidence for the assumption that the same mechanisms underlie attentional blink task performance at the electrophysiological and behavioral levels as assessed by fixed-links models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The reliability of measurement refers to unsystematic error in observed responses. Investigations of the prevalence of random error in stated estimates of willingness to pay (WTP) are important to an understanding of why tests of validity in CV can fail. However, published reliability studies have tended to adopt empirical methods that have practical and conceptual limitations when applied to WTP responses. This contention is supported in a review of contingent valuation reliability studies that demonstrate important limitations of existing approaches to WTP reliability. It is argued that empirical assessments of the reliability of contingent values may be better dealt with by using multiple indicators to measure the latent WTP distribution. This latent variable approach is demonstrated with data obtained from a WTP study for stormwater pollution abatement. Attitude variables were employed as a way of assessing the reliability of open-ended WTP (with benchmarked payment cards) for stormwater pollution abatement. The results indicated that participants' decisions to pay were reliably measured, but not the magnitude of the WTP bids. This finding highlights the need to better discern what is actually being measured in VVTP studies, (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Petroleum pipelines are the nervous system of the oil industry, as this transports crude oil from sources to refineries and petroleum products from refineries to demand points. Therefore, the efficient operation of these pipelines determines the effectiveness of the entire business. Pipeline route selection plays a major role when designing an effective pipeline system, as the health of the pipeline depends on its terrain. The present practice of route selection for petroleum pipelines is governed by factors such as the shortest distance, constructability, minimal effects on the environment, and approachability. Although this reduces capital expenditure, it often proves to be uneconomical when life cycle costing is considered. This study presents a route selection model with the application of an Analytic Hierarchy Process (AHP), a multiple attribute decision making technique. AHP considers all the above factors along with the operability and maintainability factors interactively. This system has been demonstrated here through a case study of pipeline route selection, from an Indian perspective. A cost-benefit comparison of the shortest route (conventionally selected) and optimal route establishes the effectiveness of the model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Resource allocation is one of the major decision problems arising in higher education. Resources must be allocated optimally in such a way that the performance of universities can be improved. This paper applies an integrated multiple criteria decision making approach to the resource allocation problem. In the approach, the Analytic Hierarchy Process (AHP) is first used to determine the priority or relative importance of proposed projects with respect to the goals of the universities. Then, the Goal Programming (GP) model incorporating the constraints of AHP priority, system, and resource is formulated for selecting the best set of projects without exceeding the limited available resources. The projects include 'hardware' (tangible university's infrastructures), and 'software' (intangible effects that can be beneficial to the university, its members, and its students). In this paper, two commercial packages are used: Expert Choice for determining the AHP priority ranking of the projects, and LINDO for solving the GP model. Copyright © 2007 Inderscience Enterprises Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are still few explanations of the micro-level practices by which top managers influence employee commitment to multiple strategic goals. This paper argues that, through their language, top managers can construct a context for commitment to multiple strategic goals. We therefore propose a rhetoric-in-context approach to illuminate some of the micro practices through which top managers influence employee commitment. Based upon an empirical study of the rhetorical practices through which top managers influence academic commitment to multiple strategic goals in university contexts, we demonstrate relationships between rhetoric and context. Specifically, we show that rhetorical influences over commitment to multiple goals are associated with the historical context for multiple goals, the degree to which top managers' rhetoric instantiates a change in that context, and the internal consistency of the rhetorical practices used by top managers. Copyright © 2007 SAGE Publications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The importance of the changeover process in the manufacturing industry is becoming widely recognised. Changeover is a complete process of changing between the manufacture of one product to manufacture of an alternative product until specified production and quality rates are reached. The initiatives to improve changeover exist in industry, as better changeover process typically contribute to improved quality performance. A high-quality and reliable changeover process can be achieved through implementation of continuous or radical improvements. This research examines the changeover process of Saudi Arabian manufacturing firms because Saudi Arabia’s government is focused on the expansion of GDP and increasing the number of export manufacturing firms. Furthermore, it is encouraging foreign manufacturing firms to invest within Saudi Arabia. These initiatives, therefore, require that Saudi manufacturing businesses develop the changeover practice in order to compete in the market and achieve the government’s objectives. Therefore, the aim of this research is to discover the current status of changeover process implementation in Saudi Arabian manufacturing businesses. To achieve this aim, the main objective of this research is to develop a conceptual model to understand and examine the effectiveness of the changeover process within Saudi Arabian manufacturing firms, facilitating identification of those activities that affect the reliability and high-quality of the process. In order to provide a comprehensive understanding of this area, this research first explores the concept of quality management and its relationship to firm performance and the performance of manufacturing changeover. An extensive body of literature was reviewed on the subject of lean manufacturing and changeover practice. A research conceptual model was identified based on this review, and focus was on providing high-quality and reliable manufacturing changeover processes during set-up in a dynamic environment. Exploratory research was conducted in sample Saudi manufacturing firms to understand the features of the changeover process within the manufacturing sector, and as a basis for modifying the proposed conceptual model. Qualitative research was employed in the study with semi-structured interviews, direct observations and documentation in order to understand the real situation such as actual daily practice and current status of changeover process in the field. The research instrument, the Changeover Effectiveness Assessment Tool (CEAT) was developed to evaluate changeover practices. A pilot study was conducted by examining the CEAT, proposed for the main research. Consequently, the conceptual model was modified and CEAT was improved in response to the pilot study findings. Case studies have been conducted within eight Saudi manufacturing businesses. These case studies assessed the implementation of manufacturing changeover practice in the lighting and medical products sectors. These two sectors were selected based on their operation strategy which was batch production as well as the fact that they fulfilled the research sampling strategy. The outcomes of the research improved the conceptual model, ultimately to facilitate the firms’ adoption and rapid implementation of a high-quality and reliability changeover during the set-up process. The main finding of this research is that Quality’s factors were considering the lowest levels comparing to the other factors which are People, Process and Infrastructure. This research contributes to enable Saudi businesses to implement the changeover process by adopting the conceptual model. In addition, the guidelines for facilitating implementation were provided in this thesis. Therefore, this research provides insight to enable the Saudi manufacturing industry to be more responsive to rapidly changing customer demands.