964 resultados para Weighted sum


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE We aimed to evaluate the added value of diffusion-weighted imaging (DWI) to standard magnetic resonance imaging (MRI) for detecting post-treatment cervical cancer recurrence. The detection accuracy of T2-weighted (T2W) images was compared with that of T2W MRI combined with either dynamic contrast-enhanced (DCE) MRI or DWI. METHODS Thirty-eight women with clinically suspected uterine cervical cancer recurrence more than six months after treatment completion were examined with 1.5 Tesla MRI including T2W, DCE, and DWI sequences. Disease was confirmed histologically and correlated with MRI findings. The diagnostic performance of T2W imaging and its combination with either DCE or DWI were analyzed. Sensitivity, positive predictive value, and accuracy were calculated. RESULTS Thirty-six women had histologically proven recurrence. The accuracy for recurrence detection was 80% with T2W/DCE MRI and 92.1% with T2W/DWI. The addition of DCE sequences did not significantly improve the diagnostic ability of T2W imaging, and this sequence combination misclassified two patients as falsely positive and seven as falsely negative. The T2W/DWI combination revealed a positive predictive value of 100% and only three false negatives. CONCLUSION The addition of DWI to T2W sequences considerably improved the diagnostic ability of MRI. Our results support the inclusion of DWI in the initial MRI protocol for the detection of cervical cancer recurrence, leaving DCE sequences as an option for uncertain cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Ciências Actuariais

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Obnoxious single facility location models are models that have the aim to find the best location for an undesired facility. Undesired is usually expressed in relation to the so-called demand points that represent locations hindered by the facility. Because obnoxious facility location models as a rule are multimodal, the standard techniques of convex analysis used for locating desirable facilities in the plane may be trapped in local optima instead of the desired global optimum. It is assumed that having more optima coincides with being harder to solve. In this thesis the multimodality of obnoxious single facility location models is investigated in order to know which models are challenging problems in facility location problems and which are suitable for site selection. Selected for this are the obnoxious facility models that appear to be most important in literature. These are the maximin model, that maximizes the minimum distance from demand point to the obnoxious facility, the maxisum model, that maximizes the sum of distance from the demand points to the facility and the minisum model, that minimizes the sum of damage of the facility to the demand points. All models are measured with the Euclidean distances and some models also with the rectilinear distance metric. Furthermore a suitable algorithm is selected for testing multimodality. Of the tested algorithms in this thesis, Multistart is most appropriate. A small numerical experiment shows that Maximin models have on average the most optima, of which the model locating an obnoxious linesegment has the most. Maximin models have few optima and are thus not very hard to solve. From the Minisum models, the models that have the most optima are models that take wind into account. In general can be said that the generic models have less optima than the weighted versions. Models that are measured with the rectilinear norm do have more solutions than the same models measured with the Euclidean norm. This can be explained for the maximin models in the numerical example because the shape of the norm coincides with a bound of the feasible area, so not all solutions are different optima. The difference found in number of optima of the Maxisum and Minisum can not be explained by this phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report discusses the calculation of analytic second-order bias techniques for the maximum likelihood estimates (for short, MLEs) of the unknown parameters of the distribution in quality and reliability analysis. It is well-known that the MLEs are widely used to estimate the unknown parameters of the probability distributions due to their various desirable properties; for example, the MLEs are asymptotically unbiased, consistent, and asymptotically normal. However, many of these properties depend on an extremely large sample sizes. Those properties, such as unbiasedness, may not be valid for small or even moderate sample sizes, which are more practical in real data applications. Therefore, some bias-corrected techniques for the MLEs are desired in practice, especially when the sample size is small. Two commonly used popular techniques to reduce the bias of the MLEs, are ‘preventive’ and ‘corrective’ approaches. They both can reduce the bias of the MLEs to order O(n−2), whereas the ‘preventive’ approach does not have an explicit closed form expression. Consequently, we mainly focus on the ‘corrective’ approach in this report. To illustrate the importance of the bias-correction in practice, we apply the bias-corrected method to two popular lifetime distributions: the inverse Lindley distribution and the weighted Lindley distribution. Numerical studies based on the two distributions show that the considered bias-corrected technique is highly recommended over other commonly used estimators without bias-correction. Therefore, special attention should be paid when we estimate the unknown parameters of the probability distributions under the scenario in which the sample size is small or moderate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les travaux de ce mémoire traitent du problème d’ordonnancement et d’optimisation de la production dans un environnement de plusieurs machines en présence de contraintes sur les ressources matérielles dans une usine d’extrusion plastique. La minimisation de la somme pondérée des retards est le critère économique autour duquel s’articule cette étude car il représente un critère très important pour le respect des délais. Dans ce mémoire, nous proposons une approche exacte via une formulation mathématique capable des donner des solutions optimales et une approche heuristique qui repose sur deux méthodes de construction de solution sérielle et parallèle et un ensemble de méthodes de recherche dans le voisinage (recuit-simulé, recherche avec tabous, GRASP et algorithme génétique) avec cinq variantes de voisinages. Pour être en totale conformité avec la réalité de l’industrie du plastique, nous avons pris en considération certaines caractéristiques très fréquentes telles que les temps de changement d’outils sur les machines lorsqu’un ordre de fabrication succède à un autre sur une machine donnée. La disponibilité des extrudeuses et des matrices d’extrusion représente le goulot d’étranglement dans ce problème d’ordonnancement. Des séries d’expérimentations basées sur des problèmes tests ont été effectuées pour évaluer la qualité de la solution obtenue avec les différents algorithmes proposés. L’analyse des résultats a démontré que les méthodes de construction de solution ne sont pas suffisantes pour assurer de bons résultats et que les méthodes de recherche dans le voisinage donnent des solutions de très bonne qualité. Le choix du voisinage est important pour raffiner la qualité de la solution obtenue. Mots-clés : ordonnancement, optimisation, extrusion, formulation mathématique, heuristique, recuit-simulé, recherche avec tabous, GRASP, algorithme génétique

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The introduction of molecular criteria into the classification of diffuse gliomas has added interesting practical implications to glioma management. This has created a new clinical need for correlating imaging characteristics with glioma genotypes, also known as radiogenomics or imaging genomics. Whilst many studies have primarily focused on the use of advanced magnetic resonance imaging (MRI) techniques for radiogenomics purposes, conventional MRI sequences still remain the reference point in the study and characterization of brain tumours. Moreover, a different approach may rely on diffusion-weighted imaging (DWI) usage, which is considered a “conventional” sequence in line with recently published directions on glioma imaging. In a non-invasive way, it can provide direct insight into the microscopic physical properties of tissues. Considering that Isocitrate-Dehydrogenase gene mutations may reflect alterations in metabolism, cellularity, and angiogenesis, which may manifest characteristic features on an MRI, the identification of specific MRI biomarkers could be of great interest in managing patients with brain gliomas. My study aimed to evaluate the presence of specific MRI-derived biomarkers of IDH molecular status through conventional MRI and DWI sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the industry of steelmaking, the process of galvanizing is a treatment which is applied to protect the steel from corrosion. The air knife effect (AKE) occurs when nozzles emit a steam of air on the surfaces of a steel strip to remove excess zinc from it. In our work we formalized the problem to control the AKE and we implemented, with the R&D dept.of MarcegagliaSPA, a DL model able to drive the AKE. We call it controller. It takes as input the tuple : a tuple of the physical conditions of the process line (t,h,s) with the target value of the zinc coating (c); and generates the expected tuple of (pres and dist) to drive the mechanical nozzles towards the (c). According to the requirements we designed the structure of the network. We collected and explored the data set of the historical data of the smart factory. Finally, we designed the loss function as sum of three components: the minimization between the coating addressed by the network and the target value we want to reach; and two weighted minimization components for both pressure and distance. In our solution we construct a second module, named coating net, to predict the coating of zinc resulting from the AKE when the conditions are applied to the prod. line. Its structure is made by a linear and a deep nonlinear “residual” component learned by empirical observations. The predictions made by the coating nets are used as ground truth in the loss function of the controller. By tuning the weights of the different components of the loss function, it is possible to train models with slightly different optimization purposes. In the tests we compared the regularization of different strategies with the standard one in condition of optimal estimation for both; the overall accuracy is ± 3 g/m^2 dal target for all of them. Lastly, we analyze how the controller modeled the current solutions with the new logic: the sub-optimal values of pres and dist can be optimize of 50% and 20%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our objective was to investigate spinal cord (SC) atrophy in amyotrophic lateral sclerosis (ALS) patients, and to determine whether it correlates with clinical parameters. Forty-three patients with ALS (25 males) and 43 age- and gender-matched healthy controls underwent MRI on a 3T scanner. We used T1-weighted 3D images covering the whole brain and the cervical SC to estimate cervical SC area and eccentricity at C2/C3 level using validated software (SpineSeg). Disease severity was quantified with the ALSFRS-R and ALS Severity scores. SC areas of patients and controls were compared with a Mann-Whitney test. We used linear regression to investigate association between SC area and clinical parameters. Results showed that mean age of patients and disease duration were 53.1 ± 12.2 years and 34.0 ± 29.8 months, respectively. The two groups were significantly different regarding SC areas (67.8 ± 6.8 mm² vs. 59.5 ± 8.4 mm², p < 0.001). Eccentricity values were similar in both groups (p = 0.394). SC areas correlated with disease duration (r = - 0.585, p < 0.001), ALSFRS-R score (r = 0.309, p = 0.044) and ALS Severity scale (r = 0.347, p = 0.022). In conclusion, patients with ALS have SC atrophy, but no flattening. In addition, SC areas correlated with disease duration and functional status. These data suggest that quantitative MRI of the SC may be a useful biomarker in the disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Primary craniocervical dystonia (CCD) is generally attributed to functional abnormalities in the cortico-striato-pallido-thalamocortical loops, but cerebellar pathways have also been implicated in neuroimaging studies. Hence, our purpose was to perform a volumetric evaluation of the infratentorial structures in CCD. We compared 35 DYT1/DYT6 negative patients with CCD and 35 healthy controls. Cerebellar volume was evaluated using manual volumetry (DISPLAY software) and infratentorial volume by voxel based morphometry of gray matter (GM) segments derived from T1 weighted 3 T MRI using the SUIT tool (SPM8/Dartel). We used t-tests to compare infratentorial volumes between groups. Cerebellar volume was (1.14 ± 0.17) × 10(2) cm(3) for controls and (1.13 ± 0.14) × 10(2) cm(3) for patients; p = 0.74. VBM demonstrated GM increase in the left I-IV cerebellar lobules and GM decrease in the left lobules VI and Crus I and in the right lobules VI, Crus I and VIIIb. In a secondary analysis, VBM demonstrated GM increase also in the brainstem, mostly in the pons. While gray matter increase is observed in the anterior lobe of the cerebellum and in the brainstem, the atrophy is concentrated in the posterior lobe of the cerebellum, demonstrating a differential pattern of infratentorial involvement in CCD. This study shows subtle structural abnormalities of the cerebellum and brainstem in primary CCD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Health economic evaluations require estimates of expected survival from patients receiving different interventions, often over a lifetime. However, data on the patients of interest are typically only available for a much shorter follow-up time, from randomised trials or cohorts. Previous work showed how to use general population mortality to improve extrapolations of the short-term data, assuming a constant additive or multiplicative effect on the hazards for all-cause mortality for study patients relative to the general population. A more plausible assumption may be a constant effect on the hazard for the specific cause of death targeted by the treatments. To address this problem, we use independent parametric survival models for cause-specific mortality among the general population. Because causes of death are unobserved for the patients of interest, a polyhazard model is used to express their all-cause mortality as a sum of latent cause-specific hazards. Assuming proportional cause-specific hazards between the general and study populations then allows us to extrapolate mortality of the patients of interest to the long term. A Bayesian framework is used to jointly model all sources of data. By simulation, we show that ignoring cause-specific hazards leads to biased estimates of mean survival when the proportion of deaths due to the cause of interest changes through time. The methods are applied to an evaluation of implantable cardioverter defibrillators for the prevention of sudden cardiac death among patients with cardiac arrhythmia. After accounting for cause-specific mortality, substantial differences are seen in estimates of life years gained from implantable cardioverter defibrillators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El Niño South Oscillation (ENSO) is one climatic phenomenon related to the inter-annual variability of global meteorological patterns influencing sea surface temperature and rainfall variability. It influences human health indirectly through extreme temperature and moisture conditions that may accelerate the spread of some vector-borne viral diseases, like dengue fever (DF). This work examines the spatial distribution of association between ENSO and DF in the countries of the Americas during 1995-2004, which includes the 1997-1998 El Niño, one of the most important climatic events of 20(th) century. Data regarding the South Oscillation index (SOI), indicating El Niño-La Niña activity, were obtained from Australian Bureau of Meteorology. The annual DF incidence (AIy) by country was computed using Pan-American Health Association data. SOI and AIy values were standardised as deviations from the mean and plotted in bars-line graphics. The regression coefficient values between SOI and AIy (rSOI,AI) were calculated and spatially interpolated by an inverse distance weighted algorithm. The results indicate that among the five years registering high number of cases (1998, 2002, 2001, 2003 and 1997), four had El Niño activity. In the southern hemisphere, the annual spatial weighted mean centre of epidemics moved southward, from 6° 31' S in 1995 to 21° 12' S in 1999 and the rSOI,AI values were negative in Cuba, Belize, Guyana and Costa Rica, indicating a synchrony between higher DF incidence rates and a higher El Niño activity. The rSOI,AI map allows visualisation of a graded surface with higher values of ENSO-DF associations for Mexico, Central America, northern Caribbean islands and the extreme north-northwest of South America.