9 resultados para Weighted sum
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The single machine scheduling problem with a common due date and non-identical ready times for the jobs is examined in this work. Performance is measured by the minimization of the weighted sum of earliness and tardiness penalties of the jobs. Since this problem is NP-hard, the application of constructive heuristics that exploit specific characteristics of the problem to improve their performance is investigated. The proposed approaches are examined through a computational comparative study on a set of 280 benchmark test problems with up to 1000 jobs.
Resumo:
In this paper we address the "skull-stripping" problem in 3D MR images. We propose a new method that employs an efficient and unique histogram analysis. A fundamental component of this analysis is an algorithm for partitioning a histogram based on the position of the maximum deviation from a Gaussian fit. In our experiments we use a comprehensive image database, including both synthetic and real MRI. and compare our method with other two well-known methods, namely BSE and BET. For all datasets we achieved superior results. Our method is also highly independent of parameter tuning and very robust across considerable variations of noise ratio.
Resumo:
Inthispaperwestudygermsofpolynomialsformedbytheproductofsemi-weighted homogeneous polynomials of the same type, which we call semi-weighted homogeneous arrangements. It is shown how the L numbers of such polynomials are computed using only their weights and degree of homogeneity. A key point of the main theorem is to find the number called polar ratio of this polynomial class. An important consequence is the description of the Euler characteristic of the Milnor fibre of such arrangements only depending on their weights and degree of homogeneity. The constancy of the L numbers in families formed by such arrangements is shown, with the deformed terms having weighted degree greater than the weighted degree of the initial germ. Moreover, using the results of Massey applied to families of function germs, we obtain the constancy of the homology of the Milnor fibre in this family of semi-weighted homogeneous arrangements.
Resumo:
We use the QCD sum rules to study possible B-c-like molecular states. We consider isoscalar J(P) = 0(+) and J(P) = 1(+) D(*) B(*) molecular currents. We consider the contributions of condensates up to dimension eight and we work at leading order in alpha(s). We obtain for these states masses around 7 GeV. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Objective: In chronic renal failure patients under hemodialysis (HD) treatment, the availability of simple, safe, and effective tools to assess body composition enables evaluation of body composition accurately, in spite of changes in body fluids that occur in dialysis therapy, thus contributing to planning and monitoring of nutritional treatment. We evaluated the performance of bioelectrical impedance analysis (BIA) and the skinfold thickness sum (SKF) to assess fat mass (FM) in chronic renal failure patients before (BHD) and after (AHD) HD, using air displacement plethysmography (ADP) as the standard method. Design: This single-center cross-sectional trial involved comparing the FM of 60 HD patients estimated BHD and AHD by BIA (multifrequential; 29 women, 31 men) and by SKF with those estimated by the reference method, ADP. Body fat-free mass (FFM) was also obtained by subtracting the total body fat from the individual total weight. Results: Mean estimated FM (kg [%]) observed by ADP BHD was 17.95 +/- 0.99 kg (30.11% +/- 1.30%), with a 95% confidence interval (CI) of 16.00 to 19.90 (27.56 to 32.66); mean estimated FM observed AHD was 17.92 +/- 1.11 kg (30.04% +/- 1.40%), with a 95% CI of 15.74 to 20.10 (27.28 to 32.79). Neither study period showed a difference in FM and FFM (for both kg and %) estimates by the SKF method when compared with ADP; however, the BIA underestimated the FM and overestimated the FFM (for both kg and %) when compared with ADP. Conclusion: The SKF, but not the BIA, method showed results similar to ADP and can be considered adequate for FM evaluation in HD patients. (C) 2012 by the National Kidney Foundation, Inc. All rights reserved.
Resumo:
In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.
Resumo:
We study the action of a weighted Fourier–Laplace transform on the functions in the reproducing kernel Hilbert space (RKHS) associated with a positive definite kernel on the sphere. After defining a notion of smoothness implied by the transform, we show that smoothness of the kernel implies the same smoothness for the generating elements (spherical harmonics) in the Mercer expansion of the kernel. We prove a reproducing property for the weighted Fourier–Laplace transform of the functions in the RKHS and embed the RKHS into spaces of smooth functions. Some relevant properties of the embedding are considered, including compactness and boundedness. The approach taken in the paper includes two important notions of differentiability characterized by weighted Fourier–Laplace transforms: fractional derivatives and Laplace–Beltrami derivatives.
Resumo:
We use the QCD sum rules to study the recently observed charmonium-like structure Z+ c (3900) as a tetraquark state. We evaluate the three-point function and extract the coupling constants of the Z+ c J/ψ π+, Z+ c ηc ρ+ and Z+ c D+ ¯D∗0 vertices and the corresponding decay widths in these channels. The results obtained are in good agreement with the experimental data and supports to the tetraquark picture of this state.
Resumo:
We study, using the QCD sum rule framework, the possible existence of a charmed pentaquark that we call Θc(3250). In the QCD side we work at leading order in αs and consider condensates up to dimension 10. The mass obtained: mΘc = (3.21±0.13) GeV, is compatible with the mass of the structure seen by BaBar Collaboration in the decay channel B− →  ̄p Σ++ c π−π−.