974 resultados para Symmetry Ratio Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter to the Editor comments on the article When 'neutral' evidence still has probative value (with implications from the Barry George Case) by N. Fenton et al. [[1], 2014].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction An impaired ability to oxidize fat may be a factor in the obesity's aetiology (3). Moreover, the exercise intensity (Fatmax) eliciting the maximal fat oxidation rate (MFO) was lower in obese (O) compared with lean (L) individuals (4). However, difference in fat oxidation rate (FOR) during exercise between O and L remains equivocal and little is known about FORs during high intensities (>60% ) in O compared with L. This study aimed to characterize fat oxidation kinetics over a large range of intensities in L and O. Methods 12 healthy L [body mass index (BMI): 22.8±0.4] and 16 healthy O men (BMI: 38.9±1.4) performed submaximal incremental test (Incr) to determine whole-body fat oxidation kinetics using indirect calorimetry. After a 15-min resting period (Rest) and 10-min warm-up at 20% of maximal power output (MPO, determined by a maximal incremental test), the power output was increased by 7.5% MPO every 6-min until respiratory exchange ratio reached 1.0. Venous lactate and glucose and plasma concentration of epinephrine (E), norepinephrine (NE), insulin and non-esterified fatty acid (NEFA) were assessed at each step. A mathematical model (SIN) (1), including three variables (dilatation, symmetry, translation), was used to characterize fat oxidation (normalized by fat-free mass) kinetics and to determine Fatmax and MFO. Results FOR at Rest and MFO were not significantly different between groups (p≥0.1). FORs were similar from 20-60% (p≥0.1) and significantly lower from 65-85% in O than in L (p≤0.04). Fatmax was significantly lower in O than in L (46.5±2.5 vs 56.7±1.9 % respectively; p=0.005). Fat oxidation kinetics was characterized by similar translation (p=0.2), significantly lower dilatation (p=0.001) and tended to a left-shift symmetry in O compared with L (p=0.09). Plasma E, insulin and NEFA were significantly higher in L compared to O (p≤0.04). There were no significant differences in glucose, lactate and plasma NE between groups (p≥0.2). Conclusion The study showed that O presented a lower Fatmax and a lower reliance on fat oxidation at high, but not at moderate, intensities. This may be linked to a: i) higher levels of insulin and lower E concentrations in O, which may induce blunted lipolysis; ii) higher percentage of type II and a lower percentage of type I fibres (5), and iii) decreased mitochondrial content (2), which may reduce FORs at high intensities and Fatmax. These findings may have implications for an appropriate exercise intensity prescription for optimize fat oxidation in O. References 1. Cheneviere et al. Med Sci Sports Exerc. 2009 2. Holloway et al. Am J Clin Nutr. 2009 3. Kelley et al. Am J Physiol. 1999 4. Perez-Martin et al. Diabetes Metab. 2001 5. Tanner et al. Am J Physiol Endocrinol Metab. 2002

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The hospital readmission rate has been proposed as an important outcome indicator computable from routine statistics. However, most commonly used measures raise conceptual issues. OBJECTIVES: We sought to evaluate the usefulness of the computerized algorithm for identifying avoidable readmissions on the basis of minimum bias, criterion validity, and measurement precision. RESEARCH DESIGN AND SUBJECTS: A total of 131,809 hospitalizations of patients discharged alive from 49 hospitals were used to compare the predictive performance of risk adjustment methods. A subset of a random sample of 570 medical records of discharge/readmission pairs in 12 hospitals were reviewed to estimate the predictive value of the screening of potentially avoidable readmissions. MEASURES: Potentially avoidable readmissions, defined as readmissions related to a condition of the previous hospitalization and not expected as part of a program of care and occurring within 30 days after the previous discharge, were identified by a computerized algorithm. Unavoidable readmissions were considered as censored events. RESULTS: A total of 5.2% of hospitalizations were followed by a potentially avoidable readmission, 17% of them in a different hospital. The predictive value of the screen was 78%; 27% of screened readmissions were judged clearly avoidable. The correlation between the hospital rate of clearly avoidable readmission and all readmissions rate, potentially avoidable readmissions rate or the ratio of observed to expected readmissions were respectively 0.42, 0.56 and 0.66. Adjustment models using clinical information performed better. CONCLUSION: Adjusted rates of potentially avoidable readmissions are scientifically sound enough to warrant their inclusion in hospital quality surveillance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

γ-Hydroxybutyric acid (GHB) is an endogenous short-chain fatty acid popular as a recreational drug due to sedative and euphoric effects, but also often implicated in drug-facilitated sexual assaults owing to disinhibition and amnesic properties. Whilst discrimination between endogenous and exogenous GHB as required in intoxication cases may be achieved by the determination of the carbon isotope content, such information has not yet been exploited to answer source inference questions of forensic investigation and intelligence interests. However, potential isotopic fractionation effects occurring through the whole metabolism of GHB may be a major concern in this regard. Thus, urine specimens from six healthy male volunteers who ingested prescription GHB sodium salt, marketed as Xyrem(®), were analysed by means of gas chromatography/combustion/isotope ratio mass spectrometry to assess this particular topic. A very narrow range of δ(13)C values, spreading from -24.810/00 to -25.060/00, was observed, whilst mean δ(13)C value of Xyrem(®) corresponded to -24.990/00. Since urine samples and prescription drug could not be distinguished by means of statistical analysis, carbon isotopic effects and subsequent influence on δ(13)C values through GHB metabolism as a whole could be ruled out. Thus, a link between GHB as a raw matrix and found in a biological fluid may be established, bringing relevant information regarding source inference evaluation. Therefore, this study supports a diversified scope of exploitation for stable isotopes characterized in biological matrices from investigations on intoxication cases to drug intelligence programmes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power transformations of positive data tables, prior to applying the correspondence analysis algorithm, are shown to open up a family of methods with direct connections to the analysis of log-ratios. Two variations of this idea are illustrated. The first approach is simply to power the original data and perform a correspondence analysis this method is shown to converge to unweighted log-ratio analysis as the power parameter tends to zero. The second approach is to apply the power transformation to thecontingency ratios, that is the values in the table relative to expected values based on the marginals this method converges to weighted log-ratio analysis, or the spectral map. Two applications are described: first, a matrix of population genetic data which is inherently two-dimensional, and second, a larger cross-tabulation with higher dimensionality, from a linguistic analysis of several books.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behavioral biology has a central role in evolutionary biology mainly because the antagonistic relations that occur in the sexual reproduction. One involves the effect of reproduction on the future life expectation. In this scenario, changes in male operational sex ratio could lead to an increase in mortality due to costs associated with excessive courtship and mating displays. Thus, this work experimentally altered the male sex ratio of Drosophila melanogaster Meigen, 1830, to determine its impact on mortality. The results indicated that mortality increases as the sex ratio changes, including modifications in the survivorship curve type and in the curve concavity, measured by entropy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effects of female diet and age on offspring sex ratio of the solitary parasitoid Pachycrepoideus vindemmiae (Rondani) (Hymenoptera, Pteromalidae). Theories predict that females of parasitoid wasps would adjust the offspring sex ratio to environmental conditions in the oviposition patch, but the diet and age of females would also affect the sex ratio adjustment. Our focus was to test the effects of female diet and age on offspring sex ratio of the solitary parasitoid wasp, Pachycrepoideus vindemmiae (Rondani, 1875). Our results showed that females fed with honey had significantly less female biased offspring sex ratio than those fed only with water. Offspring sex ratio (male percentage) decreased with female age or female longevity at the beginning of oviposition but increased at the end. There should be a sperm limitation in P. vindemmiae females at the end of oviposition, and a higher frequency of unfertilized eggs were laid then. Females also laid more unfertilized eggs at the beginning of oviposition, which would be necessary to insure the mating among offspring. Male offspring developed faster and emerged earlier, which would also reduce the risk of virginity in offspring with female-biased sex ratio.