902 resultados para Extended rank likelihood
Resumo:
In this day of the mature highway systems, a new set of problems is facing the highway engineer. The existing infrastructure has aged to or past the design life of the original pavement design. In many cases, increased commercial traffic is creating the need for additional load carrying capacity, causing state highway engineers to consider new alternatives for rehabilitation of existing surfaces. Alternative surface materials, thicknesses, and methods of installation must be identified to meet the needs of individual pavements and budgets. With overlays being one of the most frequently used rehabilitation alternatives, it is important to learn more about the limitations and potential performance of thin bonded portland cement overlays and subsequent rehabilitation. The Iowa ultra-thin project demonstrated the application of thin portland cement concrete overlays as a rehabilitation technique. It combined the variables of base preparation, overlay thickness, slab size, and fiber enhancement into a series of test sections over a 7.2-mile length. This report identifies the performance of the overlays in terms of deflection reduction, reduced cracking, and improved bonding between the portland cement concrete (PCC) and asphalt cement concrete (ACC) base layers. The original research project was designed to evaluate the variables over a 5-year period of time. A second project provided the opportunity to test overlay rehabilitation techniques and continue measurement of the original overlay performance for 5 additional years. All performance indicators identified exceptional performance over the 10-year evaluation period for each of the variable combinations considered. The report summarizes the research methods, results, and identifies future research ideas to aid the pavement overlay designer in the successful implementation of ultra-thin portland cement concrete overlays as an lternative pavement rehabilitation technique.
Resumo:
This letter to the Editor comments on the article When 'neutral' evidence still has probative value (with implications from the Barry George Case) by N. Fenton et al. [[1], 2014].
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
AIM: To document the feasibility and report the results of dosing darbepoetin-alpha at extended intervals up to once monthly (QM) in a large dialysis patient population. MATERIAL: 175 adult patients treated, at 23 Swiss hemodialysis centres, with stable doses of any erythropoiesis-stimulating agent who were switched by their physicians to darbepoetin-alpha treatment at prolonged dosing intervals (every 2 weeks [Q2W] or QM). METHOD: Multicentre, prospective, observational study. Patients' hemoglobin (Hb) levels and other data were recorded 1 month before conversion (baseline) to an extended darbepoetin-alpha dosing interval, at the time of conversion, and once monthly thereafter up to the evaluation point (maximum of 12 months or until loss to follow-up). RESULTS: Data for 161 evaluable patients from 23 sites were included in the final analysis. At 1 month prior to conversion, 73% of these patients were receiving darbepoetin-alpha weekly (QW) and 27% of the patients biweekly (Q2W). After a mean follow-up of 9.5 months, 34% received a monthly (QM) dosing regimen, 52% of the patients were receiving darbepoetin-alpha Q2W, and 14% QW. The mean (SD) Hb concentration at baseline was 12.3 +/- 1.2 g/dl, compared to 11.9 +/- 1.2 g/dl at the evaluation point. The corresponding mean weekly darbepoetin-alpha dose was 44.3 +/- 33.4 microg at baseline and 37.7 +/- 30.8 microg at the evaluation point. CONCLUSIONS: Conversion to extended darbepoetin-alpha dosing intervals of up to QM, with maintenance of initial Hb concentrations, was successful for the majority of stable dialysis patients.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
This paper analyzes the nature of health care provider choice inthe case of patient-initiated contacts, with special reference toa National Health Service setting, where monetary prices are zeroand general practitioners act as gatekeepers to publicly financedspecialized care. We focus our attention on the factors that mayexplain the continuously increasing use of hospital emergencyvisits as opposed to other provider alternatives. An extendedversion of a discrete choice model of demand for patient-initiatedcontacts is presented, allowing for individual and town residencesize differences in perceived quality (preferences) betweenalternative providers and including travel and waiting time asnon-monetary costs. Results of a nested multinomial logit model ofprovider choice are presented. Individual choice betweenalternatives considers, in a repeated nested structure, self-care,primary care, hospital and clinic emergency services. Welfareimplications and income effects are analyzed by computingcompensating variations, and by simulating the effects of userfees by levels of income. Results indicate that compensatingvariation per visit is higher than the direct marginal cost ofemergency visits, and consequently, emergency visits do not appearas an inefficient alternative even for non-urgent conditions.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
Expected utility theory (EUT) has been challenged as a descriptive theoryin many contexts. The medical decision analysis context is not an exception.Several researchers have suggested that rank dependent utility theory (RDUT)may accurately describe how people evaluate alternative medical treatments.Recent research in this domain has addressed a relevant feature of RDU models-probability weighting-but to date no direct test of this theoryhas been made. This paper provides a test of the main axiomatic differencebetween EUT and RDUT when health profiles are used as outcomes of riskytreatments. Overall, EU best described the data. However, evidence on theediting and cancellation operation hypothesized in Prospect Theory andCumulative Prospect Theory was apparent in our study. we found that RDUoutperformed EU in the presentation of the risky treatment pairs in whichthe common outcome was not obvious. The influence of framing effects onthe performance of RDU and their importance as a topic for future researchis discussed.
Resumo:
OBJECTIVES: To provide a global, up-to-date picture of the prevalence, treatment, and outcomes of Candida bloodstream infections in intensive care unit patients and compare Candida with bacterial bloodstream infection. DESIGN: A retrospective analysis of the Extended Prevalence of Infection in the ICU Study (EPIC II). Demographic, physiological, infection-related and therapeutic data were collected. Patients were grouped as having Candida, Gram-positive, Gram-negative, and combined Candida/bacterial bloodstream infection. Outcome data were assessed at intensive care unit and hospital discharge. SETTING: EPIC II included 1265 intensive care units in 76 countries. PATIENTS: Patients in participating intensive care units on study day. INTERVENTIONS: None. MEASUREMENT AND MAIN RESULTS: Of the 14,414 patients in EPIC II, 99 patients had Candida bloodstream infections for a prevalence of 6.9 per 1000 patients. Sixty-one patients had candidemia alone and 38 patients had combined bloodstream infections. Candida albicans (n = 70) was the predominant species. Primary therapy included monotherapy with fluconazole (n = 39), caspofungin (n = 16), and a polyene-based product (n = 12). Combination therapy was infrequently used (n = 10). Compared with patients with Gram-positive (n = 420) and Gram-negative (n = 264) bloodstream infections, patients with candidemia were more likely to have solid tumors (p < .05) and appeared to have been in an intensive care unit longer (14 days [range, 5-25 days], 8 days [range, 3-20 days], and 10 days [range, 2-23 days], respectively), but this difference was not statistically significant. Severity of illness and organ dysfunction scores were similar between groups. Patients with Candida bloodstream infections, compared with patients with Gram-positive and Gram-negative bloodstream infections, had the greatest crude intensive care unit mortality rates (42.6%, 25.3%, and 29.1%, respectively) and longer intensive care unit lengths of stay (median [interquartile range]) (33 days [18-44], 20 days [9-43], and 21 days [8-46], respectively); however, these differences were not statistically significant. CONCLUSION: Candidemia remains a significant problem in intensive care units patients. In the EPIC II population, Candida albicans was the most common organism and fluconazole remained the predominant antifungal agent used. Candida bloodstream infections are associated with high intensive care unit and hospital mortality rates and resource use.
Resumo:
Liquid-chromatography (LC) high-resolution (HR) mass spectrometry (MS) analysis can record HR full scans, a technique of detection that shows comparable selectivity and sensitivity to ion transitions (SRM) performed with triple-quadrupole (TQ)-MS but that allows de facto determination of "all" ions including drug metabolites. This could be of potential utility in in vivo drug metabolism and pharmacovigilance studies in order to have a more comprehensive insight in drug biotransformation profile differences in patients. This simultaneous quantitative and qualitative (Quan/Qual) approach has been tested with 20 patients chronically treated with tamoxifen (TAM). The absolute quantification of TAM and three metabolites in plasma was realized using HR- and TQ-MS and compared. The same LC-HR-MS analysis allowed the identification and relative quantification of 37 additional TAM metabolites. A number of new metabolites were detected in patients' plasma including metabolites identified as didemethyl-trihydroxy-TAM-glucoside and didemethyl-tetrahydroxy-TAM-glucoside conjugates corresponding to TAM with six and seven biotransformation steps, respectively. Multivariate analysis allowed relevant patterns of metabolites and ratios to be associated with TAM administration and CYP2D6 genotype. Two hydroxylated metabolites, α-OH-TAM and 4'-OH-TAM, were newly identified as putative CYP2D6 substrates. The relative quantification was precise (<20 %), and the semiquantitative estimation suggests that metabolite levels are non-negligible. Metabolites could play an important role in drug toxicity, but their impact on drug-related side effects has been partially neglected due to the tremendous effort needed with previous MS technologies. Using present HR-MS, this situation should evolve with the straightforward determination of drug metabolites, enlarging the possibilities in studying inter- and intra-patients drug metabolism variability and related effects.
Resumo:
During the last 2 years, several novel genes that encode glucose transporter-like proteins have been identified and characterized. Because of their sequence similarity with GLUT1, these genes appear to belong to the family of solute carriers 2A (SLC2A, protein symbol GLUT). Sequence comparisons of all 13 family members allow the definition of characteristic sugar/polyol transporter signatures: (1) the presence of 12 membrane-spanning helices, (2) seven conserved glycine residues in the helices, (3) several basic and acidic residues at the intracellular surface of the proteins, (4) two conserved tryptophan residues, and (5) two conserved tyrosine residues. On the basis of sequence similarities and characteristic elements, the extended GLUT family can be divided into three subfamilies, namely class I (the previously known glucose transporters GLUT1-4), class II (the previously known fructose transporter GLUT5, the GLUT7, GLUT9 and GLUT11), and class III (GLUT6, 8, 10, 12, and the myo-inositol transporter HMIT1). Functional characteristics have been reported for some of the novel GLUTs. Like GLUT1-4, they exhibit a tissue/cell-specific expression (GLUT6, leukocytes, brain; GLUT8, testis, blastocysts, brain, muscle, adipocytes; GLUT9, liver, kidney; GLUT10, liver, pancreas; GLUT11, heart, skeletal muscle). GLUT6 and GLUT8 appear to be regulated by sub-cellular redistribution, because they are targeted to intra-cellular compartments by dileucine motifs in a dynamin dependent manner. Sugar transport has been reported for GLUT6, 8, and 11; HMIT1 has been shown to be a H+/myo-inositol co-transporter. Thus, the members of the extended GLUT family exhibit a surprisingly diverse substrate specificity, and the definition of sequence elements determining this substrate specificity will require a full functional characterization of all members.
Resumo:
Precise estimation of propagation parameters inprecipitation media is of interest to improve the performanceof communications systems and in remote sensing applications.In this paper, we present maximum-likelihood estimators ofspecific attenuation and specific differential phase in rain. Themodel used for obtaining the cited estimators assumes coherentpropagation, reflection symmetry of the medium, and Gaussianstatistics of the scattering matrix measurements. No assumptionsabout the microphysical properties of the medium are needed.The performance of the estimators is evaluated through simulateddata. Results show negligible estimators bias and variances closeto Cramer–Rao bounds.
Resumo:
A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.