844 resultados para Random Allocation
Resumo:
Let IaS,a"e (d) be a set of centers chosen according to a Poisson point process in a"e (d) . Let psi be an allocation of a"e (d) to I in the sense of the Gale-Shapley marriage problem, with the additional feature that every center xi aI has an appetite given by a nonnegative random variable alpha. Generalizing some previous results, we study large deviations for the distance of a typical point xaa"e (d) to its center psi(x)aI, subject to some restrictions on the moments of alpha.
Resumo:
OBJECTIVE To determine if adequacy of randomisation and allocation concealment is associated with changes in effect sizes (ES) when comparing physical therapy (PT) trials with and without these methodological characteristics. DESIGN Meta-epidemiological study. PARTICIPANTS A random sample of randomised controlled trials (RCTs) included in meta-analyses in the PT discipline were identified. INTERVENTION Data extraction including assessments of random sequence generation and allocation concealment was conducted independently by two reviewers. To determine the association between sequence generation, and allocation concealment and ES, a two-level analysis was conducted using a meta-meta-analytic approach. PRIMARY AND SECONDARY OUTCOME MEASURES association between random sequence generation and allocation concealment and ES in PT trials. RESULTS 393 trials included in 43 meta-analyses, analysing 44 622 patients contributed to this study. Adequate random sequence generation and appropriate allocation concealment were accomplished in only 39.7% and 11.5% of PT trials, respectively. Although trials with inappropriate allocation concealment tended to have an overestimate treatment effect when compared with trials with adequate concealment of allocation, the difference was non-statistically significant (ES=0.12; 95% CI -0.06 to 0.30). When pooling our results with those of Nuesch et al, we obtained a pooled statistically significant value (ES=0.14; 95% CI 0.02 to 0.26). There was no difference in ES in trials with appropriate or inappropriate random sequence generation (ES=0.02; 95% CI -0.12 to 0.15). CONCLUSIONS Our results suggest that when evaluating risk of bias of primary RCTs in PT area, systematic reviewers and clinicians implementing research into practice should pay attention to these biases since they could exaggerate treatment effects. Systematic reviewers should perform sensitivity analysis including trials with low risk of bias in these domains as primary analysis and/or in combination with less restrictive analyses. Authors and editors should make sure that allocation concealment and random sequence generation are properly reported in trial reports.
Finite mixture regression model with random effects: application to neonatal hospital length of stay
Resumo:
A two-component mixture regression model that allows simultaneously for heterogeneity and dependency among observations is proposed. By specifying random effects explicitly in the linear predictor of the mixture probability and the mixture components, parameter estimation is achieved by maximising the corresponding best linear unbiased prediction type log-likelihood. Approximate residual maximum likelihood estimates are obtained via an EM algorithm in the manner of generalised linear mixed model (GLMM). The method can be extended to a g-component mixture regression model with the component density from the exponential family, leading to the development of the class of finite mixture GLMM. For illustration, the method is applied to analyse neonatal length of stay (LOS). It is shown that identification of pertinent factors that influence hospital LOS can provide important information for health care planning and resource allocation. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The buffer allocation problem (BAP) is a well-known difficult problem in the design of production lines. We present a stochastic algorithm for solving the BAP, based on the cross-entropy method, a new paradigm for stochastic optimization. The algorithm involves the following iterative steps: (a) the generation of buffer allocations according to a certain random mechanism, followed by (b) the modification of this mechanism on the basis of cross-entropy minimization. Through various numerical experiments we demonstrate the efficiency of the proposed algorithm and show that the method can quickly generate (near-)optimal buffer allocations for fairly large production lines.
Resumo:
An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.
Resumo:
In the half-duplex relay channel applying the decode-and-forward protocol the relay introduces energy over random time intervals into the channel as observed at the destination. Consequently, during simulation the average signal power seen at the destination becomes known at run-time only. Therefore, in order to obtain specific performance measures at the signal-to-noise ratio (SNR) of interest, strategies are required to adjust the noise variance during simulation run-time. It is necessary that these strategies result in the same performance as measured under real-world conditions. This paper introduces three noise power allocation strategies and demonstrates their applicability using numerical and simulation results.
Biased Random-key Genetic Algorithms For The Winner Determination Problem In Combinatorial Auctions.
Resumo:
Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.
Resumo:
Isosorbide succinate moieties were incorporated into poly(L-lactide) (PLLA) backbone in order to obtain a new class of biodegradable polymer with enhanced properties. This paper describes the synthesis and characterization of four types of low molecular weight copolymers. Copolymer I was obtained from monomer mixtures of L-lactide, isosorbide, and succinic anhydride; II from oligo(L-lactide) (PLLA), isosorbide, and succinic anhydride; III from oligo(isosorbide succinate) (PIS) and L-lactide; and IV from transesterification reactions between PLLA and PIS. MALDI-TOFMS and 13C-NMR analyses gave evidence that co-oligomerization was successfully attained in all cases. The data suggested that the product I is a random co-oligomer and the products II-IV are block co-oligomers.
Resumo:
Consider a random medium consisting of N points randomly distributed so that there is no correlation among the distances separating them. This is the random link model, which is the high dimensionality limit (mean-field approximation) for the Euclidean random point structure. In the random link model, at discrete time steps, a walker moves to the nearest point, which has not been visited in the last mu steps (memory), producing a deterministic partially self-avoiding walk (the tourist walk). We have analytically obtained the distribution of the number n of points explored by the walker with memory mu=2, as well as the transient and period joint distribution. This result enables us to explain the abrupt change in the exploratory behavior between the cases mu=1 (memoryless walker, driven by extreme value statistics) and mu=2 (walker with memory, driven by combinatorial statistics). In the mu=1 case, the mean newly visited points in the thermodynamic limit (N >> 1) is just < n >=e=2.72... while in the mu=2 case, the mean number < n > of visited points grows proportionally to N(1/2). Also, this result allows us to establish an equivalence between the random link model with mu=2 and random map (uncorrelated back and forth distances) with mu=0 and the abrupt change between the probabilities for null transient time and subsequent ones.
Resumo:
Objective: The aim of this study was to assess the effects of 830 and 670 nm laser on malondialdehyde (MDA) concentration in random skin-flap survival. Background Data: Low-level laser therapy (LLLT) has been reported to be successful in stimulating the formation of new blood vessels and activating superoxide-dismutase delivery, thus helping the inhibition of free-radical action and consequently reducing necrosis. Materials and Methods: Thirty Wistar rats were used and divided into three groups, with 10 rats in each one. A random skin flap was raised on the dorsum of each animal. Group 1 was the control group; group 2 received 830 nm laser radiation; and group 3 was submitted to 670 nm laser radiation. The animals underwent laser therapy with 36 J/cm(2) energy density immediately after surgery and on the 4 days subsequent to surgery. The application site of the laser radiation was 1 point, 2.5 cm from the flap's cranial base. The percentage of the skin-flap necrosis area was calculated 7 days postoperative using the paper-template method, and a skin sample was collected immediately after as a way of determining the MDA concentration. Results: Statistically significant differences were found between the necrosis percentages, with higher values seen in group 1 compared with groups 2 and 3. Groups 2 and 3 did not present statistically significant differences (p > 0.05). Group 3 had a lower concentration of MDA values compared to the control group (p < 0.05). Conclusion: LLLT was effective in increasing the random skin-flap viability in rats, and the 670 nm laser was efficient in reducing the MDA concentration.
Resumo:
Mature weight breeding values were estimated using a multi-trait animal model (MM) and a random regression animal model (RRM). Data consisted of 82 064 weight records from 8 145 animals, recorded from birth to eight years of age. Weights at standard ages were considered in the MM. All models included contemporary groups as fixed effects, and age of dam (linear and quadratic effects) and animal age as covariates. In the RRM, mean trends were modelled through a cubic regression on orthogonal polynomials of animal age and genetic maternal and direct and maternal permanent environmental effects were also included as random. Legendre polynomials of orders 4, 3, 6 and 3 were used for animal and maternal genetic and permanent environmental effects, respectively, considering five classes of residual variances. Mature weight (five years) direct heritability estimates were 0.35 (MM) and 0.38 (RRM). Rank correlation between sires' breeding values estimated by MM and RRM was 0.82. However, selecting the top 2% (12) or 10% (62) of the young sires based on the MM predicted breeding values, respectively 71% and 80% of the same sires would be selected if RRM estimates were used instead. The RRM modelled the changes in the (co) variances with age adequately and larger breeding value accuracies can be expected using this model.
Resumo:
Imprinted inactivation of the paternal X chromosome in marsupials is the primordial mechanism of dosage compensation for X-linked genes between females and males in Therians. In Eutherian mammals, X chromosome inactivation (XCI) evolved into a random process in cells from the embryo proper, where either the maternal or paternal X can be inactivated. However, species like mouse and bovine maintained imprinted XCI exclusively in extraembryonic tissues. The existence of imprinted XCI in humans remains controversial, with studies based on the analyses of only one or two X-linked genes in different extraembryonic tissues. Here we readdress this issue in human term placenta by performing a robust analysis of allele-specific expression of 22 X-linked genes, including XIST, using 27 SNPs in transcribed regions. We show that XCI is random in human placenta, and that this organ is arranged in relatively large patches of cells with either maternal or paternal inactive X. In addition, this analysis indicated heterogeneous maintenance of gene silencing along the inactive X, which combined with the extensive mosaicism found in placenta, can explain the lack of agreement among previous studies. Our results illustrate the differences of XCI mechanism between humans and mice, and highlight the importance of addressing the issue of imprinted XCI in other species in order to understand the evolution of dosage compensation in placental mammals.
Resumo:
It is shown that the families of generalized matrix ensembles recently considered which give rise to an orthogonal invariant stable Levy ensemble can be generated by the simple procedure of dividing Gaussian matrices by a random variable. The nonergodicity of this kind of disordered ensembles is investigated. It is shown that the same procedure applied to random graphs gives rise to a family that interpolates between the Erdos-Renyi and the scale free models.
Resumo:
A photoluminescence (PL) study of the individual electron states localized in a random potential is performed in artificially disordered superlattices embedded in a wide parabolic well. The valence band bowing of the parabolic potential provides a variation of the emission energies which splits the optical transitions corresponding to different wells within the random potential. The blueshift of the PL lines emitted by individual random wells, observed with increasing disorder strength, is demonstrated. The variation of temperature and magnetic field allowed for the behavior of the electrons localized in individual wells of the random potential to be distinguished.
Resumo:
The transition of plasmons from propagating to localized state was studied in disordered systems formed in GaAs/AlGaAs superlattices by impurities and by artificial random potential. Both the localization length and the linewidth of plasmons were measured by Raman scattering. The vanishing dependence of the plasmon linewidth on the disorder strength was shown to be a manifestation of the strong plasmon localization. The theoretical approach based on representation of the plasmon wave function in a Gaussian form well accounted for by the obtained experimental data.