853 resultados para Solving Rule
Resumo:
BACKGROUND: A simple prognostic model could help identify patients with pulmonary embolism who are at low risk of death and are candidates for outpatient treatment. METHODS: We randomly allocated 15,531 retrospectively identified inpatients who had a discharge diagnosis of pulmonary embolism from 186 Pennsylvania hospitals to derivation (67%) and internal validation (33%) samples. We derived our rule to predict 30-day mortality using classification tree analysis and patient data routinely available at initial examination as potential predictor variables. We used data from a European prospective study to externally validate the rule among 221 inpatients with pulmonary embolism. We determined mortality and nonfatal adverse medical outcomes across derivation and validation samples. RESULTS: Our final model consisted of 10 patient factors (age > or = 70 years; history of cancer, heart failure, chronic lung disease, chronic renal disease, and cerebrovascular disease; and clinical variables of pulse rate > or = 110 beats/min, systolic blood pressure < 100 mm Hg, altered mental status, and arterial oxygen saturation < 90%). Patients with none of these factors were defined as low risk. The 30-day mortality rates for low-risk patients were 0.6%, 1.5%, and 0% in the derivation, internal validation, and external validation samples, respectively. The rates of nonfatal adverse medical outcomes were less than 1% among low-risk patients across all study samples. CONCLUSIONS: This simple prediction rule accurately identifies patients with pulmonary embolism who are at low risk of short-term mortality and other adverse medical outcomes. Prospective validation of this rule is important before its implementation as a decision aid for outpatient treatment.
Resumo:
In this article, we analyze the rationale for introducing outlier payments into a prospective payment system for hospitals under adverse selection and moral hazard. The payer has only two instruments: a fixed price for patients whose treatment cost is below a threshold and a cost-sharing rule for outlier patients. We show that a fixed-price policy is optimal when the hospital is sufficiently benevolent. When the hospital is weakly benevolent, a mixed policy solving a trade-off between rent extraction, efficiency, and dumping deterrence must be preferred. We show how the optimal combination of fixed price and partially cost-based payment depends on the degree of benevolence of the hospital, the social cost of public funds, and the distribution of patients severity. [Authors]
Resumo:
Genes affect not only the behavior and fitness of their carriers but also that of other individuals. According to Hamilton's rule, whether a mutant gene will spread in the gene pool depends on the effects of its carrier on the fitness of all individuals in the population, each weighted by its relatedness to the carrier. However, social behaviors may affect not only recipients living in the generation of the actor but also individuals living in subsequent generations. In this note, I evaluate space-time relatedness coefficients for localized dispersal. These relatedness coefficients weight the selection pressures on long-lasting behaviors, which stem from a multigenerational gap between phenotypic expression by actors and the resulting environmental feedback on the fitness of recipients. Explicit values of space-time relatedness coefficients reveal that they can be surprisingly large for typical dispersal rates, even for hundreds of generations in the future.
Resumo:
Some people cannot buy products without first touching them, believing that doing so will create more assurance and information and reduce uncertainty. The international consumer marketing literature suggests an instrument to measure consumers' necessity for pohysical contact, called Need for Touch (NFT). This paper analyzes whether the Need for Touch structure is empirically consistent. Based on a literature review, we suggest six hypotheses in order to assess the nomological, convergent, and discriminant validity of the phenomenon. Departing from these, data supported four assumptions in the predicted direction. Need for Touch was associated with Need for Input and with Need for Cognition. Need for Touch was not associated with traditional marketing channels. The results also showed the dual characterization of Need for Touch as a bi-dimensional construct. The moderator effect indicated that when the consumer has a higher (vs. lower) Need for Touch autotelic score, the experiential motivation for shopping played a more (vs. less) important role in impulsive motivation. Our Study 3 supports the NFT structure and shows new associations with the need for unique products and dependent decisions.
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.
Resumo:
In this paper, we are proposing a methodology to determine the most efficient and least costly way of crew pairing optimization. We are developing a methodology based on algorithm optimization on Eclipse opensource IDE using the Java programming language to solve the crew scheduling problems.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
In this paper, I consider a general and informationally effcient approach to determine the optimal access rule and show that there exists a simple rule that achieves the Ramsey outcome as the unique equilibrium when networks compete in linear prices without network-based price discrimination. My approach is informationally effcient in the sense that the regulator is required to know only the marginal cost structure, i.e. the marginal cost of making and terminating a call. The approach is general in that access prices can depend not only on the marginal costs but also on the retail prices, which can be observed by consumers and therefore by the regulator as well. In particular, I consider the set of linear access pricing rules which includes any fixed access price, the Efficient Component Pricing Rule (ECPR) and the Modified ECPR as special cases. I show that in this set, there is a unique access rule that achieves the Ramsey outcome as the unique equilibrium as long as there exists at least a mild degree of substitutability among networks' services.
Resumo:
Studying the geographic variation of phenotypic traits can provide key information about the potential adaptive function of alternative phenotypes. Gloger's rule posits that animals should be dark-vs. light-colored in warm and humid vs. cold and dry habitats, respectively. The rule is based on the assumption that melanin pigments and/or dark coloration confer selective advantages in warm and humid regions. This rule may not apply, however, if genes for color are acting on other traits conferring fitness benefits in specific climes. Covariation between coloration and climate will therefore depend on the relative importance of coloration or melanin pigments and the genetically correlated physiological and behavioral processes that enable an animal to deal with climatic factors. The Barn Owl (Tyto alba) displays three melanin-based plumage traits, and we tested whether geographic variation in these traits at the scale of the North American continent supported Gloger's rule. An analysis of variation of pheomelanin-based reddish coloration and of the number and size of black feather spots in 1,369 museum skin specimens showed that geographic variation was correlated with ambient temperature and precipitation. Owls were darker red in color and displayed larger but fewer black feather spots in colder regions. Owls also exhibited more and larger black spots in regions where the climate was dry in winter. We propose that the associations between pigmentation and ambient temperature are of opposite sign for reddish coloration and spot size vs. the number of spots because selection exerted by climate (or a correlated variable) is plumage trait-specific or because plumage traits are genetically correlated with different adaptations.
Resumo:
Directed evolution of life through millions of years, such as increasing adult body size, is one of the most intriguing patterns displayed by fossil lineages. Processes and causes of such evolutionary trends are still poorly understood. Ammonoids (externally shelled marine cephalopods) are well known to have experienced repetitive morphological evolutionary trends of their adult size, shell geometry and ornamentation. This study analyses the evolutionary trends of the family Acrochordiceratidae Arthaber, 1911 from the Early to Middle Triassic (251228 Ma). Exceptionally large and bed-rock-controlled collections of this ammonoid family were obtained from strata of Anisian age (Middle Triassic) in north-west Nevada and north-east British Columbia. They enable quantitative and statistical analyses of its morphological evolutionary trends. This study demonstrates that the monophyletic clade Acrochordiceratidae underwent the classical evolute to involute evolutionary trend (i.e. increasing coiling of the shell), an increase in its shell adult size (conch diameter) and an increase in the indentation of its shell suture shape. These evolutionary trends are statistically robust and seem more or less gradual. Furthermore, they are nonrandom with the sustained shift in the mean, the minimum and the maximum of studied shell characters. These results can be classically interpreted as being constrained by the persistence and common selection pressure on this mostly anagenetic lineage characterized by relatively moderate evolutionary rates. Increasing involution of ammonites is traditionally interpreted by increasing adaptation mostly in terms of improved hydrodynamics. However, this trend in ammonoid geometry can also be explained as a case of Copes rule (increasing adult body size) instead of functional explanation of coiling, because both shell diameter and shell involution are two possible paths for ammonoids to accommodate size increase.
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
In todays competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.
Resumo:
1.1 Fundamentals Chest pain is a common complaint in primary care patients (1 to 3% of all consultations) (1) and its aetiology can be miscellaneous, from harmless to potentially life threatening conditions. In primary care practice, the most prevalent aetiologies are: chest wall syndrome (43%), coronary heart disease (12%) and anxiety (7%) (2). In up to 20% of cases, potentially serious conditions as cardiac, respiratory or neoplasic diseases underlie chest pain. In this context, a large number of laboratory tests are run (42%) and over 16% of patients are referred to a specialist or hospitalized (2).¦A cardiovascular origin to chest pain can threaten patient's life and investigations run to exclude a serious condition can be expensive and involve a large number of exams or referral to specialist -‐ often without real clinical need. In emergency settings, up to 80% of chest pains in patients are due to cardiovascular events (3) and scoring methods have been developed to identify conditions such as coronary heart disease (HD) quickly and efficiently (4-‐6). In primary care, a cardiovascular origin is present in only about 12% of patients with chest pain (2) and general practitioners (GPs) need to exclude as safely as possible a potential serious condition underlying chest pain. A simple clinical prediction rule (CPR) like those available in emergency settings may therefore help GPs and spare time and extra investigations in ruling out CHD in primary care patients. Such a tool may also help GPs reassure patients with more common origin to chest pain.