916 resultados para Approximate Sum Rule


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Directed evolution of life through millions of years, such as increasing adult body size, is one of the most intriguing patterns displayed by fossil lineages. Processes and causes of such evolutionary trends are still poorly understood. Ammonoids (externally shelled marine cephalopods) are well known to have experienced repetitive morphological evolutionary trends of their adult size, shell geometry and ornamentation. This study analyses the evolutionary trends of the family Acrochordiceratidae Arthaber, 1911 from the Early to Middle Triassic (251228 Ma). Exceptionally large and bed-rock-controlled collections of this ammonoid family were obtained from strata of Anisian age (Middle Triassic) in north-west Nevada and north-east British Columbia. They enable quantitative and statistical analyses of its morphological evolutionary trends. This study demonstrates that the monophyletic clade Acrochordiceratidae underwent the classical evolute to involute evolutionary trend (i.e. increasing coiling of the shell), an increase in its shell adult size (conch diameter) and an increase in the indentation of its shell suture shape. These evolutionary trends are statistically robust and seem more or less gradual. Furthermore, they are nonrandom with the sustained shift in the mean, the minimum and the maximum of studied shell characters. These results can be classically interpreted as being constrained by the persistence and common selection pressure on this mostly anagenetic lineage characterized by relatively moderate evolutionary rates. Increasing involution of ammonites is traditionally interpreted by increasing adaptation mostly in terms of improved hydrodynamics. However, this trend in ammonoid geometry can also be explained as a case of Copes rule (increasing adult body size) instead of functional explanation of coiling, because both shell diameter and shell involution are two possible paths for ammonoids to accommodate size increase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend Aumann's theorem [Aumann 1987], deriving correlated equilibria as a consequence of common priors and common knowledge of rationality, by explicitly allowing for non-rational behavior. Wereplace the assumption of common knowledge of rationality with a substantially weaker one, joint p-belief of rationality, where agents believe the other agents are rational with probability p or more. We show that behavior in this case constitutes a kind of correlated equilibrium satisfying certain p-belief constraints, and that it varies continuously in the parameters p and, for p sufficiently close to one,with high probability is supported on strategies that survive the iterated elimination of strictly dominated strategies. Finally, we extend the analysis to characterizing rational expectations of interimtypes, to games of incomplete information, as well as to the case of non-common priors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1.1 Fundamentals Chest pain is a common complaint in primary care patients (1 to 3% of all consultations) (1) and its aetiology can be miscellaneous, from harmless to potentially life threatening conditions. In primary care practice, the most prevalent aetiologies are: chest wall syndrome (43%), coronary heart disease (12%) and anxiety (7%) (2). In up to 20% of cases, potentially serious conditions as cardiac, respiratory or neoplasic diseases underlie chest pain. In this context, a large number of laboratory tests are run (42%) and over 16% of patients are referred to a specialist or hospitalized (2).¦A cardiovascular origin to chest pain can threaten patient's life and investigations run to exclude a serious condition can be expensive and involve a large number of exams or referral to specialist -­‐ often without real clinical need. In emergency settings, up to 80% of chest pains in patients are due to cardiovascular events (3) and scoring methods have been developed to identify conditions such as coronary heart disease (HD) quickly and efficiently (4-­‐6). In primary care, a cardiovascular origin is present in only about 12% of patients with chest pain (2) and general practitioners (GPs) need to exclude as safely as possible a potential serious condition underlying chest pain. A simple clinical prediction rule (CPR) like those available in emergency settings may therefore help GPs and spare time and extra investigations in ruling out CHD in primary care patients. Such a tool may also help GPs reassure patients with more common origin to chest pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper deals with a bilateral accident situation in which victims haveheterogeneous costs of care. With perfect information,efficient care bythe injurer raises with the victim's cost. When the injurer cannot observeat all the victim's type, and this fact can be verified by Courts, first-bestcannot be implemented with the use of a negligence rule based on thefirst-best levels of care. Second-best leads the injurer to intermediate care,and the two types of victims to choose the best response to it. This second-bestsolution can be easily implemented by a negligence rule with second-best as duecare. We explore imperfect observation of the victim's type, characterizing theoptimal solution and examining the different legal alternatives when Courts cannotverify the injurers' statements. Counterintuitively, we show that there is nodifference at all between the use by Courts of a rule of complete trust and arule of complete distrust towards the injurers' statements. We then relate thefindings of the model to existing rules and doctrines in Common Law and Civil Lawlegal systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Attorney General’s Consumer Protection Division receives hundreds of calls and consumer complaints every year. Follow these tips to avoid unexpected expense and disappointments. This record is about: Price-Gouging Rule in Effect in Storm- and Flood-damaged Counties

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the standard economic model of unilateral accidents, in its simplest form, assumingthat the injurers have limited assets.We identify a second-best optimal rule that selects as duecare the minimum of first-best care, and a level of care that takes into account the wealth ofthe injurer. We show that such a rule in fact maximizes the precautionary effort by a potentialinjurer. The idea is counterintuitive: Being softer on an injurer, in terms of the required level ofcare, actually improves the incentives to take care when he is potentially insolvent. We extendthe basic result to an entire population of potentially insolvent injurers, and find that the optimalgeneral standards of care do depend on wealth, and distribution of income. We also show theconditions for the result that higher income levels in a given society call for higher levels of carefor accidents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study how to promote compliance with rules in everyday situations. Having access to unique data on the universe of users of all public libraries inBarcelona, we test the effect of sending email messages with dierent contents.We find that users return their items earlier if asked to do so in a simple email.Emails reminding users of the penalties associated with late returns are more effective than emails with only a generic reminder. We find differential treatmenteffects by user types. The characteristics we analyze are previous compliance,gender, age, and nationality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers a general and informationally efficient approach to determine the optimal access pricing rule for interconnected networks. It shows that there exists a simple rule that achieves the Ramsey outcome as the unique equilibrium when networks compete in linear prices without network-based price discrimination. The approach is informationally efficient in the sense that the regulator is required to know only the marginal cost structure, i.e. the marginal cost of making and terminating a call. The approach is general in that access prices can depend not only on the marginal costs but also on the retail prices, which can be observed by consumers and therefore by the regulator as well. In particular, I consider the set of linear access pricing rules which includes any fixed access price, the Efficient Component Pricing Rule (ECPR) and the Modified ECPR as special cases. I show that in this set, there is a unique rule that implements the Ramsey outcome as the unique equilibrium independently of the underlying demand conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: A clinical decision rule to improve the accuracy of a diagnosis of influenza could help clinicians avoid unnecessary use of diagnostic tests and treatments. Our objective was to develop and validate a simple clinical decision rule for diagnosis of influenza. METHODS: We combined data from 2 studies of influenza diagnosis in adult outpatients with suspected influenza: one set in California and one in Switzerland. Patients in both studies underwent a structured history and physical examination and had a reference standard test for influenza (polymerase chain reaction or culture). We randomly divided the dataset into derivation and validation groups and then evaluated simple heuristics and decision rules from previous studies and 3 rules based on our own multivariate analysis. Cutpoints for stratification of risk groups in each model were determined using the derivation group before evaluating them in the validation group. For each decision rule, the positive predictive value and likelihood ratio for influenza in low-, moderate-, and high-risk groups, and the percentage of patients allocated to each risk group, were reported. RESULTS: The simple heuristics (fever and cough; fever, cough, and acute onset) were helpful when positive but not when negative. The most useful and accurate clinical rule assigned 2 points for fever plus cough, 2 points for myalgias, and 1 point each for duration <48 hours and chills or sweats. The risk of influenza was 8% for 0 to 2 points, 30% for 3 points, and 59% for 4 to 6 points; the rule performed similarly in derivation and validation groups. Approximately two-thirds of patients fell into the low- or high-risk group and would not require further diagnostic testing. CONCLUSION: A simple, valid clinical rule can be used to guide point-of-care testing and empiric therapy for patients with suspected influenza.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Foram estudadas correlações lineares simples entre os parâmetros erosividade da chuva e da enxurrada e as perdas de solo provocadas por chuvas erosivas num solo Bruno Não-Cálcico Vértico. Os dados correspondentes aos anos de 1986-1990 foram obtidos na estação experimental de Sumé (PB), pertencente à Universidade Federal da Paraíba-UFPB. Os parâmetros erosividade da chuva e da enxurrada estudados foram: (a) altura total da chuva (P), em mm; (b) intensidades máximas (In), ocorridas nos tempos de 5; 10; 15; 20; 25; 30; 35; 40; 45; 50; 55; 60 e 120 minutos, respectivamente, em mm h-1; (c) energia cinética total, pelo método de Wischmeier e Smith (Ec) e pelo método de Wagner e Massambani (EcW), em MJ ha-1; (d) somatório da energia cinética de intensidades superior a 10 mm h-1 (Ec > 10 e EcW > 10) em MJ ha-1; (e) somatório da energia cinética de intensidades superior a 25 mm h-1 (Ec > 25 e EcW > 25), em MJ ha-1; (f) produtos da energia cinética total pelas intensidades máximas de chuva em intervalos crescentes de tempo (EIn), ou seja: EI5; EI10; EI15; EI20; EI25; EI30; EIW30; EI35; EI40; EI45; EI50; EI55; EI60 e EI120, em MJ mm ha-1 h-1; (g) produtos da altura total da chuva pelas intensidades máximas das chuvas em intervalos crescentes de tempo (PIn), ou seja: PI5; PI10; PI15; PI20; PI25; PI30; PI35; PI40; PI45; PI50; PI55; PI60 e PI120, em mm² h-1, e (h) volume de enxurrada (Vu), em m³. O parâmetro volume de enxurrada (Vu) foi o que melhor estimou (r = 0,812) as perdas de solo em Sumé (PB). Dentre os parâmetros erosividade da chuva, o que melhor se correlacionou com as perdas de solo foi o parâmetro PI25 (r = 0,753). As equações de Wischmeier & Smith e de Wagner & Massambani, utilizadas no cálculo da energia cinética total da chuva, apresentaram o mesmo grau de precisão na estimativa das perdas de solo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Foram estudados os efeitos do desmatamento da caatinga sobre as perdas de solo e água provocadas por chuvas erosivas num Luvissolo. Os dados relativos aos anos de 1983-1990 foram obtidos na Estação Experimental de Sumé (PB), pertencente à Universidade Federal da Paraíba - UFPB. Os tratamentos consistiram de duas parcelas desmatadas, uma parcela com caatinga nativa, uma parcela com caatinga nova, duas macroparcelas com caatinga nativa e duas macroparcelas desmatadas. Nas parcelas desmatadas, as perdas de solo foram de 61,7 e 47,7 t ha-1 e as perdas de água de 224,2 e 241,0 mm. A parcela com caatinga nativa, quando comparada com a parcela desmatada, reduziu a perda de solo em cerca de 98% e a perda de água em torno de 73%. Nas macroparcelas desmatadas foram observadas perdas anuais de solo de 31 e 26 t ha-1 e de água de 151,3 e 131,5 mm. Nas macroparcelas com caatinga, houve uma redução de aproximadamente 99% das perdas de solo e 90% das perdas de água, em relação às macroparcelas desmatadas.