999 resultados para Pseudo-Bayesian Design


Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the 1972 Clean Water Act, the Environmental Protection Agency (EPA) established a set of regulations for the National Pollutant Discharge Elimination System (NPDES). The purpose of these regulations is to reduce pollution of the nation’s waterways. In addition to other pollutants, the NPDES regulates stormwater discharges associated with industrial activities, municipal storm sewer systems, and construction sites. Phase II of the NPDES stormwater regulations, which went into effect in Iowa in 2003, applies to construction activities that disturb more than one acre of ground. The regulations also require certain communities with Municipal Separate Storm Sewer Systems (MS4) to perform education, inspection, and regulation activities to reduce stormwater pollution within their communities. Iowa does not currently have a resource to provide guidance on the stormwater regulations to contractors, designers, engineers, and municipal staff. The Statewide Urban Design and Specifications (SUDAS) manuals are widely accepted as the statewide standard for public improvements. The SUDAS Design manual currently contains a brief chapter (Chapter 7) on erosion and sediment control; however, it is outdated, and Phase II of the NPDES stormwater regulations is not discussed. In response to the need for guidance, this chapter was completely rewritten. It now escribes the need for erosion and sediment control and explains the NPDES stormwater regulations. It provides information for the development and completion of Stormwater Pollution Prevention Plans (SWPPPs) that comply with the stormwater regulations, as well as the proper design and implementation of 28 different erosion and sediment control practices. In addition to the design chapter, this project also updated a section in the SUDAS Specifications manual (Section 9040), which describes the proper materials and methods of construction for the erosion and sediment control practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine the design of permit trading programs when the objective is to minimize the cost of achieving an ex ante pollution target, that is, one that is defined in expectation rather than an ex post deterministic value. We consider two potential sources of uncertainty, the presence of either of which can make our model appropriate: incomplete information on abatement costs and uncertain delivery coefficients. In such a setting, we find three distinct features that depart from the well-established results on permit trading: (1) the regulator’s information on firms’ abatement costs can matter; (2) the optimal permit cap is not necessarily equal to the ex ante pollution target; and (3) the optimal trading ratio is not necessarily equal to the delivery coefficient even when it is known with certainty. Intuitively, since the regulator is only required to meet a pollution target on average, she can set the trading ratio and total permit cap such that there will be more pollution when abatement costs are high and less pollution when abatement costs are low. Information on firms’ abatement costs is important in order for the regulator to induce the optimal alignment between pollution level and abatement costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimum experimental designs depend on the design criterion, the model andthe design region. The talk will consider the design of experiments for regressionmodels in which there is a single response with the explanatory variables lying ina simplex. One example is experiments on various compositions of glass such asthose considered by Martin, Bursnall, and Stillman (2001).Because of the highly symmetric nature of the simplex, the class of models thatare of interest, typically Scheff´e polynomials (Scheff´e 1958) are rather differentfrom those of standard regression analysis. The optimum designs are also ratherdifferent, inheriting a high degree of symmetry from the models.In the talk I will hope to discuss a variety of modes for such experiments. ThenI will discuss constrained mixture experiments, when not all the simplex is availablefor experimentation. Other important aspects include mixture experimentswith extra non-mixture factors and the blocking of mixture experiments.Much of the material is in Chapter 16 of Atkinson, Donev, and Tobias (2007).If time and my research allows, I would hope to finish with a few comments ondesign when the responses, rather than the explanatory variables, lie in a simplex.ReferencesAtkinson, A. C., A. N. Donev, and R. D. Tobias (2007). Optimum ExperimentalDesigns, with SAS. Oxford: Oxford University Press.Martin, R. J., M. C. Bursnall, and E. C. Stillman (2001). Further results onoptimal and efficient designs for constrained mixture experiments. In A. C.Atkinson, B. Bogacka, and A. Zhigljavsky (Eds.), Optimal Design 2000,pp. 225–239. Dordrecht: Kluwer.Scheff´e, H. (1958). Experiments with mixtures. Journal of the Royal StatisticalSociety, Ser. B 20, 344–360.1

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asphalt pavement recycling has grown dramatically over the last few years as a viable technology to rehabilitate existing asphalt pavements. Iowa's current Cold In-place Recycling (CIR) practice utilizes a generic recipe specification to define the characteristics of the CIR mixture. As CIR continues to evolve, the desire to place CIR mixture with specific engineering properties requires the use of a mix design process. A new mix design procedure was developed for Cold In-place Recycling using foamed asphalt (CIR-foam) in consideration of its predicted field performance. The new laboratory mix design process was validated against various Reclaimed Asphalt Pavement (RAP) materials to determine its consistency over a wide range of RAP materials available throughout Iowa. The performance tests, which include dynamic modulus test, dynamic creep test and raveling test, were conducted to evaluate the consistency of a new CIR-foam mix design process to ensure reliable mixture performance over a wide range of traffic and climatic conditions. The “lab designed” CIR will allow the pavement designer to take the properties of the CIR into account when determining the overlay thickness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Granular shoulders are an important element of the transportation system and are constantly subjected to performance problems due to wind- and water-induced erosion, rutting, edge drop-off, and slope irregularities. Such problems can directly affect drivers’ safety and often require regular maintenance. The present research study was undertaken to investigate the factors contributing to these performance problems and to propose new ideas to design and maintain granular shoulders while keeping ownership costs low. This report includes observations made during a field reconnaissance study, findings from an effort to stabilize the granular and subgrade layer at six shoulder test sections, and the results of a laboratory box study where a shoulder section overlying a soft foundation layer was simulated. Based on the research described in this report, the following changes are proposed to the construction and maintenance methods for granular shoulders: • A minimum CBR value for the granular and subgrade layer should be selected to alleviate edge drop-off and rutting formation. • For those constructing new shoulder sections, the design charts provided in this report can be used as a rapid guide based on an allowable rut depth. The charts can also be used to predict the behavior of existing shoulders. • In the case of existing shoulder sections overlying soft foundations, the use of geogrid or fly ash stabilization proved to be an effective technique for mitigating shoulder rutting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The forensic two-trace problem is a perplexing inference problem introduced by Evett (J Forensic Sci Soc 27:375-381, 1987). Different possible ways of wording the competing pair of propositions (i.e., one proposition advanced by the prosecution and one proposition advanced by the defence) led to different quantifications of the value of the evidence (Meester and Sjerps in Biometrics 59:727-732, 2003). Here, we re-examine this scenario with the aim of clarifying the interrelationships that exist between the different solutions, and in this way, produce a global vision of the problem. We propose to investigate the different expressions for evaluating the value of the evidence by using a graphical approach, i.e. Bayesian networks, to model the rationale behind each of the proposed solutions and the assumptions made on the unknown parameters in this problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many areas of economics there is a growing interest in how expertise andpreferences drive individual and group decision making under uncertainty. Increasingly, we wish to estimate such models to quantify which of these drive decisionmaking. In this paper we propose a new channel through which we can empirically identify expertise and preference parameters by using variation in decisionsover heterogeneous priors. Relative to existing estimation approaches, our \Prior-Based Identification" extends the possible environments which can be estimated,and also substantially improves the accuracy and precision of estimates in thoseenvironments which can be estimated using existing methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Stroke registries are valuable tools for obtaining information about stroke epidemiology and management. The Acute STroke Registry and Analysis of Lausanne (ASTRAL) prospectively collects epidemiological, clinical, laboratory and multimodal brain imaging data of acute ischemic stroke patients in the Centre Hospitalier Universitaire Vaudois (CHUV). Here, we provide design and methods used to create ASTRAL and present baseline data of our patients (2003 to 2008). METHODS: All consecutive patients admitted to CHUV between January 1, 2003 and December 31, 2008 with acute ischemic stroke within 24 hours of symptom onset were included in ASTRAL. Patients arriving beyond 24 hours, with transient ischemic attack, intracerebral hemorrhage, subarachnoidal hemorrhage, or cerebral sinus venous thrombosis, were excluded. Recurrent ischemic strokes were registered as new events. RESULTS: Between 2003 and 2008, 1633 patients and 1742 events were registered in ASTRAL. There was a preponderance of males, even in the elderly. Cardioembolic stroke was the most frequent type of stroke. Most strokes were of minor severity (National Institute of Health Stroke Scale [NIHSS] score ≤ 4 in 40.8% of patients). Cardioembolic stroke and dissections presented with the most severe clinical picture. There was a significant number of patients with unknown onset stroke, including wake-up stroke (n=568, 33.1%). Median time from last-well time to hospital arrival was 142 minutes for known onset and 759 minutes for unknown-onset stroke. The rate of intravenous or intraarterial thrombolysis between 2003 and 2008 increased from 10.8% to 20.8% in patients admitted within 24 hours of last-well time. Acute brain imaging was performed in 1695 patients (97.3%) within 24 hours. In 1358 patients (78%) who underwent acute computed tomography angiography, 717 patients (52.8%) had significant abnormalities. Of the 1068 supratentorial stroke patients who underwent acute perfusion computed tomography (61.3%), focal hypoperfusion was demonstrated in 786 patients (73.6%). CONCLUSIONS: This hospital-based prospective registry of consecutive acute ischemic strokes incorporates demographic, clinical, metabolic, acute perfusion, and arterial imaging. It is characterized by a high proportion of minor and unknown-onset strokes, short onset-to-admission time for known-onset patients, rapidly increasing thrombolysis rates, and significant vascular and perfusion imaging abnormalities in the majority of patients.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interpretation of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) is based on a 4-factor model, which is only partially compatible with the mainstream Cattell-Horn-Carroll (CHC) model of intelligence measurement. The structure of cognitive batteries is frequently analyzed via exploratory factor analysis and/or confirmatory factor analysis. With classical confirmatory factor analysis, almost all crossloadings between latent variables and measures are fixed to zero in order to allow the model to be identified. However, inappropriate zero cross-loadings can contribute to poor model fit, distorted factors, and biased factor correlations; most important, they do not necessarily faithfully reflect theory. To deal with these methodological and theoretical limitations, we used a new statistical approach, Bayesian structural equation modeling (BSEM), among a sample of 249 French-speaking Swiss children (8-12 years). With BSEM, zero-fixed cross-loadings between latent variables and measures are replaced by approximate zeros, based on informative, small-variance priors. Results indicated that a direct hierarchical CHC-based model with 5 factors plus a general intelligence factor better represented the structure of the WISC-IV than did the 4-factor structure and the higher order models. Because a direct hierarchical CHC model was more adequate, it was concluded that the general factor should be considered as a breadth rather than a superordinate factor. Because it was possible for us to estimate the influence of each of the latent variables on the 15 subtest scores, BSEM allowed improvement of the understanding of the structure of intelligence tests and the clinical interpretation of the subtest scores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta dissertação apresenta um estudo sobre a participação de Design Gráfico no projeto de identidade visual das marcas turísticas de cidades. O foco recai sobre a coerência da visualidade gráfica da marca com relação ao posicionamento socioeconômico e cultural das cidades, como instâncias de empreendimentos turísticos. O estudo do posicionamento das marcas de cidades foi baseado no livro Competitive Identity (ANHOLT, 2007), também, em Anholt city branding index (2006) e nas atualizações parciais desse índice (ANHOLT, 2009 e 2011). Além disso, as marcas gráficas de 30 cidades e os respectivos dados sobre seu posicionamento, como empreendimentos turísticos, foram coletadas em websites oficiais das cidades na internet. Tendo como base essas 30 cidades com um a marca gráfica turística da cidade, foi proposta uma classificação visual dessas baseando-se em três principais categorias: Categorização conceitual; a Categorização cinéticosensorial; Categorização visual. Com base nessas informações e na classificação da visualidade das marcas gráficas pesquisadas, foi realizado um estudo comparado, visando estabelecer coerências entre a comunicação visual da marca gráfica e o posicionamento socioeconômico e cultural das cidades turísticas. Diante disso, apresentam-se em destaque as marcas das cidades São Paulo e Melbourne, como um exemplo nacional e outro internacional da criatividade gráfica aplicada e da coerência entre o posicionamento do empreendimento turístico e a identidade visual da marca

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel numerical approach for the comprehensive, flexible, and accurate simulation of poro-elastic wave propagation in 2D polar coordinates. An important application of this method and its extensions will be the modeling of complex seismic wave phenomena in fluid-filled boreholes, which represents a major, and as of yet largely unresolved, computational problem in exploration geophysics. In view of this, we consider a numerical mesh, which can be arbitrarily heterogeneous, consisting of two or more concentric rings representing the fluid in the center and the surrounding porous medium. The spatial discretization is based on a Chebyshev expansion in the radial direction and a Fourier expansion in the azimuthal direction and a Runge-Kutta integration scheme for the time evolution. A domain decomposition method is used to match the fluid-solid boundary conditions based on the method of characteristics. This multi-domain approach allows for significant reductions of the number of grid points in the azimuthal direction for the inner grid domain and thus for corresponding increases of the time step and enhancements of computational efficiency. The viability and accuracy of the proposed method has been rigorously tested and verified through comparisons with analytical solutions as well as with the results obtained with a corresponding, previously published, and independently bench-marked solution for 2D Cartesian coordinates. Finally, the proposed numerical solution also satisfies the reciprocity theorem, which indicates that the inherent singularity associated with the origin of the polar coordinate system is adequately handled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.