983 resultados para design-build,
Resumo:
According to the 1972 Clean Water Act, the Environmental Protection Agency (EPA) established a set of regulations for the National Pollutant Discharge Elimination System (NPDES). The purpose of these regulations is to reduce pollution of the nation’s waterways. In addition to other pollutants, the NPDES regulates stormwater discharges associated with industrial activities, municipal storm sewer systems, and construction sites. Phase II of the NPDES stormwater regulations, which went into effect in Iowa in 2003, applies to construction activities that disturb more than one acre of ground. The regulations also require certain communities with Municipal Separate Storm Sewer Systems (MS4) to perform education, inspection, and regulation activities to reduce stormwater pollution within their communities. Iowa does not currently have a resource to provide guidance on the stormwater regulations to contractors, designers, engineers, and municipal staff. The Statewide Urban Design and Specifications (SUDAS) manuals are widely accepted as the statewide standard for public improvements. The SUDAS Design manual currently contains a brief chapter (Chapter 7) on erosion and sediment control; however, it is outdated, and Phase II of the NPDES stormwater regulations is not discussed. In response to the need for guidance, this chapter was completely rewritten. It now escribes the need for erosion and sediment control and explains the NPDES stormwater regulations. It provides information for the development and completion of Stormwater Pollution Prevention Plans (SWPPPs) that comply with the stormwater regulations, as well as the proper design and implementation of 28 different erosion and sediment control practices. In addition to the design chapter, this project also updated a section in the SUDAS Specifications manual (Section 9040), which describes the proper materials and methods of construction for the erosion and sediment control practices.
Resumo:
In this paper, we examine the design of permit trading programs when the objective is to minimize the cost of achieving an ex ante pollution target, that is, one that is defined in expectation rather than an ex post deterministic value. We consider two potential sources of uncertainty, the presence of either of which can make our model appropriate: incomplete information on abatement costs and uncertain delivery coefficients. In such a setting, we find three distinct features that depart from the well-established results on permit trading: (1) the regulator’s information on firms’ abatement costs can matter; (2) the optimal permit cap is not necessarily equal to the ex ante pollution target; and (3) the optimal trading ratio is not necessarily equal to the delivery coefficient even when it is known with certainty. Intuitively, since the regulator is only required to meet a pollution target on average, she can set the trading ratio and total permit cap such that there will be more pollution when abatement costs are high and less pollution when abatement costs are low. Information on firms’ abatement costs is important in order for the regulator to induce the optimal alignment between pollution level and abatement costs.
Resumo:
Optimum experimental designs depend on the design criterion, the model andthe design region. The talk will consider the design of experiments for regressionmodels in which there is a single response with the explanatory variables lying ina simplex. One example is experiments on various compositions of glass such asthose considered by Martin, Bursnall, and Stillman (2001).Because of the highly symmetric nature of the simplex, the class of models thatare of interest, typically Scheff´e polynomials (Scheff´e 1958) are rather differentfrom those of standard regression analysis. The optimum designs are also ratherdifferent, inheriting a high degree of symmetry from the models.In the talk I will hope to discuss a variety of modes for such experiments. ThenI will discuss constrained mixture experiments, when not all the simplex is availablefor experimentation. Other important aspects include mixture experimentswith extra non-mixture factors and the blocking of mixture experiments.Much of the material is in Chapter 16 of Atkinson, Donev, and Tobias (2007).If time and my research allows, I would hope to finish with a few comments ondesign when the responses, rather than the explanatory variables, lie in a simplex.ReferencesAtkinson, A. C., A. N. Donev, and R. D. Tobias (2007). Optimum ExperimentalDesigns, with SAS. Oxford: Oxford University Press.Martin, R. J., M. C. Bursnall, and E. C. Stillman (2001). Further results onoptimal and efficient designs for constrained mixture experiments. In A. C.Atkinson, B. Bogacka, and A. Zhigljavsky (Eds.), Optimal Design 2000,pp. 225–239. Dordrecht: Kluwer.Scheff´e, H. (1958). Experiments with mixtures. Journal of the Royal StatisticalSociety, Ser. B 20, 344–360.1
Validation of the New Mix Design Process for Cold In-Place Rehabilitation Using Foamed Asphalt, 2007
Resumo:
Asphalt pavement recycling has grown dramatically over the last few years as a viable technology to rehabilitate existing asphalt pavements. Iowa's current Cold In-place Recycling (CIR) practice utilizes a generic recipe specification to define the characteristics of the CIR mixture. As CIR continues to evolve, the desire to place CIR mixture with specific engineering properties requires the use of a mix design process. A new mix design procedure was developed for Cold In-place Recycling using foamed asphalt (CIR-foam) in consideration of its predicted field performance. The new laboratory mix design process was validated against various Reclaimed Asphalt Pavement (RAP) materials to determine its consistency over a wide range of RAP materials available throughout Iowa. The performance tests, which include dynamic modulus test, dynamic creep test and raveling test, were conducted to evaluate the consistency of a new CIR-foam mix design process to ensure reliable mixture performance over a wide range of traffic and climatic conditions. The “lab designed” CIR will allow the pavement designer to take the properties of the CIR into account when determining the overlay thickness.
Resumo:
Granular shoulders are an important element of the transportation system and are constantly subjected to performance problems due to wind- and water-induced erosion, rutting, edge drop-off, and slope irregularities. Such problems can directly affect drivers’ safety and often require regular maintenance. The present research study was undertaken to investigate the factors contributing to these performance problems and to propose new ideas to design and maintain granular shoulders while keeping ownership costs low. This report includes observations made during a field reconnaissance study, findings from an effort to stabilize the granular and subgrade layer at six shoulder test sections, and the results of a laboratory box study where a shoulder section overlying a soft foundation layer was simulated. Based on the research described in this report, the following changes are proposed to the construction and maintenance methods for granular shoulders: • A minimum CBR value for the granular and subgrade layer should be selected to alleviate edge drop-off and rutting formation. • For those constructing new shoulder sections, the design charts provided in this report can be used as a rapid guide based on an allowable rut depth. The charts can also be used to predict the behavior of existing shoulders. • In the case of existing shoulder sections overlying soft foundations, the use of geogrid or fly ash stabilization proved to be an effective technique for mitigating shoulder rutting.
Resumo:
Despite many successful projects, some public agencies and contractors have been hesitant to use concrete overlays. This lack of confidence has been based on a number of factors, including the misperception that concrete overlays are expensive or difficult to build. This guide will help readers understand concrete overlays and develop confidence in their application. The guide provides the key elements of the six major types of concrete overlays along with specifics on materials, typical sections, and important construction elements.
Resumo:
BACKGROUND AND PURPOSE: Stroke registries are valuable tools for obtaining information about stroke epidemiology and management. The Acute STroke Registry and Analysis of Lausanne (ASTRAL) prospectively collects epidemiological, clinical, laboratory and multimodal brain imaging data of acute ischemic stroke patients in the Centre Hospitalier Universitaire Vaudois (CHUV). Here, we provide design and methods used to create ASTRAL and present baseline data of our patients (2003 to 2008). METHODS: All consecutive patients admitted to CHUV between January 1, 2003 and December 31, 2008 with acute ischemic stroke within 24 hours of symptom onset were included in ASTRAL. Patients arriving beyond 24 hours, with transient ischemic attack, intracerebral hemorrhage, subarachnoidal hemorrhage, or cerebral sinus venous thrombosis, were excluded. Recurrent ischemic strokes were registered as new events. RESULTS: Between 2003 and 2008, 1633 patients and 1742 events were registered in ASTRAL. There was a preponderance of males, even in the elderly. Cardioembolic stroke was the most frequent type of stroke. Most strokes were of minor severity (National Institute of Health Stroke Scale [NIHSS] score ≤ 4 in 40.8% of patients). Cardioembolic stroke and dissections presented with the most severe clinical picture. There was a significant number of patients with unknown onset stroke, including wake-up stroke (n=568, 33.1%). Median time from last-well time to hospital arrival was 142 minutes for known onset and 759 minutes for unknown-onset stroke. The rate of intravenous or intraarterial thrombolysis between 2003 and 2008 increased from 10.8% to 20.8% in patients admitted within 24 hours of last-well time. Acute brain imaging was performed in 1695 patients (97.3%) within 24 hours. In 1358 patients (78%) who underwent acute computed tomography angiography, 717 patients (52.8%) had significant abnormalities. Of the 1068 supratentorial stroke patients who underwent acute perfusion computed tomography (61.3%), focal hypoperfusion was demonstrated in 786 patients (73.6%). CONCLUSIONS: This hospital-based prospective registry of consecutive acute ischemic strokes incorporates demographic, clinical, metabolic, acute perfusion, and arterial imaging. It is characterized by a high proportion of minor and unknown-onset strokes, short onset-to-admission time for known-onset patients, rapidly increasing thrombolysis rates, and significant vascular and perfusion imaging abnormalities in the majority of patients.
Resumo:
Juvenile or adult fish can alter their behaviour and rely on an innate and adaptive immune system to avoid/counteract pathogens, while fish embryos have to depend on egg characteristics and may be partly protected by their developing immune system that is building up from a certain age on. We developed an infection protocol that allows testing the reaction of individual whitefish embryos (Coregonus palaea) to repeated exposures to Pseudomonas fluorescens, an opportunistic bacterial fish pathogen. We used a full-factorial in vitro breeding design to separately test the effects of paternal and maternal contributions to the embryos' susceptibility to different kinds of pathogen exposure. We found that a first non-lethal exposure had immunosuppressive effects: pre-exposed embryos were more susceptible to future challenges with the same pathogen. At intermediate and high levels of pathogen intensity, maternal effects turned out to be crucial for the embryos' tolerance to infection. Paternal (i.e. genetic) effects played a significant role at the strongest level of infection, i.e. the embryos' own genetics already explained some of the variation in embryo susceptibility. Our findings suggest that whitefish embryos are largely protected by maternally transmitted substances, but build up some own innate immunocompetence several days before hatching.
Resumo:
Today, information technology is strategically important to the goals and aspirations of the business enterprises, government and high-level education institutions – university. Universities are facing new challenges with the emerging global economy characterized by the importance of providing faster communication services and improving the productivity and effectiveness of individuals. New challenges such as provides an information network that supports the demands and diversification of university issues. A new network architecture, which is a set of design principles for build a network, is one of the pillar bases. It is the cornerstone that enables the university’s faculty, researchers, students, administrators, and staff to discover, learn, reach out, and serve society. This thesis focuses on the network architecture definitions and fundamental components. Three most important characteristics of high-quality architecture are that: it’s open network architecture; it’s service-oriented characteristics and is an IP network based on packets. There are four important components in the architecture, which are: Services and Network Management, Network Control, Core Switching and Edge Access. The theoretical contribution of this study is a reference model Architecture of University Campus Network that can be followed or adapted to build a robust yet flexible network that respond next generation requirements. The results found are relevant to provide an important complete reference guide to the process of building campus network which nowadays play a very important role. Respectively, the research gives university networks a structured modular model that is reliable, robust and can easily grow.
Resumo:
Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.
Resumo:
Esta dissertação apresenta um estudo sobre a participação de Design Gráfico no projeto de identidade visual das marcas turísticas de cidades. O foco recai sobre a coerência da visualidade gráfica da marca com relação ao posicionamento socioeconômico e cultural das cidades, como instâncias de empreendimentos turísticos. O estudo do posicionamento das marcas de cidades foi baseado no livro Competitive Identity (ANHOLT, 2007), também, em Anholt city branding index (2006) e nas atualizações parciais desse índice (ANHOLT, 2009 e 2011). Além disso, as marcas gráficas de 30 cidades e os respectivos dados sobre seu posicionamento, como empreendimentos turísticos, foram coletadas em websites oficiais das cidades na internet. Tendo como base essas 30 cidades com um a marca gráfica turística da cidade, foi proposta uma classificação visual dessas baseando-se em três principais categorias: Categorização conceitual; a Categorização cinéticosensorial; Categorização visual. Com base nessas informações e na classificação da visualidade das marcas gráficas pesquisadas, foi realizado um estudo comparado, visando estabelecer coerências entre a comunicação visual da marca gráfica e o posicionamento socioeconômico e cultural das cidades turísticas. Diante disso, apresentam-se em destaque as marcas das cidades São Paulo e Melbourne, como um exemplo nacional e outro internacional da criatividade gráfica aplicada e da coerência entre o posicionamento do empreendimento turístico e a identidade visual da marca
Resumo:
Firms compete by choosing both a price and a design from a family of designs thatcan be represented as demand rotations. Consumers engage in costly sequential searchamong firms. Each time a consumer pays a search cost he observes a new offering. Anoffering consists of a price quote and a new good, where goods might vary in the extentto which they are good matches for the consumer. In equilibrium, only two design-styles arise: either the most niche where consumers are likely to either love or loathethe product, or the broadest where consumers are likely to have similar valuations. Inequilibrium, different firms may simultaneously offer both design-styles. We performcomparative statics on the equilibrium and show that a fall in search costs can lead tohigher industry prices and profits and lower consumer surplus. Our analysis is relatedto discussions of how the internet has led to the prevalence of niche goods and the"long tail" phenomenon.
Resumo:
We obtain minimax lower and upper bounds for the expected distortionredundancy of empirically designed vector quantizers. We show that the meansquared distortion of a vector quantizer designed from $n$ i.i.d. datapoints using any design algorithm is at least $\Omega (n^{-1/2})$ awayfrom the optimal distortion for some distribution on a bounded subset of${\cal R}^d$. Together with existing upper bounds this result shows thatthe minimax distortion redundancy for empirical quantizer design, as afunction of the size of the training data, is asymptotically on the orderof $n^{1/2}$. We also derive a new upper bound for the performance of theempirically optimal quantizer.