975 resultados para Design for Repair,DfR,Design for X,sostenibilità,turbina eolica,riduttore,riparazione
Resumo:
According to the 1972 Clean Water Act, the Environmental Protection Agency (EPA) established a set of regulations for the National Pollutant Discharge Elimination System (NPDES). The purpose of these regulations is to reduce pollution of the nation’s waterways. In addition to other pollutants, the NPDES regulates stormwater discharges associated with industrial activities, municipal storm sewer systems, and construction sites. Phase II of the NPDES stormwater regulations, which went into effect in Iowa in 2003, applies to construction activities that disturb more than one acre of ground. The regulations also require certain communities with Municipal Separate Storm Sewer Systems (MS4) to perform education, inspection, and regulation activities to reduce stormwater pollution within their communities. Iowa does not currently have a resource to provide guidance on the stormwater regulations to contractors, designers, engineers, and municipal staff. The Statewide Urban Design and Specifications (SUDAS) manuals are widely accepted as the statewide standard for public improvements. The SUDAS Design manual currently contains a brief chapter (Chapter 7) on erosion and sediment control; however, it is outdated, and Phase II of the NPDES stormwater regulations is not discussed. In response to the need for guidance, this chapter was completely rewritten. It now escribes the need for erosion and sediment control and explains the NPDES stormwater regulations. It provides information for the development and completion of Stormwater Pollution Prevention Plans (SWPPPs) that comply with the stormwater regulations, as well as the proper design and implementation of 28 different erosion and sediment control practices. In addition to the design chapter, this project also updated a section in the SUDAS Specifications manual (Section 9040), which describes the proper materials and methods of construction for the erosion and sediment control practices.
Resumo:
In this paper, we examine the design of permit trading programs when the objective is to minimize the cost of achieving an ex ante pollution target, that is, one that is defined in expectation rather than an ex post deterministic value. We consider two potential sources of uncertainty, the presence of either of which can make our model appropriate: incomplete information on abatement costs and uncertain delivery coefficients. In such a setting, we find three distinct features that depart from the well-established results on permit trading: (1) the regulator’s information on firms’ abatement costs can matter; (2) the optimal permit cap is not necessarily equal to the ex ante pollution target; and (3) the optimal trading ratio is not necessarily equal to the delivery coefficient even when it is known with certainty. Intuitively, since the regulator is only required to meet a pollution target on average, she can set the trading ratio and total permit cap such that there will be more pollution when abatement costs are high and less pollution when abatement costs are low. Information on firms’ abatement costs is important in order for the regulator to induce the optimal alignment between pollution level and abatement costs.
Resumo:
OBJECTIVE: A new tool to quantify visceral adipose tissue (VAT) over the android region of a total body dual-energy x-ray absorptiometry (DXA) scan has recently been reported. The measurement, CoreScan, is currently available on Lunar iDXA densitometers. The purpose of the study was to determine the precision of the CoreScan VAT measurement, which is critical for understanding the utility of this measure in longitudinal trials. DESIGN AND METHODS: VAT precision was characterized in both an anthropomorphic imaging phantom (measured on 10 Lunar iDXA systems) and a clinical population consisting of obese women (n = 32). RESULTS: The intrascanner precision for the VAT phantom across 9 quantities of VAT mass (0-1,800 g) ranged from 28.4 to 38.0 g. The interscanner precision ranged from 24.7 to 38.4 g. There was no statistical dependence on the quantity of VAT for either the inter- or intrascanner precision result (p = 0.670). Combining inter- and intrascanner precision yielded a total phantom precision estimate of 47.6 g for VAT mass, which corresponds to a 4.8% coefficient of variance (CV) for a 1 kg VAT mass. Our clinical population, who completed replicate total body scans with repositioning between scans, showed a precision of 56.8 g on an average VAT mass of 1110.4 g. This corresponds to a 5.1% CV. Hence, the in vivo precision result was similar to the phantom precision result. CONCLUSIONS: The study suggests that CoreScan has a relatively low precision error in both phantoms and obese women and therefore may be a useful addition to clinical trials where interventions are targeted towards changes in visceral adiposity.
Resumo:
Optimum experimental designs depend on the design criterion, the model andthe design region. The talk will consider the design of experiments for regressionmodels in which there is a single response with the explanatory variables lying ina simplex. One example is experiments on various compositions of glass such asthose considered by Martin, Bursnall, and Stillman (2001).Because of the highly symmetric nature of the simplex, the class of models thatare of interest, typically Scheff´e polynomials (Scheff´e 1958) are rather differentfrom those of standard regression analysis. The optimum designs are also ratherdifferent, inheriting a high degree of symmetry from the models.In the talk I will hope to discuss a variety of modes for such experiments. ThenI will discuss constrained mixture experiments, when not all the simplex is availablefor experimentation. Other important aspects include mixture experimentswith extra non-mixture factors and the blocking of mixture experiments.Much of the material is in Chapter 16 of Atkinson, Donev, and Tobias (2007).If time and my research allows, I would hope to finish with a few comments ondesign when the responses, rather than the explanatory variables, lie in a simplex.ReferencesAtkinson, A. C., A. N. Donev, and R. D. Tobias (2007). Optimum ExperimentalDesigns, with SAS. Oxford: Oxford University Press.Martin, R. J., M. C. Bursnall, and E. C. Stillman (2001). Further results onoptimal and efficient designs for constrained mixture experiments. In A. C.Atkinson, B. Bogacka, and A. Zhigljavsky (Eds.), Optimal Design 2000,pp. 225–239. Dordrecht: Kluwer.Scheff´e, H. (1958). Experiments with mixtures. Journal of the Royal StatisticalSociety, Ser. B 20, 344–360.1
Validation of the New Mix Design Process for Cold In-Place Rehabilitation Using Foamed Asphalt, 2007
Resumo:
Asphalt pavement recycling has grown dramatically over the last few years as a viable technology to rehabilitate existing asphalt pavements. Iowa's current Cold In-place Recycling (CIR) practice utilizes a generic recipe specification to define the characteristics of the CIR mixture. As CIR continues to evolve, the desire to place CIR mixture with specific engineering properties requires the use of a mix design process. A new mix design procedure was developed for Cold In-place Recycling using foamed asphalt (CIR-foam) in consideration of its predicted field performance. The new laboratory mix design process was validated against various Reclaimed Asphalt Pavement (RAP) materials to determine its consistency over a wide range of RAP materials available throughout Iowa. The performance tests, which include dynamic modulus test, dynamic creep test and raveling test, were conducted to evaluate the consistency of a new CIR-foam mix design process to ensure reliable mixture performance over a wide range of traffic and climatic conditions. The “lab designed” CIR will allow the pavement designer to take the properties of the CIR into account when determining the overlay thickness.
Resumo:
Granular shoulders are an important element of the transportation system and are constantly subjected to performance problems due to wind- and water-induced erosion, rutting, edge drop-off, and slope irregularities. Such problems can directly affect drivers’ safety and often require regular maintenance. The present research study was undertaken to investigate the factors contributing to these performance problems and to propose new ideas to design and maintain granular shoulders while keeping ownership costs low. This report includes observations made during a field reconnaissance study, findings from an effort to stabilize the granular and subgrade layer at six shoulder test sections, and the results of a laboratory box study where a shoulder section overlying a soft foundation layer was simulated. Based on the research described in this report, the following changes are proposed to the construction and maintenance methods for granular shoulders: • A minimum CBR value for the granular and subgrade layer should be selected to alleviate edge drop-off and rutting formation. • For those constructing new shoulder sections, the design charts provided in this report can be used as a rapid guide based on an allowable rut depth. The charts can also be used to predict the behavior of existing shoulders. • In the case of existing shoulder sections overlying soft foundations, the use of geogrid or fly ash stabilization proved to be an effective technique for mitigating shoulder rutting.
Resumo:
Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.
Resumo:
Esta dissertação apresenta um estudo sobre a participação de Design Gráfico no projeto de identidade visual das marcas turísticas de cidades. O foco recai sobre a coerência da visualidade gráfica da marca com relação ao posicionamento socioeconômico e cultural das cidades, como instâncias de empreendimentos turísticos. O estudo do posicionamento das marcas de cidades foi baseado no livro Competitive Identity (ANHOLT, 2007), também, em Anholt city branding index (2006) e nas atualizações parciais desse índice (ANHOLT, 2009 e 2011). Além disso, as marcas gráficas de 30 cidades e os respectivos dados sobre seu posicionamento, como empreendimentos turísticos, foram coletadas em websites oficiais das cidades na internet. Tendo como base essas 30 cidades com um a marca gráfica turística da cidade, foi proposta uma classificação visual dessas baseando-se em três principais categorias: Categorização conceitual; a Categorização cinéticosensorial; Categorização visual. Com base nessas informações e na classificação da visualidade das marcas gráficas pesquisadas, foi realizado um estudo comparado, visando estabelecer coerências entre a comunicação visual da marca gráfica e o posicionamento socioeconômico e cultural das cidades turísticas. Diante disso, apresentam-se em destaque as marcas das cidades São Paulo e Melbourne, como um exemplo nacional e outro internacional da criatividade gráfica aplicada e da coerência entre o posicionamento do empreendimento turístico e a identidade visual da marca
Resumo:
Firms compete by choosing both a price and a design from a family of designs thatcan be represented as demand rotations. Consumers engage in costly sequential searchamong firms. Each time a consumer pays a search cost he observes a new offering. Anoffering consists of a price quote and a new good, where goods might vary in the extentto which they are good matches for the consumer. In equilibrium, only two design-styles arise: either the most niche where consumers are likely to either love or loathethe product, or the broadest where consumers are likely to have similar valuations. Inequilibrium, different firms may simultaneously offer both design-styles. We performcomparative statics on the equilibrium and show that a fall in search costs can lead tohigher industry prices and profits and lower consumer surplus. Our analysis is relatedto discussions of how the internet has led to the prevalence of niche goods and the"long tail" phenomenon.
Resumo:
We obtain minimax lower and upper bounds for the expected distortionredundancy of empirically designed vector quantizers. We show that the meansquared distortion of a vector quantizer designed from $n$ i.i.d. datapoints using any design algorithm is at least $\Omega (n^{-1/2})$ awayfrom the optimal distortion for some distribution on a bounded subset of${\cal R}^d$. Together with existing upper bounds this result shows thatthe minimax distortion redundancy for empirical quantizer design, as afunction of the size of the training data, is asymptotically on the orderof $n^{1/2}$. We also derive a new upper bound for the performance of theempirically optimal quantizer.
Resumo:
The demands of representative design, as formulated by Egon Brunswik (1956), set a high methodological standard. Both experimental participants and the situations with which they are faced should be representative of the populations to which researchers claim to generalize results. Failure to observe the latter has led to notable experimental failures in psychology from which economics could learn. It also raises questions about the meaning of testing economic theories in abstract environments. Logically, abstract tests can only be generalized to abstract realities and these may or may not have anything to do with the empirical realities experienced by economic actors.
Resumo:
Protein-protein interactions encode the wiring diagram of cellular signaling pathways and their deregulations underlie a variety of diseases, such as cancer. Inhibiting protein-protein interactions with peptide derivatives is a promising way to develop new biological and therapeutic tools. Here, we develop a general framework to computationally handle hundreds of non-natural amino acid sidechains and predict the effect of inserting them into peptides or proteins. We first generate all structural files (pdb and mol2), as well as parameters and topologies for standard molecular mechanics software (CHARMM and Gromacs). Accurate predictions of rotamer probabilities are provided using a novel combined knowledge and physics based strategy. Non-natural sidechains are useful to increase peptide ligand binding affinity. Our results obtained on non-natural mutants of a BCL9 peptide targeting beta-catenin show very good correlation between predicted and experimental binding free-energies, indicating that such predictions can be used to design new inhibitors. Data generated in this work, as well as PyMOL and UCSF Chimera plug-ins for user-friendly visualization of non-natural sidechains, are all available at http://www.swisssidechain.ch. Our results enable researchers to rapidly and efficiently work with hundreds of non-natural sidechains.
Resumo:
The n-octanol/water partition coefficient (log Po/w) is a key physicochemical parameter for drug discovery, design, and development. Here, we present a physics-based approach that shows a strong linear correlation between the computed solvation free energy in implicit solvents and the experimental log Po/w on a cleansed data set of more than 17,500 molecules. After internal validation by five-fold cross-validation and data randomization, the predictive power of the most interesting multiple linear model, based on two GB/SA parameters solely, was tested on two different external sets of molecules. On the Martel druglike test set, the predictive power of the best model (N = 706, r = 0.64, MAE = 1.18, and RMSE = 1.40) is similar to six well-established empirical methods. On the 17-drug test set, our model outperformed all compared empirical methodologies (N = 17, r = 0.94, MAE = 0.38, and RMSE = 0.52). The physical basis of our original GB/SA approach together with its predictive capacity, computational efficiency (1 to 2 s per molecule), and tridimensional molecular graphics capability lay the foundations for a promising predictor, the implicit log P method (iLOGP), to complement the portfolio of drug design tools developed and provided by the SIB Swiss Institute of Bioinformatics.