991 resultados para Prestress losses
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY SERVICES WITH PRIOR ARRANGEMENT
Resumo:
We suggest a model for data losses in a single node (memory buffer) of a packet-switched network (like the Internet) which reduces to one-dimensional discrete random walks with unusual boundary conditions. By construction, the model has critical behavior with a sharp transition from exponentially small to finite losses with increasing data arrival rate. We show that for a finite-capacity buffer at the critical point the loss rate exhibits strong fluctuations and non-Markovian power-law correlations in time, in spite of the Markovian character of the data arrival process.
Resumo:
We report on high power issues related to the reliability of fibre Bragg gratings inscribed with an infrared femtosecond laser using the point-by-point writing method. Conventionally, fibre Bragg gratings have usually been written in fibres using ultraviolet light, either holographically or using a phase mask. Since the coating is highly absorbing in the UV, this process normally requires that the protective polymer coating is stripped prior to inscription, with the fibre then being recoated. This results in a time consuming fabrication process that, unless great care is taken, can lead to fibre strength degradation, due to the presence of surface damage. The recent development of FBG inscription using NIR femtosecond lasers has eliminated the requirement for the stripping of the coating. At the same time the ability to write gratings point-by-point offers the potential for great flexibility in the grating design. There is, however, a requirement for reliability testing of these gratings, particularly for use in telecommunications systems where high powers are increasingly being used in long-haul transmission systems making use of Raman amplification. We report on a study of such gratings which has revealed the presence of broad spectrum power losses. When high powers are used, even at wavelengths far removed from the Bragg condition, these losses produce an increase in the fibre temperature due to absorption in the coating. We have monitored this temperature rise using the wavelength shift in the grating itself. At power levels of a few watts, various temperature increases were experienced ranging from a few degrees up to the point where the buffer completely melts off the fibre at the grating site. Further investigations are currently under way to study the optical loss mechanisms in order to optimise the inscription mechanism and minimise such losses.
Resumo:
This paper examines investors' reactions to dividend reductions or omissions conditional on past earnings and dividend patterns for a sample of eighty-two U.S. firms that incurred an annual loss. We document that the market reaction for firms with long patterns of past earnings and dividend payouts is significantly more negative than for firms with lessestablished past earnings and dividends records. Our results can be explained by the following line of reasoning. First, consistent with DeAngelo, DeAngelo, and Skinner (1992), a loss following a long stream of earnings and dividend payments represents an unreliable indicator of future earnings. Thus, established firms have higher loss reliability than less-established firms. Second, because current earnings and dividend policy are a substitute source of means of forecasting future earnings, lower loss reliability increases the information content of dividend reductions. Therefore, given the presence of a loss, the longer the stream of prior earnings and dividend payments, (1) the lower the loss reliability and (2) the more reliably dividend cuts are perceived as an indication that earnings difficulties will persist in the future.
Resumo:
The southern Everglades mangrove ecotone is characterized by extensive dwarf Rhizophora mangle L. shrub forests with a seasonally variable water source (Everglades – NE Florida Bay) and residence times ranging from short to long. We conducted a leaf leaching experiment to understand the influence that water source and its corresponding water quality have on (1) the early decay of R. mangle leaves and (2) the early exchange of total organic carbon (TOC) and total phosphorus (TP) between leaves and the water column. Newly senesced leaves collected from lower Taylor River (FL) were incubated in bottles containing water from one of three sources (Everglades, ambient mangrove, and Florida Bay) that spanned a range of salinity from 0 to 32‰, [TOC] from 710 to 1400 μM, and [TP] from 0.17 to 0.33 μM. We poisoned half the bottles in order to quantify abiotic processes (i.e., leaching) and assumed that non-poisoned bottles represented both biotic (i.e., microbial) and abiotic processes. We sacrificed bottles after 1,2, 5, 10, and 21 days of incubation and quantified changes in leaf mass and changes in water column [TOC] and [TP]. We saw 10–20% loss of leaf mass after 24 h—independent of water treatment—that leveled off by Day 21. After 3 weeks, non-poisoned leaves lost more mass than poisoned leaves, and there was only an effect of salinity on mass loss in poisoned incubations—with greatest leaching-associated losses in Everglades freshwater. Normalized concentrations of TOC in the water column increased by more than two orders of magnitude after 21 days with no effect of salinity and no difference between poisoned and non-poisoned treatments. However, normalized [TP] was lower in non-poisoned incubations as a result of immobilization by epiphytic microbes. This immobilization was greatest in Everglades freshwater and reflects the high P demand in this ecosystem. Immobilization of leached P in mangrove water and Florida Bay water was delayed by several days and may indicate an initial microbial limitation by labile C during the dry season.
Resumo:
Financial survival in the hotel and restaurant business can depend upon a mastery of the basic principles of risk management. This article explains the series of steps leading to the successful implementation of the risk management techniques most appropriate for a given hotel or restaurant.
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Climate change and carbon (C) sequestration are a major focus of research in the twenty-first century. Globally, soils store about 300 times the amount of C that is released per annum through the burning of fossil fuels (Schulze and Freibauer 2005). Land clearing and introduction of agricultural systems have led to rapid declines in soil C reserves. The recent introduction of conservation agricultural practices has not led to a reversing of the decline in soil C content, although it has minimized the rate of decline (Baker et al. 2007; Hulugalle and Scott 2008). Lal (2003) estimated the quantum of C pools in the atmosphere, terrestrial ecosystems, and oceans and reported a “missing C” component in the world C budget. Though not proven yet, this could be linked to C losses through runoff and soil erosion (Lal 2005) and a lack of C accounting in inland water bodies (Cole et al. 2007). Land management practices to minimize the microbial respiration and soil organic C (SOC) decline such as minimum tillage or no tillage were extensively studied in the past, and the soil erosion and runoff studies monitoring those management systems focused on other nutrients such as nitrogen (N) and phosphorus (P).
Resumo:
Hollow, cylindrical, prismatic light guides (CPLGs) are optical components that, using total internal reflection (TIR), are able to transmit high-diameter light beams in daylight and artificial lighting applications without relevant losses. It is necessary to study the prism defects of their surfaces to quantify the behavior of these optical components. In this Letter, we analyze a CPLG made of a transparent dielectric material. Scanning electron microscopy (SEM) and the topographic optical profilometry by absorption in fluids (TOPAF) imaging technique are conducted to determine if there are defects in the corners of the prisms. A model for light guide transmittance that is dependent on prism defects is proposed. Finally, a simulation and an experimental study are carried out to check the validity of the proposed model.
Resumo:
This paper generalizes the model of Salant et al. (1983; Quarterly Journal of Economics, Vol. 98, pp. 185–199) to a successive oligopoly model with product differentiation. Upstream firms produce differentiated goods, retailers compete in quantities, and supply contracts are linear. We show that if retailers buy from all producers, downstream mergers do not affect wholesale prices. Our result replicates that of Salant's, where mergers are not profitable unless the size of the merged firm exceeds 80 per cent of the industry. This result is robust to the type of competition.
Resumo:
In the scope of the discussions about microgeneration (and microgrids), the avoided electrical losses are often pointed out as an important value to be credited to those entities. Therefore, methods to assess the impact of microgeneration on losses must be developed in order to support the definition of a suitable regulatory framework for the economic integration of microgeneration on distribution networks. This paper presents an analytical method to quantify the value of avoided losses that microgeneration may produce on LV networks. Intervals of expected avoided losses are used to account for the variation of avoided losses due to the number, size and location of microgenerators, as well as for the kind of load distribution on LV networks.
Resumo:
Pesticides applications have been described by many researches as a very inefficient process. In some cases, there are reports that only 0.02% of the applied products are used for the effective control of the problem. The main factor that influences pesticides applications is the droplet size formed on spraying nozzles. Many parameters affects the dynamic of the droplets, like wind, temperature, relative humidity, and others. Small droplets are biologically more active, but they are affected by evaporation and drift. On the other hand, the great droplets do not promote a good distribution of the product on the target. In this sense, associated with the risk of non target areas contamination and with the high costs involved in applications, the knowledge of the droplet size is of fundamental importance in the application technology. When sophisticated technology for droplets analysis is unavailable, is common the use of artificial targets like water-sensitive paper to sample droplets. On field sampling, water-sensitive papers are placed on the trials where product will be applied. When droplets impinging on it, the yellow surface of this paper will be stained dark blue, making easy their recognition. Collected droplets on this papers have different kinds of sizes. In this sense, the determination of the droplet size distribution gives a mass distribution of the material and so, the efficience of the application of the product. The stains produced by droplets shows a spread factor proportional to their respectives initial sizes. One of methodologies to analyse the droplets is a counting and measure of the droplets made in microscope. The Porton N-G12 graticule, that shows equaly spaces class intervals on geometric progression of square 2, are coulpled to the lens of the microscope. The droplet size parameters frequently used are the Volumetric Median Diameter (VMD) and the Numeric Median Diameter. On VMD value, a representative droplets sample is divided in two equal parts of volume, in such away one part contains droplets of sizes smaller than VMD and the other part contains droplets of sizes greater that VMD. The same process is done to obtaining the NMD, which divide the sample in two equal parts in relation to the droplets size. The ratio between VMD and NMD allows the droplets uniformity evaluation. After that, the graphics of accumulated probability of the volume and size droplets are plotted on log scale paper (accumulated probability versus median diameter of each size class). The graphics provides the NMD on the x-axes point corresponding to the value of 50% founded on the y-axes. All this process is very slow and subjected to operator error. So, in order to decrease the difficulty envolved with droplets measuring it was developed a numeric model, implemented on easy and accessfull computational language, which allows approximate VMD and NMD values, with good precision. The inputs to this model are the frequences of the droplets sizes colected on the water-sensitive paper, observed on the Porton N-G12 graticule fitted on microscope. With these data, the accumulated distribution of the droplet medium volumes and sizes are evaluated. The graphics obtained by plotting this distributions allow to obtain the VMD and NMD using linear interpolation, seen that on the middle of the distributions the shape of the curves are linear. These values are essential to evaluate the uniformity of droplets and to estimate the volume deposited on the observed paper by the density (droplets/cm2). This methodology to estimate the droplets volume was developed by 11.0.94.224 Project of the CNPMA/EMBRAPA. Observed data of herbicides aerial spraying samples, realized by Project on Pelotas/RS county, were used to compare values obtained manual graphic method and with those obtained by model has shown, with great precision, the values of VMD and NMD on each sampled collector, allowing to estimate a quantities of deposited product and, by consequence, the quantities losses by drifty. The graphics of variability of VMD and NMD showed that the quantity of droplets that reachs the collectors had a short dispersion, while the deposited volume shows a great interval of variation, probably because the strong action of air turbulence on the droplets distribution, enfasizing the necessity of a deeper study to verify this influences on drift.
Resumo:
The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.