953 resultados para Clear channel assessment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

People in several parts of the world as well in India countenance an immense confront to meet the basic needs of water. The crisis is not due to lack of fresh water but its availability in adequate superiority. Environmental quality objectives should be developed in order to define acceptable loads on the terrain. There has been a number of initiatives in water quality monitoring but the next step towards improving its quality hasn’t taken the required pace. Today, there is a growing need to create awareness among citizens on the different technologies available for improving the water quality. Monitoring facilitate to apprehend how land and water use distress the quality of water and assist in estimating the extent of pollution. Once these issues are recognized, people can work towards local solutions to manage the indispensable resource effectively. Ground waters are extremely precious resources and in many countries together with India they represent the most important drinking water supply. They are generally microbiologically pure and, in most cases, they do not need any treatment. This communiqué is intended to act as a channel on the various paraphernalia and techniques accessible for groundwater quality assessment and suggesting the assured precautionary measures to embark on environment management. This learning is imperative considering that groundwater as the exclusive source of drinking water in the region which not makes situation alarming but also calls for immediate attention. The scope of this work is somewhat vast. Water quality in Ernakulam district is getting deteriorated due to the fast growth of urbanization. The closure of several water bodies due to land development and construction prevents infiltration of rainwater into the ground and hence recharge the aquifers. Most of the aquifers are getting polluted from the industrial effluents and chemicals and fertilizers used in agriculture. Such serious issues require proper monitoring of groundwater and steps are to be taken for remedial measures. This study helps in the total protection of the rich resource of groundwater and its sustainability. Socio-economic aspect covered could be used for conducting further individual case studies and to suggest remedial measures on a scientific basis. The specific study taken up for 15 sites can be further extended to the sources of pollution, especially industrial and agriculture

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conventional method for the assessment of acute inhalation toxicity (OECD Test Guideline 403, 1981) uses death of animals as an endpoint to identify the median lethal concentration (LC50). A new OECD Testing Guideline called the Fixed Concentration Procedure (FCP) is being prepared to provide an alternative to Test Guideline 403. Unlike Test Guideline 403, the FCP does not provide a point estimate of the LC50, but aims to identify an airborne exposure level that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonized System of Classification and Labelling scheme (GHS). The FCP has been validated using statistical simulation rather than byin vivo testing. The statistical simulation approach predicts the GHS classification outcome and the numbers of deaths and animals used in the test for imaginary substances with a range of LC50 values and dose response curve slopes. This paper describes the FCP and reports the results from the statistical simulation study assessing its properties. It is shown that the procedure will be completed with considerably less death and suffering than Test Guideline 403, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LC50 value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs) have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results: The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion: MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods tested can often be used as effective post filters for re-ranking few models from individual fold recognition servers and further improvements can be achieved using a consensus of these methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As control systems have developed and the implications of poor hygienic practices have become better known, the evaluation of the hygienic status of premises has become more critical. The assessment of the overall status of premises hygiene call provide useful management data indicating whether the premises are improving or whether, whilst still meeting legal requirements, they might be failing to maintain previously high standards. Since the creation, for the United Kingdom, of the meat hygiene service (MHS), one of the aims of the service was to monitor hygiene on different premises to provide a means of comparing standards and to identify and encourage improvements. This desire led to the implementation of a scoring system known as the hygiene assessment system (HAS). This paper analyses English slaughterhouses HAS scores between 1998 and 2005 outlining the main incidents throughout this period, Although rising initially, the later results displayed a clear decrease in the general hygiene scores. These revealing results coincide with the start of a new meat inspection system where, after several years of discussion, risk based inspection is finally coming to a reality within Europe. The paper considers the implications of these changes in the way hygiene standards will be monitored in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The General Packet Radio Service (GPRS) has been developed for the mobile radio environment to allow the migration from the traditional circuit switched connection to a more efficient packet based communication link particularly for data transfer. GPRS requires the addition of not only the GPRS software protocol stack, but also more baseband functionality for the mobile as new coding schemes have be en defined, uplink status flag detection, multislot operation and dynamic coding scheme detect. This paper concentrates on evaluating the performance of the GPRS coding scheme detection methods in the presence of a multipath fading channel with a single co-channel interferer as a function of various soft-bit data widths. It has been found that compressing the soft-bit data widths from the output of the equalizer to save memory can influence the likelihood decision of the coding scheme detect function and hence contribute to the overall performance loss of the system. Coding scheme detection errors can therefore force the channel decoder to either select the incorrect decoding scheme or have no clear decision which coding scheme to use resulting in the decoded radio block failing the block check sequence and contribute to the block error rate. For correct performance simulation, the performance of the full coding scheme detection must be taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SCIENTIFIC SUMMARY Globally averaged total column ozone has declined over recent decades due to the release of ozone-depleting substances (ODSs) into the atmosphere. Now, as a result of the Montreal Protocol, ozone is expected to recover from the effects of ODSs as ODS abundances decline in the coming decades. However, a number of factors in addition to ODSs have led to and will continue to lead to changes in ozone. Discriminating between the causes of past and projected ozone changes is necessary, not only to identify the progress in ozone recovery from ODSs, but also to evaluate the effectiveness of climate and ozone protection policy options. Factors Affecting Future Ozone and Surface Ultraviolet Radiation • At least for the next few decades, the decline of ODSs is expected to be the major factor affecting the anticipated increase in global total column ozone. However, several factors other than ODS will affect the future evolution of ozone in the stratosphere. These include changes in (i) stratospheric circulation and temperature due to changes in long-lived greenhouse gas (GHG) abundances, (ii) stratospheric aerosol loading, and (iii) source gases of highly reactive stratospheric hydrogen and nitrogen compounds. Factors that amplify the effects of ODSs on ozone (e.g., stratospheric aerosols) will likely decline in importance as ODSs are gradually eliminated from the atmosphere. • Increases in GHG emissions can both positively and negatively affect ozone. Carbon dioxide (CO2)-induced stratospheric cooling elevates middle and upper stratospheric ozone and decreases the time taken for ozone to return to 1980 levels, while projected GHG-induced increases in tropical upwelling decrease ozone in the tropical lower stratosphere and increase ozone in the extratropics. Increases in nitrous oxide (N2O) and methane (CH4) concentrations also directly impact ozone chemistry but the effects are different in different regions. • The Brewer-Dobson circulation (BDC) is projected to strengthen over the 21st century and thereby affect ozone amounts. Climate models consistently predict an acceleration of the BDC or, more specifically, of the upwelling mass flux in the tropical lower stratosphere of around 2% per decade as a consequence of GHG abundance increases. A stronger BDC would decrease the abundance of tropical lower stratospheric ozone, increase poleward transport of ozone, and could reduce the atmospheric lifetimes of long-lived ODSs and other trace gases. While simulations showing faster ascent in the tropical lower stratosphere to date are a robust feature of chemistry-climate models (CCMs), this has not been confirmed by observations and the responsible mechanisms remain unclear. • Substantial ozone losses could occur if stratospheric aerosol loading were to increase in the next few decades, while halogen levels are high. Stratospheric aerosol increases may be caused by sulfur contained in volcanic plumes entering the stratosphere or from human activities. The latter might include attempts to geoengineer the climate system by enhancing the stratospheric aerosol layer. The ozone losses mostly result from enhanced heterogeneous chemistry on stratospheric aerosols. Enhanced aerosol heating within the stratosphere also leads to changes in temperature and circulation that affect ozone. • Surface ultraviolet (UV) levels will not be affected solely by ozone changes but also by the effects of climate change and by air quality change in the troposphere. These tropospheric effects include changes in clouds, tropospheric aerosols, surface reflectivity, and tropospheric sulfur dioxide (SO2) and nitrogen dioxide (NO2). The uncertainties in projections of these factors are large. Projected increases in tropospheric ozone are more certain and may lead to reductions in surface erythemal (“sunburning”) irradiance of up to 10% by 2100. Changes in clouds may lead to decreases or increases in surface erythemal irradiance of up to 15% depending on latitude. Expected Future Changes in Ozone Full ozone recovery from the effects of ODSs and return of ozone to historical levels are not synonymous. In this chapter a key target date is chosen to be 1980, in part to retain the connection to previous Ozone Assessments. Noting, however, that decreases in ozone may have occurred in some regions of the atmosphere prior to 1980, 1960 return dates are also reported. The projections reported on in this chapter are taken from a recent compilation of CCM simulations. The ozone projections, which also form the basis for the UV projections, are limited in their representativeness of possible futures since they mostly come from CCM simulations based on a single GHG emissions scenario (scenario A1B of Emissions Scenarios. A Special Report of Working Group III of the Intergovernmental Panel on Climate Change, Cambridge University Press, 2000) and a single ODS emissions scenario (adjusted A1 of the previous (2006) Ozone Assessment). Throughout this century, the vertical, latitudinal, and seasonal structure of the ozone distribution will be different from what it was in 1980. For this reason, ozone changes in different regions of the atmosphere are considered separately. • The projections of changes in ozone and surface clear-sky UV are broadly consistent with those reported on in the 2006 Assessment. • The capability of making projections and attribution of future ozone changes has been improved since the 2006 Assessment. Use of CCM simulations from an increased number of models extending through the entire period of ozone depletion and recovery from ODSs (1960–2100) as well as sensitivity simulations have allowed more robust projections of long-term changes in the stratosphere and of the relative contributions of ODSs and GHGs to those changes. • Global annually averaged total column ozone is projected to return to 1980 levels before the middle of the century and earlier than when stratospheric halogen loading returns to 1980 levels. CCM projections suggest that this early return is primarily a result of GHG-induced cooling of the upper stratosphere because the effects of circulation changes on tropical and extratropical ozone largely cancel. Global (90°S–90°N) annually averaged total column ozone will likely return to 1980 levels between 2025 and 2040, well before the return of stratospheric halogens to 1980 levels between 2045 and 2060. • Simulated changes in tropical total column ozone from 1960 to 2100 are generally small. The evolution of tropical total column ozone in models depends on the balance between upper stratospheric increases and lower stratospheric decreases. The upper stratospheric increases result from declining ODSs and a slowing of ozone destruction resulting from GHG-induced cooling. Ozone decreases in the lower stratosphere mainly result from an increase in tropical upwelling. From 1960 until around 2000, a general decline is simulated, followed by a gradual increase to values typical of 1980 by midcentury. Thereafter, although total column ozone amounts decline slightly again toward the end of the century, by 2080 they are no longer expected to be affected by ODSs. Confidence in tropical ozone projections is compromised by the fact that simulated decreases in column ozone to date are not supported by observations, suggesting that significant uncertainties remain. • Midlatitude total column ozone is simulated to evolve differently in the two hemispheres. Over northern midlatitudes, annually averaged total column ozone is projected to return to 1980 values between 2015 and 2030, while for southern midlatitudes the return to 1980 values is projected to occur between 2030 and 2040. The more rapid return to 1980 values in northern midlatitudes is linked to a more pronounced strengthening of the poleward transport of ozone due to the effects of increased GHG levels, and effects of Antarctic ozone depletion on southern midlatitudes. By 2100, midlatitude total column ozone is projected to be above 1980 values in both hemispheres. • October-mean Antarctic total column ozone is projected to return to 1980 levels after midcentury, later than in any other region, and yet earlier than when stratospheric halogen loading is projected to return to 1980 levels. The slightly earlier return of ozone to 1980 levels (2045–2060) results primarily from upper stratospheric cooling and resultant increases in ozone. The return of polar halogen loading to 1980 levels (2050–2070) in CCMs is earlier than in empirical models that exclude the effects of GHG-induced changes in circulation. Our confidence in the drivers of changes in Antarctic ozone is higher than for other regions because (i) ODSs exert a strong influence on Antarctic ozone, (ii) the effects of changes in GHG abundances are comparatively small, and (iii) projections of ODS emissions are more certain than those for GHGs. Small Antarctic ozone holes (areas of ozone <220 Dobson units, DU) could persist to the end of the 21st century. • March-mean Arctic total column ozone is projected to return to 1980 levels two to three decades before polar halogen loading returns to 1980 levels, and to exceed 1980 levels thereafter. While CCM simulations project a return to 1980 levels between 2020 and 2035, most models tend not to capture observed low temperatures and thus underestimate present-day Arctic ozone loss such that it is possible that this return date is biased early. Since the strengthening of the Brewer-Dobson circulation through the 21st century leads to increases in springtime Arctic column ozone, by 2100 Arctic ozone is projected to lie well above 1960 levels. Uncertainties in Projections • Conclusions dependent on future GHG levels are less certain than those dependent on future ODS levels since ODS emissions are controlled by the Montreal Protocol. For the six GHG scenarios considered by a few CCMs, the simulated differences in stratospheric column ozone over the second half of the 21st century are largest in the northern midlatitudes and the Arctic, with maximum differences of 20–40 DU between the six scenarios in 2100. • There remain sources of uncertainty in the CCM simulations. These include the use of prescribed ODS mixing ratios instead of emission fluxes as lower boundary conditions, the range of sea surface temperatures and sea ice concentrations, missing tropospheric chemistry, model parameterizations, and model climate sensitivity. • Geoengineering schemes for mitigating climate change by continuous injections of sulfur-containing compounds into the stratosphere, if implemented, would substantially affect stratospheric ozone, particularly in polar regions. Ozone losses observed following large volcanic eruptions support this prediction. However, sporadic volcanic eruptions provide limited analogs to the effects of continuous sulfur emissions. Preliminary model simulations reveal large uncertainties in assessing the effects of continuous sulfur injections. Expected Future Changes in Surface UV. While a number of factors, in addition to ozone, affect surface UV irradiance, the focus in this chapter is on the effects of changes in stratospheric ozone on surface UV. For this reason, clear-sky surface UV irradiance is calculated from ozone projections from CCMs. • Projected increases in midlatitude ozone abundances during the 21st century, in the absence of changes in other factors, in particular clouds, tropospheric aerosols, and air pollutants, will result in decreases in surface UV irradiance. Clear-sky erythemal irradiance is projected to return to 1980 levels on average in 2025 for the northern midlatitudes, and in 2035 for the southern midlatitudes, and to fall well below 1980 values by the second half of the century. However, actual changes in surface UV will be affected by a number of factors other than ozone. • In the absence of changes in other factors, changes in tropical surface UV will be small because changes in tropical total column ozone are projected to be small. By the middle of the 21st century, the model projections suggest surface UV to be slightly higher than in the 1960s, very close to values in 1980, and slightly lower than in 2000. The projected decrease in tropical total column ozone through the latter half of the century will likely result in clear-sky surface UV remaining above 1960 levels. Average UV irradiance is already high in the tropics due to naturally occurring low total ozone columns and high solar elevations. • The magnitude of UV changes in the polar regions is larger than elsewhere because ozone changes in polar regions are larger. For the next decades, surface clear-sky UV irradiance, particularly in the Antarctic, will continue to be higher than in 1980. Future increases in ozone and decreases in clear-sky UV will occur at slower rates than those associated with the ozone decreases and UV increases that occurred before 2000. In Antarctica, surface clear-sky UV is projected to return to 1980 levels between 2040 and 2060, while in the Arctic this is projected to occur between 2020 and 2030. By 2100, October surface clear-sky erythemal irradiance in Antarctica is likely to be between 5% below to 25% above 1960 levels, with considerable uncertainty. This is consistent with multi-model-mean October Antarctic total column ozone not returning to 1960 levels by 2100. In contrast, by 2100, surface clear-sky UV in the Arctic is projected to be 0–10% below 1960 levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on the consistency of water vapour line intensities in selected spectral regions between 800–12,000 cm−1 under atmospheric conditions using sun-pointing Fourier transform infrared spectroscopy. Measurements were made across a number of days at both a low and high altitude field site, sampling a relatively moist and relatively dry atmosphere. Our data suggests that across most of the 800–12,000 cm−1 spectral region water vapour line intensities in recent spectral line databases are generally consistent with what was observed. However, we find that HITRAN-2008 water vapour line intensities are systematically lower by up to 20% in the 8000–9200 cm−1 spectral interval relative to other spectral regions. This discrepancy is essentially removed when two new linelists (UCL08, a compilation of linelists and ab-initio calculations, and one based on recent laboratory measurements by Oudot et al. (2010) [10] in the 8000–9200 cm−1 spectral region) are used. This strongly suggests that the H2O line strengths in the HITRAN-2008 database are indeed underestimated in this spectral region and in need of revision. The calculated global-mean clear-sky absorption of solar radiation is increased by about 0.3 W m−2 when using either the UCL08 or Oudot line parameters in the 8000–9200 cm−1 region, instead of HITRAN-2008. We also found that the effect of isotopic fractionation of HDO is evident in the 2500–2900 cm−1 region in the observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variability of renewable energy is widely recognised as a challenge for integrating high levels of renewable generation into electricity systems. However, to explore its implications effectively, variability itself should first be clearly understood. This is particularly true for national electricity systems with high planned penetration of renewables and limited interconnection such as the UK. Variability cannot be considered as a distinct resource property with a single measurable parameter, but is a multi-faceted concept best described by a range of distinct characteristics. This paper identifies relevant characteristics of variability, and considers their implications for energy research. This is done through analysis of wind, solar and tidal current resources, with a primary focus on the Bristol Channel region in the UK. The relationship with electricity demand is considered, alongside the potential benefits of resource diversity. Analysis is presented in terms of persistence, distribution, frequency and correlation between supply and demand. Marked differences are seen between the behaviours of the individual resources, and these give rise to a range of different implications for system integration. Wind shows strong persistence and a useful seasonal pattern, but also a high spread in energy levels at timescales beyond one or two days. The solar resource is most closely correlated with electricity demand, but is undermined by night-time zero values and an even greater spread of monthly energy delivered than wind. In contrast, the tidal resource exhibits very low persistence, but also much greater consistency in energy values assessed across monthly time scales. Whilst this paper focuses primarily on the behaviour of resources, it is noted that discrete variability characteristics can be related to different system impacts. Persistence and predictability are relevant for system balancing, whereas statistical distribution is more relevant when exploring issues of asset utilisation and energy curtailment. Areas of further research are also identified, including the need to assess the value of predictability in relation to other characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sub-lethal carbon monoxide (CO) exposure is frequently associated with myocardial arrhythmias and our recent studies have demonstrated that these may be attributable to modulation of cardiac Na+ channels, causing an increase in the late current and an inhibition of the peak current. Using a recombinant expression system, we demonstrate that CO inhibits peak human Nav1.5 current amplitude without activation of the late Na+ current observed in native tissue. Inhibition was associated with a hyperpolarizing shift in the steady-state inactivation properties of the channels and was unaffected by modification of channel gating induced by anemone toxin (rATX-II). Systematic pharmacological assessment indicated that no recognised CO-sensitive intracellular signalling pathways appeared to mediate CO inhibition of Nav1.5. Inhibition was, however, markedly suppressed by inhibition of nitric oxide (NO) formation, but NO donors did not mimic or occlude channel inhibition by CO, indicating that NO alone did not account for the actions of CO. Exposure of cells to dithiothreitol immediately before CO exposure also dramatically reduced the magnitude of current inhibition. Similarly, L-cysteine and N-ethylmaleimide significantly attenuated the inhibition caused by CO. In the presence of DTT and the NO inhibitor L-NAME, the ability of CO to inhibit Nav1.5 was almost fully prevented. Our data indicate that inhibition of peak Na+ current (which can lead to Brugada-syndrome like arrhythmias) occurs via a mechanism distinct from induction of the late current, requires NO formation and is dependent on channel redox state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social domains are classes of interpersonal processes each with distinct procedural rules underpinning mutual understanding, emotion regulation and action. We describe the features of three domains of family life – safety, attachment and discipline/expectation – and contrast them with exploratory processes in terms of the emotions expressed, the role of certainty versus uncertainty, and the degree of hierarchy in an interaction. We argue that everything that people say and do in family life carries information about the type of interaction they are engaged in – that is, the domain. However, sometimes what they say or how they behave does not make the domain clear, or participants in the social interactions are not in the same domain (there is a domain mismatch). This may result in misunderstandings, irresolvable arguments or distress. We describe how it is possible to identify domains and judge whether they are clear and unclear, and matched and mismatched, in observed family interactions and in accounts of family processes. This then provides a focus for treatment and helps to define criteria for evaluating outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study on the benthic ecosystem health was performed to assess the environmental quality of Montevideo coastal zone, in view of the construction of a new sanitation system. Data were compared to previous research undertaken 10 years ago, and biochemical composition of organic matter, heavy metals, organic matter, phytopigments, benthic diatoms, macrofauna community structure and a biotic index (AMBI) were used as proxies. Results indicate an environmental quality-gradient, with the worst conditions within the inner stations of Montevideo Bay and an improvement towards the adjacent coastal zone. Higher levels of chromium, lead, phaeopigments, organic biopolymers and poor benthic macrofauna and diatom communities, characterised the hypertrophic innermost portion of Montevideo Bay. Data indicated a clear deterioration of the adjacent coastal zone comparatively to that observed 10 years ago. The complementary use of approaches not applied before (benthic diatoms and organic biopolymers) with those formerly applied improve our assessment of the trophic status and the environmental health of the area. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The onset of the financial crisis in 2008 and the European sovereign crisis in 2010 renewed the interest of macroeconomists on the role played by credit in business cycle fluctuations. The purpose of the present work is to present empirical evidence on the monetary policy transmission mechanism in Brazil with a special eye on the role played by the credit channel, using different econometric techniques. It is comprised by three articles. The first one presents a review of the literature of financial frictions, with a focus on the overlaps between credit activity and the monetary policy. It highlights how the sharp disruptions in the financial markets spurred central banks in developed and emerging nations to deploy of a broad set of non conventional tools to overcome the damage on financial intermediation. A chapter is dedicated to the challenge face by the policymaking in emerging markets and Brazil in particular in the highly integrated global capital market. This second article investigates the implications of the credit channel of the monetary policy transmission mechanism in the case of Brazil, using a structural FAVAR (SFAVAR) approach. The term “structural” comes from the estimation strategy, which generates factors that have a clear economic interpretation. The results show that unexpected shocks in the proxies for the external finance premium and the credit volume produce large and persistent fluctuations in inflation and economic activity – accounting for more than 30% of the error forecast variance of the latter in a three-year horizon. Counterfactual simulations demonstrate that the credit channel amplified the economic contraction in Brazil during the acute phase of the global financial crisis in the last quarter of 2008, thus gave an important impulse to the recovery period that followed. In the third articles, I make use of Bayesian estimation of a classical neo-Keynesian DSGE model, incorporating the financial accelerator channel developed by Bernanke, Gertler and Gilchrist (1999). The results present evidences in line to those already seen in the previous article: disturbances on the external finance premium – represented here by credit spreads – trigger significant responses on the aggregate demand and inflation and monetary policy shocks are amplified by the financial accelerator mechanism. Keywords: Macroeconomics, Monetary Policy, Credit Channel, Financial Accelerator, FAVAR, DSGE, Bayesian Econometrics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dragagem dos sedimentos do Canal de Santos é necessária para permitir o trânsito de navios que operam no Porto de Santos. As áreas de disposição do material dragado estão situadas na zona costeira, em frente à Baía de Santos. Este estudo visou avaliar a qualidade dos sedimentos do Canal de Santos e das áreas de disposição atuais e antigas, utilizando testes de toxicidade de sedimento integral com anfípodos e de toxicidade de elutriatos com embriões de ouriço do mar. As amostras do Canal de Santos foram consideradas as mais tóxicas: todas as amostras dessa área foram consideradas significativamente tóxicas. Além disso, algumas amostras das áreas de disposição exibiram toxicidade. Os resultados mostraram, portanto, que os sedimentos apresentam evidências de degradação em sua qualidade, porém novos estudos devem ser conduzidos visando determinar as relações entre contaminação e toxicidade. Os resultados sugerem ainda que a disposição dos sedimentos dragados deva ser reavaliada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)