336 resultados para hurricane


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report is a result of long-term fish monitoring studies supported by the National Park Service (NPS) at the Virgin Islands National Park since 1988 and is now a joint NPS and NOAA collaboration. Reef fish monitoring data collected from 1988 to 2006 within Virgin Islands National Park (VINP) and adjacent reefs around St. John, U.S. Virgin Islands (USVI) were analyzed to provide information on the status of reef fishes during the monitoring period. Monitoring projects were initiated by the National Park Service (NPS) in the 1980s to provide useful data for evaluation of resources and for development of a long-term monitoring program. Monthly monitoring was conducted at two reef sites (Yawzi Point and Cocoloba Cay) starting in November 1988 for 2.5 years to document the monthly/seasonal variability in reef fish assemblages. Hurricane Hugo (a powerful Category 4 storm) struck the USVI in September 1989 resulting in considerable damage to the reefs around St. John. Abundance of fishes was lower at both sites following the storm, however, a greater effect was observed at Yawzi Point, which experienced a more direct impact from the hurricane. The storm affected species differently, with some showing only small, short-term declines in abundance, and others, such as the numerically abundant blue chromis (Chromis cyanea), a planktivorous damselfish, exhibiting a larger and longer recovery period. This report provides: 1) an evaluation of sampling methods, sample size, and methods used during the sampling period, 2) an evaluation of the spatial and temporal variability in reef fish assemblages at selected reef sites inside and outside of VINP, and 3) an evaluation of trends over 17 years of monitoring at the four reference sites. Comparisons of methods were conducted to standardize assessments among years. Several methods were used to evaluate sample size requirements for reef fish monitoring and the results provided a statistically robust justification for sample allocation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plate anchors are increasingly being used to moor large floating offshore structures in deep and ultradeep water. These facilities impart substantial vertical uplift loading to plate anchors. However, extreme operating conditions such as hurricane loading often result in partial system failures, with significant change in the orientation of the remaining intact mooring lines. The purpose of this study is to investigate the undrained pure translational (parallel to plate) and torsional bearing capacity of anchor plates idealized as square and rectangular shaped plates. Moreover, the interaction response of plate anchors under combined translational and torsional loading is studied using a modified plastic limit analysis (PLA) approach. The previous PLA formulation which did not account for shear-normal force interaction on the vertical end faces of the plate provides an exact solution to the idealized problem of an infinitely thin plate but only an approximate solution to the problem of a plate of finite thickness. This is also confirmed by the three-dimensional finite element (FE) results, since the PLA values exceed FE results as the thickness of the plate increases. By incorporating the shear-normal interaction relationship in the modified solution, the torsional bearing capacity factors, as well as the plate interaction responses are enhanced as they show satisfactory agreement with the FE results. The interaction relationship is then obtained for square and rectangular plates of different aspect ratios and thicknesses. The new interaction relationships could also be used as an associated plastic failure locus for combined shear and torsional loading to predict plastic displacements and rotations in translational and torsional loading modes as well. Copyright © 2011 by the International Society of Offshore and Polar Engineers (ISOPE).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vertical growth of shoots of the seagrass Thalassia testudinum Banks ex Konig in four meadows, along a range of exposure to waves, in the Mexican Caribbean was examined to elucidate its magnitude and its relationship to sediment dynamics. Average internodal length varied between 0.17 and 12.75 mm, and was greatest in the meadow which experienced the greatest burial by sand waves moved by Hurricane Gilbert (September 1988). Internodal length showed annual cycles, confirmed by the flower scars always preceding or coinciding with the annual minimum internodal length. These annual cycles on the shoot allowed estimation of annual leaf production, which varied, on average, between 14.2 and 19.3 leaves per shoot year-1. High vertical shoot growth was associated with long internodes and high leaf production rate, which increased with increasing vertical shoot growth to a maximum of approximately 25 leaves per shoot year-1, with vertical growth of about 30 mm year-1 or more. Average internodal length showed substantial interannual differences from perturbations derived from the passage of Hurricane Gilbert. The growth response of the plants surviving moderate burial and erosion after the hurricane involved enhanced vertical growth and increased leaf production, and reduced vertical growth, respectively, after 1988. The variability in shoot vertical growth of T testudinum can be separated into seasonal changes in plant growth, and long-term variability associated with episodic perturbations involving sediment redistribution by hurricanes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The response of the South China Sea (SCS) to Typhoon Imbudo was examined using POM model. The results indicated that SST decreased by 2-6 degrees C with a rightward-biased response as Typhoon Imbudo passed across the SCS. Due to a strong mixing process, the mixed layer (ML) depth deepened as much as 10-60 m and ML heat budget lost 824.78 W/m(2), which was OF dominated by the vertical mixing. By the response of upper ML heat transport, the temperature below the ML increased and oscillated near the inertial period. Furthermore, strong inertial currents were generated by the storm with the max currents up to 1.4 m/s in the upper ML.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent investigations show that normalized radar cross sections for C-band microwave sensors decrease under high wind conditions with certain incident angles instead of increase, as is the case for low to moderate wind speeds. This creates the problem of ambiguities in high wind speed retrievals from synthetic aperture radar (SAR). In the present work, four geophysical model functions (GMFs) are studied, namely the high wind C-band model 4 (CMOD4HW), C-band model 5 (CMOD5), the high wind vertical polarized GMF (HWGMF_VV), and the high wind horizontal polarized GMF (HWGMF_HH). Our focus is on model behaviours relative to wind speed ambiguities. We show that, except for CMOD4HW, the other GMFs exhibit the wind speed ambiguity problem. To consider this problem in high wind speed retrievals from SAR, we focus on hurricanes and propose a method to remove the speed ambiguity using the dominant hurricane wind structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the summer of 2007 the United Kingdom experienced some of the worst flooding in its history, with the city of Hull amongst the worst affected. Meanwhile, the city of New Orleans, USA was subject to severe flooding in August 2005 as a result of Hurricane Katrina. The study has found that both the UK and US government disaster management programmes were ill prepared for these flood events. Many parallel issues have been discovered and discussed. The conditions of vulnerability that are evident in developing countries are not widely present in the UK or US but this must not be allowed to lead to complacency and lack of preparation and awareness. The cost in terms of mortality is relatively low compared to similar events in developing countries; however, the economic implications are considerable and must be addressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.

In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In his essay, Anti-Object, Kengo Kuma proposes that architecture cannot and should not be understood as object alone but instead always as series of networks and connections, relationships within space and through form. Some of these relationships are tangible, others are invisible. Stan Allen and James Corner have also called for an architecture that is more performative and operative – ‘less concerned with what buildings look like and more concerned with what they do’ – as means of effecting a more intimate and promiscuous relationship between infrastructure, urbanism and buildings. According to Allen this expanding filed offers a reclamation of some of the areas ceded by architecture following disciplinary specialization:

‘Territory, communication and speed are properly infrastructural problems and architecture as a discipline has developed specific technical means to deal with these variables. Mapping, projection, calculation, notation and visualization are among architecture’s traditional tools for operating at the very large scale’.

The motorway may not look like it – partly because we are no longer accustomed to think about it as such – but it is a site for and of architecture, a territory where architecture can be critical and active. If the limits of the discipline have narrowed, then one of the functions of a school of architecture must be an attempt occupy those areas of the built environment where architecture is no longer, or has yet to reach. If this is a project about reclamation of a landscape, it is also a challenge to some of the boundaries that surround architecture and often confine it, as Kuma suggests, to the appreciation of isolated objects.

M:NI 2014-15
We tend to think of the motorway as a thing or an object, something that has a singular function. Historically this is how it has been seen, with engineers designing bridges and embankments and suchlike with zeal … These objects like the M3 Urban Motorway, Belfast’s own Westway, are beautiful of course, but they have caused considerable damage to the city they were inflicted upon.

Actually, it’s the fact that we have seen the motorway as a solid object that has caused this problem. The motorway actually is a fluid and dynamic thing, and it should be seen as such: in fact it’s not an organ at all but actually tissue – something that connects rather than is. Once we start to see the motorway as tissue, it opens up new propositions about what the motorway is, is used for and does. This new dynamic and connective view unlocks the stasis of the motorway as edifice, and allows adaptation to happen: adaptation to old contexts that were ignored by the planners, and adaptation to new contexts that have arisen because of or in spite of our best efforts.

Motorways as tissue are more than just infrastructures: they are landscapes. These landscapes can be seen as surfaces on which flows take place, not only of cars, buses and lorries, but also of the globalized goods carried and the lifestyles and mobilities enabled. Here the infinite speed of urban change of thought transcends the declared speed limit [70 mph] of the motorway, in that a consignment of bananas can cause soil erosion in Equador, or the delivery of a new iphone can unlock connections and ideas the world over.

So what is this new landscape to be like? It may be a parallax-shifting, cognitive looking glass; a drone scape of energy transformation; a collective farm, or maybe part of a hospital. But what’s for sure, is that it is never fixed nor static: it pulses like a heartbeat through that most bland of landscapes, the countryside. It transmits forces like a Caribbean hurricane creating surf on an Atlantic Storm Beach: alien forces that mutate and re-form these places screaming into new, unclear and unintended futures.

And this future is clear: the future is urban. In this small rural country, motorways as tissue have made the whole of it: countryside, mountain, sea and town, into one singular, homogenous and hyper-connected, generic city.

Goodbye, place. Hello, surface!

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In his essay, Anti-Object, Kengo Kuma proposes that architecture cannot and should not be understood as object alone but instead always as series of networks and connections, relationships within space and through form. Some of these relationships are tangible, others are invisible. Stan Allen and James Corner have also called for an architecture that is more performative and operative – ‘less concerned with what buildings look like and more concerned with what they do’ – as means of effecting a more intimate and promiscuous relationship between infrastructure, urbanism and buildings. According to Allen this expanding filed offers a reclamation of some of the areas ceded by architecture following disciplinary specialization:

‘Territory, communication and speed are properly infrastructural problems and architecture as a discipline has developed specific technical means to deal with these variables. Mapping, projection, calculation, notation and visualization are among architecture’s traditional tools for operating at the very large scale’.

The motorway may not look like it – partly because we are no longer accustomed to think about it as such – but it is a site for and of architecture, a territory where architecture can be critical and active. If the limits of the discipline have narrowed, then one of the functions of a school of architecture must be an attempt occupy those areas of the built environment where architecture is no longer, or has yet to reach. If this is a project about reclamation of a landscape, it is also a challenge to some of the boundaries that surround architecture and often confine it, as Kuma suggests, to the appreciation of isolated objects.

M:NI 2014-15
We tend to think of the motorway as a thing or an object, something that has a singular function. Historically this is how it has been seen, with engineers designing bridges and embankments and suchlike with zeal … These objects like the M3 Urban Motorway, Belfast’s own Westway, are beautiful of course, but they have caused considerable damage to the city they were inflicted upon.

Actually, it’s the fact that we have seen the motorway as a solid object that has caused this problem. The motorway actually is a fluid and dynamic thing, and it should be seen as such: in fact it’s not an organ at all but actually tissue – something that connects rather than is. Once we start to see the motorway as tissue, it opens up new propositions about what the motorway is, is used for and does. This new dynamic and connective view unlocks the stasis of the motorway as edifice, and allows adaptation to happen: adaptation to old contexts that were ignored by the planners, and adaptation to new contexts that have arisen because of or in spite of our best efforts.

Motorways as tissue are more than just infrastructures: they are landscapes. These landscapes can be seen as surfaces on which flows take place, not only of cars, buses and lorries, but also of the globalized goods carried and the lifestyles and mobilities enabled. Here the infinite speed of urban change of thought transcends the declared speed limit [70 mph] of the motorway, in that a consignment of bananas can cause soil erosion in Equador, or the delivery of a new iphone can unlock connections and ideas the world over.

So what is this new landscape to be like? It may be a parallax-shifting, cognitive looking glass; a drone scape of energy transformation; a collective farm, or maybe part of a hospital. But what’s for sure, is that it is never fixed nor static: it pulses like a heartbeat through that most bland of landscapes, the countryside. It transmits forces like a Caribbean hurricane creating surf on an Atlantic Storm Beach: alien forces that mutate and re-form these places screaming into new, unclear and unintended futures.

And this future is clear: the future is urban. In this small rural country, motorways as tissue have made the whole of it: countryside, mountain, sea and town, into one singular, homogenous and hyper-connected, generic city.

Goodbye, place. Hello, surface!

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La résilience est la capacité à s’adapter aux menaces et à atténuer ou éviter un risque, elle peut être trouvée dans des bâtiments résistant aux dangers ou dans des systèmes sociaux adaptables (Pelling, 2003). Par conséquence, ce concept peut aussi être compris comme la capacité de reconstruire un quartier avec des composants plus solides et plus viables. Presque quatre ans après l’ouragan Katrina, la Nouvelle-Orléans est considérée comme un laboratoire à ciel ouvert. Le niveau de résilience de ses communautés peut y être examiné. L’état actuel de la reconstitution de ses quartiers diffère largement des uns aux autres. L’arrondissement historique de Holy Cross est l’un des plus vieux quartiers de la ville, cette communauté vulnérable est connue pour son patrimoine culturel, apparent non seulement dans son architecture unique, mais aussi ses relations sociales. Un des principaux défi de la reconstruction du quartier de Holly Cross est de trouver une façon de concilier la préservation du patrimoine bâti et de son tissu urbain ancien avec de nouveaux plans de développement, afin de créer une communauté durable. Cette étude examine les rôles des acteurs impliqués dans le processus de reconstruction et leur efficacité sur la création d’un Holy Cross plus durable, résistant et abordable, afin d’encourager le retour de ses résidents. Elle présente également les efforts actuels pour proposer des projets de reconstruction durables tout en préservant le caractère patrimonial du quartier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans ma thèse doctorale, j'étudie trois facteurs importants qui caractérisent le commerce international : les différences technologiques entre les pays, les barrières à l'entrée sous la forme de coûts fixes et la migration internationale. Le premier chapitre analyse si les différences technologiques entre les pays peuvent expliquer la spécialisation dans le commerce international entre les pays. Pour mesurer le niveau de la spécialisation, je calcule les index de concentration pour la valeur des importations et des exportations et décompose la concentration totale dans la marge de produits extensive (nombre de produits commercialisés) et la marge de produits intensive (volume de produits commercialisés). En utilisant des données commerciales détaillées au niveau du produit dans 160 pays, mes résultats montrent que les exportations sont plus concentrées que les importations, que la spécialisation se produit principalement au niveau de la marge intensive du produit, et que les économies plus grandes disposent d'importations et d'exportations plus diversifiées, car elles commercialisent plus de produits. Compte tenu de ces faits, j'évalue la capacité du modèle Eaton-Kortum, le principal modèle de la théorie ricardienne du commerce, pour représenter les preuves empiriques. Les résultats montrent que la spécialisation à travers l'avantage comparatif induit par les différences de technologie peut expliquer les faits qualitatifs et quantitatifs. De plus, j'évalue le rôle des déterminants clés de la spécialisation : le degré de l'avantage comparatif, l'élasticité de la substitution et la géographie. Une implication de ces résultats est qu'il est important d’évaluer jusqu'à quel point la volatilité de production mesurée par la volatilité du PIB est motivée par la spécialisation des exportations et des importations. Étant donné le compromis entre l'ouverture du commerce et la volatilité de production, les bénéfices tirés du commerce peuvent s'avérer plus faibles que ceux estimés précédemment. Par conséquent, les politiques commerciales alternatives telles que l'ouverture graduelle au commerce combinée à la diversification de la production pour réduire la concentration de l'exportation peuvent se révéler être une meilleure stratégie que l'approche du laissez-faire. En utilisant la relation entre la taille du marché et l’entrée de firmes et produits, le deuxième chapitre évalue si les barrières à l'entrée sous la forme de coûts fixes à exporter sont au niveau de la firme ou au niveau du produit. Si les coûts fixes se trouvent au niveau de la firme, la firme multiproduits a un avantage de coût de production par rapport aux autres firmes parce qu’elles peuvent diviser les coûts fixes sur plusieurs produits. Dans ce cas, le commerce international sera caractérisé par peu de firmes qui exportent beaucoup des produits. Si les coûts fixes sont au niveau du produit, l’entrée d’un produit est associée avec l’entrée de plusieurs firmes. La raison est qu’une fois que la première firme entre et paye les coûts fixes du produit, elle crée un effet d’entrainement qui réduit les coûts fixes pour des firmes rivales. Dans ce cas, le commerce international sera caractérisé par plusieurs firmes qui vendent des variétés différentes du même produit. En utilisant des données détaillées provenant de 40 pays exportateurs à travers 180 marchés de destination, mes résultats montrent que les barrières à l'entrée se trouvent principalement au niveau du produit. Un marché plus large favorise l'expansion d'un plus grand nombre d’entreprises au sein d'une catégorie de produit plutôt que de permettre aux entreprises produisant plusieurs produits de croître dans une gamme de produits. En regardant la différence entre le nombre d'exportateurs au sein d'une catégorie de produit dans des destinations données, je trouve que le taux d'entrée de firmes augmente significativement après qu'un produit entre la première fois dans le marché. J'en déduis donc que le premier entrant réduit les coûts fixes pour les firmes suivantes. Mes recherches démontrent également que malgré une plus grande compétition sur le marché du produit, les entreprises disposent de revenus d'exportation supérieurs et sont plus susceptibles de rester sur les marchés internationaux. Ces résultats sont cohérents avec l’hypothèse que l’effet d’entrainement incite l'entrée de firmes rivales et permettent aux entreprises de produire à plus grande échelle. Cette recherche dévoile un nombre de conclusions importantes. D'abord, les politiques commerciales encouragent l'entrée de nouveaux produits, par exemple, en promouvant des produits dans les marchés de destination entraînant ainsi des retombées qui se traduiront par un taux de participation plus élevé de l'entreprise et une croissance de l'exportation. Deuxièmement, les consommateurs du pays importateur peuvent bénéficier de prix plus bas pour le produit en réduisant les barrières techniques du commerce. Troisièmement, lorsque l'on effectue des expérimentations politiques sous la forme de réduction des coûts commerciaux, il est de coutume de considérer uniquement une baisse des coûts marginaux et d'évaluer les répercussions sur le bien-être du consommateur. Cependant, un élément important des accords commerciaux est la réduction des barrières techniques au commerce grâce à la négociation de normes communes pour un produit. Négliger l'existence des barrières à l'entrée et les conséquences des réaffectations de l'industrie affaiblit l'impact des réformes commerciales. Le troisième chapitre prend en compte le rôle de l'information dans la facilitation du commerce international. Les immigrants réduisent les coûts de transaction dans le commerce international en fournissant des informations sur les possibilités d'échange avec leur pays d'origine. En utilisant des données géographiques détaillées sur l'immigration et les importations aux États-Unis entre 1970 et 2005, je quantifie l'incidence qu'ont les nouveaux immigrants sur la demande pour les importations de biens intermédiaires aux États-Unis. Pour établir le lien cause à effet entre le commerce et la migration, j’exploite l'important afflux d'immigrants d'Amérique centrale après l'ouragan Mitch. Les résultats montrent que l'augmentation de dix pour cent d'immigrants a fait croître la demande pour les importations de biens intermédiaires de 1,5 pour cent. Mes résultats sont robustes aux problèmes de la causalité inverse ou la décision d’émigrer est causée par des opportunités de faire du commerce.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lorsque les ouragans entrent en contact avec l'environnement bâti et naturel, les autorités publiques n'ont parfois d'autre choix que de déclarer l'évacuation obligatoire de la population située en zone à risque. En raison de l'imprévisibilité du déroulement d'une catastrophe et des comportements humains, les opérations d'évacuation sont confrontées à une incertitude significative. Les expériences passées ont montré que les technologies de l'information et des communications (TIC) ont le potentiel d'améliorer l'état de l'art en gestion des évacuations. Malgré cette reconnaissance, les recherches empiriques sur ce sujet sont à ce jour limitées. La présente étude de cas de la ville de New York explore comment l'intégration des TIC dans la planification opérationnelle des organisations ayant des responsabilités en matière de transport peut améliorer leurs réponses aux événements et influencer le succès global du système de gestion des catastrophes. L'analyse est basée sur les informations recueillies au moyen d'entretiens semi-dirigés avec les organisations de transport et de gestion des catastrophes de la ville de New York ainsi qu’avec des experts du milieu universitaire. Les résultats mettent en lumière le potentiel des TIC pour la prise de décision en interne. Même s’il est largement reconnu que les TIC sont des moyens efficaces d'échanger de l'information en interne et entre les organisations, ces usages sont confrontés à certaines contraintes technologique, organisationnelle, structurelle et systémique. Cette observation a permis d'identifier les contraintes vécues dans les pratiques usuelles de gestion des systèmes urbains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What are the effects of natural disasters on electoral results? Some authors claim that catastrophes have a negative effect on the survival of leaders in a democracy because voters have a propensity to punish politicians for not preventing or poorly handling a crisis. In contrast, this paper finds that these events might be beneficial for leaders. Disasters are linked to leader survival through clientelism: they generate an in-flow of resources in the form of aid, which increase money for buying votes. Analyzing the rainy season of 2010-2011 in Colombia, considered its worst disaster in history, I use a difference-in-differences strategy to show that in the local election incumbent parties benefited from the disaster. The result is robust to different specifications and alternative explanations. Moreover, places receiving more aid and those with judicial evidence of vote-buying irregularities, are more likely to reelect the incumbent, supporting the mechanism proposed by this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Records of Atlantic basin tropical cyclones (TCs) since the late nineteenth century indicate a very large upward trend in storm frequency. This increase in documented TCs has been previously interpreted as resulting from anthropogenic climate change. However, improvements in observing and recording practices provide an alternative interpretation for these changes: recent studies suggest that the number of potentially missed TCs is sufficient to explain a large part of the recorded increase in TC counts. This study explores the influence of another factor—TC duration—on observed changes in TC frequency, using a widely used Atlantic hurricane database (HURDAT). It is found that the occurrence of short-lived storms (duration of 2 days or less) in the database has increased dramatically, from less than one per year in the late nineteenth–early twentieth century to about five per year since about 2000, while medium- to long-lived storms have increased little, if at all. Thus, the previously documented increase in total TC frequency since the late nineteenth century in the database is primarily due to an increase in very short-lived TCs. The authors also undertake a sampling study based upon the distribution of ship observations, which provides quantitative estimates of the frequency of missed TCs, focusing just on the moderate to long-lived systems with durations exceeding 2 days in the raw HURDAT. Upon adding the estimated numbers of missed TCs, the time series of moderate to long-lived Atlantic TCs show substantial multidecadal variability, but neither time series exhibits a significant trend since the late nineteenth century, with a nominal decrease in the adjusted time series. Thus, to understand the source of the century-scale increase in Atlantic TC counts in HURDAT, one must explain the relatively monotonic increase in very short-duration storms since the late nineteenth century. While it is possible that the recorded increase in short-duration TCs represents a real climate signal, the authors consider that it is more plausible that the increase arises primarily from improvements in the quantity and quality of observations, along with enhanced interpretation techniques. These have allowed National Hurricane Center forecasters to better monitor and detect initial TC formation, and thus incorporate increasing numbers of very short-lived systems into the TC database.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systematic climate shifts have been linked to multidecadal variability in observed sea surface temperatures in the North Atlantic Ocean1. These links are extensive, influencing a range of climate processes such as hurricane activity2 and African Sahel3, 4, 5 and Amazonian5 droughts. The variability is distinct from historical global-mean temperature changes and is commonly attributed to natural ocean oscillations6, 7, 8, 9, 10. A number of studies have provided evidence that aerosols can influence long-term changes in sea surface temperatures11, 12, but climate models have so far failed to reproduce these interactions6, 9 and the role of aerosols in decadal variability remains unclear. Here we use a state-of-the-art Earth system climate model to show that aerosol emissions and periods of volcanic activity explain 76 per cent of the simulated multidecadal variance in detrended 1860–2005 North Atlantic sea surface temperatures. After 1950, simulated variability is within observational estimates; our estimates for 1910–1940 capture twice the warming of previous generation models but do not explain the entire observed trend. Other processes, such as ocean circulation, may also have contributed to variability in the early twentieth century. Mechanistically, we find that inclusion of aerosol–cloud microphysical effects, which were included in few previous multimodel ensembles, dominates the magnitude (80 per cent) and the spatial pattern of the total surface aerosol forcing in the North Atlantic. Our findings suggest that anthropogenic aerosol emissions influenced a range of societally important historical climate events such as peaks in hurricane activity and Sahel drought. Decadal-scale model predictions of regional Atlantic climate will probably be improved by incorporating aerosol–cloud microphysical interactions and estimates of future concentrations of aerosols, emissions of which are directly addressable by policy actions.