38 resultados para diagnose


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiple linear regression is used to diagnose the signal of the 11-yr solar cycle in zonal-mean zonal wind and temperature in the 40-yr ECMWF Re-Analysis (ERA-40) dataset. The results of previous studies are extended to 2008 using data from ECMWF operational analyses. This analysis confirms that the solar signal found in previous studies is distinct from that of volcanic aerosol forcing resulting from the eruptions of El Chichón and Mount Pinatubo, but it highlights the potential for confusion of the solar signal and lower-stratospheric temperature trends. A correction to an error that is present in previous results of Crooks and Gray, stemming from the use of a single daily analysis field rather than monthly averaged data, is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Liquid clouds play a profound role in the global radiation budget but it is difficult to remotely retrieve their vertical profile. Ordinary narrow field-of-view (FOV) lidars receive a strong return from such clouds but the information is limited to the first few optical depths. Wideangle multiple-FOV lidars can isolate radiation scattered multiple times before returning to the instrument, often penetrating much deeper into the cloud than the singly-scattered signal. These returns potentially contain information on the vertical profile of extinction coefficient, but are challenging to interpret due to the lack of a fast radiative transfer model for simulating them. This paper describes a variational algorithm that incorporates a fast forward model based on the time-dependent two-stream approximation, and its adjoint. Application of the algorithm to simulated data from a hypothetical airborne three-FOV lidar with a maximum footprint width of 600m suggests that this approach should be able to retrieve the extinction structure down to an optical depth of around 6, and total opticaldepth up to at least 35, depending on the maximum lidar FOV. The convergence behavior of Gauss-Newton and quasi-Newton optimization schemes are compared. We then present results from an application of the algorithm to observations of stratocumulus by the 8-FOV airborne “THOR” lidar. It is demonstrated how the averaging kernel can be used to diagnose the effective vertical resolution of the retrieved profile, and therefore the depth to which information on the vertical structure can be recovered. This work enables exploitation of returns from spaceborne lidar and radar subject to multiple scattering more rigorously than previously possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We diagnose forcing and climate feedbacks in benchmark sensitivity experiments with the new Met Office Hadley Centre Earth system climate model HadGEM2-ES. To identify the impact of newly-included biogeophysical and chemical processes, results are compared to a parallel set of experiments performed with these processes switched off, and different couplings with the biogeochemistry. In abrupt carbon dioxide quadrupling experiments we find that the inclusion of these processes does not alter the global climate sensitivity of the model. However, when the change in carbon dioxide is uncoupled from the vegetation, or when the model is forced with a non-carbon dioxide forcing – an increase in solar constant – new feedbacks emerge that make the climate system less sensitive to external perturbations. We identify a strong negative dust-vegetation feedback on climate change that is small in standard carbon dioxide sensitivity experiments due to the physiological/fertilization effects of carbon dioxide on plants in this model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The drag produced by 2D orographic gravity waves trapped at a temperature inversion and waves propagating in the stably stratified layer existing above are explicitly calculated using linear theory, for a two-layer atmosphere with neutral static stability near the surface, mimicking a well-mixed boundary layer. For realistic values of the flow parameters, trapped lee wave drag, which is given by a closed analytical expression, is comparable to propagating wave drag, especially in moderately to strongly non-hydrostatic conditions. In resonant flow, both drag components substantially exceed the single-layer hydrostatic drag estimate used in most parametrization schemes. Both drag components are optimally amplified for a relatively low-level inversion and Froude numbers Fr ≈ 1. While propagating wave drag is maximized for approximately hydrostatic flow, trapped lee wave drag is maximized for l_2 a = O(1) (where l_2 is the Scorer parameter in the stable layer and a is the mountain width). This roughly happens when the horizontal scale of trapped lee waves matches that of the mountain slope. The drag behavior as a function of Fr for l_2 H = 0.5 (where H is the inversion height) and different values of l2a shows good agreement with numerical simulations. Regions of parameter space with high trapped lee wave drag correlate reasonably well with those where lee wave rotors were found to occur in previous nonlinear numerical simulations including frictional effects. This suggests that trapped lee wave drag, besides giving a relevant contribution to low-level drag exerted on the atmosphere, may also be useful to diagnose lee rotor formation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Considerable debate surrounds the source of the apparently ‘anomalous’1 increase of atmospheric methane concentrations since the mid-Holocene (5,000 years ago) compared to previous interglacial periods as recorded in polar ice core records2. Proposed mechanisms for the rise in methane concentrations relate either to methane emissions from anthropogenic early rice cultivation1, 3 or an increase in natural wetland emissions from tropical4 or boreal sources5, 6. Here we show that our climate and wetland simulations of the global methane cycle over the last glacial cycle (the past 130,000 years) recreate the ice core record and capture the late Holocene increase in methane concentrations. Our analyses indicate that the late Holocene increase results from natural changes in the Earth's orbital configuration, with enhanced emissions in the Southern Hemisphere tropics linked to precession-induced modification of seasonal precipitation. Critically, our simulations capture the declining trend in methane concentrations at the end of the last interglacial period (115,000–130,000 years ago) that was used to diagnose the Holocene methane rise as unique. The difference between the two time periods results from differences in the size and rate of regional insolation changes and the lack of glacial inception in the Holocene. Our findings also suggest that no early agricultural sources are required to account for the increase in methane concentrations in the 5,000 years before the industrial era.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The RAPID-MOCHA array has observed the Atlantic Meridional overturning circulation (AMOC) at 26.5°N since 2004. During 2009/2010, there was a transient 30% weakening of the AMOC driven by anomalies in geostrophic and Ekman transports. Here, we use simulations based on the Met Office Forecast Ocean Assimilation Model (FOAM) to diagnose the relative importance of atmospheric forcings and internal ocean dynamics in driving the anomalous geostrophic circulation of 2009/10. Data assimilating experiments with FOAM accurately reproduce the mean strength and depth of the AMOC at 26.5°N. In addition, agreement between simulated and observed stream functions in the deep ocean is improved when we calculate the AMOC using a method that approximates the RAPID observations. The main features of the geostrophic circulation anomaly are captured by an ensemble of simulations without data-assimilation. These model results suggest that the atmosphere played a dominant role in driving recent interannual variability of the AMOC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a series of papers, Killworth and Blundell have proposed to study the effects of a background mean flow and topography on Rossby wave propagation by means of a generalized eigenvalue problem formulated in terms of the vertical velocity, obtained from a linearization of the primitive equations of motion. However, it has been known for a number of years that this eigenvalue problem contains an error, which Killworth was prevented from correcting himself by his unfortunate passing and whose correction is therefore taken up in this note. Here, the author shows in the context of quasigeostrophic (QG) theory that the error can ulti- mately be traced to the fact that the eigenvalue problem for the vertical velocity is fundamentally a non- linear one (the eigenvalue appears both in the numerator and denominator), unlike that for the pressure. The reason that this nonlinear term is lacking in the Killworth and Blundell theory comes from neglecting the depth dependence of a depth-dependent term. This nonlinear term is shown on idealized examples to alter significantly the Rossby wave dispersion relation in the high-wavenumber regime but is otherwise irrelevant in the long-wave limit, in which case the eigenvalue problems for the vertical velocity and pressure are both linear. In the general dispersive case, however, one should first solve the generalized eigenvalue problem for the pressure vertical structure and, if needed, diagnose the vertical velocity vertical structure from the latter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a part of the Atmospheric Model Intercomparison Project (AMIP), the behaviour of 15 general circulation models has been analysed in order to diagnose and compare the ability of the different models in simulating Northern Hemisphere midlatitude atmospheric blocking. In accordance with the established AMIP procedure, the 10-year model integrations were performed using prescribed, time-evolving monthly mean observed SSTs spanning the period January 1979–December 1988. Atmospheric observational data (ECMWF analyses) over the same period have been also used to verify the models results. The models involved in this comparison represent a wide spectrum of model complexity, with different horizontal and vertical resolution, numerical techniques and physical parametrizations, and exhibit large differences in blocking behaviour. Nevertheless, a few common features can be found, such as the general tendency to underestimate both blocking frequency and the average duration of blocks. The problem of the possible relationship between model blocking and model systematic errors has also been assessed, although without resorting to ad-hoc numerical experimentation it is impossible to relate with certainty particular model deficiencies in representing blocking to precise parts of the model formulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a part of the Atmospheric Model Intercomparison Project (AMIP), the behaviour of 15 general circulation models has been analysed in order to diagnose and compare the ability of the different models in simulating Northern Hemisphere midlatitude atmospheric blocking. In accordance with the established AMIP procedure, the 10-year model integrations were performed using prescribed, time-evolving monthly mean observed SSTs spanning the period January 1979–December 1988. Atmospheric observational data (ECMWF analyses) over the same period have been also used to verify the models results. The models involved in this comparison represent a wide spectrum of model complexity, with different horizontal and vertical resolution, numerical techniques and physical parametrizations, and exhibit large differences in blocking behaviour. Nevertheless, a few common features can be found, such as the general tendency to underestimate both blocking frequency and the average duration of blocks. The problem of the possible relationship between model blocking and model systematic errors has also been assessed, although without resorting to ad-hoc numerical experimentation it is impossible to relate with certainty particular model deficiencies in representing blocking to precise parts of the model formulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coronal mass ejections (CMEs) can be continuously tracked through a large portion of the inner heliosphere by direct imaging in visible and radio wavebands. White light (WL) signatures of solar wind transients, such as CMEs, result from Thomson scattering of sunlight by free electrons and therefore depend on both viewing geometry and electron density. The Faraday rotation (FR) of radio waves from extragalactic pulsars and quasars, which arises due to the presence of such solar wind features, depends on the line-of-sight magnetic field component B ∥ and the electron density. To understand coordinated WL and FR observations of CMEs, we perform forward magnetohydrodynamic modeling of an Earth-directed shock and synthesize the signatures that would be remotely sensed at a number of widely distributed vantage points in the inner heliosphere. Removal of the background solar wind contribution reveals the shock-associated enhancements in WL and FR. While the efficiency of Thomson scattering depends on scattering angle, WL radiance I decreases with heliocentric distance r roughly according to the expression Ir –3. The sheath region downstream of the Earth-directed shock is well viewed from the L4 and L5 Lagrangian points, demonstrating the benefits of these points in terms of space weather forecasting. The spatial position of the main scattering site r sheath and the mass of plasma at that position M sheath can be inferred from the polarization of the shock-associated enhancement in WL radiance. From the FR measurements, the local B ∥sheath at r sheath can then be estimated. Simultaneous observations in polarized WL and FR can not only be used to detect CMEs, but also to diagnose their plasma and magnetic field properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Doctor-patient jokes are universally popular because of the information asymmetries within the diagnostic relationship. We contend that entrepreneurial diagnosis is present in markets where consumers are unable to diagnose their own problems and, instead, may rely on the entrepreneur to diagnose them. Entrepreneurial diagnosis is a cognitive skill possessed by the entrepreneur. It is an identifiable subset of entrepreneurial judgment and can be modeled – which we attempt to do. In order to overcome the information asymmetries and exploit opportunities, we suggest that entrepreneurs must invest in market making innovations (as distinct from product innovations) such as trustworthy reputations. The diagnostic entrepreneur described in this paper represents a creative response to difficult diagnostic problems and helps to explain the success of many firms whose products are not particularly innovative but which are perceived as offering high standards of service. These firms are trusted not only for their truthfulness about the quality of their product, but for their honesty, confidentiality and understanding in helping customers identify the most appropriate product to their needs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses ECG classification after parametrizing the ECG waveforms in the wavelet domain. The aim of the work is to develop an accurate classification algorithm that can be used to diagnose cardiac beat abnormalities detected using a mobile platform such as smart-phones. Continuous time recurrent neural network classifiers are considered for this task. Records from the European ST-T Database are decomposed in the wavelet domain using discrete wavelet transform (DWT) filter banks and the resulting DWT coefficients are filtered and used as inputs for training the neural network classifier. Advantages of the proposed methodology are the reduced memory requirement for the signals which is of relevance to mobile applications as well as an improvement in the ability of the neural network in its generalization ability due to the more parsimonious representation of the signal to its inputs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The South Asian monsoon is one of the most significant manifestations of the seasonal cycle. It directly impacts nearly one third of the world’s population and also has substantial global influence. Using 27-year integrations of a high-resolution atmospheric general circulation model (Met Office Unified Model), we study changes in South Asian monsoon precipitation and circulation when horizontal resolution is increased from approximately 200 to 40 km at the equator (N96 to N512, 1.9 to 0.35◦). The high resolution, integration length and ensemble size of the dataset make this the most extensive dataset used to evaluate the resolution sensitivity of the South Asian monsoon to date. We find a consistent pattern of JJAS precipitation and circulation changes as resolution increases, which include a slight increase in precipitation over peninsular India, changes in Indian and Indochinese orographic rain bands, increasing wind speeds in the Somali Jet, increasing precipitation over the Maritime Continent islands and decreasing precipitation over the northern Maritime Continent seas. To diagnose which resolution related processes cause these changes we compare them to published sensitivity experiments that change regional orography and coastlines. Our analysis indicates that improved resolution of the East African Highlands results in the improved representation of the Somali Jet and further suggests that improved resolution of orography over Indochina and the Maritime Continent results in more precipitation over the Maritime Continent islands at the expense of reduced precipitation further north. We also evaluate the resolution sensitivity of monsoon depressions and lows, which contribute more precipitation over northeast India at higher resolution. We conclude that while increasing resolution at these scales does not solve the many monsoon biases that exist in GCMs, it has a number of small, beneficial impacts.