14 resultados para Frane Appenniniche Back analysis colate stabilizzazione versanti

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are commonly recommended as tools for assessing the attainment of development, and the current vogue is for aggregating a number of indicators together into a single index. It is claimed that such indices of development help facilitate maximum impact in policy terms by appealing to those who may not necessarily have technical expertise in data collection, analysis and interpretation. In order to help counter criticisms of over-simplification, those advocating such indices also suggest that the raw data be provided so as to allow disaggregation into component parts and hence facilitate a more subtle interpretation if a reader so desires. This paper examines the problems involved with interpreting indices of development by focusing on the United Nations Development Programmes (UNDP) Human Development Index (HDI) published each year in the Human Development Reports (HDRs). The HDI was intended to provide an alternative to the more economic based indices, such as GDP, commonly used within neo-liberal development agendas. The paper explores the use of the HDI as a gauge of human development by making comparisons between two major political and economic communities in Africa (ECOWAS and SADC). While the HDI did help highlight important changes in human development as expressed by the HDI over 10 years, it is concluded that the HDI and its components are difficult to interpret as methodologies have changed significantly and the 'averaging' nature of the HDI could hide information unless care is taken. The paper discusses the applicability of alternative models to the HDI such as the more neo-populist centred methods commonly advocated for indicators of sustainable development. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individuals are typically co-infected by a diverse community of microparasites (e.g. viruses or protozoa) and macroparasites (e.g. helminths). Vertebrates respond to these parasites differently, typically mounting T helper type 1 (Th1) responses against microparasites and Th2 responses against macroparasites. These two responses may be antagonistic such that hosts face a 'decision' of how to allocate potentially limiting resources. Such decisions at the individual host level will influence parasite abundance at the population level which, in turn, will feed back upon the individual level. We take a first step towards a complete theoretical framework by placing an analysis of optimal immune responses under microparasite-macroparasite co-infection within an epidemiological framework. We show that the optimal immune allocation is quantitatively sensitive to the shape of the trade-off curve and qualitatively sensitive to life-history traits of the host, microparasite and macroparasite. This model represents an important first step in placing optimality models of the immune response to co-infection into an epidemiological framework. Ultimately, however, a more complete framework is needed to bring together the optimal strategy at the individual level and the population-level consequences of those responses, before we can truly understand the evolution of host immune responses under parasite co-infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray reflectivity (XR) and grazing incidence X-ray diffraction (GIXD) have been used to examine an oxyethylene-b-oxybutylene (E23B8) copolymer film at the air-water interface. The XR data were fitted using both a one- and a two-layer model that outputted the film thickness, roughness, and electron density. The best fit to the experimental data was obtained using a two-layer model (representing the oxyethylene and oxybutylene blocks, respectively), which showed a rapid thickening of the copolymer film at pressures above 7 mN/m. The large roughness values found indicate a significant degree of intermixing between the blocks and back up the GIXD data, which showed no long range lateral ordering within the layer. It was found from the electron density model results that there is a large film densification at 7 mN/m, possibly suggesting conformational changes within the film, even though no such change occurs on the pressure-area isotherm at the same surface pressure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A first step in interpreting the wide variation in trace gas concentrations measured over time at a given site is to classify the data according to the prevailing weather conditions. In order to classify measurements made during two intensive field campaigns at Mace Head, on the west coast of Ireland, an objective method of assigning data to different weather types has been developed. Air-mass back trajectories calculated using winds from ECMWF analyses, arriving at the site in 1995–1997, were allocated to clusters based on a statistical analysis of the latitude, longitude and pressure of the trajectory at 12 h intervals over 5 days. The robustness of the analysis was assessed by using an ensemble of back trajectories calculated for four points around Mace Head. Separate analyses were made for each of the 3 years, and for four 3-month periods. The use of these clusters in classifying ground-based ozone measurements at Mace Head is described, including the need to exclude data which have been influenced by local perturbations to the regional flow pattern, for example, by sea breezes. Even with a limited data set, based on 2 months of intensive field measurements in 1996 and 1997, there are statistically significant differences in ozone concentrations in air from the different clusters. The limitations of this type of analysis for classification and interpretation of ground-based chemistry measurements are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research into the topic of liquidity has greatly benefited from the availability of data. Although bid-ask spreads were inaccessible to researchers, Roll (1984) provided a conceptual model that estimated the effective bid-ask prices from regular time series data, recorded on a daily or longer interval. Later data availability improved and researchers were able to address questions regarding the factors that influenced the spreads and the relationship between spreads and risk, return and liquidity. More recently transaction data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the impact of transactions on price movements (Clayton and McKinnon, 2000) on a trade-by-trade analysis. This paper aims to use techniques that combine elements from all three approaches and, by studying US data over a relatively long time period, to throw light on earlier research as well as to reveal the changes in liquidity over the period controlling for extraneous factors such as market, age and size of REIT. It also reveals some comparable results for the UK market over the same period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on the topic of liquidity has greatly benefited from the improved availability of data. Researchers have addressed questions regarding the factors that influence bid-ask spreads and the relationship between spreads and risk, return and liquidity. Intra-day data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the price impact of transactions on a trade-by-trade analysis. The growth in the creation of tax-transparent securities has greatly enhanced the visibility of securitized real estate, and has naturally led to the question of whether the increased visibility of real estate has caused market liquidity to change. Although the growth in the public market for securitized real estate has occurred in international markets, it has not been accompanied by universal publication of transaction data. Therefore this paper develops an aggregate daily data-based test for liquidity and applies the test to US data in order to check for consistency with the results of prior intra-day analysis. If the two approaches produce similar results, we can apply the same technique to markets in which less detailed data are available and offer conclusions on the liquidity of a wider set of markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hardcore, or long-term derelict and vacant brownfield sites which are often contaminated, form a significant proportion of brownfield land in many cities, not only in the UK but also in other countries. The recent economic recession has placed the economic viability of such sites in jeopardy. This paper compares the approaches for bringing back hardcore brownfield sites into use in England and Japan by focusing on ten case studies in Manchester and Osaka, using an `agency'-based frame- work. The findings are set in the context of (i) national brownfield and related policy agendas; (ii) recent trends in land and property markets in both England and Japan; and (iii) city-level comparisons of brownfields in Manchester and Osaka. The research, which was conducted during 2009 ^ 10, suggests that hardcore brownfield sites have been badly affected by the recent recession in both Manchester and Osaka. Despite this, not only is there evidence that hardcore sites have been successfully regenerated in both cities, but also that the critical success factors (CSFs) operating in bringing sites back into use share a large degree of commonality. These CSFs include the presence of strong potential markets, seeing the recession as an opportunity, long-term vision, strong branding, strong partnerships, integrated development, and getting infrastructure into place. Finally, the paper outlines the policy implications of the research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data from various stations having different measurement record periods between 1988 and 2007 are analyzed to investigate the surface ozone concentration, long-term trends, and seasonal changes in and around Ireland. Time series statistical analysis is performed on the monthly mean data using seasonal and trend decomposition procedures and the Box-Jenkins approach (autoregressive integrated moving average). In general, ozone concentrations in the Irish region are found to have a negative trend at all sites except at the coastal sites of Mace Head and Valentia. Data from the most polluted Dublin city site have shown a very strong negative trend of −0.33 ppb/yr with a 95% confidence limit of 0.17 ppb/yr (i.e., −0.33 ± 0.17) for the period 2002−2007, and for the site near the city of Cork, the trend is found to be −0.20 ± 0.11 ppb/yr over the same period. The negative trend for other sites is more pronounced when the data span is considered from around the year 2000 to 2007. Rural sites of Wexford and Monaghan have also shown a very strong negative trend of −0.99 ± 0.13 and −0.58 ± 0.12, respectively, for the period 2000−2007. Mace Head, a site that is representative of ozone changes in the air advected from the Atlantic to Europe in the marine planetary boundary layer, has shown a positive trend of about +0.16 ± 0.04 ppb per annum over the entire period 1988−2007, but this positive trend has reduced during recent years (e.g., in the period 2001−2007). Cluster analysis for back trajectories are performed for the stations having a long record of data, Mace Head and Lough Navar. For Mace Head, the northern and western clean air sectors have shown a similar positive trend (+0.17 ± 0.02 ppb/yr for the northern sector and +0.18 ± 0.02 ppb/yr for the western sector) for the whole period, but partial analysis for the clean western sector at Mace Head shows different trends during different time periods with a decrease in the positive trend since 1988 indicating a deceleration in the ozone trend for Atlantic air masses entering Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nonlinearity of high-power amplifiers (HPAs) has a crucial effect on the performance of multiple-input-multiple-output (MIMO) systems. In this paper, we investigate the performance of MIMO orthogonal space-time block coding (OSTBC) systems in the presence of nonlinear HPAs. Specifically, we propose a constellation-based compensation method for HPA nonlinearity in the case with knowledge of the HPA parameters at the transmitter and receiver, where the constellation and decision regions of the distorted transmitted signal are derived in advance. Furthermore, in the scenario without knowledge of the HPA parameters, a sequential Monte Carlo (SMC)-based compensation method for the HPA nonlinearity is proposed, which first estimates the channel-gain matrix by means of the SMC method and then uses the SMC-based algorithm to detect the desired signal. The performance of the MIMO-OSTBC system under study is evaluated in terms of average symbol error probability (SEP), total degradation (TD) and system capacity, in uncorrelated Nakagami-m fading channels. Numerical and simulation results are provided and show the effects on performance of several system parameters, such as the parameters of the HPA model, output back-off (OBO) of nonlinear HPA, numbers of transmit and receive antennas, modulation order of quadrature amplitude modulation (QAM), and number of SMC samples. In particular, it is shown that the constellation-based compensation method can efficiently mitigate the effect of HPA nonlinearity with low complexity and that the SMC-based detection scheme is efficient to compensate for HPA nonlinearity in the case without knowledge of the HPA parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Targeted Induced Loci Lesions IN Genomes (TILLING) is increasingly being used to generate and identify mutations in target genes of crop genomes. TILLING populations of several thousand lines have been generated in a number of crop species including Brassica rapa. Genetic analysis of mutants identified by TILLING requires an efficient, high-throughput and cost effective genotyping method to track the mutations through numerous generations. High resolution melt (HRM) analysis has been used in a number of systems to identify single nucleotide polymorphisms (SNPs) and insertion/deletions (IN/DELs) enabling the genotyping of different types of samples. HRM is ideally suited to high-throughput genotyping of multiple TILLING mutants in complex crop genomes. To date it has been used to identify mutants and genotype single mutations. The aim of this study was to determine if HRM can facilitate downstream analysis of multiple mutant lines identified by TILLING in order to characterise allelic series of EMS induced mutations in target genes across a number of generations in complex crop genomes. Results: We demonstrate that HRM can be used to genotype allelic series of mutations in two genes, BraA.CAX1a and BraA.MET1.a in Brassica rapa. We analysed 12 mutations in BraA.CAX1.a and five in BraA.MET1.a over two generations including a back-cross to the wild-type. Using a commercially available HRM kit and the Lightscanner™ system we were able to detect mutations in heterozygous and homozygous states for both genes. Conclusions: Using HRM genotyping on TILLING derived mutants, it is possible to generate an allelic series of mutations within multiple target genes rapidly. Lines suitable for phenotypic analysis can be isolated approximately 8-9 months (3 generations) from receiving M3 seed of Brassica rapa from the RevGenUK TILLING service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Back to the Future Trilogy incorporates several different generic elements, including aspects of the fifties teen movie, science fiction, comedy and the western. These different modes playfully intertwine with each other creating a complex world of repetitions, echoes and modulations. This essay seeks to interrogate the construction of generic elements and the play between them through a close analysis of a repeated performance. Genre is signalled through various strategies employed within the construction of mise-en-scène, a significant portion of this, as I would like to argue, is transmitted through performance. The material detail of a performance – incorporating gesture, movement, voice, and even surrounding elements such as costume – as well as the way it its presented within a film is key to the establishment, invocation and coherence of genre. Furthermore, attention to the complexity of performance details, particularly in the manner in which they reverberate across texts, demonstrates the intricacy of genre and its inherent mutability. The Back to the Future trilogy represents a specific interest in the flexibility of genre. Within each film, and especially across all three, aspects of various genres are interlaced through both visual and narrative detail, thus constructing a dense layer of references both within and without the texts. To explore this patterning in more detail I will interrogate the contribution of performance to generic play through close analysis of Thomas F. Wilson’s performance of Biff/Griff/Burford Tannen and his central encounter with Marty McFly (Michael J. Fox) in each film. These moments take place in a fifties diner, a 1980s retro diner and a saloon respectively, each space contributing the similarities and differences in each repetition. Close attention to Wilson’s performance of each related character, which contains both modulations and repetitions used specifically to place each film’s central generic theme, demonstrates how embedded the play between genres and their flexibility is within the trilogy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long Term Evolution based networks lack native support for Circuit Switched (CS) services. The Evolved Packet System (EPS) which includes the Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) and Evolved Packet Core (EPC) is a purely all-IP packet system. This introduces the problem of how to provide voice call support when a user is within an LTE network and how to ensure voice service continuity when the user moves out of LTE coverage area. Different technologies have been proposed for the purpose of providing a voice to LTE users and to ensure the service continues outside LTE networks. The aim of this paper is to analyze and evaluate the overall performance of these technologies along with Single Radio Voice Call Continuity (SRVCC) Inter-RAT handover to Universal Terrestrial Radio Access Networks/ GSM-EDGE radio access Networks (UTRAN/GERAN). The possible solutions for providing voice call and service continuity over LTE-based networks are Circuit Switched Fall Back (CSFB), Voice over LTE via Generic Access (VoLGA), Voice over LTE (VoLTE) based on IMS/MMTel with SRVCC and Over The Top (OTT) services like Skype. This paper focuses mainly on the 3GPP standard solutions to implement voice over LTE. The paper compares various aspects of these solutions and suggests a possible roadmap that mobile operators can adopt to provide seamless voice over LTE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flow in geophysical fluids is commonly summarized by coherent streams, for example conveyor belt flows in extratropical cyclones or jet streaks in the upper troposphere. Typically, parcel trajectories are calculated from the flow field and subjective thresholds are used to distinguish coherent streams of interest. This methodology contribution develops a more objective approach to distinguish coherent airstreams within extratropical cyclones. Agglomerative clustering is applied to trajectories along with a method to identify the optimal number of cluster classes. The methodology is applied to trajectories associated with the low-level jets of a well-studied extratropical cyclone. For computational efficiency, a constraint that trajectories must pass through these jet regions is applied prior to clustering; the partitioning into different airstreams is then performed by the agglomerative clustering. It is demonstrated that the methodology can identify the salient flow structures of cyclones: the warm and cold conveyor belts. A test focusing on the airstreams terminating at the tip of the bent-back front further demonstrates the success of the method in that it can distinguish fine-scale flow structure such as descending sting jet airstreams.