967 resultados para Feature Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, the cusp has been described in terms of a time-stationary feature of the magnetosphere which allows access of magnetosheath-like plasma to low altitudes. Statistical surveys of data from low-altitude spacecraft have shown the average characteristics and position of the cusp. Recently, however, it has been suggested that the ionospheric footprint of flux transfer events (FTEs) may be identified as variations of the “cusp” on timescales of a few minutes. In this model, the cusp can vary in form between a steady-state feature in one limit and a series of discrete ionospheric FTE signatures in the other limit. If this time-dependent cusp scenario is correct, then the signatures of the transient reconnection events must be able, on average, to reproduce the statistical cusp occurrence previously determined from the satellite observations. In this paper, we predict the precipitation signatures which are associated with transient magnetopause reconnection, following recent observations of the dependence of dayside ionospheric convection on the orientation of the IMF. We then employ a simple model of the longitudinal motion of FTE signatures to show how such events can easily reproduce the local time distribution of cusp occurrence probabilities, as observed by low-altitude satellites. This is true even in the limit where the cusp is a series of discrete events. Furthermore, we investigate the existence of double cusp patches predicted by the simple model and show how these events may be identified in the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topography of many floodplains in the developed world has now been surveyed with high resolution sensors such as airborne LiDAR (Light Detection and Ranging), giving accurate Digital Elevation Models (DEMs) that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X WorldDEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution Synthetic Aperture Radar (SAR) images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. This paper discusses an additional use of SAR flood extents, namely to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving this DEM for future flood modelling and other studies. The method is based on the fact that for larger rivers the water elevation generally changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as samples with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the corrected heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must in general be no lower than the corrected heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the reduced errors on the corrected waterline heights. The method was tested on a section of the TanDEM-X Intermediate DEM (IDEM) covering an 11km reach of the Warwickshire Avon, England. Flood extents from four COSMO-SKyMed images were available at various stages of a flood in November 2012, and a LiDAR DEM was available for validation. In the area covered by the flood extents, the original IDEM heights had a mean difference from the corresponding LiDAR heights of 0.5 m with a standard deviation of 2.0 m, while the corrected heights had a mean difference of 0.3 m with standard deviation 1.2 m. These figures show that significant reductions in IDEM height bias and error can be made using the method, with the corrected error being only 60% of the original. Even if only a single SAR image obtained near the peak of the flood was used, the corrected error was only 66% of the original. The method should also be capable of improving the final TanDEM-X DEM and other DEMs, and may also be of use with data from the SWOT (Surface Water and Ocean Topography) satellite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general circulation models used to simulate global climate typically feature resolution too coarse to reproduce many smaller-scale processes, which are crucial to determining the regional responses to climate change. A novel approach to downscale climate change scenarios is presented which includes the interactions between the North Atlantic Ocean and the European shelves as well as their impact on the North Atlantic and European climate. The goal of this paper is to introduce the global ocean-regional atmosphere coupling concept and to show the potential benefits of this model system to simulate present-day climate. A global ocean-sea ice-marine biogeochemistry model (MPIOM/HAMOCC) with regionally high horizontal resolution is coupled to an atmospheric regional model (REMO) and global terrestrial hydrology model (HD) via the OASIS coupler. Moreover, results obtained with ROM using NCEP/NCAR reanalysis and ECHAM5/MPIOM CMIP3 historical simulations as boundary conditions are presented and discussed for the North Atlantic and North European region. The validation of all the model components, i.e., ocean, atmosphere, terrestrial hydrology, and ocean biogeochemistry is performed and discussed. The careful and detailed validation of ROM provides evidence that the proposed model system improves the simulation of many aspects of the regional climate, remarkably the ocean, even though some biases persist in other model components, thus leaving potential for future improvement. We conclude that ROM is a powerful tool to estimate possible impacts of climate change on the regional scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the variability of the South America monsoon system (SAMS) over tropical South America (SA). The onset, end, and total rainfall during the summer monsoon are investigated using precipitation pentad estimates from the global precipitation climatology project (GPCP) 1979-2006. Likewise, the variability of SAMS characteristics is examined in ten Intergovernmental Panel on Climate Change (IPCC) global coupled climate models in the twentieth century (1981-2000) and in a future scenario of global change (A1B) (2081-2100). It is shown that most IPCC models misrepresent the intertropical convergence zone and therefore do not capture the actual annual cycle of precipitation over the Amazon and northwest SA. Most models can correctly represent the spatiotemporal variability of the annual cycle of precipitation in central and eastern Brazil such as the correct phase of dry and wet seasons, onset dates, duration of rainy season and total accumulated precipitation during the summer monsoon for the twentieth century runs. Nevertheless, poor representation of the total monsoonal precipitation over the Amazon and northeast Brazil is observed in a large majority of the models. Overall, MI-ROC3.2-hires, MIROC3.2-medres and MRI-CGCM3.2.3 show the most realistic representation of SAMS`s characteristics such as onset, duration, total monsoonal precipitation, and its interannual variability. On the other hand, ECHAM5, GFDL-CM2.0 and GFDL-CM2.1 have the least realistic representation of the same characteristics. For the A1B scenario the most coherent feature observed in the IPCC models is a reduction in precipitation over central-eastern Brazil during the summer monsoon, comparatively with the present climate. The IPCC models do not indicate statistically significant changes in SAMS onset and demise dates for the same scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, there has been renewed interest in biologically active peptides in fields like allergy, autoimmume diseases and antibiotic therapy. Mast cell degranulating peptides mimic G-protein receptors, showing different activity levels even among homologous peptides. Another important feature is their ability to interact directly with membrane phospholipids, in a fast and concentration-dependent way. The mechanism of action of peptide HR1 on model membranes was investigated comparatively to other mast cell degranulating peptides (Mastoparan, Eumenitin and Anoplin) to evidence the features that modulate their selectivity. Using vesicle leakage, single-channel recordings and zeta-potential measurements, we demonstrated that HR1 preferentially binds to anionic bilayers, accumulates, folds, and at very low concentrations, is able to insert and create membrane spanning ion-selective pores. We discuss the ion selectivity character of the pores based on the neutralization or screening of the peptides charges by the bilayer head group charges or dipoles. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Sznajd model (SM) has been employed with success in the last years to describe opinion propagation in a community. In particular, it has been claimed that its transient is able to reproduce some scale properties observed in data of proportional elections, in different countries, if the community structure (the network) is scale-free. In this work, we investigate the properties of the transient of a particular version of the SM, introduced by Bernardes and co-authors in 2002. We studied the behavior of the model in networks of different topologies through the time evolution of an order parameter known as interface density, and concluded that regular lattices with high dimensionality also leads to a power-law distribution of the number of candidates with v votes. Also, we show that the particular absorbing state achieved in the stationary state (or else, the winner candidate), is related to a particular feature of the model, that may not be realistic in all situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Axelrod`s model for culture dissemination offers a nontrivial answer to the question of why there is cultural diversity given that people`s beliefs have a tendency to become more similar to each other`s as they interact repeatedly. The answer depends on the two control parameters of the model, namely, the number F of cultural features that characterize each agent, and the number q of traits that each feature can take on, as well as on the size A of the territory or, equivalently, on the number of interacting agents. Here, we investigate the dependence of the number C of distinct coexisting cultures on the area A in Axelrod`s model, the culture-area relationship, through extensive Monte Carlo simulations. We find a non-monotonous culture-area relation, for which the number of cultures decreases when the area grows beyond a certain size, provided that q is smaller than a threshold value q (c) = q (c) (F) and F a parts per thousand yen 3. In the limit of infinite area, this threshold value signals the onset of a discontinuous transition between a globalized regime marked by a uniform culture (C = 1), and a completely polarized regime where all C = q (F) possible cultures coexist. Otherwise, the culture-area relation exhibits the typical behavior of the species-area relation, i.e., a monotonically increasing curve the slope of which is steep at first and steadily levels off at some maximum diversity value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important feature of Axelrod`s model for culture dissemination or social influence is the emergence of many multicultural absorbing states, despite the fact that the local rules that specify the agents interactions are explicitly designed to decrease the cultural differences between agents. Here we re-examine the problem of introducing an external, global interaction-the mass media-in the rules of Axelrod`s model: in addition to their nearest neighbors, each agent has a certain probability p to interact with a virtual neighbor whose cultural features are fixed from the outset. Most surprisingly, this apparently homogenizing effect actually increases the cultural diversity of the population. We show that, contrary to previous claims in the literature, even a vanishingly small value of p is sufficient to destabilize the homogeneous regime for very large lattice sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The open provenance architecture (OPA) approach to the challenge was distinct in several regards. In particular, it is based on an open, well-defined data model and architecture, allowing different components of the challenge workflow to independently record documentation, and for the workflow to be executed in any environment. Another noticeable feature is that we distinguish between the data recorded about what has occurred, emphprocess documentation, and the emphprovenance of a data item, which is all that caused the data item to be as it is and is obtained as the result of a query over process documentation. This distinction allows us to tailor the system to separately best address the requirements of recording and querying documentation. Other notable features include the explicit recording of causal relationships between both events and data items, an interaction-based world model, intensional definition of data items in queries rather than relying on explicit naming mechanisms, and emphstyling of documentation to support non-functional application requirements such as reducing storage costs or ensuring privacy of data. In this paper we describe how each of these features aid us in answering the challenge provenance queries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excessive labor turnover may be considered, to a great extent, an undesirable feature of a given economy. This follows from considerations such as underinvestment in human capital by firms. Understanding the determinants and the evolution of turnover in a particular labor market is therefore of paramount importance, including policy considerations. The present paper proposes an econometric analysis of turnover in the Brazilian labor market, based on a partial observability bivariate probit model. This model considers the interdependence of decisions taken by workers and firms, helping to elucidate the causes that lead each of them to end an employment relationship. The Employment and Unemployment Survey (PED) conducted by the State System of Data Analysis (SEADE) and by the Inter-Union Department of Statistics and Socioeconomic Studies (DIEESE) provides data at the individual worker level, allowing for the estimation of the joint probabilities of decisions to quit or stay on the job on the worker’s side, and to maintain or fire the employee on the firm’s side, during a given time period. The estimated parameters relate these estimated probabilities to the characteristics of workers, job contracts, and to the potential macroeconomic determinants in different time periods. The results confirm the theoretical prediction that the probability of termination of an employment relationship tends to be smaller as the worker acquires specific skills. The results also show that the establishment of a formal employment relationship reduces the probability of a quit decision by the worker, and also the firm’s firing decision in non-industrial sectors. With regard to the evolution of quit probability over time, the results show that an increase in the unemployment rate inhibits quitting, although this tends to wane as the unemployment rate rises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Em modelos de competição de preços, somente um custo de procura positivo por parte do consumidor não gera equilíbrio com dispersão de preços. Já modelos dinâmicos de switching cost consistentemente geram este fenômeno bastante documentado para preços no varejo. Embora ambas as literaturas sejam vastas, poucos modelos tentaram combinar as duas fricções em um só modelo. Este trabalho apresenta um modelo dinâmico de competição de preços em que consumidores idênticos enfrentam custos de procura e de switching. O equilíbrio gera dispersão nos preços. Ainda, como os consumidores são obrigados a se comprometer com uma amostra fixa de firmas antes dos preços serem definidos, somente dois preços serão considerados antes de cada compra. Este resultado independe do tamanho do custo de procura individual do consumidor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (e.g. prices and dividends) is a necessary condition to assess the empirical validity of a present-value model (PVM) linking them. The work on cointegration,namelyon long-run co-movements, has been so prevalent that it is often over-looked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. This amounts to investigate whether short-run co-movememts steming from common cyclical feature restrictions are also present in such a system. In this paper we test for the presence of such co-movement on long- and short-term interest rates and on price and dividend for the U.S. economy. We focuss on the potential improvement in forecasting accuracies when imposing those two types of restrictions coming from economic theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate a neutrino mass model in which the neutrino data is accounted for by bilinear R-parity violating supersymmetry with anomaly mediated supersymmetry breaking. We focus on the CERN Large Hadron Collider (LHC) phenomenology, studying the reach of generic supersymmetry search channels with leptons, missing energy and jets. A special feature of this model is the existence of long-lived neutralinos and charginos which decay inside the detector leading to detached vertices. We demonstrate that the largest reach is obtained in the displaced vertices channel and that practically all of the reasonable parameter space will be covered with an integrated luminosity of 10 fb(-1). We also compare the displaced vertex reaches of the LHC and Tevatron.