987 resultados para Extragalactic Distance Scale


Relevância:

30.00% 30.00%

Publicador:

Resumo:

1 Adaptation of plant populations to local environments has been shown in many species but local adaptation is not always apparent and spatial scales of differentiation are not well known. In a reciprocal transplant experiment we tested whether: (i) three widespread grassland species are locally adapted at a European scale; (ii) detection of local adaptation depends on competition with the local plant community; and (iii) local differentiation between neighbouring populations from contrasting habitats can be stronger than differentiation at a European scale. 2 Seeds of Holcus lanatus, Lotus corniculatus and Plantago lanceolata from a Swiss, Czech and UK population were sown in a reciprocal transplant experiment at fields that exhibit environmental conditions similar to the source sites. Seedling emergence, survival, growth and reproduction were recorded for two consecutive years. 3 The effect of competition was tested by comparing individuals in weeded monocultures with plants sown together with species from the local grassland community. To compare large-scale vs. small-scale differentiation, a neighbouring population from a contrasting habitat (wet-dry contrast) was compared with the 'home' and 'foreign' populations. 4 In P. lanceolata and H. lanatus, a significant home-site advantage was detected in fitness-related traits, thus indicating local adaptation. In L. corniculatus, an overall superiority of one provenance was found. 5 The detection of local adaptation depended on competition with the local plant community. In the absence of competition the home-site advantage was underestimated in P. lanceolata and overestimated in H. lanatus. 6 A significant population differentiation between contrasting local habitats was found. In some traits, this small-scale was greater than large-scale differentiation between countries. 7 Our results indicate that local adaptation in real plant communities cannot necessarily be predicted from plants grown in weeded monocultures and that tests on the relationship between fitness and geographical distance have to account for habitat-dependent small-scale differentiation. Considering the strong small-scale differentiation, a local provenance from a different habitat may not be the best choice in ecological restoration if distant populations from a more similar habitat are available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The surface mass balance for Greenland and Antarctica has been calculated using model data from an AMIP-type experiment for the period 1979–2001 using the ECHAM5 spectral transform model at different triangular truncations. There is a significant reduction in the calculated ablation for the highest model resolution, T319 with an equivalent grid distance of ca 40 km. As a consequence the T319 model has a positive surface mass balance for both ice sheets during the period. For Greenland, the models at lower resolution, T106 and T63, on the other hand, have a much stronger ablation leading to a negative surface mass balance. Calculations have also been undertaken for a climate change experiment using the IPCC scenario A1B, with a T213 resolution (corresponding to a grid distance of some 60 km) and comparing two 30-year periods from the end of the twentieth century and the end of the twenty-first century, respectively. For Greenland there is change of 495 km3/year, going from a positive to a negative surface mass balance corresponding to a sea level rise of 1.4 mm/year. For Antarctica there is an increase in the positive surface mass balance of 285 km3/year corresponding to a sea level fall by 0.8 mm/year. The surface mass balance changes of the two ice sheets lead to a sea level rise of 7 cm at the end of this century compared to end of the twentieth century. Other possible mass losses such as due to changes in the calving of icebergs are not considered. It appears that such changes must increase significantly, and several times more than the surface mass balance changes, if the ice sheets are to make a major contribution to sea level rise this century. The model calculations indicate large inter-annual variations in all relevant parameters making it impossible to identify robust trends from the examined periods at the end of the twentieth century. The calculated inter-annual variations are similar in magnitude to observations. The 30-year trend in SMB at the end of the twenty-first century is significant. The increase in precipitation on the ice sheets follows closely the Clausius-Clapeyron relation and is the main reason for the increase in the surface mass balance of Antarctica. On Greenland precipitation in the form of snow is gradually starting to decrease and cannot compensate for the increase in ablation. Another factor is the proportionally higher temperature increase on Greenland leading to a larger ablation. It follows that a modest increase in temperature will not be sufficient to compensate for the increase in accumulation, but this will change when temperature increases go beyond any critical limit. Calculations show that such a limit for Greenland might well be passed during this century. For Antarctica this will take much longer and probably well into following centuries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper aims to investigate the scale and drivers of cross-border real estate development in Western Europe and Central and Eastern Europe. Design/methodology/approach – Placing cross-border real estate development within the framework of foreign direct investment (FDI), conceptual complexities in characterizing the notional real estate developer are emphasized. Drawing upon a transaction database, this paper proxies cross-border real estate development flows with asset sales by developers. Findings – Much higher levels of market penetration by international real estate developers are found in the less mature markets of Central and Eastern Europe. Analysis suggests a complex range of determinants with physical distance remaining a consistent barrier to cross-border development flows. Originality/value – This analysis adds significant value in terms of understanding cross-border real estate development flows. In this study, a detailed examination of the issues based on a rigorous empirical analysis through gravity modelling is offered. The gravity framework is one of the most confirmed empirical regularities in international economics and commonly applied to trade, FDI, migration, foreign portfolio investment inter alia. This paper assesses the extent to which it provides useful insights into the pattern of cross-border real estate development flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that value creation by multinational enterprises (MNEs) is the result of activities where geographic distance effects can be overcome. We submit that geographic distance has a relatively low impact on international research and development (R&D) investments, owing to the spiky nature of innovation, and to the unique ability of MNEs to absorb and transfer knowledge on a global scale. On the one hand, MNEs need to set up their labs as close as possible to specialized technology clusters where valuable knowledge is concentrated, largely regardless of distance from their home base. On the other, MNEs have historically developed technical and organizational competencies that enable them to transfer knowledge within their internal networks and across technology clusters at relatively low cost. Using data on R&D and manufacturing investments of 6320 firms in 59 countries, we find that geographic distance has a lower negative impact on the probability of setting up R&D than manufacturing plants. Furthermore, once measures of institutional proximity are accounted for, MNEs are equally likely to set up R&D labs in nearby or in more remote locations. This result is driven by MNEs based in Triad countries, whereas for non-Triad MNEs the effect of geographic distance on cross-border R&D is negative and significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scale mixtures of the skew-normal (SMSN) distribution is a class of asymmetric thick-tailed distributions that includes the skew-normal (SN) distribution as a special case. The main advantage of these classes of distributions is that they are easy to simulate and have a nice hierarchical representation facilitating easy implementation of the expectation-maximization algorithm for the maximum-likelihood estimation. In this paper, we assume an SMSN distribution for the unobserved value of the covariates and a symmetric scale mixtures of the normal distribution for the error term of the model. This provides a robust alternative to parameter estimation in multivariate measurement error models. Specific distributions examined include univariate and multivariate versions of the SN, skew-t, skew-slash and skew-contaminated normal distributions. The results and methods are applied to a real data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2003 the first distance teacher education started at Dalarna University in a small scale compared with today when a large part of the teacher education is distributed as distance education. From this point of view it seems important to ask the question: How can you become a successful distance student? This paper is based on a case study. Data were collected from earlier research reports, study registers and a group interview. The most important parameters appeared to be motivation, situation in life, discipline and experiences from earlier studies and/or work experience and good relations to other students and the university teachers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Running hydrodynamic models interactively allows both visual exploration and change of model state during simulation. One of the main characteristics of an interactive model is that it should provide immediate feedback to the user, for example respond to changes in model state or view settings. For this reason, such features are usually only available for models with a relatively small number of computational cells, which are used mainly for demonstration and educational purposes. It would be useful if interactive modeling would also work for models typically used in consultancy projects involving large scale simulations. This results in a number of technical challenges related to the combination of the model itself and the visualisation tools (scalability, implementation of an appropriate API for control and access to the internal state). While model parallelisation is increasingly addressed by the environmental modeling community, little effort has been spent on developing a high-performance interactive environment. What can we learn from other high-end visualisation domains such as 3D animation, gaming, virtual globes (Autodesk 3ds Max, Second Life, Google Earth) that also focus on efficient interaction with 3D environments? In these domains high efficiency is usually achieved by the use of computer graphics algorithms such as surface simplification depending on current view, distance to objects, and efficient caching of the aggregated representation of object meshes. We investigate how these algorithms can be re-used in the context of interactive hydrodynamic modeling without significant changes to the model code and allowing model operation on both multi-core CPU personal computers and high-performance computer clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Majority of biometric researchers focus on the accuracy of matching using biometrics databases, including iris databases, while the scalability and speed issues have been neglected. In the applications such as identification in airports and borders, it is critical for the identification system to have low-time response. In this paper, a graph-based framework for pattern recognition, called Optimum-Path Forest (OPF), is utilized as a classifier in a pre-developed iris recognition system. The aim of this paper is to verify the effectiveness of OPF in the field of iris recognition, and its performance for various scale iris databases. This paper investigates several classifiers, which are widely used in iris recognition papers, and the response time along with accuracy. The existing Gauss-Laguerre Wavelet based iris coding scheme, which shows perfect discrimination with rotary Hamming distance classifier, is used for iris coding. The performance of classifiers is compared using small, medium, and large scale databases. Such comparison shows that OPF has faster response for large scale database, thus performing better than more accurate but slower Bayesian classifier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods of recording soil erosion using photographs exist but they are not commonly considered in scientific studies. Digital images may hold an expressive amount of information that can be extracted quickly in different manners. The investigation of several metrics that were initially developed for landscape ecology analysis constitutes one method. In this study we applied a method of landscape metrics to quantify the spatial configuration of surface micro-topography and erosion-related features, in order to generate a possible complementary tool for environmental management. In a 3.7 m wide and 9.7 m long soil box used during a rainfall simulation study, digital images were systematically acquired in four instances: (a) when the soil was dry; (b) after a short duration rain for initial wetting; (c) after the first erosive rain; and (d) after the 2nd erosive rain. Thirteen locations were established in the box and digital photos were taken at these locations with the camera positioned at the same orthogonal distance from the soil surface under the same ambient light intensity. Digital photos were converted into bimodal images and seven landscape metrics were analyzed: percentage of land, number of patches, density of patches, largest patch index, edge density, shape index, and fractal dimension. Digital images were an appropriate tool because they can generate data very quickly. The landscape metrics were sensitive to changes in soil surface micro-morphology especially after the 1st erosive rain event, indicating significant erosional feature development between the initial wetting and first erosive rainfall. The method is considered suitable for spatial patterns of soil micro-topography evolution from rainfall events that bear similarity to landscape scale pattern evolution from eco-hydrological processes. Although much more study is needed for calibrating the landscape metrics at the micro-scale, this study is a step forward in demonstrating the advantages of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To verify the effects of exercise intensity deception by the Borg scale on the ratings of perceived exertion (RPE), heart rate (HR) and performance responses during a constant power output open-loop exercise. Methods Eight healthy men underwent a maximal incremental test on a cycle ergometer to identify the peak power output (PPO) and heart rate deflection point (HRDP). Subsequently, they performed a constant power output trial to exhaustion set at the HRDP intensity, in deception (DEC) and informed (INF) conditions: DEC-subjects were told that they would be cycling at an intensity corresponding to two categories below the RPE quantified at the HRDP; INF-subjects were told that they would cycle at the exact intensity corresponding to the RPE quantified at the HRDP. Results The PPO and power output at the HRDP obtained in maximal incremental tests were 247.5 +/- 32.1 W and 208.1 +/- 27.1 W, respectively. No significant difference in the time to exhaustion was found between DEC (525 +/- 244 s) or INF (499 +/- 224 s) trials. The slope and the first and second measurements of the RPE and HR parameters showed no significant difference between trials. Conclusions Psychophysiological variables such as RPE and HR as well as performance were not affected when exercise intensity was deceptively manipulated via RPE scores. This may suggest that unaltered RPE during exercise is a regulator of performance in this open-loop exercise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The origin of cosmic rays at all energies is still uncertain. In this paper, we present and explore an astrophysical scenario to produce cosmic rays with energy ranging from below 10(15) to 3 x 10(20) eV. We show here that just our Galaxy and the radio galaxy Cen A, each with their own galactic cosmic-ray particles but with those from the radio galaxy pushed up in energy by a relativistic shock in the jet emanating from the active black hole, are sufficient to describe the most recent data in the PeV to near ZeV energy range. Data are available over this entire energy range from the KASCADE, KASCADE-Grande, and Pierre Auger Observatory experiments. The energy spectrum calculated here correctly reproduces the measured spectrum beyond the knee and, contrary to widely held expectations, no other extragalactic source population is required to explain the data even at energies far below the general cutoff expected at 6 x 10(19) eV, the Greisen-Zatsepin-Kuz'min turnoff due to interaction with the cosmological microwave background. We present several predictions for the source population, the cosmic-ray composition, and the propagation to Earth which can be tested in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current cosmological dark sector (dark matter plus dark energy) is challenging our comprehension about the physical processes taking place in the Universe. Recently, some authors tried to falsify the basic underlying assumptions of such dark matterdark energy paradigm. In this Letter, we show that oversimplifications of the measurement process may produce false positives to any consistency test based on the globally homogeneous and isotropic ? cold dark matter (?CDM) model and its expansion history based on distance measurements. In particular, when local inhomogeneity effects due to clumped matter or voids are taken into account, an apparent violation of the basic assumptions (Copernican Principle) seems to be present. Conversely, the amplitude of the deviations also probes the degree of reliability underlying the phenomenological DyerRoeder procedure by confronting its predictions with the accuracy of the weak lensing approach. Finally, a new method is devised to reconstruct the effects of the inhomogeneities in a ?CDM model, and some suggestions of how to distinguish between clumpiness (or void) effects from different cosmologies are discussed.