942 resultados para Data Driven Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling and predicting co-occurrences of events is a fundamental problem of unsupervised learning. In this contribution we develop a statistical framework for analyzing co-occurrence data in a general setting where elementary observations are joint occurrences of pairs of abstract objects from two finite sets. The main challenge for statistical models in this context is to overcome the inherent data sparseness and to estimate the probabilities for pairs which were rarely observed or even unobserved in a given sample set. Moreover, it is often of considerable interest to extract grouping structure or to find a hierarchical data organization. A novel family of mixture models is proposed which explain the observed data by a finite number of shared aspects or clusters. This provides a common framework for statistical inference and structure discovery and also includes several recently proposed models as special cases. Adopting the maximum likelihood principle, EM algorithms are derived to fit the model parameters. We develop improved versions of EM which largely avoid overfitting problems and overcome the inherent locality of EM--based optimization. Among the broad variety of possible applications, e.g., in information retrieval, natural language processing, data mining, and computer vision, we have chosen document retrieval, the statistical analysis of noun/adjective co-occurrence and the unsupervised segmentation of textured images to test and evaluate the proposed algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tsunoda et al. (2001) recently studied the nature of object representation in monkey inferotemporal cortex using a combination of optical imaging and extracellular recordings. In particular, they examined IT neuron responses to complex natural objects and "simplified" versions thereof. In that study, in 42% of the cases, optical imaging revealed a decrease in the number of activation patches in IT as stimuli were "simplified". However, in 58% of the cases, "simplification" of the stimuli actually led to the appearance of additional activation patches in IT. Based on these results, the authors propose a scheme in which an object is represented by combinations of active and inactive columns coding for individual features. We examine the patterns of activation caused by the same stimuli as used by Tsunoda et al. in our model of object recognition in cortex (Riesenhuber 99). We find that object-tuned units can show a pattern of appearance and disappearance of features identical to the experiment. Thus, the data of Tsunoda et al. appear to be in quantitative agreement with a simple object-based representation in which an object's identity is coded by its similarities to reference objects. Moreover, the agreement of simulations and experiment suggests that the simplification procedure used by Tsunoda (2001) is not necessarily an accurate method to determine neuronal tuning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing fuel taxes play a major role in determining the welfare effects of exempting the transportation sector from measures to control greenhouse gases. To study this phenomenon we modify the MIT Emissions Prediction and Policy Analysis (EPPA) model to disaggregate the household transportation sector. This improvement requires an extension of the GTAP data set that underlies the model. The revised and extended facility is then used to compare economic costs of cap-and-trade systems differentiated by sector, focusing on two regions: the USA where the fuel taxes are low, and Europe where the fuel taxes are high. We find that the interplay between carbon policies and pre-existing taxes leads to different results in these regions: in the USA exemption of transport from such a system would increase the welfare cost of achieving a national emissions target, while in Europe such exemptions will correct pre-existing distortions and reduce the cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tecnología LiDAR (Light Detection and Ranging), basada en el escaneado del territorio por un telémetro láser aerotransportado, permite la construcción de Modelos Digitales de Superficie (DSM) mediante una simple interpolación, así como de Modelos Digitales del Terreno (DTM) mediante la identificación y eliminación de los objetos existentes en el terreno (edificios, puentes o árboles). El Laboratorio de Geomática del Politécnico de Milán – Campus de Como- desarrolló un algoritmo de filtrado de datos LiDAR basado en la interpolación con splines bilineares y bicúbicas con una regularización de Tychonov en una aproximación de mínimos cuadrados. Sin embargo, en muchos casos son todavía necesarios modelos más refinados y complejos en los cuales se hace obligatorio la diferenciación entre edificios y vegetación. Este puede ser el caso de algunos modelos de prevención de riesgos hidrológicos, donde la vegetación no es necesaria; o la modelización tridimensional de centros urbanos, donde la vegetación es factor problemático. (...)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents recent WMR (wheeled mobile robot) navigation experiences using local perception knowledge provided by monocular and odometer systems. A local narrow perception horizon is used to plan safety trajectories towards the objective. Therefore, monocular data are proposed as a way to obtain real time local information by building two dimensional occupancy grids through a time integration of the frames. The path planning is accomplished by using attraction potential fields, while the trajectory tracking is performed by using model predictive control techniques. The results are faced to indoor situations by using the lab available platform consisting in a differential driven mobile robot

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work extends a previously developed research concerning about the use of local model predictive control in differential driven mobile robots. Hence, experimental results are presented as a way to improve the methodology by considering aspects as trajectory accuracy and time performance. In this sense, the cost function and the prediction horizon are important aspects to be considered. The aim of the present work is to test the control method by measuring trajectory tracking accuracy and time performance. Moreover, strategies for the integration with perception system and path planning are briefly introduced. In this sense, monocular image data can be used to plan safety trajectories by using goal attraction potential fields

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a simple Ordered Probit model to analyse the monetary policy reaction function of the Colombian Central Bank. There is evidence that the reaction function is asymmetric, in the sense that the Bank increases the Bank rate when the gap between observed inflation and the inflation target (lagged once) is positive, but it does not reduce the Bank rate when the gap is negative. This behaviour suggests that the Bank is more interested in fulfilling the announced inflation target rather than in reducing inflation excessively. The forecasting performance of the model, both within and beyond the estimation period, appears to be particularly good.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric downwelling longwave radiation is an important component of the terrestrial energy budget; since it is strongly related with the greenhouse effect, it remarkably affects the climate. In this study, I evaluate the estimation of the downwelling longwave irradiance at the terrestrial surface for cloudless and overcast conditions using a one-dimensional radiative transfer model (RTM), specifically the Santa Barbara DISORT Atmospheric Radiative Transfer (SBDART). The calculations performed by using this model were compared with pyrgeometer measurements at three different European places: Girona (NE of the Iberian Peninsula), Payerne (in the East of Switzerland), and Heselbach (in the Black Forest, Germany). Several studies of sensitivity based on the radiative transfer model have shown that special attention on the input of temperature and water content profiles must be held for cloudless sky conditions; for overcast conditions, similar sensitivity studies have shown that, besides the atmospheric profiles, the cloud base height is very relevant, at least for optically thick clouds. Also, the estimation of DLR in places where radiosoundings are not available is explored, either by using the atmospheric profiles spatially interpolated from the gridded analysis data provided by European Centre of Medium-Range Weather Forecast (ECMWF), or by applying a real radiosounding of a nearby site. Calculations have been compared with measurements at all sites. During cloudless sky conditions, when radiosoundings were available, calculations show differences with measurements of -2.7 ± 3.4 Wm-2 (Payerne). While no in situ radiosoundings are available, differences between modeling and measurements were about 0.3 ± 9.4 Wm-2 (Girona). During overcast sky conditions, when in situ radiosoundings and cloud properties (derived from an algorithm that uses spectral infrared and microwave ground based measurements) were available (Black Forest), calculations show differences with measurements of -0.28 ± 2.52 Wm2. When using atmospheric profiles from the ECMWF and fixed values of liquid water path and droplet effective radius (Girona) calculations show differences with measurements of 4.0 ± 2.5 Wm2. For all analyzed sky conditions, it has been confirmed that estimations from radiative transfer modeling are remarkably better than those obtained by simple parameterizations of atmospheric emissivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conservation planning requires identifying pertinent habitat factors and locating geographic locations where land management may improve habitat conditions for high priority species. I derived habitat models and mapped predicted abundance for the Golden-winged Warbler (Vermivora chrysoptera), a species of high conservation concern, using bird counts, environmental variables, and hierarchical models applied at multiple spatial scales. My aim was to understand habitat associations at multiple spatial scales and create a predictive abundance map for purposes of conservation planning for the Golden-winged Warbler. My models indicated a substantial influence of landscape conditions, including strong positive associations with total forest composition within the landscape. However, many of the associations I observed were counter to reported associations at finer spatial extents; for instance, I found Golden-winged Warblers negatively associated with several measures of edge habitat. No single spatial scale dominated, indicating that this species is responding to factors at multiple spatial scales. I found Golden-winged Warbler abundance was negatively related with Blue-winged Warbler (Vermivora cyanoptera) abundance. I also observed a north-south spatial trend suggestive of a regional climate effect that was not previously noted for this species. The map of predicted abundance indicated a large area of concentrated abundance in west-central Wisconsin, with smaller areas of high abundance along the northern periphery of the Prairie Hardwood Transition. This map of predicted abundance compared favorably with independent evaluation data sets and can thus be used to inform regional planning efforts devoted to conserving this species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common Loon (Gavia immer) is considered an emblematic and ecologically important example of aquatic-dependent wildlife in North America. The northern breeding range of Common Loon has contracted over the last century as a result of habitat degradation from human disturbance and lakeshore development. We focused on the state of New Hampshire, USA, where a long-term monitoring program conducted by the Loon Preservation Committee has been collecting biological data on Common Loon since 1976. The Common Loon population in New Hampshire is distributed throughout the state across a wide range of lake-specific habitats, water quality conditions, and levels of human disturbance. We used a multiscale approach to evaluate the association of Common Loon and breeding habitat within three natural physiographic ecoregions of New Hampshire. These multiple scales reflect Common Loon-specific extents such as territories, home ranges, and lake-landscape influences. We developed ecoregional multiscale models and compared them to single-scale models to evaluate model performance in distinguishing Common Loon breeding habitat. Based on information-theoretic criteria, there is empirical support for both multiscale and single-scale models across all three ecoregions, warranting a model-averaging approach. Our results suggest that the Common Loon responds to both ecological and anthropogenic factors at multiple scales when selecting breeding sites. These multiscale models can be used to identify and prioritize the conservation of preferred nesting habitat for Common Loon populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several global quantities are computed from the ERA40 reanalysis for the period 1958-2001 and explored for trends. These are discussed in the context of changes to the global observing system. Temperature, integrated water vapor (IWV), and kinetic energy are considered. The ERA40 global mean temperature in the lower troposphere has a trend of +0.11 K per decade over the period of 1979-2001, which is slightly higher than the MSU measurements, but within the estimated error limit. For the period 1958 2001 the warming trend is 0.14 K per decade but this is likely to be an artifact of changes in the observing system. When this is corrected for, the warming trend is reduced to 0.10 K per decade. The global trend in IWV for the period 1979-2001 is +0.36 mm per decade. This is about twice as high as the trend determined from the Clausius-Clapeyron relation assuming conservation of relative humidity. It is also larger than results from free climate model integrations driven by the same observed sea surface temperature as used in ERA40. It is suggested that the large trend in IWV does not represent a genuine climate trend but an artifact caused by changes in the global observing system such as the use of SSM/I and more satellite soundings in later years. Recent results are in good agreement with GPS measurements. The IWV trend for the period 1958-2001 is still higher but reduced to +0.16 mm per decade when corrected for changes in the observing systems. Total kinetic energy shows an increasing global trend. Results from data assimilation experiments strongly suggest that this trend is also incorrect and mainly caused by the huge changes in the global observing system in 1979. When this is corrected for, no significant change in global kinetic energy from 1958 onward can be found.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The banded organization of clouds and zonal winds in the atmospheres of the outer planets has long fascinated observers. Several recent studies in the theory and idealized modeling of geostrophic turbulence have suggested possible explanations for the emergence of such organized patterns, typically involving highly anisotropic exchanges of kinetic energy and vorticity within the dissipationless inertial ranges of turbulent flows dominated (at least at large scales) by ensembles of propagating Rossby waves. The results from an attempt to reproduce such conditions in the laboratory are presented here. Achievement of a distinct inertial range turns out to require an experiment on the largest feasible scale. Deep, rotating convection on small horizontal scales was induced by gently and continuously spraying dense, salty water onto the free surface of the 13-m-diameter cylindrical tank on the Coriolis platform in Grenoble, France. A “planetary vorticity gradient” or “β effect” was obtained by use of a conically sloping bottom and the whole tank rotated at angular speeds up to 0.15 rad s−1. Over a period of several hours, a highly barotropic, zonally banded large-scale flow pattern was seen to emerge with up to 5–6 narrow, alternating, zonally aligned jets across the tank, indicating the development of an anisotropic field of geostrophic turbulence. Using particle image velocimetry (PIV) techniques, zonal jets are shown to have arisen from nonlinear interactions between barotropic eddies on a scale comparable to either a Rhines or “frictional” wavelength, which scales roughly as (β/Urms)−1/2. This resulted in an anisotropic kinetic energy spectrum with a significantly steeper slope with wavenumber k for the zonal flow than for the nonzonal eddies, which largely follows the classical Kolmogorov k−5/3 inertial range. Potential vorticity fields show evidence of Rossby wave breaking and the presence of a “hyperstaircase” with radius, indicating instantaneous flows that are supercritical with respect to the Rayleigh–Kuo instability criterion and in a state of “barotropic adjustment.” The implications of these results are discussed in light of zonal jets observed in planetary atmospheres and, most recently, in the terrestrial oceans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two wavelet-based control variable transform schemes are described and are used to model some important features of forecast error statistics for use in variational data assimilation. The first is a conventional wavelet scheme and the other is an approximation of it. Their ability to capture the position and scale-dependent aspects of covariance structures is tested in a two-dimensional latitude-height context. This is done by comparing the covariance structures implied by the wavelet schemes with those found from the explicit forecast error covariance matrix, and with a non-wavelet- based covariance scheme used currently in an operational assimilation scheme. Qualitatively, the wavelet-based schemes show potential at modeling forecast error statistics well without giving preference to either position or scale-dependent aspects. The degree of spectral representation can be controlled by changing the number of spectral bands in the schemes, and the least number of bands that achieves adequate results is found for the model domain used. Evidence is found of a trade-off between the localization of features in positional and spectral spaces when the number of bands is changed. By examining implied covariance diagnostics, the wavelet-based schemes are found, on the whole, to give results that are closer to diagnostics found from the explicit matrix than from the nonwavelet scheme. Even though the nature of the covariances has the right qualities in spectral space, variances are found to be too low at some wavenumbers and vertical correlation length scales are found to be too long at most scales. The wavelet schemes are found to be good at resolving variations in position and scale-dependent horizontal length scales, although the length scales reproduced are usually too short. The second of the wavelet-based schemes is often found to be better than the first in some important respects, but, unlike the first, it has no exact inverse transform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.