49 resultados para Model-Data Integration and Data Assimilation
Resumo:
Report for the scientific sojourn at the Simon Fraser University, Canada, from July to September 2007. General context: landscape change during the last years is having significant impacts on biodiversity in many Mediterranean areas. Land abandonment, urbanisation and specially fire are profoundly transforming large areas in the Western Mediterranean basin and we know little on how these changes influence species distribution and in particular how these species will respond to further change in a context of global change including climate. General objectives: integrate landscape and population dynamics models in a platform allowing capturing species distribution responses to landscape changes and assessing impact on species distribution of different scenarios of further change. Specific objective 1: develop a landscape dynamic model capturing fire and forest succession dynamics in Catalonia and linked to a stochastic landscape occupancy (SLOM) (or spatially explicit population, SEPM) model for the Ortolan bunting, a species strongly linked to fire related habitat in the region. Predictions from the occupancy or spatially explicit population Ortolan bunting model (SEPM) should be evaluated using data from the DINDIS database. This database tracks bird colonisation of recently burnt big areas (&50 ha). Through a number of different SEPM scenarios with different values for a number of parameter, we should be able to assess different hypothesis in factors driving bird colonisation in new burnt patches. These factors to be mainly, landscape context (i.e. difficulty to reach the patch, and potential presence of coloniser sources), dispersal constraints, type of regenerating vegetation after fire, and species characteristics (niche breadth, etc).
Resumo:
The remarkable increase in trade flows and in migratory flows of highly educated people are two important features of globalization of the last decades. This paper extends a two-country model of inter- and intraindustry trade to a rich environment featuring technological differences, skill differences and the possibility of international labor mobility. The model is used to explain the patterns of trade and migration as countries remove barriers to trade and to labor mobility. We parameterize the model to match the features of the Western and Eastern European members of the EU and analyze first the effects of the trade liberalization which occured between 1989 and 2004, and then the gains and losses from migration which are expected to occur if legal barriers to labor mobility are substantially reduced. The lower barriers to migration would result in significant migration of skilled workers from Eastern European countries. Interestingly, this would not only benefit the migrants and most Western European workers but, via trade, it would also benefit the workers remaining in Eastern Europe. Key Words: Skilled Migration, Gains from Variety, Real Wages, Eastern-Western Europe. JEL Codes: F12, F22, J61.
Resumo:
This paper examines a dataset which is modeled well by thePoisson-Log Normal process and by this process mixed with LogNormal data, which are both turned into compositions. Thisgenerates compositional data that has zeros without any need forconditional models or assuming that there is missing or censoreddata that needs adjustment. It also enables us to model dependenceon covariates and within the composition
Resumo:
Background: The analysis and usage of biological data is hindered by the spread of information across multiple repositories and the difficulties posed by different nomenclature systems and storage formats. In particular, there is an important need for data unification in the study and use of protein-protein interactions. Without good integration strategies, it is difficult to analyze the whole set of available data and its properties.Results: We introduce BIANA (Biologic Interactions and Network Analysis), a tool for biological information integration and network management. BIANA is a Python framework designed to achieve two major goals: i) the integration of multiple sources of biological information, including biological entities and their relationships, and ii) the management of biological information as a network where entities are nodes and relationships are edges. Moreover, BIANA uses properties of proteins and genes to infer latent biomolecular relationships by transferring edges to entities sharing similar properties. BIANA is also provided as a plugin for Cytoscape, which allows users to visualize and interactively manage the data. A web interface to BIANA providing basic functionalities is also available. The software can be downloaded under GNU GPL license from http://sbi.imim.es/web/BIANA.php.Conclusions: BIANA's approach to data unification solves many of the nomenclature issues common to systems dealing with biological data. BIANA can easily be extended to handle new specific data repositories and new specific data types. The unification protocol allows BIANA to be a flexible tool suitable for different user requirements: non-expert users can use a suggested unification protocol while expert users can define their own specific unification rules.
Resumo:
Unemployment rates in developed countries have recently reached levels not seenin a generation, and workers of all ages are facing increasing probabilities of losingtheir jobs and considerable losses in accumulated assets. These events likely increasethe reliance that most older workers will have on public social insurance programs,exactly at a time that public finances are suffering from a large drop in contributions.Our paper explicitly accounts for employment uncertainty and unexpectedwealth shocks, something that has been relatively overlooked in the literature, butthat has grown in importance in recent years. Using administrative and householdlevel data we empirically characterize a life-cycle model of retirement and claimingdecisions in terms of the employment, wage, health, and mortality uncertainty facedby individuals. Our benchmark model explains with great accuracy the strikinglyhigh proportion of individuals who claim benefits exactly at the Early RetirementAge, while still explaining the increased claiming hazard at the Normal RetirementAge. We also discuss some policy experiments and their interplay with employmentuncertainty. Additionally, we analyze the effects of negative wealth shocks on thelabor supply and claiming decisions of older Americans. Our results can explainwhy early claiming has remained very high in the last years even as the early retirementpenalties have increased substantially compared with previous periods, andwhy labor force participation has remained quite high for older workers even in themidst of the worse employment crisis in decades.
Resumo:
This paper studies the generation and transmission of international cycles in a multi-country model with production and consumption interdependencies. Two sources of disturbance are considered and three channels of propagation are compared. In the short run the contemporaneous correlation of disturbances determines the main features of the transmission. In the medium run production interdependencies account for the transmission of technology shocks and consumption interdependencies account for the transmission of government shocks. Technology disturbances, which are mildly correlated across countries, are more successful than government expenditure disturbances in reproducing actual data. The model also accounts for the low cross country consumption correlations observed in the data.
Resumo:
Understanding the mechanism through which financial globalization affect economic performance is crucial for evaluating the costs and benefits of opening financial markets. This paper is a first attempt at disentangling the effects of financial integration on the two main determinants of economic performance: productivity (TFP)and investments. I provide empirical evidence from a sample of 93 countries observed between 1975 and 1999. The results suggest that financial integration has a positive direct effect on productivity, while it spurs capital accumulation only with some delay and indirectly, since capital follows the rise in productivity. I control for indirect effects of financial globalization through banking crises. Such episodes depress both investments and TFP, though they are triggered by financial integration only to a minor extent. The paper also provides a discussion of a simple model on the effects of financial integration, and shows additional empirical evidence supporting it.
Resumo:
In this article we examine the potential effect of market structureon hospital technical efficiency as a measure of performance controlled byownership and regulation. This study is relevant to provide an evaluationof the potential effects of recommended and initiated deregulation policiesin order to promote market reforms in the context of a European NationalHealth Service. Our goal was reached through three main empirical stages.Firstly, using patient origin data from hospitals in the region of Cataloniain 1990, we estimated geographic hospital markets through the Elzinga--Hogartyapproach, based on patient flows. Then we measured the market level ofconcentration using the Herfindahl--Hirschman index. Secondly, technicaland scale efficiency scores for each hospital was obtained specifying aData Envelopment Analysis. According to the data nearly two--thirds of thehospitals operate under the production frontier with an average efficiencyscore of 0.841. Finally, the determinants of the efficiency scores wereinvestigated using a censored regression model. Special attention waspaid to test the hypothesis that there is an efficiency improvement in morecompetitive markets. The results suggest that the number of competitors inthe market contributes positively to technical efficiency and there is someevidence that the differences in efficiency scores are attributed toseveral environmental factors such as ownership, market structure andregulation effects.
Resumo:
Can we reconcile the predictions of the altruism model of the family withthe evidence on parental monetary transfers in the US? This paper providesa new assessment of this question. I expand the altruism model by introducingeffort of the child and by relaxing the assumption of perfect informationof the parent about the labor market opportunities of the child. First,I solve and simulate a model of altruism and labor supply under imperfectinformation. Second, I use cross-sectional data to test the following prediction of the model: Are parental transfers especially responsive tothe income variations of children who are very attached to the labor market? The results of the analysis suggest that imperfect informationaccounts for many of the patterns of intergenerational transfers in theUS.
Resumo:
Different climatic simulations have been obtained by using a 2-Dim horizontal energy balancemodel (EBM), which has been constrained to satisfy several extremal principles on dissipationand convection. Moreover, 2 different versions of the model with fixed and variable cloud-coverhave been used. The assumption of an extremal type of behaviour for the climatic system canacquire additional support depending on the similarities found with measured data for pastconditions as well as with usual projections for possible future scenarios
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.
Resumo:
Weather radar observations are currently the most reliable method for remote sensing of precipitation. However, a number of factors affect the quality of radar observations and may limit seriously automated quantitative applications of radar precipitation estimates such as those required in Numerical Weather Prediction (NWP) data assimilation or in hydrological models. In this paper, a technique to correct two different problems typically present in radar data is presented and evaluated. The aspects dealt with are non-precipitating echoes - caused either by permanent ground clutter or by anomalous propagation of the radar beam (anaprop echoes) - and also topographical beam blockage. The correction technique is based in the computation of realistic beam propagation trajectories based upon recent radiosonde observations instead of assuming standard radio propagation conditions. The correction consists of three different steps: 1) calculation of a Dynamic Elevation Map which provides the minimum clutter-free antenna elevation for each pixel within the radar coverage; 2) correction for residual anaprop, checking the vertical reflectivity gradients within the radar volume; and 3) topographical beam blockage estimation and correction using a geometric optics approach. The technique is evaluated with four case studies in the region of the Po Valley (N Italy) using a C-band Doppler radar and a network of raingauges providing hourly precipitation measurements. The case studies cover different seasons, different radio propagation conditions and also stratiform and convective precipitation type events. After applying the proposed correction, a comparison of the radar precipitation estimates with raingauges indicates a general reduction in both the root mean squared error and the fractional error variance indicating the efficiency and robustness of the procedure. Moreover, the technique presented is not computationally expensive so it seems well suited to be implemented in an operational environment.
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.
Resumo:
A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.