998 resultados para Multidimensional modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper establishes a general framework for metric scaling of any distance measure between individuals based on a rectangular individuals-by-variables data matrix. The method allows visualization of both individuals and variables as well as preserving all the good properties of principal axis methods such as principal components and correspondence analysis, based on the singular-value decomposition, including the decomposition of variance into components along principal axes which provide the numerical diagnostics known as contributions. The idea is inspired from the chi-square distance in correspondence analysis which weights each coordinate by an amount calculated from the margins of the data table. In weighted metric multidimensional scaling (WMDS) we allow these weights to be unknown parameters which are estimated from the data to maximize the fit to the original distances. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing a matrix and displaying its rows and columns in biplots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper evaluates new evidence on price setting practices and inflation persistence in the euro area with respect to its implications for macro modelling. It argues that several of the most commonly used assumptions in micro-founded macro models are seriously challenged by the new findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines properties of optimal poverty assistance programs under different informational environments using an income maintenanceframework. To that end, we make both the income generating ability andthe disutility of labor of individuals unobservable, and compare theresulting benefit schedules with those of programs found in the UnitedStates since Welfare Reform (1996). We find that optimal programs closelyresemble a Negative Income Tax with a Benefit Reduction rate that dependson the distribution of population characteristics. A policy of workfare(unpaid public sector work) is inefficient when disutility of labor isunobservable, but minimum work requirements (for paid work) may be usedin that same environment. The distortions to work incentives and thepresence of minimum work requirements depend on the observability andrelative importance of the population's characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Youth is one of the phases in the life-cycle when some of the most decisivelife transitions take place. Entering the labour market or leaving parentalhome are events with important consequences for the economic well-beingof young adults. In this paper, the interrelationship between employment,residential emancipation and poverty dynamics is studied for eight Europeancountries by means of an econometric model with feedback effects. Resultsshow that youth poverty genuine state dependence is positive and highly significant.Evidence proves there is a strong causal effect between poverty andleaving home in Scandinavian countries, however, time in economic hardshipdoes not last long. In Southern Europe, instead, youth tend to leave theirparental home much later in order to avoid falling into a poverty state that ismore persistent. Past poverty has negative consequences on the likelihood ofemployment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the importance of supplier inducement and brand loyalty inthe drug purchasing process, little empirical evidence is to be foundwith regard to the influence that these factors exert on patients decisions. Under the new scenario of easier access to information,patients are becoming more demanding and even go as far asquestioning their physicians prescription. Furthermore, newregulation also encourages patients to adopt an active role in thedecision between brand-name and generic drugs. Using a statedpreference model based on a choice survey, I have found evidenceof how significant physicians prescription and pharmacists recommendation become throughout the drug purchase process and,to what extent, brand loyalty influences the final decision. Asfar as we are aware, this paper is the first to explicitlytake consumers preferences into account rather than focusingon the behavior of health professionals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context There are no evidence syntheses available to guide clinicians on when to titrate antihypertensive medication after initiation. Objective To model the blood pressure (BP) response after initiating antihypertensive medication. Data sources electronic databases including Medline, Embase, Cochrane Register and reference lists up to December 2009. Study selection Trials that initiated antihypertensive medication as single therapy in hypertensive patients who were either drug naive or had a placebo washout from previous drugs. Data extraction Office BP measurements at a minimum of two weekly intervals for a minimum of 4 weeks. An asymptotic approach model of BP response was assumed and non-linear mixed effects modelling used to calculate model parameters. Results and conclusions Eighteen trials that recruited 4168 patients met inclusion criteria. The time to reach 50% of the maximum estimated BP lowering effect was 1 week (systolic 0.91 weeks, 95% CI 0.74 to 1.10; diastolic 0.95, 0.75 to 1.15). Models incorporating drug class as a source of variability did not improve fit of the data. Incorporating the presence of a titration schedule improved model fit for both systolic and diastolic pressure. Titration increased both the predicted maximum effect and the time taken to reach 50% of the maximum (systolic 1.2 vs 0.7 weeks; diastolic 1.4 vs 0.7 weeks). Conclusions Estimates of the maximum efficacy of antihypertensive agents can be made early after starting therapy. This knowledge will guide clinicians in deciding when a newly started antihypertensive agent is likely to be effective or not at controlling BP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing an efficient sampling strategy is of crucial importance for habitat suitability modelling. This paper compares four such strategies, namely, 'random', 'regular', 'proportional-stratified' and 'equal -stratified'- to investigate (1) how they affect prediction accuracy and (2) how sensitive they are to sample size. In order to compare them, a virtual species approach (Ecol. Model. 145 (2001) 111) in a real landscape, based on reliable data, was chosen. The distribution of the virtual species was sampled 300 times using each of the four strategies in four sample sizes. The sampled data were then fed into a GLM to make two types of prediction: (1) habitat suitability and (2) presence/ absence. Comparing the predictions to the known distribution of the virtual species allows model accuracy to be assessed. Habitat suitability predictions were assessed by Pearson's correlation coefficient and presence/absence predictions by Cohen's K agreement coefficient. The results show the 'regular' and 'equal-stratified' sampling strategies to be the most accurate and most robust. We propose the following characteristics to improve sample design: (1) increase sample size, (2) prefer systematic to random sampling and (3) include environmental information in the design'

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High N concentrations in biosolids are one of the strongest reasons for their agricultural use. However, it is essential to understand the fate of N in soils treated with biosolids for both plant nutrition and managing the environmental risk of NO3--N leaching. This work aimed at evaluating the risk of NO3--N leaching from a Spodosol and an Oxisol, each one treated with 0.5-8.0 dry Mg ha-1 of fresh tertiary sewage sludge, composted biosolids, limed biosolids, heat-dried biosolids and solar-irradiated biosolids. Results indicated that under similar application rates NO3--N accumulated up to three times more in the 20 cm topsoil of the Oxisol than the Spodosol. However, a higher water content held at field capacity in the Oxisol compensated for the greater nitrate concentrations. A 20 % NO3--N loss from the root zone in the amended Oxisol could be expected. Depending on the biosolids type, 42 to 76 % of the NO3--N accumulated in the Spodosol could be expected to leach down from the amended 20 cm topsoil. NO3--N expected to leach from the Spodosol ranged from 0.8 (composted sludge) to 3.5 times (limed sludge) the amounts leaching from the Oxisol treated alike. Nevertheless, the risk of NO3--N groundwater contamination as a result of a single biosolids land application at 0.5-8.0 dry Mg ha-1 could be considered low.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research provides a description of the process followed in order to assemble a "Social Accounting Matrix" for Spain corresponding to the year 2000 (SAMSP00). As argued in the paper, this process attempts to reconcile ESA95 conventions with requirements of applied general equilibrium modelling. Particularly, problems related to the level of aggregation of net taxation data, and to the valuation system used for expressing the monetary value of input-output transactions have deserved special attention. Since the adoption of ESA95 conventions, input-output transactions have been preferably valued at basic prices, which impose additional difficulties on modellers interested in computing applied general equilibrium models. This paper addresses these difficulties by developing a procedure that allows SAM-builders to change the valuation system of input-output transactions conveniently. In addition, this procedure produces new data related to net taxation information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This contribution builds upon a former paper by the authors (Lipps and Betz 2004), in which a stochastic population projection for East- and West Germany is performed. Aim was to forecast relevant population parameters and their distribution in a consistent way. We now present some modifications, which have been modelled since. First, population parameters for the entire German population are modelled. In order to overcome the modelling problem of the structural break in the East during reunification, we show that the adaptation process of the relevant figures by the East can be considered to be completed by now. As a consequence, German parameters can be modelled just by using the West German historic patterns, with the start-off population of entire Germany. Second, a new model to simulate age specific fertility rates is presented, based on a quadratic spline approach. This offers a higher flexibility to model various age specific fertility curves. The simulation results are compared with the scenario based official forecasts for Germany in 2050. Exemplary for some population parameters (e.g. dependency ratio), it can be shown that the range spanned by the medium and extreme variants correspond to the s-intervals in the stochastic framework. It seems therefore more appropriate to treat this range as a s-interval covering about two thirds of the true distribution.