879 resultados para General allocation model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With most clinical trials, missing data presents a statistical problem in evaluating a treatment's efficacy. There are many methods commonly used to assess missing data; however, these methods leave room for bias to enter the study. This thesis was a secondary analysis on data taken from TIME, a phase 2 randomized clinical trial conducted to evaluate the safety and effect of the administration timing of bone marrow mononuclear cells (BMMNC) for subjects with acute myocardial infarction (AMI).^ We evaluated the effect of missing data by comparing the variance inflation factor (VIF) of the effect of therapy between all subjects and only subjects with complete data. Through the general linear model, an unbiased solution was made for the VIF of the treatment's efficacy using the weighted least squares method to incorporate missing data. Two groups were identified from the TIME data: 1) all subjects and 2) subjects with complete data (baseline and follow-up measurements). After the general solution was found for the VIF, it was migrated Excel 2010 to evaluate data from TIME. The resulting numerical value from the two groups was compared to assess the effect of missing data.^ The VIF values from the TIME study were considerably less in the group with missing data. By design, we varied the correlation factor in order to evaluate the VIFs of both groups. As the correlation factor increased, the VIF values increased at a faster rate in the group with only complete data. Furthermore, while varying the correlation factor, the number of subjects with missing data was also varied to see how missing data affects the VIF. When subjects with only baseline data was increased, we saw a significant rate increase in VIF values in the group with only complete data while the group with missing data saw a steady and consistent increase in the VIF. The same was seen when we varied the group with follow-up only data. This essentially showed that the VIFs steadily increased when missing data is not ignored. When missing data is ignored as with our comparison group, the VIF values sharply increase as correlation increases.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the potential usefulness of an AGE model with the Melitz-type trade specification to assess economic effects of technical regulations, taking the case of the EU ELV/RoHS directives as an example. Simulation experiments reveal that: (1) raising the fixed exporting cost to make sales in the EU market brings results that exports of the targeted commodities (motor vehicles and parts for ELV and electronic equipment for RoHS) to the EU from outside regions/countries expand while the domestic trade in the EU shrinks when the importer's preference for variety (PfV) is not strong; (2) if the PfV is not strong, policy changes that may bring reduction in the number of firms enable survived producers with high productivity to expand production to be large-scale mass producers fully enjoying the fruit of economies of scale; and (3) When the strength of the importer's PfV is changed from zero to unity, there is the value that totally changes simulation results and their interpretations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ECHAM-1 T21/LSG coupled ocean-atmosphere general circulation model (GCM) is used to simulate climatic conditions at the last interglacial maximum (Eemian. 125 kyr BP). The results reflect thc expected surface temperature changes (with respect to the control run) due to the amplification (reduction) of the seasonal cycle of insolation in the Northern (Southern) Hemisphere. A number of simulated features agree with previous results from atmospheric GCM simulations e.g. intensified summer southwest monsoons) except in the Northern Hemisphere poleward of 30 degrees N. where dynamical feedback, in the North Atlantic and North Pacific increase zonal temperatures about 1 degrees C above what would be predicted from simple energy balance considerations. As this is the same area where most of the terrestrial geological data originate, this result suggests that previous estimates of Eemian global average temperature might have been biased by sample distribution. This conclusion is supported by the fact that the estimated global temperature increase of only 0.3 degrees C greater than the control run ha, been previously shown to be consistent a with CLIMAP sea surface temperature estimates. Although the Northern Hemisphere summer monsoon is intensified. globally averaged precipitation over land is within about 1% of the present, contravening some geological inferences bur not the deep-sea delta(13)C estimates of terrestrial carbon storage changes. Winter circulation changes in the northern Arabian Sea. driven by strong cooling on land, are as large as summer circulation changes that are the usual focus of interest, suggesting that interpreting variations in the Arabian Sea. sedimentary record solely in terms of the summer monsoon response could sometimes lead to errors. A small monsoonal response over northern South America suggests that interglacial paleotrends in this region were not just due to El Nino variations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Federal Railway Administration, Office of Safety, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new general linear model (GLM) beamformer method is described for processing magnetoencephalography (MEG) data. A standard nonlinear beamformer is used to determine the time course of neuronal activation for each point in a predefined source space. A Hilbert transform gives the envelope of oscillatory activity at each location in any chosen frequency band (not necessary in the case of sustained (DC) fields), enabling the general linear model to be applied and a volumetric T statistic image to be determined. The new method is illustrated by a two-source simulation (sustained field and 20 Hz) and is shown to provide accurate localization. The method is also shown to locate accurately the increasing and decreasing gamma activities to the temporal and frontal lobes, respectively, in the case of a scintillating scotoma. The new method brings the advantages of the general linear model to the analysis of MEG data and should prove useful for the localization of changing patterns of activity across all frequency ranges including DC (sustained fields). © 2004 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Macroeconomic policy makers are typically concerned with several indicators of economic performance. We thus propose to tackle the design of macroeconomic policy using Multicriteria Decision Making (MCDM) techniques. More specifically, we employ Multiobjective Programming (MP) to seek so-called efficient policies. The MP approach is combined with a computable general equilibrium (CGE) model. We chose use of a CGE model since they have the dual advantage of being consistent with standard economic theory while allowing one to measure the effect(s) of a specific policy with real data. Applying the proposed methodology to Spain (via the 1995 Social Accounting Matrix) we first quantified the trade-offs between two specific policy objectives: growth and inflation, when designing fiscal policy. We then constructed a frontier of efficient policies involving real growth and inflation. In doing so, we found that policy in 1995 Spain displayed some degree of inefficiency with respect to these two policy objectives. We then offer two sets of policy recommendations that, ostensibly, could have helped Spain at the time. The first deals with efficiency independent of the importance given to both growth and inflation by policy makers (we label this set: general policy recommendations). A second set depends on which policy objective is seen as more important by policy makers: increasing growth or controlling inflation (we label this one: objective-specific recommendations).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Characterization of damping forces in a vibrating structure has long been an active area of research in structural dynamics. In spite of a large amount of research, understanding of damping mechanisms is not well developed. A major reason for this is that unlike inertia and stiffness forces it is not in general clear what are the state variables that govern the damping forces. The most common approach is to use `viscous damping' where the instantaneous generalized velocities are the only relevant state variables. However, viscous damping by no means the only damping model within the scope of linear analysis. Any model which makes the energy dissipation functional non-negative is a possible candidate for a valid damping model. This paper is devoted to develop methodologies for identification of such general damping models responsible for energy dissipation in a vibrating structure. The method uses experimentally identified complex modes and complex natural frequencies and does not a-priori assume any fixed damping model (eg., viscous damping) but seeks to determine parameters of a general damping model described by the so called `relaxation function'. The proposed method and several related issues are discussed by considering a numerical example of a linear array of damped spring-mass oscillators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Mara River Basin (MRB) is endowed with pristine biodiversity, socio-cultural heritage and natural resources. The purpose of my study is to develop and apply an integrated water resource allocation framework for the MRB based on the hydrological processes, water demand and economic factors. The basin was partitioned into twelve sub-basins and the rainfall runoff processes was modeled using the Soil and Water Assessment Tool (SWAT) after satisfactory Nash-Sutcliff efficiency of 0.68 for calibration and 0.43 for validation at Mara Mines station. The impact and uncertainty of climate change on the hydrology of the MRB was assessed using SWAT and three scenarios of statistically downscaled outputs from twenty Global Circulation Models. Results predicted the wet season getting more wet and the dry season getting drier, with a general increasing trend of annual rainfall through 2050. Three blocks of water demand (environmental, normal and flood) were estimated from consumptive water use by human, wildlife, livestock, tourism, irrigation and industry. Water demand projections suggest human consumption is expected to surpass irrigation as the highest water demand sector by 2030. Monthly volume of water was estimated in three blocks of current minimum reliability, reserve (>95%), normal (80–95%) and flood (40%) for more than 5 months in a year. The assessment of water price and marginal productivity showed that current water use hardly responds to a change in price or productivity of water. Finally, a water allocation model was developed and applied to investigate the optimum monthly allocation among sectors and sub-basins by maximizing the use value and hydrological reliability of water. Model results demonstrated that the status on reserve and normal volumes can be improved to ‘low’ or ‘moderate’ by updating the existing reliability to meet prevailing demand. Flow volumes and rates for four scenarios of reliability were presented. Results showed that the water allocation framework can be used as comprehensive tool in the management of MRB, and possibly be extended similar watersheds.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper explores the philosophical origins of appropriation of Information Systems (IS) using Marxian and other socio-cultural theory. It provides an in-depth examination of appropriation and its application in extant IS theory. We develop a three-tier model using Marx’s foundational concepts and from this generate four propositions that we test in an empirical example of IS in anesthesia. Using Marxian theory, this paper seeks common ground among existing theories of technology appropriation in IS research. This work contributes to IS research by (1) opening philosophical discussions on appropriation and the human ↔ technology nexus, (2) drawing on these varying perspectives to propose a general conceptualization of technology appropriation and (3) providing a starting point towards a general causal model of technology appropriation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper explores the philosophical roots of appropriation within Marx's theories and socio-cultural studies in an attempt to seek common ground among existing theories of technology appropriation in IS research. Drawing on appropriation perspectives from Adaptive Structuration Theory, the Model of Technology Appropriation and the Structurational Model of Technology for comparison, we aim to generate a Marxian model that provides a starting point toward a general causal model of technology appropriation. This paper opens a philosophical discussion on the phenomenon of appropriation in the IS community, directing attention to foundational concepts in the human-technology nexus using ideas conceived by Marx.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multiresolution techniques are being extensively used in signal processing literature. This paper has two parts, in the first part we derive a relationship between the general degradation model (Y=BX+W) at coarse and fine resolutions. In the second part we develop a signal restoration scheme in a multiresolution framework and demonstrate through experiments that the knowledge of the relationship between the degradation model at different resolutions helps in obtaining computationally efficient restoration scheme.