628 resultados para Maximizing
Resumo:
We analyze a model of 'postelection politics', in which (unlike in the more common Downsian models of 'preelection politics') politicians cannot make binding commitments prior to elections. The game begins with an incumbent politician in office, and voters adopt reelection strategies that are contingent on the policies implemented by the incumbent. We generalize previous models of this type by introducing heterogeneity in voters' ideological preferences, and analyze how voters' reelection strategies constrain the policies chosen by a rent-maximizing incumbent. We first show that virtually any policy (and any feasible level of rent for the incumbent) can be sustained in a Nash equilibrium. Then, we derive a 'median voter theorem': the ideal point of the median voter, and the minimum feasible level of rent, are the unique outcomes in any strong Nash equilibrium. We then introduce alternative refinements that are less restrictive. In particular, Ideologically Loyal Coalition-proof equilibrium also leads uniquely to the median outcome.
Resumo:
This paper analyzes whether the Congressional budget process (instituted in 1974) leads to lower aggregate spending than does the piece-meal appropriations process that preceded it. Previous theoretical analysis, using spatial models of legislator preferences, is inconclusive. This paper uses a model of interest group lobbying, where a legislature determines spending on a national public good and on subsidies to subsets of the population that belong to nationwide sector-specific interest groups. In the appropriations process, the Appropriations Committee proposes a budget, maximizing the joint welfare of voters and the interest groups, that leads to overspending on subsidies. In the budget process, a Budget Committee proposes an aggregate level of spending (the budget resolution); the Appropriations Committee then proposes a budget. If the lobby groups are not subject to a binding resource constraint, the two institutional structures lead to identical outcomes. With such a constraint, however, there is a free rider problem among the groups in lobbying the Budget Committee, as each group only obtains a small fraction of the benefits from increasing the aggregate budget. If the number of groups is sufficiently large, each takes the budget resolution as given, and lobbies only the Appropriations Committee. The main results are that aggregate spending is lower, and social welfare higher, under the budget process; however, provision of the public good is suboptimal. The paper also presents two extensions: the first endogenizes the enforcement of the budget resolution by incorporating the relevant procedural rules into the model. The second analyzes statutory budget rules that limit spending levels, but can be revised by a simple majority vote. In each case,the free rider problem prevents the groups from securing the required changes to procedural and budget rules.
Resumo:
This paper examines how US and proposed international law relate to the recovery of archaeological data from historic shipwrecks. It argues that US federal admiralty law of salvage gives far less protection to historic submerged sites than do US laws protecting archaeological sites on US federal and Indian lands. The paper offers a simple model in which the net present value of the salvage and archaeological investigation of an historic shipwreck is maximized. It is suggested that salvage law gives insufficient protection to archaeological data, but that UNESCO's Convention on the Protection of the Underwater Cultural Heritage goes too far in the other direction. It is also suggested that a move towards maximizing the net present value of a wreck would be promoted if the US admiralty courts explicitly tied the size of salvage awards to the quality of the archaeology performed.
Resumo:
Methods of tax collection employed by modern governments seem dull when compared to the rich variety observed in history. Whereas most governments today typically use salaried agents to collect taxes, various other types of contractual relationships have been observed in history, including sharing arrangements which divide the tax revenue between the government and collectors at fixed proportions, negotiated payment schemes based on the tax base, and sale of the revenue to a collector in exchange for a lump-sum payment determined at auction. We propose an economic theory of tax collection that can coherently explain the temporal and spatial variation in contractual forms. We begin by offering a simple classification of tax collection schemes observed in history. We then develop a general economic model of tax collection that specifies the cost and benefits of alternative schemes and identifies the conditions under which a government would choose one contractual form over another in maximizing the net revenue. Finally, we use the conclusions of the model to explain some of the well-known patterns of tax collection observed in history and how choices varied over time and space.
Resumo:
In 2004, Houston had one of the lowest childhood immunization levels among major metropolitan cities in the United States at 65% for the 4:3:1:3:3 vaccination series. Delays in the receipt of scheduled vaccinations may be related to missed opportunities due to health care provider lack of knowledge about catch-up regimens and contraindications for pediatric vaccination. The objectives of this study are to identify, measure, and report on VFC provider-practice characteristics, knowledge of catch-up regimens and contraindications, and use of Reminder recall (R/R) and moved or gone elsewhere (MOGE) practices among providers with high (>80%) and low (<70%) immunization coverage among 19-35 month old children. The sampling frame consists of 187 Vaccines for Children (VFC) providers with 2004 clinic assessment software application (CASA) scores. Data were collected by personal interview with each participating practice provider. Only ten VFC providers were successful at maximizing vaccinations for every vignette and no provider administered the maximum possible number of vaccinations at visit 2 for all six vignettes. Both coverage groups administered polio conjugate vaccine (PCV), haemophilus influenza type b (Hib), and diphtheria, tetanus and acellular pertussis (DTaP) most frequently and omitted most frequently varicella zoster vaccine (VZV) and measles, mumps, and rubella (MMR) vaccine. ^
Resumo:
The research project is an extension of a series of administrative science and health care research projects evaluating the influence of external context, organizational strategy, and organizational structure upon organizational success or performance. The research will rely on the assumption that there is not one single best approach to the management of organizations (the contingency theory). As organizational effectiveness is dependent on an appropriate mix of factors, organizations may be equally effective based on differing combinations of factors. The external context of the organization is expected to influence internal organizational strategy and structure and in turn the internal measures affect performance (discriminant theory). The research considers the relationship of external context and organization performance.^ The unit of study for the research will be the health maintenance organization (HMO); an organization the accepts in exchange for a fixed, advance capitation payment, contractual responsibility to assure the delivery of a stated range of health sevices to a voluntary enrolled population. With the current Federal resurgence of interest in the Health Maintenance Organization (HMO) as a major component in the health care system, attention must be directed at maximizing development of HMOs from the limited resources available. Increased skills are needed in both Federal and private evaluation of HMO feasibility in order to prevent resource investment and in projects that will fail while concurrently identifying potentially successful projects that will not be considered using current standards.^ The research considers 192 factors measuring contextual milieu (social, educational, economic, legal, demographic, health and technological factors). Through intercorrelation and principle components data reduction techniques this was reduced to 12 variables. Two measures of HMO performance were identified, they are (1) HMO status (operational or defunct), and (2) a principle components factor score considering eight measures of performance. The relationship between HMO context and performance was analysed using correlation and stepwise multiple regression methods. In each case it has been concluded that the external contextual variables are not predictive of success or failure of study Health Maintenance Organizations. This suggests that performance of an HMO may rely on internal organizational factors. These findings have policy implications as contextual measures are used as a major determinant in HMO feasibility analysis, and as a factor in the allocation of limited Federal funds. ^
Resumo:
Genome-wide association studies (GWAS) have rapidly become a standard method for disease gene discovery. Many recent GWAS indicate that for most disorders, only a few common variants are implicated and the associated SNPs explain only a small fraction of the genetic risk. The current study incorporated gene network information into gene-based analysis of GWAS data for Crohn's disease (CD). The purpose was to develop statistical models to boost the power of identifying disease-associated genes and gene subnetworks by maximizing the use of existing biological knowledge from multiple sources. The results revealed that Markov random field (MRF) based mixture model incorporating direct neighborhood information from a single gene network is not efficient in identifying CD-related genes based on the GWAS data. The incorporation of solely direct neighborhood information might lead to the low efficiency of these models. Alternative MRF models looking beyond direct neighboring information are necessary to be developed in the future for the purpose of this study.^
Resumo:
Clinical trials are often not successful because of the inability to recruit a sufficient number of patients. The Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT), the largest antihypertensive trial ever conducted, provided highly generalized results and successful recruitment of over 42,000 participants. The overall purpose of this study was to examine the association of investigator characteristics with anti-hypertensive (AHT) participant recruitment in ALLHAT. This secondary data analyses collected data from the ALLHAT investigator profile survey and related investigator characteristics to recruitment success. The sample size was 502 investigators, with recruitment data from 37,947AHT participants. Recruitment was dichotomized by categorizing all sites with recruitment numbers at or above the overall median recruitment number of 46 as "Successful Recruitment". Frequency distributions and univariate and multivariate logistic regression were conducted. When adjusting for all other factors, Hispanic ethnicity, suburban setting, Department of Veterans Affairs Medical Centers (VAMC) site type, number of clinical site staff working on the trial, study coordinator hours per week, medical conference sessions attended, the investigator's primary goal and the likelihood that a physician will convince a patient to continue on randomized treatment, have significant impacts on the recruitment success of ALLHAT investigators. Most of the ALLHAT investigators described their primary commitment as being towards their patients and not to scientific knowledge alone. However, investigators that distinguished themselves as leaders in research had greater recruitment success than investigators who were leaders in clinical practice. ALLHAT was a highly successful trial that proved that community based cardiovascular trials can be implemented on a large scale. Exploring characteristics of ALLHAT investigators provides data that can be generalized to sponsors, sites, and others interested in maximizing clinical trial recruitment numbers. Future studies should further evaluate investigator and study coordinator factors that impact cardiovascular clinical trial recruitment success.^
Resumo:
Maximizing data quality may be especially difficult in trauma-related clinical research. Strategies are needed to improve data quality and assess the impact of data quality on clinical predictive models. This study had two objectives. The first was to compare missing data between two multi-center trauma transfusion studies: a retrospective study (RS) using medical chart data with minimal data quality review and the PRospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study with standardized quality assurance. The second objective was to assess the impact of missing data on clinical prediction algorithms by evaluating blood transfusion prediction models using PROMMTT data. RS (2005-06) and PROMMTT (2009-10) investigated trauma patients receiving ≥ 1 unit of red blood cells (RBC) from ten Level I trauma centers. Missing data were compared for 33 variables collected in both studies using mixed effects logistic regression (including random intercepts for study site). Massive transfusion (MT) patients received ≥ 10 RBC units within 24h of admission. Correct classification percentages for three MT prediction models were evaluated using complete case analysis and multiple imputation based on the multivariate normal distribution. A sensitivity analysis for missing data was conducted to estimate the upper and lower bounds of correct classification using assumptions about missing data under best and worst case scenarios. Most variables (17/33=52%) had <1% missing data in RS and PROMMTT. Of the remaining variables, 50% demonstrated less missingness in PROMMTT, 25% had less missingness in RS, and 25% were similar between studies. Missing percentages for MT prediction variables in PROMMTT ranged from 2.2% (heart rate) to 45% (respiratory rate). For variables missing >1%, study site was associated with missingness (all p≤0.021). Survival time predicted missingness for 50% of RS and 60% of PROMMTT variables. MT models complete case proportions ranged from 41% to 88%. Complete case analysis and multiple imputation demonstrated similar correct classification results. Sensitivity analysis upper-lower bound ranges for the three MT models were 59-63%, 36-46%, and 46-58%. Prospective collection of ten-fold more variables with data quality assurance reduced overall missing data. Study site and patient survival were associated with missingness, suggesting that data were not missing completely at random, and complete case analysis may lead to biased results. Evaluating clinical prediction model accuracy may be misleading in the presence of missing data, especially with many predictor variables. The proposed sensitivity analysis estimating correct classification under upper (best case scenario)/lower (worst case scenario) bounds may be more informative than multiple imputation, which provided results similar to complete case analysis.^
Resumo:
Corn is planted earlier each year, which is one important component in maximizing grain yield. Earlier planting dates can be attributed to larger farms, less spring tillage, improvements in corn hybrids, improved drainage systems, and better seed treatments. Research conducted at the ISU Northwest Research Farm from 2006 through 2009 showed that the planting window for 98 percent or greater yield potential in northwest Iowa is April 15 to May 9. A 95 percent or greater yield potential can be realized from April 15 to May 18. A study was conducted from 2009 through 2011 at the Northwest Research Farm to determine how corn planted in early April compares with corn planted in the recommended planting window for the area.
Resumo:
Three ice type regimes at Ice Station Belgica (ISB), during the 2007 International Polar Year SIMBA (Sea Ice Mass Balance in Antarctica) expedition, were characterized and assessed for elevation, snow depth, ice freeboard and thickness. Analyses of the probability distribution functions showed great potential for satellite-based altimetry for estimating ice thickness. In question is the required altimeter sampling density for reasonably accurate estimation of snow surface elevation given inherent spatial averaging. This study assesses an effort to determine the number of laser altimeter 'hits' of the ISB floe, as a representative Antarctic floe of mixed first- and multi-year ice types, for the purpose of statistically recreating the in situ-determined ice-thickness and snow depth distribution based on the fractional coverage of each ice type. Estimates of the fractional coverage and spatial distribution of the ice types, referred to as ice 'towns', for the 5 km**2 floe were assessed by in situ mapping and photo-visual documentation. Simulated ICESat altimeter tracks, with spot size ~70 m and spacing ~170 m, sampled the floe's towns, generating a buoyancy-derived ice thickness distribution. 115 altimeter hits were required to statistically recreate the regional thickness mean and distribution for a three-town assemblage of mixed first- and multi-year ice, and 85 hits for a two-town assemblage of first-year ice only: equivalent to 19.5 and 14.5 km respectively of continuous altimeter track over a floe region of similar structure. Results have significant implications toward model development of sea-ice sampling performance of the ICESat laser altimeter record as well as maximizing sampling characteristics of satellite/airborne laser and radar altimetry missions for sea-ice thickness.
Resumo:
We determined the distribution of lipids (n-alkanes and n-alkan-2-ones) in present-day peat-formingplants in the RoñanzasBog in northernSpain. Consistent with the observation of others, most Sphagnum (moss) species alkanes maximized at C23, whereas the other plants maximized at higher molecular weight (C27 to C31). We show for the first time that plants other than seagrass and Sphagnum moss contain n-alkan-2-ones. Almost all the species analysed showed an n-alkan-2-one distribution between C21 and C31 with an odd/even predominance, maximizing at C27 or C29, except ferns, which maximized at lower molecular weight (C21–C23). We also observed that microbial degradation can be a major contributor to the n-alkan-2-one distribution in sediments as opposed to a direct input of ketones from plants
Resumo:
We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
This paper outlines the problems found in the parallelization of SPH (Smoothed Particle Hydrodynamics) algorithms using Graphics Processing Units. Different results of some parallel GPU implementations in terms of the speed-up and the scalability compared to the CPU sequential codes are shown. The most problematic stage in the GPU-SPH algorithms is the one responsible for locating neighboring particles and building the vectors where this information is stored, since these specific algorithms raise many dificulties for a data-level parallelization. Because of the fact that the neighbor location using linked lists does not show enough data-level parallelism, two new approaches have been pro- posed to minimize bank conflicts in the writing and subsequent reading of the neighbor lists. The first strategy proposes an efficient coordination between CPU-GPU, using GPU algorithms for those stages that allow a straight forward parallelization, and sequential CPU algorithms for those instructions that involve some kind of vector reduction. This coordination provides a relatively orderly reading of the neighbor lists in the interactions stage, achieving a speed-up factor of x47 in this stage. However, since the construction of the neighbor lists is quite expensive, it is achieved an overall speed-up of x41. The second strategy seeks to maximize the use of the GPU in the neighbor's location process by executing a specific vector sorting algorithm that allows some data-level parallelism. Al- though this strategy has succeeded in improving the speed-up on the stage of neighboring location, the global speed-up on the interactions stage falls, due to inefficient reading of the neighbor vectors. Some changes to these strategies are proposed, aimed at maximizing the computational load of the GPU and using the GPU texture-units, in order to reach the maximum speed-up for such codes. Different practical applications have been added to the mentioned GPU codes. First, the classical dam-break problem is studied. Second, the wave impact of the sloshing fluid contained in LNG vessel tanks is also simulated as a practical example of particle methods
Resumo:
production, during the summer of 2010. This farm is integrated at the Spanish research network for the sugar beet development (AIMCRA) which regarding irrigation, focuses on maximizing water saving and cost reduction. According to AIMCRA 0 s perspective for promoting irrigation best practices, it is essential to understand soil response to irrigation i.e. maximum irrigation length for each soil infiltration capacity. The Use of Humidity Sensors provides foundations to address soil 0 s behavior at the irrigation events and, therefore, to establish the boundaries regarding irrigation length and irrigation interval. In order to understand to what extent farmer 0 s performance at Tordesillas farm could have been potentially improved, this study aims to address suitable irrigation length and intervals for the given soil properties and evapotranspiration rates. In this sense, several humidity sensors were installed: (1) A Frequency Domain Reflectometry (FDR) EnviroScan Probe taking readings at 10, 20, 40 and 60cm depth and (2) different Time Domain Reflectometry (TDR) Echo 2 and Cr200 probes buried in a 50cm x 30cm x 50cm pit and placed along the walls at 10, 20, 30 and 40 cm depth. Moreover, in order to define soil properties, a textural analysis at the Tordesillas Farm was conducted. Also, data from the Tordesillas meteorological station was utilized.