960 resultados para Improved sequential algebraic algorithm
Resumo:
Surface characteristics represent a critical issue facing pavement owners and the concrete paving industry. The traveling public has come to expect smoother, quieter, and better drained pavements, all without compromising safety. The overall surface characteristics issues is extremely complex since all pavement surface characteristics properties, including texture, noise, friction, splash/spray, rolling resistance, reflectivity/illuminance, and smoothness, are complexly related. The following needs and gaps related to achieving desired pavement surface characteristics need to be addressed: determined how changes in one surface characteristic affect, either beneficially or detrimentally, other characteristics of the pavement, determine the long-term surface and acoustic durability of different textures, and develop, evaluate, and standardize new data collection and analysis tools. It is clear that an overall strategic and coordinated research approach to the problem must be developed and pursued to address these needs and gaps.
Resumo:
In order to improve the immunogenicity of currently available non-replicating pox virus HIV vaccine vectors, NYVAC was genetically modified through re-insertion of two host range genes (K1L and C7L), resulting in restored replicative capacity in human cells. In the present study these vectors, expressing either a combination of the HIV-1 clade C antigens Env, Gag, Pol, Nef, or a combination of Gal, Pol, Nef were evaluated for safety and immunogenicity in rhesus macaques, which were immunized at weeks 0, 4 and 12 either by scarification (conventional poxvirus route of immunization), intradermal or by intramuscular injection (route used in previous vaccine studies).Replication competent NYVAC-C-KC vectors induced higher HIV-specific responses, as measured by IFN- ELISpot assay, than the replication defective NYVAC-C vectors. Application through scarification only required one immunization to induce maximum HIV-specific immune responses. This method simultaneously induced relatively lower anti-vector responses. In contrast, two to three immunizations were required when the NYVAC-C-KC vectors were given by intradermal or intramuscular injection and this method tended to generate slightly lower responses. Responses were predominantly directed against Env in the animals that received NYVAC-C-KC vectors expressing HIV-1 Env, Gag, Pol, Nef, while Gag responses were dominant in the NYVAC-C-KC HIV-1 Gag, Pol, Nef immunized animals.The current study demonstrates that NYVAC replication competent vectors were well tolerated and showed increased immunogenicity as compared to replication defective vectors. Further studies are needed to evaluate the most efficient route of immunization and to explore the use of these replication competent NYVAC vectors in prime/boost combination with gp120 protein-based vaccine candidates. This studies was performed within the Poxvirus T-cell Vaccine Discovery Consortium (PTVDC) which is part of the CAVD program.
Resumo:
The large spatial inhomogeneity in transmit B(1) field (B(1)(+)) observable in human MR images at high static magnetic fields (B(0)) severely impairs image quality. To overcome this effect in brain T(1)-weighted images, the MPRAGE sequence was modified to generate two different images at different inversion times, MP2RAGE. By combining the two images in a novel fashion, it was possible to create T(1)-weighted images where the result image was free of proton density contrast, T(2) contrast, reception bias field, and, to first order, transmit field inhomogeneity. MP2RAGE sequence parameters were optimized using Bloch equations to maximize contrast-to-noise ratio per unit of time between brain tissues and minimize the effect of B(1)(+) variations through space. Images of high anatomical quality and excellent brain tissue differentiation suitable for applications such as segmentation and voxel-based morphometry were obtained at 3 and 7 T. From such T(1)-weighted images, acquired within 12 min, high-resolution 3D T(1) maps were routinely calculated at 7 T with sub-millimeter voxel resolution (0.65-0.85 mm isotropic). T(1) maps were validated in phantom experiments. In humans, the T(1) values obtained at 7 T were 1.15+/-0.06 s for white matter (WM) and 1.92+/-0.16 s for grey matter (GM), in good agreement with literature values obtained at lower spatial resolution. At 3 T, where whole-brain acquisitions with 1 mm isotropic voxels were acquired in 8 min, the T(1) values obtained (0.81+/-0.03 s for WM and 1.35+/-0.05 for GM) were once again found to be in very good agreement with values in the literature.
Resumo:
Starting in February 1994, 20 patients (pt) with a median age of 50 years(range 41-63) from 7 European centers have been included. Completedata were obtained in 16 patients so far. CPC were mobilized with chemo(Epirubicine 75 mg/m2 /d, 01 + 02) followed by G-CSF 5 p.gfkg/d for14 days. HD chemo consisted in 3 sequential courses of ICE regimen(UOs. 10 g/m2 , Carbo. 1200 mg/m2 and Etop. 1200 mg/m2 ) underCPC protection and G-CSF 5 p.g/kg/d. Out of the 16 pt, 12 completedfull program (3 cycles). One pt died of septic shock before receivingany ICE course. One pt died during the first ICE of renal insufficiency.Two pt had only 2 courses because of toxicity. Among the 16 pt, responserate (RR) was: 7 CR, 6 PR, 1 PO; 3 pt are not evaluable dueto early withdrawal (overall RR: 13/16 = 81 %). Thirty-nine cycles ofHD chemo were given with a median hematological recovery of 9 days(range 7-12) until neutro. counts> 1.0 x 109 /1 and 9 days (range 717)until thrombo. > 20 x 109 /1. No cumulative, hematological toxicitywas seen. Accrual of patients is still ongoing and updated results will bepresented.
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
In this paper we study, as in Jeon-Menicucci (2009), competition between sellerswhen each of them sells a portfolio of distinct products to a buyer having limitedslots. This paper considers sequential pricing and complements our main paper (Jeon-Menicucci, 2009) that considers simultaneous pricing.First, Jeon-Menicucci (2009) find that under simultaneous individual pricing, equilibriumoften does not exist and hence the outcome is often inefficient. By contrast,equilibrium always exists under sequential individual pricing and we characterize it inthis paper. We find that each seller faces a trade-off between the number of slots heoccupies and surplus extraction per product, and there is no particular reason thatthis leads to an efficient allocation of slots.Second, Jeon-Menicucci (2009) find that when bundling is allowed, there alwaysexists an efficient equilibrium but inefficient equilibria can also exist due to purebundling (for physical products) or slotting contracts. Under sequential pricing,we find that all equilibria are efficient regardless of whether firms can use slottingcontracts, and both for digital goods and for physical goods. Therefore, sequentialpricing presents an even stronger case for laissez-faire in the matter of bundling thansimultaneous pricing.
Resumo:
In experiments with two-person sequential games we analyzewhether responses to favorable and unfavorable actions dependon the elicitation procedure. In our hot treatment thesecond player responds to the first player s observed actionwhile in our cold treatment we follow the strategy method and have the second player decide on a contingent action foreach and every possible first player move, without firstobserving this move. Our analysis centers on the degree towhich subjects deviate from the maximization of their pecuniaryrewards, as a response to others actions. Our results show nodifference in behavior between the two treatments. We also findevidence of the stability of subjects preferences with respectto their behavior over time and to the consistency of theirchoices as first and second mover.
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
In this work we study older workers (50 64) labor force transitions after a health/disability shock. We find that the probability of keeping working decreases with both age and severity of the shock. Moreover, we find strong interactions between age and severity in the 50 64 age range and none in the 30 49 age range. Regarding demographics we find that being female and married reduce the probability of keeping work. On the contrary, being main breadwinner, education and skill levels increase it. Interestingly, the effect of some demographics changes its sign when we look at transitions from inactivity to work. This is the case of being married or having a working spouse. Undoubtedly, leisure complementarities should play a role in the latter case. Since the data we use contains a very detailed information on disabilities, we are able to evaluate the marginal effect of each type of disability either in the probability of keeping working or in returning back to work. Some of these results may have strong policy implications.
Resumo:
Condence intervals in econometric time series regressions suffer fromnotorious coverage problems. This is especially true when the dependencein the data is noticeable and sample sizes are small to moderate, as isoften the case in empirical studies. This paper suggests using thestudentized block bootstrap and discusses practical issues, such as thechoice of the block size. A particular data-dependent method is proposedto automate the method. As a side note, it is pointed out that symmetricconfidence intervals are preferred over equal-tailed ones, since theyexhibit improved coverage accuracy. The improvements in small sampleperformance are supported by a simulation study.
Resumo:
We propose a rule of decision-making, the sequential procedure guided byroutes, and show that three influential boundedly rational choice models can be equivalentlyunderstood as special cases of this rule. In addition, the sequential procedure guidedby routes is instrumental in showing that the three models are intimately related. We showthat choice with a status-quo bias is a refinement of rationalizability by game trees, which, inturn, is also a refinement of sequential rationalizability. Thus, we provide a sharp taxonomyof these choice models, and show that they all can be understood as choice by sequentialprocedures.
Resumo:
It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.
Resumo:
Small sample properties are of fundamental interest when only limited data is avail-able. Exact inference is limited by constraints imposed by speci.c nonrandomizedtests and of course also by lack of more data. These e¤ects can be separated as we propose to evaluate a test by comparing its type II error to the minimal type II error among all tests for the given sample. Game theory is used to establish this minimal type II error, the associated randomized test is characterized as part of a Nash equilibrium of a .ctitious game against nature.We use this method to investigate sequential tests for the di¤erence between twomeans when outcomes are constrained to belong to a given bounded set. Tests ofinequality and of noninferiority are included. We .nd that inference in terms oftype II error based on a balanced sample cannot be improved by sequential sampling or even by observing counter factual evidence providing there is a reasonable gap between the hypotheses.
Resumo:
Price bubbles in an Arrow-Debreu valuation equilibrium in infinite-timeeconomy are a manifestation of lack of countable additivity of valuationof assets. In contrast, known examples of price bubbles in sequentialequilibrium in infinite time cannot be attributed to the lack of countableadditivity of valuation. In this paper we develop a theory of valuation ofassets in sequential markets (with no uncertainty) and study the nature ofprice bubbles in light of this theory. We consider an operator, calledpayoff pricing functional, that maps a sequence of payoffs to the minimumcost of an asset holding strategy that generates it. We show that thepayoff pricing functional is linear and countably additive on the set ofpositive payoffs if and only if there is no Ponzi scheme, and providedthat there is no restriction on long positions in the assets. In the knownexamples of equilibrium price bubbles in sequential markets valuation islinear and countably additive. The presence of a price bubble indicatesthat the asset's dividends can be purchased in sequential markers at acost lower than the asset's price. We also present examples of equilibriumprice bubbles in which valuation is nonlinear but not countably additive.