896 resultados para Sequential quadratic programming
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.
Resumo:
AIMS: To investigate empirically the hypothesized relationship between counsellor motivational interviewing (MI) skills and patient change talk (CT) by analysing the articulation between counsellor behaviours and patient language during brief motivational interventions (BMI) addressing at-risk alcohol consumption. DESIGN: Sequential analysis of psycholinguistic codes obtained by two independent raters using the Motivational Interviewing Skill Code (MISC), version 2.0. SETTING: Secondary analysis of data from a randomized controlled trial evaluating the effectiveness of BMI in an emergency department. PARTICIPANTS: A total of 97 patients tape-recorded when receiving BMI. MEASUREMENTS: MISC variables were categorized into three counsellor behaviours (MI-consistent, MI-inconsistent and 'other') and three kinds of patient language (CT, counter-CT (CCT) and utterances not linked with the alcohol topic). Observed transition frequencies, conditional probabilities and significance levels based on odds ratios were computed using sequential analysis software. FINDINGS: MI-consistent behaviours were the only counsellor behaviours that were significantly more likely to be followed by patient CT. Those behaviours were significantly more likely to be followed by patient change exploration (CT and CCT) while MI-inconsistent behaviours and 'other' counsellor behaviours were significantly more likely to be followed by utterances not linked with the alcohol topic and significantly less likely to be followed by CT. MI-consistent behaviours were more likely after change exploration, whereas 'other' counsellor behaviours were more likely only after utterances not linked with the alcohol topic. CONCLUSIONS: Findings lend support to the hypothesized relationship between MI-consistent behaviours and CT, highlight the importance of patient influence on counsellor behaviour and emphasize the usefulness of MI techniques and spirit during brief interventions targeting change enhancement.
Resumo:
In This work we present a Web-based tool developed with the aim of reinforcing teaching and learning of introductory programming courses. This tool provides support for teaching and learning. From the teacher's perspective the system introduces important gains with respect to the classical teaching methodology. It reinforces lecture and laboratory sessions, makes it possible to give personalized attention to the student, assesses the degree of participation of the students and most importantly, performs a continuous assessment of the student's progress. From the student's perspective it provides a learning framework, consisting in a help environment and a correction environment, which facilitates their personal work. With this tool students are more motivated to do programming
Resumo:
OBJECTIVE To better define the concordance of visual loss in patients with nonarteritic anterior ischemic optic neuropathy (NAION). METHODS The medical records of 86 patients with bilateral sequential NAION were reviewed retrospectively, and visual function was assessed using visual acuity, Goldmann visual fields, color vision, and relative afferent papillary defect. A quantitative total visual field score and score per quadrant were analyzed for each eye using the numerical Goldmann visual field scoring method. RESULTS Outcome measures were visual acuity, visual field, color vision, and relative afferent papillary defect. A statistically significant correlation was found between fellow eyes for multiple parameters, including logMAR visual acuity (P = .01), global visual field (P < .001), superior visual field (P < .001), and inferior visual field (P < .001). The mean deviation of total (P < .001) and pattern (P < .001) deviation analyses was significantly less between fellow eyes than between first and second eyes of different patients. CONCLUSIONS Visual function between fellow eyes showed a fair to moderate correlation that was statistically significant. The pattern of vision loss was also more similar in fellow eyes than between eyes of different patients. These results may help allow better prediction of visual outcome for the second eye in patients with NAION.
Resumo:
A sequential treatment design was chosen in this trial to ensure complete resistance to single-agent non-steroidal aromatase inhibitor (AI) and trastuzumab both given as monotherapy before receiving the combination of a non-steroidal AI and trastuzumab. Key eligibility criteria included postmenopausal patients with advanced, measurable, human epidermal growth factor receptor-2 (HER-2)-positive disease (assessed by FISH, ratio (≥2)), hormone receptor (HR)-positive disease, and progression on prior treatment with a non-steroidal AI, e.g. letrozole or anastrozole, either in the adjuvant or in the advanced setting. Patients received standard dose trastuzumab monotherapy in step 1 and upon disease progression continued trastuzumab in combination with letrozole in step 2. The primary endpoint was clinical benefit rate (CBR) in step 2. Totally, 13 patients were enrolled. In step 1, six patients (46%) achieved CBR. Median time to progression (TTP) was 161 days (95% confidence interval (CI): 82-281). In step 2, CBR was observed in eight out of the 11 evaluable patients (73%), including one patient with partial response. Median TTP for all the 11 patients was 188 days (95% CI: 77-not reached). Results of this proof-of-concept trial suggest that complete resistance to both AI and trastuzumab can be overcome in a proportion of patients by combined treatment of AI and trastuzumab, as all patients served as their own control. Our results appear promising for a new treatment strategy that offers a chemotherapy-free option for at least a subset of patients with HR-positive, HER-2-positive breast cancer over a clinically relevant time period.
Resumo:
Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.
Resumo:
SNARE complexes are required for membrane fusion in the endomembrane system. They contain coiled-coil bundles of four helices, three (Q(a), Q(b), and Q(c)) from target (t)-SNAREs and one (R) from the vesicular (v)-SNARE. NSF/Sec18 disrupts these cis-SNARE complexes, allowing reassembly of their subunits into trans-SNARE complexes and subsequent fusion. Studying these reactions in native yeast vacuoles, we found that NSF/Sec18 activates the vacuolar cis-SNARE complex by selectively displacing the vacuolar Q(a) SNARE, leaving behind a Q(bc)R subcomplex. This subcomplex serves as an acceptor for a Q(a) SNARE from the opposite membrane, leading to Q(a)-Q(bc)R trans-complexes. Activity tests of vacuoles with diagnostic distributions of inactivating mutations over the two fusion partners confirm that this distribution accounts for a major share of the fusion activity. The persistence of the Q(bc)R cis-complex and the formation of the Q(a)-Q(bc)R trans-complex are both sensitive to the Rab-GTPase inhibitor, GDI, and to mutations in the vacuolar tether complex, HOPS (HOmotypic fusion and vacuolar Protein Sorting complex). This suggests that the vacuolar Rab-GTPase, Ypt7, and HOPS restrict cis-SNARE disassembly and thereby bias trans-SNARE assembly into a preferred topology.
Resumo:
A new formula for glomerular filtration rate estimation in pediatric population from 2 to 18 years has been developed by the University Unit of Pediatric Nephrology. This Quadratic formula, accessible online, allows pediatricians to adjust drug dosage and/or follow-up renal function more precisely and in an easy manner.
Resumo:
Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.
Resumo:
Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.
Resumo:
In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.
Resumo:
In this work we study older workers'(50-64) labor force transitions after a health/disability shock. We find that the probability of keeping working decreases with both age and severity of the shock. Moreover, we find strong interactions between age and severity in the 50-64 age range and none in the 30-49 age range. Regarding demographics we find that being female and married reduce the probability of keeping work. On the contrary, being main breadwinner, education and skill levels increase it. Interestingly, the effect of some demographics changes its sign when we look at transitions from inactivity to work. This is the case of being married or having a working spouse. Undoubtedly, leisure complementarities should play a role in the latter case. Since the data we use contains a very detailed information on disabilities, we are able to evaluate the marginal effect of each type of disability either in the probability of keeping working or in returning back to work. Some of these results may have strong policy implications.