273 resultados para Optimal matching analysis.
Resumo:
A three-phase LPME (liquid-phase microextraction) method for the enantioselective analysis of venlafaxine (VF) metabolites (O-desmethylvenlafaxine (ODV) and N-desmethylvenlafaxine (NDV) in microsomal preparations is described for the first time. The assay involves the chiral HPLC separation of drug and metabolites using a Chiralpak AD column under normal-phase mode of elution and detection at 230 nm. The LPME procedure was optimized using multifactorial experiments and the following optimal condition was established: sample agitation at 1,750 rpm, 20 min of extraction, acetic acid 0.1 mol/L as acceptor phase, 1-octanol as organic phase and donor phase pH adjustment to 10.0. Under these conditions, the mean recoveries were 41% and 42% for (-)-(R)-ODV and (+)-(S)-ODV, respectively, and 47% and 48% for (-)-( R)-NDV and (+)-( S)-NDV, respectively. The method presented quantification limits of 200 ng/mL and it was linear over the concentration range of 200-5,000 ng/mL for all analytes. The validated method was employed to study the in vitro biotransformation of VF using rat liver microsomal fraction. The results demonstrated the enantioselective biotransformation of VF.
Resumo:
In a decentralized setting the game-theoretical predictions are that only strong blockings are allowed to rupture the structure of a matching. This paper argues that, under indifferences, also weak blockings should be considered when these blockings come from the grand coalition. This solution concept requires stability plus Pareto optimality. A characterization of the set of Pareto-stable matchings for the roommate and the marriage models is provided in terms of individually rational matchings whose blocking pairs, if any, are formed with unmatched agents. These matchings always exist and give an economic intuition on how blocking can be done by non-trading agents, so that the transactions need not be undone as agents reach the set of stable matchings. Some properties of the Pareto-stable matchings shared by the Marriage and Roommate models are obtained.
Resumo:
A stable matching rule is used as the outcome function for the Admission game where colleges behave straightforwardly and the students` strategies are given by their preferences over the colleges. We show that the college-optimal stable matching rule implements the set of stable matchings via the Nash equilibrium (NE) concept. For any other stable matching rule the strategic behavior of the students may lead to outcomes that are not stable under the true preferences. We then introduce uncertainty about the matching selected and prove that the natural solution concept is that of NE in the strong sense. A general result shows that the random stable matching rule, as well as any stable matching rule, implements the set of stable matchings via NE in the strong sense. Precise answers are given to the strategic questions raised.
Resumo:
This article assesses if innovators outperform non-innovators in Brazilian manufacturing during 1996-2002. To do so, we begin with a simple theoretical model and test the impacts of technological innovation (treatment) on innovating firms (treated) by employing propensity score matching techniques. Correcting for the survivorship bias in the period, it was verified that, on an average, the accomplishment of technological innovations produces positive and significant impacts on the employment, the net revenue, the labor productivity, the capital productivity, and market share of the firms. However, this result was not observed for the mark-up. Especially, the net revenue reflects more robustly the impacts of the innovations. Quantitatively speaking, innovating firms experienced a 10.8-12.5 percentage points (p.p. henceforth) higher growth on employment, a 18.1-21.7 p.p. higher growth on the net revenue, a 10.8-11.9 p.p. higher growth on labor productivity, a 11.8-12.0 p.p. higher growth on capital productivity, and a 19.9-24.3 p.p. higher growth on their market share, relative to the average of the non-innovating firms in the control group. It was also observed that the conjunction of product and process innovations, relative to other forms of innovation, presents the stronger impacts on the performance of Brazilian firms.
Resumo:
This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Background Meta-analysis is increasingly being employed as a screening procedure in large-scale association studies to select promising variants for follow-up studies. However, standard methods for meta-analysis require the assumption of an underlying genetic model, which is typically unknown a priori. This drawback can introduce model misspecifications, causing power to be suboptimal, or the evaluation of multiple genetic models, which augments the number of false-positive associations, ultimately leading to waste of resources with fruitless replication studies. We used simulated meta-analyses of large genetic association studies to investigate naive strategies of genetic model specification to optimize screenings of genome-wide meta-analysis signals for further replication. Methods Different methods, meta-analytical models and strategies were compared in terms of power and type-I error. Simulations were carried out for a binary trait in a wide range of true genetic models, genome-wide thresholds, minor allele frequencies (MAFs), odds ratios and between-study heterogeneity (tau(2)). Results Among the investigated strategies, a simple Bonferroni-corrected approach that fits both multiplicative and recessive models was found to be optimal in most examined scenarios, reducing the likelihood of false discoveries and enhancing power in scenarios with small MAFs either in the presence or in absence of heterogeneity. Nonetheless, this strategy is sensitive to tau(2) whenever the susceptibility allele is common (MAF epsilon 30%), resulting in an increased number of false-positive associations compared with an analysis that considers only the multiplicative model. Conclusion Invoking a simple Bonferroni adjustment and testing for both multiplicative and recessive models is fast and an optimal strategy in large meta-analysis-based screenings. However, care must be taken when examined variants are common, where specification of a multiplicative model alone may be preferable.
Resumo:
Background: Understanding how clinical variables affect stress distribution facilitates optimal prosthesis design and fabrication and may lead to a decrease in mechanical failures as well as improve implant longevity. Purpose: In this study, the many clinical variations present in implant-supported prosthesis were analyzed by 3-D finite element method. Materials and Method: A geometrical model representing the anterior segment of a human mandible treated with 5 implants supporting a framework was created to perform the tests. The variables introduced in the computer model were cantilever length, elastic modulus of cancellous bone, abutment length, implant length, and framework alloy (AgPd or CoCr). The computer was programmed with physical properties of the materials as derived from the literature, and a 100N vertical load was used to simulate the occlusal force. Images with the fringes of stress were obtained and the maximum stress at each site was plotted in graphs for comparison. Results: Stresses clustered at the elements closest to the loading point. Stress increase was found to be proportional to the increase in cantilever length and inversely proportional to the increase in the elastic modulus of cancellous bone. Increasing the abutment length resulted in a decrease of stress on implants and framework. Stress decrease could not be demonstrated with implants longer than 13 mm. A stiffer framework may allow better stress distribution. Conclusion: The relative physical properties of the many materials involved in an implant-supported prosthesis system affect the way stresses are distributed.
Resumo:
The optimal formulation for the preparation of amaranth flour films plasticized with glycerol and sorbitol was obtained by a multi-response analysis. The optimization aimed to achieve films with higher resistance to break, moderate elongation and lower solubility in water. The influence of plasticizer concentration (Cg, glycerol or Cs, sorbitol) and process temperature (Tp) on the mechanical properties and solubility of the amaranth flour films was initially studied by response surface methodology (RSM). The optimized conditions obtained were Cg 20.02 g glycerol/100 g flour and Tp 75 degrees C, and Cs 29.6 g sorbitol/100 g flour and Tp 75 degrees C. Characterization of the films prepared with these formulations revealed that the optimization methodology employed in this work was satisfactory. Sorbitol was the most suitable plasticizer. It furnished amaranth flour films that were more resistant to break and less permeable to oxygen, due to its greater miscibility with the biopolymers present in the flour and its lower affinity for water. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.
An improved estimate of leaf area index based on the histogram analysis of hemispherical photographs
Resumo:
Leaf area index (LAI) is a key parameter that affects the surface fluxes of energy, mass, and momentum over vegetated lands, but observational measurements are scarce, especially in remote areas with complex canopy structure. In this paper we present an indirect method to calculate the LAI based on the analyses of histograms of hemispherical photographs. The optimal threshold value (OTV), the gray-level required to separate the background (sky) and the foreground (leaves), was analytically calculated using the entropy crossover method (Sahoo, P.K., Slaaf, D.W., Albert, T.A., 1997. Threshold selection using a minimal histogram entropy difference. Optical Engineering 36(7) 1976-1981). The OTV was used to calculate the LAI using the well-known gap fraction method. This methodology was tested in two different ecosystems, including Amazon forest and pasturelands in Brazil. In general, the error between observed and calculated LAI was similar to 6%. The methodology presented is suitable for the calculation of LAI since it is responsive to sky conditions, automatic, easy to implement, faster than commercially available software, and requires less data storage. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Background: Allergic lung inflammation is impaired in diabetic rats and is restored by insulin treatment. In the present study we investigated the effect of insulin on the signaling pathways triggered by allergic inflammation in the lung and the release of selected mediators. Methods: Diabetic male Wistar rats (alloxan, 42 mg/kg, i.v., 10 days) and matching controls were sensitized by s.c. injections of ovalbumin (OA) in aluminium hydroxide, 14 days before OA (1 mg/0.4 ml) or saline intratracheal challenge. A group of diabetic rats were treated with neutral protamine Hagedorn insulin (NPH, 4 IU, s.c.), 2 h before the OA challenge. Six hours after the challenge, bronchoalveolar lavage (BAL) was performed for mediator release and lung tissue was homogenized for Western blotting analysis of signaling pathways. Results: Relative to non-diabetic rats, the diabetic rats exhibited a significant reduction in OA-induced phosphorylation of the extracellular signal-regulated kinase (ERK, 59%), p38 (53%), protein kinase B (Akt, 46%), protein kinase C (PKC)-alpha (63%) and PKC-delta (38%) in lung homogenates following the antigen challenge. Activation of the NF-kappa B p65 subunit and phosphorylation of I kappa B alpha were almost suppressed in diabetic rats. Reduced expression of inducible nitric oxide synthase (iNOS, 32%) and cyclooxygenase-2 (COX-2, 46%) in the lung homogenates was also observed. The BAL concentration of prostaglandin (PG)-E(2), nitric oxide (NO) and interleukin (IL)-6 was reduced in diabetic rats (74%, 44% and 65%, respectively), whereas the cytokine-induced neutrophil chemoattractant (CINC)-2 concentration was not different from the control animals. Treatment of diabetic rats with insulin completely or partially restored all of these parameters. This protocol of insulin treatment only partially reduced the blood glucose levels. Conclusion: The data presented show that insulin regulates MAPK, PI3K, PKC and NF-kappa B pathways, the expression of the inducible enzymes iNOS and COX-2, and the levels of NO, PGE(2) and IL-6 in the early phase of allergic lung inflammation in diabetic rats. It is suggested that insulin is required for optimal transduction of the intracellular signals that follow allergic stimulation. Copyright (C) 2010 S. Karger AG, Basel
Resumo:
A new method for characterization and analysis of asphaltic mixtures aggregate particles is reported. By relying on multiscale representation of the particles, curvature estimation, and discriminant analysis for optimal separation of the categories of mixtures, a particularly effective and comprehensive methodology is obtained. The potential of the methodology is illustrated with respect to three important types of particles used in asphaltic mixtures, namely basalt, gabbro, and gravel. The obtained results show that gravel particles are markedly distinct from the other two types of particles, with the gabbro category resulting with intermediate geometrical properties. The importance of each considered measurement in the discrimination between the three categories of particles was also quantified in terms of the adopted discriminant analysis.
Resumo:
2D electrophoresis is a well-known method for protein separation which is extremely useful in the field of proteomics. Each spot in the image represents a protein accumulation and the goal is to perform a differential analysis between pairs of images to study changes in protein content. It is thus necessary to register two images by finding spot correspondences. Although it may seem a simple task, generally, the manual processing of this kind of images is very cumbersome, especially when strong variations between corresponding sets of spots are expected (e.g. strong non-linear deformations and outliers). In order to solve this problem, this paper proposes a new quadratic assignment formulation together with a correspondence estimation algorithm based on graph matching which takes into account the structural information between the detected spots. Each image is represented by a graph and the task is to find a maximum common subgraph. Successful experimental results using real data are presented, including an extensive comparative performance evaluation with ground-truth data. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Background: Human infection by the pork tapeworm Taenia solium affects more than 50 million people worldwide, particularly in underdeveloped and developing countries. Cysticercosis which arises from larval encystation can be life threatening and difficult to treat. Here, we investigate for the first time the transcriptome of the clinically relevant cysticerci larval form. Results: Using Expressed Sequence Tags (ESTs) produced by the ORESTES method, a total of 1,520 high quality ESTs were generated from 20 ORESTES cDNA mini-libraries and its analysis revealed fragments of genes with promising applications including 51 ESTs matching antigens previously described in other species, as well as 113 sequences representing proteins with potential extracellular localization, with obvious applications for immune-diagnosis or vaccine development. Conclusion: The set of sequences described here will contribute to deciphering the expression profile of this important parasite and will be informative for the genome assembly and annotation, as well as for studies of intra- and inter-specific sequence variability. Genes of interest for developing new diagnostic and therapeutic tools are described and discussed.
Resumo:
OBJECTIVES: This study assessed the bone density gain and its relationship with the periodontal clinical parameters in a case series of a regenerative therapy procedure. MATERIAL AND METHODS: Using a split-mouth study design, 10 pairs of infrabony defects from 15 patients were treated with a pool of bovine bone morphogenetic proteins associated with collagen membrane (test sites) or collagen membrane only (control sites). The periodontal healing was clinically and radiographically monitored for six months. Standardized pre-surgical and 6-month postoperative radiographs were digitized for digital subtraction analysis, which showed relative bone density gain in both groups of 0.034 ± 0.423 and 0.105 ± 0.423 in the test and control group, respectively (p>0.05). RESULTS: As regards the area size of bone density change, the influence of the therapy was detected in 2.5 mm² in the test group and 2 mm² in the control group (p>0.05). Additionally, no correlation was observed between the favorable clinical results and the bone density gain measured by digital subtraction radiography (p>0.05). CONCLUSIONS: The findings of this study suggest that the clinical benefit of the regenerative therapy observed did not come with significant bone density gains. Long-term evaluation may lead to a different conclusions.