937 resultados para Optimal matching analysis.
Resumo:
The use of iodine as a catalyst and either acetic or trifluoroacetic acid as a derivatizing reagent for determining the enantiomeric composition of acyclic and cyclic aliphatic chiral alcohols was investigated. Optimal conditions were selected according to the molar ratio of alcohol to acid, the reaction time, and the reaction temperature. Afterwards, chiral stability of chiral carbons was studied. Although no isomerization was observed when acetic acid was used, partial isomerization was detected with the trifluoroacetic acid. A series of chiral alcohols of a widely varying structural type were then derivatized with acetic acid using the optimal conditions. The resolution of the enantiomeric esters and the free chiral alcohols was measured using a capillary gas chromatograph equipped with a CP Chirasil-DEX CB column. The best resolutions were obtained with 2-pentyl acetates (α = 3.00) and 2-hexyl acetates (α = 1.95). This method provides a very simple and efficient experimental workup procedure for analyzing chiral alcohols by chiral-phase GC.
Resumo:
In the present research we have set forth a new, simple, Trade-Off model that would allow us to calculate how much debt and, by default, how much equity a company should have, using easily available information and calculating the cost of debt dynamically on the basis of the effect that the capital structure of the company has on the risk of bankruptcy; in an attempt to answer this question. The proposed model has been applied to the companies that make up the Dow Jones Industrial Average (DJIA) in 2007. We have used consolidated financial data from 1996 to 2006, published by Bloomberg. We have used simplex optimization method to find the debt level that maximizes firm value. Then, we compare the estimated debt with real debt of companies using statistical nonparametric Mann-Whitney. The results indicate that 63% of companies do not show a statistically significant difference between the real and the estimated debt.
Resumo:
Psychophysical studies suggest that humans preferentially use a narrow band of low spatial frequencies for face recognition. Here we asked whether artificial face recognition systems have an improved recognition performance at the same spatial frequencies as humans. To this end, we estimated recognition performance over a large database of face images by computing three discriminability measures: Fisher Linear Discriminant Analysis, Non-Parametric Discriminant Analysis, and Mutual Information. In order to address frequency dependence, discriminabilities were measured as a function of (filtered) image size. All three measures revealed a maximum at the same image sizes, where the spatial frequency content corresponds to the psychophysical found frequencies. Our results therefore support the notion that the critical band of spatial frequencies for face recognition in humans and machines follows from inherent properties of face images, and that the use of these frequencies is associated with optimal face recognition performance.
Resumo:
Drug combinations can improve angiostatic cancer treatment efficacy and enable the reduction of side effects and drug resistance. Combining drugs is non-trivial due to the high number of possibilities. We applied a feedback system control (FSC) technique with a population-based stochastic search algorithm to navigate through the large parametric space of nine angiostatic drugs at four concentrations to identify optimal low-dose drug combinations. This implied an iterative approach of in vitro testing of endothelial cell viability and algorithm-based analysis. The optimal synergistic drug combination, containing erlotinib, BEZ-235 and RAPTA-C, was reached in a small number of iterations. Final drug combinations showed enhanced endothelial cell specificity and synergistically inhibited proliferation (p < 0.001), but not migration of endothelial cells, and forced enhanced numbers of endothelial cells to undergo apoptosis (p < 0.01). Successful translation of this drug combination was achieved in two preclinical in vivo tumor models. Tumor growth was inhibited synergistically and significantly (p < 0.05 and p < 0.01, respectively) using reduced drug doses as compared to optimal single-drug concentrations. At the applied conditions, single-drug monotherapies had no or negligible activity in these models. We suggest that FSC can be used for rapid identification of effective, reduced dose, multi-drug combinations for the treatment of cancer and other diseases.
Resumo:
ABSTRACT The present study aimed at evaluating the heterotic group formation in guava based on quantitative descriptors and using artificial neural network (ANN). For such, we evaluated eight quantitative descriptors. Large genetic variability was found for the eight quantitative traits in the 138 genotypes of guava. The artificial neural network technique determined that the optimal number of groups was three. The grouping consistency was determined by linear discriminant analysis, which obtained classification percentage of the groups, with a value of 86 %. It was concluded that the artificial neural network method is effective to detect genetic divergence and heterotic group formation.
Resumo:
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.
Resumo:
Objectives: We present the retrospective analysis of a single-institution experience for radiosurgery (RS) in brain metastasis (BM) with Gamma Knife (GK) and Linac. Methods: From July 2010 to July 2012, 28 patients (with 83 lesions) had RS with GK and 35 patients (with 47 lesions) with Linac. The primary outcome was the local progression-free survival (LPFS). The secondary outcome was the overall survival (OS). Apart a standard statistical analysis, we included a Cox regression model with shared frailty, to modulate the within-patient correlation (preliminary evaluation showed a significant frailty effect, meaning that the correlation within patient could be ignored). Results: The mean follow-up period was 11.7 months (median 7.9, 1.7-22.7) for GK and 18.1 (median 17, 7.5-28.7) for Linac. The median number of lesions per patient was 2.5 (1-9) in GK compared with 1 (1-3) in Linac. There were more radioresistant lesions (melanoma) and more lesions located in functional areas for the GK group. The median dose was 24 Gy (GK) compared with 20 Gy (Linac). The LPFS actuarial rate was as follows: for GK at 3, 6, 9, 12, and 17 months: 96.96, 96.96, 96.96, 88.1, and 81.5%, and remained stable till 32 months; for Linac at 3, 6, 12, 17, 24, and 33 months, it was 91.5, 91.5, 91.5, 79.9, 55.5, and 17.1%, respectively (p = 0.03, chi-square test). After the Cox regression analysis with shared frailty, the p-value was not statistically significant between groups. The median overall survival was 9.7 months for GK and 23.6 months for Linac group. Uni- and multivariate analysis showed a lower GPA score and noncontrolled systemic status were associated with lower OS. Cox regression analysis adjusting for these two parameters showed comparable OS rate. Conclusions: In this comparative report between GK and Linac, preliminary analysis showed that more difficult cases are treated by GK, with patients harboring more lesions, radioresistant tumors, and highly functional located. The groups look, in this sense, very heterogeneous at baseline. After a Cox frailty model, the LPFS rates seemed very similar (p < 0.05). The OS was similar, after adjusting for systemic status and GPA score (p < 0.05). The technical reasons for choosing GK instead of Linac were the anatomical location related to highly functional areas, histology, technical limitations of Linac movements, especially lower posterior fossa locations, or closeness of multiple lesions to highly functional areas optimal dosimetry with Linac
Resumo:
In this diploma work advantages of coherent anti-Stokes Raman scattering spectrometry (CARS) and various methods of the quantitative analysis of substance structure with its help are considered. The basic methods and concepts of the adaptive analysis are adduced. On the basis of these methods the algorithm of automatic measurement of a scattering strip size of a target component in CARS spectrum is developed. The algorithm uses known full spectrum of target substance and compares it with a CARS spectrum. The form of a differential spectrum is used as a feedback to control the accuracy of matching. To exclude the influence of a background in CARS spectra the differential spectrum is analysed by means of its second derivative. The algorithm is checked up on the simulated simple spectra and on the spectra of organic compounds received experimentally.
Resumo:
Objectives: The aim of the study was to combine clinical results from the European Cohort of the REVERSE study and costs associated with the addition of cardiac resynchronization therapy (CRT) to optimal medical therapy (OMT) in patients with mild symptomatic (NYHA I-II) or asymptomatic left ventricular dysfunction and markers of cardiac dyssynchrony in Spain. Methods: A Markov model was developed with CRT + OMT (CRT-ON) versus OMT only (CRT-OFF) based on a retrospective cost-effectiveness analysis. Raw data was derived from literature and expert opinion, reflecting clinical and economic consequences of patient"s management in Spain. Time horizon was 10 years. Both costs (euro 2010) and effects were discounted at 3 percent per annum. Results: CRT-ON showed higher total costs than CRT-OFF; however, CRT reduced the length of hospitalization in ICU by 94 percent (0.006 versus 0.091 days) and general ward in by 34 percent (0.705 versus 1.076 days). Surviving CRT-ON patients (88.2 percent versus 77.5 percent) remained in better functional class longer, and they achieved an improvement of 0.9 life years (LYGs) and 0.77 years quality-adjusted life years (QALYs). CRT-ON proved to be cost-effective after 6 years, except for the 7th year due to battery depletion. At 10 years, the results were 18,431 per LYG and 21,500 per QALY gained. Probabilistic sensitivity analysis showed CRT-ON was cost-effective in 75.4 percent of the cases at 10 years. Conclusions: The use of CRT added to OMT represents an efficient use of resources in patients suffering from heart failure in NYHA functional classes I and II.
Resumo:
BACKGROUND AND OBJECTIVES: Sudden cardiac death (SCD) is a severe burden of modern medicine. Aldosterone antagonist is publicized as effective in reducing mortality in patients with heart failure (HF) or post myocardial infarction (MI). Our study aimed to assess the efficacy of AAs on mortality including SCD, hospitalization admission and several common adverse effects. METHODS: We searched Embase, PubMed, Web of Science, Cochrane library and clinicaltrial.gov for randomized controlled trials (RCTs) assigning AAs in patients with HF or post MI through May 2015. The comparator included standard medication or placebo, or both. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. Event rates were compared using a random effects model. Prospective RCTs of AAs with durations of at least 8 weeks were selected if they included at least one of the following outcomes: SCD, all-cause/cardiovascular mortality, all-cause/cardiovascular hospitalization and common side effects (hyperkalemia, renal function degradation and gynecomastia). RESULTS: Data from 19,333 patients enrolled in 25 trials were included. In patients with HF, this treatment significantly reduced the risk of SCD by 19% (RR 0.81; 95% CI, 0.67-0.98; p = 0.03); all-cause mortality by 19% (RR 0.81; 95% CI, 0.74-0.88, p<0.00001) and cardiovascular death by 21% (RR 0.79; 95% CI, 0.70-0.89, p<0.00001). In patients with post-MI, the matching reduced risks were 20% (RR 0.80; 95% CI, 0.66-0.98; p = 0.03), 15% (RR 0.85; 95% CI, 0.76-0.95, p = 0.003) and 17% (RR 0.83; 95% CI, 0.74-0.94, p = 0.003), respectively. Concerning both subgroups, the relative risks respectively decreased by 19% (RR 0.81; 95% CI, 0.71-0.92; p = 0.002) for SCD, 18% (RR 0.82; 95% CI, 0.77-0.88, p < 0.0001) for all-cause mortality and 20% (RR 0.80; 95% CI, 0.74-0.87, p < 0.0001) for cardiovascular mortality in patients treated with AAs. As well, hospitalizations were significantly reduced, while common adverse effects were significantly increased. CONCLUSION: Aldosterone antagonists appear to be effective in reducing SCD and other mortality events, compared with placebo or standard medication in patients with HF and/or after a MI.
Resumo:
Que ce soit d'un point de vue, urbanistique, social, ou encore de la gouvernance, l'évolution des villes est un défi majeur de nos sociétés contemporaines. En offrant la possibilité d'analyser des configurations spatiales et sociales existantes ou en tentant de simuler celles à venir, les systèmes d'information géographique sont devenus incontournables dans la gestion et dans la planification urbaine. En cinq ans la population de la ville de Lausanne est passée de 134'700 à 140'570 habitants, alors que les effectifs de l'école publique ont crû de 12'200 à 13'500 élèves. Cet accroissement démographique associé à un vaste processus d'harmonisation de la scolarité obligatoire en Suisse ont amené le Service des écoles à mettre en place et à développer en collaboration avec l'université de Lausanne des solutions SIG à même de répondre à différentes problématiques spatiales. Établies en 1989, les limites des établissements scolaires (bassins de recrutement) ont dû être redéfinies afin de les réadapter aux réalités d'un paysage urbain et politique en pleine mutation. Dans un contexte de mobilité et de durabilité, un système d'attribution de subventions pour les transports publics basé sur la distance domicile-école et sur l'âge des écoliers, a été conçu. La réalisation de ces projets a nécessité la construction de bases de données géographiques ainsi que l'élaboration de nouvelles méthodes d'analyses exposées dans ce travail. Cette thèse s'est ainsi faite selon une dialectique permanente entre recherches théoriques et nécessités pratiques. La première partie de ce travail porte sur l'analyse du réseau piéton de la ville. La morphologie du réseau est investiguée au travers d'approches multi-échelles du concept de centralité. La première conception, nommée sinuo-centralité ("straightness centrality"), stipule qu'être central c'est être relié aux autres en ligne droite. La deuxième, sans doute plus intuitive, est intitulée centricité ("closeness centrality") et exprime le fait qu'être central c'est être proche des autres (fig. 1, II). Les méthodes développées ont pour but d'évaluer la connectivité et la marchabilité du réseau, tout en suggérant de possibles améliorations (création de raccourcis piétons). Le troisième et dernier volet théorique expose et développe un algorithme de transport optimal régularisé. En minimisant la distance domicile-école et en respectant la taille des écoles, l'algorithme permet de réaliser des scénarios d'enclassement. L'implémentation des multiplicateurs de Lagrange offre une visualisation du "coût spatial" des infrastructures scolaires et des lieux de résidence des écoliers. La deuxième partie de cette thèse retrace les aspects principaux de trois projets réalisés dans le cadre de la gestion scolaire. À savoir : la conception d'un système d'attribution de subventions pour les transports publics, la redéfinition de la carte scolaire, ou encore la simulation des flux d'élèves se rendant à l'école à pied. *** May it be from an urbanistic, a social or from a governance point of view, the evolution of cities is a major challenge in our contemporary societies. By giving the opportunity to analyse spatial and social configurations or attempting to simulate future ones, geographic information systems cannot be overlooked in urban planning and management. In five years, the population of the city of Lausanne has grown from 134'700 to 140'570 inhabitants while the numbers in public schools have increased from 12'200 to 13'500 students. Associated to a considerable harmonisation process of compulsory schooling in Switzerland, this demographic rise has driven schooling services, in collaboration with the University of Lausanne, to set up and develop GIS capable of tackling various spatial issues. Established in 1989, the school districts had to be altered so that they might fit the reality of a continuously changing urban and political landscape. In a context of mobility and durability, an attribution system for public transport subventions based on the distance between residence and school and on the age of the students was designed. The implementation of these projects required the built of geographical databases as well as the elaboration of new analysis methods exposed in this thesis. The first part of this work focuses on the analysis of the city's pedestrian network. Its morphology is investigated through multi-scale approaches of the concept of centrality. The first conception, named the straightness centrality, stipulates that being central is being connected to the others in a straight line. The second, undoubtedly more intuitive, is called closeness centrality and expresses the fact that being central is being close to the others. The goal of the methods developed is to evaluate the connectivity and walkability of the network along with suggesting possible improvements (creation of pedestrian shortcuts).The third and final theoretical section exposes and develops an algorithm of regularised optimal transport. By minimising home to school distances and by respecting school capacity, the algorithm enables the production of student allocation scheme. The implementation of the Lagrange multipliers offers a visualisation of the spatial cost associated to the schooling infrastructures and to the student home locations. The second part of this thesis recounts the principal aspects of three projects fulfilled in the context of school management. It focuses namely on the built of an attribution system for public transport subventions, a school redistricting process and on simulating student pedestrian flows.
Resumo:
This paper studies the incidence and consequences of the mismatch between formal education and the educational requirements of jobs in Estonia during the years 1997-2003. We fi nd large wage penalties associated with the phenomenon of educational mismatch. Moreover, the incidence and wage penalty of mismatches increase with age. This suggests that structural educational mismatches can occur after fast transition periods. Our results are robust for various methodologies, and more importantly regarding departures from the exogeneity assumptions inherent in the matching estimators used in our analysis
Resumo:
In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.
Resumo:
Peer-reviewed
Resumo:
A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae). The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone) were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%), total polyphenol content (29.39 ± 0.39%), and antioxidant capacity (6.38 ± 0.21). An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.