843 resultados para Relevance feature


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Adult community-acquired pneumonia (CAP) is a relevant worldwide cause of morbidity and mortality, however the aetiology often remains uncertain and the therapy is empirical. We applied conventional and molecular diagnostics to identify viruses and atypical bacteria associated with CAP in Chile. Methods We used sputum and blood cultures, IgG/IgM serology and molecular diagnostic techniques (PCR, reverse transcriptase PCR) for detection of classical and atypical bacteria (Mycoplasma pneumoniae, Chlamydia pneumoniae, Legionella pneumoniae) and respiratory viruses (adenovirus, respiratory syncytial virus (RSV), human metapneumovirus, influenza virus, parainfluenzavirus, rhinovirus, coronavirus) in adults >18 years old presenting with CAP in Santiago from February 2005 to September 2007. Severity was qualified at admission by Fine's pneumonia severity index. Results Overall detection in 356 enrolled adults were 92 (26%) cases of a single bacterial pathogen, 80 (22%) cases of a single viral pathogen, 60 (17%) cases with mixed bacterial and viral infection and 124 (35%) cases with no identified pathogen. Streptococcus pneumoniae and RSV were the most common bacterial and viral pathogens identified. Infectious agent detection by PCR provided greater sensitivity than conventional techniques. To our surprise, no relationship was observed between clinical severity and sole or coinfections. Conclusions The use of molecular diagnostics expanded the detection of viruses and atypical bacteria in adults with CAP, as unique or coinfections. Clinical severity and outcome were independent of the aetiological agents detected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atomic physics plays an important role in determining the evolution stages in a wide range of laboratory and cosmic plasmas. Therefore, the main contribution to our ability to model, infer and control plasma sources is the knowledge of underlying atomic processes. Of particular importance are reliable low temperature dielectronic recombination (DR) rate coefficients. This thesis provides systematically calculated DR rate coefficients of lithium-like beryllium and sodium ions via ∆n = 0 doubly excited resonant states. The calculations are based on complex-scaled relativistic many-body perturbation theory in an all-order formulation within the single- and double-excitation coupled-cluster scheme, including radiative corrections. Comparison of DR resonance parameters (energy levels, autoionization widths, radiative transition probabilities and strengths) between our theoretical predictions and the heavy-ion storage rings experiments (CRYRING-Stockholm and TSRHeidelberg) shows good agreement. The intruder state problem is a principal obstacle for general application of the coupled-cluster formalism on doubly excited states. Thus, we have developed a technique designed to avoid the intruder state problem. It is based on a convenient partitioning of the Hilbert space and reformulation of the conventional set of pairequations. The general aspects of this development are discussed, and the effectiveness of its numerical implementation (within the non-relativistic framework) is selectively illustrated on autoionizing doubly excited states of helium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]The human face provides useful information during interaction; therefore, any system integrating Vision- BasedHuman Computer Interaction requires fast and reliable face and facial feature detection. Different approaches have focused on this ability but only open source implementations have been extensively used by researchers. A good example is the Viola–Jones object detection framework that particularly in the context of facial processing has been frequently used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]In this paper, we experimentally study the combination of face and facial feature detectors to improve face detection performance. The face detection problem, as suggeted by recent face detection challenges, is still not solved. Face detectors traditionally fail in large-scale problems and/or when the face is occluded or di erent head rotations are present. The combination of face and facial feature detectors is evaluated with a public database. The obtained results evidence an improvement in the positive detection rate while reducing the false detection rate. Additionally, we prove that the integration of facial feature detectors provides useful information for pose estimation and face alignment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis evaluated in vivo and in vitro enamel permeability in different physiological and clinical conditions by means of SEM inspection of replicas of enamel surface obtained from polyvinyl siloxane impressions subsequently later cast in polyether impression ma-terial. This technique, not invasive and risk-free, allows the evaluation of fluid outflow from enamel surface and is able to detect the presence of small quantities of fluid, visu-alized as droplets. Fluid outflow on enamel surface represents enamel permeability. This property has a paramount importance in enamel physiolgy and pathology although its ef-fective role in adhesion, caries pathogenesis and prevention today is still not fully under-stood. The aim of the studies proposed was to evaluate enamel permeability changes in differ-ent conditions and to correlate the findings with the actual knowledge about enamel physiology, caries pathogenesis, fluoride and etchinhg treatments. To obtain confirmed data the replica technique has been supported by others specific techniques such as Ra-man and IR spectroscopy and EDX analysis. The first study carried out visualized fluid movement through dental enamel in vivo con-firmed that enamel is a permeable substrate and demonstrated that age and enamel per-meability are closely related. Examined samples from subjects of different ages showed a decreasing number and size of droplets with increasing age: freshly erupted permanent teeth showed many droplets covering the entire enamel surface. Droplets in permanent teeth were prominent along enamel perikymata. These results obtained through SEM inspection of replicas allowed innovative remarks in enamel physiology. An analogous testing has been developed for evaluation of enamel permeability in primary enamel. The results of this second study showed that primary enamel revealed a substantive permeability with droplets covering the entire enamel sur-face without any specific localization accordingly with histological features, without changes during aging signs of post-eruptive maturation. These results confirmed clinical data that showed a higher caries susceptibility for primary enamel and suggested a strong relationship between this one and enamel permeability. Topical fluoride application represents the gold standard for caries prevention although the mechanism of cariostatic effect of fluoride still needs to be clarified. The effects of topical fluoride application on enamel permeability were evaluated. Particularly two dif-ferent treatments (NaF and APF), with different pH, were examined. The major product of topical fluoride application was the deposition of CaF2-like globules. Replicas inspec-tion before and after both treatments at different times intervals and after specific addi-tional clinical interventions showed that such globule formed in vivo could be removed by professional toothbrushing, sonically and chemically by KOH. The results obtained in relation to enamel permeability showed that fluoride treatments temporarily reduced enamel water permeability when CaF2-like globules were removed. The in vivo perma-nence of decreased enamel permeability after CaF2 globules removal has been demon-strated for 1 h for NaF treated teeth and for at least 7 days for APF treated teeth. Important clinical consideration moved from these results. In fact the caries-preventing action of fluoride application may be due, in part, to its ability to decrease enamel water permeability and CaF2 like-globules seem to be indirectly involved in enamel protection over time maintaining low permeability. Others results obtained by metallographic microscope and SEM/EDX analyses of or-thodontic resins fluoride releasing and not demonstrated the relevance of topical fluo-ride application in decreasing the demineralization marks and modifying the chemical composition of the enamel in the treated area. These data obtained in both the experiments confirmed the efficacy of fluoride in caries prevention and contribute to clarify its mechanism of action. Adhesive dentistry is the gold standard for caries treatment and tooth rehabilitation and is founded on important chemical and physical principles involving both enamel and dentine substrates. Particularly acid etching of dental enamel enamel has usually employed in bonding pro-cedures increasing microscopic roughness. Different acids have been tested in the litera-ture suggesting several etching procedures. The acid-induced structural transformations in enamel after different etching treatments by means of Raman and IR spectroscopy analysis were evaluated and these findings were correlated with enamel permeability. Conventional etching with 37% phosphoric acid gel (H3PO4) for 30 s and etching with 15 % HCl for 120 s were investigated. Raman and IR spectroscopy showed that the treatment with both hydrochloric and phosphoric acids induced a decrease in the carbonate content of the enamel apatite. At the same time, both acids induced the formation of HPO42- ions. After H3PO4 treatment the bands due to the organic component of enamel decreased in intensity, while in-creased after HCl treatment. Replicas of H3PO4 treated enamel showed a strongly reduced permeability while replicas of HCl 15% treated samples showed a maintained permeability. A decrease of the enamel organic component, as resulted after H3PO4 treatment, involves a decrease in enamel permeability, while the increase of the organic matter (achieved by HCl treat-ment) still maintains enamel permeability. These results suggested a correlation between the amount of the organic matter, enamel permeability and caries. The results of the different studies carried out in this thesis contributed to clarify and improve the knowledge about enamel properties with important rebounds in theoretical and clinical aspects of Dentistry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, it is clear that the target of creating a sustainable future for the next generations requires to re-think the industrial application of chemistry. It is also evident that more sustainable chemical processes may be economically convenient, in comparison with the conventional ones, because fewer by-products means lower costs for raw materials, for separation and for disposal treatments; but also it implies an increase of productivity and, as a consequence, smaller reactors can be used. In addition, an indirect gain could derive from the better public image of the company, marketing sustainable products or processes. In this context, oxidation reactions play a major role, being the tool for the production of huge quantities of chemical intermediates and specialties. Potentially, the impact of these productions on the environment could have been much worse than it is, if a continuous efforts hadn’t been spent to improve the technologies employed. Substantial technological innovations have driven the development of new catalytic systems, the improvement of reactions and process technologies, contributing to move the chemical industry in the direction of a more sustainable and ecological approach. The roadmap for the application of these concepts includes new synthetic strategies, alternative reactants, catalysts heterogenisation and innovative reactor configurations and process design. Actually, in order to implement all these ideas into real projects, the development of more efficient reactions is one primary target. Yield, selectivity and space-time yield are the right metrics for evaluating the reaction efficiency. In the case of catalytic selective oxidation, the control of selectivity has always been the principal issue, because the formation of total oxidation products (carbon oxides) is thermodynamically more favoured than the formation of the desired, partially oxidized compound. As a matter of fact, only in few oxidation reactions a total, or close to total, conversion is achieved, and usually the selectivity is limited by the formation of by-products or co-products, that often implies unfavourable process economics; moreover, sometimes the cost of the oxidant further penalizes the process. During my PhD work, I have investigated four reactions that are emblematic of the new approaches used in the chemical industry. In the Part A of my thesis, a new process aimed at a more sustainable production of menadione (vitamin K3) is described. The “greener” approach includes the use of hydrogen peroxide in place of chromate (from a stoichiometric oxidation to a catalytic oxidation), also avoiding the production of dangerous waste. Moreover, I have studied the possibility of using an heterogeneous catalytic system, able to efficiently activate hydrogen peroxide. Indeed, the overall process would be carried out in two different steps: the first is the methylation of 1-naphthol with methanol to yield 2-methyl-1-naphthol, the second one is the oxidation of the latter compound to menadione. The catalyst for this latter step, the reaction object of my investigation, consists of Nb2O5-SiO2 prepared with the sol-gel technique. The catalytic tests were first carried out under conditions that simulate the in-situ generation of hydrogen peroxide, that means using a low concentration of the oxidant. Then, experiments were carried out using higher hydrogen peroxide concentration. The study of the reaction mechanism was fundamental to get indications about the best operative conditions, and improve the selectivity to menadione. In the Part B, I explored the direct oxidation of benzene to phenol with hydrogen peroxide. The industrial process for phenol is the oxidation of cumene with oxygen, that also co-produces acetone. This can be considered a case of how economics could drive the sustainability issue; in fact, the new process allowing to obtain directly phenol, besides avoiding the co-production of acetone (a burden for phenol, because the market requirements for the two products are quite different), might be economically convenient with respect to the conventional process, if a high selectivity to phenol were obtained. Titanium silicalite-1 (TS-1) is the catalyst chosen for this reaction. Comparing the reactivity results obtained with some TS-1 samples having different chemical-physical properties, and analyzing in detail the effect of the more important reaction parameters, we could formulate some hypothesis concerning the reaction network and mechanism. Part C of my thesis deals with the hydroxylation of phenol to hydroquinone and catechol. This reaction is already industrially applied but, for economical reason, an improvement of the selectivity to the para di-hydroxilated compound and a decrease of the selectivity to the ortho isomer would be desirable. Also in this case, the catalyst used was the TS-1. The aim of my research was to find out a method to control the selectivity ratio between the two isomers, and finally to make the industrial process more flexible, in order to adapt the process performance in function of fluctuations of the market requirements. The reaction was carried out in both a batch stirred reactor and in a re-circulating fixed-bed reactor. In the first system, the effect of various reaction parameters on catalytic behaviour was investigated: type of solvent or co-solvent, and particle size. With the second reactor type, I investigated the possibility to use a continuous system, and the catalyst shaped in extrudates (instead of powder), in order to avoid the catalyst filtration step. Finally, part D deals with the study of a new process for the valorisation of glycerol, by means of transformation into valuable chemicals. This molecule is nowadays produced in big amount, being a co-product in biodiesel synthesis; therefore, it is considered a raw material from renewable resources (a bio-platform molecule). Initially, we tested the oxidation of glycerol in the liquid-phase, with hydrogen peroxide and TS-1. However, results achieved were not satisfactory. Then we investigated the gas-phase transformation of glycerol into acrylic acid, with the intermediate formation of acrolein; the latter can be obtained by dehydration of glycerol, and then can be oxidized into acrylic acid. Actually, the oxidation step from acrolein to acrylic acid is already optimized at an industrial level; therefore, we decided to investigate in depth the first step of the process. I studied the reactivity of heterogeneous acid catalysts based on sulphated zirconia. Tests were carried out both in aerobic and anaerobic conditions, in order to investigate the effect of oxygen on the catalyst deactivation rate (one main problem usually met in glycerol dehydration). Finally, I studied the reactivity of bifunctional systems, made of Keggin-type polyoxometalates, either alone or supported over sulphated zirconia, in this way combining the acid functionality (necessary for the dehydrative step) with the redox one (necessary for the oxidative step). In conclusion, during my PhD work I investigated reactions that apply the “green chemistry” rules and strategies; in particular, I studied new greener approaches for the synthesis of chemicals (Part A and Part B), the optimisation of reaction parameters to make the oxidation process more flexible (Part C), and the use of a bioplatform molecule for the synthesis of a chemical intermediate (Part D).