30 resultados para discriminants of number fields
em Université de Lausanne, Switzerland
Resumo:
Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.
Resumo:
Background: Simultaneous polydrug use (SPU) may represent a greater incremental risk factor for human health than concurrent polydrug use (CPU). However, few studies have examined these patterns of use in relation to health issues, particularly with regard to the number of drugs used. Methods: In the present study, we have analyzed data from a representative sample of 5734 young Swiss males from the Cohort Study on Substance Use Risk Factors. Exposure to drugs (i.e., alcohol, tobacco, cannabis, and 15 other illicit drugs), as well as mental, social and physical factors, were studied through regression analysis. Results: We found that individuals engaging in CPU and SPU followed the known stages of drug use, involving initial experiences with licit drugs (e.g., alcohol and tobacco), followed by use of cannabis and then other illicit drugs. In this regard, two classes of illicit drugs were identified, including first uppers, hallucinogens and sniffed drugs; and then "harder" drugs (ketamine, heroin, and crystal meth), which were only consumed by polydrug users who were already taking numerous drugs. Moreover, we observed an association between the number of drugs used simultaneously and social issues (i.e., social consequences and aggressiveness). In fact, the more often the participants simultaneously used substances, the more likely they were to experience social problems. In contrast, we did not find any relationship between SPU and depression, anxiety, health consequences, or health. Conclusions: We identified some associations with SPU that were independent of CPU. Moreover, we found that the number of concurrently used drugs can be a strong factor associated with mental and physical health, although their simultaneous use may not significantly contribute to this association. Finally, the negative effects related to the use of one substance might be counteracted by the use of an additional substance.
Resumo:
The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.
Resumo:
Counts performed on dissociated cell cultures of E10 chick embryo dorsal root ganglia (DRG) showed after 4-6 days of culture a pronounced decline of the neuronal population in neuron-enriched cultures and a net gain in the number of ganglion cells in mixed DRG cell cultures (containing both neurons and nonneuronal cells). In the latter case, the increase in the number of neurons was found to depend on NGF and to average 119% in defined medium or 129% in horse serum-supplemented medium after 6 days of culture. The lack of [3H]thymidine incorporation into the neuronal population indicated that the newly formed ganglion cells were not generated by proliferation. On the contrary, the differentiation of postmitotic neuroblasts present in the nonneuronal cell compartment was supported by sequential microphotographs of selected fields taken every hour for 48-55 hr after 3 days of culture. Apparently nonneuronal flat dark cells exhibited morphological changes and gradually evolved into neuronal ovoid and refringent cell bodies with expanding neurites. The ultrastructural organization of these evolving cells corresponded to that of primitive or intermediate neuroblasts. The neuronal nature of these rounding up cell bodies was indeed confirmed by the progressive expression of various neuronal cell markers (150 and 200-kDa neurofilament triplets, neuron specific enolase, and D2/N-CAM). Besides a constant lack of immunoreactivity for tyrosine hydroxylase, somatostatin, parvalbumin, and calbindin-D 28K and a lack of cytoenzymatic activity for carbonic anhydrase, all the newly produced neurons expressed three main phenotypic characteristics: a small cell body, a strong immunoreactivity to MAG, and substance P. Hence, ganglion cells newly differentiated in culture would meet characteristics ascribed to small B sensory neurons and more specifically to a subpopulation of ganglion cells containing substance P-immunoreactive material.
Resumo:
OBJECTIVE: As part of the WHO ICD-11 development initiative, the Topic Advisory Group on Quality and Safety explores meta-features of morbidity data sets, such as the optimal number of secondary diagnosis fields. DESIGN: The Health Care Quality Indicators Project of the Organization for Economic Co-Operation and Development collected Patient Safety Indicator (PSI) information from administrative hospital data of 19-20 countries in 2009 and 2011. We investigated whether three countries that expanded their data systems to include more secondary diagnosis fields showed increased PSI rates compared with six countries that did not. Furthermore, administrative hospital data from six of these countries and two American states, California (2011) and Florida (2010), were analysed for distributions of coded patient safety events across diagnosis fields. RESULTS: Among the participating countries, increasing the number of diagnosis fields was not associated with any overall increase in PSI rates. However, high proportions of PSI-related diagnoses appeared beyond the sixth secondary diagnosis field. The distribution of three PSI-related ICD codes was similar in California and Florida: 89-90% of central venous catheter infections and 97-99% of retained foreign bodies and accidental punctures or lacerations were captured within 15 secondary diagnosis fields. CONCLUSIONS: Six to nine secondary diagnosis fields are inadequate for comparing complication rates using hospital administrative data; at least 15 (and perhaps more with ICD-11) are recommended to fully characterize clinical outcomes. Increasing the number of fields should improve the international and intra-national comparability of data for epidemiologic and health services research, utilization analyses and quality of care assessment.
Resumo:
The purpose of this paper is to review the scientific literature from August 2007 to July 2010. The review is focused on more than 420 published papers. The review will not cover information coming from international meetings available only in abstract form. Fingermarks constitute an important chapter with coverage of the identification process as well as detection techniques on various surfaces. We note that the research has been very dense both at exploring and understanding current detection methods as well as bringing groundbreaking techniques to increase the number of marks detected from various objects. The recent report from the US National Research Council (NRC) is a milestone that has promoted a critical discussion on the state of forensic science and its associated research. We can expect a surge of interest in research in relation to cognitive aspect of mark and print comparison, establishment of relevant forensic error rates and statistical modelling of the selectivity of marks' attributes. Other biometric means of forensic identification such as footmarks or earmarks are also covered in the report. Compared to previous years, we noted a decrease in the number of submission in these areas. No doubt that the NRC report has set the seed for further investigation of these fields as well.
Resumo:
The response of Arabidopsis to stress caused by mechanical wounding was chosen as a model to compare the performances of high resolution quadrupole-time-of-flight (Q-TOF) and single stage Orbitrap (Exactive Plus) mass spectrometers in untargeted metabolomics. Both instruments were coupled to ultra-high pressure liquid chromatography (UHPLC) systems set under identical conditions. The experiment was divided in two steps: the first analyses involved sixteen unwounded plants, half of which were spiked with pure standards that are not present in Arabidopsis. The second analyses compared the metabolomes of mechanically wounded plants to unwounded plants. Data from both systems were extracted using the same feature detection software and submitted to unsupervised and supervised multivariate analysis methods. Both mass spectrometers were compared in terms of number and identity of detected features, capacity to discriminate between samples, repeatability and sensitivity. Although analytical variability was lower for the UHPLC-Q-TOF, generally the results for the two detectors were quite similar, both of them proving to be highly efficient at detecting even subtle differences between plant groups. Overall, sensitivity was found to be comparable, although the Exactive Plus Orbitrap provided slightly lower detection limits for specific compounds. Finally, to evaluate the potential of the two mass spectrometers for the identification of unknown markers, mass and spectral accuracies were calculated on selected identified compounds. While both instruments showed excellent mass accuracy (<2.5ppm for all measured compounds), better spectral accuracy was recorded on the Q-TOF. Taken together, our results demonstrate that comparable performances can be obtained at acquisition frequencies compatible with UHPLC on Q-TOF and Exactive Plus MS, which may thus be equivalently used for plant metabolomics.
Resumo:
OBJECTIVES/HYPOTHESIS: Facial nerve regeneration is limited in some clinical situations: in long grafts, by aged patients, and when the delay between nerve lesion and repair is prolonged. This deficient regeneration is due to the limited number of regenerating nerve fibers, their immaturity and the unresponsiveness of Schwann cells after a long period of denervation. This study proposes to apply glial cell line-derived neurotrophic factor (GDNF) on facial nerve grafts via nerve guidance channels to improve the regeneration. METHODS: Two situations were evaluated: immediate and delayed grafts (repair 7 months after the lesion). Each group contained three subgroups: a) graft without channel, b) graft with a channel without neurotrophic factor; and c) graft with a GDNF-releasing channel. A functional analysis was performed with clinical observation of facial nerve function, and nerve conduction study at 6 weeks. Histological analysis was performed with the count of number of myelinated fibers within the graft, and distally to the graft. Central evaluation was assessed with Fluoro-Ruby retrograde labeling and Nissl staining. RESULTS: This study showed that GDNF allowed an increase in the number and the maturation of nerve fibers, as well as the number of retrogradely labeled neurons in delayed anastomoses. On the contrary, after immediate repair, the regenerated nerves in the presence of GDNF showed inferior results compared to the other groups. CONCLUSIONS: GDNF is a potent neurotrophic factor to improve facial nerve regeneration in grafts performed several months after the nerve lesion. However, GDNF should not be used for immediate repair, as it possibly inhibits the nerve regeneration.
Resumo:
Purpose: Collaboration and interprofessional practices are highly valued in health systems everywhere, partly based on the rationale that they improve outcomes of care for people with complex health problems, such as low back pain. Research in the area of low back pain also supports the involvement of different health professionals in the interventions for people who present this condition. The aim of this studywas to identify factors influencing the interprofessional practices of physiotherapists working in private settings with people with low back pain. Relevance: Physiotherapists, like other health professionals, are encouraged to engage in interprofessional practices in their dailywork. However, to date, very little is known of their interprofessional practices, especially in private settings. Understanding physiotherapists' interprofessional practices and their influencing factors will notably advance knowledge relating to the organisation of physiotherapy services for people with low back pain. Participants: Participants in this study were 13 physiotherapists including 10 women and 3 men, having between 3 and 22 years of professional experience, and working in one of 10 regions of the Province of Quebec (Canada). In order to obtain maximal variation in the perspectives, participants were selected using a recruitment matrix including three criteria: duration of professional experience, work location, and physical proximity with other professionals. Methods: Thiswas a descriptive qualitative study using faceto- face semi-structured interviews as the main method of data collection. An interview guide was developed based on an evidence-derived frame of reference. Each interview lasted between 55 and 95 minutes and was transcribed verbatim. Analysis: Qualitative analyses took the form of content analysis, encompassing data coding and general thematic regrouping. NVivo version 8 was used to assist data organisation and analysis. Results: Multiple factors influencing the interprofessional practices of physiotherapists were identified. The main factors include the consulting person's health condition, the extent of knowledge on health professionals' roles and fields of practice, the proximity and availability of professional resources, as well as daily work schedules. Conclusions: Our findings highlight the influence of multiple factors on physiotherapists' interprofessional practices, including professional practice and organisational issues. However, further research on the interprofessional practices of physiotherapists is still required. Research priorities targeting the views of other health professionals, as well as those of services users, would enhance our comprehension of interprofessional practices of physiotherapists. Implications: This study provides new insights that improve our understanding of the interprofessional practices of physiotherapists working in private settings with people with low back pain, more specifically on the factors influencing these practices. Based on our findings, implementing changes such as improving current and future health professionals' knowledge of the fields and roles of other health professionals through training may contribute to positively influencing interprofessional practices. Keywords: Interprofessional practices; Private practice; Low back pain Funding acknowledgements: This research was supported in part by a B.E. Schnurr Memorial Fund Research Grant administered by the Physiotherapy Foundation of Canada, as well as from a clinical research partnership in physiotherapy between the Quebec Rehabilitation Research Network (REPAR) and the Ordre professionnel de la physiothérapie du Québec (OPPQ). KP received doctoral-level scholarships from the Canadian Institutes of Health Research (CIHR) and the Institut de recherche Robert-Sauvé en santé et en sécurité du travail (IRSST). CE Dionne is a FRSQ senior Research Scholar. Ethics approval: This project was approved by the ethics research committee of the Institut de réadaptation en déficience physique de Québec.
Resumo:
L'année 2007 a été marquée par la publication de plusieurs études internationales concernant directement le quotidien de l'interniste hospitalier. Un résumé de ces travaux ne saurait être qu'un extrait condensé et forcément subjectif d'une croissante et dynamique diversité. Au gré de leurs lectures, de leurs intérêts et de leurs interrogations, les chefs de clinique du Service de médecine interne vous proposent ainsi un parcours original revisitant les thèmes de l'insuffisance cardiaque, du diabète, de l'endocardite, de la BPCO ou de la qualité des soins. Cette variété de sujets illustre à la fois le vaste champ couvert par la médecine interne actuelle, ainsi que les nombreuses incertitudes liées à la pratique médicale moderne basée sur les preuves. In 2007, several international studies brought useful information for the daily work of internists in hospital settings. This summary is of course subjective but reflects the interests and questions of the chief residents of the Department of internal medicine who wrote this article like an original trip in medical literature. This trip will allow you to review some aspects of important fields such as heart failure, diabetes, endocarditis, COPD, and quality of care. Besides the growing diversity of the fields covered by internal medicine, these various topics underline also the uncertainty internists have to face in a practice directed towards evidence.
Resumo:
Right from the beginning of the development of the medical specialty of Physical and Rehabilitation Medicine (PRM) the harmonization of the fields of competence and the specialist training across Europe was always an important issue. The initially informal European collaboration was formalized in 1963 under the umbrella of the European Federation of PRM. The European Academy of PRM and the UEMS section of PRM started to contribute in 1969 and 1974 respectively. In 1991 the European Board of Physical and Rehabilitation Medicine (EBPRM) was founded with the specific task of harmonizing education and training in PRM in Europe. The EBPRM has progressively defined curricula for the teaching of medical students and for the postgraduate education and training of PRM specialists. It also created a harmonized European certification system for medical PRM specialists, PRM trainers and PRM training sites. European teaching initiatives for PRM trainees (European PRM Schools) were promoted and learning material for PRM trainees and PRM specialists (e-learning, books and e-books, etc.) was created. For the future the Board will have to ensure that a minimal specific undergraduate curriculum on PRM based on a detailed European catalogue of learning objectives will be taught in all medical schools in Europe as a basis for the general medical practice. To stimulate the harmonization of national curricula, the existing postgraduate curriculum will be expanded by a syllabus of competencies related to PRM and a catalogue of learning objectives to be reached by all European PRM trainees. The integration of the certifying examination of the PRM Board into the national assessment procedures for PRM specialists will also have to be promoted.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
This paper presents a review of methodology for semi-supervised modeling with kernel methods, when the manifold assumption is guaranteed to be satisfied. It concerns environmental data modeling on natural manifolds, such as complex topographies of the mountainous regions, where environmental processes are highly influenced by the relief. These relations, possibly regionalized and nonlinear, can be modeled from data with machine learning using the digital elevation models in semi-supervised kernel methods. The range of the tools and methodological issues discussed in the study includes feature selection and semisupervised Support Vector algorithms. The real case study devoted to data-driven modeling of meteorological fields illustrates the discussed approach.
Resumo:
For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.