994 resultados para Real quadratic fields
Resumo:
La douleur neuropathique est définie comme une douleur causée par une lésion du système nerveux somato-sensoriel. Elle se caractérise par des douleurs exagérées, spontanées, ou déclenchées par des stimuli normalement non douloureux (allodynie) ou douloureux (hyperalgésie). Bien qu'elle concerne 7% de la population, ses mécanismes biologiques ne sont pas encore élucidés. L'étude des variations d'expressions géniques dans les tissus-clés des voies sensorielles (notamment le ganglion spinal et la corne dorsale de la moelle épinière) à différents moments après une lésion nerveuse périphérique permettrait de mettre en évidence de nouvelles cibles thérapeutiques. Elles se détectent de manière sensible par reverse transcription quantitative real-time polymerase chain reaction (RT- qPCR). Pour garantir des résultats fiables, des guidelines ont récemment recommandé la validation des gènes de référence utilisés pour la normalisation des données ("Minimum information for publication of quantitative real-time PCR experiments", Bustin et al 2009). Après recherche dans la littérature des gènes de référence fréquemment utilisés dans notre modèle de douleur neuropathique périphérique SNI (spared nerve injury) et dans le tissu nerveux en général, nous avons établi une liste de potentiels bons candidats: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) et L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) et hydroxymethyl-bilane synthase (HMBS). Nous avons évalué la stabilité d'expression de ces gènes dans le ganglion spinal et dans la corne dorsale à différents moments après la lésion nerveuse (SNI) en calculant des coefficients de variation et utilisant l'algorithme geNorm qui compare les niveaux d'expression entre les différents candidats et détermine la paire de gènes restante la plus stable. Il a aussi été possible de classer les gènes selon leur stabilité et d'identifier le nombre de gènes nécessaires pour une normalisation la plus précise. Les gènes les plus cités comme référence dans le modèle SNI ont été GAPDH, HMBS, Actb, HPRT1 et 18S. Seuls HPRT1 and 18S ont été précédemment validés dans des arrays de RT-qPCR. Dans notre étude, tous les gènes testés dans le ganglion spinal et dans la corne dorsale satisfont au critère de stabilité exprimé par une M-value inférieure à 1. Par contre avec un coefficient de variation (CV) supérieur à 50% dans le ganglion spinal, 18S ne peut être retenu. La paire de gènes la plus stable dans le ganglion spinal est HPRT1 et Actb et dans la corne dorsale il s'agit de RPL29 et RPL13a. L'utilisation de 2 gènes de référence stables suffit pour une normalisation fiable. Nous avons donc classé et validé Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 et 18S comme gènes de référence utilisables dans la corne dorsale pour le modèle SNI chez le rat. Dans le ganglion spinal 18S n'a pas rempli nos critères. Nous avons aussi déterminé que la combinaison de deux gènes de référence stables suffit pour une normalisation précise. Les variations d'expression génique de potentiels gènes d'intérêts dans des conditions expérimentales identiques (SNI, tissu et timepoints post SNI) vont pouvoir se mesurer sur la base d'une normalisation fiable. Non seulement il sera possible d'identifier des régulations potentiellement importantes dans la genèse de la douleur neuropathique mais aussi d'observer les différents phénotypes évoluant au cours du temps après lésion nerveuse.
Resumo:
Objective: Aspergillus species are the main pathogens causing invasive fungal infections but the prevalence of other mould species is rising. Resistance to antifungals among these new emerging pathogens presents a challenge for managing of infections. Conventional susceptibility testing of non-Aspergillus species is laborious and often difficult to interpret. We evaluated a new method for real-time susceptibility testing of moulds based on their of growth-related heat production.Methods: Laboratory and clinical strains of Mucor spp. (n = 4), Scedoporium spp. (n = 4) and Fusarium spp. (n = 5) were used. Conventional MIC was determined by microbroth dilution. Isothermal microcalorimetry was performed at 37 C using Sabouraud dextrose broth (SDB) inoculated with 104 spores/ml (determined by microscopical enumeration). SDB without antifungals was used for evaluation of growth characteristics. Detection time was defined as heat flow exceeding 10 lW. For susceptibility testing serial dilutions of amphotericin B, voriconazole, posaconazole and caspofungin were used. The minimal heat inhibitory concentration (MHIC) was defined as the lowest antifungal concentration, inhbiting 50% of the heat produced by the growth control at 48 h or at 24 h for Mucor spp. Susceptibility tests were performed in duplicate.Results: Tested mould genera had distinctive heat flow profiles with a median detection time (range) of 3.4 h (1.9-4.1 h) for Mucor spp, 11.0 h (7.1-13.7 h) for Fusarium spp and 29.3 h (27.4-33.0 h) for Scedosporium spp. Graph shows heat flow (in duplicate) of one representative strain from each genus (dashed line marks detection limit). Species belonging to the same genus showed similar heat production profiles. Table shows MHIC and MIC ranges for tested moulds and antifungals.Conclusions: Microcalorimetry allowed rapid detection of growth of slow-growing species, such as Fusarium spp. and Scedosporium spp. Moreover, microcalorimetry offers a new approach for antifungal susceptibility testing of moulds, correlating with conventional MIC values. Interpretation of calorimetric susceptibility data is easy and real-time data on the effect of different antifungals on the growth of the moulds is additionally obtained. This method may be used for investigation of different mechanisms of action of antifungals, new substances and drug-drug combinations.
Resumo:
Digital Holographic Microscopy (DHM), is a new imaging technique allowing to provide quantitative phase images with a high accuracy and stability making possible to explore a large variety of relevant processes, occurring on the p.s to day time scale, in the fields including material research as well as cell biology. As a non invasive and real time imaging technique, DHM is particularly well suited for high throughput screening
Resumo:
The quantity of interest for high-energy photon beam therapy recommended by most dosimetric protocols is the absorbed dose to water. Thus, ionization chambers are calibrated in absorbed dose to water, which is the same quantity as what is calculated by most treatment planning systems (TPS). However, when measurements are performed in a low-density medium, the presence of the ionization chamber generates a perturbation at the level of the secondary particle range. Therefore, the measured quantity is close to the absorbed dose to a volume of water equivalent to the chamber volume. This quantity is not equivalent to the dose calculated by a TPS, which is the absorbed dose to an infinitesimally small volume of water. This phenomenon can lead to an overestimation of the absorbed dose measured with an ionization chamber of up to 40% in extreme cases. In this paper, we propose a method to calculate correction factors based on the Monte Carlo simulations. These correction factors are obtained by the ratio of the absorbed dose to water in a low-density medium □D(w,Q,V1)(low) averaged over a scoring volume V₁ for a geometry where V₁ is filled with the low-density medium and the absorbed dose to water □D(w,QV2)(low) averaged over a volume V₂ for a geometry where V₂ is filled with water. In the Monte Carlo simulations, □D(w,QV2)(low) is obtained by replacing the volume of the ionization chamber by an equivalent volume of water, according to the definition of the absorbed dose to water. The method is validated in two different configurations which allowed us to study the behavior of this correction factor as a function of depth in phantom, photon beam energy, phantom density and field size.
Resumo:
Lymphocytic choriomeningitis virus (LCMV) is a rare cause of central nervous system disease in humans. Screening by real-time RT-PCR assay is of interest in the case of aseptic meningitis of unknown etiology. A specific LCMV real-time RT-PCR assay, based on the detection of genomic sequences of the viral nucleoprotein (NP), was developed to assess the presence of LCMV in cerebrospinal fluids (CSF) sent for viral screening to a Swiss university hospital laboratory. A 10-fold dilution series assay using a plasmid containing the cDNA of the viral NP of the LCMV isolate Armstrong (Arm) 53b demonstrated the high sensitivity of the assay with a lowest detection limit of ≤50 copies per reaction. High sensitivity was confirmed by dilution series assays in a pool of human CSF using four different LCMV isolates (Arm53b, WE54, Traub and E350) with observed detection limits of ≤10PFU/ml (Arm53b and WE54) and 1PFU/ml (Traub and E350). Analysis of 130 CSF showed no cases of acute infection. The absence of positive cases was confirmed by a published PCR assay detecting all Old World arenaviruses. This study validates a specific and sensitive real-time RT-PCR assay for the diagnosis of LCMV infections. Results showed that LCMV infections are extremely rare in hospitalized patients western in Switzerland.
Resumo:
This report describes the development of a SYBR Green I based real time polymerase chain reaction (PCR) protocol for detection on the ABI Prism 7000 instrument. Primers targeting the gene encoding the SSU rRNA were designed to amplify with high specificity DNA from Schistosoma mansoni, in a real time quantitative PCR system. The limit of detection of parasite DNA for the system was 10 fg of purified genomic DNA, that means less than the equivalent to one parasite cell (genome ~580 fg DNA). The efficiency was 0.99 and the correlation coefficient (R²) was 0.97. When different copy numbers of the target amplicon were used as standards, the assay could detect at least 10 copies of the specific target. The primers used were designed to amplify a 106 bp DNA fragment (Tm 83ºC). The assay was highly specific for S. mansoni, and did not recognize DNA from closely related non-schistosome trematodes. The real time PCR allowed for accurate quantification of S. mansoni DNA and no time-consuming post-PCR detection of amplification products by gel electrophoresis was required. The assay is potentially able to quantify S. mansoni DNA (and indirectly parasite burden) in a number of samples, such as snail tissue, serum and feces from patients, and cercaria infested water. Thus, these PCR protocols have potential to be used as tools for monitoring of schistosome transmission and quantitative diagnosis of human infection.
Resumo:
El Banc Central Europeu(BCE) ens ha encarregat preparar-li una base de dades on poder emmagatzemar uns registres diaris relacionats amb uns indicadors. En el moment d¿enregistrar els indicadors si es viola alguna regla de negoci, prèviament definides per el BCE, es llencen una sèrie d¿alertes que també s¿emmagatzemaran a la base de dades per al seu posterior tractament. Amb la informació emmagatzemada en aquesta base de dades(BD), el BCE utilitzant una aplicació per explotar aquestes dades, podrà controlar la salut dels bancs europeus en temps real i facilitar la presa de decisions. També se¿ns demana la implementació d¿un mòdul estadístic on hi hagin una sèrie de dades ja precalculades per facilitar la seva consulta.
Resumo:
Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.
Resumo:
Adolescent health surveys, like those for other segments of the population, tend to remain in the hands of researchers, where they can have no real impact on the way critical health issues are dealt with by policy makers or other professionals directly connected to young people in their everyday work. This paper reviews important issues concerning the dissemination of survey results among professionals from various fields. The content, length and wording of the messages should be tailored to the audience one wants to reach as well as the type of channels used for their diffusion. Survey data sets can be used to select priorities for interventions: ad hoc presentations, attractive summaries and brochures, or even films expressing young peoples' opinions have been used by European public health professionals to make data sets usable in various local, regional and national contexts. CONCLUSION: The impact of these diffusion strategies is, however, difficult to assess and needs to be refined. The adequate delivery of survey findings as well as advocacy and lobbying activities require specific skills which can be endorsed by specialized professionals. Ultimately, it is the researchers' responsibility to ensure that such tasks are effectively performed.
Resumo:
Les possibilitats ofertes per la virtualitat tenen una gran importància en l'esfera educativa i en tots els aspectes referents a ella. Evidentment, les biblioteques i els centres de documentació no són estranys a aquest nou ambient virtual facilitat pel canvi social, econòmic i, sobretot, tecnològic que ha permès que els bibliotecaris-documentalistes accedeixin a gran quantitat d'informació i de documentació, permetent que actuïn com a agents intermediaris entre aquesta nova situació i l'ús que se'n poden fer pels diversos tipus d'usuaris.
Resumo:
Reinforcement learning (RL) is a very suitable technique for robot learning, as it can learn in unknown environments and in real-time computation. The main difficulties in adapting classic RL algorithms to robotic systems are the generalization problem and the correct observation of the Markovian state. This paper attempts to solve the generalization problem by proposing the semi-online neural-Q_learning algorithm (SONQL). The algorithm uses the classic Q_learning technique with two modifications. First, a neural network (NN) approximates the Q_function allowing the use of continuous states and actions. Second, a database of the most representative learning samples accelerates and stabilizes the convergence. The term semi-online is referred to the fact that the algorithm uses the current but also past learning samples. However, the algorithm is able to learn in real-time while the robot is interacting with the environment. The paper shows simulated results with the "mountain-car" benchmark and, also, real results with an underwater robot in a target following behavior
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
This paper deals with the problem of navigation for an unmanned underwater vehicle (UUV) through image mosaicking. It represents a first step towards a real-time vision-based navigation system for a small-class low-cost UUV. We propose a navigation system composed by: (i) an image mosaicking module which provides velocity estimates; and (ii) an extended Kalman filter based on the hydrodynamic equation of motion, previously identified for this particular UUV. The obtained system is able to estimate the position and velocity of the robot. Moreover, it is able to deal with visual occlusions that usually appear when the sea bottom does not have enough visual features to solve the correspondence problem in a certain area of the trajectory
Resumo:
The automatic interpretation of conventional traffic signs is very complex and time consuming. The paper concerns an automatic warning system for driving assistance. It does not interpret the standard traffic signs on the roadside; the proposal is to incorporate into the existing signs another type of traffic sign whose information will be more easily interpreted by a processor. The type of information to be added is profuse and therefore the most important object is the robustness of the system. The basic proposal of this new philosophy is that the co-pilot system for automatic warning and driving assistance can interpret with greater ease the information contained in the new sign, whilst the human driver only has to interpret the "classic" sign. One of the codings that has been tested with good results and which seems to us easy to implement is that which has a rectangular shape and 4 vertical bars of different colours. The size of these signs is equivalent to the size of the conventional signs (approximately 0.4 m2). The colour information from the sign can be easily interpreted by the proposed processor and the interpretation is much easier and quicker than the information shown by the pictographs of the classic signs