922 resultados para Efficient error correction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The involvement of the cerebellum in migraine pathophysiology is not well understood. We used a biparametric approach at high-field MRI (3 T) to assess the structural integrity of the cerebellum in 15 migraineurs with aura (MWA), 23 migraineurs without aura (MWoA), and 20 healthy controls (HC). High-resolution T1 relaxation maps were acquired together with magnetization transfer images in order to probe microstructural and myelin integrity. Clusterwise analysis was performed on T1 and magnetization transfer ratio (MTR) maps of the cerebellum of MWA, MWoA, and HC using an ANOVA and a non-parametric clusterwise permutation F test, with age and gender as covariates and correction for familywise error rate. In addition, mean MTR and T1 in frontal regions known to be highly connected to the cerebellum were computed. Clusterwise comparison among groups showed a cluster of lower MTR in the right Crus I of MWoA patients vs. HC and MWA subjects (p = 0.04). Univariate and bivariate analysis on T1 and MTR contrasts showed that MWoA patients had longer T1 and lower MTR in the right and left pars orbitalis compared to MWA (p < 0.01 and 0.05, respectively), but no differences were found with HC. Lower MTR and longer T1 point at a loss of macromolecules and/or micro-edema in Crus I and pars orbitalis in MWoA patients vs. HC and vs. MWA. The pathophysiological implications of these findings are discussed in light of recent literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study preconditioning techniques for discontinuous Galerkin discretizations of isotropic linear elasticity problems in primal (displacement) formulation. We propose subspace correction methods based on a splitting of the vector valued piecewise linear discontinuous finite element space, that are optimal with respect to the mesh size and the Lamé parameters. The pure displacement, the mixed and the traction free problems are discussed in detail. We present a convergence analysis of the proposed preconditioners and include numerical examples that validate the theory and assess the performance of the preconditioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the University of California at Berkeley, from September to December 2007. Environmental niche modelling (ENM) techniques are powerful tools to predict species potential distributions. In the last ten years, a plethora of novel methodological approaches and modelling techniques have been developed. During three months, I stayed at the University of California, Berkeley, working under the supervision of Dr. David R. Vieites. The aim of our work was to quantify the error committed by these techniques, but also to test how an increase in the sample size affects the resultant predictions. Using MaxEnt software we generated distribution predictive maps, from different sample sizes, of the Eurasian quail (Coturnix coturnix) in the Iberian Peninsula. The quail is a generalist species from a climatic point of view, but an habitat specialist. The resultant distribution maps were compared with the real distribution of the species. This distribution was obtained from recent bird atlases from Spain and Portugal. Results show that ENM techniques can have important errors when predicting the species distribution of generalist species. Moreover, an increase of sample size is not necessary related with a better performance of the models. We conclude that a deep knowledge of the species’ biology and the variables affecting their distribution is crucial for an optimal modelling. The lack of this knowledge can induce to wrong conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational resources. Grid enables access to the resources but it does not guarantee any quality of service. Moreover, Grid does not provide performance isolation; job of one user can influence the performance of other user’s job. The other problem with Grid is that the users of Grid belong to scientific community and the jobs require specific and customized software environment. Providing the perfect environment to the user is very difficult in Grid for its dispersed and heterogeneous nature. Though, Cloud computing provide full customization and control, but there is no simple procedure available to submit user jobs as in Grid. The Grid computing can provide customized resources and performance to the user using virtualization. A virtual machine can join the Grid as an execution node. The virtual machine can also be submitted as a job with user jobs inside. Where the first method gives quality of service and performance isolation, the second method also provides customization and administration in addition. In this thesis, a solution is proposed to enable virtual machine reuse which will provide performance isolation with customization and administration. The same virtual machine can be used for several jobs. In the proposed solution customized virtual machines join the Grid pool on user request. Proposed solution describes two scenarios to achieve this goal. In first scenario, user submits their customized virtual machine as a job. The virtual machine joins the Grid pool when it is powered on. In the second scenario, user customized virtual machines are preconfigured in the execution system. These virtual machines join the Grid pool on user request. Condor and VMware server is used to deploy and test the scenarios. Condor supports virtual machine jobs. The scenario 1 is deployed using Condor VM universe. The second scenario uses VMware-VIX API for scripting powering on and powering off of the remote virtual machines. The experimental results shows that as scenario 2 does not need to transfer the virtual machine image, the virtual machine image becomes live on pool more faster. In scenario 1, the virtual machine runs as a condor job, so it easy to administrate the virtual machine. The only pitfall in scenario 1 is the network traffic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the Saccharomyces cerevisae peroxisomal acyl-coenzyme A (acyl-CoA) thioesterase (Pte1p) in fatty acid beta-oxidation was studied by analyzing the in vitro kinetic activity of the purified protein as well as by measuring the carbon flux through the beta-oxidation cycle in vivo using the synthesis of peroxisomal polyhydroxyalkanoate (PHA) from the polymerization of the 3-hydroxyacyl-CoAs as a marker. The amount of PHA synthesized from the degradation of 10-cis-heptadecenoic, tridecanoic, undecanoic, or nonanoic acids was equivalent or slightly reduced in the pte1Delta strain compared with wild type. In contrast, a strong reduction in PHA synthesized from heptanoic acid and 8-methyl-nonanoic acid was observed for the pte1Delta strain compared with wild type. The poor catabolism of 8-methyl-nonanoic acid via beta-oxidation in pte1Delta negatively impacted the degradation of 10-cis-heptadecenoic acid and reduced the ability of the cells to efficiently grow in medium containing such fatty acids. An increase in the proportion of the short chain 3-hydroxyacid monomers was observed in PHA synthesized in pte1Delta cells grown on a variety of fatty acids, indicating a reduction in the metabolism of short chain acyl-CoAs in these cells. A purified histidine-tagged Pte1p showed high activity toward short and medium chain length acyl-CoAs, including butyryl-CoA, decanoyl-CoA and 8-methyl-nonanoyl-CoA. The kinetic parameters measured for the purified Pte1p fit well with the implication of this enzyme in the efficient metabolism of short straight and branched chain fatty acyl-CoAs by the beta-oxidation cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

n this paper the iterative MSFV method is extended to include the sequential implicit simulation of time dependent problems involving the solution of a system of pressure-saturation equations. To control numerical errors in simulation results, an error estimate, based on the residual of the MSFV approximate pressure field, is introduced. In the initial time steps in simulation iterations are employed until a specified accuracy in pressure is achieved. This initial solution is then used to improve the localization assumption at later time steps. Additional iterations in pressure solution are employed only when the pressure residual becomes larger than a specified threshold value. Efficiency of the strategy and the error control criteria are numerically investigated. This paper also shows that it is possible to derive an a-priori estimate and control based on the allowed pressure-equation residual to guarantee the desired accuracy in saturation calculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many cell surface glycoproteins are anchored in the lipid bilayer by a glycosylphosphatidyl-inositol (GPI) structure. Recently, a number of cell lines which are deficient in the biosynthesis and/or addition of this anchor have been described. In this report, we summarize the current knowledge on these lines and discuss their potential use to isolate the genes involved in the GPI anchor biosynthetic pathway with a specific emphasis on L cell fibroblasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quantity of interest for high-energy photon beam therapy recommended by most dosimetric protocols is the absorbed dose to water. Thus, ionization chambers are calibrated in absorbed dose to water, which is the same quantity as what is calculated by most treatment planning systems (TPS). However, when measurements are performed in a low-density medium, the presence of the ionization chamber generates a perturbation at the level of the secondary particle range. Therefore, the measured quantity is close to the absorbed dose to a volume of water equivalent to the chamber volume. This quantity is not equivalent to the dose calculated by a TPS, which is the absorbed dose to an infinitesimally small volume of water. This phenomenon can lead to an overestimation of the absorbed dose measured with an ionization chamber of up to 40% in extreme cases. In this paper, we propose a method to calculate correction factors based on the Monte Carlo simulations. These correction factors are obtained by the ratio of the absorbed dose to water in a low-density medium □D(w,Q,V1)(low) averaged over a scoring volume V₁ for a geometry where V₁ is filled with the low-density medium and the absorbed dose to water □D(w,QV2)(low) averaged over a volume V₂ for a geometry where V₂ is filled with water. In the Monte Carlo simulations, □D(w,QV2)(low) is obtained by replacing the volume of the ionization chamber by an equivalent volume of water, according to the definition of the absorbed dose to water. The method is validated in two different configurations which allowed us to study the behavior of this correction factor as a function of depth in phantom, photon beam energy, phantom density and field size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary : Forensic science - both as a source of and as a remedy for error potentially leading to judicial error - has been studied empirically in this research. A comprehensive literature review, experimental tests on the influence of observational biases in fingermark comparison, and semistructured interviews with heads of forensic science laboratories/units in Switzerland and abroad were the tools used. For the literature review, some of the areas studied are: the quality of forensic science work in general, the complex interaction between science and law, and specific propositions as to error sources not directly related to the interaction between law and science. A list of potential error sources all the way from the crime scene to the writing of the report has been established as well. For the empirical tests, the ACE-V (Analysis, Comparison, Evaluation, and Verification) process of fingermark comparison was selected as an area of special interest for the study of observational biases, due to its heavy reliance on visual observation and recent cases of misidentifications. Results of the tests performed with forensic science students tend to show that decision-making stages are the most vulnerable to stimuli inducing observational biases. For the semi-structured interviews, eleven senior forensic scientists answered questions on several subjects, for example on potential and existing error sources in their work, of the limitations of what can be done with forensic science, and of the possibilities and tools to minimise errors. Training and education to augment the quality of forensic science have been discussed together with possible solutions to minimise the risk of errors in forensic science. In addition, the time that samples of physical evidence are kept has been determined as well. Results tend to show considerable agreement on most subjects among the international participants. Their opinions on possible explanations for the occurrence of such problems and the relative weight of such errors in the three stages of crime scene, laboratory, and report writing, disagree, however, with opinions widely represented in existing literature. Through the present research it was therefore possible to obtain a better view of the interaction of forensic science and judicial error to propose practical recommendations to minimise their occurrence. Résumé : Les sciences forensiques - considérés aussi bien comme source de que comme remède à l'erreur judiciaire - ont été étudiées empiriquement dans cette recherche. Une revue complète de littérature, des tests expérimentaux sur l'influence du biais de l'observation dans l'individualisation de traces digitales et des entretiens semi-directifs avec des responsables de laboratoires et unités de sciences forensiques en Suisse et à l'étranger étaient les outils utilisés. Pour la revue de littérature, quelques éléments étudies comprennent: la qualité du travail en sciences forensiques en général, l'interaction complexe entre la science et le droit, et des propositions spécifiques quant aux sources d'erreur pas directement liées à l'interaction entre droit et science. Une liste des sources potentielles d'erreur tout le long du processus de la scène de crime à la rédaction du rapport a également été établie. Pour les tests empiriques, le processus d'ACE-V (analyse, comparaison, évaluation et vérification) de l'individualisation de traces digitales a été choisi comme un sujet d'intérêt spécial pour l'étude des effets d'observation, due à son fort recours à l'observation visuelle et dû à des cas récents d'identification erronée. Les résultats des tests avec des étudiants tendent à prouver que les étapes de prise de décision sont les plus vulnérables aux stimuli induisant des biais d'observation. Pour les entretiens semi-structurés, onze forensiciens ont répondu à des questions sur des sujets variés, par exemple sur des sources potentielles et existantes d'erreur dans leur travail, des limitations de ce qui peut être fait en sciences forensiques, et des possibilités et des outils pour réduire au minimum ses erreurs. La formation et l'éducation pour augmenter la qualité des sciences forensiques ont été discutées ainsi que les solutions possibles pour réduire au minimum le risque d'erreurs en sciences forensiques. Le temps que des échantillons sont gardés a été également déterminé. En général, les résultats tendent à montrer un grand accord sur la plupart des sujets abordés pour les divers participants internationaux. Leur avis sur des explications possibles pour l'occurrence de tels problèmes et sur le poids relatif de telles erreurs dans les trois étapes scène de crime;', laboratoire et rédaction de rapports est cependant en désaccord avec les avis largement représentés dans la littérature existante. Par cette recherche il était donc possible d'obtenir une meilleure vue de l'interaction des sciences forensiques et de l'erreur judiciaire afin de proposer des recommandations pratiques pour réduire au minimum leur occurrence. Zusammenfassung : Forensische Wissenschaften - als Ursache und als Hilfsmittel gegen Fehler, die möglicherweise zu Justizirrtümern führen könnten - sind hier empirisch erforscht worden. Die eingestzten Methoden waren eine Literaturübersicht, experimentelle Tests über den Einfluss von Beobachtungseffekten (observer bias) in der Individualisierung von Fingerabdrücken und halbstandardisierte Interviews mit Verantwortlichen von kriminalistischen Labors/Diensten in der Schweiz und im Ausland. Der Literaturüberblick umfasst unter anderem: die Qualität der kriminalistischen Arbeit im Allgemeinen, die komplizierte Interaktion zwischen Wissenschaft und Recht und spezifische Fehlerquellen, welche nicht direkt auf der Interaktion von Recht und Wissenschaft beruhen. Eine Liste möglicher Fehlerquellen vom Tatort zum Rapportschreiben ist zudem erstellt worden. Für die empirischen Tests wurde der ACE-V (Analyse, Vergleich, Auswertung und Überprüfung) Prozess in der Fingerabdruck-Individualisierung als speziell interessantes Fachgebiet für die Studie von Beobachtungseffekten gewählt. Gründe sind die Wichtigkeit von visuellen Beobachtungen und kürzliche Fälle von Fehlidentifizierungen. Resultate der Tests, die mit Studenten durchgeführt wurden, neigen dazu Entscheidungsphasen als die anfälligsten für Stimuli aufzuzeigen, die Beobachtungseffekte anregen könnten. Für die halbstandardisierten Interviews beantworteten elf Forensiker Fragen über Themen wie zum Beispiel mögliche und vorhandene Fehlerquellen in ihrer Arbeit, Grenzen der forensischen Wissenschaften und Möglichkeiten und Mittel um Fehler zu verringern. Wie Training und Ausbildung die Qualität der forensischen Wissenschaften verbessern können ist zusammen mit möglichen Lösungen zur Fehlervermeidung im selben Bereich diskutiert worden. Wie lange Beweismitten aufbewahrt werden wurde auch festgehalten. Resultate neigen dazu, für die meisten Themen eine grosse Übereinstimmung zwischen den verschiedenen internationalen Teilnehmern zu zeigen. Ihre Meinungen über mögliche Erklärungen für das Auftreten solcher Probleme und des relativen Gewichts solcher Fehler in den drei Phasen Tatort, Labor und Rapportschreiben gehen jedoch mit den Meinungen, welche in der Literatur vertreten werden auseinander. Durch diese Forschungsarbeit war es folglich möglich, ein besseres Verständnis der Interaktion von forensischen Wissenschaften und Justizirrtümer zu erhalten, um somit praktische Empfehlungen vorzuschlagen, welche diese verringern. Resumen : Esta investigación ha analizado de manera empírica el rol de las ciencias forenses como fuente y como remedio de potenciales errores judiciales. La metodología empleada consistió en una revisión integral de la literatura, en una serie de experimentos sobre la influencia de los sesgos de observación en la individualización de huellas dactilares y en una serie de entrevistas semiestructuradas con jefes de laboratorios o unidades de ciencias forenses en Suiza y en el extranjero. En la revisión de la literatura, algunas de las áreas estudiadas fueron: la calidad del trabajo en ciencias forenses en general, la interacción compleja entre la ciencia y el derecho, así como otras fuentes de error no relacionadas directamente con la interacción entre derecho y ciencia. También se ha establecido una lista exhaustiva de las fuentes potenciales de error desde la llegada a la escena del crimen a la redacción del informe. En el marco de los tests empíricos, al analizar los sesgos de observación dedicamos especial interés al proceso de ACE-V (análisis, comparación, evaluación y verificación) para la individualización de huellas dactilares puesto que este reposa sobre la observación visual y ha originado varios casos recientes de identificaciones erróneas. Los resultados de las experimentaciones realizadas con estudiantes sugieren que las etapas en las que deben tornarse decisiones son las más vulnerables a lös factores que pueden generar sesgos de observación. En el contexto de las entrevistas semi-estructuradas, once científicos forenses de diversos países contestaron preguntas sobre varios temas, incluyendo las fuentes potenciales y existehtes de error en su trabajo, las limitaciones propias a las ciencias forenses, las posibilidades de reducir al mínimo los errores y las herramientas que podrían ser utilizadas para ello. Se han sugerido diversas soluciones para alcanzar este objetivo, incluyendo el entrenamiento y la educación para aumentar la calidad de las ciencias forenses. Además, se ha establecido el periodo de conservación de las muestras judiciales. Los resultados apuntan a un elevado grado de consenso entre los entrevistados en la mayoría de los temas. Sin embargo, sus opiniones sobre las posibles causas de estos errores y su importancia relativa en las tres etapas de la investigación -la escena del crimen, el laboratorio y la redacción de informe- discrepan con las que predominan ampliamente en la literatura actual. De este modo, esta investigación nos ha permitido obtener una mejor imagen de la interacción entre ciencias forenses y errores judiciales, y comenzar a formular una serie de recomendaciones prácticas para reducirlos al minimo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Els alumnes i les alumnes de primer cicle d'ESO s'enfronten amb les dificultats de la nova etapa i la necessitat de demostrar les seves competències lingüístiques, quant al redactat i la correcció en l'elaboració de textos diversos. Les noves tecnologies de la informació i la comunicació poden ajudar (i han de propiciar-ho) a desenvolupar aquestes capacitats, i han de ser un element motivador perquè els nois i les noies sentin la necessitat de crear produccions escrites pròpies, adequades lingüísticament, coherents i amb les idees ben organitzades.És necessari, a més, que l'avaluació d'aquestes tasques els serveixi per millorar l'expressió en llengua catalana, gramaticalment i ortogràficament, tot ajudant-los a establir sistemes d'autocorrecció i rectificació d'errors.L'exposició pública (en un mitjà com el d'internet) de les seves obres hauria de reblar l'aspecte de la motivació en la creació d'escrits, de manera que els aporti noves expectatives de comunicació i els animi a participar de l'espai virtual que els ofereix aquesta finestra oberta al món. El meu treball té la intenció de ser un projecte educatiu per aconseguir una millora en l'expressió escrita dels alumnes del primer curs d'ESO amb l'ajuda de les eines de les Tecnologies de la Informació i la Comunicació.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Mild neurocognitive disorders (MND) affect a subset of HIV+ patients under effective combination antiretroviral therapy (cART). In this study, we used an innovative multi-contrast magnetic resonance imaging (MRI) approach at high-field to assess the presence of micro-structural brain alterations in MND+ patients. METHODS: We enrolled 17 MND+ and 19 MND- patients with undetectable HIV-1 RNA and 19 healthy controls (HC). MRI acquisitions at 3T included: MP2RAGE for T1 relaxation times, Magnetization Transfer (MT), T2* and Susceptibility Weighted Imaging (SWI) to probe micro-structural integrity and iron deposition in the brain. Statistical analysis used permutation-based tests and correction for family-wise error rate. Multiple regression analysis was performed between MRI data and (i) neuropsychological results (ii) HIV infection characteristics. A linear discriminant analysis (LDA) based on MRI data was performed between MND+ and MND- patients and cross-validated with a leave-one-out test. RESULTS: Our data revealed loss of structural integrity and micro-oedema in MND+ compared to HC in the global white and cortical gray matter, as well as in the thalamus and basal ganglia. Multiple regression analysis showed a significant influence of sub-cortical nuclei alterations on the executive index of MND+ patients (p = 0.04 he and R(2) = 95.2). The LDA distinguished MND+ and MND- patients with a classification quality of 73% after cross-validation. CONCLUSION: Our study shows micro-structural brain tissue alterations in MND+ patients under effective therapy and suggests that multi-contrast MRI at high field is a powerful approach to discriminate between HIV+ patients on cART with and without mild neurocognitive deficits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Schistosomes are endoparasites causing a serious human disease called schistosomiasis. The quantification of parasite genetic diversity is an essential component to understand the schistosomiasis epidemiology and disease transmission patterns. In this paper, we propose a novel assay for a rapid, low costly and efficient DNA extraction method of egg, larval and adult stages of Schistosoma mansoni. One euro makes possible to perform 60,000 DNA extraction reactions at top speed (only 15 min of incubation and 5 handling steps).