965 resultados para diagnostic techniques and procedure
Resumo:
The In Situ Analysis System (ISAS) was developed to produce gridded fields of temperature and salinity that preserve as much as possible the time and space sampling capabilities of the Argo network of profiling floats. Since the first global re-analysis performed in 2009, the system has evolved and a careful delayed mode processing of the 2002-2012 dataset has been carried out using version 6 of ISAS and updating the statistics to produce the ISAS13 analysis. This last version is now implemented as the operational analysis tool at the Coriolis data centre. The robustness of the results with respect to the system evolution is explored through global quantities of climatological interest: the Ocean Heat Content and the Steric Height. Estimates of errors consistent with the methodology are computed. This study shows that building reliable statistics on the fields is fundamental to improve the monthly estimates and to determine the absolute error bars. The new mean fields and variances deduced from the ISAS13 re-analysis and dataset show significant changes relative to the previous ISAS estimates, in particular in the southern ocean, justifying the iterative procedure. During the decade covered by Argo, the intermediate waters appear warmer and saltier in the North Atlantic and fresher in the Southern Ocean than in WOA05 long term mean. At inter-annual scale, the impact of ENSO on the Ocean Heat Content and Steric Height is observed during the 2006-2007 and 2009-2010 events captured by the network.
Resumo:
Aims: To investigate the use of diffusion weighted magnetic resonance imaging (DWI) and the apparent diffusion coefficient (ADC) values in the diagnosis of hemangioma. Materials and methods: The study population consisted of 72 patients with liver masses larger than 1 cm (72 focal lesions). DWI examination with a b value of 600 s/mm2 was carried out for all patients. After DWI examination, an ADC map was created and ADC values were measured for 72 liver masses and normal liver tissue (control group). The average ADC values of normal liver tissue and focal liver lesions, the “cut-off” ADC values, and the diagnostic sensitivity and specificity of the ADC map in diagnosing hemangioma, benign and malignant lesions were researched. Results: Of the 72 liver masses, 51 were benign and 21 were malignant. Benign lesions comprised 38 hemangiomas and 13 simple cysts. Malignant lesions comprised 9 hepatocellular carcinomas, and 12 metastases. The highest ADC values were measured for cysts (3.782±0.53×10-3 mm2/s) and hemangiomas (2.705±0.63×10-3 mm2/s). The average ADC value of hemangiomas was significantly higher than malignant lesions and the normal control group (p<0.001). The average ADC value of cysts were significantly higher when compared to hemangiomas and normal control group (p<0.001). To distinguish hemangiomas from malignant liver lesions, the “cut-off” ADC value of 1.800×10-3 mm2/s had a sensitivity of 97.4% and a specificity of 90.9%. To distinguish hemangioma from normal liver parenchyma the “cut-off” value of 1.858×10-3 mm2/s had a sensitivity of 97.4% and a specificity of 95.7%. To distinguish benign liver lesions from malignant liver lesions the “cut-off” value of 1.800×10-3 mm2/s had a sensitivity of 96.1% and a specificity of 90.0%. Conclusion: DWI and quantitative measurement of ADC values can be used in differential diagnosis of benign and malignant liver lesions and also in the diagnosis and differentiation of hemangiomas. When dynamic examination cannot distinguish cases with vascular metastasis and lesions from hemangioma, DWI and ADC values can be useful in the primary diagnosis and differential diagnosis. The technique does not require contrast material, so it can safely be used in patients with renal failure. Keywords:
Resumo:
The Ocean Model Intercomparison Project (OMIP) aims to provide a framework for evaluating, understanding, and improving the ocean and sea-ice components of global climate and earth system models contributing to the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses these aims in two complementary manners: (A) by providing an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing, (B) by providing a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) offering details for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows that of the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II have become the standard method to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP (Scenario MIP), as well as the ocean-sea ice OMIP simulations. The bulk of this paper offers scientific rationale for saving these diagnostics.
Resumo:
The Ocean Model Intercomparison Project (OMIP) is an endorsed project in the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses CMIP6 science questions, investigating the origins and consequences of systematic model biases. It does so by providing a framework for evaluating (including assessment of systematic biases), understanding, and improving ocean, sea-ice, tracer, and biogeochemical components of climate and earth system models contributing to CMIP6. Among the WCRP Grand Challenges in climate science (GCs), OMIP primarily contributes to the regional sea level change and near-term (climate/decadal) prediction GCs. OMIP provides (a) an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing; and (b) a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) detailing methods for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II (Interannual Forcing) have become the standard methods to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP, HighResMIP (High Resolution MIP), as well as the ocean/sea-ice OMIP simulations.
Resumo:
Despite a low incidence in developed countries, gastrointestinal taeniasis should be suspected in patients with abdominal pain, diarrhea, anemia, and/or malabsorption of unknown origin, even more so if they come from endemic regions or areas with poor hygienic and alimentary habits. Diagnosis is traditionally reached by identifying the parasite in stools, but more recently both serological and immunological approaches are also available. Based on a patient diagnosed by gastroscopy, a literature review was undertaken of patients diagnosed by endoscopy. We discuss endoscopy as diagnostic modality, and the effectiveness and safety that endoscopic treatment may provide in view of the potential risk for neurocysticercosis.
Resumo:
Dissertação de Mestrado Integrado em Medicina Veterinária
Resumo:
The photochemistry of pesticides triadimenol and triadimefon was studied on cellulose and beta-cyclodextrin (beta-CD) in controlled and natural conditions, using diffuse reflectance techniques and chromatographic analysis. The photochemistry of triadimenol occurs from the chlorophenoxyl moiety, while the photodegradation of triadimefon also involves the carbonyl group. The formation of 4-chlorophenoxyl radical is one of the major reaction pathways for both pesticides and leads to 4-chlorophenol. Triadimenol also undergoes photooxidation and dechlorination, leading to triadimefon and dechlorinated triadimenol, respectively. The other main reaction process of triadimefon involves alpha-cleavage from the carbonyl group, leading to decarbonylated compounds. Triadimenol undergoes photodegradation at 254 nm but was found to be stable at 313 nm, while triadimefon degradates in both conditions. Both pesticides undergo photochemical decomposition under solar radiation, being the initial degradation of rate per unit area of triadimefon 1 order of magnitude higher than the observed for triadimenol in both supports. The degradation rates of the pesticides were somewhat lower in beta-CD than on cellulose. Photoproduct distribution of triadimenol and triadimefon is similar for the different irradiation conditions, indicating an intramolecular energy transfer from the chlorophenoxyl moiety to the carbonyl group in the latter pesticide.
Development of new scenario decomposition techniques for linear and nonlinear stochastic programming
Resumo:
Une approche classique pour traiter les problèmes d’optimisation avec incertitude à deux- et multi-étapes est d’utiliser l’analyse par scénario. Pour ce faire, l’incertitude de certaines données du problème est modélisée par vecteurs aléatoires avec des supports finis spécifiques aux étapes. Chacune de ces réalisations représente un scénario. En utilisant des scénarios, il est possible d’étudier des versions plus simples (sous-problèmes) du problème original. Comme technique de décomposition par scénario, l’algorithme de recouvrement progressif est une des méthodes les plus populaires pour résoudre les problèmes de programmation stochastique multi-étapes. Malgré la décomposition complète par scénario, l’efficacité de la méthode du recouvrement progressif est très sensible à certains aspects pratiques, tels que le choix du paramètre de pénalisation et la manipulation du terme quadratique dans la fonction objectif du lagrangien augmenté. Pour le choix du paramètre de pénalisation, nous examinons quelques-unes des méthodes populaires, et nous proposons une nouvelle stratégie adaptive qui vise à mieux suivre le processus de l’algorithme. Des expériences numériques sur des exemples de problèmes stochastiques linéaires multi-étapes suggèrent que la plupart des techniques existantes peuvent présenter une convergence prématurée à une solution sous-optimale ou converger vers la solution optimale, mais avec un taux très lent. En revanche, la nouvelle stratégie paraît robuste et efficace. Elle a convergé vers l’optimalité dans toutes nos expériences et a été la plus rapide dans la plupart des cas. Pour la question de la manipulation du terme quadratique, nous faisons une revue des techniques existantes et nous proposons l’idée de remplacer le terme quadratique par un terme linéaire. Bien que qu’il nous reste encore à tester notre méthode, nous avons l’intuition qu’elle réduira certaines difficultés numériques et théoriques de la méthode de recouvrement progressif.
Resumo:
This report has two major objectives. First, the results of an action research project conducted at my high school concerning the use of graphic organizers and their effects on students' written expression abilities. The findings from this action research project indicate that the use of graphic organizers can prove beneficial to students. The second major objective of this report is to provide a reflection and evaluation of my experiences as a participant in the Michigan Teacher Excellence Program (MiTEP). This program provided middle and high school science teachers with an opportunity to develop research based pedagogy techniques and develop the skill necessary to serve as leaders within the public school science community. The action research project described in the first chapter of this report was a collaborative project I participated in during my enrollment in ED 5705 at Michigan Technological University. I worked closely with two other teachers in my building - Brytt Ergang and James Wright. We met several times to develop a research question, and a procedure for testing our question. Each of us investigated how the use of graphic organizers by students in our classroom might impact their performance on writing assessments. We each collected data from several of our classes. In my case I collected data from 2 different classes over 2 different assignments. Our data was collected and the results analyzed separately from classroom to classroom. After the individual classroom data and corresponding analysis was compiled my fellow collaborators and I got together to discuss our findings. We worked together to write a conclusion based on our combined results in all of our classes.
Development of new scenario decomposition techniques for linear and nonlinear stochastic programming
Resumo:
Une approche classique pour traiter les problèmes d’optimisation avec incertitude à deux- et multi-étapes est d’utiliser l’analyse par scénario. Pour ce faire, l’incertitude de certaines données du problème est modélisée par vecteurs aléatoires avec des supports finis spécifiques aux étapes. Chacune de ces réalisations représente un scénario. En utilisant des scénarios, il est possible d’étudier des versions plus simples (sous-problèmes) du problème original. Comme technique de décomposition par scénario, l’algorithme de recouvrement progressif est une des méthodes les plus populaires pour résoudre les problèmes de programmation stochastique multi-étapes. Malgré la décomposition complète par scénario, l’efficacité de la méthode du recouvrement progressif est très sensible à certains aspects pratiques, tels que le choix du paramètre de pénalisation et la manipulation du terme quadratique dans la fonction objectif du lagrangien augmenté. Pour le choix du paramètre de pénalisation, nous examinons quelques-unes des méthodes populaires, et nous proposons une nouvelle stratégie adaptive qui vise à mieux suivre le processus de l’algorithme. Des expériences numériques sur des exemples de problèmes stochastiques linéaires multi-étapes suggèrent que la plupart des techniques existantes peuvent présenter une convergence prématurée à une solution sous-optimale ou converger vers la solution optimale, mais avec un taux très lent. En revanche, la nouvelle stratégie paraît robuste et efficace. Elle a convergé vers l’optimalité dans toutes nos expériences et a été la plus rapide dans la plupart des cas. Pour la question de la manipulation du terme quadratique, nous faisons une revue des techniques existantes et nous proposons l’idée de remplacer le terme quadratique par un terme linéaire. Bien que qu’il nous reste encore à tester notre méthode, nous avons l’intuition qu’elle réduira certaines difficultés numériques et théoriques de la méthode de recouvrement progressif.
Resumo:
Summarize the available literature descriptions of neural mobilization (NM) techniques and neural provocation tests (NPT) for the Lower Limb (LL). Compilation of data was performed in May 2016 using MEDLINE data base, Google Scholar and the library of the European University of Madrid. After application of inclusion/exclusion criterions 5 books and 14 journal publications where found to be of interest and used during data extraction.Results: a list of 8 different LLNM techniques are applied in a rhythmic alternating oscillatory cycle fashion, starting in the initial position from where the therapist proceeds to move the limb in order to achieve a final position. LL NPTs are useful tools for differential diagnose and selecting the proper LLNM procedure. There is no consensus about the time frame of repetition intervals or amount of tensile strength during NPT never the less it is found to normally be performed at a rate of 2-4 seconds per complete cycle of movement, during 1-5 minutes, 3-5 times a week. LLNM treatment techniques all thou increasingly popular in clinical practice are found to be frugally described and lack proper standardization in regards to therapeutic dosification.
Resumo:
In Portugal, Veterinary Pathology is developing rapidly, and in recent years we assist to the emergence of private laboratories and the restructuring of universities,polytechnics and public laboratories.The Portuguese Society of Animal Pathology,through its actions and its associates has been keeping the discussion among its peers in order to standardizethe criteria of description,classification and evaluation of cases which are the subject of our daily work.One of the last challenges is associated with the use of routine histochemical techniques and immunohistochemistry, in an effort to establish standardized panels for tumour diagnosis, which could eventually reduce each analysis cost.For this purpose a simple survey was built, in which all collaborators answered questions about the markers used for carcinoma, sarcoma and round cell tumour diagnosis, as well as general questions related with the subject. We obtained twenty-one answered to the questions, from public and private laboratories.In general, in most cases immunohistochemical and histochemical methods are used for diagnosis.The wide spectrum cytokeratins are universally used to confirm carcinoma, and vimentin for sarcoma. The CD3 marker is used by all laboratories to identify T lymphocytes. For the diagnosis of B-cell lymphoma, the marker used is not consensual. In each laboratory there are different markers for more specific situations and only two labs perform PCR techniques for diagnosis. These data will be presented to promote extended discussion,namely to reach a consensus when different markers are used.
Resumo:
This thesis deals with optimization techniques and modeling of vehicular networks. Thanks to the models realized with the integer linear programming (ILP) and the heuristic ones, it was possible to study the performances in 5G networks for the vehicular. Thanks to Software-defined networking (SDN) and Network functions virtualization (NFV) paradigms it was possible to study the performances of different classes of service, such as the Ultra Reliable Low Latency Communications (URLLC) class and enhanced Mobile BroadBand (eMBB) class, and how the functional split can have positive effects on network resource management. Two different protection techniques have been studied: Shared Path Protection (SPP) and Dedicated Path Protection (DPP). Thanks to these different protections, it is possible to achieve different network reliability requirements, according to the needs of the end user. Finally, thanks to a simulator developed in Python, it was possible to study the dynamic allocation of resources in a 5G metro network. Through different provisioning algorithms and different dynamic resource management techniques, useful results have been obtained for understanding the needs in the vehicular networks that will exploit 5G. Finally, two models are shown for reconfiguring backup resources when using shared resource protection.
Resumo:
Scopo del progetto di dottorato è stato quello di mettere a punto l’utilizzo di una tecnica alternativa, rispetto a quelle riportate in letteratura, per la risoluzione dell’ernia perineale nel cane. Tale tecnica si propone di risolvere il difetto mediante l’ausilio di una rete protesica in polipropilene in supporto all’erniorrafia perineale tradizionale, mediante sutura a punti staccati dei muscoli che costituiscono il diaframma pelvico. Questa procedura risulta meno invasiva per l’animale rispetto alla tecnica che in letteratura viene definita il gold standard, ovvero la trasposizione del muscolo otturatore interno (Shaughnessy and Monnet, 2015), ed altre tecniche traspositive, quali la trasposizione del muscolo semitendinoso (Morello et al., 2015) e del gluteo superficiale (Bellenger & Canfield, 2002), con tempi di recupero postoperatori più rapidi e una gestione più semplice da parte del proprietario, pur mantenendo eccellenti outcomes. La messa a punto di questa procedura nasce dalla necessità di trovare una tecnica semplice da eseguire, che determini una chiusura del diaframma pelvico tale da evitare la comparsa di recidive e minimizzare le complicazioni post-operatorie. L’intento del progetto è stato di combinare l’erniorrafia perineale tradizionale con l’utilizzo di reti protesiche così da ricostruire il diaframma pelvico con i muscoli deputati a svolgere tale compito e fornire, tramite la rete, un maggiore supporto alla rafia, evitando le trasposizioni muscolari che comporterebbero un intervento più demolitivo e un recupero più lento da parte del paziente. Sono stati inclusi nello studio 30 cani portatori di ernia perineale, per un totale di 50 ernie, trattati con l’intervento proposto dal progetto. Di ogni caso sono stati raccolti il segnalamento, l’anamnesi, le indagini diagnostiche e i successivi follow up post-operatori considerando le eventuali complicazioni o la comparsa di recidiva dell’ernia.
Assessing brain connectivity through electroencephalographic signal processing and modeling analysis
Resumo:
Brain functioning relies on the interaction of several neural populations connected through complex connectivity networks, enabling the transmission and integration of information. Recent advances in neuroimaging techniques, such as electroencephalography (EEG), have deepened our understanding of the reciprocal roles played by brain regions during cognitive processes. The underlying idea of this PhD research is that EEG-related functional connectivity (FC) changes in the brain may incorporate important neuromarkers of behavior and cognition, as well as brain disorders, even at subclinical levels. However, a complete understanding of the reliability of the wide range of existing connectivity estimation techniques is still lacking. The first part of this work addresses this limitation by employing Neural Mass Models (NMMs), which simulate EEG activity and offer a unique tool to study interconnected networks of brain regions in controlled conditions. NMMs were employed to test FC estimators like Transfer Entropy and Granger Causality in linear and nonlinear conditions. Results revealed that connectivity estimates reflect information transmission between brain regions, a quantity that can be significantly different from the connectivity strength, and that Granger causality outperforms the other estimators. A second objective of this thesis was to assess brain connectivity and network changes on EEG data reconstructed at the cortical level. Functional brain connectivity has been estimated through Granger Causality, in both temporal and spectral domains, with the following goals: a) detect task-dependent functional connectivity network changes, focusing on internal-external attention competition and fear conditioning and reversal; b) identify resting-state network alterations in a subclinical population with high autistic traits. Connectivity-based neuromarkers, compared to the canonical EEG analysis, can provide deeper insights into brain mechanisms and may drive future diagnostic methods and therapeutic interventions. However, further methodological studies are required to fully understand the accuracy and information captured by FC estimates, especially concerning nonlinear phenomena.