936 resultados para Data Acquisition Methods.
Resumo:
Relatório de estágio de mestrado em Ensino de Matemática no 3.º Ciclo do Ensino Básico e no Ensino Secundário
Resumo:
Tese de Doutoramento em Biologia Ambiental e Molecular
Resumo:
Tese de Doutoramento em História - Especialidade de Idade Moderna
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Informática Médica)
Resumo:
Mechanical Ventilation is an artificial way to help a Patient to breathe. This procedure is used to support patients with respiratory diseases however in many cases it can provoke lung damages, Acute Respiratory Diseases or organ failure. With the goal to early detect possible patient breath problems a set of limit values was defined to some variables monitored by the ventilator (Average Ventilation Pressure, Compliance Dynamic, Flow, Peak, Plateau and Support Pressure, Positive end-expiratory pressure, Respiratory Rate) in order to create critical events. A critical event is verified when a patient has a value higher or lower than the normal range defined for a certain period of time. The values were defined after elaborate a literature review and meeting with physicians specialized in the area. This work uses data streaming and intelligent agents to process the values collected in real-time and classify them as critical or not. Real data provided by an Intensive Care Unit were used to design and test the solution. In this study it was possible to understand the importance of introduce critical events for Mechanically Ventilated Patients. In some cases a value is considered critical (can trigger an alarm) however it is a single event (instantaneous) and it has not a clinical significance for the patient. The introduction of critical events which crosses a range of values and a pre-defined duration contributes to improve the decision-making process by decreasing the number of false positives and having a better comprehension of the patient condition.
Resumo:
The main objective of this thesis on flooding was to produce a detailed report on flooding with specific reference to the Clare River catchment. Past flooding in the Clare River catchment was assessed with specific reference to the November 2009 flood event. A Geographic Information System was used to produce a graphical representation of the spatial distribution of the November 2009 flood. Flood risk is prominent within the Clare River catchment especially in the region of Claregalway. The recent flooding events of November 2009 produced significant fluvial flooding from the Clare River. This resulted in considerable flood damage to property. There were also hidden costs such as the economic impact of the closing of the N17 until floodwater subsided. Land use and channel conditions are traditional factors that have long been recognised for their effect on flooding processes. These factors were examined in the context of the Clare River catchment to determine if they had any significant effect on flood flows. Climate change has become recognised as a factor that may produce more significant and frequent flood events in the future. Many experts feel that climate change will result in an increase in the intensity and duration of rainfall in western Ireland. This would have significant implications for the Clare River catchment, which is already vulnerable to flooding. Flood estimation techniques are a key aspect in understanding and preparing for flood events. This study uses methods based on the statistical analysis of recorded data and methods based on a design rainstorm and rainfall-runoff model to estimate flood flows. These provide a mathematical basis to evaluate the impacts of various factors on flooding and also to generate practical design floods, which can be used in the design of flood relief measures. The final element of the thesis includes the author’s recommendations on how flood risk management techniques can reduce existing flood risk in the Clare River catchment. Future implications to flood risk due to factors such as climate change and poor planning practices are also considered.
Resumo:
Description of a costing model developed by digital production librarian to determine the cost to put an item into the Claremont Colleges Digital Library at the Claremont University Consortium. This case study includes variables such as material types and funding sources, data collection methods, and formulas and calculations for analysis. This model is useful for grant applications, cost allocations, and budgeting for digital project coordinators and digital library projects.
Resumo:
Estudi elaborat a partir d’una estada a l'Imperial College of London, Gran Bretanya, entre setembre i desembre 2006. Disposar d'una geometria bona i ben definida és essencial per a poder resoldre eficientment molts dels models computacionals i poder obtenir uns resultats comparables a la realitat del problema. La reconstrucció d'imatges mèdiques permet transformar les imatges obtingudes amb tècniques de captació a geometries en formats de dades numèriques . En aquest text s'explica de forma qualitativa les diverses etapes que formen el procés de reconstrucció d'imatges mèdiques fins a finalment obtenir una malla triangular per a poder‐la processar en els algoritmes de càlcul. Aquest procés s'inicia a l'escàner MRI de The Royal Brompton Hospital de Londres del que s'obtenen imatges per a després poder‐les processar amb les eines CONGEN10 i SURFGEN per a un entorn MATLAB. Aquestes eines les han desenvolupat investigadors del Bioflow group del departament d'enginyeria aeronàutica del Imperial College of London i en l'ultim apartat del text es comenta un exemple d'una artèria que entra com a imatge mèdica i surt com a malla triangular processable amb qualsevol programari o algoritme que treballi amb malles.
Resumo:
Big sports events like the 2008 European Football Championship are a challenge for anti-doping activities, particularly when the sports event is hosted by two different countries and there are two laboratories accredited by the World Anti-Doping Agency. This challenges the logistics of sample collection as well as the chemical analyses, which must be carried out timeously. The following paper discusses the handling of whereabouts information for each athlete and the therapeutic use exemption system, experiences in sample collection and transportation of blood and urine samples, and the results of the chemical analysis in two different accredited laboratories. An overview of the analytical results of blood profiling and growth hormone testing in comparison with the distribution of the normal population is also presented.
Resumo:
PURPOSE: The Cancer Vaccine Consortium of the Cancer Research Institute (CVC-CRI) conducted a multicenter HLA-peptide multimer proficiency panel (MPP) with a group of 27 laboratories to assess the performance of the assay. EXPERIMENTAL DESIGN: Participants used commercially available HLA-peptide multimers and a well characterized common source of peripheral blood mononuclear cells (PBMC). The frequency of CD8+ T cells specific for two HLA-A2-restricted model antigens was measured by flow cytometry. The panel design allowed for participants to use their preferred staining reagents and locally established protocols for both cell labeling, data acquisition and analysis. RESULTS: We observed significant differences in both the performance characteristics of the assay and the reported frequencies of specific T cells across laboratories. These results emphasize the need to identify the critical variables important for the observed variability to allow for harmonization of the technique across institutions. CONCLUSIONS: Three key recommendations emerged that would likely reduce assay variability and thus move toward harmonizing of this assay. (1) Use of more than two colors for the staining (2) collect at least 100,000 CD8 T cells, and (3) use of a background control sample to appropriately set the analytical gates. We also provide more insight into the limitations of the assay and identified additional protocol steps that potentially impact the quality of data generated and therefore should serve as primary targets for systematic analysis in future panels. Finally, we propose initial guidelines for harmonizing assay performance which include the introduction of standard operating protocols to allow for adequate training of technical staff and auditing of test analysis procedures.
Resumo:
OBJECTIVE: To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. DESIGN: Cohort of protocols of randomised controlled trial and subsequent full journal publications. SETTING: Six research ethics committees in Switzerland, Germany, and Canada. DATA SOURCES: 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. RESULTS: Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. CONCLUSIONS: Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials.
Resumo:
Tobacco consumption is a global epidemic responsible for a vast burden of disease. With pharmacological properties sought-after by consumers and responsible for addiction issues, nicotine is the main reason of this phenomenon. Accordingly, smokeless tobacco products are of growing popularity in sport owing to potential performance enhancing properties and absence of adverse effects on the respiratory system. Nevertheless, nicotine does not appear on the 2011 World Anti-Doping Agency (WADA) Prohibited List or Monitoring Program by lack of a comprehensive large-scale prevalence survey. Thus, this work describes a one-year monitoring study on urine specimens from professional athletes of different disciplines covering 2010 and 2011. A method for the detection and quantification of nicotine, its major metabolites (cotinine, trans-3-hydroxycotinine, nicotine-N'-oxide and cotinine-N-oxide) and minor tobacco alkaloids (anabasine, anatabine and nornicotine) was developed, relying on ultra-high pressure liquid chromatography coupled to triple quadrupole mass spectrometry (UHPLC-TQ-MS/MS). A simple and fast dilute-and-shoot sample treatment was performed, followed by hydrophilic interaction chromatography-tandem mass spectrometry (HILIC-MS/MS) operated in positive electrospray ionization (ESI) mode with multiple reaction monitoring (MRM) data acquisition. After method validation, assessing the prevalence of nicotine consumption in sport involved analysis of 2185 urine samples, accounting for 43 different sports. Concentrations distribution of major nicotine metabolites, minor nicotine metabolites and tobacco alkaloids ranged from 10 (LLOQ) to 32,223, 6670 and 538 ng/mL, respectively. Compounds of interest were detected in trace levels in 23.0% of urine specimens, with concentration levels corresponding to an exposure within the last three days for 18.3% of samples. Likewise, hypothesizing conservative concentration limits for active nicotine consumption prior and/or during sport practice (50 ng/mL for nicotine, cotinine and trans-3-hydroxycotinine and 25 ng/mL for nicotine-N'-oxide, cotinine-N-oxide, anabasine, anatabine and nornicotine) revealed a prevalence of 15.3% amongst athletes. While this number may appear lower than the worldwide smoking prevalence of around 25%, focusing the study on selected sports highlighted more alarming findings. Indeed, active nicotine consumption in ice hockey, skiing, biathlon, bobsleigh, skating, football, basketball, volleyball, rugby, American football, wrestling and gymnastics was found to range between 19.0 and 55.6%. Therefore, considering the adverse effects of smoking on the respiratory tract and numerous health threats detrimental to sport practice at top level, likelihood of smokeless tobacco consumption for performance enhancement is greatly supported.
Resumo:
El projecte "Laboratori Asssit per Ordinador Mitjançant Eines Ofimàtiques Convencionals" ha estat realitzat en la facultat de Física de la Universitat de Barcelona durant els anys 2007 i 2008 (projecte biennal). El principal objectiu d’aquest projecte és demostrar la possibilitat d’utilitzar les eines informàtiques més habituals en la realització d’experiències de laboratori assistit per ordinador (LAO). En particular, es proposa la utilització del Excel © juntament amb les seves macros (Visual Basic para Aplicacions, VBA) en pràctiques de laboratori d’assignatures en l’àrea de Física Aplicada. Excel és un full de càlcul molt conegut i usat tant per professors com pels estudiants. En aquest treball mostrem exemples concrets que abasten les diferents tècniques de control i adquisició de dades: programació del port sèrie (RS- 232) i paral·lel, i interfase GPIB. La implementació d’aquestes tècniques es realitza mitjançant macros VBA de Excel. La resta de programació de l’aplicació LAO, la representació gràfica i el tractament de les dades, es realitza de forma molt simple a partir del maneig habitual d’un full de càlcul. La realització del projecte ha demostrat la conveniència d’aquesta metodologia. Actualment pràcticament la totalitat de les pràctiques LAO de les quals és responsable el Departament de Física Aplicada utilitzen la programació a través del full de càlcul. La resposta dels estudiants ha estat molt positiva. La combinació de les característiques d’aquesta eina juntament amb la programació VBA té un enorme potencial i representa, probablement, una forma senzilla d’introduir tant a l’alumne com al professor en el món de la programació.
Multimodel inference and multimodel averaging in empirical modeling of occupational exposure levels.
Resumo:
Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.
Resumo:
Els sistemes automatitzats que requereixen d’un control d’estabilitat o moviment es poden trobar cada cop en més àmbits. Aplicacions UAV o de posicionament global són les més comunes per aquest tipus de sistemes, degut a que necessiten d’un control de moviment molt precís. Per a dur a terme aquest procés s’utilitzen unitats de mesura inercial, que mitjançant acceleròmetres i giroscopis degudament posicionats, a més a més d’una correcció del possible error que puguin introduir aquests últims, proporcionen una acceleració i una velocitat angular de les quals es pot extreure el camí efectuat per aquestes unitats. La IMU, combinada amb un GPS i mitjançant un filtre de Kalman, proporcionen una major exactitud , a més d’un punt de partida (proporcionat per el GPS), un recorregut representable en un mapa y, en el cas de perdre la senyal GPS, poder seguir adquirint dades de la IMU. Aquestes dades poden ser recollides i processades per una FPGA, que a la vegada podem sincronitzar amb una PDA per a que l’usuari pugui veure representat el moviment del sistema. Aquest treball es centra en el funcionament de la IMU i l’adquisició de dades amb la FPGA. També introdueix el filtre de Kalman per a la correcció de l’error dels sensors.