945 resultados para Lead Analysis Data processing
Resumo:
This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
Allergic asthma represents an important public health issue, most common in the paediatric population, characterized by airway inflammation that may lead to changes in volatiles secreted via the lungs. Thus, exhaled breath has potential to be a matrix with relevant metabolomic information to characterize this disease. Progress in biochemistry, health sciences and related areas depends on instrumental advances, and a high throughput and sensitive equipment such as comprehensive two-dimensional gas chromatography–time of flight mass spectrometry (GC × GC–ToFMS) was considered. GC × GC–ToFMS application in the analysis of the exhaled breath of 32 children with allergic asthma, from which 10 had also allergic rhinitis, and 27 control children allowed the identification of several hundreds of compounds belonging to different chemical families. Multivariate analysis, using Partial Least Squares-Discriminant Analysis in tandem with Monte Carlo Cross Validation was performed to assess the predictive power and to help the interpretation of recovered compounds possibly linked to oxidative stress, inflammation processes or other cellular processes that may characterize asthma. The results suggest that the model is robust, considering the high classification rate, sensitivity, and specificity. A pattern of six compounds belonging to the alkanes characterized the asthmatic population: nonane, 2,2,4,6,6-pentamethylheptane, decane, 3,6-dimethyldecane, dodecane, and tetradecane. To explore future clinical applications, and considering the future role of molecular-based methodologies, a compound set was established to rapid access of information from exhaled breath, reducing the time of data processing, and thus, becoming more expedite method for the clinical purposes.
Resumo:
Hypertensive syndromes in pregnancy (HSP) are configured as one of the major complications in the pregnancy and postpartum period and can lead premature newborn and subsequent hospitalization of the newborn to the Neonatal Intensive Care Unit (NICU). This study aimed to analyze the perceptions, meanings and feelings of mothers on the hypertensive syndromes in pregnancy and premature obstetric labor. The research was qualitative and has a theoretical methodological the Social Representations Theory(SRT) in the approach to the Central Nucleus Theory. The study included 70 women, mean age 29 years, predominantly school to high school, most of them married or in consensual union, primiparous and prevalence of cesarean delivery occurred between 32 and 37 weeks of pregnancy.The data were collected from may to december 2008 in the Maternity School Januário Cicco in Natal , and obtained through the following instruments for data collection: questionnaire including questions about socio-demographic status; the Free Words Association Test (FWAT) and and verbalized mental image construction used three stimuli: such as pregnancy with high blood pressure, preterm birth and NICU, and interview with the following guiding question: what it meant for you to have a pregnancy with high blood pressure and consequently the birth of a premature baby? Data analysis was performed using multi-method obtained from the data processing by EVOC (Ensemble Programmes Permettant L 'Analyze des Évocations) and ALCESTE (Analyse Lexicale par Contexte d'un Ensemble de Segment de Texte) and thematic analysis in categories. The results will be presented in four thematic units under the following representative universes: HSP, prematurity as a result of HSP, NICU and the social representations of mothers on the hypertensive disorder of pregnancy sequenced premature birth and hospitalization of the child in the NICU. The results obtained by multimethod analyses showed similar constructions and point to death as the central nucleus and negative aspects, coping strategies, need of care, knowledge about the disease, fragility and meanings of the NICU as peripheral elements. It is considered that the perceptions, meanings and feelings of puerperal women in relation to HSPs and to premature delivery are a negative social representation, with representational elements that may have influenced the adverse effects on the disease and its consequences. We suggest action on the peripheral elements of this representation, with adequate orientation, early diagnosis, effective conduct, receptive attitude on the part of the team, health promotion measures and effective public policies, in order to improve the care provided to puerperal women, making them feel welcome and minimizing their suffering
Resumo:
The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.
Resumo:
Cassava starch has been shown to make transparent and colorless flexible films without any previous chemical treatment. The functional properties of edible films are influenced by starch properties, including chain conformation, molecular bonding, crystallinity, and water content. Fourier-transform infrared (FTIR) spectroscopy in combination with attenuated total reflectance (ATR) has been applied for the elucidation of the structure and conformation of carbohydrates. This technique associated with chemometric data processing could indicate the relationship between the structural parameters and the functional properties of cassava starch-based edible films. Successful prediction of the functional properties values of the starch-based films was achieved by partial least squares regression data. The results showed that presence of the hydroxyl group on carbon 6 of the cyclic part of glucose is directly correlated with the functional properties of cassava starch films.
Resumo:
Grinding process is usually the last finishing process of a precision component in the manufacturing industries. This process is utilized for manufacturing parts of different materials, so it demands results such as low roughness, dimensional and shape error control, optimum tool-life, with minimum cost and time. Damages on the parts are very expensive since the previous processes and the grinding itself are useless when the part is damaged in this stage. This work aims to investigate the efficiency of digital signal processing tools of acoustic emission signals in order to detect thermal damages in grinding process. To accomplish such a goal, an experimental work was carried out for 15 runs in a surface grinding machine operating with an aluminum oxide grinding wheel and ABNT 1045 e VC131 steels. The acoustic emission signals were acquired from a fixed sensor placed on the workpiece holder. A high sampling rate acquisition system at 2.5 MHz was used to collect the raw acoustic emission instead of root mean square value usually employed. In each test AE data was analyzed off-line, with results compared to inspection of each workpiece for burn and other metallurgical anomaly. A number of statistical signal processing tools have been evaluated.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Long-term dietary exposures to lead in young children were calculated by combining food consumption data of 11 European countries categorised using harmonised broad food categories with occurrence data on lead from different Member States (pan-European approach). The results of the assessment in children living in the Netherlands were compared with a long-term lead intake assessment in the same group using Dutch lead concentration data and linking the consumption and concentration data at the highest possible level of detail. Exposures obtained with the pan-European approach were higher than the national exposure calculations. For both assessments cereals contributed most to the exposure. The lower dietary exposure in the national study was due to the use of lower lead concentrations and...
Resumo:
The present PhD thesis was focused on the development and application of chemical methodology (Py-GC-MS) and data-processing method by multivariate data analysis (chemometrics). The chromatographic and mass spectrometric data obtained with this technique are particularly suitable to be interpreted by chemometric methods such as PCA (Principal Component Analysis) as regards data exploration and SIMCA (Soft Independent Models of Class Analogy) for the classification. As a first approach, some issues related to the field of cultural heritage were discussed with a particular attention to the differentiation of binders used in pictorial field. A marker of egg tempera the phosphoric acid esterified, a pyrolysis product of lecithin, was determined using HMDS (hexamethyldisilazane) rather than the TMAH (tetramethylammonium hydroxide) as a derivatizing reagent. The validity of analytical pyrolysis as tool to characterize and classify different types of bacteria was verified. The FAMEs chromatographic profiles represent an important tool for the bacterial identification. Because of the complexity of the chromatograms, it was possible to characterize the bacteria only according to their genus, while the differentiation at the species level has been achieved by means of chemometric analysis. To perform this study, normalized areas peaks relevant to fatty acids were taken into account. Chemometric methods were applied to experimental datasets. The obtained results demonstrate the effectiveness of analytical pyrolysis and chemometric analysis for the rapid characterization of bacterial species. Application to a samples of bacterial (Pseudomonas Mendocina), fungal (Pleorotus ostreatus) and mixed- biofilms was also performed. A comparison with the chromatographic profiles established the possibility to: • Differentiate the bacterial and fungal biofilms according to the (FAMEs) profile. • Characterize the fungal biofilm by means the typical pattern of pyrolytic fragments derived from saccharides present in the cell wall. • Individuate the markers of bacterial and fungal biofilm in the same mixed-biofilm sample.
Resumo:
Advances in biomedical signal acquisition systems for motion analysis have led to lowcost and ubiquitous wearable sensors which can be used to record movement data in different settings. This implies the potential availability of large amounts of quantitative data. It is then crucial to identify and to extract the information of clinical relevance from the large amount of available data. This quantitative and objective information can be an important aid for clinical decision making. Data mining is the process of discovering such information in databases through data processing, selection of informative data, and identification of relevant patterns. The databases considered in this thesis store motion data from wearable sensors (specifically accelerometers) and clinical information (clinical data, scores, tests). The main goal of this thesis is to develop data mining tools which can provide quantitative information to the clinician in the field of movement disorders. This thesis will focus on motor impairment in Parkinson's disease (PD). Different databases related to Parkinson subjects in different stages of the disease were considered for this thesis. Each database is characterized by the data recorded during a specific motor task performed by different groups of subjects. The data mining techniques that were used in this thesis are feature selection (a technique which was used to find relevant information and to discard useless or redundant data), classification, clustering, and regression. The aims were to identify high risk subjects for PD, characterize the differences between early PD subjects and healthy ones, characterize PD subtypes and automatically assess the severity of symptoms in the home setting.
Resumo:
This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.
Resumo:
In the last years radar sensor networks for localization and tracking in indoor environment have generated more and more interest, especially for anti-intrusion security systems. These networks often use Ultra Wide Band (UWB) technology, which consists in sending very short (few nanoseconds) impulse signals. This approach guarantees high resolution and accuracy and also other advantages such as low price, low power consumption and narrow-band interference (jamming) robustness. In this thesis the overall data processing (done in MATLAB environment) is discussed, starting from experimental measures from sensor devices, ending with the 2D visualization of targets movements over time and focusing mainly on detection and localization algorithms. Moreover, two different scenarios and both single and multiple target tracking are analyzed.
Resumo:
We previously showed that lifetime cumulative lead dose, measured as lead concentration in the tibia bone by X-ray fluorescence, was associated with persistent and progressive declines in cognitive function and with decreases in MRI-based brain volumes in former lead workers. Moreover, larger region-specific brain volumes were associated with better cognitive function. These findings motivated us to explore a novel application of path analysis to evaluate effect mediation. Voxel-wise path analysis, at face value, represents the natural evolution of voxel-based morphometry methods to answer questions of mediation. Application of these methods to the former lead worker data demonstrated potential limitations in this approach where there was a tendency for results to be strongly biased towards the null hypothesis (lack of mediation). Moreover, a complimentary analysis using anatomically-derived regions of interest volumes yielded opposing results, suggesting evidence of mediation. Specifically, in the ROI-based approach, there was evidence that the association of tibia lead with function in three cognitive domains was mediated through the volumes of total brain, frontal gray matter, and/or possibly cingulate. A simulation study was conducted to investigate whether the voxel-wise results arose from an absence of localized mediation, or more subtle defects in the methodology. The simulation results showed the same null bias evidenced as seen in the lead workers data. Both the lead worker data results and the simulation study suggest that a null-bias in voxel-wise path analysis limits its inferential utility for producing confirmatory results.