931 resultados para CHD Prediction, Blood Serum Data Chemometrics Methods
Resumo:
This review examines recent evidence linking exposure to aluminium with the aetiology of breast cancer. The human population is exposed to aluminium throughout daily life including through diet, application of antiperspirants, use of antacids and vaccination. Aluminium has now been measured in a range of human breast structures at higher levels than in blood serum and experimental evidence suggests that the tissue concentrations measured have the potential to adversely influence breast epithelial cells including generation of genomic instability, induction of anchorage-independent proliferation and interference in oestrogen action. The presence of aluminium in the human breast may also alter the breast microenvironment causing disruption to iron metabolism, oxidative damage to cellular components, inflammatory responses and alterations to the motility of cells. The main research need is now to investigate whether the concentrations of aluminium measured in the human breast can lead in vivo to any of the effects observed in cells in vitro and this would be aided by the identification of biomarkers specific for aluminium action.
Resumo:
Modeling aging and age-related pathologies presents a substantial analytical challenge given the complexity of gene−environment influences and interactions operating on an individual. A top-down systems approach is used to model the effects of lifelong caloric restriction, which is known to extend life span in several animal models. The metabolic phenotypes of caloric-restricted (CR; n = 24) and pair-housed control-fed (CF; n = 24) Labrador Retriever dogs were investigated by use of orthogonal projection to latent structures discriminant analysis (OPLS-DA) to model both generic and age-specific responses to caloric restriction from the 1H NMR blood serum profiles of young and older dogs. Three aging metabolic phenotypes were resolved: (i) an aging metabolic phenotype independent of diet, characterized by high levels of glutamine, creatinine, methylamine, dimethylamine, trimethylamine N-oxide, and glycerophosphocholine and decreasing levels of glycine, aspartate, creatine and citrate indicative of metabolic changes associated largely with muscle mass; (ii) an aging metabolic phenotype specific to CR dogs that consisted of relatively lower levels of glucose, acetate, choline, and tyrosine and relatively higher serum levels of phosphocholine with increased age in the CR population; (iii) an aging metabolic phenotype specific to CF dogs including lower levels of liproprotein fatty acyl groups and allantoin and relatively higher levels of formate with increased age in the CF population. There was no diet metabotype that consistently differentiated the CF and CR dogs irrespective of age. Glucose consistently discriminated between feeding regimes in dogs (≥312 weeks), being relatively lower in the CR group. However, it was observed that creatine and amino acids (valine, leucine, isoleucine, lysine, and phenylalanine) were lower in the CR dogs (<312 weeks), suggestive of differences in energy source utilization. 1H NMR spectroscopic analysis of longitudinal serum profiles enabled an unbiased evaluation of the metabolic markers modulated by a lifetime of caloric restriction and showed differences in the metabolic phenotype of aging due to caloric restriction, which contributes to longevity studies in caloric-restricted animals. Furthermore, OPLS-DA provided a framework such that significant metabolites relating to life extension could be differentiated and integrated with aging processes.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.
Resumo:
Residential electricity demand in most European countries accounts for a major proportion of overall electricity consumption. The timing of residential electricity demand has significant impacts on carbon emissions and system costs. This paper reviews the data and methods used in time use studies in the context of residential electricity demand modelling. It highlights key issues which are likely to become more topical for research on the timing of electricity demand following the roll-out of smart metres.
Resumo:
Current methods for initialising coupled atmosphere-ocean forecasts often rely on the use of separate atmosphere and ocean analyses, the combination of which can leave the coupled system imbalanced at the beginning of the forecast, potentially accelerating the development of errors. Using a series of experiments with the European Centre for Medium-range Weather Forecasts coupled system, the magnitude and extent of these so-called initialisation shocks is quantified, and their impact on forecast skill measured. It is found that forecasts initialised by separate ocean and atmospheric analyses do exhibit initialisation shocks in lower atmospheric temperature, when compared to forecasts initialised using a coupled data assimilation method. These shocks result in as much as a doubling of root-mean-square error on the first day of the forecast in some regions, and in increases that are sustained for the duration of the 10-day forecasts performed here. However, the impacts of this choice of initialisation on forecast skill, assessed using independent datasets, were found to be negligible, at least over the limited period studied. Larger initialisation shocks are found to follow a change in either the atmospheric or ocean model component between the analysis and forecast phases: changes in the ocean component can lead to sea surface temperature shocks of more than 0.5K in some equatorial regions during the first day of the forecast. Implications for the development of coupled forecast systems, particularly with respect to coupled data assimilation methods, are discussed.
Resumo:
Sensitive quantitation of multiple cytokines can provide important diagnostic information during infection, inflammation and immunopathology. In this study sensitive immunoassay detection of human cytokines IL-1β, IL-6, IL-12p70 and TNFα is shown for singleplex and multiplex formats using a novel miniaturized ELISA platform. The platform uses a disposable plastic multi-syringe aspirator (MSA) integrating 8 disposable fluoropolymer microfluidic test strips, each containing an array of ten 200 mean i.d. microcapillaries coated with a set of monoclonal antibodies. Each MSA device thus performs 10 tests on 8 samples, delivering 80 measurements. Unprecedented levels of sensitivity were obtained with the novel fluoropolymer microfluidic material and simple colorimetric detection in a flatbed scanner. The limit of detection for singleplex detection ranged from 2.0 to 15.0 pg/ml, i.e. 35 and 713 femtomolar for singleplex cytokine detection, and the intra- and inter-assay coefficient of variation (CV) remained within 10%. In addition, a triplex immunoassay was developed for measuring IL-1β, IL-12p70 and TNFα simultaneously from a given sample in the pg/ml range. These assays permit high sensitivity measurement with rapid <15 min assay or detection from undiluted blood serum. The portability, speed and low-cost of this system are highly suited to point-of-care testing and field diagnostics applications.
Resumo:
Estimating trajectories and parameters of dynamical systems from observations is a problem frequently encountered in various branches of science; geophysicists for example refer to this problem as data assimilation. Unlike as in estimation problems with exchangeable observations, in data assimilation the observations cannot easily be divided into separate sets for estimation and validation; this creates serious problems, since simply using the same observations for estimation and validation might result in overly optimistic performance assessments. To circumvent this problem, a result is presented which allows us to estimate this optimism, thus allowing for a more realistic performance assessment in data assimilation. The presented approach becomes particularly simple for data assimilation methods employing a linear error feedback (such as synchronization schemes, nudging, incremental 3DVAR and 4DVar, and various Kalman filter approaches). Numerical examples considering a high gain observer confirm the theory.
Resumo:
Accurate knowledge of the location and magnitude of ocean heat content (OHC) variability and change is essential for understanding the processes that govern decadal variations in surface temperature, quantifying changes in the planetary energy budget, and developing constraints on the transient climate response to external forcings. We present an overview of the temporal and spatial characteristics of OHC variability and change as represented by an ensemble of dynamical and statistical ocean reanalyses (ORAs). Spatial maps of the 0–300 m layer show large regions of the Pacific and Indian Oceans where the interannual variability of the ensemble mean exceeds ensemble spread, indicating that OHC variations are well-constrained by the available observations over the period 1993–2009. At deeper levels, the ORAs are less well-constrained by observations with the largest differences across the ensemble mostly associated with areas of high eddy kinetic energy, such as the Southern Ocean and boundary current regions. Spatial patterns of OHC change for the period 1997–2009 show good agreement in the upper 300 m and are characterized by a strong dipole pattern in the Pacific Ocean. There is less agreement in the patterns of change at deeper levels, potentially linked to differences in the representation of ocean dynamics, such as water mass formation processes. However, the Atlantic and Southern Oceans are regions in which many ORAs show widespread warming below 700 m over the period 1997–2009. Annual time series of global and hemispheric OHC change for 0–700 m show the largest spread for the data sparse Southern Hemisphere and a number of ORAs seem to be subject to large initialization ‘shock’ over the first few years. In agreement with previous studies, a number of ORAs exhibit enhanced ocean heat uptake below 300 and 700 m during the mid-1990s or early 2000s. The ORA ensemble mean (±1 standard deviation) of rolling 5-year trends in full-depth OHC shows a relatively steady heat uptake of approximately 0.9 ± 0.8 W m−2 (expressed relative to Earth’s surface area) between 1995 and 2002, which reduces to about 0.2 ± 0.6 W m−2 between 2004 and 2006, in qualitative agreement with recent analysis of Earth’s energy imbalance. There is a marked reduction in the ensemble spread of OHC trends below 300 m as the Argo profiling float observations become available in the early 2000s. In general, we suggest that ORAs should be treated with caution when employed to understand past ocean warming trends—especially when considering the deeper ocean where there is little in the way of observational constraints. The current work emphasizes the need to better observe the deep ocean, both for providing observational constraints for future ocean state estimation efforts and also to develop improved models and data assimilation methods.
Resumo:
The dynamical processes that lead to open cluster disruption cause its mass to decrease. To investigate such processes from the observational point of view, it is important to identify open cluster remnants (OCRs), which are intrinsically poorly populated. Due to their nature, distinguishing them from field star fluctuations is still an unresolved issue. In this work, we developed a statistical diagnostic tool to distinguish poorly populated star concentrations from background field fluctuations. We use 2MASS photometry to explore one of the conditions required for a stellar group to be a physical group: to produce distinct sequences in a colour-magnitude diagram (CMD). We use automated tools to (i) derive the limiting radius; (ii) decontaminate the field and assign membership probabilities; (iii) fit isochrones; and (iv) compare object and field CMDs, considering the isochrone solution, in order to verify the similarity. If the object cannot be statistically considered as a field fluctuation, we derive its probable age, distance modulus, reddening and uncertainties in a self-consistent way. As a test, we apply the tool to open clusters and comparison fields. Finally, we study the OCR candidates DoDz 6, NGC 272, ESO 435 SC48 and ESO 325 SC15. The tool is optimized to treat these low-statistic objects and to separate the best OCR candidates for studies on kinematics and chemical composition. The study of the possible OCRs will certainly provide a deep understanding of OCR properties and constraints for theoretical models, including insights into the evolution of open clusters and dissolution rates.
Resumo:
Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
We analyzed ostriches from an equipped farm located in the Brazilian southeast region for the presence of Salmonella spp. This bacterium was investigated in 80 samples of ostrich droppings, 90 eggs, 30 samples of feed and 30 samples of droppings from rodents. Additionally, at slaughter-house this bacterium was investigated in droppings, caecal content, spleen, liver and carcasses from 90 slaughtered ostriches from the studied farm. Also, blood serum of those animals were harvested and submitted to serum plate agglutination using commercial Salmonella Pullorum antigen. No Salmonella spp. was detected in any eggs, caecal content, liver, spleen, carcass and droppings from ostriches and rodents. However, Salmonella Javiana and Salmonella enterica subsp. enterica 4, 12: i:- were isolated from some samples of feed. The serologic test was negative for all samples. Good sanitary farming management and the application of HACCP principles and GMP during the slaughtering process could explain the absence of Salmonella spp. in the tested samples.
Resumo:
Low-Density Lipoprotein (LDL), often known as ""bad cholesterol"" is one of the responsible to increase the risk of coronary arterial diseases. For this reason, the cholesterol present in the LDL particle has become one of the main parameters to be quantified in routine clinical diagnosis. A number of tools are available to assess LDL particles and estimate the cholesterol concentration in the blood. The most common methods to quantify the LDL in the plasma are the density gradient ultracentrifugation and nuclear magnetic resonance (NMR). However, these techniques require special equipments and can take a long time to provide the results. In this paper, we report on the increase of the Europium emission in Europium-oxytetracycline complex aqueous solutions in the presence of LDL. This increase is proportional to the LDL concentration in the solution. This phenomenum can be used to develop a method to quantify the number of LDL particles in a sample. A comparison between the performances of the oxytetracycline and the tetracycline in the complexes is also made.
Resumo:
Detta examensarbete har genomförts av två studenter vid Högskolan Dalarna i samarbete medIT-konsultbolaget Istone Concrevi. Edsbyverken, som är en möbeltillverkare baserad i Edsbyn,är kund hos Istone och har ett behov att öka sin leveranssäkerhet. Företaget upplever också attdet finns en bristande sammanhållning mellan de som arbetar administrativt och de som arbetarmed tillverkning i verksamheten. Edsbyverken hoppas att en ökad sammanhållning ska varabidragande till en bättre leveranssäkerhet. Sammanhållning är ett begrepp som är vanligtförekommande inom idrott och lagsporter och förknippas ofta med framgång inom idrotten.Forskningsstrategin som används i studien är design and creation som fokuserar på att skapa nyaIT-produkter, artefakter. Denna studie har resulterat i en artefakt av typen instansiering i form aven applikation som utvecklats med hjälp av ett användarcentrerat och agilt arbetssätt. Syftet medstudien är att undersöka och testa hur teorier inom idrottspsykologi kan användas inom ettsystemutvecklingsprojekt med ändamål att få artefakten och utvecklingsprocessen att främjasammanhållning hos verksamheten. Datainsamlingsmetoder som använts i studien är intervjueroch enkäter. Intervjuerna har använts för att insamla bakgrundsinfo från verksamheten ochönskemål kring appen och enkäterna för feedback på appen samt för att utvärderasystemutvecklingens och appens påverkan på sammanhållningen.Resultaten från datainsamlingen är i många fall spretande med allt från positiva omdömen sommenar att sammanhållningen förbättrats, till negativa som menar att ingen inverkan skett. Vid ensammanvägning kan man dock se att majoriteten menar att detta forskningsprojekt inte lyckatsmed att förbättra sammanhållningen. En viss andel anställda vid Edsbyverken tror dock attsammanhållningen kan öka på längre sikt. Slutsatsen kring idrottspsykologi är att den vid enkoppling till systemutveckling har ett begränsat användningsområde och att många av de teoriersom förekommer om sammanhållning inte är användbara eller passande vidsystemutvecklingsarbete.
Resumo:
A challenge for the clinical management of Parkinson's disease (PD) is the large within- and between-patient variability in symptom profiles as well as the emergence of motor complications which represent a significant source of disability in patients. This thesis deals with the development and evaluation of methods and systems for supporting the management of PD by using repeated measures, consisting of subjective assessments of symptoms and objective assessments of motor function through fine motor tests (spirography and tapping), collected by means of a telemetry touch screen device. One aim of the thesis was to develop methods for objective quantification and analysis of the severity of motor impairments being represented in spiral drawings and tapping results. This was accomplished by first quantifying the digitized movement data with time series analysis and then using them in data-driven modelling for automating the process of assessment of symptom severity. The objective measures were then analysed with respect to subjective assessments of motor conditions. Another aim was to develop a method for providing comparable information content as clinical rating scales by combining subjective and objective measures into composite scores, using time series analysis and data-driven methods. The scores represent six symptom dimensions and an overall test score for reflecting the global health condition of the patient. In addition, the thesis presents the development of a web-based system for providing a visual representation of symptoms over time allowing clinicians to remotely monitor the symptom profiles of their patients. The quality of the methods was assessed by reporting different metrics of validity, reliability and sensitivity to treatment interventions and natural PD progression over time. Results from two studies demonstrated that the methods developed for the fine motor tests had good metrics indicating that they are appropriate to quantitatively and objectively assess the severity of motor impairments of PD patients. The fine motor tests captured different symptoms; spiral drawing impairment and tapping accuracy related to dyskinesias (involuntary movements) whereas tapping speed related to bradykinesia (slowness of movements). A longitudinal data analysis indicated that the six symptom dimensions and the overall test score contained important elements of information of the clinical scales and can be used to measure effects of PD treatment interventions and disease progression. A usability evaluation of the web-based system showed that the information presented in the system was comparable to qualitative clinical observations and the system was recognized as a tool that will assist in the management of patients.