16 resultados para Detection and segmentation

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowing the chromosomal areas or actual genes affecting the traits under selection would add more information to be used in the selection decisions which would potentially lead to higher genetic response. The first objective of this study was to map quantitative trait loci (QTL) affecting economically important traits in the Finnish Ayrshire population. The second objective was to investigate the effects of using QTL information in marker-assisted selection (MAS) on the genetic response and the linkage disequilibrium between the different parts of the genome. Whole genome scans were carried out on a grand-daughter design with 12 half-sib families and a total of 493 sons. Twelve different traits were studied: milk yield, protein yield, protein content, fat yield, fat content, somatic cell score (SCS), mastitis treatments, other veterinary treatments, days open, fertility treatments, non-return rate, and calf mortality. The average spacing of the typed markers was 20 cM with 2 to 14 markers per chromosome. Associations between markers and traits were analyzed with multiple marker regression. Significance was determined by permutation and genome-wise P-values obtained by Bonferroni correction. The benefits from MAS were investigated by simulation: a conventional progeny testing scheme was compared to a scheme where QTL information was used within families to select among full-sibs in the male path. Two QTL on different chromosomes were modelled. The effects of different starting frequencies of the favourable alleles and different size of the QTL effects were evaluated. A large number of QTL, 48 in total, were detected at 5% or higher chromosome-wise significance. QTL for milk production were found on 8 chromosomes, for SCS on 6, for mastitis treatments on 1, for other veterinary treatments on 5, for days open on 7, for fertility treatments on 7, for calf mortality on 6, and for non-return rate on 2 chromosomes. In the simulation study the total genetic response was faster with MAS than with conventional selection and the advantage of MAS persisted over the studied generations. The rate of response and the difference between the selection schemes reflected clearly the changes in allele frequencies of the favourable QTL. The disequilibrium between the polygenes and QTL was always negative and it was larger with larger QTL size. The disequilibrium between the two QTL was larger with QTL of large effect and it was somewhat larger with MAS for scenarios with starting frequencies below 0.5 for QTL of moderate size and below 0.3 for large QTL. In conclusion, several QTL affecting economically important traits of dairy cattle were detected. Further studies are needed to verify these QTL, check their presence in the present breeding population, look for pleiotropy and fine map the most interesting QTL regions. The results of the simulation studies show that using MAS together with embryo transfer to pre-select young bulls within families is a useful approach to increase the genetic merit of the AI-bulls compared to conventional selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Megasphaera cerevisiae, Pectinatus cerevisiiphilus, Pectinatus frisingensis, Selenomonas lacticifex, Zymophilus paucivorans and Zymophilus raffinosivorans are strictly anaerobic Gram-stain-negative bacteria that are able to spoil beer by producing off-flavours and turbidity. They have only been isolated from the beer production chain. The species are phylogenetically affiliated to the Sporomusa sub-branch in the class "Clostridia". Routine cultivation methods for detection of strictly anaerobic bacteria in breweries are time-consuming and do not allow species identification. The main aim of this study was to utilise DNA-based techniques in order to improve detection and identification of the Sporomusa sub-branch beer-spoilage bacteria and to increase understanding of their biodiversity, evolution and natural sources. Practical PCR-based assays were developed for monitoring of M. cerevisiae, Pectinatus species and the group of Sporomusa sub-branch beer spoilers throughout the beer production process. The developed assays reliably differentiated the target bacteria from other brewery-related microbes. The contaminant detection in process samples (10 1,000 cfu/ml) could be accomplished in 2 8 h. Low levels of viable cells in finished beer (≤10 cfu/100 ml) were usually detected after 1 3 d culture enrichment. Time saving compared to cultivation methods was up to 6 d. Based on a polyphasic approach, this study revealed the existence of three new anaerobic spoilage species in the beer production chain, i.e. Megasphaera paucivorans, Megasphaera sueciensis and Pectinatus haikarae. The description of these species enabled establishment of phenotypic and DNA-based methods for their detection and identification. The 16S rRNA gene based phylogenetic analysis of the Sporomusa sub-branch showed that the genus Selenomonas originates from several ancestors and will require reclassification. Moreover, Z. paucivorans and Z. raffinosivorans were found to be in fact members of the genus Propionispira. This relationship implies that they were carried to breweries along with plant material. The brewery-related Megasphaera species formed a distinct sub-group that did not include any sequences from other sources, suggesting that M. cerevisiae, M. paucivorans and M. sueciensis may be uniquely adapted to the brewery ecosystem. M. cerevisiae was also shown to exhibit remarkable resistance against many brewery-related stress conditions. This may partly explain why it is a brewery contaminant. This study showed that DNA-based techniques provide useful tools for obtaining more rapid and specific information about the presence and identity of the strictly anaerobic spoilage bacteria in the beer production chain than is possible using cultivation methods. This should ensure financial benefits to the industry and better product quality to customers. In addition, DNA-based analyses provided new insight into the biodiversity as well as natural sources and relations of the Sporomusa sub-branch bacteria. The data can be exploited for taxonomic classification of these bacteria and for surveillance and control of contaminations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Kidney transplantation (KTX) is considered to be the best treatment of terminal uremia. Despite improvements in short-term graft survival, a considerable number of kidney allografts are lost due to the premature death of patients with a functional kidney and to chronic allograft nephropathy (CAN). Aim. To investigate the risk factors involved in the progression of CAN and to analyze diagnostic methods for this entity. Materials and methods. Altogether, 153 implant and 364 protocol biopsies obtained between June 1996 and April 2008 were analyzed. The biopsies were classified according to Banff ’97 and chronic allograft damage index (CADI). Immunohistochemistry for TGF-β1 was performed in 49 biopsies. Kidney function was evaluated by creatinine and/or cystatin C measurement and by various estimates of glomerular filtration rate (GFR). Demographic data of the donors and recipients were recorded after 2 years’ follow-up. Results. Most of the 3-month biopsies (73%) were nearly normal. The mean CADI score in the 6-month biopsies decreased significantly after 2001. Diastolic hypertension correlated with ΔCADI. Serum creatinine concentration at hospital discharge and glomerulosclerosis were risk factors for ΔCADI. High total and LDL cholesterol, low HDL and hypertension correlated with chronic histological changes. The mean age of the donors increased from 41 -52 years. Older donors were more often women who had died from an underlying disease. The prevalence of delayed graft function increased over the years, while acute rejections (AR) decreased significantly over the years. Sub-clinical AR was observed in 4% and it did not affect long-term allograft function or CADI. Recipients´ drug treatment was modified along the Studies, being mycophenolate mophetil, tacrolimus, statins and blockers of the renine-angiotensin-system more frequently prescribed after 2001. Patients with a higher ΔCADI had lower GFR during follow-up. CADI over 2 was best predicted by creatinine, although with modest sensitivity and specificity. Neither cystatin C nor other estimates of GFR were superior to creatinine for CADI prediction. Cyclosporine A toxicity was seldom seen. Low cyclosporin A concentration after 2 h correlated with TGF- β1 expression in interstitial inflammatory cells, and this predicted worse graft function. Conclusions. The progression of CAN has been affected by two major factors: the donors’ characteristics and the recipients’ hypertension. The increased prevalence of DGF might be a consequence of the acceptance of older donors who had died from an underlying disease. Implant biopsies proved to be of prognostic value, and they are essential for comparison with subsequent biopsies. The progression of histological damage was associated with hypertension and dyslipidemia. The augmented expression of TGF-β1 in inflammatory cells is unclear, but it may be related to low immunosuppression. Serum creatinine is the most suitable tool for monitoring kidney allograft function on every-day basis. However, protocol biopsies at 6 and 12 months predicted late kidney allograft dysfunction and affected the clinical management of the patients. Protocol biopsies are thus a suitable surrogate to be used in clinical trials and for monitoring kidney allografts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pdf-file, link above

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method was developed for relative radiometric calibration of single multitemporal Landsat TM image, several multitemporal images covering each others, and several multitemporal images covering different geographic locations. The radiometricly calibrated difference images were used for detecting rapid changes on forest stands. The nonparametric Kernel method was applied for change detection. The accuracy of the change detection was estimated by inspecting the image analysis results in field. The change classification was applied for controlling the quality of the continuously updated forest stand information. The aim was to ensure that all the manmade changes and any forest damages were correctly updated including the attribute and stand delineation information. The image analysis results were compared with the registered treatments and the stand information base. The stands with discrepancies between these two information sources were recommended to be field inspected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present challenge in drug discovery is to synthesize new compounds efficiently in minimal time. The trend is towards carefully designed and well-characterized compound libraries because fast and effective synthesis methods easily produce thousands of new compounds. The need for rapid and reliable analysis methods is increased at the same time. Quality assessment, including the identification and purity tests, is highly important since false (negative or positive) results, for instance in tests of biological activity or determination of early-ADME parameters in vitro (the pharmacokinetic study of drug absorption, distribution, metabolism, and excretion), must be avoided. This thesis summarizes the principles of classical planar chromatographic separation combined with ultraviolet (UV) and mass spectrometric (MS) detection, and introduces powerful, rapid, easy, low-cost, and alternative tools and techniques for qualitative and quantitative analysis of small drug or drug-like molecules. High performance thin-layer chromatography (HPTLC) was introduced and evaluated for fast semi-quantitative assessment of the purity of synthesis target compounds. HPTLC methods were compared with the liquid chromatography (LC) methods. Electrospray ionization mass spectrometry (ESI MS) and atmospheric pressure matrix-assisted laser desorption/ionization MS (AP MALDI MS) were used to identify and confirm the product zones on the plate. AP MALDI MS was rapid, and easy to carry out directly on the plate without scraping. The PLC method was used to isolate target compounds from crude synthesized products and purify them for bioactivity and preliminary ADME tests. Ultra-thin-layer chromatography (UTLC) with AP MALDI MS and desorption electrospray ionization mass spectrometry (DESI MS) was introduced and studied for the first time. Because of the thinner adsorbent layer, the monolithic UTLC plate provided 10 100 times better sensitivity in MALDI analysis than did HPTLC plates. The limits of detection (LODs) down to low picomole range were demonstrated for UTLC AP MALDI and UTLC DESI MS. In a comparison of AP and vacuum MALDI MS detection for UTLC plates, desorption from the irregular surface of the plates with the combination of an external AP MALDI ion source and an ion trap instrument provided clearly less variation in mass accuracy than the vacuum MALDI time-of-flight (TOF) instrument. The performance of the two-dimensional (2D) UTLC separation with AP MALDI MS method was studied for the first time. The influence of the urine matrix on the separation and the repeatability was evaluated with benzodiazepines as model substances in human urine. The applicability of 2D UTLC AP MALDI MS was demonstrated in the detection of metabolites in an authentic urine sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The synchronization of neuronal activity, especially in the beta- (14-30 Hz) /gamma- (30 80 Hz) frequency bands, is thought to provide a means for the integration of anatomically distributed processing and for the formation of transient neuronal assemblies. Thus non-stimulus locked (i.e. induced) gamma-band oscillations are believed to underlie feature binding and the formation of neuronal object representations. On the other hand, the functional roles of neuronal oscillations in slower theta- (4 8 Hz) and alpha- (8 14 Hz) frequency bands remain controversial. In addition, early stimulus-locked activity has been largely ignored, as it is believed to reflect merely the physical properties of sensory stimuli. With human neuromagnetic recordings, both the functional roles of gamma- and alpha-band oscillations and the significance of early stimulus-locked activity in neuronal processing were examined in this thesis. Study I of this thesis shows that even the stimulus-locked (evoked) gamma oscillations were sensitive to high-level stimulus features for speech and non-speech sounds, suggesting that they may underlie the formation of early neuronal object representations for stimuli with a behavioural relevance. Study II shows that neuronal processing for consciously perceived and unperceived stimuli differed as early as 30 ms after stimulus onset. This study also showed that the alpha band oscillations selectively correlated with conscious perception. Study III, in turn, shows that prestimulus alpha-band oscillations influence the subsequent detection and processing of sensory stimuli. Further, in Study IV, we asked whether phase synchronization between distinct frequency bands is present in cortical circuits. This study revealed prominent task-sensitive phase synchrony between alpha and beta/gamma oscillations. Finally, the implications of Studies II, III, and IV to the broader scientific context are analysed in the last study of this thesis (V). I suggest, in this thesis that neuronal processing may be extremely fast and that the evoked response is important for cognitive processes. I also propose that alpha oscillations define the global neuronal workspace of perception, action, and consciousness and, further, that cross-frequency synchronization is required for the integration of neuronal object representations into global neuronal workspace.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Yersinia enterocolitica and Yersinia pseudotuberculosis are among the major enteropathogenic bacteria causing infections in humans in many industrialized countries. In Finland, Y. pseudotuberculosis has caused 10 outbreaks among humans during 1997-2008. Some of these outbreaks have been very extensive involving over 400 cases; mainly children attending schools and day-care. Y. enterocolitica, on the contrary, has caused mainly a large number of sporadic human infections in Finland. Y. pseudotuberculosis is widespread in nature, causing infections in a variety of domestic and wild animals. Foodborne transmission of human infections has long been suspected, however, attempts to trace the pathogen have been unsuccessful before this study that epidemiologically linked Y. pseudotuberculosis to a specific food item. Furthermore, due to modern food distribution systems, foodborne outbreaks usually involve many geographically separate infection clusters difficult to identify as part of the same outbreak. Among pathogenic Y. enterocolitica, the global predominance of one genetically homogeneous type (bioserotype 4/O:3) is a challenge to the development of genetic typing methods discriminatory enough for epidemiological purposes, for example, for tracing back to the sources of infections. Furthermore, the diagnostics of Y. enterocolitica infections is hampered because clinical laboratories easily misidentify some other members of the Yersinia species (Y. enterocolitica–like species) as Y. enterocolitica. This results in misleading information on the prevalence and clinical significance of various Yersinia isolates. The aim of this study was to develop and optimize molecular typing methods to be used in epidemiological investigations of Y. enterocolitica and Y. pseudotuberculosis, particularly in active surveillance and outbreak investigations of Y. pseudotuberculosis isolates. The aim was also to develop a simplified set of phenotypic tests that could be used in routine diagnostic laboratories for the correct identification of Y. enterocolitica and Y. enterocolitica –like species. A PFGE method designed here for typing of Y. pseudotuberculosis was efficient in linking the geographically dispersed and apparently unrelated Y. pseudotuberculosis infections as parts of the same outbreak. It proved to be useful in active laboratory-based surveillance of Y. pseudotuberculosis outbreaks. Throughout the study period, information about the diversity of genotypes among outbreak and non-outbreak related strains of human origin was obtained. Also, to our knowledge, this was the first study to epidemiologically link a Y. pseudotuberculosis outbreak of human illnesses to a specific food item, iceberg lettuce. A novel epidemiological typing method based on the use of a repeated genomic region (YeO:3RS) as a probe was developed for the detection and differentiation between strains of Y. enterocolitica subspecies palearctica. This method was able to increase the discrimination in a set of 106 previously PFGE typed Finnish Y. enterocolitica bioserotype 4/O:3 strains among which two main PFGE genotypes had prevailed. The developed simplified method was a more reliable tool than the commercially available biochemical test kits for differentiation between Y. enterocolitica and Y. enterocolitica –like species. In Finland, the methods developed for Y. enterocolitica and Y. pseudotuberculosis have been used to improve the identification protocols and in subsequent outbreak investigations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bioremediation, which is the exploitation of the intrinsic ability of environmental microbes to degrade and remove harmful compounds from nature, is considered to be an environmentally sustainable and cost-effective means for environmental clean-up. However, a comprehensive understanding of the biodegradation potential of microbial communities and their response to decontamination measures is required for the effective management of bioremediation processes. In this thesis, the potential to use hydrocarbon-degradative genes as indicators of aerobic hydrocarbon biodegradation was investigated. Small-scale functional gene macro- and microarrays targeting aliphatic, monoaromatic and low molecular weight polyaromatic hydrocarbon biodegradation were developed in order to simultaneously monitor the biodegradation of mixtures of hydrocarbons. The validity of the array analysis in monitoring hydrocarbon biodegradation was evaluated in microcosm studies and field-scale bioremediation processes by comparing the hybridization signal intensities to hydrocarbon mineralization, real-time polymerase chain reaction (PCR), dot blot hybridization and both chemical and microbiological monitoring data. The results obtained by real-time PCR, dot blot hybridization and gene array analysis were in good agreement with hydrocarbon biodegradation in laboratory-scale microcosms. Mineralization of several hydrocarbons could be monitored simultaneously using gene array analysis. In the field-scale bioremediation processes, the detection and enumeration of hydrocarbon-degradative genes provided important additional information for process optimization and design. In creosote-contaminated groundwater, gene array analysis demonstrated that the aerobic biodegradation potential that was present at the site, but restrained under the oxygen-limited conditions, could be successfully stimulated with aeration and nutrient infiltration. During ex situ bioremediation of diesel oil- and lubrication oil-contaminated soil, the functional gene array analysis revealed inefficient hydrocarbon biodegradation, caused by poor aeration during composting. The functional gene array specifically detected upper and lower biodegradation pathways required for complete mineralization of hydrocarbons. Bacteria representing 1 % of the microbial community could be detected without prior PCR amplification. Molecular biological monitoring methods based on functional genes provide powerful tools for the development of more efficient remediation processes. The parallel detection of several functional genes using functional gene array analysis is an especially promising tool for monitoring the biodegradation of mixtures of hydrocarbons.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The literature review elucidates the mechanism of oxidation in proteins and amino acids and gives an overview of the detection and analysis of protein oxidation products as well as information about ?-lactoglobulin and studies carried out on modifications of this protein under certain conditions. The experimental research included the fractionation of the tryptic peptides of ?-lactoglobulin using preparative-HPLC-MS and monitoring the oxidation process of these peptides via reverse phase-HPLC-UV. Peptides chosen to be oxidized were selected with respect to their amino acid content which were susceptible to oxidation and fractionated according to their m/z values. These peptides were: IPAVFK (m/z 674), ALPMHIR (m/z 838), LIVTQTMK (m/z 934) and VLVLDTDYK (m/z 1066). Even though it was not possible to solely isolate the target peptides due to co-elution of various fractions, the percentages of target peptides in the samples were satisfactory to carry out the oxidation procedure. IPAVFK and VLVLDTDYK fractions were found to yield the oxidation products reviewed in literature, however, unoxidized peptides were still present in high amounts after 21 days of oxidation. The UV data at 260 and 280 nm enabled to monitor both the main peptides and the oxidation products due to the absorbance of aromatic side-chains these peptides possess. ALPMHIR and LIVTQTMK fractions were oxidatively consumed rapidly and oxidation products of these peptides were observed even on day 0. High rates of depletion of these peptides were acredited to the presence of His (H) and sulfur-containing side-chains of Met (M). In conclusion, selected peptides hold the potential to be utilized as marker peptides in ?-lactoglobulin oxidation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The occurrence of occupational chronic solvent encephalopathy (CSE) seems to decrease, but still every year reveals new cases. To prevent CSE and early retirement of solvent-exposed workers, actions should focus on early CSE detection and diagnosis. Identifying the work tasks and solvent exposure associated with high risk for CSE is crucial. Clinical and exposure data of all the 128 cases diagnosed with CSE as an occupational disease in Finland during 1995-2007 was collected from the patient records at the Finnish Institute of Occupational Health (FIOH) in Helsinki. The data on the number of exposed workers in Finland were gathered from the Finnish Job-exposure Matrix (FINJEM) and the number of employed from the national workforce survey. We analyzed the work tasks and solvent exposure of CSE patients and the findings in brain magnetic resonance imaging (MRI), quantitative electroencephalography (QEEG), and event-related potentials (ERP). The annual number of new cases diminished from 18 to 3, and the incidence of CSE decreased from 8.6 to 1.2 / million employed per year. The highest incidence of CSE was in workers with their main exposure to aromatic hydrocarbons; during 1995-2006 the incidence decreased from 1.2 to 0.3 / 1 000 exposed workers per year. The work tasks with the highest incidence of CSE were floor layers and lacquerers, wooden surface finishers, and industrial, metal, or car painters. Among 71 CSE patients, brain MRI revealed atrophy or white matter hyperintensities or both in 38% of the cases. Atrophy which was associated with duration of exposure was most frequently located in the cerebellum and in the frontal or parietal brain areas. QEEG in a group of 47 patients revealed increased power of the theta band in the frontal brain area. In a group of 86 patients, the P300 amplitude of auditory ERP was decreased, but at individual level, all the amplitude values were classified as normal. In 11 CSE patients and 13 age-matched controls, ERP elicited by a multimodal paradigm including an auditory, a visual detection, and a recognition memory task under single and dual-task conditions corroborated the decrease of auditory P300 amplitude in CSE patients in single-task condition. In dual-task conditions, the auditory P300 component was, more often in patients than in controls, unrecognizable. Due to the paucity and non-specificity of the findings, brain MRI serves mainly for differential diagnostics in CSE. QEEG and auditory P300 are insensitive at individual level and not useful in the clinical diagnostics of CSE. A multimodal ERP paradigm may, however, provide a more sensitive method to diagnose slight cognitive disturbances such as CSE.