22 resultados para Detection and representation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bioremediation, which is the exploitation of the intrinsic ability of environmental microbes to degrade and remove harmful compounds from nature, is considered to be an environmentally sustainable and cost-effective means for environmental clean-up. However, a comprehensive understanding of the biodegradation potential of microbial communities and their response to decontamination measures is required for the effective management of bioremediation processes. In this thesis, the potential to use hydrocarbon-degradative genes as indicators of aerobic hydrocarbon biodegradation was investigated. Small-scale functional gene macro- and microarrays targeting aliphatic, monoaromatic and low molecular weight polyaromatic hydrocarbon biodegradation were developed in order to simultaneously monitor the biodegradation of mixtures of hydrocarbons. The validity of the array analysis in monitoring hydrocarbon biodegradation was evaluated in microcosm studies and field-scale bioremediation processes by comparing the hybridization signal intensities to hydrocarbon mineralization, real-time polymerase chain reaction (PCR), dot blot hybridization and both chemical and microbiological monitoring data. The results obtained by real-time PCR, dot blot hybridization and gene array analysis were in good agreement with hydrocarbon biodegradation in laboratory-scale microcosms. Mineralization of several hydrocarbons could be monitored simultaneously using gene array analysis. In the field-scale bioremediation processes, the detection and enumeration of hydrocarbon-degradative genes provided important additional information for process optimization and design. In creosote-contaminated groundwater, gene array analysis demonstrated that the aerobic biodegradation potential that was present at the site, but restrained under the oxygen-limited conditions, could be successfully stimulated with aeration and nutrient infiltration. During ex situ bioremediation of diesel oil- and lubrication oil-contaminated soil, the functional gene array analysis revealed inefficient hydrocarbon biodegradation, caused by poor aeration during composting. The functional gene array specifically detected upper and lower biodegradation pathways required for complete mineralization of hydrocarbons. Bacteria representing 1 % of the microbial community could be detected without prior PCR amplification. Molecular biological monitoring methods based on functional genes provide powerful tools for the development of more efficient remediation processes. The parallel detection of several functional genes using functional gene array analysis is an especially promising tool for monitoring the biodegradation of mixtures of hydrocarbons.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The literature review elucidates the mechanism of oxidation in proteins and amino acids and gives an overview of the detection and analysis of protein oxidation products as well as information about ?-lactoglobulin and studies carried out on modifications of this protein under certain conditions. The experimental research included the fractionation of the tryptic peptides of ?-lactoglobulin using preparative-HPLC-MS and monitoring the oxidation process of these peptides via reverse phase-HPLC-UV. Peptides chosen to be oxidized were selected with respect to their amino acid content which were susceptible to oxidation and fractionated according to their m/z values. These peptides were: IPAVFK (m/z 674), ALPMHIR (m/z 838), LIVTQTMK (m/z 934) and VLVLDTDYK (m/z 1066). Even though it was not possible to solely isolate the target peptides due to co-elution of various fractions, the percentages of target peptides in the samples were satisfactory to carry out the oxidation procedure. IPAVFK and VLVLDTDYK fractions were found to yield the oxidation products reviewed in literature, however, unoxidized peptides were still present in high amounts after 21 days of oxidation. The UV data at 260 and 280 nm enabled to monitor both the main peptides and the oxidation products due to the absorbance of aromatic side-chains these peptides possess. ALPMHIR and LIVTQTMK fractions were oxidatively consumed rapidly and oxidation products of these peptides were observed even on day 0. High rates of depletion of these peptides were acredited to the presence of His (H) and sulfur-containing side-chains of Met (M). In conclusion, selected peptides hold the potential to be utilized as marker peptides in ?-lactoglobulin oxidation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The occurrence of occupational chronic solvent encephalopathy (CSE) seems to decrease, but still every year reveals new cases. To prevent CSE and early retirement of solvent-exposed workers, actions should focus on early CSE detection and diagnosis. Identifying the work tasks and solvent exposure associated with high risk for CSE is crucial. Clinical and exposure data of all the 128 cases diagnosed with CSE as an occupational disease in Finland during 1995-2007 was collected from the patient records at the Finnish Institute of Occupational Health (FIOH) in Helsinki. The data on the number of exposed workers in Finland were gathered from the Finnish Job-exposure Matrix (FINJEM) and the number of employed from the national workforce survey. We analyzed the work tasks and solvent exposure of CSE patients and the findings in brain magnetic resonance imaging (MRI), quantitative electroencephalography (QEEG), and event-related potentials (ERP). The annual number of new cases diminished from 18 to 3, and the incidence of CSE decreased from 8.6 to 1.2 / million employed per year. The highest incidence of CSE was in workers with their main exposure to aromatic hydrocarbons; during 1995-2006 the incidence decreased from 1.2 to 0.3 / 1 000 exposed workers per year. The work tasks with the highest incidence of CSE were floor layers and lacquerers, wooden surface finishers, and industrial, metal, or car painters. Among 71 CSE patients, brain MRI revealed atrophy or white matter hyperintensities or both in 38% of the cases. Atrophy which was associated with duration of exposure was most frequently located in the cerebellum and in the frontal or parietal brain areas. QEEG in a group of 47 patients revealed increased power of the theta band in the frontal brain area. In a group of 86 patients, the P300 amplitude of auditory ERP was decreased, but at individual level, all the amplitude values were classified as normal. In 11 CSE patients and 13 age-matched controls, ERP elicited by a multimodal paradigm including an auditory, a visual detection, and a recognition memory task under single and dual-task conditions corroborated the decrease of auditory P300 amplitude in CSE patients in single-task condition. In dual-task conditions, the auditory P300 component was, more often in patients than in controls, unrecognizable. Due to the paucity and non-specificity of the findings, brain MRI serves mainly for differential diagnostics in CSE. QEEG and auditory P300 are insensitive at individual level and not useful in the clinical diagnostics of CSE. A multimodal ERP paradigm may, however, provide a more sensitive method to diagnose slight cognitive disturbances such as CSE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis treats Githa Hariharan s first three novels The Thousand Faces of Night (1992), The Ghosts of Vasu Master (1994) and When Dreams Travel (1999) as a trilogy, in which Hariharan studies the effects of storytelling from different perspectives. The thesis examines the relationship between embedded storytelling and the primary narrative level, the impact of character-bound storytelling on personal development and interpersonal relationships. Thus, I assume that an analysis of the instabilities between characters, and tensions between sets of values introduced through storytelling displays the development of the central characters in the novels. My focus is on the tensions between different sets of values and knowledge systems and their impact on the gender negotiations. The tensions are articulated through a resistance and/or adherence to a cultural narrative, that is, a formula, which underlies specific narratives. Conveyed or disputed by embedded storytelling, the cultural narrative circumscribes what it means to be gendered in Hariharan s novels. The analysis centres on how the narratee in The Thousand Faces of Night and the storyteller in The Ghosts of Vasu Master relate to and are affected by the storytelling, and how they perceive their gendered positions. The analysis of the third novel When Dreams Travel focuses on storytelling, and its relationship to power and representation. I argue that Hariharan's use of embedded storytelling is a way to renegotiate and even reconceptualise gender. Hariharan s primary concern in all three novels is the tensions between tradition or repetition, and change, and how they affect gender. Although the novels feature ancient mythical heroes, mice and flies, or are set in a fantasy world of jinnis and ghouls, they are, I argue, deeply concerned with contemporary issues. Whereas the first novel questions the demands set by family and society on the individual, the second strives to articulate an ethical relationship between the self and the other. The third novel examines the relationship between representation and power. These issues could not be more contemporary, and they loom large over both the regional and global arenas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The relationship between the Orthodox Churches and the World Council of Churches (WCC) became a crisis just before the 8th Assembly of the WCC in Harare, Zimbabwe in 1998. The Special Commission on Orthodox Participation in the WCC (SC), inaugurated in Harare, worked during the period 1999 2002 to solve the crisis and to secure the Orthodox participation in the WCC. The purpose of this study is: 1) to clarify the theological motives for the inauguration of the SC and the theological argumentation of the Orthodox criticism; 2) to write a reliable history and analysis of the SC; 3) to outline the theological argumentation, which structures the debate, and 4) to investigate the ecclesiological questions that arise from the SC material. The study spans the years 1998 to 2006, from the WCC Harare Assembly to the Porto Alegre Assembly. Hence, the initiation and immediate reception of the Special Commission are included in the study. The sources of this study are all the material produced by and for the SC. The method employed is systematic analysis. The focus of the study is on theological argumentation; the historical context and political motives that played a part in the Orthodox-WCC relations are not discussed in detail. The study shows how the initial, specific and individual Orthodox concerns developed into a profound ecclesiological discussion and also led to concrete changes in WCC practices, the best known of which is the change to decision-making by consensus. The Final Report of the SC contains five main themes, namely, ecclesiology, decision-making, worship/common prayer, membership and representation, and social and ethical issues. The main achievement of the SC was that it secured the Orthodox membership in the WCC. The ecclesiological conclusions made in the Final Report are twofold. On the one hand, it confirms that the very act of belonging to the WCC means the commitment to discuss the relationship between a church and churches. The SC recommended that baptism should be added as a criterion for membership in the WCC, and the member churches should continue to work towards the mutual recognition of each other s baptism. These elements strengthen the ecclesiological character of the WCC. On the other hand, when the Final Report discusses common prayer, the ecclesiological conclusions are much more cautious, and the ecclesiological neutrality of the WCC is emphasized several times. The SC repeatedly emphasized that the WCC is a fellowship of churches. The concept of koinonia, which has otherwise been important in recent ecclesiological questions, was not much applied by the SC. The comparison of the results of the SC to parallel ecclesiological documents of the WCC (Nature and Mission of the Church, Called to Be the One Church) shows that they all acknowledge the different ecclesiological starting points of the member churches, and, following that, a variety of legitimate views on the relation of the Church to the churches. Despite the change from preserving the koinonia to promises of eschatological koinonia, all the documents affirm that the goal of the ecumenical movement is still full, visible unity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Microbes in natural and artificial environments as well as in the human body are a key part of the functional properties of these complex systems. The presence or absence of certain microbial taxa is a correlate of functional status like risk of disease or course of metabolic processes of a microbial community. As microbes are highly diverse and mostly notcultivable, molecular markers like gene sequences are a potential basis for detection and identification of key types. The goal of this thesis was to study molecular methods for identification of microbial DNA in order to develop a tool for analysis of environmental and clinical DNA samples. Particular emphasis was placed on specificity of detection which is a major challenge when analyzing complex microbial communities. The approach taken in this study was the application and optimization of enzymatic ligation of DNA probes coupled with microarray read-out for high-throughput microbial profiling. The results show that fungal phylotypes and human papillomavirus genotypes could be accurately identified from pools of PCR amplicons generated from purified sample DNA. Approximately 1 ng/μl of sample DNA was needed for representative PCR amplification as measured by comparisons between clone sequencing and microarray. A minimum of 0,25 amol/μl of PCR amplicons was detectable from amongst 5 ng/μl of background DNA, suggesting that the detection limit of the test comprising of ligation reaction followed by microarray read-out was approximately 0,04%. Detection from sample DNA directly was shown to be feasible with probes forming a circular molecule upon ligation followed by PCR amplification of the probe. In this approach, the minimum detectable relative amount of target genome was found to be 1% of all genomes in the sample as estimated from 454 deep sequencing results. Signal-to-noise of contact printed microarrays could be improved by using an internal microarray hybridization control oligonucleotide probe together with a computational algorithm. The algorithm was based on identification of a bias in the microarray data and correction of the bias as shown by simulated and real data. The results further suggest semiquantitative detection to be possible by ligation detection, allowing estimation of target abundance in a sample. However, in practise, comprehensive sequence information of full length rRNA genes is needed to support probe design with complex samples. This study shows that DNA microarray has the potential for an accurate microbial diagnostic platform to take advantage of increasing sequence data and to replace traditional, less efficient methods that still dominate routine testing in laboratories. The data suggests that ligation reaction based microarray assay can be optimized to a degree that allows good signal-tonoise and semiquantitative detection.