917 resultados para post-processing method
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.
Resumo:
This paper provides an introduction to issues surrounding the participation rights of young people in research and the implications of their growing involvement in research as well as providing a discourse on the ethical implications related to consent. The unique contribution of this paper is that it considers children’s rights in respect to the increasing opportunities for young people to take part in evaluation research. The aim of this paper, therefore, is to acknowledge the growing involvement for young people in research and the implications of ensuring that their rights of participation are respected. Secondly we will consider the children’s rights legislation and our obligations as researchers to implement this. Finally we will explore consent as an issue in its own right as well as the practicalities of accessing participants. This paper will postulate that any research about young people should involve and prioritise at all stages of the research process; including participation in decision-making. We conclude by identifying five key principles, which we believe can help to facilitate the fulfilment of post-primary pupils’ ability to consent to participate in trials and evaluative research.
Resumo:
In settings of intergroup conflict, identifying contextually-relevant risk factors for youth development in an important task. In Vukovar, Croatia, a city devastated during the war in former Yugoslavia, ethno-political tensions remain. The current study utilized a mixed method approach to identify two salient community-level risk factors (ethnic tension and general antisocial behavior) and related emotional insecurity responses (ethnic and non-ethnic insecurity) among youth in Vukovar. In Study 1, focus group discussions (N=66) with mother, fathers, and adolescents 11 to 15-years-old were analyzed using the Constant Comparative Method, revealing two types of risk and insecurity responses. In Study 2, youth (N=227, 58% male, M=15.88 SD=1.12 years old) responded to quantitative scales developed from the focus groups; discriminate validity was demonstrated and path analyses established predictive validity between each type of risk and insecurity. First, community ethnic tension (i.e., threats related to war/ethnic identity) significantly predicted ethnic insecurity for all youth (β=.41, p<.001). Second, experience with community antisocial behavior (i.e., general crime found in any context) predicted non-ethnic community insecurity for girls (β=.32, p<.05), but not for boys. These findings are the first to show multiple forms of emotional insecurity at the community level; implications for future research are discussed.
Resumo:
Amphibian skin secretions are unique sources of bioactive molecules, particularly bioactive peptides. In this study, the skin secretion of the white-lipped tree frog (Litoria infrafrenata) was obtained to identify peptides with putative therapeutic potential. By utilizing skin secretion-derived mRNA, a cDNA library was constructed, a frenatin gene was cloned and its encoded peptides were deduced and confirmed using RP-HPLC, MALDI-TOF and MS/MS. The deduced peptides were identified as frenatin 4.1 (GFLEKLKTGAKDFASAFVNSIKGT) and a post-translationally modified peptide, frenatin 4.2 (GFLEKLKTGAKDFASAFVNSIK.NH2). Antimicrobial activity of the peptides was assessed by determining their minimal inhibitory concentrations (MICs) using standard model microorganisms. Through studying structure–activity relationships, analogues of the two peptides were designed, resulting in synthesis of frenatin 4.1a (GFLEKLKKGAKDFASALVNSIKGT) and frenatin 4.2a (GFLLKLKLGAKLFASAFVNSIK.NH2). Both analogues exhibited improved antimicrobial activities, especially frenatin 4.2a, which displayed significant enhancement of broad spectrum antimicrobial efficiency. The peptide modifications applied in this study, may provide new ideas for the generation of leads for the design of antimicrobial peptides with therapeutic applications.
Resumo:
In the casting of metals, tundish flow, welding, converters, and other metal processing applications, the behaviour of the fluid surface is important. In aluminium alloys, for example, oxides formed on the surface may be drawn into the body of the melt where they act as faults in the solidified product affecting cast quality. For this reason, accurate description of wave behaviour, air entrapment, and other effects need to be modelled, in the presence of heat transfer and possibly phase change. The authors have developed a single-phase algorithm for modelling this problem. The Scalar Equation Algorithm (SEA) (see Refs. 1 and 2), enables the transport of the property discontinuity representing the free surface through a fixed grid. An extension of this method to unstructured mesh codes is presented here, together with validation. The new method employs a TVD flux limiter in conjunction with a ray-tracing algorithm, to ensure a sharp bound interface. Applications of the method are in the filling and emptying of mould cavities, with heat transfer and phase change.
Resumo:
This study examined the properties of ERP effects elicited by unattended (spatially uncued) objects using a short-lag repetition-priming paradigm. Same or different common objects were presented in a yoked prime-probe trial either as intact images or slightly scrambled (half-split) versions. Behaviourally, only objects in a familiar (intact) view showed priming. An enhanced negativity was observed at parietal and occipito-parietal electrode sites within the time window of the posterior N250 after the repetition of intact, but not split, images. An additional post-hoc N2pc analysis of the prime display supported that this result could not be attributed to differences in salience between familiar intact and split views. These results demonstrate that spatially unattended objects undergo visual processing but only if shown in familiar views, indicating a role of holistic processing of objects that is independent of attention.
Resumo:
Background For decades film has proved to be a powerful form of communication. Whether produced as entertainment, art or documentary, films have the capacity to inform and move us. Films are a highly attractive teaching instrument and an appropriate teaching method in health education. It is a valuable tool for studying situations most transcendental to human beings such as pain, disease and death. Objectives The objectives were to determine how this helps students engage with their role as health care professionals; to determine how they view the personal experience of illness, disease, disability or death; and to determine how this may impact upon their provision of patient care. Design, Setting and Participants The project was underpinned by the film selection determined by considerate review, intensive scrutiny, contemplation and discourse by the research team. 7 films were selected, ranging from animation; foreign, documentary, biopic and Hollywood drama. Each film was shown discretely, in an acoustic lecture theatre projected onto a large screen to pre-registration student nurses (adult, child and mental health) across each year of study from different cohorts (n = 49). Method A mixed qualitative method approach consisted of audio-recorded 5-minute reactions post film screening; coded questionnaires; and focus group. Findings were drawn from the impact of the films through thematic analysis of data sets and subjective text condensation categorised as: new insights looking through patient eyes; evoking emotion in student nurses; spiritual care; going to the moves to learn about the patient experience; self discovery through films; using films to link theory to practice. Results Deeper learning through film as a powerful medium was identified in meeting the objectives of the study. Integration of film into pre registration curriculum, pedagogy, teaching and learning is recommended. Conclusion The teaching potential of film stems from the visual process linked to human emotion and experience. Its impact has the power to not only help in learning the values that underpin nursing, but also for respecting the patient experience of disease, disability, death and its reality.
Resumo:
Introduction Compounds exhibiting antioxidant activity have received much interest in the food industry because of their potential health benefits. Carotenoids such as lycopene, which in the human diet mainly derives from tomatoes (Solanum lycopersicum), have attracted much attention in this aspect and the study of their extraction, processing and storage procedures is of importance. Optical techniques potentially offer advantageous non-invasive and specific methods to monitor them. Objectives To obtain both fluorescence and Raman information to ascertain if ultrasound assisted extraction from tomato pulp has a detrimental effect on lycopene. Method Use of time-resolved fluorescence spectroscopy to monitor carotenoids in a hexane extract obtained from tomato pulp with application of ultrasound treatment (583 kHz). The resultant spectra were a combination of scattering and fluorescence. Because of their different timescales, decay associated spectra could be used to separate fluorescence and Raman information. This simultaneous acquisition of two complementary techniques was coupled with a very high time-resolution fluorescence lifetime measurement of the lycopene. Results Spectroscopic data showed the presence of phytofluene and chlorophyll in addition to lycopene in the tomato extract. The time-resolved spectral measurement containing both fluorescence and Raman data, coupled with high resolution time-resolved measurements, where a lifetime of ~5 ps was attributed to lycopene, indicated lycopene appeared unaltered by ultrasound treatment. Detrimental changes were, however, observed in both chlorophyll and phytofluene contributions. Conclusion Extracted lycopene appeared unaffected by ultrasound treatment, while other constituents (chlorophyll and phytofluene) were degraded.
Resumo:
Current hearing-assistive technology performs poorly in noisy multi-talker conditions. The goal of this thesis was to establish the feasibility of using EEG to guide acoustic processing in such conditions. To attain this goal, this research developed a model via the constructive research method, relying on literature review. Several approaches have revealed improvements in the performance of hearing-assistive devices under multi-talker conditions, namely beamforming spatial filtering, model-based sparse coding shrinkage, and onset enhancement of the speech signal. Prior research has shown that electroencephalography (EEG) signals contain information that concerns whether the person is actively listening, what the listener is listening to, and where the attended sound source is. This thesis constructed a model for using EEG information to control beamforming, model-based sparse coding shrinkage, and onset enhancement of the speech signal. The purpose of this model is to propose a framework for using EEG signals to control sound processing to select a single talker in a noisy environment containing multiple talkers speaking simultaneously. On a theoretical level, the model showed that EEG can control acoustical processing. An analysis of the model identified a requirement for real-time processing and that the model inherits the computationally intensive properties of acoustical processing, although the model itself is low complexity placing a relatively small load on computational resources. A research priority is to develop a prototype that controls hearing-assistive devices with EEG. This thesis concludes highlighting challenges for future research.
Resumo:
In this master’s thesis, I examine the development of writer-characters and metafiction from John Irving’s The World According to Garp to Last Night in Twisted River and how this development relates to the development of late twentieth century postmodern literary theory to twenty-first century post-postmodern literary theory. The purpose of my study is to determine how the prominently postmodern feature metafiction, created through the writer-character’s stories-within-stories, has changed in form and function in the two novels published thirty years apart from one another, and what possible features this indicates for future post-postmodern theory. I establish my theoretical framework on the development of metafiction largely on late twentieth-century models of author and authorship as discussed by Roland Barthes, Wayne Booth and Michel Foucault. I base my close analysis of metafiction mostly on Linda Hutcheon’s model of overt and covert metafiction. At the end of my study, I examine Irving’s later novel through Suzanne Rohr’s models of reality constitution and fictional reality. The analysis of the two novels focuses on excerpts that feature the writer-characters, their stories-within-stories and the novels’ other characters and the narrators’ evaluations of these two. I draw examples from both novels, but I illustrate my choice of focus on the novels at the beginning of each section. Through this, I establish a method of analysis that best illustrates the development as a continuum from pre-existing postmodern models and theories to the formation of new post-postmodern theory. Based on my findings, the thesis argues that twenty-first century literary theory has moved away from postmodern overt deconstruction of the narrative and its meaning. New post-postmodern literary theory reacquires the previously deconstructed boundaries that define reality and truth and re-establishes them as having intrinsic value that cannot be disputed. In establishing fictional reality as self-governing and non-intrudable, post-postmodern theory takes a stance against postmodern nihilism, which indicates the re-founded, non-questionable value of the text’s reality. To continue mapping other possible features of future post-postmodern theory, I recommend further analysis solely on John Irving’s novels’ published in the twenty-first century.
Resumo:
In this paper, we demonstrate a digital signal processing (DSP) algorithm for improving spatial resolution of images captured by CMOS cameras. The basic approach is to reconstruct a high resolution (HR) image from a shift-related low resolution (LR) image sequence. The aliasing relationship of Fourier transforms between discrete and continuous images in the frequency domain is used for mapping LR images to a HR image. The method of projection onto convex sets (POCS) is applied to trace the best estimate of pixel matching from the LR images to the reconstructed HR image. Computer simulations and preliminary experimental results have shown that the algorithm works effectively on the application of post-image-captured processing for CMOS cameras. It can also be applied to HR digital image reconstruction, where shift information of the LR image sequence is known.
Resumo:
The handling and processing of fish in Uganda has until recently been carried out exclusively by the artisanal fishermen and fish processors. Their operations have left much to be desired as the product is often of low quality 'and its keeping time is limited. The handling of fresh fish has been without refrigeration but with the recent establishment of commercial fish processing plants a cold chain of fish distribution is being set up for domestic and export markets. Some of the fishermen are beginning to ice their catch immediately after reaching the shore. It is hoped that fishmongers will increasingly find it more profitable to market their products iced. This will make fish available to a large sector of the population and in the process there will be reduced post-harvest losses.
Resumo:
We present a detailed analysis of the application of a multi-scale Hierarchical Reconstruction method for solving a family of ill-posed linear inverse problems. When the observations on the unknown quantity of interest and the observation operators are known, these inverse problems are concerned with the recovery of the unknown from its observations. Although the observation operators we consider are linear, they are inevitably ill-posed in various ways. We recall in this context the classical Tikhonov regularization method with a stabilizing function which targets the specific ill-posedness from the observation operators and preserves desired features of the unknown. Having studied the mechanism of the Tikhonov regularization, we propose a multi-scale generalization to the Tikhonov regularization method, so-called the Hierarchical Reconstruction (HR) method. First introduction of the HR method can be traced back to the Hierarchical Decomposition method in Image Processing. The HR method successively extracts information from the previous hierarchical residual to the current hierarchical term at a finer hierarchical scale. As the sum of all the hierarchical terms, the hierarchical sum from the HR method provides an reasonable approximate solution to the unknown, when the observation matrix satisfies certain conditions with specific stabilizing functions. When compared to the Tikhonov regularization method on solving the same inverse problems, the HR method is shown to be able to decrease the total number of iterations, reduce the approximation error, and offer self control of the approximation distance between the hierarchical sum and the unknown, thanks to using a ladder of finitely many hierarchical scales. We report numerical experiments supporting our claims on these advantages the HR method has over the Tikhonov regularization method.
Resumo:
Edge-labeled graphs have proliferated rapidly over the last decade due to the increased popularity of social networks and the Semantic Web. In social networks, relationships between people are represented by edges and each edge is labeled with a semantic annotation. Hence, a huge single graph can express many different relationships between entities. The Semantic Web represents each single fragment of knowledge as a triple (subject, predicate, object), which is conceptually identical to an edge from subject to object labeled with predicates. A set of triples constitutes an edge-labeled graph on which knowledge inference is performed. Subgraph matching has been extensively used as a query language for patterns in the context of edge-labeled graphs. For example, in social networks, users can specify a subgraph matching query to find all people that have certain neighborhood relationships. Heavily used fragments of the SPARQL query language for the Semantic Web and graph queries of other graph DBMS can also be viewed as subgraph matching over large graphs. Though subgraph matching has been extensively studied as a query paradigm in the Semantic Web and in social networks, a user can get a large number of answers in response to a query. These answers can be shown to the user in accordance with an importance ranking. In this thesis proposal, we present four different scoring models along with scalable algorithms to find the top-k answers via a suite of intelligent pruning techniques. The suggested models consist of a practically important subset of the SPARQL query language augmented with some additional useful features. The first model called Substitution Importance Query (SIQ) identifies the top-k answers whose scores are calculated from matched vertices' properties in each answer in accordance with a user-specified notion of importance. The second model called Vertex Importance Query (VIQ) identifies important vertices in accordance with a user-defined scoring method that builds on top of various subgraphs articulated by the user. Approximate Importance Query (AIQ), our third model, allows partial and inexact matchings and returns top-k of them with a user-specified approximation terms and scoring functions. In the fourth model called Probabilistic Importance Query (PIQ), a query consists of several sub-blocks: one mandatory block that must be mapped and other blocks that can be opportunistically mapped. The probability is calculated from various aspects of answers such as the number of mapped blocks, vertices' properties in each block and so on and the most top-k probable answers are returned. An important distinguishing feature of our work is that we allow the user a huge amount of freedom in specifying: (i) what pattern and approximation he considers important, (ii) how to score answers - irrespective of whether they are vertices or substitution, and (iii) how to combine and aggregate scores generated by multiple patterns and/or multiple substitutions. Because so much power is given to the user, indexing is more challenging than in situations where additional restrictions are imposed on the queries the user can ask. The proposed algorithms for the first model can also be used for answering SPARQL queries with ORDER BY and LIMIT, and the method for the second model also works for SPARQL queries with GROUP BY, ORDER BY and LIMIT. We test our algorithms on multiple real-world graph databases, showing that our algorithms are far more efficient than popular triple stores.