943 resultados para NON-HUMAN PRIMATE


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Chlamydia pneumoniae is an obligate intracellular respiratory pathogen that causes 10% of community-acquired pneumonia and has been associated with cardiovascular disease. Both whole-genome sequencing and specific gene typing suggest that there is relatively little genetic variation in human isolates of C. pneumoniae. To date, there has been little genomic analysis of strains from human cardiovascular sites. The genotypes of C. pneumoniae present in human atherosclerotic carotid plaque were analysed and several polymorphisms in the variable domain 4 (VD4) region of the outer-membrane protein-A (ompA) gene and the intergenic region between the ygeD and uridine kinase (ygeD-urk) genes were found. While one genotype was identified that was the same as one reported previously in humans (respiratory and cardiovascular), another genotype was found that was identical to a genotype from non-human sources (frog/koala).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Risk management in healthcare represents a group of various complex actions, implemented to improve the quality of healthcare services and guarantee the patients safety. Risks cannot be eliminated, but it can be controlled with different risk assessment methods derived from industrial applications and among these the Failure Mode Effect and Criticality Analysis (FMECA) is a largely used methodology. The main purpose of this work is the analysis of failure modes of the Home Care (HC) service provided by local healthcare unit of Naples (ASL NA1) to focus attention on human and non human factors according to the organization framework selected by WHO. © Springer International Publishing Switzerland 2014.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This project posits a link between representations of animals or animality and representations of illness in the Victorian novel, and examines the narrative uses and ideological consequences of such representations. Figurations of animality and illness in Victorian fiction have been examined extensively as distinct phenomena, but examining their connection allows for a more complex view of the role of sympathy in the Victorian novel. The commonplace in novel criticism is that Victorian authors, whether effectively or not, constructed their novels with a view to the expansion of sympathy. This dissertation intervenes in the discussion of the Victorian novel as a vehicle for sympathy by positing that texts and scenes in which representations of illness and animality are conjoined reveal where the novel draws the boundaries of the human, and the often surprising limits it sets on sympathetic feeling. In such moments, textual cues train or direct readerly sympathies in ways that suggest a particular definition of the human, but that direction of sympathy is not always towards an enlarged sympathy, or an enlarged definition of the human. There is an equally (and increasingly) powerful antipathetic impulse in many of these texts, which estranges readerly sympathy from putatively deviant, degenerate, or dangerous groups. These two opposing impulses—the sympathetic and the antipathetic—often coexist in the same novel or even the same scene, creating an ideological and affective friction, and both draw on the same tropes of illness and animality. Examining the intersection of these different discourses—sympathy, illness, and animality-- in these novels reveals the way that major Victorian debates about human nature, evolution and degeneration, and moral responsibility shaped the novels of the era as vehicles for both antipathy and sympathy. Focusing on the novels of the Brontës and Thomas Hardy, this dissertation examines in depth the interconnected ways that representations of animals and animality and representations of illness function in the Victorian novel, as they allow authors to explore or redefine the boundary between the human and the non-human, the boundary between sympathy and antipathy, and the limits of sympathy itself.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Scheduling problems are generally NP-hard combinatorial problems, and a lot of research has been done to solve these problems heuristically. However, most of the previous approaches are problem-specific and research into the development of a general scheduling algorithm is still in its infancy. Mimicking the natural evolutionary process of the survival of the fittest, Genetic Algorithms (GAs) have attracted much attention in solving difficult scheduling problems in recent years. Some obstacles exist when using GAs: there is no canonical mechanism to deal with constraints, which are commonly met in most real-world scheduling problems, and small changes to a solution are difficult. To overcome both difficulties, indirect approaches have been presented (in [1] and [2]) for nurse scheduling and driver scheduling, where GAs are used by mapping the solution space, and separate decoding routines then build solutions to the original problem. In our previous indirect GAs, learning is implicit and is restricted to the efficient adjustment of weights for a set of rules that are used to construct schedules. The major limitation of those approaches is that they learn in a non-human way: like most existing construction algorithms, once the best weight combination is found, the rules used in the construction process are fixed at each iteration. However, normally a long sequence of moves is needed to construct a schedule and using fixed rules at each move is thus unreasonable and not coherent with human learning processes. When a human scheduler is working, he normally builds a schedule step by step following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not completed yet, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this research we intend to design more human-like scheduling algorithms, by using ideas derived from Bayesian Optimization Algorithms (BOA) and Learning Classifier Systems (LCS) to implement explicit learning from past solutions. BOA can be applied to learn to identify good partial solutions and to complete them by building a Bayesian network of the joint distribution of solutions [3]. A Bayesian network is a directed acyclic graph with each node corresponding to one variable, and each variable corresponding to individual rule by which a schedule will be constructed step by step. The conditional probabilities are computed according to an initial set of promising solutions. Subsequently, each new instance for each node is generated by using the corresponding conditional probabilities, until values for all nodes have been generated. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the Bayesian network is updated again using the current set of good rule strings. The algorithm thereby tries to explicitly identify and mix promising building blocks. It should be noted that for most scheduling problems the structure of the network model is known and all the variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus learning can amount to 'counting' in the case of multinomial distributions. In the LCS approach, each rule has its strength showing its current usefulness in the system, and this strength is constantly assessed [4]. To implement sophisticated learning based on previous solutions, an improved LCS-based algorithm is designed, which consists of the following three steps. The initialization step is to assign each rule at each stage a constant initial strength. Then rules are selected by using the Roulette Wheel strategy. The next step is to reinforce the strengths of the rules used in the previous solution, keeping the strength of unused rules unchanged. The selection step is to select fitter rules for the next generation. It is envisaged that the LCS part of the algorithm will be used as a hill climber to the BOA algorithm. This is exciting and ambitious research, which might provide the stepping-stone for a new class of scheduling algorithms. Data sets from nurse scheduling and mall problems will be used as test-beds. It is envisaged that once the concept has been proven successful, it will be implemented into general scheduling algorithms. It is also hoped that this research will give some preliminary answers about how to include human-like learning into scheduling algorithms and may therefore be of interest to researchers and practitioners in areas of scheduling and evolutionary computation. References 1. Aickelin, U. and Dowsland, K. (2003) 'Indirect Genetic Algorithm for a Nurse Scheduling Problem', Computer & Operational Research (in print). 2. Li, J. and Kwan, R.S.K. (2003), 'Fuzzy Genetic Algorithm for Driver Scheduling', European Journal of Operational Research 147(2): 334-344. 3. Pelikan, M., Goldberg, D. and Cantu-Paz, E. (1999) 'BOA: The Bayesian Optimization Algorithm', IlliGAL Report No 99003, University of Illinois. 4. Wilson, S. (1994) 'ZCS: A Zeroth-level Classifier System', Evolutionary Computation 2(1), pp 1-18.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Scheduling problems are generally NP-hard combinatorial problems, and a lot of research has been done to solve these problems heuristically. However, most of the previous approaches are problem-specific and research into the development of a general scheduling algorithm is still in its infancy. Mimicking the natural evolutionary process of the survival of the fittest, Genetic Algorithms (GAs) have attracted much attention in solving difficult scheduling problems in recent years. Some obstacles exist when using GAs: there is no canonical mechanism to deal with constraints, which are commonly met in most real-world scheduling problems, and small changes to a solution are difficult. To overcome both difficulties, indirect approaches have been presented (in [1] and [2]) for nurse scheduling and driver scheduling, where GAs are used by mapping the solution space, and separate decoding routines then build solutions to the original problem. In our previous indirect GAs, learning is implicit and is restricted to the efficient adjustment of weights for a set of rules that are used to construct schedules. The major limitation of those approaches is that they learn in a non-human way: like most existing construction algorithms, once the best weight combination is found, the rules used in the construction process are fixed at each iteration. However, normally a long sequence of moves is needed to construct a schedule and using fixed rules at each move is thus unreasonable and not coherent with human learning processes. When a human scheduler is working, he normally builds a schedule step by step following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not completed yet, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this research we intend to design more human-like scheduling algorithms, by using ideas derived from Bayesian Optimization Algorithms (BOA) and Learning Classifier Systems (LCS) to implement explicit learning from past solutions. BOA can be applied to learn to identify good partial solutions and to complete them by building a Bayesian network of the joint distribution of solutions [3]. A Bayesian network is a directed acyclic graph with each node corresponding to one variable, and each variable corresponding to individual rule by which a schedule will be constructed step by step. The conditional probabilities are computed according to an initial set of promising solutions. Subsequently, each new instance for each node is generated by using the corresponding conditional probabilities, until values for all nodes have been generated. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the Bayesian network is updated again using the current set of good rule strings. The algorithm thereby tries to explicitly identify and mix promising building blocks. It should be noted that for most scheduling problems the structure of the network model is known and all the variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus learning can amount to 'counting' in the case of multinomial distributions. In the LCS approach, each rule has its strength showing its current usefulness in the system, and this strength is constantly assessed [4]. To implement sophisticated learning based on previous solutions, an improved LCS-based algorithm is designed, which consists of the following three steps. The initialization step is to assign each rule at each stage a constant initial strength. Then rules are selected by using the Roulette Wheel strategy. The next step is to reinforce the strengths of the rules used in the previous solution, keeping the strength of unused rules unchanged. The selection step is to select fitter rules for the next generation. It is envisaged that the LCS part of the algorithm will be used as a hill climber to the BOA algorithm. This is exciting and ambitious research, which might provide the stepping-stone for a new class of scheduling algorithms. Data sets from nurse scheduling and mall problems will be used as test-beds. It is envisaged that once the concept has been proven successful, it will be implemented into general scheduling algorithms. It is also hoped that this research will give some preliminary answers about how to include human-like learning into scheduling algorithms and may therefore be of interest to researchers and practitioners in areas of scheduling and evolutionary computation. References 1. Aickelin, U. and Dowsland, K. (2003) 'Indirect Genetic Algorithm for a Nurse Scheduling Problem', Computer & Operational Research (in print). 2. Li, J. and Kwan, R.S.K. (2003), 'Fuzzy Genetic Algorithm for Driver Scheduling', European Journal of Operational Research 147(2): 334-344. 3. Pelikan, M., Goldberg, D. and Cantu-Paz, E. (1999) 'BOA: The Bayesian Optimization Algorithm', IlliGAL Report No 99003, University of Illinois. 4. Wilson, S. (1994) 'ZCS: A Zeroth-level Classifier System', Evolutionary Computation 2(1), pp 1-18.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stem cell therapy for ischaemic stroke is an emerging field in light of an increasing number of patients surviving with permanent disability. Several allogenic and autologous cells types are now in clinical trials with preliminary evidence of safety. Some clinical studies have reported functional improvements in some patients. After initial safety evaluation in a Phase 1 study, the conditionally immortalised human neural stem cell line CTX0E03 is currently in a Phase 2 clinical trial (PISCES-II). Previous pre-clinical studies conducted by ReNeuron Ltd, showed evidence of functional recovery in the Bilateral Asymmetry test up to 6 weeks following transplantation into rodent brain, 4 weeks after middle cerebral artery occlusion. Resting-state fMRI is increasingly used to investigate brain function in health and disease, and may also act as a predictor of recovery due to known network changes in the post-stroke recovery period. Resting-state methods have also been applied to non-human primates and rodents which have been found to have analogous resting-state networks to humans. The sensorimotor resting-state network of rodents is impaired following experimental focal ischaemia of the middle cerebral artery territory. However, the effects of stem cell implantation on brain functional networks has not previously been investigated. Prior studies assessed sensorimotor function following sub-cortical implantation of CTX0E03 cells in the rodent post-stroke brain but with no MRI assessments of functional improvements. This thesis presents research on the effect of sub-cortical implantation of CTX0E03 cells on the resting- state sensorimotor network and sensorimotor deficits in the rat following experimental stroke, using protocols based on previous work with this cell line. The work in this thesis identified functional tests of appropriate sensitivity for long-term dysfunction suitable for this laboratory, and investigated non-invasive monitoring of physiological variables required to optimize BOLD signal stability within a high-field MRI scanner. Following experimental stroke, rats demonstrated expected sensorimotor dysfunction and changes in the resting-state sensorimotor network. CTX0E03 cells did not improve post-stroke functional outcome (compared to previous studies) and with no changes in resting-state sensorimotor network activity. However, in control animals, we observed changes in functional networks due to the stereotaxic procedure. This illustrates the sensitivity of resting-state fMRI to stereotaxic procedures. We hypothesise that the damage caused by cell or vehicle implantation may have prevented functional and network recovery which has not been previously identified due to the application of different functional tests. The findings in this thesis represent one of few pre-clinical studies in resting-state fMRI network changes post-stroke and the only to date applying this technique to evaluate functional outcomes following a clinically applicable human neural stem cell treatment for ischaemic stroke. It was found that injury caused by stereotaxic injection should be taken into account when assessing the effectiveness of treatment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Comparative and evolutionary developmental analyses seek to discover the similarities and differences between humans and non-human species that illuminate both the evolutionary foundations of our nature that we share with other animals, and the distinctive characteristics that make human development unique. As our closest animal relatives, with whom we last shared common ancestry, non-human primates have beenparticularly important in this endeavour. Such studies that have focused on social learning, traditions, and culture have discovered much about the ‘how’ of social learning, concerned with key underlying processes such as imitation and emulation. One of the core discoveries is that the adaptive adjustment of social learning options to different contexts is not unique to human infants, therefore multiple new strands of research have begun to focus on more subtle questions about when, from whom, and why such learning occurs. Here we review illustrative studies on both human infants and young children and on non-human primates to identify the similarities shared more broadly across the primate order, and the apparent specialisms that distinguish human development. Adaptive biases in social learning discussed include those modulated by task comprehension, experience, conformity to majorities, and the age, skill, proficiency and familiarity of potential alternative cultural models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Globality generates increasingly diffuse networks of human and non-human innovators, carriers and icons of exotic, polyethnic cosmopolitan difference; and this diffusion is increasingly hard to ignore or police (Latour 1993). In fact, such global networks of material-symbolic exchange can frequently have the unintended consequence of promoting status systems and cultural relationships founded on uncosmopolitan values such as cultural appropriation and status-based social exclusion. Moreover, this materialsymbolic engagement with cosmopolitan difference could also be rather mundane, engaged in routinely without any great reflexive consciousness or capacity to destabilise current relations of cultural power, or interpreted unproblematically as just one component of a person’s social environment. Indeed, Beck’s (2006) argument is that cosmopolitanism, in an age of global risk, is being forced upon us unwillingly, so there should be no surprise if it is a bitter pill for some to swallow. Within these emergent cosmopolitan networks, which we call ‘cosmoscapes’, there is no certainty about the development of ethical or behavioural stances consistent with claims foundational to the current literature on cosmopolitanism. Reviewing historical and contemporary studies of globality and its dynamic generative capacity, this paper considers such literatures in the context of studies of cultural consumption and social status. When one positions these diverse bodies of literature against one another, it becomes clear that the possibility of widespread cosmopolitan cultural formations is largely unpromising.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Because aesthetics can have a profound effect upon the human relationship to the non-human environment the importance of aesthetics to ecologically sustainable designed landscapes has been acknowledged. However, in recognition that the physical forms of designed landscapes are an expression of the social values of the time, some design professionals have called for a new aesthetic ― one that reflects these current ecological concerns. To address this, some authors have suggested various theoretical design frameworks upon which such an aesthetic could be based. Within these frameworks there is an underlying theme that the patterns and processes of natural systems have the potential to form a new aesthetic for landscape design —an aesthetic based on fractal rather than Euclidean geometry. Perry, Reeves and Sim (2008) have shown that it is possible to differentiate between different landscape forms by fractal analysis. However, this research also shows that individual scenes from within very different landscape forms can possess the same fractal properties. Early data, revealed by transforming landscape images from the spatial to the frequency domain, using the fast Fourier transform, suggest that fractal patterning can have a significant effect within the landscape. In fact, it may be argued that any landscape design that includes living processes will include some design element whose ultimate form can only be expressed through the mathematics of fractal geometry. This paper will present ongoing research into the potential role of fractal geometry as a basis for a new form language – a language that may articulate an aesthetic for landscape design that echoes our ecological awakening.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Streptococcus pyogenes, also known as Group A Streptococcus (GAS) has been associated with a range of diseases from the mild pharyngitis and pyoderma to more severe invasive infections such as streptococcal toxic shock. GAS also causes a number of non-suppurative post-infectious diseases such as rheumatic fever, rheumatic heart disease and glomerulonephritis. The large extent of GAS disease burden necessitates the need for a prophylactic vaccine that could target the diverse GAS emm types circulating globally. Anti-GAS vaccine strategies have focused primarily on the GAS M-protein, an extracellular virulence factor anchored to GAS cell wall. As opposed to the hypervariable N-terminal region, the C-terminal portion of the protein is highly conserved among different GAS emm types and is the focus of a leading GAS vaccine candidate, J8-DT/alum. The vaccine candidate J8-DT/alum was shown to be immunogenic in mice, rabbits and the non-human primates, hamadryas baboons. Similar responses to J8-DT/alum were observed after subcutaneous and intramuscular immunization with J8-DT/alum, in mice and in rabbits. Further assessment of parameters that may influence the immunogenicity of J8-DT demonstrated that the immune responses were identical in male and female mice and the use of alum as an adjuvant in the vaccine formulation significantly increased its immunogenicity, resulting in a long-lived serum IgG response. Contrary to the previous findings, the data in this thesis indicates that a primary immunization with J8-DT/alum (50ƒÊg) followed by a single boost is sufficient to generate a robust immune response in mice. As expected, the IgG response to J8- DT/alum was a Th2 type response consisting predominantly of the isotype IgG1 accompanied by lower levels of IgG2a. Intramuscular vaccination of rabbits with J8-DT/alum demonstrated that an increase in the dose of J8-DT/alum up to 500ƒÊg does not have an impact on the serum IgG titers achieved. Similar to the immune response in mice, immunization with J8-DT/alum in baboons also established that a 60ƒÊg dose compared to either 30ƒÊg or 120ƒÊg was sufficient to generate a robust immune response. Interestingly, mucosal infection of naive baboons with a M1 GAS strain did not induce a J8-specific serum IgG response. As J8-DT/alum mediated protection has been previously reported to be due to the J8- specific antibody formed, the efficacy of J8-DT antibodies was determined in vitro and in vivo. In vitro opsonization and in vivo passive transfer confirmed the protective potential of J8-DT antibodies. A reduction in the bacterial burden after challenge with a bioluminescent M49 GAS strain in mice that were passively administered J8-DT IgG established that protection due to J8-DT was mediated by antibodies. The GAS burden in infected mice was monitored using bioluminescent imaging in addition to traditional CFU assays. Bioluminescent GAS strains including the ‘rheumatogenic’ M1 GAS could not be generated due to limitations with transformation of GAS, however, a M49 GAS strain was utilized during BLI. The M49 serotype is traditionally a ‘nephritogenic’ serotype associated with post-streptococcal glomerulonephritis. Anti- J8-DT antibodies now have been shown to be protective against multiple GAS strains such as M49 and M1. This study evaluated the immunogenicity of J8-DT/alum in different species of experimental animals in preparation for phase I human clinical trials and provided the ground work for the development of a rapid non-invasive assay for evaluation of vaccine candidates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background This research addresses the development of a digital stethoscope for use with a telehealth communications network to allow doctors to examine patients remotely (a digital telehealth stethoscope). A telehealth stethoscope would allow remote auscultation of patients who do not live near a major hospital. Travelling from remote areas to major hospitals is expensive for patients and a telehealth stethoscope could result in significant cost savings. Using a stethoscope requires great skill. To design a telehealth stethoscope that meets doctors’ expectations, the use of existing stethoscopes in clinical contexts must be examined. Method Observations were conducted of 30 anaesthetic preadmission consultations. The observations were video- taped. Interaction between doctor, patient and non-human elements in the consultation were “coded” to transform the video into data. The data were analysed to reveal essential aspects of the interactions. Results The analysis has shown that the doctor controls the interaction during auscultation. The conduct of auscultation draws heavily on the doctor’s tacit knowledge, allowing the doctor to treat the acoustic stethoscope as infrastructure – that is, the stethoscope sinks into the background and becomes completely transparent in use. Conclusion Two important, and related, implications for the design of a telehealth stethoscope have arisen from this research. First, as a telehealth stethoscope will be a shared device, doctors will not be able to make use of their existing expertise in using their own stethoscopes. Very simply, a telehealth stethoscope will sound different to a doctor’s own stethoscope. Second, the collaborative interaction required to use a telehealth stethoscope will have to be invented and refined. A telehealth stethoscope will need to be carefully designed to address these issues and result in successful use. This research challenges the concept of a telehealth stethoscope by raising questions about the ease and confidence with which doctors could use such a device.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real-world business processes are resource-intensive. In work environments human resources usually multitask, both human and non-human resources are typically shared between tasks, and multiple resources are sometimes necessary to undertake a single task. However, current Business Process Management Systems focus on task-resource allocation in terms of individual human resources only and lack support for a full spectrum of resource classes (e.g., human or non-human, application or non-application, individual or teamwork, schedulable or unschedulable) that could contribute to tasks within a business process. In this paper we develop a conceptual data model of resources that takes into account the various resource classes and their interactions. The resulting conceptual resource model is validated using a real-life healthcare scenario.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wellness is now seen as central to redefining the National Health agenda. There is growing evidence that contact with nature and physical activity in nature has considerable positive effects on human health. At the most basic level humanity is reliant on the natural world for resources such as air and water. However, a growing body of research is finding that beyond this fundamental relationship exposure to the non-human natural world can also positively enhance perceptions of physiological, emotional, psychological and spiritual health in ways that cannot be satisfied by alternate means. Theoretical explanations for this have posited that non-human nature might 1) restore mental fatigue, 2) trigger deep reflections, 3) provide an opportunity for nurturing and 4) rekindle innate connections. In this paper the authors show how human wellness is strongly connected to their relationship with the natural world. This paper points to how non-human nature could be better utilised for enhancing human health and wellness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real-world business processes rely on the availability of scarce, shared resources, both human and non-human. Current workflow management systems support allocation of individual human resources to tasks but lack support for the full range of resource types used in practice, and the inevitable constraints on their availability and applicability. Based on past experience with resource-intensive workflow applications, we derive generic requirements for a workflow system which can use its knowledge of resource capabilities and availability to help create feasible task schedules. We then define the necessary architecture for implementing such a system and demonstrate its practicality through a proof-of-concept implementation. This work is presented in the context of a real-life surgical care process observed in a number of German hospitals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Microbial pollution in water periodically affects human health in Australia, particularly in times of drought and flood. There is an increasing need for the control of waterborn microbial pathogens. Methods, allowing the determination of the origin of faecal contamination in water, are generally referred to as Microbial Source Tracking (MST). Various approaches have been evaluated as indicatorsof microbial pathogens in water samples, including detection of different microorganisms and various host-specific markers. However, until today there have been no universal MST methods that could reliably determine the source (human or animal) of faecal contamination. Therefore, the use of multiple approaches is frequently advised. MST is currently recognised as a research tool, rather than something to be included in routine practices. The main focus of this research was to develop novel and universally applicable methods to meet the demands for MST methods in routine testing of water samples. Escherichia coli was chosen initially as the object organism for our studies as, historically and globally, it is the standard indicator of microbial contamination in water. In this thesis, three approaches are described: single nucleotide polymorphism (SNP) genotyping, clustered regularly interspaced short palindromic repeats (CRISPR) screening using high resolution melt analysis (HRMA) methods and phage detection development based on CRISPR types. The advantage of the combination SNP genotyping and CRISPR genes has been discussed in this study. For the first time, a highly discriminatory single nucleotide polymorphism interrogation of E. coli population was applied to identify the host-specific cluster. Six human and one animal-specific SNP profile were revealed. SNP genotyping was successfully applied in the field investigations of the Coomera watershed, South-East Queensland, Australia. Four human profiles [11], [29], [32] and [45] and animal specific SNP profile [7] were detected in water. Two human-specific profiles [29] and [11] were found to be prevalent in the samples over a time period of years. The rainfall (24 and 72 hours), tide height and time, general land use (rural, suburban), seasons, distance from the river mouth and salinity show a lack of relashionship with the diversity of SNP profiles present in the Coomera watershed (p values > 0.05). Nevertheless, SNP genotyping method is able to identify and distinquish between human- and non-human specific E. coli isolates in water sources within one day. In some samples, only mixed profiles were detected. To further investigate host-specificity in these mixed profiles CRISPR screening protocol was developed, to be used on the set of E. coli, previously analysed for SNP profiles. CRISPR loci, which are the pattern of previous DNA coliphages attacks, were considered to be a promising tool for detecting host-specific markers in E. coli. Spacers in CRISPR loci could also reveal the dynamics of virulence in E. coli as well in other pathogens in water. Despite the fact that host-specificity was not observed in the set of E. coli analysed, CRISPR alleles were shown to be useful in detection of the geographical site of sources. HRMA allows determination of ‘different’ and ‘same’ CRISPR alleles and can be introduced in water monitoring as a cost-effective and rapid method. Overall, we show that the identified human specific SNP profiles [11], [29], [32] and [45] can be useful as marker genotypes globally for identification of human faecal contamination in water. Developed in the current study, the SNP typing approach can be used in water monitoring laboratories as an inexpensive, high-throughput and easy adapted protocol. The unique approach based on E. coli spacers for the search for unknown phage was developed to examine the host-specifity in phage sequences. Preliminary experiments on the recombinant plasmids showed the possibility of using this method for recovering phage sequences. Future studies will determine the host-specificity of DNA phage genotyping as soon as first reliable sequences can be acquired. No doubt, only implication of multiple approaches in MST will allow identification of the character of microbial contamination with higher confidence and readability.