885 resultados para Nature of turbidity,


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most common and conventional method for removing turbidity from water is by coagulating with alum or iron salts, and settling the precipitate in suitably designed clarifiers followed by filtration. But the sludge produced is bulky, difficult to dewater and accumulates in the dumping grounds causing environmental problems. Synthetic polymers such as polyacrylamide and polyethyleneoxide have been investigated for their ability to remove turbidity. They overcome many of the disadvantages of conventional methods, but are cost—effective only when rapid flocculation and reduction in sludge volume are demanded. Considering the aforementioned situation, it was felt that more easily available and eco-friendly materials must be developed for removing turbidity from water. The results of our studies in this direction are presented in this thesis. The thesis comprises of nine chapters, with a common bibliography at the end. Chapter 1 gives an introduction to the nature of turbidity and colour usually present in water. Chapter 2 discusses the nature and availability of the principal material used in these studies, namely chitosan. Chapters 3 to 8, which deal with the actual experimental work, are further subdivided into (a) introduction, (b) materials and methods, (c) results and discussion and (d) conclusions. Chapter 9 summarises the entire work so as to put the results and conclusions into proper perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of 7 cerium double-decker complexes with various tetrapyrrole ligands including porphyrinates, phthalocyaninates, and 2,3-naphthalocyaninates have been prepared by previously described methodologies and characterized with elemental analysis and a range of spectroscopic methods. The molecular structures of two heteroleptic \[(na)phthalocyaninato](porphyrinato) complexes have also been determined by X-ray diffraction analysis which exhibit a slightly distorted square antiprismatic geometry with two domed ligands. Having a range of tetrapyrrole ligands with very different electronic properties, these compounds have been systematically investigated for the effects of ligands on the valence of the cerium center. On the basis of the spectroscopic (UV−vis, near-IR, IR, and Raman), electrochemical, and structural data of these compounds and compared with those of the other rare earth(III) counterparts reported earlier, it has been found that the cerium center adopts an intermediate valence in these complexes. It assumes a virtually trivalent state in cerium bis(tetra-tert-butylnaphthalocyaninate) as a result of the two electron rich naphthalocyaninato ligands, which facilitate the delocalization of electron from the ligands to the metal center. For the rest of the cerium double-deckers, the cerium center is predominantly tetravalent. The valences (3.59−3.68) have been quantified according to their LIII-edge X-ray absorption near-edge structure (XANES) profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There exists a general consensus in the science education literature around the goal of enhancing students. and teachers. views of nature of science (NOS). An emerging area of research in science education explores NOS and argumentation, and the aim of this study was to explore the effectiveness of a science content course incorporating explicit NOS and argumentation instruction on preservice primary teachers. views of NOS. A constructivist perspective guided the study, and the research strategy employed was case study research. Five preservice primary teachers were selected for intensive investigation in the study, which incorporated explicit NOS and argumentation instruction, and utilised scientific and socioscientific contexts for argumentation to provide opportunities for participants to apply their NOS understandings to their arguments. Four primary sources of data were used to provide evidence for the interpretations, recommendations, and implications that emerged from the study. These data sources included questionnaires and surveys, interviews, audio- and video-taped class sessions, and written artefacts. Data analysis involved the formation of various assertions that informed the major findings of the study, and a variety of validity and ethical protocols were considered during the analysis to ensure the findings and interpretations emerging from the data were valid. Results indicated that the science content course was effective in enabling four of the five participants. views of NOS to be changed. All of the participants expressed predominantly limited views of the majority of the examined NOS aspects at the commencement of the study. Many positive changes were evident at the end of the study with four of the five participants expressing partially informed and/or informed views of the majority of the examined NOS aspects. A critical analysis of the effectiveness of the various course components designed to facilitate the development of participants‟ views of NOS in the study, led to the identification of three factors that mediated the development of participants‟ NOS views: (a) contextual factors (including context of argumentation, and mode of argumentation), (b) task-specific factors (including argumentation scaffolds, epistemological probes, and consideration of alternative data and explanations), and (c) personal factors (including perceived previous knowledge about NOS, appreciation of the importance and utility value of NOS, and durability and persistence of pre-existing beliefs). A consideration of the above factors informs recommendations for future studies that seek to incorporate explicit NOS and argumentation instruction as a context for learning about NOS.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Globally, teaching has become more complex and more challenging over recent years, with new and increased demands being placed on teachers by students, their families, governments and wider society. Teachers work with more diverse communities in times characterised by volatility, uncertainty and moral ambiguity. Societal, political, economic and cultural shifts have transformed the contexts in which teachers work and have redefined the ways in which teachers interact with students. This qualitative study uses phenomenographic methods to explore the nature of pedagogic teacherstudent interactions. The data analysis reveals five qualitatively different ways in which teachers experience pedagogic engagements with students. The resultant categories of description ranged from information providing, with teachers viewed as transmitters of a body of knowledge through to mentoring in which teachers were perceived as significant others in the lives of students with their influence extending beyond the walls of the classroom and beyond the years of schooling. The paper concludes by arguing that if teachers are to prepare students for the challenges and opportunities in changing times, teacher education programs need to consider ways to facilitate the development of mentoring capacities in new teachers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As consumers become better educated and more skeptical of traditional advertising, alternate forms of marketing communication have emerged that aim to influence audiences unobtrusively. One such example is product placement. Product placement has attracted ongoing debate as to whether it is covert, unethical, and influences consumption. The current article examines the nature and practice of product placement in this light. This taxonomy of product placement attributes is based on current marketing practice and examines whether this is, indeed, a covert marketing strategy. Further, it presents a conceptualization of the influence of product placement on consumer welfare. We highlight that the many forms of product placement necessitate independent evaluation to determine ethical and regulatory standards. Operational solutions for developing public policy are offered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explored youth caregiving for a parent with multiple sclerosis (MS) from multiple perspectives, and examined associations between caregiving and child negative (behavioural emotional difficulties, somatisation) and positive (life satisfaction, positive affect, prosocial behaviour) adjustment outcomes overtime. A total of 88 families participated; 85 parents with MS, 55 partners and 130 children completed questionnaires at Time 1. Child caregiving was assessed by the Youth Activities of Caregiving Scale (YACS). Child and parent questionnaire data were collected at Time 1 and child data were collected 12 months later (Time 2). Factor analysis of the child and parent YACS data replicated the four factors (instrumental, social-emotional, personal-intimate, domestic-household care), all of which were psychometrically sound. The YACS factors were related to parental illness and caregiving context variables that reflected increased caregiving demands. The Time 1 instrumental and social-emotional care domains were associated with poorer Time 2 adjustment, whereas personal-intimate was related to better adjustment and domestic-household care was unrelated to adjustment. Children and their parents exhibited highest agreement on personal-intimate, instrumental and total caregiving, and least on domestic-household and social-emotional care. Findings delineate the key dimensions of young caregiving in MS and the differential links between caregiving activities and youth adjustment.