838 resultados para Biology teaching. Undergraduate curriculum. Understanding of nature. Complexity
Resumo:
Some schools do not have ideal access to laboratory space and supplies. Computer simulations of laboratory activities can be a cost-effective way of presenting experiences to students, but are those simulations as effective at supplementing content concepts? This study compared the use of traditional lab activities illustrating the principles of cell respiration and photosynthesis in an introductory high school biology class with virtual simulations of the same activities. Additionally student results were analyzed to assess if student conceptual understanding was affected by the complexity of the simulation. Although all student groups posted average gain increases between the pre and post-tests coupled with positive effect sizes, students who completed the wet lab version of the activity consistently outperformed the students who completed the virtual simulation of the same activity. There was no significant difference between the use of more or less complex simulations. Students also tended to rate the wet lab experience higher on a motivation and interest inventory.
Resumo:
This report shares my efforts in developing a solid unit of instruction that has a clear focus on student outcomes. I have been a teacher for 20 years and have been writing and revising curricula for much of that time. However, most has been developed without the benefit of current research on how students learn and did not focus on what and how students are learning. My journey as a teacher has involved a lot of trial and error. My traditional method of teaching is to look at the benchmarks (now content expectations) to see what needs to be covered. My unit consists of having students read the appropriate sections in the textbook, complete work sheets, watch a video, and take some notes. I try to include at least one hands-on activity, one or more quizzes, and the traditional end-of-unit test consisting mostly of multiple choice questions I find in the textbook. I try to be engaging, make the lessons fun, and hope that at the end of the unit my students get whatever concepts I‘ve presented so that we can move on to the next topic. I want to increase students‘ understanding of science concepts and their ability to connect understanding to the real-world. However, sometimes I feel that my lessons are missing something. For a long time I have wanted to develop a unit of instruction that I know is an effective tool for the teaching and learning of science. In this report, I describe my efforts to reform my curricula using the “Understanding by Design” process. I want to see if this style of curriculum design will help me be a more effective teacher and if it will lead to an increase in student learning. My hypothesis is that this new (for me) approach to teaching will lead to increased understanding of science concepts among students because it is based on purposefully thinking about learning targets based on “big ideas” in science. For my reformed curricula I incorporate lessons from several outstanding programs I‘ve been involved with including EpiCenter (Purdue University), Incorporated Research Institutions for Seismology (IRIS), the Master of Science Program in Applied Science Education at Michigan Technological University, and the Michigan Association for Computer Users in Learning (MACUL). In this report, I present the methodology on how I developed a new unit of instruction based on the Understanding by Design process. I present several lessons and learning plans I‘ve developed for the unit that follow the 5E Learning Cycle as appendices at the end of this report. I also include the results of pilot testing of one of lessons. Although the lesson I pilot-tested was not as successful in increasing student learning outcomes as I had anticipated, the development process I followed was helpful in that it required me to focus on important concepts. Conducting the pilot test was also helpful to me because it led me to identify ways in which I could improve upon the lesson in the future.
Resumo:
Atrial fibrillation (AF) is the most common cardiac arrhythmia, and is responsible for the highest number of rhythm-related disorders and cardioembolic strokes worldwide. Intracardiac signal analysis during the onset of paroxysmal AF led to the discovery of pulmonary vein as a triggering source of AF, which has led to the development of pulmonary vein ablation--an established curative therapy for drug-resistant AF. Complex, multicomponent and rapid electrical activity widely involving the atrial substrate characterizes persistent/permanent AF. Widespread nature of the problem and complexity of signals in persistent AF reduce the success rate of ablation therapy. Although signal processing applied to extraction of relevant features from these complex electrograms has helped to improve the efficacy of ablation therapy in persistent/permanent AF, improved understanding of complex signals should help to identify sources of AF and further increase the success rate of ablation therapy.
Resumo:
Predicting the impacts of ocean acidification on coastal ecosystems requires an understanding of the effects on macroalgae and their grazers, as these underpin the ecology of rocky shores. Whilst calcified coralline algae (Rhodophyta) appear to be especially vulnerable to ocean acidification, there is a lack of information concerning calcified brown algae (Phaeophyta), which are not obligate calcifiers but are still important producers of calcium carbonate and organic matter in shallow coastal waters. Here, we compare ecological shifts in subtidal rocky shore systems along CO2 gradients created by volcanic seeps in the Mediterranean and Papua New Guinea, focussing on abundant macroalgae and grazing sea urchins. In both the temperate and tropical systems the abundances of grazing sea urchins declined dramatically along CO2 gradients. Temperate and tropical species of the calcifying macroalgal genus Padina (Dictyoaceae, Phaeophyta) showed reductions in CaCO3 content with CO2 enrichment. In contrast to other studies of calcified macroalgae, however, we observed an increase in the abundance of Padina spp. in acidified conditions. Reduced sea urchin grazing pressure and significant increases in photosynthetic rates may explain the unexpected success of decalcified Padina spp. at elevated levels of CO2. This is the first study to provide a comparison of ecological changes along CO2 gradients between temperate and tropical rocky shores. The similarities we found in the responses of Padina spp. and sea urchin abundance at several vent systems increases confidence in predictions of the ecological impacts of ocean acidification over a large geographical range.
Resumo:
Headed on the first page with the words "Nomenclatura hebraica," this handwritten volume is a vocabulary with the Hebrew word in the left column, and the English translation on the right. While the book is arranged in sections by letter, individual entries do not appear in strict alphabetical order. The small vocabulary varies greatly and includes entries like enigma, excommunication, and martyr, as well as cucumber and maggot. There are translations of the astrological signs at the end of the volume. Poem written at the bottom of the last page in different hand: "Women when good the best of saints/ that bright seraphick lovely/ she, who nothing of an angel/ wants but truth & immortality./ Verse 2: Who silken limbs & charming/ face. Keeps nature warm."
Resumo:
Previous research on computers and graphics calculators in mathematics education has examined effects on curriculum content and students’ mathematical achievement and attitudes while less attention has been given to the relationship between technology use and issues of pedagogy, in particular the impact on teachers’ professional learning in specific classroom and school environments. This observation is critical in the current context of educational policy making, where it is assumed – often incorrectly – that supplying schools with hardware and software will increase teachers’ use of technology and encourage more innovative teaching approaches. This paper reports on a research program that aimed to develop better understanding of how and under what conditions Australian secondary school mathematics teachers learn to effectively integrate technology into their practice. The research adapted Valsiner’s concepts of the Zone of Proximal Development, Zone of Free Movement and Zone of Promoted Action to devise a theoretical framework for analysing relationships between factors influencing teachers’ use of technology in mathematics classrooms. This paper illustrates how the framework may be used by analysing case studies of a novice teacher and an experienced teacher in different school settings.
Resumo:
The aims of this study were to investigate the beliefs concerning the philosophy of science held by practising science teachers and to relate those beliefs to their pupils' understanding of the philosophy of science. Three philosophies of science, differing in the way they relate experimental work to other parts of the scientific enterprise, are described. By the use of questionnaire techniques, teachers of four extreme types were identified. These are: the H type or hypothetico-deductivist teacher, who sees experiments as potential falsifiers of hypotheses or of logical deductions from them; the I type or inductivist teacher, who regards experiments mainly as a way of increasing the range of observations available for recording before patterns are noted and inductive generalisation is carried out; the V type or verificationist teacher, who expects experiments to provide proof and to demonstrate the truth or accuracy of scientific statements; and the 0 type, who has no discernible philosophical beliefs about the nature of science or its methodology. Following interviews of selected teachers to check their responses to the questionnaire and to determine their normal teaching methods, an experiment was organised in which parallel groups were given H, I and V type teaching in the normal school situation during most of one academic year. Using pre-test and post-test scores on a specially developed test of pupil understanding of the philosophy of science, it was shown that pupils were positively affected by their teacher's implied philosophy of science. There was also some indication that V type teaching improved marks obtained in school science examinations, but appeared to discourage the more able from continuing the study of science. Effects were also noted on vocabulary used by pupils to describe scientists and their activities.
Resumo:
There is an increasing pressure on university staff to provide ever more information and resources to students. This study investigated student opinions on (audio) podcasts and (video) vodcasts and how well they met requirements and aided learning processes. Two experiments within the Aston University looked at student opinion on, and usage of, podcasts and vodcasts for a selection of their psychology lectures. Recordings were produced first using a hand-held camcorder, and then using the in-house media department. WebCT was used to distribute the podcasts and vodcasts, attitude questionnaires were then circulated at two time points. Overall students indicated that podcasts and vodcasts were a beneficial addition resource for learning, particularly when used in conjunction with lecturers’ slides and as a tool for revision/assessment. The online material translated into students having increased understanding of the material, which supplemented and enhanced their learning without being a substitute for traditional lectures. There is scope for the provision of portable media files to become standard practice within higher education; integrating distance and online learning with traditional approaches to improve teaching and learning.
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
The past decade has seen a drive to give all pupils the opportunity to study a Modern Foreign Language (MFL) in schools in England, making the teaching and learning of foreign languages part of the primary school curriculum. The Languages for All: Languages for Life (DfES, 2002) policy was introduced through the National Languages Strategy with an objective to increase the nation’s language capability. Raising the educational standard for all pupils is another government initiative with a strong emphasis on inclusion. As the Languages for All policy stresses the importance and benefits of language learning, and inclusion suggests equality and provision for all, this study examines the inclusion of all key stage 2 pupils in foreign language learning and describes perceptions and experiences of pupils, particularly those identified as having special educational needs (SEN) in their performances and negotiations in learning French. As a small scale, qualitative and ethnographically informed, this research is based on participant observation and semi-structured interviews with pupils, teachers of French, teaching assistants and parents. This study draws upon Nussbaum’s capabilities approach and Bourdieu’s concepts as theoretical foundations to analyse the ‘inclusive’ French classroom. As the capabilities approach takes people as ends not means, and goes beyond a focus on resources, it lends itself to critical thinking on issues around inclusion in education. In this context, this researcher investigates the experiences of pupils who struggle with foreign language learning because of their abilities or disabilities, and frames the discussion around the capabilities approach. The study also focuses on motivation and identity in foreign language learning, and draws upon Bourdieu’s concepts of capital, habitus and field to analyse how the participants make sense of and respond to their own circumstances in relation to their performances in the language learning process. This research thus considers Bourdieu’s concepts for a deeper understanding of issues of inequality in learning French and takes up Nussbaum’s insight that pupils may differ in what learning French means to them, and it is not how they differ, but the difference between their capability to choose and achieve what they value that should matter. The findings indicate that although, initially, the French classroom appears ‘inclusive’ due to the provision and practices of inclusion, a closer look shows it to be exclusionary. In addition, responses from the participants on the usefulness and benefits of foreign language learning are contradictory to the objectives of the Languages for All policy, illustrating the complexity of the ‘inclusive’ MFL classroom. This research concludes that structural and interpersonal practices of inclusion contribute to the disguising of exclusion in a classroom deemed ‘inclusive’. Implications are that an understanding and consideration of other aspect of life such as well-being, interests, needs and values should form a necessary part of the language policy.
Resumo:
The ultimate intent of this dissertation was to broaden and strengthen our understanding of IT implementation by emphasizing research efforts on the dynamic nature of the implementation process. More specifically, efforts were directed toward opening the "black box" and providing the story that explains how and why contextual conditions and implementation tactics interact to produce project outcomes. In pursuit of this objective, the dissertation was aimed at theory building and adopted a case study methodology combining qualitative and quantitative evidence. Precisely, it examined the implementation process, use and consequences of three clinical information systems at Jackson Memorial Hospital, a large tertiary care teaching hospital.^ As a preliminary step toward the development of a more realistic model of system implementation, the study proposes a new set of research propositions reflecting the dynamic nature of the implementation process.^ Findings clearly reveal that successful implementation projects are likely to be those where key actors envision end goals, anticipate challenges ahead, and recognize the presence of and seize opportunities. It was also found that IT implementation is characterized by the systems theory of equifinality, that is, there are likely several equally effective ways to achieve a given end goal. The selection of a particular implementation strategy appears to be a rational process where actions and decisions are largely influenced by the degree to which key actors recognize the mediating role of each tactic and are motivated to action. The nature of the implementation process is also characterized by the concept of "duality of structure," that is, context and actions mutually influence each other. Another key finding suggests that there is no underlying program that regulates the process of change and moves it form one given point toward a subsequent and already prefigured end. For this reason, the implementation process cannot be thought of as a series of activities performed in a sequential manner such as conceived in stage models. Finally, it was found that IT implementation is punctuated by a certain indeterminacy. Results suggest that only when substantial efforts are focused on what to look for and think about, it is less likely that unfavorable and undesirable consequences will occur. ^
Resumo:
The purpose of this inquiry was to investigate the impact of a large, urban school district's experience in implementing a mandated school improvement plan and to examine how that plan was perceived, interpreted, and executed by those charged with the task. The research addressed the following questions: First, by whom was the district implementation plan designed, and what factors were considered in its construction? Second, what impact did the district implementation plan have on those charged with its implementation? Third, what impact did the district plan have on the teaching and learning practices of a particular school? Fourth, what aspects of the implemention plan were perceived as most and least helpful by school personnel in achieving stated goals? Last, what were the intended and unintended consequences of an externally mandated and directed plan for improving student achievement? The implementation process was measured against Fullan's model as expounded upon in The Meaning of Educational Change (1982) and The New Meaning of Educational Change (1990). The Banya implementation model (1993), because it added a dimension not adequately addressed by Fullan, was also considered.^ A case study was used as the methodological framework of this qualitative study. Sources of data used in this inquiry included document analysis, participant observations in situ, follow-up interviews, the "long" interview, and triangulation. The study was conducted over a twelve-month period. Findings were obtained from the content analysis of interview transcripts of multiple participants. Results were described and interpreted using the Fullan and Banya models as the descriptive framework. A cross-case comparison of the multiple perspectives of the same phenomena by various participants was constructed.^ The study concluded that the school district's implementation plan to improve student achievement was closely aligned to Fullan's model, although not intentionally. The research also showed that where there was common understanding at all levels of the organization as to the expectations for teachers, level of support to be provided, and availability of resources, successful implementation occured. The areas where successful implementation did not occur were those where the complexity of the changes were underestimated and processes for dealing with unintended consequences were not considered or adequately addressed. The unique perspectives of the various participants, from the superintendent to the classroom teacher, are described. Finally, recommendations for enhancement of implementation are offered and possible topics for further research studies are postulated. ^
Resumo:
Proofs by induction are central to many computer science areas such as data structures, theory of computation, programming languages, program efficiency-time complexity, and program correctness. Proofs by induction can also improve students’ understanding and performance of computer science concepts such as programming languages, algorithm design, and recursion, as well as serve as a medium for teaching them. Even though students are exposed to proofs by induction in many courses of their curricula, they still have difficulties understanding and performing them. This impacts the whole course of their studies, since proofs by induction are omnipresent in computer science. Specifically, students do not gain conceptual understanding of induction early in the curriculum and as a result, they have difficulties applying it to more advanced areas later on in their studies. The goal of my dissertation is twofold: (1) identifying sources of computer science students’ difficulties with proofs by induction, and (2) developing a new approach to teaching proofs by induction by way of an interactive and multimodal electronic book (e-book). For the first goal, I undertook a study to identify possible sources of computer science students’ difficulties with proofs by induction. Its results suggest that there is a close correlation between students’ understanding of inductive definitions and their understanding and performance of proofs by induction. For designing and developing my e-book, I took into consideration the results of my study, as well as the drawbacks of the current methodologies of teaching proofs by induction for computer science. I designed my e-book to be used as a standalone and complete educational environment. I also conducted a study on the effectiveness of my e-book in the classroom. The results of my study suggest that, unlike the current methodologies of teaching proofs by induction for computer science, my e-book helped students overcome many of their difficulties and gain conceptual understanding of proofs induction.
Resumo:
I conducted this study to provide insights toward deepening understanding of association between culture and writing by building, assessing, and refining a conceptual model of second language writing. To do this, I examined culture and coherence as well as the relationship between them through a mixed methods research design. Coherence has been an important and complex concept in ESL/EFL writing. I intended to study the concept of coherence in the research context of contrastive rhetoric, comparing the coherence quality in argumentative essays written by undergraduates in Mainland China and their U.S. peers. In order to analyze the complex concept of coherence, I synthesized five linguistic theories of coherence: Halliday and Hasan's cohesion theory, Carroll's theory of coherence, Enkvist's theory of coherence, Topical Structure Analysis, and Toulmin's Model. Based upon the synthesis, 16 variables were generated. Across these 16 variables, Hotelling t-test statistical analysis was conducted to predict differences in argumentative coherence between essays written by two groups of participants. In order to complement the statistical analysis, I conducted 30 interviews of the writers in the studies. Participants' responses were analyzed with open and axial coding. By analyzing the empirical data, I refined the conceptual model by adding more categories and establishing associations among them. The study found that U.S. students made use of more pronominal reference. Chinese students adopted more lexical devices of reiteration and extended paralleling progression. The interview data implied that the difference may be associated with the difference in linguistic features and rhetorical conventions in Chinese and English. As far as Toulmin's Model is concerned, Chinese students scored higher on data than their U.S. peers. According to the interview data, this may be due to the fact that Toulmin's Model, modified as three elements of arguments, have been widely and long taught in Chinese writing instruction while U.S. interview participants said that they were not taught to write essays according to Toulmin's Model. Implications were generated from the process of textual data analysis and the formulation of structural model defining coherence. These implications were aimed at informing writing instruction, assessment, peer-review, and self-revision.
Resumo:
Most reef-building corals are known to engage in symbiosis not only with unicellular dinoflagellates from the genus, Symbiodinium, but they also sustain highly complex symbiotic associations with other microscopic organisms such as bacteria, fungi, and viruses. The details of these non-pathogenic interactions remain largely unclear. The impetus of this study is to gain a better understanding of the symbiotic interaction between marine bacteria and a variety of coral species representative of differing morphologies. Studies have shown that certain bacterial orders associate specifically with certain coral species, thus making the symbiotic synergy a non-random consortium. Consequently both corals and bacteria may be capable of emitting chemical cues that enables both parties to find one another and thus creating the symbiosis. One potential chemical cue could be the compound DMSP (Dimethylsulfoniopropionate) and its sulphur derivatives. Reef-building corals are believed to be the major producers of the DMSP and its derivatives during times of stress. As a result corals could potentially attract their bacterial consortium depending on their DMSP production. Corals may be able to adapt to fluctuating environmental conditions by changing their bacterial communities to that which may aid in survival. The cause of this attraction may stem from the capability of a variety of marine bacteria to catabolize DMSP into different metabolically significant pathways, which may be necessary for the survival of these mutualistic interactions. To test the hypothesis that coral-produced DMSP play a role in attracting symbiotic bacteria, this study utilized the advent of high-through sequencing paired with bacterial isolation techniques to properly characterize the microbial community in the stony coral Porites astreoides. We conducted DMSP swarming and chemotaxis assays to determine the response of these coral-associated bacterial isolates towards the DMSP compound at differing concentrations. Preliminary data from this study suggests that six out of the ten bacterial isolates are capable of conducting unidirectional motility; these six isolates are also capable of conducting swarming motility in the direction of an increasing DMSP concentration gradient. This would indicate that there is a form of positive chemotaxis on behalf of the bacteria towards the DMSP compound. By obtaining a better understanding of the dynamics that drive the associations between bacterial communities and corals, we can further aid in the protection and conservation processes for corals. Also this study would further elucidate the significance of the DMSP compound in the survival of corals under times of stress.