900 resultados para Visualization Using Computer Algebra Tools


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the last two decades, higher education institutions have been actively engaged in the use of online technologies with the aim of transforming the ways we teach and learn to improve students’ learning experiences and outcomes. However, despite significant investment in infrastructure and training and a wide-scale uptake of such technologies, the promised transformative effect on student learning is yet to be actualised outside of small pockets of innovation. In this paper, we argue that one of the factors contributing to lack of qualitative large-scale transformation is students’ lack of preparedness and experience in using online tools for academic purposes. Focusing on students’ experience of a learning activity that used blogging to promote critical thinking and reflection, we draw on data from a doctoral study to demonstrate how a techno-literacy framework can be used to analyse the nuances of students’ preparedness to put such technologies to work within a formal higher education setting. We argue that, although contemporary university students are largely operationally literate when using online learning tools, they often lack the cultural and critical skills required to use such technologies in a meaningful way to support powerful learning. We argue that, for online learning technologies to transform learning, students need to be supported to develop these higher order techno-literacies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this project is to develop a three-dimensional block model for a garnet deposit in the Alder Gulch, Madison County, Montana. Garnets occur in pre-Cambrian metamorphic Red Wash gneiss and similar rocks in the vicinity. This project seeks to model the percentage of garnet in a deposit called the Section 25 deposit using the Surpac software. Data available for this work are drillhole, trench and grab sample data obtained from previous exploration of the deposit. The creation of the block model involves validating the data, creating composites of assayed garnet percentages and conducting basic statistics on composites using Surpac statistical tools. Variogram analysis will be conducted on composites to quantify the continuity of the garnet mineralization. A three-dimensional block model will be created and filled with estimates of garnet percentage using different methods of reserve estimation and the results compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adeno-associated viral (AAV) vectors are among the most widely used gene transfer systems in basic and pre-clinical research and have been employed in more than 160 clinical trials. AAV vectors are commonly produced in producer cell lines like HEK293 by co-transfection with a so-called vector plasmid and one (in this work) or two so-called helper plasmids. The vector plasmid contains the transgene cassette of interest (TEC) flanked by AAV’s inverted terminal repeats (ITRs) which serve as packaging signals, whereas the helper plasmid provides the required AAV and helper virus functions in trans. A pivotal aspect of AAV vectorology is the manufacturing of AAV vectors free from impurities arising during the production process. These impurities include AAV vector preparations that contain capsids containing prokaryotic sequences, e.g. antibiotic resistance genes originating from the producer plasmids. In the first part of the thesis we aimed at improving the safety of AAV vectors. As we found that encapsidated prokaryotic sequences (using the ampicillin resistance gene as indicator) cannot be re-moved by standard purification methods we investigated whether the producer plasmids could be replaced by Minicircles (MCs). MCs are circular DNA constructs which contain no functional or coding prokaryotic sequences; they only consist of the TEC and a short sequence required for production and purification. MC counterparts of a vector plasmid encoding for enhanced green fluorescent (eGFP) protein and a helper plasmid encoding for AAV serotype 2 (AAV2) and helper Adenovirus (Ad) genes were designed and produced by PlasmidFactory (Bielefeld, Germany). Using all four possible combinations of plasmid and MCs, single-stranded AAV2 vectors (ssAAV) and self-complementary AAV vectors (scAAV) were produced and characterized for vector quantity, quality and functionality. The analyses showed that plasmids can be replaced by MCs without decreasing the efficiency of vector production and vector quality. MC-derived scAAV vector preparations even exceeded plasmid-derived preparations, as they displayed up to 30-fold improved transduction efficiencies. Using MCs as tools, we found that the vector plasmid is the main source of encapsidated prokaryotic sequences. Remarkably, we found that plasmid-derived scAAV vector preparations contained a much higher relative amount of prokaryotic sequences (up to 26.1 %, relative to TEC) compared to ssAAV vector preparations (up to 2.9 %). By replacing both plasmids by MCs the amount of functional prokaryotic sequences could be decreased to below the limit of quantification. Additional analyses for DNA impurities other than prokaryotic sequences showed that scAAV vectors generally contained a higher amount of non-vector DNA (e.g. adenoviral sequences) than ssAAV vectors. For both, ssAAV and scAAV vector preparations, MC-derived vectors tended to contain lower amounts of foreign DNA. None of the vectors tested could be shown to induce immunogenicity. In summary we could demonstrate that the quality of AAV vector preparations could be significantly improved by replacing producer plasmids by MCs. Upon transduction of a target tissue, AAV vector genomes predominantly remain in an episomal state, as duplex DNA circles or concatemers. These episomal forms mediate long-term transgene expression in terminally differentiated cells, but are lost in proliferating cells due to cell division. Therefore, in the second part of the thesis, in cooperation with Claudia Hagedorn and Hans J. Lipps (University Witten/Herdecke) an AAV vector genome was equipped with an autonomous replication element (Scaffold/matrix attachment region (S/MAR)). AAV-S/MAR encoding for eGFP and a blasticidin resistance gene and a control vector with the same TEC but lacking the S/MAR element (AAV-ΔS/MAR) were produced and transduced into highly proliferative HeLa cells. Antibiotic pressure was employed to select for cells stably maintaining the vector genome. AAV-S/MAR transduced cells yielded a higher number of colonies than AAV-ΔS/MAR-transduced cells. Colonies derived from each vector transduction were picked and cultured further. They remained eGFP-positive (up to 70 days, maximum cultivation period) even in the absence of antibiotic selection pressure. Interestingly, the mitotic stability of both AAV-S/MAR and control vector AAV-ΔS/MAR was found to be a result of episomal maintenance of the vector genome. This finding indicates that, under specific conditions such as the mild selection pressure we employed, “common” AAV vectors persist episomally. Thus, the S/MAR element increases the establishment frequency of stable episomes, but is not a prerequisite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Across the nation, librarians work with caregivers and children to encourage engagement in their early literacy programs. However, these early literacy programs that libraries provide have been left mostly undocumented by research, especially through quantitative methods. Valuable Initiatives in Early Learning that Work Successfully (VIEWS2) was designed to test new ways to measure the effectiveness of these early literacy programs for young children (birth to kindergarten), leveraging a mixed methods, quasi-experimental design. Using two innovative tools, researchers collected data at 120 public library storytimes in the first year of research, observing approximately 1,440 children ranging from birth to 60 months of age. Analysis of year-one data showed a correlation between the early literacy content of the storytime program and children’s outcomes in terms of early literacy behaviors. These findings demonstrate that young children who attend public library storytimes are responding to the early literacy content in the storytime programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette étude s’est déroulée dans le contexte de l’École en réseau au Québec. Une classe du primaire a utilisé un espace numérique collaboratif (Knowledge Forum) pour la réalisation d’activités d’éducation artistique, soit l’appréciation d’un corpus d’oeuvres et la création d’un objet d’art. La progression du discours écrit relativement à ces deux objets de partage a été étudiée. Nos résultats montrent que les élèves ont réussi à faire progresser leurs contributions sur les deux objets partagés. En interagissant entre eux, ils ont créé des artéfacts de connaissances partagées et développé un langage artistique propre. À la suite de nos constats, des implications pédagogiques ont été formulées pour d’encourager la participation des élèves lors d’activités d’éducation artistique au moyen d’outils numériques qui soutiennent la collaboration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: El Cáncer es prevenible en algunos casos, si se evita la exposición a sustancias cancerígenas en el medio ambiente. En Colombia, Cundinamarca es uno de los departamentos con mayores incrementos en la tasa de mortalidad y en el municipio de Sibaté, habitantes han manifestado preocupación por el incremento de la enfermedad. En el campo de la salud ambiental mundial, la georreferenciación aplicada al estudio de fenómenos en salud, ha tenido éxito con resultados válidos. El estudio propuso usar herramientas de información geográfica, para generar análisis de tiempo y espacio que hicieran visible el comportamiento del cáncer en Sibaté y sustentaran hipótesis de influencias ambientales sobre concentraciones de casos. Objetivo: Obtener incidencia y prevalencia de casos de cáncer en habitantes de Sibaté y georreferenciar los casos en un periodo de 5 años, con base en indagación de registros. Metodología: Estudio exploratorio descriptivo de corte transversal,sobre todos los diagnósticos de cáncer entre los años 2010 a 2014, encontrados en los archivos de la Secretaria de Salud municipal. Se incluyeron unicamente quienes tuvieron residencia permanente en el municipio y fueron diagnosticados con cáncer entre los años de 2010 a 2104. Sobre cada caso se obtuvo género, edad, estrato socioeconómico, nivel académico, ocupación y estado civil. Para el análisis de tiempo se usó la fecha de diagnóstico y para el análisis de espacio, la dirección de residencia, tipo de cáncer y coordenada geográfica. Se generaron coordenadas geográficas con un equipo GPS Garmin y se crearon mapas con los puntos de la ubicación de las viviendas de los pacientes. Se proceso la información, con Epi Info 7 Resultados: Se encontraron 107 casos de cáncer registrados en la Secretaria de Salud de Sibaté, 66 mujeres, 41 hombres. Sin división de género, el 30.93% de la población presento cáncer del sistema reproductor, el 18,56% digestivo y el 17,53% tegumentario. Se presentaron 2 grandes casos de agrupaciones espaciales en el territorio estudiado, una en el Barrio Pablo Neruda con 12 (21,05%) casos y en el casco Urbano de Sibaté con 38 (66,67%) casos. Conclusión: Se corroboro que el análisis geográfico con variables espacio temporales y de exposición, puede ser la herramienta para generar hipótesis sobre asociaciones de casos de cáncer con factores ambientales.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is a major effort in medical imaging to develop algorithms to extract information from DTI and HARDI, which provide detailed information on brain integrity and connectivity. As the images have recently advanced to provide extraordinarily high angular resolution and spatial detail, including an entire manifold of information at each point in the 3D images, there has been no readily available means to view the results. This impedes developments in HARDI research, which need some method to check the plausibility and validity of image processing operations on HARDI data or to appreciate data features or invariants that might serve as a basis for new directions in image segmentation, registration, and statistics. We present a set of tools to provide interactive display of HARDI data, including both a local rendering application and an off-screen renderer that works with a web-based viewer. Visualizations are presented after registration and averaging of HARDI data from 90 human subjects, revealing important details for which there would be no direct way to appreciate using conventional display of scalar images.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Along the internal carotid artery (ICA), atherosclerotic plaques are often located in its cavernous sinus (parasellar) segments (pICA). Studies indicate that the incidence of pre-atherosclerotic lesions is linked with the complexity of the pICA; however, the pICA shape was never objectively characterized. Our study aims at providing objective mathematical characterizations of the pICA shape. Methods and results Three-dimensional (3D) computer models, reconstructed from contrast enhanced computed tomography (CT) data of 30 randomly selected patients (60 pICAs) were analyzed with modern visualization software and new mathematical algorithms. As objective measures for the pICA shape complexity, we provide calculations of curvature energy, torsion energy, and total complexity of 3D skeletons of the pICA lumen. We further measured the posterior knee of the so-called ""carotid siphon"" with a virtual goniometer and performed correlations between the objective mathematical calculations and the subjective angle measurements. Conclusions Firstly, our study provides mathematical characterizations of the pICA shape, which can serve as objective reference data for analyzing connections between pICA shape complexity and vascular diseases. Secondly, we provide an objective method for creating Such data. Thirdly, we evaluate the usefulness of subjective goniometric measurements of the angle of the posterior knee of the carotid siphon.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Communication processes are vital in the lifecycle of BPM projects. With this in mind, much research has been performed into facilitating this key component between stakeholders. Amongst the methods used to support this process are personalized process visualisations. In this paper, we review the development of this visualization trend, then, we propose a theoretical analysis framework based upon communication theory. We use this framework to provide theoretical support to the conjecture that 3D virtual worlds are powerful tools for communicating personalised visualisations of processes within a workplace. Meta requirements are then derived and applied, via 3D virtual world functionalities, to generate example visualisations containing personalized aspects, which we believe enhance the process of communcation between analysts and stakeholders in BPM process (re)design activities.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this paper we present an update on our novel visualization technologies based on cellular immune interaction from both large-scale spatial and temporal perspectives. We do so with a primary motive: to present a visually and behaviourally realistic environment to the community of experimental biologists and physicians such that their knowledge and expertise may be more readily integrated into the model creation and calibration process. Visualization aids understanding as we rely on visual perception to make crucial decisions. For example, with our initial model, we can visualize the dynamics of an idealized lymphatic compartment, with antigen presenting cells (APC) and cytotoxic T lymphocyte (CTL) cells. The visualization technology presented here offers the researcher the ability to start, pause, zoom-in, zoom-out and navigate in 3-dimensions through an idealised lymphatic compartment.