843 resultados para computer technology enhanced pedagogy
Resumo:
This appendix is divided into three sections. The first section contains abstracts of each of the eight computer programs in the system, instructions for keypunching the three input documents, and computer operating instructions pertaining to each program. The second section contains system flowcharts for the entire system as well as program flowcharts for each program. The last section contains PL/l program listings of each program.
Resumo:
Abstract In this thesis we present the design of a systematic integrated computer-based approach for detecting potential disruptions from an industry perspective. Following the design science paradigm, we iteratively develop several multi-actor multi-criteria artifacts dedicated to environment scanning. The contributions of this thesis are both theoretical and practical. We demonstrate the successful use of multi-criteria decision-making methods for technology foresight. Furthermore, we illustrate the design of our artifacts using build and-evaluate loops supported with a field study of the Swiss mobile payment industry. To increase the relevance of this study, we systematically interview key Swiss experts for each design iteration. As a result, our research provides a realistic picture of the current situation in the Swiss mobile payment market and reveals previously undiscovered weak signals for future trends. Finally, we suggest a generic design process for environment scanning.
Resumo:
A computer program to adjust roadway profiles has been developed to serve as an aid to the county engineers of the State of Iowa. Many hours are spent reducing field notes and calculating adjusted roadway profiles to prepare an existing roadway for paving that will produce a high quality ride and be as maintenance free as possible. Since the computer is very well adapted to performing long tedious tasks; programming this work for a computer would result in freeing the engineer of these tasks. Freed from manual calculations, the engineer is able to spend more time in solving engineering problems. The type of roadway that this computer program is designed to adjust is a road that at sometime. in its history was graded to a finished subgrade. After a period of time, this road is to receive a finished paved surface. The problem then arises whether to bring the existing roadway up to the de signed grade or to make profile adjustments and comprise between the existing and the design profiles. In order to achieve the latter condition using this program, the engineer needs to give the computer only a minimum amount of information.
Resumo:
Wild-type A75/17-Canine distemper virus (CDV) is a highly virulent strain, which induces a persistent infection in the central nervous system (CNS) with demyelinating disease. Wild-type A75/17-CDV, which is unable to replicate in cell lines to detectable levels, was adapted to grow in Vero cells and was designated A75/17-V. Sequence comparison between the two genomes revealed seven nucleotide differences located in the phosphoprotein (P), the matrix (M) and the large (L) genes. The P gene is polycistronic and encodes two auxiliary proteins, V and C, besides the P protein. The mutations resulted in amino acid changes in the P and V, but not in the C protein, as well as in the M and L proteins. Here, a rescue system was developed for the A75/17-V strain, which was shown to be attenuated in vivo, but retains a persistent infection phenotype in Vero cells. In order to track the recombinant virus, an additional transcription unit coding for the enhanced green fluorescent protein (eGFP) was inserted at the 3' proximal position in the A75/17-V cDNA clone. Reverse genetics technology will allow us to characterize the genetic determinants of A75/17-V CDV persistent infection in cell culture.
Resumo:
OBJECTIVE: Our aim was to evaluate a fluorescence-based enhanced-reality system to assess intestinal viability in a laparoscopic mesenteric ischemia model. MATERIALS AND METHODS: A small bowel loop was exposed, and 3 to 4 mesenteric vessels were clipped in 6 pigs. Indocyanine green (ICG) was administered intravenously 15 minutes later. The bowel was illuminated with an incoherent light source laparoscope (D-light-P, KarlStorz). The ICG fluorescence signal was analyzed with Ad Hoc imaging software (VR-RENDER), which provides a digital perfusion cartography that was superimposed to the intraoperative laparoscopic image [augmented reality (AR) synthesis]. Five regions of interest (ROIs) were marked under AR guidance (1, 2a-2b, 3a-3b corresponding to the ischemic, marginal, and vascularized zones, respectively). One hour later, capillary blood samples were obtained by puncturing the bowel serosa at the identified ROIs and lactates were measured using the EDGE analyzer. A surgical biopsy of each intestinal ROI was sent for mitochondrial respiratory rate assessment and for metabolites quantification. RESULTS: Mean capillary lactate levels were 3.98 (SD = 1.91) versus 1.05 (SD = 0.46) versus 0.74 (SD = 0.34) mmol/L at ROI 1 versus 2a-2b (P = 0.0001) versus 3a-3b (P = 0.0001), respectively. Mean maximal mitochondrial respiratory rate was 104.4 (±21.58) pmolO2/second/mg at the ROI 1 versus 191.1 ± 14.48 (2b, P = 0.03) versus 180.4 ± 16.71 (3a, P = 0.02) versus 199.2 ± 25.21 (3b, P = 0.02). Alanine, choline, ethanolamine, glucose, lactate, myoinositol, phosphocholine, sylloinositol, and valine showed statistically significant different concentrations between ischemic and nonischemic segments. CONCLUSIONS: Fluorescence-based AR may effectively detect the boundary between the ischemic and the vascularized zones in this experimental model.
Resumo:
Computer-Aided Tomography Angiography (CTA) images are the standard for assessing Peripheral artery disease (PAD). This paper presents a Computer Aided Detection (CAD) and Computer Aided Measurement (CAM) system for PAD. The CAD stage detects the arterial network using a 3D region growing method and a fast 3D morphology operation. The CAM stage aims to accurately measure the artery diameters from the detected vessel centerline, compensating for the partial volume effect using Expectation Maximization (EM) and a Markov Random field (MRF). The system has been evaluated on phantom data and also applied to fifteen (15) CTA datasets, where the detection accuracy of stenosis was 88% and the measurement accuracy was with an 8% error.
Resumo:
OBJECTIVE: Gadolinium-enhanced pulmonary magnetic resonance angiography (MRA) can be an option in patients with a history of previous adverse reaction to iodinated contrast material and renal insufficiency. Radiation is also avoided. The aim of this study is to prospectively compare the diagnostic value of MRA with that of a diagnostic strategy, taking into account catheter angiography, computed tomography angiography (CTA), and lung scintigraphy [ventilation-perfusion (VQ)]. MATERIAL AND METHODS: Magnetic resonance angiography was done in 48 patients with clinically suspected pulmonary embolism (PE) using fast gradient echo coronal acquisition with gadolinium. Interpretation was done with native coronal images and multiplanar maximum intensity projection reconstructions. Results were compared to catheter angiography (n=15), CTA (n=34), VQ (n=45), as well as 6-12 months clinical follow-ups, according to a sequenced reference tree. RESULTS: The final diagnosis of PE was retained in 11 patients (23%). There were two false negatives and no false positive results with MRA. Computed tomography angiography resulted in no false negatives or false positives. Magnetic resonance angiography had a sensitivity of 82% and a specificity of 100%. CONCLUSION: In our study, pulmonary MRA had a sensitivity of 82% and a specificity of 100% for the diagnosis of PE, with slightly less sensitivity than CTA. In the diagnostic algorithm of PE, pulmonary MRA should be considered as an alternative to CTA when iodine contrast injection or radiation is a significant matter.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
The paper presents a method of analyzing Rigid Frames by use of the Conjugate Beam Theory. The development of the method along with an example is given. This method has been used to write a computer program for the analysis of twin box culverts. The culverts may be analyzed under any fill height and any of the standard truck loadings. The wall and slab thickness are increased by the computer program as necessary. The final result is steel requirements both for moment and shear, and the slab and wall thickness.
Resumo:
Tämä diplomityö tehtiin Convergens Oy:lle. Convergens on elektroniikan suunnittelutoimisto, joka on erikoistunut sulautettuihin järjestelmiin sekä tietoliikennetekniikkaan. Diplomityön tavoitteena oli suunnitella tietokonekortti tietoliikennesovelluksia varten asiakkaalle, jolta vaatimusmäärittelyt tulivat. Työ on rajattu koskemaan laitteen prototyypin suunnittelua. Työssä suunnitellaan pääasiassa WLAN-tukiaseman tietokone. Tukiasema onasennettavissa toimistoihin, varastoihin, kauppoihin sekä myös liikkuvaan ajoneuvoon. Suunnittelussa on otettu nämä asiat huomioon, ja laitteen akun pystyy lataamaan muun muassa auton akulla. Langattomat tekniikat ovat voimakkaasti yleistymässä, ja tämän työn tukiasema tarjoaakin varteenotettavan vaihtoehdon lukuisilla ominaisuuksillaan. Mukana on mm. GPS, Bluetooth sekä Ethernet-valmius. Langattomien tekniikoiden lisäksi myös sulautetut järjestelmät ovat voimakkaasti yleistymässä, ja nykyään mikroprosessoreita löytääkin lähesmistä vain. Tässä projektissa käytetty prosessori on nopeutensa puolesta kilpailukykyinen, ja siitä löytyy useita eri rajapintoja. Jatkossa tietokonekortille on myös tulossa WiMAX-tuki, joka lisää tukiaseman tulevaisuuden arvoa asiakkaalle. Projektiin valittu Freescalen MPC8321E-prosessori on PowerPC-arkkitehtuuriin perustuva ja juuri markkinoille ilmestynyt. Tämä toi mukanaan lisähaasteen, sillä kyseisestä prosessorista ei ollut vielä kaikkea tietoa saatavilla. Mekaniikka toi omat haasteensa mukanaan, sillä se rajoitti piirilevyn koonniin, että ylimääräistä piirilevytilaa ei juurikaan jäänyt. Tämän takia esimerkiksi DDR-muistit olivat haastavia reitittää, sillä muistivetojen on oltava melko samanpituisia keskenään. Käyttöjärjestelmänä projektissa käytetään Linuxia. Suunnittelu alkoi keväällä 2007 ja toimiva prototyyppi oli valmis alkusyksystä. Prototyypin testaus osoitti, että tietokonekortti kykenee täyttämään kaikki asiakkaan vaatimukset. Prototyypin testauksessa löytyneet viat ja optimoinnit on tarkoitus korjata tuotantomalliin, joten se antaa hyvän pohjan jatkosuunnittelua varten.
Resumo:
The diffusion of mobile telephony began in 1971 in Finland, when the first car phones, called ARP1 were taken to use. Technologies changed from ARP to NMT and later to GSM. The main application of the technology, however, was voice transfer. The birth of the Internet created an open public data network and easy access to other types of computer-based services over networks. Telephones had been used as modems, but the development of the cellular technologies enabled automatic access from mobile phones to Internet. Also other wireless technologies, for instance Wireless LANs, were also introduced. Telephony had developed from analog to digital in fixed networks and allowed easy integration of fixed and mobile networks. This development opened a completely new functionality to computers and mobile phones. It also initiated the merger of the information technology (IT) and telecommunication (TC) industries. Despite the arising opportunity for firms' new competition the applications based on the new functionality were rare. Furthermore, technology development combined with innovation can be disruptive to industries. This research focuses on the new technology's impact on competition in the ICT industry through understanding the strategic needs and alternative futures of the industry's customers. The change speed inthe ICT industry is high and therefore it was valuable to integrate the DynamicCapability view of the firm in this research. Dynamic capabilities are an application of the Resource-Based View (RBV) of the firm. As is stated in the literature, strategic positioning complements RBV. This theoretical framework leads theresearch to focus on three areas: customer strategic innovation and business model development, external future analysis, and process development combining these two. The theoretical contribution of the research is in the development of methodology integrating theories of the RBV, dynamic capabilities and strategic positioning. The research approach has been constructive due to the actual managerial problems initiating the study. The requirement for iterative and innovative progress in the research supported the chosen research approach. The study applies known methods in product development, for instance, innovation process in theGroup Decision Support Systems (GDSS) laboratory and Quality Function Deployment (QFD), and combines them with known strategy analysis tools like industry analysis and scenario method. As the main result, the thesis presents the strategic innovation process, where new business concepts are used to describe the alternative resource configurations and scenarios as alternative competitive environments, which can be a new way for firms to achieve competitive advantage in high-velocity markets. In addition to the strategic innovation process as a result, thestudy has also resulted in approximately 250 new innovations for the participating firms, reduced technology uncertainty and helped strategic infrastructural decisions in the firms, and produced a knowledge-bank including data from 43 ICT and 19 paper industry firms between the years 1999 - 2004. The methods presentedin this research are also applicable to other industries.
Resumo:
It has been convincingly argued that computer simulation modeling differs from traditional science. If we understand simulation modeling as a new way of doing science, the manner in which scientists learn about the world through models must also be considered differently. This article examines how researchers learn about environmental processes through computer simulation modeling. Suggesting a conceptual framework anchored in a performative philosophical approach, we examine two modeling projects undertaken by research teams in England, both aiming to inform flood risk management. One of the modeling teams operated in the research wing of a consultancy firm, the other were university scientists taking part in an interdisciplinary project experimenting with public engagement. We found that in the first context the use of standardized software was critical to the process of improvisation, the obstacles emerging in the process concerned data and were resolved through exploiting affordances for generating, organizing, and combining scientific information in new ways. In the second context, an environmental competency group, obstacles were related to the computer program and affordances emerged in the combination of experience-based knowledge with the scientists' skill enabling a reconfiguration of the mathematical structure of the model, allowing the group to learn about local flooding.
Resumo:
Awareness is required for supporting all forms of cooperation. In Computer Supported Collaborative Learning (CSCL), awareness can be used for enhancing collaborative opportunities across physical distances and in computer-mediated environments. Shared Knowledge Awareness (SKA) intends to increase the perception about the shared knowledge, students have in a collaborative learning scenario and also concerns the understanding that this group has about it. However, it is very difficult to produce accurate awareness indicators based on informal message exchange among the participants. Therefore, we propose a semantic system for cooperation that makes use of formal methods for knowledge representation based on semantic web technologies. From these semantics-enhanced repository and messages, it could be easier to compute more accurate awareness.
Resumo:
This paper aims to better understand the development of students’ learning processes when participating actively in a specific Computer Supported Collaborative Learning system called KnowCat. To this end, a longitudinal case study was designed, in which eighteen university students took part in a 12-month (two semesters) learning project. During this time period, the students followed an instructional process, using some elements of KnowCat (KnowCat key features) design to support and improve their interaction processes, especially peer learning processes. Our research involved both supervising the students’ collaborative learning processes throughout the learning project and focusing our analysis on the qualitative evolution of the students’ interaction processes and on the development of metacognitive learning processes. The results of the current research reveal that the instructional application of the CSCL-KnowCat system may favour and improve the development of the students’ metacognitive learning processes. Additionally, the implications of the design of computer supported collaborative learning networks and pedagogical issues are discussed in this paper.