836 resultados para E-labs reusability
Resumo:
The physical and biological carbon pumps in the different hydrographic and biogeochemical regimes of the Atlantic Sector of the Southern Ocean are controlled by a series of coupled physical, chemical and biological processes and a project named Eddy-Pump was designed to study them. The Eddy Pump field campaign was carried out during RV Polarstern Cruise ANT-XXVIII/3 between January and March 2012. Particular emphasis was laid on the differences which occur along the axis of the Antarctic Circumpolar Current (ACC) with its associated mesoscale eddy field. The study sites were selected in order to represent (1) the central ACC with its regular separation in different frontal jets, investigated by a meridional transect along 10°E; (2) a large-scale bloom west of the Mid-Atlantic Ridge which lasted several months with conspicuous chlorophyll-poor waters to its immediate east studied by a three-dimensional mesoscale survey centred at 12°40'W; and (3) the Georgia Basin north of the island of South Georgia, which regularly features an extended and dense phytoplankton bloom, was investigated by a mesoscale survey centred at 38°12'W. While Eddy-Pump represents an interdisciplinary project by design, we here focus on describing the variable physical environment within which the different biogeochemical regimes developed. For describing the physical environment we use measurements of temperature, salinity and density, of mixed-layer turbulence parameters, of dynamic heights and horizontal current vectors, and of flow trajectories obtained from surface drifters and submerged floats. This serves as background information for the analyses of biological and chemical processes and of biogeochemical fluxes addressed by other papers in this issue. The section along 10°E between 44°S and 53°S showed a classical ACC structure with well-known hydrographic fronts, the Subantarctic Front (SAF) at 46.5°S, the Antarctic Polar Front (APF) split in two, at 49.25°S and 50.5°S, and the Southern Polar Front (SPF) at 52.5°S. Each front was associated with strong eastward flows. The West Mid-Atlantic Ridge Survey showed a weak and poorly resolved meander structure between the APF and the SPF. During the first eight days of the survey the oceanographic conditions at the Central Station at 12°40'W remained reasonably constant. However after that, conditions became more variable in the thermocline with conspicuous temperature inversions and interleavings and also a decrease in temperature in the surface layer. At the very end of the period of observation the conditions in the thermocline returned to being similar to those observed during the early part of the period with however the mixed layer temperature raised. The period of enhanced thermohaline variability was accompanied by increased currents. The Georgia Basin Survey showed a very strong zonal jet at its northern edge which connects to a large cyclonic meander that itself joins an anticyclonic eddy in the southeastern quadrant. The water mass contrasts in this survey were stronger than in the West Mid-Atlantic Ridge Survey, but similar to those met along 10°E with the exception that the warm and saline surface water typical of the northern side of the SAF was not covered by the Georgia Basin Survey. Mixed layers found during Eddy-Pump were typically deep, but varied between the three survey areas; the mean depths and standard variations of the mixed layer along the 10°E were 77.2±24.7 m, at the West Mid-Atlantic Ridge 66.7±17.7 m, and in the Georgia Basin 36.8±10.7 m.
Resumo:
The paper in hand presents a mobile testbed –namely the Heavy Duty Planetary Rover (HDPR)– that was designed and constructed at the Automation and Robotics Laboratories (ARL) of the European Space Agency to fulfill the lab’s internal needs in the context of long range rover exploration as well as in order to provide the means to perform in situ testing of novel algorithms. We designed a rover that: a) is able to reliably perform long range routes, and b) carries an abundant of sensors (both current rover technology and futuristic ones). The testbed includes all the additional hardware and software (i.e. ground control station, UAV, networking, mobile power) to allow the prompt deployment on the field. The reader can find in the paper the description of the system as well as a report on our experiences during our first experiments with the testbed.
Resumo:
This paper focuses on two basic issues: the anxiety-generating nature of the interpreting task and the relevance of interpreter trainees’ academic self-concept. The first has already been acknowledged, although not extensively researched, in several papers, and the second has only been mentioned briefly in interpreting literature. This study seeks to examine the relationship between the anxiety and academic self-concept constructs among interpreter trainees. An adapted version of the Foreign Language Anxiety Scale (Horwitz et al., 1986), the Academic Autoconcept Scale (Schmidt, Messoulam & Molina, 2008) and a background information questionnaire were used to collect data. Students’ t-Test analysis results indicated that female students reported experiencing significantly higher levels of anxiety than male students. No significant gender difference in self-concept levels was found. Correlation analysis results suggested, on the one hand, that younger would-be interpreters suffered from higher anxiety levels and students with higher marks tended to have lower anxiety levels; and, on the other hand, that younger students had lower self-concept levels and higher-ability students held higher self-concept levels. In addition, the results revealed that students with higher anxiety levels tended to have lower self-concept levels. Based on these findings, recommendations for interpreting pedagogy are discussed.
Resumo:
Intelligent Tutoring Systems (ITSs) are computerized systems for learning-by-doing. These systems provide students with immediate and customized feedback on learning tasks. An ITS typically consists of several modules that are connected to each other. This research focuses on the distribution of the ITS module that provides expert knowledge services. For the distribution of such an expert knowledge module we need to use an architectural style because this gives a standard interface, which increases the reusability and operability of the expert knowledge module. To provide expert knowledge modules in a distributed way we need to answer the research question: ‘How can we compare and evaluate REST, Web services and Plug-in architectural styles for the distribution of the expert knowledge module in an intelligent tutoring system?’. We present an assessment method for selecting an architectural style. Using the assessment method on three architectural styles, we selected the REST architectural style as the style that best supports the distribution of expert knowledge modules. With this assessment method we also analyzed the trade-offs that come with selecting REST. We present a prototype and architectural views based on REST to demonstrate that the assessment method correctly scores REST as an appropriate architectural style for the distribution of expert knowledge modules.
Resumo:
With the new academic year structure encouraging more in-term assessment to replace end-of-year examinations one of the problems we face is assessing students and keeping track of individual student learning without overloading the students and staff with excessive assessment burdens.
In the School of Electronics, Electrical Engineering and Computer Science, we have constructed a system that allows students to self-assess their capability on a simple Yes/No/Don’t Know scale against fine grained learning outcomes for a module. As the term progresses students update their record as appropriately, including selecting a Learnt option to reflect improvements they have gained as part of their studies.
In the system each of the learning outcomes are linked to the relevant teaching session (lectures and labs) and to online resources that students can access at any time. Students can structure their own learning experience to their needs and preferences in order to attain the learning outcomes.
The system keeps a history of the student’s record, allowing the lecturer to observe how the students’ abilities progress over the term and to compare it to assessment results. The system also keeps of any of the resource links that student has clicked on and the related learning outcome.
The initial work is comparing the accuracy of the student self-assessments with their performance in the related questions in the traditional end-of-year examination.
Resumo:
Wireless sensor networks (WSNs) differ from conventional distributed systems in many aspects. The resource limitation of sensor nodes, the ad-hoc communication and topology of the network, coupled with an unpredictable deployment environment are difficult non-functional constraints that must be carefully taken into account when developing software systems for a WSN. Thus, more research needs to be done on designing, implementing and maintaining software for WSNs. This thesis aims to contribute to research being done in this area by presenting an approach to WSN application development that will improve the reusability, flexibility, and maintainability of the software. Firstly, we present a programming model and software architecture aimed at describing WSN applications, independently of the underlying operating system and hardware. The proposed architecture is described and realized using the Model-Driven Architecture (MDA) standard in order to achieve satisfactory levels of encapsulation and abstraction when programming sensor nodes. Besides, we study different non-functional constrains of WSN application and propose two approaches to optimize the application to satisfy these constrains. A real prototype framework was built to demonstrate the developed solutions in the thesis. The framework implemented the programming model and the multi-layered software architecture as components. A graphical interface, code generation components and supporting tools were also included to help developers design, implement, optimize, and test the WSN software. Finally, we evaluate and critically assess the proposed concepts. Two case studies are provided to support the evaluation. The first case study, a framework evaluation, is designed to assess the ease at which novice and intermediate users can develop correct and power efficient WSN applications, the portability level achieved by developing applications at a high-level of abstraction, and the estimated overhead due to usage of the framework in terms of the footprint and executable code size of the application. In the second case study, we discuss the design, implementation and optimization of a real-world application named TempSense, where a sensor network is used to monitor the temperature within an area.
Resumo:
The semiconductor industry's urge towards faster, smaller and cheaper integrated circuits has lead the industry to smaller node devices. The integrated circuits that are now under volume production belong to 22 nm and 14 nm technology nodes. In 2007 the 45 nm technology came with the revolutionary high- /metal gate structure. 22 nm technology utilizes fully depleted tri-gate transistor structure. The 14 nm technology is a continuation of the 22 nm technology. Intel is using second generation tri-gate technology in 14 nm devices. After 14 nm, the semiconductor industry is expected to continue the scaling with 10 nm devices followed by 7 nm. Recently, IBM has announced successful production of 7 nm node test chips. This is the fashion how nanoelectronics industry is proceeding with its scaling trend. For the present node of technologies selective deposition and selective removal of the materials are required. Atomic layer deposition and the atomic layer etching are the respective techniques used for selective deposition and selective removal. Atomic layer deposition still remains as a futuristic manufacturing approach that deposits materials and lms in exact places. In addition to the nano/microelectronics industry, ALD is also widening its application areas and acceptance. The usage of ALD equipments in industry exhibits a diversi cation trend. With this trend, large area, batch processing, particle ALD and plasma enhanced like ALD equipments are becoming prominent in industrial applications. In this work, the development of an atomic layer deposition tool with microwave plasma capability is described, which is a ordable even for lightly funded research labs.
Resumo:
Työn tavoitteena oli hiilihydraattien ja aminohappojen talteenotto biomassaperäisistä liuoksista erilaisina fraktioina ultra- ja nanosuodattamalla niitä erilaisilla membraaneilla. Työ tehtiin selvitystyönä Senson Oy:lle syystalven 2015 ja kevään 2016 välisenä aikana. Teoriaosassa perehdyttiin nanosuodatukseen ja sen erilaisiin sovelluksiin teollisuudessa, sekä lyhyesti muihin paineavusteisiin membraanisuodatusprosesseihin. Teoriaosassa myös keskityttiin erityisesti nanosuodatuksessa käytettyihin membraaneihin sekä niiden likaantumismekanismeihin. Kokeellisessa osassa keskityttiin hiilihydraattien ja aminohappojen talteenottoon kolmesta biomassaperäisestä liuoksesta. Tutkimuksen osa-alueita olivat ultrasuodatus, ultrasuodatuksen konsentraatin kirkastaminen sekä ultrasuodatuksen permeaatin fraktiointi nanosuodatuksella. Tutkimuksessa kiinnitettiin myös erityistä huomiota suodatuskalvojen likaantumiseen ja peseytyvyyteen sekä kalvojen käytettävyyteen pesujen jälkeen. Ultrasuodatuksessa kaikkien kolmen liuoksen kohdalla tutkittavien hiilihydraattien saanto permeaattiin oli hyvä, noin 90 %. Ultrasuodatuksissa käytettyjen membraanien osalta ei myöskään ollut havaittavissa merkittävää likaantumista. Ultrasuodatuksen konsentraattien kirkastamiskokeissa sameutta aiheuttavat komponentit saatiin poistettua kaikista liuoksista yli 94 %:in tehokkuudella. Nanosuodatuksissa monosakkaridit saatiin erotettua suuremmista hiilihydraattikomponenteista joko täysin tai lähes täysin (97 - 100 %). Nanosuodatuksissa käytettyjen membraanien osalta huomattavaa likaantumista oli havaittavissa vain membraanilla 2. Tulosten perusteella nanosuodatuksen voidaan sanoa olevan tehokas tapa erottaa pienet monosakkaridit suuremmista hiilihydraattiyhdisteistä.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The evolution and maturation of Cloud Computing created an opportunity for the emergence of new Cloud applications. High-performance Computing, a complex problem solving class, arises as a new business consumer by taking advantage of the Cloud premises and leaving the expensive datacenter management and difficult grid development. Standing on an advanced maturing phase, today’s Cloud discarded many of its drawbacks, becoming more and more efficient and widespread. Performance enhancements, prices drops due to massification and customizable services on demand triggered an emphasized attention from other markets. HPC, regardless of being a very well established field, traditionally has a narrow frontier concerning its deployment and runs on dedicated datacenters or large grid computing. The problem with common placement is mainly the initial cost and the inability to fully use resources which not all research labs can afford. The main objective of this work was to investigate new technical solutions to allow the deployment of HPC applications on the Cloud, with particular emphasis on the private on-premise resources – the lower end of the chain which reduces costs. The work includes many experiments and analysis to identify obstacles and technology limitations. The feasibility of the objective was tested with new modeling, architecture and several applications migration. The final application integrates a simplified incorporation of both public and private Cloud resources, as well as HPC applications scheduling, deployment and management. It uses a well-defined user role strategy, based on federated authentication and a seamless procedure to daily usage with balanced low cost and performance.
Resumo:
Due to diminishing petroleum reserves, unsteady market situation and the environmental concerns associated with utilization of fossil resources, the utilization of renewables for production of energy and chemicals (biorefining) has gained considerable attention. Biomass is the only sustainable source of organic compounds that has been proposed as petroleum equivalent for the production of fuels, chemicals and materials. In fact, it would not be wrong to say that the only viable answer to sustainably convene our future energy and material requirements remain with a bio-based economy with biomass based industries and products. This has prompted biomass valorization (biorefining) to become an important area of industrial research. While many disciplines of science are involved in the realization of this effort, catalysis and knowledge of chemical technology are considered to be particularly important to eventually render this dream to come true. Traditionally, the catalyst research for biomass conversion has been focused primarily on commercially available catalysts like zeolites, silica and various metals (Pt, Pd, Au, Ni) supported on zeolites, silica etc. Nevertheless, the main drawbacks of these catalysts are coupled with high material cost, low activity, limited reusability etc. – all facts that render them less attractive in industrial scale applications (poor activity for the price). Thus, there is a particular need to develop active, robust and cost efficient catalytic systems capable of converting complex biomass molecules. Saccharification, esterification, transesterification and acetylation are important chemical processes in the valorization chain of biomasses (and several biomass components) for production of platform chemicals, transportation fuels, food additives and materials. In the current work, various novel acidic carbons were synthesized from wastes generated from biodiesel and allied industries, and employed as catalysts in the aforementioned reactions. The structure and surface properties of the novel materials were investigated by XRD, XPS, elemental analysis, SEM, TEM, TPD and N2-physisorption techniques. The agro-industrial waste derived sulfonic acid functionalized novel carbons exhibit excellent catalytic activity in the aforementioned reactions and easily outperformed liquid H2SO4 and conventional solid acids (zeolites, ion-exchange resins etc). The experimental results indicated strong influence of catalyst pore-structure (pore size, pore-volume), concentration of –SO3H groups and surface properties in terms of the activity and selectivity of these catalysts. Here, a large pore catalyst with high –SO3H density exhibited the highest esterification and transesterification activity, and was successfully employed in biodiesel production from fatty acids and low grade acidic oils. Also, a catalyst decay model was proposed upon biodiesel production and could explain that the catalyst loses its activity mainly due to active site blocking by adsorption of impurities and by-products. The large pore sulfonated catalyst also exhibited good catalytic performance in the selective synthesis of triacetin via acetylation of glycerol with acetic anhydride and out-performed the best zeolite H-Y with respect to reusability. It also demonstrated equally good activity in acetylation of cellulose to soluble cellulose acetates, with the possibility to control cellulose acetate yield and quality (degree of substitution, DS) by a simple adjustment of reaction time and acetic anhydride concentration. In contrast, the small pore and highly functionalized catalysts obtained by hydrothermal method and from protein rich waste (Jatropha de-oiled waste cake, DOWC), were active and selective in the esterification of glycerol with fatty acids to monoglycerides and saccharification of cellulosic materials, respectively. The operational stability and reusability of the catalyst was found to depend on the stability of –SO3H function (leaching) as well as active site blocking due to adsorption of impurities during the reaction. Thus, our results corroborate the potential of DOWC derived sulfated mesoporous active carbons as efficient integrated solid acid catalysts for valorization of biomass to platform chemicals, biofuel, bio-additive, surfactants and celluloseesters.
Resumo:
The life cycle of software applications in general is very short and with extreme volatile requirements. Within these conditions programmers need development tools and techniques with an extreme level of productivity. We consider the code reuse as the most prominent approach to solve that problem. Our proposal uses the advantages provided by the Aspect-Oriented Programming in order to build a reusable framework capable to turn both programmer and application oblivious as far as data persistence is concerned, thus avoiding the need to write any line of code about that concern. Besides the benefits to productivity, the software quality increases. This paper describes the actual state of the art, identifying the main challenge to build a complete and reusable framework for Orthogonal Persistence in concurrent environments with support for transactions. The present work also includes a successfully developed prototype of that framework, capable of freeing the programmer of implementing any read or write data operations. This prototype is supported by an object oriented database and, in the future, will also use a relational database and have support for transactions.
Resumo:
Cancer and cardio-vascular diseases are the leading causes of death world-wide. Caused by systemic genetic and molecular disruptions in cells, these disorders are the manifestation of profound disturbance of normal cellular homeostasis. People suffering or at high risk for these disorders need early diagnosis and personalized therapeutic intervention. Successful implementation of such clinical measures can significantly improve global health. However, development of effective therapies is hindered by the challenges in identifying genetic and molecular determinants of the onset of diseases; and in cases where therapies already exist, the main challenge is to identify molecular determinants that drive resistance to the therapies. Due to the progress in sequencing technologies, the access to a large genome-wide biological data is now extended far beyond few experimental labs to the global research community. The unprecedented availability of the data has revolutionized the capabilities of computational researchers, enabling them to collaboratively address the long standing problems from many different perspectives. Likewise, this thesis tackles the two main public health related challenges using data driven approaches. Numerous association studies have been proposed to identify genomic variants that determine disease. However, their clinical utility remains limited due to their inability to distinguish causal variants from associated variants. In the presented thesis, we first propose a simple scheme that improves association studies in supervised fashion and has shown its applicability in identifying genomic regulatory variants associated with hypertension. Next, we propose a coupled Bayesian regression approach -- eQTeL, which leverages epigenetic data to estimate regulatory and gene interaction potential, and identifies combinations of regulatory genomic variants that explain the gene expression variance. On human heart data, eQTeL not only explains a significantly greater proportion of expression variance in samples, but also predicts gene expression more accurately than other methods. We demonstrate that eQTeL accurately detects causal regulatory SNPs by simulation, particularly those with small effect sizes. Using various functional data, we show that SNPs detected by eQTeL are enriched for allele-specific protein binding and histone modifications, which potentially disrupt binding of core cardiac transcription factors and are spatially proximal to their target. eQTeL SNPs capture a substantial proportion of genetic determinants of expression variance and we estimate that 58% of these SNPs are putatively causal. The challenge of identifying molecular determinants of cancer resistance so far could only be dealt with labor intensive and costly experimental studies, and in case of experimental drugs such studies are infeasible. Here we take a fundamentally different data driven approach to understand the evolving landscape of emerging resistance. We introduce a novel class of genetic interactions termed synthetic rescues (SR) in cancer, which denotes a functional interaction between two genes where a change in the activity of one vulnerable gene (which may be a target of a cancer drug) is lethal, but subsequently altered activity of its partner rescuer gene restores cell viability. Next we describe a comprehensive computational framework --termed INCISOR-- for identifying SR underlying cancer resistance. Applying INCISOR to mine The Cancer Genome Atlas (TCGA), a large collection of cancer patient data, we identified the first pan-cancer SR networks, composed of interactions common to many cancer types. We experimentally test and validate a subset of these interactions involving the master regulator gene mTOR. We find that rescuer genes become increasingly activated as breast cancer progresses, testifying to pervasive ongoing rescue processes. We show that SRs can be utilized to successfully predict patients' survival and response to the majority of current cancer drugs, and importantly, for predicting the emergence of drug resistance from the initial tumor biopsy. Our analysis suggests a potential new strategy for enhancing the effectiveness of existing cancer therapies by targeting their rescuer genes to counteract resistance. The thesis provides statistical frameworks that can harness ever increasing high throughput genomic data to address challenges in determining the molecular underpinnings of hypertension, cardiovascular disease and cancer resistance. We discover novel molecular mechanistic insights that will advance the progress in early disease prevention and personalized therapeutics. Our analyses sheds light on the fundamental biological understanding of gene regulation and interaction, and opens up exciting avenues of translational applications in risk prediction and therapeutics.
Resumo:
This study was designed to obtain information on the prevalence of electronic technology—in terms of availability and use—in classrooms in Newfoundland and Labrador. An online survey was developed and delivered electronically to a randomly chosen sample of 800 k-12 educators in Newfoundland & Labrador’s English School District during Winter, 2016. In total, 377 surveys were completed. Among other things, the findings showed that SMART Boards and iPads were receiving significant usage while the usage of computer labs and of various social media tools was not particularly high.