907 resultados para Integration of Programming Techniques
Resumo:
Millennials generation is changing the way of learning, prompting educational institutions to attempt to better adapt to young needs by incorporating technologies into education. Based on this premise, we have reviewed the prominent reports of the integration of ICT into education with the aim of evidencing how education is changing, and will change, to meet the needs ofMillennials with ICT support. We conclude that most of the investments have simply resulted in an increase of computers and access to the Internet, with teachers reproducing traditional approaches to education and e-learning being seen as complementary to face-to-face education. While it would seem that the use of ICT is not revolutionizing learning, it is facilitating the personalization, collaboration and ubiquity of learning.
Resumo:
Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.
Resumo:
The objective of this work was to evaluate the accuracy of digestion techniques using nitric and perchloric acid at the ratios of 2:1, 3:1, and 4:1 v v-1, in one- or two-step digestion, to estimate chromium contents in cattle feces, using sodium molybdate as a catalyst. Fecal standards containing known chromium contents (0, 2, 4, 6, 8, and 10 g kg-1) were produced from feces of five animals. The chromium content in cattle feces is accurately estimated using digestion techniques based on nitric and perchloric acids, at a 3:1 v v-1 ratio, in one-step digestion, with sodium molybdate as a catalyst.
Resumo:
This paper presents a programming environment for supporting learning in STEM, particularly mobile robotic learning. It was designed to maintain progressive learning for people with and without previous knowledge of programming and/or robotics. The environment was multi platform and built with open source tools. Perception, mobility, communication, navigation and collaborative behaviour functionalities can be programmed for different mobile robots. A learner is able to programme robots using different programming languages and editor interfaces: graphic programming interface (basic level), XML-based meta language (intermediate level) or ANSI C language (advanced level). The environment supports programme translation transparently into different languages for learners or explicitly on learners’ demand. Learners can access proposed challenges and learning interfaces by examples. The environment was designed to allow characteristics such as extensibility, adaptive interfaces, persistence and low software/hardware coupling. Functionality tests were performed to prove programming environment specifications. UV BOT mobile robots were used in these tests
Resumo:
The high cost of feed ingredients, the use of non-renewable sources of phosphate and the dramatic increase in the environmental load resulting from the excessive land application of manure are major challenges for the livestock industry. Precision feeding is proposed as an essential approach to improve the utilization of dietary nitrogen, phosphorus and other nutrients and thus reduce feeding costs and nutrient excretion. Precision feeding requires accurate knowledge of the nutritional value of feedstuffs and animal nutrient requirements, the formulation of diets in accordance with environmental constraints, and the gradual adjustment of the dietary nutrient supply to match the requirements of the animals. After the nutritional potential of feed ingredients has been precisely determined and has been improved by the addition of enzymes (e.g. phytases) or feed treatments, the addition of environmental objectives to the traditional feed formulation algorithms can promote the sustainability of the swine industry by reducing nutrient excretion in swine operations with small increases in feeding costs. Increasing the number of feeding phases can also contribute to significant reductions in nutrient excretion and feeding costs. However, the use of precision feeding techniques in which pigs are fed individually with daily tailored diets can further improve the efficiency with which pigs utilize dietary nutrients. Precision feeding involves the use of feeding techniques that allow the provision of the right amount of feed with the right composition at the right time to each pig in the herd. Using this approach, it has been estimated that feeding costs can be reduced by more than 4.6%, and nitrogen and phosphorus excretion can both be reduced by more than 38%. Moreover, the integration of precision feeding techniques into large-group production systems can provide real-time off-farm monitoring of feed and animals for optimal slaughter and production strategies, thus improving the environmental sustainability of pork production, animal well-being and meat-product quality.
Resumo:
Micronization techniques based on supercritical fluids (SCFs) are promising for the production of particles with controlled size and distribution. The interest of the pharmaceutical field in the development of SCF techniques is increasing due to the need for clean processes, reduced consumption of energy, and to their several possible applications. The food field is still far from the application of SCF micronization techniques, but there is increasing interest mainly for the processing of products with high added value. The aim of this study is to use SCF micronization techniques for the production of particles of pharmaceuticals and food ingredients with controlled particle size and morphology, and to look at their production on semi-industrial scale. The results obtained are also used to understand the processes from the perspective of broader application within the pharmaceutical and food industries. Certain pharmaceuticals, a biopolymer and a food ingredient have been tested using supercritical antisolvent micronization (SAS) or supercritical assisted atomization (SAA) techniques. The reproducibility of the SAS technique has been studied using physically different apparatuses and on both laboratory and semi-industrial scale. Moreover, a comparison between semi-continuous and batch mode has been performed. The behaviour of the system during the SAS process has been observed using a windowed precipitation vessel. The micronized powders have been characterized by particle size and distribution, morphology and crystallinity. Several analyses have been performed to verify if the SCF process modified the structure of the compound or caused degradation or contamination of the product. The different powder morphologies obtained have been linked to the position of the process operating point with respect to the vapour-liquid equilibrium (VLE) of the systems studied, that is, mainly to the position of the mixture critical point (MCP) of the mixture. Spherical micro, submicro- and nanoparticles, expanded microparticles (balloons) and crystals were obtained by SAS. The obtained particles were amorphous or with different degrees of crystallinity and, in some cases, had different pseudo-polymorphic or polymorphic forms. A compound that could not be processed using SAS was micronized by SAA, and amorphous particles were obtained, stable in vials at room temperature. The SCF micronization techniques studied proved to be effective and versatile for the production of particles for several uses. Furthermore, the findings of this study and the acquired knowledge of the proposed processes can allow a more conscious application of SCF techniques to obtain products with the desired characteristics and enable the use of their principles for broader applications.
Resumo:
The current challenge in a context of major environmental changes is to anticipate the responses of species to future landscape and climate scenarios. In the Mediterranean basin, climate change is one the most powerful driving forces of fire dynamics, with fire frequency and impact having markedly increased in recent years. Species distribution modelling plays a fundamental role in this challenge, but better integration of available ecological knowledge is needed to adequately guide conservation efforts. Here, we quantified changes in habitat suitability of an early-succession bird in Catalonia, the Dartford Warbler (Sylvia undata) ― globally evaluated as Near Threatened in the IUCN Red List. We assessed potential changes in species distributions between 2000 and 2050 under different fire management and climate change scenarios and described landscape dynamics using a spatially-explicit fire-succession model that simulates fire impacts in the landscape and post-fire regeneration (MEDFIRE model). Dartford Warbler occurrence data were acquired at two different spatial scales from: 1) the Atlas of European Breeding Birds (EBCC) and 2) Catalan Breeding Bird Atlas (CBBA). Habitat suitability was modelled using five widely-used modelling techniques in an ensemble forecasting framework. Our results indicated considerable habitat suitability losses (ranging between 47% and 57% in baseline scenarios), which were modulated to a large extent by fire regime changes derived from fire management policies and climate changes. Such result highlighted the need for taking the spatial interaction between climate changes, fire-mediated landscape dynamics and fire management policies into account for coherently anticipating habitat suitability changes of early succession bird species. We conclude that fire management programs need to be integrated into conservation plans to effectively preserve sparsely forested and early succession habitats and their associated species in the face of global environmental change.
Resumo:
Diplomityön tarkoituksena oli arvioida akvisition jälkeistä integraatioprosessia. Integraation tarkoitus on mukauttaa ostettu yritys toimivaksi osaksi konsernia. Työn empiirisenä ongelmana oli yleisesti tunnustettu integraatiojohtamisen kompleksisuus. Samoin myöskin akateemisesta kirjallisuudesta puuttui koherentti malli, jolla arvioida integraatiota. Tutkimuskohteena oli akvisitio, jossa suomalainen tietotekniikkan suuryritys osti osake-enemmistön tsekkiläisestä keskisuuresta ohjelmistoyrityksestä. Tutkimuksessa generoitiin integraatiojohtamisen malli tietopohjaiseen organisaatioon. Mallin mukaan integraatio koostuu kolmesta eriävästä, mutta toisiaan tukevasta alueesta: organisaatiokulttuurin yhdentyminen, tietopääoman tasaaminen ja konsernin sisäisten prosessien yhdenmukaistaminen. Näistä kaksi kaksi jälkimmäistä ovat johdettavissa, mutta kulttuurin yhdentymiseen integraatiojohtamisella voidaan vaikuttaa vain katalysoivasti. Organisaatiokulttuuri levittäytyy vain osallisten vuorovaikuksien kautta. Lisäksi tutkimus osoitti, miten akvisitio on revolutionaarinen vaihe yrityksen kehityksessä. Integraation ensimmäinen ajanjakso on revolutionaarista. Tällöin suurimmat ja näkyvimmät johdettavat muutokset pyritään saamaan aikaan, jotta integraatiossa edettäisiin evolutionaariseen kehitykseen. Revolutionaarisen intergaation vetojuhtana toimii integraatiojohto, kun taas evolutionaarinen integraatio etenee osallisten (organisaation jäsenten) itsensä toiminnan ja vuorovaikutusten kautta.
Resumo:
The management and conservation of coastal waters in the Baltic is challenged by a number of complex environmental problems, including eutrophication and habitat degradation. Demands for a more holistic, integrated and adaptive framework of ecosystem-based management emphasize the importance of appropriate information on the status and changes of the aquatic ecosystems. The thesis focuses on the spatiotemporal aspects of environmental monitoring in the extensive and geomorphologically complex coastal region of SW Finland, where the acquisition of spatially and temporally representative monitoring data is inherently challenging. Furthermore, the region is subject to multiple human interests and uses. A holistic geographical approach is emphasized, as it is ultimately the physical conditions that set the frame for any human activity. Characteristics of the coastal environment were examined using water quality data from the database of the Finnish environmental administration and Landsat TM/ETM+ images. A basic feature of the complex aquatic environment in the Archipelago Sea is its high spatial and temporal variability; this foregrounds the importance of geographical information as a basis of environmental assessments. While evidence of a consistent water turbidity pattern was observed, the coastal hydrodynamic realm is also characterized by high spatial and temporal variability. It is therefore also crucial to consider the spatial and temporal representativeness of field monitoring data. Remote sensing may facilitate evaluation of hydrodynamic conditions in the coastal region and the spatial extrapolation of in situ data despite their restrictions. Additionally, remotely sensed images can be used in the mapping of many of those coastal habitats that need to be considered in environmental management. With regard to surface water monitoring, only a small fraction of the currently available data stored in the Hertta-PIVET register can be used effectively in scientific studies and environmental assessments. Long-term consistent data collection from established sampling stations should be emphasized but research-type seasonal assessments producing abundant data should also be encouraged. Thus a more comprehensive coordination of field work efforts is called for. The integration of remote sensing and various field measurement techniques would be especially useful in the complex coastal waters. The integration and development of monitoring system in Finnish coastal areas also requires further scientific assesement of monitoring practices. A holistic approach to the gathering and management of environmental monitoring data could be a cost-effective way of serving a multitude of information needs, and would fit the holistic, ecosystem-based management regimes that are currently being strongly promoted in Europe.
Resumo:
This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.
Resumo:
Tampere University of Technology is undergoing a degree reform that started in 2013. One of the major changes in the reform was the integration of compulsory Finnish, Swedish and English language courses to substance courses at the bachelor level. The integration of content and language courses aims at higher quality language learning, more fluency in studies, and increased motivation toward language studies. In addition, integration is an opportunity to optimize the use of resources and to offer courses that are more tailored to the students' field of study and to the skills needed in working life. The reform also aims to increase and develop co-operation between different departments at the university and to develop scientific follow up. This paper gives an overview of the integration process conducted at TUT and gives examples of adjunct CLIL implementations in three different languages.
Resumo:
Objective The present study was aimed at describing a case series where a preoperative diagnosis of intestinal complications secondary to accidentally ingested dietary foreign bodies was made by multidetector-row computed tomography (MDCT), with emphasis on complementary findings yielded by volume rendering techniques (VRT) and curved multiplanar reconstructions (MPR). Materials and Methods The authors retrospectively assessed five patients with surgically confirmed intestinal complications (perforation and /or obstruction) secondary to unsuspected ingested dietary foreign bodies, consecutively assisted in their institution between 2010 and 2012. Demographic, clinical, laboratory and radiological data were analyzed. VRT and curved MPR were subsequently performed. Results Preoperative diagnosis of intestinal complications was originally performed in all cases. In one case the presence of a foreign body was not initially identified as the causal factor, and the use of complementary techniques facilitated its retrospective identification. In all cases these tools allowed a better depiction of the entire foreign bodies on a single image section, contributing to the assessment of their morphology. Conclusion Although the use of complementary techniques has not had a direct impact on diagnostic performance in most cases of this series, they may provide a better depiction of foreign bodies' morphology on a single image section.
Resumo:
Selling is much maligned, often under-valued subject whose inadequate showing in business schools is in inverse proportion to the many job opportunities it offers and the importance of salespeople bringing incomes to companies. The purpose of this research is to increase the understanding of customer-oriented selling and examine the influence of customer-oriented philosophy on selling process, the applicability of selling techniques to this philosophy and the importance of them to salespeople. The empirical section of the study is two-fold. Firstly, the data of qualitative part was collected by conducting five thematic interviews among sales consultants and case company representatives. The findings of the study indicate that customer-oriented selling requires the activity of salespeople. In the customer-oriented personal selling process, salespeople invest time in the preplanning, the need analysis and the benefit demonstration stages. However, the findings propose that salespeople today must also have the basic capabilities for executing the traditional sales process, and the balance between traditional and consultative selling process will change as the duration of the relationship between the salesperson and customer increases. The study also proposes that selling techniques still belong to the customer-oriented selling process, although their roles might be modest. This thesis mapped 75 selling techniques and the quantitative part of the study explored what selling techniques are considered to be important by salespeople in direct selling industry when they make sales with new and existing customers. Response rate of the survey was 69.5%.
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.