47 resultados para Multiple Programming
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
This final project was made for the Broadband/Implementation department of TeliaSonera Finland. The question to be examined is if the operator should replace multiple ADSL connections implemented over a leased line with Multi-Dwelling access based on an Ethernet/Optical Fibre access network. The project starts with describing the technology related to these access network solu-tions and presents the technology that is used in TeliaSonera Finland's access network. It continues from the technology to describe the problem with some of the ADSL implemen-tations of TeliaSonera. The problem is the implementations done over a leased line that can cost TeliaSonera over years as much as a possible investment to extend network when there is several lines leased to the same building. The project proposes a Multi-Dwelling access as a solution to this problem and defines the circumstances when to use it. After a satisfactory solution has found the project takes a view how implementation of the solution might alter the network and a new problem is found. When used commonly to replace need of ADSL implementation Multi-Dwelling access would significantly increase optical cable congestion near operators POP. As a final deed this project also proposes a technical change to existing way to implement multi-dwelling access with EPON technology.
Resumo:
Background Multiple sclerosis (MS) is a demyelinating disease of the central nervous system, which mainly affects young adults. In Finland, approximately 2500 out of 6000 MS patients have relapsing MS and are treated with disease modifying drugs (DMD): interferon- β (INF-β-1a or INF-β-1b) and glatiramer acetate (GA). Depending on the used IFN-β preparation, 2 % to 40 % of patients develop neutralizing antibodies (NAbs), which abolish the biological effects of IFN-β, leading to reduced clinical and MRI detected efficacy. According to the Finnish Current Care Guidelines and European Federation of Neurological Societis (EFNS) guidelines, it is suggested tomeasure the presence of NAbs during the first 24 months of IFN-β therapy. Aims The aim of this thesis was to measure the bioactivity of IFN-β therapy by focusing on the induction of MxA protein (myxovirus resistance protein A) and its correlation to neutralizing antibodies (NAb). A new MxA EIA assay was set up to offer an easier and rapid method for MxA protein detection in clinical practice. In addition, the tolerability and safety of GA were evaluated in patients who haddiscontinued IFN-β therapy due to side effects and lack of efficacy. Results NAbs developed towards the end of 12 months of treatment, and binding antibodies were detectable before or parallel with them. The titer of NAb correlated negatively with the amount of MxA protein and the mean values of preinjection MxA levels never returned to true baseline in NAb negative patients, but tended to drop in the NAb positive group. The test results between MxA EIA and flow cytometric analysis showed significant correlation. GA reduced the relapse rate and was a safe and well-tolerated therapy in IFN-β-intolerant MS patients. Conclusions NAbs inhibit the induction of MxA protein, which can be used as a surrogate marker of the bioactivity of IFN-β therapy. Compared to flow cytometricanalysis and NAb assay, MxA-EIA seemed to be a sensitive and more practical method in clinical use to measure the actual bioactivity of IFN-β treatment, which is of value also from a cost-effective perspective.
Resumo:
[Abstract]
Wild guess, lucky guess, good guess - hazarding at a multiple-choice test of listening comprehension
Resumo:
Tämä kandidaatintyö tutkii tietotekniikan perusopetuksessa keskeisen aiheen,ohjelmoinnin, alkeisopetusta ja siihen liittyviä ongelmia. Työssä perehdytään ohjelmoinnin perusopetusmenetelmiin ja opetuksen lähestymistapoihin, sekä ratkaisuihin, joilla opetusta voidaan tehostaa. Näitä ratkaisuja työssä ovat mm. ohjelmointikielen valinta, käytettävän kehitysympäristön löytäminen sekä kurssia tukevien opetusapuvälineiden etsiminen. Lisäksi kurssin läpivientiin liittyvien toimintojen, kuten harjoitusten ja mahdollisten viikkotehtävien valinta kuuluu osaksitätä työtä. Työ itsessään lähestyy aihetta tutkimalla Pythonin soveltuvuutta ohjelmoinnin alkeisopetukseen mm. vertailemalla sitä muihin olemassa oleviin yleisiin opetuskieliin, kuten C, C++ tai Java. Se tarkastelee kielen hyviä ja huonoja puolia, sekä tutkii, voidaanko Pythonia hyödyntää luontevasti pääasiallisena opetuskielenä. Lisäksi työ perehtyy siihen, mitä kaikkea kurssilla tulisi opettaa, sekä siihen, kuinka kurssin läpivienti olisi tehokkainta toteuttaa ja minkälaiset tekniset puitteet kurssin toteuttamista varten olisi järkevää valita.
Resumo:
The main objective of this master's thesis is to study robot programming using simulation software, and also how to embed the simulation software into company's own robot controlling software. The further goal is to study a new communication interface to the assembly line's components -more precisely how to connect the robot cell into this new communication system. Conveyor lines are already available where the conveyors use the new communication standard. The robot cell is not yet capable of communicating with to other devices using the new communication protocols. The main problem among robot manufacturers is that they all have their own communication systems and programming languages. There has not been any common programming language to program all the different robot manufacturers robots, until the RRS (Realistic Robot Simulation) standards were developed. The RRS - II makes it possible to create the robot programs in the simulation software and it gives a common user interface for different robot manufacturers robots. This thesis will present the RRS - II standard and the robot manufacturers situation for the RRS - II support. Thesis presents how the simulation software can be embedded into company's own robot controlling software and also how the robot cell can be connected to the CAMX (Computer Aided Manufacturing using XML) communication system.
Resumo:
The underlying cause of many human autoimmune diseases is unknown, but several environmental factors are implicated in triggering the self-destructive immune reactions. Multiple Sclerosis (MS) is a chronic autoimmune disease of the central nervous system, potentially leading to persistent neurological deterioration. The cause of MS is not known, and apart from immunomodulatory treatments there is no cure. In the early phase of the disease, relapsing-remitting MS (RR-MS) is characterized by unpredictable exacerbations of the neurological symptoms called relapses, which can occur at different intervals ranging from 4 weeks to several years. Microbial infections are known to be able to trigger MS relapses, and the patients are instructed to avoid all factors that might increase the risk of infections and to properly use antibiotics as well as to take care of dental hygiene. Among those environmental factors which are known to increase susceptibility to infections, high ambient air inhalable particulate matter levels affect all people within a geographical region. During the period of interest in this thesis, the occurrence of MS relapses could be effectively reduced by injections of interferon, which has immunomodulatory and antiviral properties. In this thesis, ecological and epidemiological analyses were used to study the possible connection between MS relapse occurrence, population level viral infections and air quality factors, as well as the effects of interferon medication. Hospital archive data were collected retrospectively from 1986-2001, a period in time ranging from when interferon medication first became available until just before other disease-modifying MS therapies arrived on the market. The grouped data were studied with logistic regression and intervention analysis, and individual patient data with survival analysis. Interferons proved to be effective in the treatment of MS in this observational study, as the amount of MS exacerbations was lower during interferon use as compared to the time before interferon treatment. A statistically significant temporal relationship between MS relapses and inhalable particular matter (PM10) concentrations was found in this study, which implies that MS patients are affected by the exposure to PM10. Interferon probably protected against the effect of PM10, because a significant increase in the risk of exacerbations was only observed in MS patients without interferon medication following environmental exposure to population level specific viral infections and PM10. Apart from being antiviral, interferon could thus also attenuate the enhancement of immune reactions caused by ambient air PM10. The retrospective approach utilizing carefully constructed hospital records proved to be an economical and reliable source of MS disease information for statistical analyses.
Resumo:
Most Finnish periodical magazines have a website, often an online service. The objective of this thesis is to understand the magazines’ resources and capabilities and match them with online strategies’ goals and objectives. The thesis’ theoretical part focuses on explaining and classifying resources, capabilities, goals and objectives, and applying everything into Finnish magazine publishing context. In the empirical part, there is a comparative case study of four magazines. The findings indicate that with cooperating, advertising and community hosting capabilities magazines may utilize their human, brand, content and customer base resources. The resources can be further addressed to reach profitability, customer-centricity and brand congruency goals.
Resumo:
Agile software development has grown in popularity starting from the agile manifesto declared in 2001. However there is a strong belief that the agile methods are not suitable for embedded, critical or real-time software development, even though multiple studies and cases show differently. This thesis will present a custom agile process that can be used in embedded software development. The reasons for presumed unfitness of agile methods in embedded software development have mainly based on the feeling of these methods providing no real control, no strict discipline and less rigor engineering practices. One starting point is to provide a light process with disciplined approach to the embedded software development. Agile software development has gained popularity due to the fact that there are still big issues in software development as a whole. Projects fail due to schedule slips, budget surpassing or failing to meet the business needs. This does not change when talking about embedded software development. These issues are still valid, with multiple new ones rising from the quite complex and hard domain the embedded software developers work in. These issues are another starting point for this thesis. The thesis is based heavily on Feature Driven Development, a software development methodology that can be seen as a runner up to the most popular agile methodologies. The FDD as such is quite process oriented and is lacking few practices considered commonly as extremely important in agile development methodologies. In order for FDD to gain acceptance in the software development community it needs to be modified and enhanced. This thesis presents an improved custom agile process that can be used in embedded software development projects with size varying from 10 to 500 persons. This process is based on Feature Driven Development and by suitable parts to Extreme Programming, Scrum and Agile Modeling. Finally this thesis will present how the new process responds to the common issues in the embedded software development. The process of creating the new process is evaluated at the retrospective and guidelines for such process creation work are introduced. These emphasize the agility also in the process development through early and frequent deliveries and the team work needed to create suitable process.
Resumo:
Western societies have been faced with the fact that overweight, impaired glucose regulation and elevated blood pressure are already prevalent in pediatric populations. This will inevitably mean an increase in later manifestations of cardio-metabolic diseases. The dilemma has been suggested to stem from fetal life and it is surmised that the early nutritional environment plays an important role in the process called programming. The aim of the present study was to characterize early nutritional determinants associating with cardio-metabolic risk factors in fetuses, infants and children. Further, the study was designated to establish whether dietary counseling initiated in early pregnancy can modify this cascade. Healthy mother-child pairs (n=256) participating in a dietary intervention study were followed from early pregnancy to childhood. The intervention included detailed dietary counseling by a nutritionist targeting saturated fat intake in excess of recommendations and fiber consumption below recommendations. Cardio-metabolic programming was studied by characterizing the offspring’s cardio-metabolic risk factors such as over-activation of the autonomic nervous system, elevated blood pressure and adverse metabolic status (e.g. serum high split proinsulin concentration). Fetal cardiac sympathovagal activation was measured during labor. Postnatally, children’s blood pressure was measured at six-month and four-year follow-up visits. Further, infants’ metabolic status was assessed by means of growth and serum biomarkers (32-33 split proinsulin, leptin and adiponectin) at the age of six months. This study proved that fetal cardiac sympathovagal activity was positively associated with maternal pre-pregnancy body mass index indicating adverse cardio-metabolic programming in the offspring. Further, a reduced risk of high split proinsulin in infancy and lower blood pressure in childhood were found in those offspring whose mothers’ weight gain and amount and type of fats in the diet during pregnancy were as recommended. Of note, maternal dietary counseling from early pregnancy onwards could ameliorate the offspring’s metabolic status by reducing the risk of high split proinsulin concentration, although it had no effect on the other cardio-metabolic markers in the offspring. At postnatal period breastfeeding proved to entail benefits in cardio-metabolic programming. Finally, the recommended dietary protein and total fat content in the child’s diet were important nutritional determinants reducing blood pressure at the age of four years. The intrauterine and immediate postnatal period comprise a window of opportunity for interventions aiming to reduce the risk of cardio-metabolic disorders and brings the prospect of achieving health benefits over one generation.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.