985 resultados para Event-Driven Programming
Resumo:
Web application performance testing is an emerging and important field of software engineering. As web applications become more commonplace and complex, the need for performance testing will only increase. This paper discusses common concepts, practices and tools that lie at the heart of web application performance testing. A pragmatic, hands-on approach is assumed where applicable; real-life examples of test tooling, execution and analysis are presented right next to the underpinning theory. At the client-side, web application performance is primarily driven by the amount of data transmitted over the wire. At the server-side, selection of programming language and platform, implementation complexity and configuration are the primary contributors to web application performance. Web application performance testing is an activity that requires delicate coordination between project stakeholders, developers, system administrators and testers in order to produce reliable and useful results. Proper test definition, execution, reporting and repeatable test results are of utmost importance. Open-source performance analysis tools such as Apache JMeter, Firebug and YSlow can be used to realise effective web application performance tests. A sample case study using these tools is presented in this paper. The sample application was found to perform poorly even under the moderate load incurred by the sample tests.
Resumo:
Zn-EDTA degradabilty by catechol-driven Fenton reaction was studied. Response surface methodology central composite design was employed to maximize this complex degradation. Theoretical speciation calculations were in good agreement with the experimental results. Fenton and Fenton type treatments are typically thought to be applicable only in the highly acidic range, representing a major operational constraint. Interestingly, at optimized concentrations, this CAT-driven Fenton reaction at pH 5.5 achieved 100% Zn-EDTA degradation; 60% COD and 17% TOC removals, using tiny amounts of CAT (50 µM), Fe(III) (445 µM) and H2O2 (20 mM) with no evident ferric sludge.
Resumo:
Fusarium Head Blight (FHB) is a disease of great concern in wheat (Triticum aestivum). Due to its relatively narrow susceptible phase and environmental dependence, the pathosystem is suitable for modeling. In the present work, a mechanistic model for estimating an infection index of FHB was developed. The model is process-based driven by rates, rules and coefficients for estimating the dynamics of flowering, airborne inoculum density and infection frequency. The latter is a function of temperature during an infection event (IE), which is defined based on a combination of daily records of precipitation and mean relative humidity. The daily infection index is the product of the daily proportion of susceptible tissue available, infection frequency and spore cloud density. The model was evaluated with an independent dataset of epidemics recorded in experimental plots (five years and three planting dates) at Passo Fundo, Brazil. Four models that use different factors were tested, and results showed all were able to explain variation for disease incidence and severity. A model that uses a correction factor for extending host susceptibility and daily spore cloud density to account for post-flowering infections was the most accurate explaining 93% of the variation in disease severity and 69% of disease incidence according to regression analysis.
Resumo:
Agile coaching of a project team is one way to aid learning of the agile methods. The objective of this thesis is to present the agile coaching plan and to follow how complying the plan affects to the project teams. Furthermore, the agile methods are followed how they work in the projects. Two projects are used to help the research. From the thesis point of view, the task for the first project is to coach the project team and two new coaches. The task for the second project is also to coach the project team, but this time so that one of the new coaches acts as the coach. The agile methods Scrum process and Extreme programming are utilized by the projects. In the latter, the test driven development, continuous integration and pair programming are concentrated more precisely. The results of the work are based on the observations from the projects and the analysis derived from the observations. The results are divided to the effects of the coaching and to functionality of the agile methods in the projects. Because of the small sample set, the results are directional. The presented plan, to coach the agile methods, needs developing, but the results of the functionality of the agile methods are encouraging.
Resumo:
This thesis was made for a large forest industry company’s business segment. The purpose of the study was to improve the performance of the order-to-delivery process of the business segment. The study proceeded in three phases. The first phase was to define customer expectations in the market. The second phase was to analyse the performance and the operations of the order-to-delivery process, and to define any challenges or problems in serving the customers. The third and final phase was improving the performance of the order-to-delivery process, within the scope defined by the first two phases. The analysis showed that the delivery reliability is an essential but a challenging issue in the case company’s markets. On delivery reliability standpoint, the most challenging factors were the detected information flow distortions within the company as well as in the whole supply chain, and the lack of horizontal control over the multi-stage process.
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
Agile software development has grown in popularity starting from the agile manifesto declared in 2001. However there is a strong belief that the agile methods are not suitable for embedded, critical or real-time software development, even though multiple studies and cases show differently. This thesis will present a custom agile process that can be used in embedded software development. The reasons for presumed unfitness of agile methods in embedded software development have mainly based on the feeling of these methods providing no real control, no strict discipline and less rigor engineering practices. One starting point is to provide a light process with disciplined approach to the embedded software development. Agile software development has gained popularity due to the fact that there are still big issues in software development as a whole. Projects fail due to schedule slips, budget surpassing or failing to meet the business needs. This does not change when talking about embedded software development. These issues are still valid, with multiple new ones rising from the quite complex and hard domain the embedded software developers work in. These issues are another starting point for this thesis. The thesis is based heavily on Feature Driven Development, a software development methodology that can be seen as a runner up to the most popular agile methodologies. The FDD as such is quite process oriented and is lacking few practices considered commonly as extremely important in agile development methodologies. In order for FDD to gain acceptance in the software development community it needs to be modified and enhanced. This thesis presents an improved custom agile process that can be used in embedded software development projects with size varying from 10 to 500 persons. This process is based on Feature Driven Development and by suitable parts to Extreme Programming, Scrum and Agile Modeling. Finally this thesis will present how the new process responds to the common issues in the embedded software development. The process of creating the new process is evaluated at the retrospective and guidelines for such process creation work are introduced. These emphasize the agility also in the process development through early and frequent deliveries and the team work needed to create suitable process.
Resumo:
Western societies have been faced with the fact that overweight, impaired glucose regulation and elevated blood pressure are already prevalent in pediatric populations. This will inevitably mean an increase in later manifestations of cardio-metabolic diseases. The dilemma has been suggested to stem from fetal life and it is surmised that the early nutritional environment plays an important role in the process called programming. The aim of the present study was to characterize early nutritional determinants associating with cardio-metabolic risk factors in fetuses, infants and children. Further, the study was designated to establish whether dietary counseling initiated in early pregnancy can modify this cascade. Healthy mother-child pairs (n=256) participating in a dietary intervention study were followed from early pregnancy to childhood. The intervention included detailed dietary counseling by a nutritionist targeting saturated fat intake in excess of recommendations and fiber consumption below recommendations. Cardio-metabolic programming was studied by characterizing the offspring’s cardio-metabolic risk factors such as over-activation of the autonomic nervous system, elevated blood pressure and adverse metabolic status (e.g. serum high split proinsulin concentration). Fetal cardiac sympathovagal activation was measured during labor. Postnatally, children’s blood pressure was measured at six-month and four-year follow-up visits. Further, infants’ metabolic status was assessed by means of growth and serum biomarkers (32-33 split proinsulin, leptin and adiponectin) at the age of six months. This study proved that fetal cardiac sympathovagal activity was positively associated with maternal pre-pregnancy body mass index indicating adverse cardio-metabolic programming in the offspring. Further, a reduced risk of high split proinsulin in infancy and lower blood pressure in childhood were found in those offspring whose mothers’ weight gain and amount and type of fats in the diet during pregnancy were as recommended. Of note, maternal dietary counseling from early pregnancy onwards could ameliorate the offspring’s metabolic status by reducing the risk of high split proinsulin concentration, although it had no effect on the other cardio-metabolic markers in the offspring. At postnatal period breastfeeding proved to entail benefits in cardio-metabolic programming. Finally, the recommended dietary protein and total fat content in the child’s diet were important nutritional determinants reducing blood pressure at the age of four years. The intrauterine and immediate postnatal period comprise a window of opportunity for interventions aiming to reduce the risk of cardio-metabolic disorders and brings the prospect of achieving health benefits over one generation.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.
Resumo:
The objective of this master’s thesis was to examine technology-based smart home devices and services. Topic was approached through basic theories, transaction cost theory and resource-based view in order to build basis for this thesis. Conceptual framework was discussed by means of networks, value networks and service systems which provide a useful framework for service development. The needs of the elderly living at home were discussed in order to find out which technology-based services could be used to satisfy the needs. Segmentation and need data collected previously during proactive home visits was exploited and additionally a survey targeted to experts and professionals of social and health care sector was done to verify the needs. Finally, the results of the survey were analyzed using quality function deployment method to figure out the most important and suitable service offerings for the elderly. As a conclusion of analysis, social media and monitoring services are the most useful technology-based services. However, traditional home services will still maintain their necessity too.
Resumo:
Tämän kandidaatintyön tavoitteena on tutkia osakkeen nimellisarvon jakamisen vaikutusta osakkeen markkina-arvoon Suomessa vuosina 1996-2007. Ilmiötä tarkastellaan tapahtumatutkimusmenetelmän avulla ja lopullinen tutkittavien osakesplittien määrä on 38. Tutkimuksessa ei löydetty epänormaaleja tuottoja splittien julkistushetkellä, joten tämän aineiston mukaan sijoittajat eivät pitäneet sitä johdolta tulevana positiivisena signaalina. Sitä vastoin tutkimuksessa löydettiin positiivinen kurssimuutos niiden osakkeiden kohdalla, jolloin pörssiyhtiö ilmoitti splitin ohella myös osingonjaostaan.
Resumo:
In this thesis, simple methods have been sought to lower the teacher’s threshold to start to apply constructive alignment in instruction. From the phases of the instructional process, aspects that can be improved with little effort by the teacher have been identified. Teachers have been interviewed in order to find out what students actually learn in computer science courses. A quantitative analysis of the structured interviews showed that in addition to subject specific skills and knowledge, students learn many other skills that should be mentioned in the learning outcomes of the course. The students’ background, such as their prior knowledge, learning style and culture, affects how they learn in a course. A survey was conducted to map the learning styles of computer science students and to see if their cultural background affected their learning style. A statistical analysis of the data indicated that computer science students are different learners than engineering students in general and that there is a connection between the student’s culture and learning style. In this thesis, a simple self-assessment scale that is based on Bloom’s revised taxonomy has been developed. A statistical analysis of the test results indicates that in general the scale is quite reliable, but single students still slightly overestimate or under-estimate their knowledge levels. For students, being able to follow their own progress is motivating, and for a teacher, self-assessment results give information about how the class is proceeding and what the level of the students’ knowledge is.
Resumo:
Direct-driven permanent magnet synchronous generator is one of the most promising topologies for megawatt-range wind power applications. The rotational speed of the direct-driven generator is very low compared with the traditional electrical machines. The low rotational speed requires high torque to produce megawatt-range power. The special features of the direct-driven generators caused by the low speed and high torque are discussed in this doctoral thesis. Low speed and high torque set high demands on the torque quality. The cogging torque and the load torque ripple must be as low as possible to prevent mechanical failures. In this doctoral thesis, various methods to improve the torque quality are compared with each other. The rotor surface shaping, magnet skew, magnet shaping, and the asymmetrical placement of magnets and stator slots are studied not only by means of torque quality, but also the effects on the electromagnetic performance and manufacturability of the machine are discussed. The heat transfer of the direct-driven generator must be designed to handle the copper losses of the stator winding carrying high current density and to keep the temperature of the magnets low enough. The cooling system of the direct-driven generator applying the doubly radial air cooling with numerous radial cooling ducts was modeled with a lumped-parameter-based thermal network. The performance of the cooling system was discussed during the steady and transient states. The effect of the number and width of radial cooling ducts was explored. The large number of radial cooling ducts drastically increases the impact of the stack end area effects, because the stator stack consists of numerous substacks. The effects of the radial cooling ducts on the effective axial length of the machine were studied by analyzing the crosssection of the machine in the axial direction. The method to compensate the magnet end area leakage was considered. The effect of the cooling ducts and the stack end area effects on the no-load voltages and inductances of the machine were explored by using numerical analysis tools based on the three-dimensional finite element method. The electrical efficiency of the permanent magnet machine with different control methods was estimated analytically over the whole speed and torque range. The electrical efficiencies achieved with the most common control methods were compared with each other. The stator voltage increase caused by the armature reaction was analyzed. The effect of inductance saturation as a function of load current was implemented to the analytical efficiency calculation.