28 resultados para conditions of contact
Resumo:
In nature, variation for example in herbivory, wind exposure, moisture and pollution impact often creates variation in physiological stress and plant productivity. This variation is seldom clear-cut, but rather results in clines of decreasing growth and productivity towards the high-stress end. These clines of unidirectionally changing stress are generally known as ‘stress gradients’. Through its effect on plant performance, stress has the capacity to fundamentally alter the ecological relationships between individuals, and through variation in survival and reproduction it also causes evolutionary change, i.e. local adaptations to stress and eventually speciation. In certain conditions local adaptations to environmental stress have been documented in a matter of just a few generations. In plant-plant interactions, intensities of both negative interactions (competition) and positive ones (facilitation) are expected to vary along stress gradients. The stress-gradient hypothesis (SGH) suggests that net facilitation will be strongest in conditions of high biotic and abiotic stress, while a more recent ‘humpback’ model predicts strongest net facilitation at intermediate levels of stress. Plant interactions on stress gradients, however, are affected by a multitude of confounding factors, making studies of facilitation-related theories challenging. Among these factors are plant ontogeny, spatial scale, and local adaptation to stress. The last of these has very rarely been included in facilitation studies, despite the potential co-occurrence of local adaptations and changes in net facilitation in stress gradients. Current theory would predict both competitive effects and facilitative responses to be weakest in populations locally adapted to withstand high abiotic stress. This thesis is based on six experiments, conducted both in greenhouses and in the field in Russia, Norway and Finland, with mountain birch (Betula pubescens subsp. czerepanovii) as the model species. The aims were to study potential local adaptations in multiple stress gradients (both natural and anthropogenic), changes in plant-plant interactions under conditions of varying stress (as predicted by SGH), potential mechanisms behind intraspecific facilitation, and factors confounding plant-plant facilitation, such as spatiotemporal, ontogenetic, and genetic differences. I found rapid evolutionary adaptations (occurring within a time-span of 60 to 70 years) towards heavy-metal resistance around two copper-nickel smelters, a phenomenon that has resulted in a trade-off of decreased performance in pristine conditions. Heavy-metal-adapted individuals had lowered nickel uptake, indicating a possible mechanism behind the detected resistance. Seedlings adapted to heavy-metal toxicity were not co-resistant to others forms of abiotic stress, but showed co-resistance to biotic stress by being consumed to a lesser extent by insect herbivores. Conversely, populations from conditions of high natural stress (wind, drought etc.) showed no local adaptations, despite much longer evolutionary time scales. Due to decreasing emissions, I was unable to test SGH in the pollution gradients. In natural stress gradients, however, plant performance was in accordance with SGH, with the strongest host-seedling facilitation found at the high-stress sites in two different stress gradients. Factors confounding this pattern included (1) plant size / ontogenetic status, with seedling-seedling interactions being competition dominated and host-seedling interactions potentially switching towards competition with seedling growth, and (2) spatial distance, with competition dominating at very short planting distances, and facilitation being strongest at a distance of circa ¼ benefactor height. I found no evidence for changes in facilitation with respect to the evolutionary histories of plant populations. Despite the support for SGH, it may be that the ‘humpback’ model is more relevant when the main stressor is resource-related, while what I studied were the effects of ‘non-resource’ stressors (i.e. heavy-metal pollution and wind). The results have potential practical applications: the utilisation of locally adapted seedlings and plant facilitation may increase the success of future restoration efforts in industrial barrens as well as in other wind-exposed sites. The findings also have implications with regard to the effects of global change in subarctic environments: the documented potential by mountain birch for rapid evolutionary change, together with the general lack of evolutionary ‘dead ends’, due to not (over)specialising to current natural conditions, increase the chances of this crucial forest-forming tree persisting even under the anticipated climate change.
Resumo:
The dissertation is based on four articles dealing with recalcitrant lignin water purification. Lignin, a complicated substance and recalcitrant to most treatment technologies, inhibits seriously pulp and paper industry waste management. Therefore, lignin is studied, using WO as a process method for its degradation. A special attention is paid to the improvement in biodegradability and the reduction of lignin content, since they have special importance for any following biological treatment. In most cases wet oxidation is not used as a complete ' mineralization method but as a pre treatment in order to eliminate toxic components and to reduce the high level of organics produced. The combination of wet oxidation with a biological treatment can be a good option due to its effectiveness and its relatively low technology cost. The literature part gives an overview of Advanced Oxidation Processes (AOPs). A hot oxidation process, wet oxidation (WO), is investigated in detail and is the AOP process used in the research. The background and main principles of wet oxidation, its industrial applications, the combination of wet oxidation with other water treatment technologies, principal reactions in WO, and key aspects of modelling and reaction kinetics are presented. There is also given a wood composition and lignin characterization (chemical composition, structure and origin), lignin containing waters, lignin degradation and reuse possibilities, and purification practices for lignin containing waters. The aim of the research was to investigate the effect of the operating conditions of WO, such as temperature, partial pressure of oxygen, pH and initial concentration of wastewater, on the efficiency, and to enhance the process and estimate optimal conditions for WO of recalcitrant lignin waters. Two different waters are studied (a lignin water model solution and debarking water from paper industry) to give as appropriate conditions as possible. Due to the great importance of re using and minimizing the residues of industries, further research is carried out using residual ash of an Estonian power plant as a catalyst in wet oxidation of lignin-containing water. Developing a kinetic model that includes in the prediction such parameters as TOC gives the opportunity to estimate the amount of emerging inorganic substances (degradation rate of waste) and not only the decrease of COD and BOD. The degradation target compound, lignin is included into the model through its COD value (CODligning). Such a kinetic model can be valuable in developing WO treatment processes for lignin containing waters, or other wastewaters containing one or more target compounds. In the first article, wet oxidation of "pure" lignin water was investigated as a model case with the aim of degrading lignin and enhancing water biodegradability. The experiments were performed at various temperatures (110 -190°C), partial oxygen pressures (0.5 -1.5 MPa) and pH (5, 9 and 12). The experiments showed that increasing the temperature notably improved the processes efficiency. 75% lignin reduction was detected at the lowest temperature tested and lignin removal improved to 100% at 190°C. The effect of temperature on the COD removal rate was lower, but clearly detectable. 53% of organics were oxidized at 190°C. The effect of pH occurred mostly on lignin removal. Increasing the pH enhanced the lignin removal efficiency from 60% to nearly 100%. A good biodegradability ratio (over 0.5) was generally achieved. The aim of the second article was to develop a mathematical model for "pure" lignin wet oxidation using lumped characteristics of water (COD, BOD, TOC) and lignin concentration. The model agreed well with the experimental data (R2 = 0.93 at pH 5 and 12) and concentration changes during wet oxidation followed adequately the experimental results. The model also showed correctly the trend of biodegradability (BOD/COD) changes. In the third article, the purpose of the research was to estimate optimal conditions for wet oxidation (WO) of debarking water from the paper industry. The WO experiments were' performed at various temperatures, partial oxygen pressures and pH. The experiments showed that lignin degradation and organics removal are affected remarkably by temperature and pH. 78-97% lignin reduction was detected at different WO conditions. Initial pH 12 caused faster removal of tannins/lignin content; but initial pH 5 was more effective for removal of total organics, represented by COD and TOC. Most of the decrease in organic substances concentrations occurred in the first 60 minutes. The aim of the fourth article was to compare the behaviour of two reaction kinetic models, based on experiments of wet oxidation of industrial debarking water under different conditions. The simpler model took into account only the changes in COD, BOD and TOC; the advanced model was similar to the model used in the second article. Comparing the results of the models, the second model was found to be more suitable for describing the kinetics of wet oxidation of debarking water. The significance of the reactions involved was compared on the basis of the model: for instance, lignin degraded first to other chemically oxidizable compounds rather than directly to biodegradable products. Catalytic wet oxidation of lignin containing waters is briefly presented at the end of the dissertation. Two completely different catalysts were used: a commercial Pt catalyst and waste power plant ash. CWO showed good performance using 1 g/L of residual ash gave lignin removal of 86% and COD removal of 39% at 150°C (a lower temperature and pressure than with WO). It was noted that the ash catalyst caused a remarkable removal rate for lignin degradation already during the pre heating for `zero' time, 58% of lignin was degraded. In general, wet oxidation is not recommended for use as a complete mineralization method, but as a pre treatment phase to eliminate toxic or difficultly biodegradable components and to reduce the high level of organics. Biological treatment is an appropriate post treatment method since easily biodegradable organic matter remains after the WO process. The combination of wet oxidation with subsequent biological treatment can be an effective option for the treatment of lignin containing waters.
Resumo:
The aim of this master’s thesis is to develop an algorithm to calculate the cable network for heat and power station CHGRES. This algorithm includes important aspect which has an influence on the cable network reliability. Moreover, according to developed algorithm, the optimal solution for modernization cable system from economical and technical point of view was obtained. The conditions of existing cable lines show that replacement is necessary. Otherwise, the fault situation would happen. In this case company would loss not only money but also its prestige. As a solution, XLPE single core cables are more profitable than other types of cable considered in this work. Moreover, it is presented the dependence of value of short circuit current on number of 10/110 kV transformers connected in parallel between main grid and considered 10 kV busbar and how it affects on final decision. Furthermore, the losses of company in power (capacity) market due to fault situation are presented. These losses are commensurable with investment to replace existing cable system.
Resumo:
A set of models in Aspen plus was built to simulate the direct synthesis process of hydrogen peroxide in a micro-reactor system. This process model can be used to carry out material balance calculation under various experimental conditions. Three thermodynamic property methods were compared by calculating gas solubility and Uniquac-RK method was finally selected for process model. Two different operation modes with corresponding operation conditions were proposed as the starting point of future experiments. Simulations for these two modes were carried out to get the information of material streams. Moreover, some hydrodynamic parameters such as gas/liquid superficial velocity, gas holdup were also calculated with improved process model. These parameters proved the proposed experimental conditions reasonable to some extent. The influence of operation conditions including temperature, pressure and circulation ratio was analyzed for the first operation mode, where pure oxygen was fed into dissolving tank and hydrogen-carbon dioxide mixture was fed into microreactor directly. The preferred operation conditions for the system are low temperature (2°C) and high pressure (30 bar) in dissolving tank. High circulation ratio might be good in the sense that more oxygen could be dissolved and fed into reactor for reactions, but meanwhile hydrodynamics of microreactor should be considered. Furthermore, more operation conditions of reactor gas/liquid feeds in both of two operation modes were proposed to provide guidance for future experiment design and corresponding hydrodynamic parameters were also calculated. Finally, safety issue was considered from thermodynamic point of view and there is no explosion danger at given experimental plan since the released reaction heat will not cause solvent vaporization inside the microchannels. The improvement of process model still needs further study based on the future experimental results.
Resumo:
There are several filtration applications in the pulp and paper industry where the capacity and cost-effectiveness of processes are of importance. Ultrafiltration is used to clean process water. Ultrafiltration is a membrane process that separates a certain component or compound from a liquid stream. The pressure difference across the membrane sieves macromolecules smaller than 0.001-0.02 μm through the membrane. When optimizing the filtration process capacity, online information about the conditions of the membrane is needed. Fouling and compaction of the membrane both affect the capacity of the filtration process. In fouling a “cake” layer starts to build on the surface of the membrane. This layer blocks the molecules from sieving through the membrane thereby decreasing the yield of the process. In compaction of the membrane the structure is flattened out because of the high pressure applied. The higher pressure increases the capacity but may damage the structure of the membrane permanently. Information about the compaction is needed to effectively operate the filters. The objective of this study was to develop an accurate system for online monitoring of the condition of the membrane using ultrasound reflectometry. Measurements of ultrafiltration membrane compaction were made successfully utilizing ultrasound. The results were confirmed by permeate flux decline, measurements of compaction with a micrometer, mechanical compaction using a hydraulic piston and a scanning electron microscope (SEM). The scientific contribution of this thesis is to introduce a secondary ultrasound transducer to determine the speed of sound in the fluid used. The speed of sound is highly dependent on the temperature and pressure used in the filters. When the exact speed of sound is obtained by the reference transducer, the effect of temperature and pressure is eliminated. This speed is then used to calculate the distances with a higher accuracy. As the accuracy or the resolution of the ultrasound measurement is increased, the method can be applied to a higher amount of applications especially for processes where fouling layers are thinner because of smaller macromolecules. With the help of the transducer, membrane compaction of 13 μm was measured in the pressure of 5 bars. The results were verified with the permeate flux decline, which indicated that compaction had taken place. The measurements of compaction with a micrometer showed compaction of 23–26 μm. The results are in the same range and confirm the compaction. Mechanical compaction measurements were made using a hydraulic piston, and the result was the same 13 μm as obtained by applying the ultrasound time domain reflectometry (UTDR). A scanning electron microscope (SEM) was used to study the structure of the samples before and after the compaction.
Resumo:
Developed from human activities, mathematical knowledge is bound to the world and cultures that men and women experience. One can say that mathematics is rooted in humans’ everyday life, an environment where people reach agreement regarding certain “laws” and principles in mathematics. Through interaction with worldly phenomena and people, children will always gain experience that they can then in turn use to understand future situations. Consequently, the environment in which a child grows up plays an important role in what that child experiences and what possibilities for learning that child has. Variation theory, a branch of phenomenographical research, defines human learning as changes in understanding and acting towards a specific phenomenon. Variation theory implies a focus on that which it is possible to learn in a specific learning situation, since only a limited number of critical aspects of a phenomenon can be simultaneously discerned and focused on. The aim of this study is to discern how toddlers experience and learn mathematics in a daycare environment. The study focuses on what toddlers experience, how their learning experience is formed, and how toddlers use their understanding to master their environment. Twenty-three children were observed videographically during everyday activities. The videographic methodology aims to describe and interpret human actions in natural settings. The children are aged from 1 year, 1 month to 3 years, 9 months. Descriptions of the toddlers’ actions and communication with other children and adults are analyzed phenomenographically in order to discover how the children come to understand the different aspects of mathematics they encounter. The study’s analysis reveals that toddlers encounter various mathematical concepts, similarities and differences, and the relationship between parts and whole. Children form their understanding of such aspects in interaction with other children and adults in their everyday life. The results also show that for a certain type of learning to occur, some critical conditions must exist. Variation, simultaneity, reasonableness and fixed points are critical conditions of learning that appear to be important for toddlers’ learning. These four critical conditions are integral parts of the learning process. How children understand mathematics influences how they use mathematics as a tool to master their surrounding world. The results of the study’s analysis of how children use their understanding of mathematics shows that children use mathematics to uphold societal rules, to describe their surrounding world, and as a tool for problem solving. Accordingly, mathematics can be considered a very important phenomenon that children should come into contact with in different ways and which needs to be recognized as a necessary part of children’s everyday life. Adults working with young children play an important role in setting perimeters for children’s experiences and possibilities to explore mathematical concepts and phenomena. Therefore, this study is significant as regards understanding how children learn mathematics through everyday activities.
Resumo:
In this report, we summarize results of our part of the ÄLYKOP-project on customer value creation in the intersection of the health care, ICT, forest and energy industries. The research directs to describe how industry transformation and convergence create new possibilities, business opportunities and even new industries.The report consists of findings which are presented former in academic publications. The publication discusses on customer value, service provision and resource basis of the novel concepts through multiple theorethical frameworks. The report is divided into three maim sections which are theoretical background, discussion on health care industry and evaluations regarding novel smart home concepts. Transaction cost economics and Resource- Based view on the firm provides the theoretical basis to analyze the prescribed phenomena. The health care industry analysis describes the most important changes in the demand conditions of health care services, and explores the features that are likely to open new business opportunities for a solution provider. The third part of the report on the smart home business provides illustrations few potential concepts that can be considered to provide solutions to economical problems which arise from aging of population. The results provide several recommendations for the smart home platform developers in public and private sectors. By the analysis, public organizations dominate service provision and private markets are emergent state at present. We argue that public-private partnerships are nececssary for creating key suppliers. Indeed, paying attion on appropriate regulation, service specifications and technology standards would foster diffusion of new services. The dynamics of the service provision networks is driven by need for new capabiltities which are required for adapting business concepts to new competitive situation. Finally, the smart home framework revealed links between conventionally distant business areas such as health care and energy distribution. The platform integrates functionalities different for purposes which however apply same resource basis.
Resumo:
The offset printing process is complex and involves the meeting of two essentially complex materials, printing ink and paper, upon which the final product is formed. It can therefore be expected that a multitude of chemical and physical interactions and mechanisms take place at the ink-paper interface. Interactions between ink and paper are of interest to both the papermakers and ink producers, as they wish to achieve better quality in the final product. The objective of this work is to clarify the combined influence of paper coating structure, printing ink and fountain solution on ink setting and the problems related to ink setting. A further aim is to identify the mechanisms that influence ink setting problems, and to be able to counteract them by changing properties of the coating layer or by changing the properties of the ink. The work carried out for this thesis included use of many techniques ranging from standard paper and printability tests to advanced optical techniques for detection of ink filaments during ink levelling. Modern imaging methods were applied for assessment of ink filament remain sizes and distribution of ink components inside pigment coating layers. Gravimetric filtration method and assessment of print rub using Ink-Surface-Interaction-Tester (ISIT) were utilized to study the influence of ink properties on ink setting. The chemical interactions were observed with the help of modified thin layer chromatography and contact angle measurements using both conventional and high speed imaging. The results of the papers in this thesis link the press operational parameters to filament sizes and show the influence of these parameters to filament size distribution. The relative importance between the press operation parameters was shown to vary. The size distribution of filaments is important in predicting the ink setting behaviour, which was highlighted by the dynamic gloss and ink setting studies. Prediction of ink setting behaviour was also further improved by use of separate permeability factors for different ink types in connection to filtration equations. The roles of ink components were studied in connection to ink absorption and mechanism of print rub. Total solids content and ratio of linseed oil to mineral oil were found to determine the degree of print rub on coated papers. Wax addition improved print rub resistance, but would not decrease print rub as much as lowering the total solids content in the ink. Linseed oil was shown to absorb into pigment coating pores by mechanism of adsorption to pore walls, which highlights the need for sufficient pore surface area for improved chromatographic separation of ink components. These results should help press operators, suppliers of printing presses, papermakers and suppliers to papermakers, to better understand the material and operating conditions of the press as it relates to various print quality issues. Even though paper is in competition with electronic media, high quality printed products are still in demand. The results should provide useful information for this segment of the industry.
Resumo:
Human trafficking is not a new phenomenon. It has existed in various forms for ages around the world. Some researchers have even compared it to slavery, calling it the modern form of slavery in the 21st century. This study is particularly interested in the role of work-related human trafficking in Finnish business. In order for something to be called work-related human trafficking, the concepts of forced labour and human trafficking have to overlap. From the economic point of view, human trafficking is governed by the laws of supply and demand. In many countries the global pressure on cutting costs has created two trends: the increased supply of migrant workers and the deregulation of labour markets. These competitive pressures can have an adverse impact on the conditions of employment and, in the worst cases, can lead to forced labour and trafficking. In fact, trafficking has become one of the most profitable illicit industries worldwide, generating tremendous profits due to its low costs and huge profits. Therefore, it is important to investigate the phenomenon from the business point of view. This study is a qualitative research conducted by using theme interviews as a research approach. Altogether 13 interviews have been conducted and some secondary data has been used in order to find out what the role of human trafficking is in Finnish business. The special sectors investigated are the Finnish construction and service sectors. The theory framework used in this study follows the stakeholder approach. The relevant stakeholder groups for this study are: ‘institutions and authorities’, ‘law enforcement’, ‘management’ and ‘employees – potential victims’ of trafficking. With the help of these stakeholder groups, a holistic picture of the phenomenon is formed. It can be concluded that the role of human trafficking is complicated but it does exist in Finnish business. It appears in low-cost business sectors where the demand for cheap labour is high. Thus, often the victims are foreigners who do not know the culture or the Finnish conditions of employment. Especially smaller Finnish companies are at risk of getting involved in human trafficking or related exploitation cases since monitoring is much more scarce in these firms than in larger companies. The risk of human trafficking and exploitation is also higher at the bottom of the complicated subcontracting chains or when using foreign recruitment agencies. Thus, the study believes that active and intensive collaboration between the company’s different stakeholder groups is needed in order to prevent work-related human trafficking from flourishing in Finland.
Resumo:
At the present work the bifurcational behaviour of the solutions of Rayleigh equation and corresponding spatially distributed system is being analysed. The conditions of oscillatory and monotonic loss of stability are obtained. In the case of oscillatory loss of stability, the analysis of linear spectral problem is being performed. For nonlinear problem, recurrent formulas for the general term of the asymptotic approximation of the self-oscillations are found, the stability of the periodic mode is analysed. Lyapunov-Schmidt method is being used for asymptotic approximation. The correlation between periodic solutions of ODE and PDE is being investigated. The influence of the diffusion on the frequency of self-oscillations is being analysed. Several numerical experiments are being performed in order to support theoretical findings.
Resumo:
In this thesis the bifurcational behavior of the solutions of Langford system is analysed. The equilibriums of the Langford system are found, and the stability of equilibriums is discussed. The conditions of loss of stability are found. The periodic solution of the system is approximated. We consider three types of boundary condition for Langford spatially distributed system: Neumann conditions, Dirichlet conditions and Neumann conditions with additional requirement of zero average. We apply the Lyapunov-Schmidt method to Langford spatially distributed system for asymptotic approximation of the periodic mode. We analyse the influence of the diffusion on the behavior of self-oscillations. As well in the present work we perform numerical experiments and compare it with the analytical results.
Resumo:
The oxidation potential of pulsed corona discharge concerning aqueous impurities is limited in respect to certain refractory compounds. This may be enhanced in combination of the discharge with catalysis/photocatalysis as developed in homogeneous gas-phase reactions. The objective of the work consists of testing the hypothesis of oxidation potential enhancement in combination of the discharge with TiO2 photocatalysis applied to aqueous solutions of refractory oxalate. Meglumine acridone acetate was included for meeting the practical needs. The experimental research was undertaken into oxidation of aqueous solutions under conditions of various target pollutant concentrations, pH and the pulse repetition rate with plain electrodes and the electrodes with TiO2 attached to their surface. The results showed no positive influence of the photocatalyst, the pollutants were oxidized with the rate identical within the accuracy of measurements. The possible explanation for the observed inefficiency may include low UV irradiance, screening effect of water and generally low oxidation rate in photocatalytic reactions. Further studies might include combination of electric discharge with ozone decomposition/radical formation catalysts.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.