37 resultados para computer-based instrumentation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämä diplomityö tehtiin Convergens Oy:lle. Convergens on elektroniikan suunnittelutoimisto, joka on erikoistunut sulautettuihin järjestelmiin sekä tietoliikennetekniikkaan. Diplomityön tavoitteena oli suunnitella tietokonekortti tietoliikennesovelluksia varten asiakkaalle, jolta vaatimusmäärittelyt tulivat. Työ on rajattu koskemaan laitteen prototyypin suunnittelua. Työssä suunnitellaan pääasiassa WLAN-tukiaseman tietokone. Tukiasema onasennettavissa toimistoihin, varastoihin, kauppoihin sekä myös liikkuvaan ajoneuvoon. Suunnittelussa on otettu nämä asiat huomioon, ja laitteen akun pystyy lataamaan muun muassa auton akulla. Langattomat tekniikat ovat voimakkaasti yleistymässä, ja tämän työn tukiasema tarjoaakin varteenotettavan vaihtoehdon lukuisilla ominaisuuksillaan. Mukana on mm. GPS, Bluetooth sekä Ethernet-valmius. Langattomien tekniikoiden lisäksi myös sulautetut järjestelmät ovat voimakkaasti yleistymässä, ja nykyään mikroprosessoreita löytääkin lähesmistä vain. Tässä projektissa käytetty prosessori on nopeutensa puolesta kilpailukykyinen, ja siitä löytyy useita eri rajapintoja. Jatkossa tietokonekortille on myös tulossa WiMAX-tuki, joka lisää tukiaseman tulevaisuuden arvoa asiakkaalle. Projektiin valittu Freescalen MPC8321E-prosessori on PowerPC-arkkitehtuuriin perustuva ja juuri markkinoille ilmestynyt. Tämä toi mukanaan lisähaasteen, sillä kyseisestä prosessorista ei ollut vielä kaikkea tietoa saatavilla. Mekaniikka toi omat haasteensa mukanaan, sillä se rajoitti piirilevyn koonniin, että ylimääräistä piirilevytilaa ei juurikaan jäänyt. Tämän takia esimerkiksi DDR-muistit olivat haastavia reitittää, sillä muistivetojen on oltava melko samanpituisia keskenään. Käyttöjärjestelmänä projektissa käytetään Linuxia. Suunnittelu alkoi keväällä 2007 ja toimiva prototyyppi oli valmis alkusyksystä. Prototyypin testaus osoitti, että tietokonekortti kykenee täyttämään kaikki asiakkaan vaatimukset. Prototyypin testauksessa löytyneet viat ja optimoinnit on tarkoitus korjata tuotantomalliin, joten se antaa hyvän pohjan jatkosuunnittelua varten.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämä työ esittelee uuden tarjota paikasta riippuvaa tietoa langattomien tietoverkkojen käyttäjille. Tieto välitetään jokaiselle käyttäjälle tietämättä mitään käyttäjän henkilöllisyydestä. Sovellustason protokollaksi valittiin HTTP, joka mahdollistaa tämän järjestelmän saattaa tietoa perille useimmille käyttäjille, jotka käyttävät hyvinkin erilaisia päätelaitteita. Tämä järjestelmä toimii sieppaavan www-liikenteen välityspalvelimen jatkeena. Erilaisten tietokantojen sisällä on perusteella järjestelmä päättää välitetäänkö tietoa vai ei. Järjestelmä sisältää myös yksinkertaisen ohjelmiston käyttäjien paikantamiseksi yksittäisen tukiaseman tarkkuudella. Vaikka esitetty ratkaisu tähtääkin paikkaan perustuvien mainosten tarjoamiseen, se on helposti muunnettavissa minkä tahansa tyyppisen tiedon välittämiseen käyttäjille.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Markkinasegmentointi nousi esiin ensi kerran jo 50-luvulla ja se on ollut siitä lähtien yksi markkinoinnin peruskäsitteistä. Suuri osa segmentointia käsittelevästä tutkimuksesta on kuitenkin keskittynyt kuluttajamarkkinoiden segmentointiin yritys- ja teollisuusmarkkinoiden segmentoinnin jäädessä vähemmälle huomiolle. Tämän tutkimuksen tavoitteena on luoda segmentointimalli teollismarkkinoille tietotekniikan tuotteiden ja palveluiden tarjoajan näkökulmasta. Tarkoituksena on selvittää mahdollistavatko case-yrityksen nykyiset asiakastietokannat tehokkaan segmentoinnin, selvittää sopivat segmentointikriteerit sekä arvioida tulisiko tietokantoja kehittää ja kuinka niitä tulisi kehittää tehokkaamman segmentoinnin mahdollistamiseksi. Tarkoitus on luoda yksi malli eri liiketoimintayksiköille yhteisesti. Näin ollen eri yksiköiden tavoitteet tulee ottaa huomioon eturistiriitojen välttämiseksi. Tutkimusmetodologia on tapaustutkimus. Lähteinä tutkimuksessa käytettiin sekundäärisiä lähteitä sekä primäärejä lähteitä kuten case-yrityksen omia tietokantoja sekä haastatteluita. Tutkimuksen lähtökohtana oli tutkimusongelma: Voiko tietokantoihin perustuvaa segmentointia käyttää kannattavaan asiakassuhdejohtamiseen PK-yritys sektorilla? Tavoitteena on luoda segmentointimalli, joka hyödyntää tietokannoissa olevia tietoja tinkimättä kuitenkaan tehokkaan ja kannattavan segmentoinnin ehdoista. Teoriaosa tutkii segmentointia yleensä painottuen kuitenkin teolliseen markkinasegmentointiin. Tarkoituksena on luoda selkeä kuva erilaisista lähestymistavoista aiheeseen ja syventää näkemystä tärkeimpien teorioiden osalta. Tietokantojen analysointi osoitti selviä puutteita asiakastiedoissa. Peruskontaktitiedot löytyvät mutta segmentointia varten tietoa on erittäin rajoitetusti. Tietojen saantia jälleenmyyjiltä ja tukkureilta tulisi parantaa loppuasiakastietojen saannin takia. Segmentointi nykyisten tietojen varassa perustuu lähinnä sekundäärisiin tietoihin kuten toimialaan ja yrityskokoon. Näitäkään tietoja ei ole saatavilla kaikkien tietokannassa olevien yritysten kohdalta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International energy and climate strategies also set Finland’s commitments to increasing the use of renewable energy sources and reducing greenhouse gas emissions. The target can be achieved by, for example, increasing the use of energy wood. Finland’s forest biomass potential is significant compared with current use. Increased use will change forest management and wood harvesting methods however. The thesis examined the potential for integrated pulp and paper mills to increase bioenergy production. The effects of two bioenergy production technologies on the carbon footprint of an integrated LWC mill were studied at mill level and from the cradle-to-customer approach. The LignoBoost process and FT diesel production were chosen as bioenergy cases. The data for the LignoBoost process were obtained from Metso and for the FT diesel process from Neste Oil. The rest of the information is based on the literature and databases of the KCL-ECO life-cycle computer program and Ecoinvent. In both case studies, the carbon footprint was reduced. From the results, it can be concluded that it is possible to achieve a fossil-fuel-free pulp mill with the LignoBoost process. By using steam from the FT diesel process, the amount of auxiliary fuel can be reduced considerably and the bark boiler can be replaced. With a choice of auxiliary fuels for use in heat production in the paper mill and the production methods for purchased electricity, it is possible to affect the carbon footprints even more in both cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, a computer software for defining the geometry for a centrifugal compressor impeller is designed and implemented. The project is done under the supervision of Laboratory of Fluid Dynamics in Lappeenranta University of Technology. This thesis is similar to the thesis written by Tomi Putus (2009) in which a centrifugal compressor impeller flow channel is researched and commonly used design practices are reviewed. Putus wrote a computer software which can be used to define impeller’s three-dimensional geometry based on the basic geometrical dimensions given by a preliminary design. The software designed in this thesis is almost similar but it uses a different programming language (C++) and a different way to define the shape of the impeller meridional projection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central theme of this thesis is the emancipation and further development of learning activity in higher education in the context of the ongoing digital transformation of our societies. It was developed in response to the highly problematic mainstream approach to digital re-instrumentation of teaching and studying practises in contemporary higher education. The mainstream approach is largely based on centralisation, standardisation, commoditisation, and commercialisation, while re-producing the general patterns of control, responsibility, and dependence that are characteristic for activity systems of schooling. Whereas much of educational research and development focuses on the optimisation and fine-tuning of schooling, the overall inquiry that is underlying this thesis has been carried out from an explicitly critical position and within a framework of action science. It thus conceptualises learning activity in higher education not only as an object of inquiry but also as an object to engage with and to intervene into from a perspective of intentional change. The knowledge-constituting interest of this type of inquiry can be tentatively described as a combination of heuristic-instrumental (guidelines for contextualised action and intervention), practical-phronetic (deliberation of value-rational aspects of means and ends), and developmental-emancipatory (deliberation of issues of power, self-determination, and growth) aspects. Its goal is the production of orientation knowledge for educational practise. The thesis provides an analysis, argumentation, and normative claim on why the development of learning activity should be turned into an object of individual|collective inquiry and intentional change in higher education, and why the current state of affairs in higher education actually impedes such a development. It argues for a decisive shift of attention to the intentional emancipation and further development of learning activity as an important cultural instrument for human (self-)production within the digital transformation. The thesis also attempts an in-depth exploration of what type of methodological rationale can actually be applied to an object of inquiry (developing learning activity) that is at the same time conceptualised as an object of intentional change within the ongoing digital transformation. The result of this retrospective reflection is the formulation of “optimally incomplete” guidelines for educational R&D practise that shares the practicalphronetic (value related) and developmental-emancipatory (power related) orientations that had been driving the overall inquiry. In addition, the thesis formulates the instrumental-heuristic knowledge claim that the conceptual instruments that were adapted and validated in the context of a series of intervention studies provide means to effectively intervene into existing practise in higher education to support the necessary development of (increasingly emancipated) networked learning activity. It suggests that digital networked instruments (tools and services) generally should be considered and treated as transient elements within critical systemic intervention research in higher education. It further argues for the predominant use of loosely-coupled, digital networked instruments that allow for individual|collective ownership, control, (co-)production, and re-use in other contexts and for other purposes. Since the range of digital instrumentation options is continuously expanding and currently shows no signs of an imminent slow-down or consolidation, individual and collective exploration and experimentation of this realm needs to be systematically incorporated into higher education practise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The maintenance of electric distribution network is a topical question for distribution system operators because of increasing significance of failure costs. In this dissertation the maintenance practices of the distribution system operators are analyzed and a theory for scheduling maintenance activities and reinvestment of distribution components is created. The scheduling is based on the deterioration of components and the increasing failure rates due to aging. The dynamic programming algorithm is used as a solving method to maintenance problem which is caused by the increasing failure rates of the network. The other impacts of network maintenance like environmental and regulation reasons are not included to the scope of this thesis. Further the tree trimming of the corridors and the major disturbance of the network are not included to the problem optimized in this thesis. For optimizing, four dynamic programming models are presented and the models are tested. Programming is made in VBA-language to the computer. For testing two different kinds of test networks are used. Because electric distribution system operators want to operate with bigger component groups, optimal timing for component groups is also analyzed. A maintenance software package is created to apply the presented theories in practice. An overview of the program is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern machine structures are often fabricated by welding. From a fatigue point of view, the structural details and especially, the welded details are the most prone to fatigue damage and failure. Design against fatigue requires information on the fatigue resistance of a structure’s critical details and the stress loads that act on each detail. Even though, dynamic simulation of flexible bodies is already current method for analyzing structures, obtaining the stress history of a structural detail during dynamic simulation is a challenging task; especially when the detail has a complex geometry. In particular, analyzing the stress history of every structural detail within a single finite element model can be overwhelming since the amount of nodal degrees of freedom needed in the model may require an impractical amount of computational effort. The purpose of computer simulation is to reduce amount of prototypes and speed up the product development process. Also, to take operator influence into account, real time models, i.e. simplified and computationally efficient models are required. This in turn, requires stress computation to be efficient if it will be performed during dynamic simulation. The research looks back at the theoretical background of multibody dynamic simulation and finite element method to find suitable parts to form a new approach for efficient stress calculation. This study proposes that, the problem of stress calculation during dynamic simulation can be greatly simplified by using a combination of floating frame of reference formulation with modal superposition and a sub-modeling approach. In practice, the proposed approach can be used to efficiently generate the relevant fatigue assessment stress history for a structural detail during or after dynamic simulation. In this work numerical examples are presented to demonstrate the proposed approach in practice. The results show that approach is applicable and can be used as proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement is a tool for researching. Therefore, it is important that the measuring process is carried out correctly, without distorting the signal or the measured event. Researches of thermoelectric phenomena have been focused more on transverse thermoelectric phenomena during recent decades. Transverse Seebeck effect enables to produce thinner and faster heat flux sensor than before. Studies about transverse Seebeck effect have so far focused on materials, so in this Master’s Thesis instrumentation of transverse Seebeck effect based heat flux sensor is studied, This Master’s Thesis examines an equivalent circuit of transverse Seebeck effect heat flux sensors, their connectivity to electronics and choosing and design a right type amplifier. The research is carried out with a case study which is Gradient Heat Flux Sensors and an electrical motor. In this work, a general equivalent circuit was presented for the transverse Seebeck effect-based heat flux sensor. An amplifier was designed for the sensor of the case study, and the solution was produced for the measurement of the local heat flux of the electric motor to improve the electromagnetic compatibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software plays an important role in our society and economy. Software development is an intricate process, and it comprises many different tasks: gathering requirements, designing new solutions that fulfill these requirements, as well as implementing these designs using a programming language into a working system. As a consequence, the development of high quality software is a core problem in software engineering. This thesis focuses on the validation of software designs. The issue of the analysis of designs is of great importance, since errors originating from designs may appear in the final system. It is considered economical to rectify the problems as early in the software development process as possible. Practitioners often create and visualize designs using modeling languages, one of the more popular being the Uni ed Modeling Language (UML). The analysis of the designs can be done manually, but in case of large systems, the need of mechanisms that automatically analyze these designs arises. In this thesis, we propose an automatic approach to analyze UML based designs using logic reasoners. This approach firstly proposes the translations of the UML based designs into a language understandable by reasoners in the form of logic facts, and secondly shows how to use the logic reasoners to infer the logical consequences of these logic facts. We have implemented the proposed translations in the form of a tool that can be used with any standard compliant UML modeling tool. Moreover, we authenticate the proposed approach by automatically validating hundreds of UML based designs that consist of thousands of model elements available in an online model repository. The proposed approach is limited in scope, but is fully automatic and does not require any expertise of logic languages from the user. We exemplify the proposed approach with two applications, which include the validation of domain specific languages and the validation of web service interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall goal of the study was to describe nurses’ acceptance of an Internet-based support system in the care of adolescents with depression. The data were collected in four phases during the period 2006 – 2010 from nurses working in adolescent psychiatric outpatient clinics and from professionals working with adolescents in basic public services. In the first phase, the nurses’ anticipated perceptions of the usefulness of the Internet-based support system before its implementation was explored. In the second phase, the nurses’ perceived ease of computer and Internet use and attitudes toward it were explored. In the third phase, the features of the support system and its implementation process were described. In the fourth phase, the nurses’ experiences of behavioural intention and actual system use of the Internet-based support were described in psychiatric out-patient care after one year use. The Technology Acceptance Model (TAM) was used to structure the various research phases. Several benefits were identified from the nurses’ perspective in using the Internet-based support system in the care of adolescents with depression. The nurses’ technology skills were good and their attitudes towards computer use were positive. The support system was developed in various phases to meet the adolescents’ needs. Before the implementation of the information technology (IT)-based support system, it is important to pay attention to the nurses’ IT-training, technology support, resources, and safety as well as ethical issues related to the support system. After one year of using the system, the nurses perceived the Internet-based support system to be useful in the care of adolescents with depression. The adolescents’ independent work with the support system at home and the program’s systematic character were experienced as conducive from the point of view of the treatment. However, the Internet-based support system was integrated only partly into the nurseadolescent interaction even though the nurses’ perceptions of it were positive. The use of the IT-based system as part of the adolescents’ depression care was seen positively and its benefits were recognized. This serves as a good basis for future IT-based techniques. Successful implementations of IT-based support systems need a systematic implementation plan and commitment from the part of the organization and its managers. Supporting and evaluating the implementation of an IT-based system should pay attention to changing the nurses’ work styles. Health care organizations should be offered more flexible opportunities to utilize IT-based systems in direct patient care in the future.