11 resultados para Optimal control problems

em Helda - Digital Repository of University of Helsinki


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Phosphorus is a nutrient needed in crop production. While boosting crop yields it may also accelerate eutrophication in the surface waters receiving the phosphorus runoff. The privately optimal level of phosphorus use is determined by the input and output prices, and the crop response to phosphorus. Socially optimal use also takes into account the impact of phosphorus runoff on water quality. Increased eutrophication decreases the economic value of surface waters by Deteriorating fish stocks, curtailing the potential for recreational activities and by increasing the probabilities of mass algae blooms. In this dissertation, the optimal use of phosphorus is modelled as a dynamic optimization problem. The potentially plant available phosphorus accumulated in soil is treated as a dynamic state variable, the control variable being the annual phosphorus fertilization. For crop response to phosphorus, the state variable is more important than the annual fertilization. The level of this state variable is also a key determinant of the runoff of dissolved, reactive phosphorus. Also the loss of particulate phosphorus due to erosion is considered in the thesis, as well as its mitigation by constructing vegetative buffers. The dynamic model is applied for crop production on clay soils. At the steady state, the analysis focuses on the effects of prices, damage parameterization, discount rate and soil phosphorus carryover capacity on optimal steady state phosphorus use. The economic instruments needed to sustain the social optimum are also analyzed. According to the results the economic incentives should be conditioned on soil phosphorus values directly, rather than on annual phosphorus applications. The results also emphasize the substantial effects the differences in varying discount rates of the farmer and the social planner have on optimal instruments. The thesis analyzes the optimal soil phosphorus paths from its alternative initial levels. It also examines how erosion susceptibility of a parcel affects these optimal paths. The results underline the significance of the prevailing soil phosphorus status on optimal fertilization levels. With very high initial soil phosphorus levels, both the privately and socially optimal phosphorus application levels are close to zero as the state variable is driven towards its steady state. The soil phosphorus processes are slow. Therefore, depleting high phosphorus soils may take decades. The thesis also presents a methodologically interesting phenomenon in problems of maximizing the flow of discounted payoffs. When both the benefits and damages are related to the same state variable, the steady state solution may have an interesting property, under very general conditions: The tail of the payoffs of the privately optimal path as well as the steady state may provide a higher social welfare than the respective tail of the socially optimal path. The result is formalized and an applied to the created framework of optimal phosphorus use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pitch discrimination is a fundamental property of the human auditory system. Our understanding of pitch-discrimination mechanisms is important from both theoretical and clinical perspectives. The discrimination of spectrally complex sounds is crucial in the processing of music and speech. Current methods of cognitive neuroscience can track the brain processes underlying sound processing either with precise temporal (EEG and MEG) or spatial resolution (PET and fMRI). A combination of different techniques is therefore required in contemporary auditory research. One of the problems in comparing the EEG/MEG and fMRI methods, however, is the fMRI acoustic noise. In the present thesis, EEG and MEG in combination with behavioral techniques were used, first, to define the ERP correlates of automatic pitch discrimination across a wide frequency range in adults and neonates and, second, they were used to determine the effect of recorded acoustic fMRI noise on those adult ERP and ERF correlates during passive and active pitch discrimination. Pure tones and complex 3-harmonic sounds served as stimuli in the oddball and matching-to-sample paradigms. The results suggest that pitch discrimination in adults, as reflected by MMN latency, is most accurate in the 1000-2000 Hz frequency range, and that pitch discrimination is facilitated further by adding harmonics to the fundamental frequency. Newborn infants are able to discriminate a 20% frequency change in the 250-4000 Hz frequency range, whereas the discrimination of a 5% frequency change was unconfirmed. Furthermore, the effect of the fMRI gradient noise on the automatic processing of pitch change was more prominent for tones with frequencies exceeding 500 Hz, overlapping with the spectral maximum of the noise. When the fundamental frequency of the tones was lower than the spectral maximum of the noise, fMRI noise had no effect on MMN and P3a, whereas the noise delayed and suppressed N1 and exogenous N2. Noise also suppressed the N1 amplitude in a matching-to-sample working memory task. However, the task-related difference observed in the N1 component, suggesting a functional dissociation between the processing of spatial and non-spatial auditory information, was partially preserved in the noise condition. Noise hampered feature coding mechanisms more than it hampered the mechanisms of change detection, involuntary attention, and the segregation of the spatial and non-spatial domains of working-memory. The data presented in the thesis can be used to develop clinical ERP-based frequency-discrimination protocols and combined EEG and fMRI experimental paradigms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of sequential data is required in many diverse areas such as telecommunications, stock market analysis, and bioinformatics. A basic problem related to the analysis of sequential data is the sequence segmentation problem. A sequence segmentation is a partition of the sequence into a number of non-overlapping segments that cover all data points, such that each segment is as homogeneous as possible. This problem can be solved optimally using a standard dynamic programming algorithm. In the first part of the thesis, we present a new approximation algorithm for the sequence segmentation problem. This algorithm has smaller running time than the optimal dynamic programming algorithm, while it has bounded approximation ratio. The basic idea is to divide the input sequence into subsequences, solve the problem optimally in each subsequence, and then appropriately combine the solutions to the subproblems into one final solution. In the second part of the thesis, we study alternative segmentation models that are devised to better fit the data. More specifically, we focus on clustered segmentations and segmentations with rearrangements. While in the standard segmentation of a multidimensional sequence all dimensions share the same segment boundaries, in a clustered segmentation the multidimensional sequence is segmented in such a way that dimensions are allowed to form clusters. Each cluster of dimensions is then segmented separately. We formally define the problem of clustered segmentations and we experimentally show that segmenting sequences using this segmentation model, leads to solutions with smaller error for the same model cost. Segmentation with rearrangements is a novel variation to the segmentation problem: in addition to partitioning the sequence we also seek to apply a limited amount of reordering, so that the overall representation error is minimized. We formulate the problem of segmentation with rearrangements and we show that it is an NP-hard problem to solve or even to approximate. We devise effective algorithms for the proposed problem, combining ideas from dynamic programming and outlier detection algorithms in sequences. In the final part of the thesis, we discuss the problem of aggregating results of segmentation algorithms on the same set of data points. In this case, we are interested in producing a partitioning of the data that agrees as much as possible with the input partitions. We show that this problem can be solved optimally in polynomial time using dynamic programming. Furthermore, we show that not all data points are candidates for segment boundaries in the optimal solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Both maternal and fetal complications are increased in diabetic pregnancies. Although hypertensive complications are increased in pregnant women with pregestational diabetes, reports on hypertensive complications in women with gestational diabetes mellitus (GDM) have been contradictory. Congenital malformations and macrosomia are the main fetal complications in Type 1 diabetic pregnancies, whereas fetal macrosomia and birth trauma but not congenital malformations are increased in GDM pregnancies. Aims: To study the frequency of hypertensive disorders in gestational diabetes mellitus. To evaluate the risk of macrosomia and brachial plexus injury (Erb’s palsy) and the ability of the 2-hour glucose tolerance test (OGTT) combined with the 24-hour glucose profile to distinguish between low and high risks of fetal macrosomia among women with GDM. To evaluate the relationship between glycemic control and the risk of fetal malformations in pregnancies complicated by Type 1 diabetes mellitus. To assess the effect of glycemic control on the occurrence of preeclampsia and pregnancy-induced hypertension in Type 1 diabetic pregnancies. Subjects: A total of 986 women with GDM and 203 women with borderline glucose intolerance (one abnormal value in the OGTT) with a singleton pregancy, 488 pregnant women with Type 1 diabetes (691 pregnancies and 709 offspring), and 1154 pregnant non-diabetic women (1181 pregnancies and 1187 offspring) were investigated. Results: In a prospective study on 81 GDM patients the combined frequency of preeclampsia and PIH was higher than in 327 non-diabetic controls (19.8% vs 6.1%, p<0.001). On the other hand, in 203 women with only one abnormal value in the OGTT, the rate of hypertensive complications did not differ from that of the controls. Both GDM women and those with only one abnormal value in the OGTT had higher pre-pregnancy weights and BMIs than the controls. In a retrospective study involving 385 insulin-treated and 520 diet-treated GDM patients, and 805 non-diabetic control pregnant women, fetal macrosomia occurred more often in the insulin-treated GDM pregnancies (18.2%, p<0.001) than in the diet-treated GDM pregnancies (4.4%), or the control pregnancies (2.2%). The rate of Erb’s palsy in vaginally delivered infants was 2.7% in the insulin-treated group of women and 2.4% in the diet-treated women compared with 0.3% in the controls (p<0.001). The cesarean section rate was more than twice as high (42.3% vs 18.6%) in the insulin-treated GDM patients as in the controls. A major fetal malformation was observed in 30 (4.2%) of the 709 newborn infants in Type 1 diabetic pregnancies and in 10 (1.4%) of the 735 controls (RR 3.1, 95% CI 1.6–6.2). Even women whose levels of HbA1c (normal values less than 5.6%) were only slightly increased in early pregnancy (between 5.6 and 6.8%) had a relative risk of fetal malformation of 3.0 (95% CI 1.2–7.5). Only diabetic patients with a normal HbA1c level (<5.6%) in early pregnancy had the same low risk of fetal malformations as the controls. Preeclampsia was diagnosed in 12.8% and PIH in 11.4% of the 616 Type 1 diabetic women without diabetic nephropathy. The corresponding frequencies among the 854 control women were 2.7% (OR 5.2; 95% CI 3.3–8.4) for preeclampsia and 5.6% (OR 2.2, 95% CI 1.5–3.1) for PIH. Multiple logistic regression analysis indicated that glycemic control, nulliparity, diabetic retinopathy and duration of diabetes were statistically significant independent predictors of preeclampsia. The adjusted odds ratios for preeclampsia were 1.6 (95% CI 1.3–2.0) for each 1%-unit increment in the HbA1c value during the first trimester and 0.6 (95% CI 0.5–0.8) for each 1%-unit decrement during the first half of pregnancy. In contrast, changes in glycemic control during the second half of pregnancy did not alter the risk of preeclampsia. Conclusions: In type 1 diabetic pregnancies it is extremely important to achieve optimal glycemic control before pregnancy and maintain it throughout pregnancy in order to decrease the complication rates both in the mother and in her offspring. The rate of fetal macrosomia and birth trauma in GDM pregnancies, especially in the group of insulin-treated women, is still relatively high. New strategies for screening, diagnosing, and treatment of GDM must be developed in order to decrease fetal and neonatal complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The four scientific articles comprising this doctoral dissertation offer new information on the presentation and construction of addiction in the mass media during the period 1968 - 2008. Diachronic surveys as well as quantitative and qualitative content analyses were undertaken to discern trends during the period in question and to investigate underlying conceptions of the problems in contemporary media presentations. The research material for the first three articles consists of a sample of 200 texts from Finland s biggest daily newspaper, Helsingin Sanomat, from the period 1968 - 2006. The fourth study examines English-language tabloid material published on the Internet in 2005 - 2008. A number of principal trends are identified. In addition to a significant increase in addiction reporting over time, the study shows that an internalisation of addiction problems took place in the media presentations under study. The phenomenon is portrayed and tackled from within the problems themselves, often from the viewpoint of the individuals concerned. The tone becomes more personal, and technical and detailed accounts are more and more frequent. Secondly, the concept of addiction is broadened. This can be dated to the 1990s. The concept undergoes a conventionalisation: it is used more frequently in a manner that is not thought to require explanation. The word riippuvuus (the closest equivalent to addiction in Finnish) was adopted more commonly in the reporting at the same time, in the 1990s. Thirdly, the results highlight individual self-governance as a superordinate principle in contemporary descriptions of addiction. If the principal demarcation in earlier texts was between us and them , it is now focused primarily on the individual s competence and ability to govern the self, to restrain and master one's behaviour. Finally, in the fourth study investigating textual constructions of female celebrities (Amy Winehouse, Britney Spears and Kate Moss) in Internet tabloids, various relations and functions of addiction problems, intoxication, body and gender were observed to function as cultural symbols. Addiction becomes a sign, or a style, that represents different significations in relation to the main characters in the tabloid stories. Tabloids, as a genre, play an important role by introducing other images of the problems than those featured in mainstream media. The study is positioned within the framework of modernity theory and its views on the need for self-reflexivity and biographies as tools for the creation and definition of the self. Traditional institutions such as the church, occupation, family etc. no longer play an important role in self-definition. This circumstance creates a need for a culture conveying stories of success and failure in relation to which the individual can position their own behaviour and life content. I propose that addiction , as a theme in media reporting, resolves the conflict that emanates from the ambivalence between the accessibility and the individualisation of consumer society, on the one hand, and the problematic behavioural patterns (addictions) that they may induce, on the other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overlapping sound pressure waves that enter our brain via the ears and auditory nerves must be organized into a coherent percept. Modelling the regularities of the auditory environment and detecting unexpected changes in these regularities, even in the absence of attention, is a necessary prerequisite for orientating towards significant information as well as speech perception and communication, for instance. The processing of auditory information, in particular the detection of changes in the regularities of the auditory input, gives rise to neural activity in the brain that is seen as a mismatch negativity (MMN) response of the event-related potential (ERP) recorded by electroencephalography (EEG). --- As the recording of MMN requires neither a subject s behavioural response nor attention towards the sounds, it can be done even with subjects with problems in communicating or difficulties in performing a discrimination task, for example, from aphasic and comatose patients, newborns, and even fetuses. Thus with MMN one can follow the evolution of central auditory processing from the very early, often critical stages of development, and also in subjects who cannot be examined with the more traditional behavioural measures of auditory discrimination. Indeed, recent studies show that central auditory processing, as indicated by MMN, is affected in different clinical populations, such as schizophrenics, as well as during normal aging and abnormal childhood development. Moreover, the processing of auditory information can be selectively impaired for certain auditory attributes (e.g., sound duration, frequency) and can also depend on the context of the sound changes (e.g., speech or non-speech). Although its advantages over behavioral measures are undeniable, a major obstacle to the larger-scale routine use of the MMN method, especially in clinical settings, is the relatively long duration of its measurement. Typically, approximately 15 minutes of recording time is needed for measuring the MMN for a single auditory attribute. Recording a complete central auditory processing profile consisting of several auditory attributes would thus require from one hour to several hours. In this research, I have contributed to the development of new fast multi-attribute MMN recording paradigms in which several types and magnitudes of sound changes are presented in both speech and non-speech contexts in order to obtain a comprehensive profile of auditory sensory memory and discrimination accuracy in a short measurement time (altogether approximately 15 min for 5 auditory attributes). The speed of the paradigms makes them highly attractive for clinical research, their reliability brings fidelity to longitudinal studies, and the language context is especially suitable for studies on language impairments such as dyslexia and aphasia. In addition I have presented an even more ecological paradigm, and more importantly, an interesting result in view of the theory of MMN where the MMN responses are recorded entirely without a repetitive standard tone. All in all, these paradigms contribute to the development of the theory of auditory perception, and increase the feasibility of MMN recordings in both basic and clinical research. Moreover, they have already proven useful in studying for instance dyslexia, Asperger syndrome and schizophrenia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maternal drug abuse during pregnancy endangers the future health and wellbeing of the infant and growing child. On the other hand, via maternal abstinence, these problems would never occur; so the problems would be totally preventable. Buprenorphine is widely used in opioid maintenance treatment as a substitute medication. In Finland, during 2000 s buprenorphine misuse has steadily increased. In 2009 almost one third of clientele of substance treatment units were in treatment because of buprenorphine dependence. At Helsinki Women s Clinic the first child with prenatal buprenorphine exposure was born in 2001. During 1992-2001 in the three capital area maternity hospitals (Women s clinic, Maternity hospital, Jorvi hospital) 524 women were followed at special antenatal clinics due to substance abuse problems. Three control women were drawn from birth register to each case woman and matched for parity and same place and date of the index birth. According to register data mortality rate was 38-fold higher among cases than controls within 6-15 years after index birth. Especially, the risk for violent or accidental death was increased. The women with substance misuse problems had also elevated risk for viral hepatitis and psychiatric morbidity. They were more often reimbursed for psychopharmaceuticals. Disability pensions and rehabilitation allowances were more often granted to cases than controls. In total 626 children were born from these pregnancies. According to register data 38% of these children were placed in out-of-home care as part of child protection services by the age of two years, and half of them by the age of 12 years, the median follow-up time was 5.8 years. The risk for out-of-home care was associated with factors identifiable during the pre- and perinatal period. In 2002-2005 67 pregnant women with buprenorphine dependence were followed up at the Helsinki University Hospital, Department of Obstetrics and Gynecology. Their pregnancies were uneventful. The prematurity rate was similar and there were no more major anomalies compared to the national statistics. The neonates were lighter compared to the national statistics. They were also born in good condition, with no perinatal hypoxia as defined by standard clinical parameters or certain biochemical markers in the cord blood: erythropoietin, S100 and cardiac troponin-t. Almost 80% of newborns developed neonatal abstinence syndrome (NAS) and two third of them needed morphine medication for it. Maternal smoking over ten cigarettes per day aggravated and benzodiazepine use attenuated NAS. An infant s highest urinary norbuprenorphine concentration during their first 3 days of life correlated with the duration of morphine treatment. The average length of infant s hospital stay was 25 days.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating–dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating–dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs – these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating–dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soft tissue sarcomas are malignant tumours of mesenchymal origin. Because of infiltrative growth pattern, simple enucleation of the tumour causes a high rate of local recurrence. Instead, these tumours should be resected with a rim of normal tissue around the tumour. Data on the adequate margin width are scarce. At Helsinki University Central Hospital (HUCH) a multidisciplinary treatment group started in 1987. Surgical resection with a wide margin (2.5 cm) is the primary aim. In case of narrower margin radiation therapy is necessary. The role of adjuvant chemotherapy remains unclear. Our aims were to study local control by the surgical margin and to develop a new prognostic tool to aid decision-making on which patients should receive adjuvant chemotherapy. Patients with soft tissue sarcoma of the extremity or the trunk wall referred to HUCH during 1987-2002 form material in Studies I and II. External validation material comes from the Lund university sarcoma registry. The smallest surgical margin of at least 2.5 centimetres yielded local control of 89 per cent at five years. Amputation rate was 9 per cent. The proposed prognostic model with necrosis, vascular invasion, size on a continuous scale, depth, location and grade worked well both in Helsinki material and in the validation material, and it also showed good calibration. Based on the present study, we recommend the smallest surgical margin of 2-3 centimetres in soft tissue sarcoma irrespective of grade. Improvement in local control was present but modest in margins wider than 1 centimetre. In cases where gaining a wider margin would lead to a considerable loss of function, smaller margin is to be considered combined to radiation therapy. Patients treated with inadequate margins should be offered radiation therapy irrespective of tumour grade. Our new prognostic model to estimate 10-year survival probability in patients with soft tissue sarcoma of the extremities or trunk wall showed good dicscrimination and calibration. For time being the prognostic model is available for scientific use and further validations. In the future, the model may aid in clinical decision-making. For operable osteosarcoma, neoadjuvant multidrug chemotherapy followed by delayed surgery and multidrug adjuvant chemotherapy is the treatment of choice. Overall survival rates at five years are approximately 75 per cent in modern trials with classical osteosarcoma. All patients diagnosed and reported to the Finnish Cancer Registry with osteosarcoma in Finland during 1971-2005 form the material in Studies III and IV. Limb-salvage rate increased from 23 per cent to 78 per cent during 1971-2005. The 10-year sarcoma-specific survival for the whole study population improved from 32 per cent to 62 per cent. It was 75 per cent for patients with a local high-grade osteosarcoma of the extremity diagnosed during 1991-2005. This study outlines the improved prognosis of osteosarcoma patients in Finland with modern chemotherapy. The 10-year survival rates are good also in an international scale. Nonetheless, their limb-salvage rate remains inferior to those seen for highly selected patient series. Overall, the centralisation of osteosarcoma treatment would most likely improve both survival and limb-salvage rates even further.