37 resultados para High quality (HQ)
Resumo:
Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.
Resumo:
Despite much research on forest biodiversity in Fennoscandia, the exact mechanisms of species declines in dead-wood dependent fungi are still poorly understood. In particular, there is only limited information on why certain fungal species have responded negatively to habitat loss and fragmentation, while others have not. Understanding the mechanisms behind species declines would be essential for the design and development of ecologically effective and scientifically informed conservation measures, and management practices that would promote biodiversity in production forests. In this thesis I study the ecology of polypores and their responses to forest management, with a particular focus on why some species have declined more than others. The data considered in the thesis comprise altogether 98,318 dead-wood objects, with 43,085 observations of 174 fungal species. Out of these, 1,964 observations represent 58 red-listed species. The data were collected from 496 sites, including woodland key habitats, clear-cuts with retention trees, mature managed forests, and natural or natural-like forests in southern Finland and Russian Karelia. I show that the most relevant way of measuring resource availability can differ to a great extent between species seemingly sharing the same resources. It is thus critical to measure the availability of resources in a way that takes into account the ecological requirements of the species. The results show that connectivity at the local, landscape and regional scales is important especially for the highly specialized species, many of which are also red-listed. Habitat loss and fragmentation affect not only species diversity but also the relative abundances of the species and, consequently, species interactions and fungal successional pathways. Changes in species distributions and abundances are likely to affect the food chains in which wood-inhabiting fungi are involved, and thus the functioning of the whole forest ecosystem. The findings of my thesis highlight the importance of protecting well-connected, large and high-quality forest areas to maintain forest biodiversity. Small habitat patches distributed across the landscape are likely to contribute only marginally to protection of red-listed species, especially if habitat quality is not substantially higher than in ordinary managed forest, as is the case with woodland key habitats. Key habitats might supplement the forest protection network if they were delineated larger and if harvesting of individual trees was prohibited in them. Taking the landscape perspective into account in the design and development of conservation measures is critical while striving to halt the decline of forest biodiversity in an ecologically effective manner.
Resumo:
Herbivorous insects comprise a major part of terrestrial biodiversity, and their interactions with their host plants and natural enemies are of vast ecological importance. A large body of research demonstrates that the ecology and evolution of these insects may be affected by trophic interactions, by abiotic influences, and by intraspecific processes, but so far research on these individual aspects has rarely been combined. This thesis uses the leaf-mining moth Tischeria ekebladella and the pedunculate oak (Quercus robur) as a case study to assess how spatial variation in trophic interactions and the physical distribution of host trees jointly affect the distribution, dynamics and evolution of a host-specific herbivore. With respect to habitat quality, Tischeria ekebladella experiences abundant variation at several spatial scales. Most of this variation occurs at small scales notably among leaves and shoots within individual trees. While hypothetically this could cause moths to evolve an ability to select leaves and shoots of high quality, I did not find any coupling between female preference and offspring performance. Based on my studies on temporal variation in resource quality I therefore propose that unpredictable temporal changes in the relative rankings of individual resource units may render it difficult for females to predict the fate of their developing offspring. With respect to intraspecific processes, my results suggest that limited moth dispersal in relation to the spatial distribution of oak trees plays a key role in determining the regional distribution of Tischeria ekebladella. The distribution of the moth is aggregated at the landscape level, where local leaf miner populations are less likely to be present where oaks are scarce. A modelling exercise based on empirical dispersal estimates revealed that the moth population on Wattkast an island in south-western Finland is spatially structured overall, but that the relative importance of local and regional processes on tree-specific moth dynamics varies drastically across the landscape. To conclude, my work in the oak-Tischeria ekebladella system demonstrates that the local abundance and regional distribution of a herbivore may be more strongly influenced by the spatial location of host trees than by their relative quality. Hence, it reveals the importance of considering spatial context in the study of herbivorous insects, and forms a bridge between the classical fields of plant-insect interactions and spatial ecology.
Resumo:
Throughout the history of Linnean taxonomy, species have been described with varying degrees of justification. Many descriptions have been based on only a few ambiguous morphological characters. Moreover, species have been considered natural, well-defined units whereas higher taxa have been treated as disparate, non-existent creations. In the present thesis a few such cases were studied in detail. Often the species-level descriptions were based on only a few specimens and the variation previously thought to be interspecific was found to be intraspecific. In some cases morphological characters were sufficient to resolve the evolutionary relationships between the taxa, but generally more resolution was gained by the addition of molecular evidence. However, both morphological and molecular data were found to be deceptive in some cases. The DNA sequences of morphologically similar specimens were found to differ distinctly in some cases, whereas in other closely related species the morphology of specimens with identical DNA sequences differed substantially. This study counsels caution when evolutionary relationships are being studied utilizing only one source of evidence or a very limited number of characters (e.g. barcoding). Moreover, it emphasizes the importance of high quality data as well as the utilization of proper methods when making scientific inferences. Properly conducted analyses produce robust results that can be utilized in numerous interesting ways. The present thesis considered two such extensions of systematics. A novel hypothesis on the origin of bioluminescence in Elateriformia beetles is presented, tying it to the development of the clicking mechanism in the ancestors of these animals. An entirely different type of extension of systematics is the proposed high value of the white sand forests in maintaining the diversity of beetles in the Peruvian Amazon. White sand forests are under growing pressure from human activities that lead to deforestation. They were found to harbor an extremely diverse beetle fauna and many taxa were specialists living only in this unique habitat. In comparison to the predominant clay soil forests, considerably more elateroid beetles belonging to all studied taxonomic levels (species, genus, tribus, and subfamily) were collected in white sand forests. This evolutionary diversity is hypothesized to be due to a combination of factors: (1) the forest structure, which favors the fungus-plant interactions important for the elateroid beetles, (2) the old age of the forest type favoring survival of many evolutionary lineages and (3) the widespread distribution and fragmentation of the forests in the Miocene, favoring speciation.
Resumo:
High quality of platelet analytics requires specialized knowledge and skills. It was applied to analyze platelet activation and aggregation responses in a prospective controlled study of patients with Finnish type of amyloidosis. The 20 patients with AGel amyloidosis displayed a delayed and more profound platelet shape change than healthy siblings and healthy volunteers, which may be related to altered fragmentation of mutated gelsolin during platelet activation. Alterations in platelet shape change have not been reported in association with platelet disorders. In the rare Bernard-Soulier syndrome with Asn45Ser mutation of glycoprotein (GP) IX, the diagnostic defect in the expression of GPIb-IX-V complex was characterized in seven Finnish patients, also an internationally exceptionally large patient series. When measuring thrombopoietin in serial samples of amniotic fluid and cord blood of 15 pregnant women with confirmed or suspected fetal alloimmune thrombocytopenia, the lower limit of detection could be extended. The results approved that thrombopoietin is present already in amniotic fluid. The application of various non-invasive means for diagnosing thrombocytopenia (TP) revealed that techniques for estimating the proportion of young, i.e. large platelets, such as direct measurement of reticulated platelets and the mean platelet size, would be useful for evaluating platelet kinetics in a given patient. Due to different kinetics between thrombopoietin and increase of young platelets in circulation, these measurements may have most predictive value when measured from simultaneous samples. Platelet autoantibodies were present not only in isolated autoimmune TP but also in patients without TP where disappearance of platelets might be compensated by increased production. The autoantibodies may also persist after TP has been cured. Simultaneous demonstration of increased young platelets (or increased mean platelet volume) in peripheral blood and the presence of platelet associated IgG specificities to major glycoproteins (GPIb-IX and GPIIb-IIIa) may be considered diagnostic for autoimmune TP. Measurement of a soluble marker as a sign of thrombin activation and proceeding deterioration of platelet components was applied to analyze the alterations under several stress factors (storage, transportation and lack of continuous shaking under controlled conditions) of platelet products. The GPV measured as a soluble factor in platelet storage medium showed good correlation with an array of other measurements commonly applied in characterization of stored platelets. The benefits of measuring soluble analyte in a quantitative assay were evident.
Resumo:
The main obstacle for the application of high quality diamond-like carbon (DLC) coatings has been the lack of adhesion to the substrate as the coating thickness is increased. The aim of this study was to improve the filtered pulsed arc discharge (FPAD) method. With this method it is possible to achieve high DLC coating thicknesses necessary for practical applications. The energy of the carbon ions was measured with an optoelectronic time-of-flight method. An in situ cathode polishing system used for stabilizing the process yield and the carbon ion energies is presented. Simultaneously the quality of the coatings can be controlled. To optimise the quality of the deposition process a simple, fast and inexpensive method using silicon wafers as test substrates was developed. This method was used for evaluating the suitability of a simplified arc-discharge set-up for the deposition of the adhesion layer of DLC coatings. A whole new group of materials discovered by our research group, the diamond-like carbon polymer hybrid (DLC-p-h) coatings, is also presented. The parent polymers used in these novel coatings were polydimethylsiloxane (PDMS) and polytetrafluoroethylene (PTFE). The energy of the plasma ions was found to increase when the anode-cathode distance and the arc voltage were increased. A constant deposition rate for continuous coating runs was obtained with an in situ cathode polishing system. The novel DLC-p-h coatings were found to be water and oil repellent and harder than any polymers. The lowest sliding angle ever measured from a solid surface, 0.15 ± 0.03°, was measured on a DLC-PDMS-h coating. In the FPAD system carbon ions can be accelerated to high energies (≈ 1 keV) necessary for the optimal adhesion (the substrate is broken in the adhesion and quality test) of ultra thick (up to 200 µm) DLC coatings by increasing the anode-cathode distance and using high voltages (up to 4 kV). An excellent adhesion can also be obtained with the simplified arc-discharge device. To maintain high process yield (5µm/h over a surface area of 150 cm2) and to stabilize the carbon ion energies and the high quality (sp3 fraction up to 85%) of the resulting coating, an in situ cathode polishing system must be used. DLC-PDMS-h coating is the superior candidate coating material for anti-soiling applications where also hardness is required.
Resumo:
Pitfalls in the treatment of persons with dementia Persons with dementia require high-quality health care, rehabilitation and sufficient social services to support their autonomy and to postpone permanent institutionalization. This study sought to investigate possible pitfalls in the care of patients with dementia: hip fracture rehabilitation, use of inappropriate or antipsychotic medication, social and medicolegal services offered to dementia caregiving families. Three different Finnish samples were used from years 1999-2005, mean age 78 to 86 years. After hip fracture operation, the weight-bearing restriction especially in group of patients with dementia, was associated with a longer rehabilitation period (73.5 days vs. 45.5 days, p=0.03) and the inability to learn to walk after six weeks (p<0.001). Almost half (44%) of the pre-surgery home-dwellers with dementia in our sample required permanent hospitalization after hip fracture. Potentially inappropriate medication was used among 36.2% of nursing home and hospital patients. The most common PIDs in Finland were temazepam over 15 mg/day, oxybutynin, and dipyridamole. However, PID use failed to predict mortality or the use of health services. Nearly half (48.4%) of the nursing home and hospital patients with dementia used antipsychotic medication. The two-year mortality did not differ among the users of conventional or atypical antipsychotics or the non-users (45.3% vs.32.1% vs.49.6%, p=0.195). The mean number of hospital admissions was highest among non-users (p=0.029). A high number of medications (HR 1.12, p<0.001) and the use of physical restraints (HR 1.72, p=0.034) predicted higher mortality at two years, while the use of atypical antipsychotics (HR 0.49, p=0.047) showed a protective effect, if any. The services most often offered to caregiving families of persons with Alzheimer s disease (AD) included financial support from the community (36%), technical devices (33%), physiotherapy (32%), and respite care in nursing homes (31%). Those services most often needed included physiotherapy for the spouse with dementia (56%), financial support (50%), house cleaning (41%), and home respite (40%). Only a third of the caregivers were satisfied with these services, and 69% felt unable to influence the range of services offered. The use of legal guardians was quite rare (only 4.3%), while the use of financial powers of attorney was 37.8%. Almost half (47.9%) of the couples expressed an unmet need for discussion with their doctor about medico-legal issues, while only 9.9% stated that their doctor had informed them of such matters. Although we already have many practical methods to develop the medical and social care of persons with AD, these patients and their families require better planning and tailoring of such services. In this way, society could offer these elderly persons better quality of life while economizing on its financial resources. This study was supported by Social Insurance Institution of Finland and part of it made in cooperation with the The Central Union of the Welfare for the Aged, Finland.
Resumo:
The superconducting (or cryogenic) gravimeter (SG) is based on the levitation of a superconducting sphere in a stable magnetic field created by current in superconducting coils. Depending on frequency, it is capable of detecting gravity variations as small as 10-11ms-2. For a single event, the detection threshold is higher, conservatively about 10-9 ms-2. Due to its high sensitivity and low drift rate, the SG is eminently suitable for the study of geodynamical phenomena through their gravity signatures. I present investigations of Earth dynamics with the superconducting gravimeter GWR T020 at Metsähovi from 1994 to 2005. The history and key technical details of the installation are given. The data processing methods and the development of the local tidal model at Metsähovi are presented. The T020 is a part of the worldwide GGP (Global Geodynamics Project) network, which consist of 20 working station. The data of the T020 and of other participating SGs are available to the scientific community. The SG T020 have used as a long-period seismometer to study microseismicity and the Earth s free oscillation. The annual variation, spectral distribution, amplitude and the sources of microseism at Metsähovi were presented. Free oscillations excited by three large earthquakes were analyzed: the spectra, attenuation and rotational splitting of the modes. The lowest modes of all different oscillation types are studied, i.e. the radial mode 0S0, the "football mode" 0S2, and the toroidal mode 0T2. The very low level (0.01 nms-1) incessant excitation of the Earth s free oscillation was detected with the T020. The recovery of global and regional variations in gravity with the SG requires the modelling of local gravity effects. The most important of them is hydrology. The variation in the groundwater level at Metsähovi as measured in a borehole in the fractured bedrock correlates significantly (0.79) with gravity. The influence of local precipitation, soil moisture and snow cover are detectable in the gravity record. The gravity effect of the variation in atmospheric mass and that of the non-tidal loading by the Baltic Sea were investigated together, as sea level and air pressure are correlated. Using Green s functions it was calculated that a 1 metre uniform layer of water in the Baltic Sea increases the gravity at Metsähovi by 31 nms-2 and the vertical deformation is -11 mm. The regression coefficient for sea level is 27 nms-2m-1, which is 87% of the uniform model. These studies are associated with temporal height variations using the GPS data of Metsähovi permanent station. Results of long time series at Metsähovi demonstrated high quality of data and correctly carried out offsets and drift corrections. The superconducting gravimeter T020 has been proved to be an eminent and versatile tool in studies of the Earth dynamics.
Resumo:
The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.
Resumo:
Physics teachers are in a key position to form the attitudes and conceptions of future generations toward science and technology, as well as to educate future generations of scientists. Therefore, good teacher education is one of the key areas of physics departments education program. This dissertation is a contribution to the research-based development of high quality physics teacher education, designed to meet three central challenges of good teaching. The first challenge relates to the organization of physics content knowledge. The second challenge, connected to the first one, is to understand the role of experiments and models in (re)constructing the content knowledge of physics for purposes of teaching. The third challenge is to provide for pre-service physics teachers opportunities and resources for reflecting on or assessing their knowledge and experience about physics and physics education. This dissertation demonstrates how these challenges can be met when the content knowledge of physics, the relevant epistemological aspects of physics and the pedagogical knowledge of teaching and learning physics are combined. The theoretical part of this dissertation is concerned with designing two didactical reconstructions for purposes of physics teacher education: the didactical reconstruction of processes (DRoP) and the didactical reconstruction of structures (DRoS). This part starts with taking into account the required professional competencies of physics teachers, the pedagogical aspects of teaching and learning, and the benefits of the graphical ways of representing knowledge. Then it continues with the conceptual and philosophical analysis of physics, especially with the analysis of experiments and models role in constructing knowledge. This analysis is condensed in the form of the epistemological reconstruction of knowledge justification. Finally, these two parts are combined in the designing and production of the DRoP and DRoS. The DRoP captures the knowledge formation of physical concepts and laws in concise and simplified form while still retaining authenticity from the processes of how concepts have been formed. The DRoS is used for representing the structural knowledge of physics, the connections between physical concepts, quantities and laws, to varying extents. Both DRoP and DRoS are represented in graphical form by means of flow charts consisting of nodes and directed links connecting the nodes. The empirical part discusses two case studies that show how the three challenges are met through the use of DRoP and DRoS and how the outcomes of teaching solutions based on them are evaluated. The research approach is qualitative; it aims at the in-depth evaluation and understanding about the usefulness of the didactical reconstructions. The data, which were collected from the advanced course for prospective physics teachers during 20012006, consisted of DRoP and DRoS flow charts made by students and student interviews. The first case study discusses how student teachers used DRoP flow charts to understand the process of forming knowledge about the law of electromagnetic induction. The second case study discusses how student teachers learned to understand the development of physical quantities as related to the temperature concept by using DRoS flow charts. In both studies, the attention is focused on the use of DRoP and DRoS to organize knowledge and on the role of experiments and models in this organization process. The results show that students understanding about physics knowledge production improved and their knowledge became more organized and coherent. It is shown that the flow charts and the didactical reconstructions behind them had an important role in gaining these positive learning results. On the basis of the results reported here, the designed learning tools have been adopted as a standard part of the teaching solutions used in the physics teacher education courses in the Department of Physics, University of Helsinki.
Resumo:
The successful interaction between leaders and their followers is central to the overall functioning of a company. The increasingly multinational nature of modern business and the resulting multicultural and increasingly heterogeneous workforce imposes specific challenges to the development of high-quality work relationships. The Western multinational companies that have started operations in China are facing these challenges. This study examines the quality of leader-follower relationships between Western expatriate leaders and their Chinese followers as well as between Chinese leaders and their Chinese followers in Western-owned subsidiaries in China. The focus is on the influence of personal, interpersonal and behavioural factors (personality, values, cultural knowledge, perceived and actual similarity, interactional justice, and follower performance) and the work-related implications of these relationships (job attitudes and organisational citizenship behaviour). One interesting finding of this study is that Chinese followers have higher perceptions of their Western than their Chinese leaders. The results also indicate that Chinese and Western leaders’ perceptions of their followers are not influenced favourably by the same follower characteristics. In a similar vein, Chinese followers value different traits in Western versus Chinese leaders. These results, as well as the numerous more specific findings of the study, have practical implications for international human resource management and areas such as selection, placement and training. Due to the different effect of personal and interpersonal factors across groups, it is difficult to achieve the “perfect match” between leader and follower characteristics that simultaneously contribute to high-quality relationships for Chinese and Western leaders as well as for followers. However, the results indicate that the ability of organisations to enhance the quality of leader-follower relations by selecting and matching people with suitable characteristics may provide an effective means for organisations to increase positive job attitudes and hence influence work-related outcomes.
Resumo:
The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.
Resumo:
Stroke is a major cause of death and disability, incurs significant costs to healthcare systems, and inflicts severe burden to the whole society. Stroke care in Finland has been described in several population-based studies between 1967 and 1998, but not since. In the PERFECT Stroke study presented here, a system for monitoring the Performance, Effectiveness, and Costs of Treatment episodes in Stroke was developed in Finland. Existing nationwide administrative registries were linked at individual patient level with personal identification numbers to depict whole episodes of care, from acute stroke, through rehabilitation, until the patients went home, were admitted to permanent institutional care, or died. For comparisons in time and between providers, patient case-mix was adjusted for. The PERFECT Stroke database includes 104 899 first-ever stroke patients over the years 1999 to 2008, of whom 79% had ischemic stroke (IS), 14% intracerebral hemorrhage (ICH), and 7% subarachnoid hemorrhage (SAH). A 18% decrease in the age and sex adjusted incidence of stroke was observed over the study period, 1.8% improvement annually. All-cause 1-year case-fatality rate improved from 28.6% to 24.6%, or 0.5% annually. The expected median lifetime after stroke increased by 2 years for IS patients, to 7 years and 7 months, and by 1 year for ICH patients, to 4 years 5 months. No change could be seen in median SAH patient survival, >10 years. Stroke prevalence was 82 000, 1.5% of total population of Finland, in 2008. Modern stroke center care was shown to be associated with a decrease in both death and risk of institutional care of stroke patients. Number needed to treat to prevent these poor outcomes at one year from stroke was 32 (95% confidence intervals 26 to 42). Despite improvements over the study period, more than a third of Finnish stroke patients did not have access to stroke center care. The mean first-year healthcare cost of a stroke patient was ~20 000 , and among survivors ~10 000 annually thereafter. Only part of this cost was incurred by stroke, as the same patients cost ~5000 over the year prior to stroke. Total lifetime costs after first-ever stroke were ~85 000 . A total of 1.1 Billion , 7% of all healthcare expenditure, is used in the treatment of stroke patients annually. Despite a rapidly aging population, the number of new stroke patients is decreasing, and the patients are more likely to survive. This is explained in part by stroke center care, which is effective, and should be made available for all stroke patients. It is possible, in a suitable setting with high-quality administrative registries and a common identifier, to avoid the huge workload and associated costs of setting up a conventional stroke registry, and still acquire a fairly comprehensive dataset on stroke care and outcome.
Resumo:
Paraserianthes falcataria is a very fast growing, light wood tree species, that has recently gained wide interest in Indonesia for industrial wood processing. At the moment the P. falcataria plantations managed by smallholders are lacking predefined management programmes for commercial wood production. The general objective of this study was to model the growth and yield of Paraserianthes falcataria stands managed by smallholders in Ciamis, West Java, Indonesia and to develop management scenarios for different production objectives. In total 106 circular sample plots with over 2300 P. falcataria trees were assessed on smallholder plantation inventory. In addition, information on market prices of P. falcataria wood was collected through rapid appraisals among industries. A tree growth model based on Chapman-Richards function was developed on three different site qualities and the stand management scenarios were developed under three management objectives: (1) low initial stand density with low intensity stand management, (2) high initial stand density with medium intensity of intervention, (3) high initial stand density and strong intensity of silvicultural interventions, repeated more than once. In general, the 9 recommended scenarios have rotation ages varying from 4 to 12 years, planting densities from 4x4 meters (625 trees ha-1) to 3x2 meters (1666 trees ha-1) and thinnings at intensities of removing 30 to 60 % of the standing trees. The highest annual income would be generated on high-quality with a scenario with initial planting density 3x2 m (1666 trees ha-1) one thinning at intensity of removing 55 % of the standing trees at the age of 2 years and clear cut at the age of 4 years.
Resumo:
Perunalla (Solanum tuberosum L.) tällä hetkellä maailmanlaajuisesti eniten sato- ja laatutappioita aiheuttaa perunan Y-virus (PVY). Vaikka pelkän Y-viruksen aiheuttamaa satotappiota on vaikea mitata, on sen arvioitu olevan 20-80 %. Viruksen tärkein leviämistapa on viroottinen siemenperuna. Korkealaatuinen siemenperuna on edellytys ruoka-, ruokateollisuus- ja tärkkelysperunan tuotannolle. Kasvuston silmämääräinen tarkastelu aliarvioi yleensä Y-viruksen esiintyvyyttä. Laboratoriotestauksen avulla saadaan tarkempi tieto pellolta korjatun sadon saastunta-asteesta. Ongelmana Y-viruksen testaamisessa on, että sitä ei havaita dormanssissa olevista perunoista otetuista näytteistä yhtä luotettavasti kuin jo dormanssin ohittaneista perunoista testattaessa. Erilaisia menetelmiä kemikaaleista (Rindite, bromietaani) kasvihormoneihin (mm. gibberelliinihappo) ja varastointiolosuhteiden muutoksiin (kylmä- ja lämpökäsittely) on kokeiltu perunan dormanssin purkamiseen, mutta tulokset ovat olleet vaihtelevia. Tässä tutkielmassa perunan dormanssin purkamiseen käytettiin happi-hiilidioksidikäsittelyä (O2 40 % ja CO2 20 %) eripituisina käsittelyaikoina. Tarkoituksena oli selvittää, vaikuttaako käsittely perunan itämiseen ja dormanssin luontaista aikaisempaan purkautumiseen tai Y-viruksen havaitsemiseen. Lisäksi haluttiin selvittää, voiko Y-viruksen määrittämisen ELISA-testillä (Enzyme Linked Immunosorbent Assay) tehdä yhtä luotettavasti myös muista kasvinosista (mukula, itu), kuin tällä hetkellä yleisesti käytetystä perunan lehdestä. Idätyskäsittelyn vaikutuksista dormanssin purkautumiseen saatiin vaihtelevia, eikä kovinkaan yleistettäviä tuloksia. Käsittelyn ei myöskään havaittu vaikuttavan PYY-viroottisuuden havaitsemiseen eri näytemateriaaleilla testattaessa. Kun eri kasvinosien toimivuutta testissä vertailtiin, mukulamateriaalin todettiin aliarvioivan PVY-viroottisuutta kaikissa kokeissa. Myös itumateriaali aliarvioi pääsääntöisesti PVY-viroottisuutta ELISA:lla tehdyissä määrityksissä. Luotettavin testimateriaali oli perunan lehti.