874 resultados para development methods
Resumo:
INTRODUCTION: Focal therapy may reduce the toxicity of current radical treatments while maintaining the oncological benefit. Irreversible electroporation (IRE) has been proposed to be tissue selective and so might have favourable characteristics compared to the currently used prostate ablative technologies. The aim of this trial is to determine the adverse events, genito-urinary side effects and early histological outcomes of focal IRE in men with localised prostate cancer. METHODS: This is a single centre prospective development (stage 2a) study following the IDEAL recommendations for evaluating new surgical procedures. Twenty men who have MRI-visible disease localised in the anterior part of the prostate will be recruited. The sample size permits a precision estimate around key functional outcomes. Inclusion criteria include PSA ≤ 15 ng/ml, Gleason score ≤ 4 + 3, stage T2N0M0 and absence of clinically significant disease outside the treatment area. Treatment delivery will be changed in an adaptive iterative manner so as to allow optimisation of the IRE protocol. After focal IRE, men will be followed during 12 months using validated patient reported outcome measures (IPSS, IIEF-15, UCLA-EPIC, EQ-5D, FACT-P, MAX-PC). Early disease control will be evaluated by mpMRI and targeted transperineal biopsy of the treated area at 6 months. DISCUSSION: The NEAT trial will assess the early functional and disease control outcome of focal IRE using an adaptive design. Our protocol can provide guidance for designing an adaptive trial to assess new surgical technologies in the challenging landscape of health technology assessment in prostate cancer treatment.
Resumo:
Molecular diagnosis using real-time polymerase chain reaction (PCR) may allow earlier diagnosis of rickettsiosis. We developed a duplex real-time PCR that amplifies (1) DNA of any rickettsial species and (2) DNA of both typhus group rickettsia, that is, Rickettsia prowazekii and Rickettsia typhi. Primers and probes were selected to amplify a segment of the 16S rRNA gene of Rickettsia spp. for the pan-rickettsial PCR and the citrate synthase gene (gltA) for the typhus group rickettsia PCR. Analytical sensitivity was 10 copies of control plasmid DNA per reaction. No cross-amplification was observed when testing human DNA and 22 pathogens or skin commensals. Real-time PCR was applied to 16 clinical samples. Rickettsial DNA was detected in the skin biopsies of three patients. In one patient with severe murine typhus, the typhus group PCR was positive in a skin biopsy from a petechial lesion and seroconversion was later documented. The two other patients with negative typhus group PCR suffered from Mediterranean and African spotted fever, respectively; in both cases, skin biopsy was performed on the eschar. Our duplex real-time PCR showed a good analytical sensitivity and specificity, allowing early diagnosis of rickettsiosis among three patients, and recognition of typhus in one of them.
Resumo:
INTRODUCTION: Intrauterine Growth Restriction (IUGR) is a multifactorial disease defined by an inability of the fetus to reach its growth potential. IUGR not only increases the risk of neonatal mortality/morbidity, but also the risk of metabolic syndrome during adulthood. Certain placental proteins have been shown to be implicated in IUGR development, such as proteins from the GH/IGF axis and angiogenesis/apoptosis processes. METHODS: Twelve patients with term IUGR pregnancy (birth weight < 10th percentile) and 12 CTRLs were included. mRNA was extracted from the fetal part of the placenta and submitted to a subtraction method (Clontech PCR-Select cDNA Subtraction). RESULTS: One candidate gene identified was the long non-coding RNA NEAT1 (nuclear paraspeckle assembly transcript 1). NEAT1 is the core component of a subnuclear structure called paraspeckle. This structure is responsible for the retention of hyperedited mRNAs in the nucleus. Overall, NEAT1 mRNA expression was 4.14 (±1.16)-fold increased in IUGR vs. CTRL placentas (P = 0.009). NEAT1 was exclusively localized in the nuclei of the villous trophoblasts and was expressed in more nuclei and with greater intensity in IUGR placentas than in CTRLs. PSPC1, one of the three main proteins of the paraspeckle, co-localized with NEAT1 in the villous trophoblasts. The expression of NEAT1_2 mRNA, the long isoform of NEAT1, was only modestly increased in IUGR vs. CTRL placentas. DISCUSSION/CONCLUSION: The increase in NEAT1 and its co-localization with PSPC1 suggests an increase in paraspeckles in IUGR villous trophoblasts. This could lead to an increased retention of important mRNAs in villous trophoblasts nuclei. Given that the villous trophoblasts are crucial for the barrier function of the placenta, this could in part explain placental dysfunction in idiopathic IUGR fetuses.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.
Resumo:
BACKGROUND: Several European HIV observational data bases have, over the last decade, accumulated a substantial number of resistance test results and developed large sample repositories, There is a need to link these efforts together, We here describe the development of such a novel tool that allows to bind these data bases together in a distributed fashion for which the control and data remains with the cohorts rather than classic data mergers.METHODS: As proof-of-concept we entered two basic queries into the tool: available resistance tests and available samples. We asked for patients still alive after 1998-01-01, and between 180 and 195 cm of height, and how many samples or resistance tests there would be available for these patients, The queries were uploaded with the tool to a central web server from which each participating cohort downloaded the queries with the tool and ran them against their database, The numbers gathered were then submitted back to the server and we could accumulate the number of available samples and resistance tests.RESULTS: We obtained the following results from the cohorts on available samples/resistance test: EuResist: not availableI11,194; EuroSIDA: 20,71611,992; ICONA: 3,751/500; Rega: 302/302; SHCS: 53,78311,485, In total, 78,552 samples and 15,473 resistance tests were available amongst these five cohorts. Once these data items have been identified, it is trivial to generate lists of relevant samples that would be usefuI for ultra deep sequencing in addition to the already available resistance tests, Saon the tool will include small analysis packages that allow each cohort to pull a report on their cohort profile and also survey emerging resistance trends in their own cohort,CONCLUSIONS: We plan on providing this tool to all cohorts within the Collaborative HIV and Anti-HIV Drug Resistance Network (CHAIN) and will provide the tool free of charge to others for any non-commercial use, The potential of this tool is to ease collaborations, that is, in projects requiring data to speed up identification of novel resistance mutations by increasing the number of observations across multiple cohorts instead of awaiting single cohorts or studies to reach the critical number needed to address such issues.
Resumo:
Reconstruction of defects in the craniomaxillofacial (CMF) area has mainly been based on bone grafts or metallic fixing plates and screws. Particularly in the case of large calvarial and/or craniofacial defects caused by trauma, tumours or congenital malformations, there is a need for reliable reconstruction biomaterials, because bone grafts or metallic fixing systems do not completely fulfill the criteria for the best possible reconstruction methods in these complicated cases. In this series of studies, the usability of fibre-reinforced composite (FRC) was studied as a biostable, nonmetallic alternative material for reconstructing artificially created bone defects in frontal and calvarial areas of rabbits. The experimental part of this work describes the different stages of the product development process from the first in vitro tests with resin-impregnated fibrereinforced composites to the in vivo animal studies, in which this FRC was tested as an implant material for reconstructing different size bone defects in rabbit frontal and calvarial areas. In the first in vitro study, the FRC was polymerised in contact with bone or blood in the laboratory. The polymerised FRC samples were then incubated in water, which was analysed for residual monomer content by using high performance liquid chromatography (HPLC). It was found that this in vitro polymerisation in contact with bone and blood did not markedly increase the residual monomer leaching from the FRC. In the second in vitro study, different adhesive systems were tested in fixing the implant to bone surface. This was done to find an alternative implant fixing system to screws and pins. On the basis of this study, it was found that the surface of the calvarial bone needed both mechanical and chemical treatments before the resinimpregnated FRC could be properly fixed onto it. In three animal studies performed with rabbit frontal bone defects and critical size calvarial bone defect models, biological responses to the FRC implants were evaluated. On the basis of theseevaluations, it can be concluded that the FRC, based on E-glass (electrical glass) fibres forming a porous fibre veil enables the ingrowth of connective tissues to the inner structures of the material, as well as the bone formation and mineralization inside the fibre veil. Bone formation could be enhanced by using bioactive glass granules fixed to the FRC implants. FRC-implanted bone defects healed partly; no total healing of defects was achieved. Biological responses during the follow-up time, at a maximum of 12 weeks, to resin-impregnated composite implant seemed to depend on the polymerization time of the resin matrix of the FRC. Both of the studied resin systems used in the FRC were photopolymerised and the heat-induced postpolymerisation was used additionally.
Resumo:
The purpose of this bachelor's thesis was to chart scientific research articles to present contributing factors to medication errors done by nurses in a hospital setting, and introduce methods to prevent medication errors. Additionally, international and Finnish research was combined and findings were reflected in relation to the Finnish health care system. Literature review was conducted out of 23 scientific articles. Data was searched systematically from CINAHL, MEDIC and MEDLINE databases, and also manually. Literature was analysed and the findings combined using inductive content analysis. Findings revealed that both organisational and individual factors contributed to medication errors. High workload, communication breakdowns, unsuitable working environment, distractions and interruptions, and similar medication products were identified as organisational factors. Individual factors included nurses' inability to follow protocol, inadequate knowledge of medications and personal qualities of the nurse. Developing and improving the physical environment, error reporting, and medication management protocols were emphasised as methods to prevent medication errors. Investing to the staff's competence and well-being was also identified as a prevention method. The number of Finnish articles was small, and therefore the applicability of the findings to Finland is difficult to assess. However, the findings seem to fit to the Finnish health care system relatively well. Further research is needed to identify those factors that contribute to medication errors in Finland. This is a necessity for the development of methods to prevent medication errors that fit in to the Finnish health care system.
Resumo:
This piece of work which is Identification of Research Portfolio for Development of Filtration Equipment aims at presenting a novel approach to identify promising research topics in the field of design and development of filtration equipment and processes. The projected approach consists of identifying technological problems often encountered in filtration processes. The sources of information for the problem retrieval were patent documents and scientific papers that discussed filtration equipments and processes. The problem identification method adopted in this work focussed on the semantic nature of a sentence in order to generate series of subject-action-object structures. This was achieved with software called Knowledgist. List of problems often encountered in filtration processes that have been mentioned in patent documents and scientific papers were generated. These problems were carefully studied and categorized. Suggestions were made on the various classes of these problems that need further investigation in order to propose a research portfolio. The uses and importance of other methods of information retrieval were also highlighted in this work.
Resumo:
Background and Aims: Eosinophilic Esophagitis (EoE) is reported with increasing frequency over the last two decades. However, it is still unknown whether this reflects a true increase in incidence or just an increased awareness by gastroenterologists. Therefore, we evaluated the incidence and cumulative prevalence of EoE in Olten county over the last 20 years. Methods: Olten county is an area of approximately 91,000 inhabitants without pronounced demographic changes in the last two decades. EoE evaluation is based upon two gastroenterology centers and one pathology center. No public programs for increased EoE awareness were implemented in this region. All EoE patients diagnosed from 1989 to 2009 were entered prospectively into the Olten county database. Results: Fourty-six patients (76% males, mean age 41±16 yrs) were diagnosed with EoE from 1989 to 2009. Ninety-four percent presented with dysphagia. In 70% of the patients concomitant allergies were found. The number of upper endoscopies per year was stable during the entire observation period. An average annual incidence rate of 2/100,000 was found (range 0-8) with a marked increase in the period from 2001 to 2009. A current cumulative EoE prevalence of 43/100,000 inhabitants was calculated. The mean diagnostic delay (time from first symptoms to diagnosis) was 4.3 years from 1989 to 1998 and 4.8 years from 1999 to 2009. Conclusions: Over the last 20 years, a significant increase in EoE incidence was found in a stable indicator region of Switzerland. The constant rate of upper endoscopies, the constant diagnostic delay, as well as the lack of EoE awareness programs in Olten county indicate a true increase in EoE incidence.
Resumo:
Nykyaikaiset Java-teknologiaa sisältävät matkapuhelimet kehittyvät vauhdikkaasti prosessoritehon, muistin määrän sekä uusien käyttöjärjestelmäversioiden tarjoamien ominaisuuksien myötä. Laitteiden näyttöjen koko tulee pysymään pienenä,mutta silti moninaista multimediasisältöä äänen, videon ja kuvan osilta voidaanhuomattavasti parantaa JSR 234:n eli kehittyneen multimedialaajennuksen avulla.Erityisesti edistyneet ääniominaisuudet ovat tervetullut lisä, sillä viime aikojen kehitys matkapuhelimissa on saanut aikaan niiden muuntumisen myös kannettavaksi musiikkisoittimiksi. Diplomityössä JSR 234 -spesifikaation tietty osa kehitettiin ympäristössä, joka koostui Series 60 -ohjelmistoalustankolmannesta versiosta sekä Symbian OS v9.1 käyttöjärjestelmästä. Tuloksena syntynyt Java-rajapinta tarjoaa sovelluskehittäjille yksinkertaisemman lähestymistavan Symbianin efektirajapintaan piilottaen samalla alla olevan käyttöjärjestelmänmonimutkaisuuden. Toteutuksen täytyy olla läpikotaisin testattu, jotta voidaan varmentua sen noudattavan tarkkaan JSR 234 -spesifikaatiota. Työssä on esitelty useita eri testausmenetelmiä tarkoituksena saavuttaa projektissa paras mahdollinen laatu.
Resumo:
Tämän hetken trendit kuten globalisoituminen, ympäristömme turbulenttisuus, elintason nousu, turvallisuuden tarpeen kasvu ja teknologian kehitysnopeus korostavatmuutosten ennakoinnin tarpeellisuutta. Pysyäkseen kilpailukykyisenä yritysten tulee kerätä, analysoida ja hyödyntää liiketoimintatietoa, jokatukee niiden toimintaa viranomaisten, kilpailijoiden ja asiakkaiden toimenpiteiden ennakoinnissa. Innovoinnin ja uusien konseptien kehittäminen, kilpailijoiden toiminnan arviointi, asiakkaiden tarpeet muun muassa vaativatennakoivaa arviointia. Heikot signaalit ovat keskeisessä osassa organisaatioiden valmistautumisessa tulevaisuuden tapahtumiin. Opinnäytetyön tarkoitus on luoda ja kehittää heikkojen signaalien ymmärrystä ja hallintaa sekäkehittää konseptuaalinen ja käytännöllinen lähestymistapa ennakoivan toiminnan edistämiselle. Heikkojen signaalien tyyppien luokittelu perustuu ominaisuuksiin ajan, voimakkuuden ja liiketoimintaan integroinnin suhteen. Erityyppiset heikot signaalit piirteineen luovat reunaehdot laatutekijöiden keräämiselle ja siitä edelleen laatujärjestelmän ja matemaattiseen malliin perustuvan työvälineen kehittämiselle. Heikkojen signaalien laatutekijät on kerätty yhteen kaikista heikkojen signaalien konseptin alueista. Analysoidut ja kohdistetut laatumuuttujat antavat mahdollisuuden kehittää esianalyysiä ja ICT - työvälineitä perustuen matemaattisen mallin käyttöön. Opinnäytetyön tavoitteiden saavuttamiseksi tehtiin ensin Business Intelligence -kirjallisuustutkimus. Hiekkojen signaalien prosessi ja systeemi perustuvat koottuun Business Intelligence - systeemiin. Keskeisinä kehitysalueina tarkasteltiin liiketoiminnan integraatiota ja systemaattisen menetelmän kehitysaluetta. Heikkojen signaalien menetelmien ja määritelmien kerääminen sekä integrointi määriteltyyn prosessiin luovat uuden konseptin perustan, johon tyypitys ja laatutekijät kytkeytyvät. Käytännöllisen toiminnan tarkastelun ja käyttöönoton mahdollistamiseksi toteutettiin Business Intelligence markkinatutkimus (n=156) sekä yhteenveto muihin saatavilla oleviin markkinatutkimuksiin. Syvähaastatteluilla (n=21) varmennettiin laadullisen tarkastelun oikeellisuus. Lisäksi analysoitiin neljä käytännön projektia, joiden yhteenvedot kytkettiin uuden konseptin kehittämiseen. Prosessi voidaan jakaa kahteen luokkaan: yritysten markkinasignaalit vuoden ennakoinnilla ja julkisen sektorin verkostoprojektit kehittäen ennakoinnin struktuurin luonnin 7-15 vuoden ennakoivalle toiminnalle. Tutkimus rajattiin koskemaan pääasiassa ulkoisen tiedon aluetta. IT työvälineet ja lopullisen laatusysteemin kehittäminen jätettiin tutkimuksen ulkopuolelle. Opinnäytetyön tavoitteena ollut heikkojen signaalien konseptin kehittäminen toteutti sille asetetut odotusarvot. Heikkojen signaalien systemaattista tarkastelua ja kehittämistyötä on mahdollista edistää Business Intelligence - systematiikan hyödyntämisellä. Business Intelligence - systematiikkaa käytetään isojen yritysten liiketoiminnan suunnittelun tukena.Organisaatioiden toiminnassa ei ole kuitenkaan yleisesti hyödynnetty laadulliseen analyysiin tukeutuvaa ennakoinnin weak signals - toimintaa. Ulkoisenja sisäisen tiedon integroinnin ja systematiikan hyödyt PK -yritysten tukena vaativat merkittävää panostusta julkishallinnon rahoituksen ja kehitystoiminnan tukimuotoina. Ennakointi onkin tuottanut lukuisia julkishallinnon raportteja, mutta ei käytännön toteutuksia. Toisaalta analysoitujen case-tapausten tuloksena voidaan nähdä, ettei organisaatioissa välttämättä tarvita omaa projektipäällikköä liiketoiminnan tuen kehittämiseksi. Business vastuun ottamiseksi ja asiaan sitoutumiseen on kuitenkin löydyttävä oikea henkilö
Resumo:
To enable a mathematically and physically sound execution of the fatigue test and a correct interpretation of its results, statistical evaluation methods are used to assist in the analysis of fatigue testing data. The main objective of this work is to develop step-by-stepinstructions for statistical analysis of the laboratory fatigue data. The scopeof this project is to provide practical cases about answering the several questions raised in the treatment of test data with application of the methods and formulae in the document IIW-XIII-2138-06 (Best Practice Guide on the Statistical Analysis of Fatigue Data). Generally, the questions in the data sheets involve some aspects: estimation of necessary sample size, verification of the statistical equivalence of the collated sets of data, and determination of characteristic curves in different cases. The series of comprehensive examples which are given in this thesis serve as a demonstration of the various statistical methods to develop a sound procedure to create reliable calculation rules for the fatigue analysis.
Resumo:
The 2010 Position Development Conference addressed four questions related to the impact of previous fractures on 10-year fracture risk as calculated by FRAX(®). To address these questions, PubMed was searched on the keywords "fracture, epidemiology, osteoporosis." Titles of retrieved articles were reviewed for an indication that risk for future fracture was discussed. Abstracts of these articles were reviewed for an indication that one or more of the questions listed above was discussed. For those that did, the articles were reviewed in greater detail to extract the findings and to find additional past work and citing works that also bore on the questions. The official positions and the supporting literature review are presented here. FRAX(®) underestimates fracture probability in persons with a history of multiple fractures (good, A, W). FRAX(®) may underestimate fracture probability in individuals with prevalent severe vertebral fractures (good, A, W). While there is evidence that hip, vertebral, and humeral fractures appear to confer greater risk of subsequent fracture than fractures at other sites, quantification of this incremental risk in FRAX(®) is not possible (fair, B, W). FRAX(®) may underestimate fracture probability in individuals with a parental history of non-hip fragility fracture (fair, B, W). Limitations of the methodology include performance by a single reviewer, preliminary review of the literature being confined to titles, and secondary review being limited to abstracts. Limitations of the evidence base include publication bias, overrepresentation of persons of European descent in the published studies, and technical differences in the methods used to identify prevalent and incident fractures. Emerging topics for future research include fracture epidemiology in non-European populations and men, the impact of fractures in family members other than parents, and the genetic contribution to fracture risk.