915 resultados para Performanced Based Design
Resumo:
OBJECTIVES: To investigate the frequency of interim analyses, stopping rules, and data safety and monitoring boards (DSMBs) in protocols of randomized controlled trials (RCTs); to examine these features across different reasons for trial discontinuation; and to identify discrepancies in reporting between protocols and publications. STUDY DESIGN AND SETTING: We used data from a cohort of RCT protocols approved between 2000 and 2003 by six research ethics committees in Switzerland, Germany, and Canada. RESULTS: Of 894 RCT protocols, 289 prespecified interim analyses (32.3%), 153 stopping rules (17.1%), and 257 DSMBs (28.7%). Overall, 249 of 894 RCTs (27.9%) were prematurely discontinued; mostly due to reasons such as poor recruitment, administrative reasons, or unexpected harm. Forty-six of 249 RCTs (18.4%) were discontinued due to early benefit or futility; of those, 37 (80.4%) were stopped outside a formal interim analysis or stopping rule. Of 515 published RCTs, there were discrepancies between protocols and publications for interim analyses (21.1%), stopping rules (14.4%), and DSMBs (19.6%). CONCLUSION: Two-thirds of RCT protocols did not consider interim analyses, stopping rules, or DSMBs. Most RCTs discontinued for early benefit or futility were stopped without a prespecified mechanism. When assessing trial manuscripts, journals should require access to the protocol.
Resumo:
The growing use of direct oral anticoagulants, in particular among older subjects, raises questions about the limits of the evidence-based medicine. The phase III studies that have validated the efficacy and the safety profile of these molecules (dabigatran, rivaroxaban, apixaban, edoxaban) in their both indications, the venous thromboembolic disease and the non-valvular atrial fibrillation raise concerns in four major fields: the financial support of pharmaceutical companies, the links of interest for many authors with the industry, the study design (exclusively non-inferiority studies), and the poor representativeness of the older subjects included. All these points are discussed, using data of sub-groups studies, post-marketing studies and recent meta-analysis. The lack of data for the very old subjects, with frailty or comorbidities, remains the main concern from these phase III studies.
Resumo:
Objective: We used demographic and clinical data to design practical classification models for prediction of neurocognitive impairment (NCI) in people with HIV infection. Methods: The study population comprised 331 HIV-infected patients with available demographic, clinical, and neurocognitive data collected using a comprehensive battery of neuropsychological tests. Classification and regression trees (CART) were developed to btain detailed and reliable models to predict NCI. Following a practical clinical approach, NCI was considered the main variable for study outcomes, and analyses were performed separately in treatment-naïve and treatment-experienced patients. Results: The study sample comprised 52 treatment-naïve and 279 experienced patients. In the first group, the variables identified as better predictors of NCI were CD4 cell count and age (correct classification [CC]: 79.6%, 3 final nodes). In treatment-experienced patients, the variables most closely related to NCI were years of education, nadir CD4 cell count, central nervous system penetration-effectiveness score, age, employment status, and confounding comorbidities (CC: 82.1%, 7 final nodes). In patients with an undetectable viral load and no comorbidities, we obtained a fairly accurate model in which the main variables were nadir CD4 cell count, current CD4 cell count, time on current treatment, and past highest viral load (CC: 88%, 6 final nodes). Conclusion: Practical classification models to predict NCI in HIV infection can be obtained using demographic and clinical variables. An approach based on CART analyses may facilitate screening for HIV-associated neurocognitive disorders and complement clinical information about risk and protective factors for NCI in HIV-infected patients.
Resumo:
INTRODUCTION: Alcohol use is one of the leading modifiable morbidity and mortality risk factors among young adults. STUDY DESIGN: 2 parallel-group randomized controlled trial with follow-up at 1 and 6 months. SETTING/PARTICIPANTS: Internet based study in a general population sample of young men with low-risk drinking, recruited between June 2012 and February 2013. INTERVENTION: Internet-based brief alcohol primary prevention intervention (IBI). The IBI aims at preventing an increase in alcohol use: it consists of normative feedback, feedback on consequences, calorific value alcohol, computed blood alcohol concentration, indication that the reported alcohol use is associated with no or limited risks for health. INTERVENTION group participants received the IBI. Control group (CG) participants completed only an assessment. MAIN OUTCOME MEASURES: Alcohol use (number of drinks per week), binge drinking prevalence. Analyses were conducted in 2014-2015. RESULTS: Of 4365 men invited to participate, 1633 did so; 896 reported low-risk drinking and were randomized (IBI: n = 451; CG: n = 445). At baseline, 1 and 6 months, the mean (SD) number of drinks/week was 2.4(2.2), 2.3(2.6), 2.5(3.0) for IBI, and 2.4(2.3), 2.8(3.7), 2.7(3.9) for CG. Binge drinking, absent at baseline, was reported by 14.4% (IBI) and 19.0% (CG) at 1 month and by 13.3% (IBI) and 13.0% (CG) at 6 months. At 1 month, beneficial intervention effects were observed on the number of drinks/week (p = 0.05). No significant differences were observed at 6 months. CONCLUSION: We found protective short term effects of a primary prevention IBI. TRIAL REGISTRATION: Controlled-Trials.com ISRCTN55991918.
Resumo:
PURPOSE: Advanced Practice Lung Cancer Nurses (APLCN) are well-established in several countries but their role has yet to be established in Switzerland. Developing an innovative nursing role requires a structured approach to guide successful implementation and to meet the overarching goal of improved nursing sensitive patient outcomes. The "Participatory, Evidence-based, Patient-focused process, for guiding the development, implementation, and evaluation of advanced practice nursing" (PEPPA framework) is one approach that was developed in the context of the Canadian health system. The purpose of this article is to describe the development of an APLCN model at a Swiss Academic Medical Center as part of a specialized Thoracic Cancer Center and to evaluate the applicability of PEPPA framework in this process. METHOD: In order to develop and implement the APLCN role, we applied the first seven phases of the PEPPA framework. RESULTS: This article spreads the applicability of the PEPPA framework for an APLCN development. This framework allowed us to i) identify key components of an APLCN model responsive to lung cancer patients' health needs, ii) identify role facilitators and barriers, iii) implement the APLCN role and iv) design a feasibility study of this new role. CONCLUSIONS: The PEPPA framework provides a structured process for implementing novel Advanced Practice Nursing roles in a local context, particularly where such roles are in their infancy. Two key points in the process include assessing patients' health needs and involving key stakeholders.
Resumo:
Landslide processes can have direct and indirect consequences affecting human lives and activities. In order to improve landslide risk management procedures, this PhD thesis aims to investigate capabilities of active LiDAR and RaDAR sensors for landslides detection and characterization at regional scales, spatial risk assessment over large areas and slope instabilities monitoring and modelling at site-specific scales. At regional scales, we first demonstrated recent boat-based mobile LiDAR capabilities to model topography of the Normand coastal cliffs. By comparing annual acquisitions, we validated as well our approach to detect surface changes and thus map rock collapses, landslides and toe erosions affecting the shoreline at a county scale. Then, we applied a spaceborne InSAR approach to detect large slope instabilities in Argentina. Based on both phase and amplitude RaDAR signals, we extracted decisive information to detect, characterize and monitor two unknown extremely slow landslides, and to quantify water level variations of an involved close dam reservoir. Finally, advanced investigations on fragmental rockfall risk assessment were conducted along roads of the Val de Bagnes, by improving approaches of the Slope Angle Distribution and the FlowR software. Therefore, both rock-mass-failure susceptibilities and relative frequencies of block propagations were assessed and rockfall hazard and risk maps could be established at the valley scale. At slope-specific scales, in the Swiss Alps, we first integrated ground-based InSAR and terrestrial LiDAR acquisitions to map, monitor and model the Perraire rock slope deformation. By interpreting both methods individually and originally integrated as well, we therefore delimited the rockslide borders, computed volumes and highlighted non-uniform translational displacements along a wedge failure surface. Finally, we studied specific requirements and practical issues experimented on early warning systems of some of the most studied landslides worldwide. As a result, we highlighted valuable key recommendations to design new reliable systems; in addition, we also underlined conceptual issues that must be solved to improve current procedures. To sum up, the diversity of experimented situations brought an extensive experience that revealed the potential and limitations of both methods and highlighted as well the necessity of their complementary and integrated uses.
Resumo:
Peer-reviewed
Resumo:
The presence of e-portfolios in educational centres, companies and administrations has emergedstrongly during the last years by creating very different practices coming from different objectives and purposes. This situation has led researchers and practitioners to design and implement e-portfolios with little reference to previous knowledge of them; consequently, developments are disparate with many of the processes and dimensions used both in development and use being unnecessary complex. In order to minimize the inconveniences, unify these developmental processes and improve the resultsof implementation and use of e-portfolios, it seemed necessary to create a network of researchers, teachers and trainers coming from different universities and institutions of different kinds who are interested in the investigation and the practice of e-portfolios in Spain. Therefore, The Network on e-portfoliowas created in 2006, funded by the Spanish Ministry of Education and led by the UniversitatOberta de Catalunya. Besides the goals associatedwith the creation of this network and which wewanted to share with other European researchers and experts of other continents, we will also present in this paper some data concerned with the first study carried out on the use of e-portfolios in our country that shows where we are and which trends are the most important for the near future.
Resumo:
Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.
Resumo:
Peer-reviewed
Resumo:
This thesis considers aspects related to the design and standardisation of transmission systems for wireless broadcasting, comprising terrestrial and mobile reception. The purpose is to identify which factors influence the technical decisions and what issues could be better considered in the design process in order to assess different use cases, service scenarios and end-user quality. Further, the necessity of cross-layer optimisation for efficient data transmission is emphasised and means to take this into consideration are suggested. The work is mainly related terrestrial and mobile digital video broadcasting systems but many of the findings can be generalised also to other transmission systems and design processes. The work has led to three main conclusions. First, it is discovered that there are no sufficiently accurate error criteria for measuring the subjective perceived audiovisual quality that could be utilised in transmission system design. Means for designing new error criteria for mobile TV (television) services are suggested and similar work related to other services is recommended. Second, it is suggested that in addition to commercial requirements there should be technical requirements setting the frame work for the design process of a new transmission system. The technical requirements should include the assessed reception conditions, technical quality of service and service functionalities. Reception conditions comprise radio channel models, receiver types and antenna types. Technical quality of service consists of bandwidth, timeliness and reliability. Of these, the thesis focuses on radio channel models and errorcriteria (reliability) as two of the most important design challenges and provides means to optimise transmission parameters based on these. Third, the thesis argues that the most favourable development for wireless broadcasting would be a single system suitable for all scenarios of wireless broadcasting. It is claimed that there are no major technical obstacles to achieve this and that the recently published second generation digital terrestrial television broadcasting system provides a good basis. The challenges and opportunities of a universal wireless broadcasting system are discussed mainly from technical but briefly also from commercial and regulatory aspect
Resumo:
The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.
Resumo:
Particulate nanostructures are increasingly used for analytical purposes. Such particles are often generated by chemical synthesis from non-renewable raw materials. Generation of uniform nanoscale particles is challenging and particle surfaces must be modified to make the particles biocompatible and water-soluble. Usually nanoparticles are functionalized with binding molecules (e.g., antibodies or their fragments) and a label substance (if needed). Overall, producing nanoparticles for use in bioaffinity assays is a multistep process requiring several manufacturing and purification steps. This study describes a biological method of generating functionalized protein-based nanoparticles with specific binding activity on the particle surface and label activity inside the particles. Traditional chemical bioconjugation of the particle and specific binding molecules is replaced with genetic fusion of the binding molecule gene and particle backbone gene. The entity of the particle shell and binding moieties are synthesized from generic raw materials by bacteria, and fermentation is combined with a simple purification method based on inclusion bodies. The label activity is introduced during the purification. The process results in particles that are ready-to-use as reagents in bioaffinity. Apoferritin was used as particle body and the system was demonstrated using three different binding moieties: a small protein, a peptide and a single chain Fv antibody fragment that represents a complex protein including disulfide bridge.If needed, Eu3+ was used as label substance. The results showed that production system resulted in pure protein preparations, and the particles were of homogeneous size when visualized with transmission electron microscopy. Passively introduced label was stably associated with the particles, and binding molecules genetically fused to the particle specifically bound target molecules. Functionality of the particles in bioaffinity assays were successfully demonstrated with two types of assays; as labels and in particle-enhanced agglutination assay. This biological production procedure features many advantages that make the process especially suited for applications that have frequent and recurring requirements for homogeneous functional particles. The production process of ready, functional and watersoluble particles follows principles of “green chemistry”, is upscalable, fast and cost-effective.
Resumo:
Meeting design is one of the most critical prerequisites of the success of facilitated meetings but how to achieve the success is not yet fully understood. This study presents a descriptive model of the design of technology supported meetings based on literature findings about the key factors contributing to the success of collaborative meetings, and linking these factors to the meeting design steps by exploring how facilitators consider the factors in practice in their design process. The empirical part includes a multiple-case study conducted among 12 facilitators. The case concentrates on the GSS laboratory at LUT, which has been working on facilitation and GSS for the last fifteen years. The study also includes ‘control’ cases from two comparable institutions. The results of this study highlight both the variances and commonalities among facilitators in how they design collaboration processes. The design thinking of facilitators of all levels of experience is found to be largely consistent wherefore the key design factors as well as their role across the design process can be outlined. Session goals, group composition, supporting technology, motivational aspects, physical constraints, and correct design practices were found to outline the key factors in design thinking. These factors are further categorized into three factor types of controllable, constraining, and guiding design factors, because the study findings indicate the factor type to have an effect on the factor’s importance in design. Furthermore, the order of considering these factors in the design process is outlined.
Resumo:
The layout design process of the packaging laboratory at Lappeenranta University of Technology is documented in this thesis. Layout planning methods are discussed in general. The systematic layout planning procedure is presented in more detail as it is utilised in the case of layout planning of the packaging laboratory. General demands for research laboratory are discussed both from the machine and product perspectives. The possibilities for commercial food processing in the laboratory are discussed from the point of view of foodstuff processing regulations and hygiene demands. The layout planning process is documented and different layout possibilities are presented. Different layout drafts are evaluated and one layout draft is developed to be the final layout of the packaging laboratory. Guideline for technical planning and implementation based on the final layout is given