910 resultados para Multiple Baseline Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: This paper describes the results of a feasibility study for a randomised controlled trial (RCT). Methods: Twenty-nine members of the UK Dermatology Clinical Trials Network (UK DCTN) expressed an interest in recruiting for this study. Of these, 17 obtained full ethics and Research & Development (R&D) approval, and 15 successfully recruited patients into the study. A total of 70 participants with a diagnosis of cellulitis of the leg were enrolled over a 5-month period. These participants were largely recruited from medical admissions wards, although some were identified from dermatology, orthopaedic, geriatric and general surgery wards. Data were collected on patient demographics, clinical features and willingness to take part in a future RCT. Results: Despite being a relatively common condition, cellulitis patients were difficult to locate through our network of UK DCTN clinicians. This was largely because patients were rarely seen by dermatologists, and admissions were not co-ordinated centrally. In addition, the impact of the proposed exclusion criteria was high; only 26 (37%) of those enrolled in the study fulfilled all of the inclusion criteria for the subsequent RCT, and were willing to be randomised to treatment. Of the 70 participants identified during the study as having cellulitis of the leg (as confirmed by a dermatologist), only 59 (84%) had all 3 of the defining features of: i) erythema, ii) oedema, and iii) warmth with acute pain/tenderness upon examination. Twenty-two (32%) patients experienced a previous episode of cellulitis within the last 3 years. The median time to recurrence (estimated as the time since the most recent previous attack) was 205 days (95% CI 102 to 308). Service users were generally supportive of the trial, although several expressed concerns about taking antibiotics for lengthy periods, and felt that multiple morbidity/old age would limit entry into a 3-year study. Conclusion: This pilot study has been crucial in highlighting some key issues for the conduct of a future RCT. As a result of these findings, changes have been made to i) the planned recruitment strategy, ii) the proposed inclusion criteria and ii) the definition of cellulitis for use in the future trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Purpose—High blood pressure (BP) is present in 80% of patients with acute ischemic stroke and is independently associated with poor outcome. There are few data examining the relationship between admission BP and acute CT findings. Methods—TAIST was a randomized controlled trial assessing 10 days of treatment with tinzaparin versus aspirin in 1489 patients with acute ischemic stroke (48 hr) with admission BP of 220/120 mm Hg. CT brain scans were performed before randomization and after 10 days. The relationships between baseline BP and adjudicated CT findings were assessed. Odds ratios per 10 mm Hg change in BP were calculated. Results—Higher systolic BP (SBP) was associated with abnormal CT scans because of independent associations with chronic changes of leukoariosis (OR, 1.12; 95% CI, 1.05–1.17) and old infarction (OR, 1.12; 95% CI, 1.06 –1.17) at baseline, and signs of visible infarction at day 10 (OR, 1.06; 95% CI, 1.00 –1.13). A lower SBP was associated with signs of acute infarction (OR, 0.94; 95% CI, 0.89–0.99). Hemorrhagic transformation, dense middle cerebral artery sign, mass effect, and cerebral edema at day 10 were not independently associated with baseline BP. Conclusion—Although high baseline BP is independently associated with a poor outcome after stroke, this was not shown to be through an association with increased hemorrhagic transformation, cerebral edema, or mass effect; trial design may be suboptimal to detect this. Higher SBP is associated with visible infarction on day 10 scans. The influence of changing BP in acute stroke on CT findings is still to be ascertained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social network sites (SNS), such as Facebook, Google+ and Twitter, have attracted hundreds of millions of users daily since their appearance. Within SNS, users connect to each other, express their identity, disseminate information and form cooperation by interacting with their connected peers. The increasing popularity and ubiquity of SNS usage and the invaluable user behaviors and connections give birth to many applications and business models. We look into several important problems within the social network ecosystem. The first one is the SNS advertisement allocation problem. The other two are related to trust mechanisms design in social network setting, including local trust inference and global trust evaluation. In SNS advertising, we study the problem of advertisement allocation from the ad platform's angle, and discuss its differences with the advertising model in the search engine setting. By leveraging the connection between social networks and hyperbolic geometry, we propose to solve the problem via approximation using hyperbolic embedding and convex optimization. A hyperbolic embedding method, \hcm, is designed for the SNS ad allocation problem, and several components are introduced to realize the optimization formulation. We show the advantages of our new approach in solving the problem compared to the baseline integer programming (IP) formulation. In studying the problem of trust mechanisms in social networks, we consider the existence of distrust (i.e. negative trust) relationships, and differentiate between the concept of local trust and global trust in social network setting. In the problem of local trust inference, we propose a 2-D trust model. Based on the model, we develop a semiring-based trust inference framework. In global trust evaluation, we consider a general setting with conflicting opinions, and propose a consensus-based approach to solve the complex problem in signed trust networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: Influenza A viral heterogeneity remains a significant threat due to unpredictable antigenic drift in seasonal influenza and antigenic shifts caused by the emergence of novel subtypes. Annual review of multivalent influenza vaccines targets strains of influenza A and B likely to be predominant in future influenza seasons. This does not induce broad, cross protective immunity against emergent subtypes. Better strategies are needed to prevent future pandemics. Cross-protection can be achieved by activating CD8+ and CD4+ T cells against highly-conserved regions of the influenza genome. We combine available experimental data with informatics-based immunological predictions to help design vaccines potentially able to induce cross-protective T-cells against multiple influenza subtypes. Results: To exemplify our approach we designed two epitope ensemble vaccines comprising highlyconserved and experimentally-verified immunogenic influenza A epitopes as putative non-seasonal influenza vaccines; one specifically targets the US population and the other is a universal vaccine. The USA-specific vaccine comprised 6 CD8+ T cell epitopes (GILGFVFTL, FMYSDFHFI, GMDPRMCSL, SVKEKDMTK, FYIQMCTEL, DTVNRTHQY) and 3 CD4+ epitopes (KGILGFVFTLTVPSE, EYIMKGVYINTALLN, ILGFVFTLTVPSERG). The universal vaccine comprised 8 CD8+ epitopes: (FMYSDFHFI, GILGFVFTL, ILRGSVAHK, FYIQMCTEL, ILKGKFQTA, YYLEKANKI, VSDGGPNLY, YSHGTGTGY) and the same 3 CD4+ epitopes. Our USA-specific vaccine has a population protection coverage (portion of the population potentially responsive to one or more component epitopes of the vaccine, PPC) of over 96% and 95% coverage of observed influenza subtypes. The universal vaccine has a PPC value of over 97% and 88% coverage of observed subtypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overarching theme of this thesis is mesoscale optical and optoelectronic design of photovoltaic and photoelectrochemical devices. In a photovoltaic device, light absorption and charge carrier transport are coupled together on the mesoscale, and in a photoelectrochemical device, light absorption, charge carrier transport, catalysis, and solution species transport are all coupled together on the mesoscale. The work discussed herein demonstrates that simulation-based mesoscale optical and optoelectronic modeling can lead to detailed understanding of the operation and performance of these complex mesostructured devices, serve as a powerful tool for device optimization, and efficiently guide device design and experimental fabrication efforts. In-depth studies of two mesoscale wire-based device designs illustrate these principles—(i) an optoelectronic study of a tandem Si|WO3 microwire photoelectrochemical device, and (ii) an optical study of III-V nanowire arrays.

The study of the monolithic, tandem, Si|WO3 microwire photoelectrochemical device begins with development and validation of an optoelectronic model with experiment. This study capitalizes on synergy between experiment and simulation to demonstrate the model’s predictive power for extractable device voltage and light-limited current density. The developed model is then used to understand the limiting factors of the device and optimize its optoelectronic performance. The results of this work reveal that high fidelity modeling can facilitate unequivocal identification of limiting phenomena, such as parasitic absorption via excitation of a surface plasmon-polariton mode, and quick design optimization, achieving over a 300% enhancement in optoelectronic performance over a nominal design for this device architecture, which would be time-consuming and challenging to do via experiment.

The work on III-V nanowire arrays also starts as a collaboration of experiment and simulation aimed at gaining understanding of unprecedented, experimentally observed absorption enhancements in sparse arrays of vertically-oriented GaAs nanowires. To explain this resonant absorption in periodic arrays of high index semiconductor nanowires, a unified framework that combines a leaky waveguide theory perspective and that of photonic crystals supporting Bloch modes is developed in the context of silicon, using both analytic theory and electromagnetic simulations. This detailed theoretical understanding is then applied to a simulation-based optimization of light absorption in sparse arrays of GaAs nanowires. Near-unity absorption in sparse, 5% fill fraction arrays is demonstrated via tapering of nanowires and multiple wire radii in a single array. Finally, experimental efforts are presented towards fabrication of the optimized array geometries. A hybrid self-catalyzed and selective area MOCVD growth method is used to establish morphology control of GaP nanowire arrays. Similarly, morphology and pattern control of nanowires is demonstrated with ICP-RIE of InP. Optical characterization of the InP nanowire arrays gives proof of principle that tapering and multiple wire radii can lead to near-unity absorption in sparse arrays of InP nanowires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Open Access article is distributed under the terms of the Creative Commons Attribution Noncommercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências Veterinárias na Especialidade de Ciências Biológicas e Biomédicas

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-orthogonal multiple access (NOMA) is emerging as a promising multiple access technology for the fifth generation cellular networks to address the fast growing mobile data traffic. It applies superposition coding in transmitters, allowing simultaneous allocation of the same frequency resource to multiple intra-cell users. Successive interference cancellation is used at the receivers to cancel intra-cell interference. User pairing and power allocation (UPPA) is a key design aspect of NOMA. Existing UPPA algorithms are mainly based on exhaustive search method with extensive computation complexity, which can severely affect the NOMA performance. A fast proportional fairness (PF) scheduling based UPPA algorithm is proposed to address the problem. The novel idea is to form user pairs around the users with the highest PF metrics with pre-configured fixed power allocation. Systemlevel simulation results show that the proposed algorithm is significantly faster (seven times faster for the scenario with 20 users) with a negligible throughput loss than the existing exhaustive search algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiovascular disease (CVD) is the leading cause of death worldwide. With atherosclerosis as the underlying cause for many CVD events, prevention or reduction of subclinical atherosclerotic plaque burden (SAPB) through a healthier lifestyle may have substantial public health benefits. The objective was to describe the protocol of a randomized controlled trial investigating the effectiveness of a 30-month worksite-based lifestyle program aimed to promote cardiovascular health in participants having a high or a low degree of SAPB compared with standard care. We will conduct a randomized controlled trial including middle-aged bank employees from the Progression of Early Subclinical Atherosclerosis cohort, stratified by SAPB (high SAPB n = 260, low SAPB n = 590). Within each stratum, participants will be randomized 1:1 to receive a lifestyle program or standard care. The program consists of 3 elements: (a) 12 personalized lifestyle counseling sessions using Motivational Interviewing over a 30-month period, (b) a wrist-worn physical activity tracker, and (c) a sit-stand workstation. Primary outcome measure is a composite score of blood pressure, physical activity, sedentary time, body weight, diet, and smoking (ie, adapted Fuster-BEWAT score) measured at baseline and at 1-, 2-, and 3-year follow-up. The study will provide insights into the effectiveness of a 30-month worksite-based lifestyle program to promote cardiovascular health compared with standard care in participants with a high or low degree of SAPB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los estudios acerca del concepto actividad física (AF) son diversos, presentando diferentes concepciones; su relación con calidad de vida suele estar generada dentro del discurso médico, que propende por la ejecución de la Actividad Física desde una mirada netamente biológica. Si bien esta disertación es importante, se debe tener en cuenta que los estudios relacionados con calidad de vida y la AF se basan en la condición de bienestar y percepción frente al estado de salud; dichos estudios no se han realizado desde las condiciones de vida y del contexto social. Si bien es cierto que la mirada médica y lo estudios objetivos son relevantes, ya que arrojan estadísticas que permiten abordar recomendaciones en cuanto a la actividad física, en este documento se elaboró una investigación de tipo cualitativo por medio de la revisión documental del concepto de actividad física, sus prácticas y su relación con calidad de vida, que abordan diferentes autores. Para ello se elige la base de datos PubMed por su énfasis en las publicaciones de salud; se seleccionan artículos publicados del 2004 y 2014, que estudien el concepto de actividad física, sus prácticas y relaciones con calidad de vida, para finalmente hacer un análisis desde los modelos de determinación y determinantes sociales. De esta forma se analiza la posición de los autores con respecto al concepto, sus prácticas y las relaciones que puede llegar a surgir con la calidad de vida. En esta investigación se obtuvo como resultados tendencias biológicas, psicológicas, sociales y culturales, en los cuales los autores dejan clara la posición médica ya que en la mayoría de investigaciones centran sus relaciones en la funcionalidad, y es a través de la visión terapéutica donde buscan el bienestar, la satisfacción de los pacientes que padecen cualquier enfermedad. Además, aparecen categorías emergentes como: cuerpo como medio de publicidad, cibernética que avanza vertiginosamente y el papel del poder en la actividad física que pueden ser contempladas para otros estudios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest biomass has been having an increasing importance in the world economy and in the evaluation of the forests development and monitoring. It was identified as a global strategic reserve, due to its applications in bioenergy, bioproduct development and issues related to reducing greenhouse gas emissions. The estimation of above ground biomass is frequently done with allometric functions per species with plot inventory data. An adequate sampling design and intensity for an error threshold is required. The estimation per unit area is done using an extrapolation method. This procedure is labour demanding and costly. The mail goal of this study is the development of allometric functions for the estimation of above ground biomass with ground cover as independent variable, for forest areas of holm aok (Quercus rotundifolia), cork oak (Quercus suber) and umbrella pine (Pinus pinea) in multiple use systems. Ground cover per species was derived from crown horizontal projection obtained by processing high resolution satellite images, orthorectified, geometrically and atmospheric corrected, with multi-resolution segmentation method and object oriented classification. Forest inventory data were used to estimate plot above ground biomass with published allometric functions at tree level. The developed functions were fitted for monospecies stands and for multispecies stands of Quercus rotundifolia and Quercus suber, and Quercus suber and Pinus pinea. The stand composition was considered adding dummy variables to distinguish monospecies from multispecies stands. The models showed a good performance. Noteworthy is that the dummy variables, reflecting the differences between species, originated improvements in the models. Significant differences were found for above ground biomass estimation with the functions with and without the dummy variables. An error threshold of 10% corresponds to stand areas of about 40 ha. This method enables the overall area evaluation, not requiring extrapolation procedures, for the three species, which occur frequently in multispecies stands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we deal with the design of experiments in the drug development process, focusing on the design of clinical trials for treatment comparisons (Part I) and the design of preclinical laboratory experiments for proteins development and manufacturing (Part II). In Part I we propose a multi-purpose design methodology for sequential clinical trials. We derived optimal allocations of patients to treatments for testing the efficacy of several experimental groups by also taking into account ethical considerations. We first consider exponential responses for survival trials and we then present a unified framework for heteroscedastic experimental groups that encompasses the general ANOVA set-up. The very good performance of the suggested optimal allocations, in terms of both inferential and ethical characteristics, are illustrated analytically and through several numerical examples, also performing comparisons with other designs proposed in the literature. Part II concerns the planning of experiments for processes composed of multiple steps in the context of preclinical drug development and manufacturing. Following the Quality by Design paradigm, the objective of the multi-step design strategy is the definition of the manufacturing design space of the whole process and, as we consider the interactions among the subsequent steps, our proposal ensures the quality and the safety of the final product, by enabling more flexibility and process robustness in the manufacturing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network monitoring is of paramount importance for effective network management: it allows to constantly observe the network’s behavior to ensure it is working as intended and can trigger both automated and manual remediation procedures in case of failures and anomalies. The concept of SDN decouples the control logic from legacy network infrastructure to perform centralized control on multiple switches in the network, and in this context, the responsibility of switches is only to forward packets according to the flow control instructions provided by controller. However, as current SDN switches only expose simple per-port and per-flow counters, the controller has to do almost all the processing to determine the network state, which causes significant communication overhead and excessive latency for monitoring purposes. The absence of programmability in the data plane of SDN prompted the advent of programmable switches, which allow developers to customize the data-plane pipeline and implement novel programs operating directly in the switches. This means that we can offload certain monitoring tasks to programmable data planes, to perform fine-grained monitoring even at very high packet processing speeds. Given the central importance of network monitoring exploiting programmable data planes, the goal of this thesis is to enable a wide range of monitoring tasks in programmable switches, with a specific focus on the ones equipped with programmable ASICs. Indeed, most network monitoring solutions available in literature do not take computational and memory constraints of programmable switches into due account, preventing, de facto, their successful implementation in commodity switches. This claims that network monitoring tasks can be executed in programmable switches. Our evaluations show that the contributions in this thesis could be used by network administrators as well as network security engineers, to better understand the network status depending on different monitoring metrics, and thus prevent network infrastructure and service outages.