44 resultados para Multiple Baseline Design

em Université de Lausanne, Switzerland


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Questions: A multiple plot design was developed for permanent vegetation plots. How reliable are the different methods used in this design and which changes can we measure? Location: Alpine meadows (2430 m a.s.l.) in the Swiss Alps. Methods: Four inventories were obtained from 40 m(2) plots: four subplots (0.4 m(2)) with a list of species, two 10m transects with the point method (50 points on each), one subplot (4 m2) with a list of species and visual cover estimates as a percentage and the complete plot (40 m(2)) with a list of species and visual estimates in classes. This design was tested by five to seven experienced botanists in three plots. Results: Whatever the sampling size, only 45-63% of the species were seen by all the observers. However, the majority of the overlooked species had cover < 0.1%. Pairs of observers overlooked 10-20% less species than single observers. The point method was the best method for cover estimate, but it took much longer than visual cover estimates, and 100 points allowed for the monitoring of only a very limited number of species. The visual estimate as a percentage was more precise than classes. Working in pairs did not improve the estimates, but one botanist repeating the survey is more reliable than a succession of different observers. Conclusion: Lists of species are insufficient for monitoring. It is necessary to add cover estimates to allow for subsequent interpretations in spite of the overlooked species. The choice of the method depends on the available resources: the point method is time consuming but gives precise data for a limited number of species, while visual estimates are quick but allow for recording only large changes in cover. Constant pairs of observers improve the reliability of the records.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Many patients with an implantable cardioverter-defibrillator (ICD) have indications for magnetic resonance imaging (MRI). However, MRI is generally contraindicated in ICD patients because of potential risks from hazardous interactions between the MRI and ICD system. OBJECTIVE: The purpose of this study was to use preclinical computer modeling, animal studies, and bench and scanner testing to demonstrate the safety of an ICD system developed for 1.5-T whole-body MRI. METHODS: MRI hazards were assessed and mitigated using multiple approaches: design decisions to increase safety and reliability, modeling and simulation to quantify clinical MRI exposure levels, animal studies to quantify the physiologic effects of MRI exposure, and bench testing to evaluate safety margin. RESULTS: Modeling estimated the incidence of a chronic change in pacing capture threshold >0.5V and 1.0V to be less than 1 in 160,000 and less than 1 in 1,000,000 cases, respectively. Modeling also estimated the incidence of unintended cardiac stimulation to occur in less than 1 in 1,000,000 cases. Animal studies demonstrated no delay in ventricular fibrillation detection and no reduction in ventricular fibrillation amplitude at clinical MRI exposure levels, even with multiple exposures. Bench and scanner testing demonstrated performance and safety against all other MRI-induced hazards. CONCLUSION: A preclinical strategy that includes comprehensive computer modeling, animal studies, and bench and scanner testing predicts that an ICD system developed for the magnetic resonance environment is safe and poses very low risks when exposed to 1.5-T normal operating mode whole-body MRI.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Stroke registries are valuable tools for obtaining information about stroke epidemiology and management. The Acute STroke Registry and Analysis of Lausanne (ASTRAL) prospectively collects epidemiological, clinical, laboratory and multimodal brain imaging data of acute ischemic stroke patients in the Centre Hospitalier Universitaire Vaudois (CHUV). Here, we provide design and methods used to create ASTRAL and present baseline data of our patients (2003 to 2008). METHODS: All consecutive patients admitted to CHUV between January 1, 2003 and December 31, 2008 with acute ischemic stroke within 24 hours of symptom onset were included in ASTRAL. Patients arriving beyond 24 hours, with transient ischemic attack, intracerebral hemorrhage, subarachnoidal hemorrhage, or cerebral sinus venous thrombosis, were excluded. Recurrent ischemic strokes were registered as new events. RESULTS: Between 2003 and 2008, 1633 patients and 1742 events were registered in ASTRAL. There was a preponderance of males, even in the elderly. Cardioembolic stroke was the most frequent type of stroke. Most strokes were of minor severity (National Institute of Health Stroke Scale [NIHSS] score ≤ 4 in 40.8% of patients). Cardioembolic stroke and dissections presented with the most severe clinical picture. There was a significant number of patients with unknown onset stroke, including wake-up stroke (n=568, 33.1%). Median time from last-well time to hospital arrival was 142 minutes for known onset and 759 minutes for unknown-onset stroke. The rate of intravenous or intraarterial thrombolysis between 2003 and 2008 increased from 10.8% to 20.8% in patients admitted within 24 hours of last-well time. Acute brain imaging was performed in 1695 patients (97.3%) within 24 hours. In 1358 patients (78%) who underwent acute computed tomography angiography, 717 patients (52.8%) had significant abnormalities. Of the 1068 supratentorial stroke patients who underwent acute perfusion computed tomography (61.3%), focal hypoperfusion was demonstrated in 786 patients (73.6%). CONCLUSIONS: This hospital-based prospective registry of consecutive acute ischemic strokes incorporates demographic, clinical, metabolic, acute perfusion, and arterial imaging. It is characterized by a high proportion of minor and unknown-onset strokes, short onset-to-admission time for known-onset patients, rapidly increasing thrombolysis rates, and significant vascular and perfusion imaging abnormalities in the majority of patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Multiple electrode aggregometry (MEA) is a point-of-care test evaluating platelet function and the efficacy of platelet inhibitors. In MEA, electrical impedance of whole blood is measured after addition of a platelet activator. Reduced impedance implies platelet dysfunction or the presence of platelet inhibitors. MEA plays an increasingly important role in the management of perioperative platelet dysfunction. In vitro, midazolam, propofol, lidocaine and magnesium have known antiplatelet effects and these may interfere with MEA interpretation. OBJECTIVE: To evaluate the extent to which MEA is modified in the presence of these drugs. DESIGN: An in-vitro study using blood collected from healthy volunteers. SETTING: Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland, 2010 to 2011. PATIENTS: Twenty healthy volunteers. INTERVENTION: Measurement of baseline MEA was using four activators: arachidonic acid, ADP, TRAP-6 and collagen. The study drugs were then added in three increasing, clinically relevant concentrations. MAIN OUTCOME MEASURE: MEA was compared with baseline for each study drug. RESULTS: Midazolam, propofol and lidocaine showed no effect on MEA at any concentration. Magnesium at 2.5 mmol l had a significant effect on the ADP and TRAP tests (31 ± 13 and 96 ± 39 AU, versus 73 ± 21 and 133 ± 28 AU at baseline, respectively), and a less pronounced effect at 1 mmol l on the ADP test (39 ± 0 AU). CONCLUSION: Midazolam, propofol and lidocaine do not interfere with MEA measurement. In patients treated with high to normal doses of magnesium, MEA results for ADP and TRAP-tests should be interpreted with caution. TRIAL REGISTRATION: Clinicaltrials.gov (no. NCT01454427).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: Intervention during the pre-psychotic period of illness holds the potential of delaying or even preventing the onset of a full-threshold disorder, or at least of reducing the impact of such a disorder if it does develop. The first step in realizing this aim was achieved more than 10 years ago with the development and validation of criteria for the identification of young people at ultra-high risk (UHR) of psychosis. Results of three clinical trials have been published that provide mixed support for the effectiveness of psychological and pharmacological interventions in preventing the onset of psychotic disorder. METHOD: The present paper describes a fourth study that has now been undertaken in which young people who met UHR criteria were randomized to one of three treatment groups: cognitive therapy plus risperidone (CogTher + Risp: n = 43); cognitive therapy plus placebo (CogTher + Placebo: n = 44); and supportive counselling + placebo (Supp + Placebo; n = 28). A fourth group of young people who did not agree to randomization were also followed up (monitoring: n = 78). Baseline characteristics of participants are provided. RESULTS AND CONCLUSION: The present study improves on the previous studies because treatment was provided for 12 months and the independent contributions of psychological and pharmacological treatments in preventing transition to psychosis in the UHR cohort and on levels of psychopathology and functioning can be directly compared. Issues associated with recruitment and randomization are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To compare the effects of sodium bicarbonate and lactate for continuous veno-venous hemodiafiltration (CVVHDF) in critically ill patients. DESIGN AND SETTINGS: Prospective crossed-over controlled trial in the surgical and medical ICUs of a university hospital. PATIENTS: Eight patients with multiple organ dysfunction syndrome (MODS) requiring CVVHDF. INTERVENTION: Each patient received the two buffers in a randomized sequence over two consecutive days. MEASUREMENTS AND RESULTS: The following variables were determined: acid-base parameters, lactate production and utilization ((13)C lactate infusion), glucose turnover (6,6(2)H(2)-glucose), gas exchange (indirect calorimetry). No side effect was observed during lactate administration. Baseline arterial acid-base variables were equal with the two buffers. Arterial lactate (2.9 versus 1.5 mmol/l), glycemia (+18%) and glucose turnover (+23%) were higher in the lactate period. Bicarbonate and glucose losses in CVVHDF were substantial, but not lactate elimination. Infusing (13)C lactate increased plasma lactate levels equally with the two buffers. Lactate clearance (7.8+/-0.8 vs 7.5+/-0.8 ml/kg per min in the bicarbonate and lactate periods) and endogenous production rates (14.0+/-2.6 vs 13.6+/-2.6 mmol/kg per min) were similar. (13)C lactate was used as a metabolic substrate, as shown by (13)CO(2) excretion. Glycemia and metabolic rate increased significantly and similarly during the two periods during lactate infusion. CONCLUSION: Lactate was rapidly cleared from the blood of critically ill patients without acute liver failure requiring CVVHDF, being transformed into glucose or oxidized. Lactate did not exert undesirable effects, except moderate hyperglycemia, and achieved comparable effects on acid-base balance to bicarbonate.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative information from magnetic resonance imaging (MRI) may substantiate clinical findings and provide additional insight into the mechanism of clinical interventions in therapeutic stroke trials. The PERFORM study is exploring the efficacy of terutroban versus aspirin for secondary prevention in patients with a history of ischemic stroke. We report on the design of an exploratory longitudinal MRI follow-up study that was performed in a subgroup of the PERFORM trial. An international multi-centre longitudinal follow-up MRI study was designed for different MR systems employing safety and efficacy readouts: new T2 lesions, new DWI lesions, whole brain volume change, hippocampal volume change, changes in tissue microstructure as depicted by mean diffusivity and fractional anisotropy, vessel patency on MR angiography, and the presence of and development of new microbleeds. A total of 1,056 patients (men and women ≥ 55 years) were included. The data analysis included 3D reformation, image registration of different contrasts, tissue segmentation, and automated lesion detection. This large international multi-centre study demonstrates how new MRI readouts can be used to provide key information on the evolution of cerebral tissue lesions and within the macrovasculature after atherothrombotic stroke in a large sample of patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the field of thrombosis and haemostasis, many preanalytical variables influence the results of coagulation assays and measures to limit potential results variations should be taken. To our knowledge, no paper describing the development and maintenance of a haemostasis biobank has been previously published. Our description of the biobank of the Swiss cohort of elderly patients with venous thromboembolism (SWITCO65+) is intended to facilitate the set-up of other biobanks in the field of thrombosis and haemostasis. SWITCO65+ is a multicentre cohort that prospectively enrolled consecutive patients aged ≥65 years with venous thromboembolism at nine Swiss hospitals from 09/2009 to 03/2012. Patients will be followed up until December 2013. The cohort includes a biobank with biological material from each participant taken at baseline and after 12 months of follow-up. Whole blood from all participants is assayed with a standard haematology panel, for which fresh samples are required. Two buffy coat vials, one PAXgene Blood RNA System tube and one EDTA-whole blood sample are also collected at baseline for RNA/DNA extraction. Blood samples are processed and vialed within 1 h of collection and transported in batches to a central laboratory where they are stored in ultra-low temperature archives. All analyses of the same type are performed in the same laboratory in batches. Using multiple core laboratories increased the speed of sample analyses and reduced storage time. After recruiting, processing and analyzing the blood of more than 1,000 patients, we determined that the adopted methods and technologies were fit-for-purpose and robust.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: To investigate the mechanism(s) of resistance to the RAF-inhibitor vemurafenib, we conducted a comprehensive analysis of the genetic alterations occurring in metastatic lesions from a patient with a BRAF(V600E)-mutant cutaneous melanoma who, after a first response, underwent subsequent rechallenge with this drug. EXPERIMENTAL DESIGN: We obtained blood and tissue samples from a patient diagnosed with a BRAF(V600E)-mutant cutaneous melanoma that was treated with vemurafenib and achieved a near-complete response. At progression, he received additional lines of chemo/immunotherapy and was successfully rechallenged with vemurafenib. Exome and RNA sequencing were conducted on a pretreatment tumor and two subcutaneous resistant metastases, one that was present at baseline and previously responded to vemurafenib (PV1) and one that occurred de novo after reintroduction of the drug (PV2). A culture established from PV1 was also analyzed. RESULTS: We identified two NRAS-activating somatic mutations, Q61R and Q61K, affecting two main subpopulations in the metastasis PV1 and a BRAF alternative splicing, involving exons 4-10, in the metastasis PV2. These alterations, known to confer resistance to RAF inhibitors, were tumor-specific, mutually exclusive, and were not detected in pretreatment tumor samples. In addition, the oncogenic PIK3CA(H1047R) mutation was detected in a subpopulation of PV1, but this mutation did not seem to play a major role in vemurafenib resistance in this metastasis. CONCLUSIONS: This work describes the coexistence within the same patient of different molecular mechanisms of resistance to vemurafenib affecting different metastatic sites. These findings have direct implications for the clinical management of BRAF-mutant melanoma. Clin Cancer Res; 19(20); 5749-57. ©2013 AACR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STUDY DESIGN: Prospective, controlled, observational outcome study using clinical, radiographic, and patient/physician-based questionnaire data, with patient outcomes at 12 months follow-up. OBJECTIVE: To validate appropriateness criteria for low back surgery. SUMMARY OF BACKGROUND DATA: Most surgical treatment failures are attributed to poor patient selection, but no widely accepted consensus exists on detailed indications for appropriate surgery. METHODS: Appropriateness criteria for low back surgery have been developed by a multispecialty panel using the RAND appropriateness method. Based on panel criteria, a prospective study compared outcomes of patients appropriately and inappropriately treated at a single institution with 12 months follow-up assessment. Included were patients with low back pain and/or sciatica referred to the neurosurgical department. Information about symptoms, neurologic signs, the health-related quality of life (SF-36), disability status (Roland-Morris), and pain intensity (VAS) was assessed at baseline, at 6 months, and at 12 months follow-up. The appropriateness criteria were administered prospectively to each clinical situation and outside of the clinical setting, with the surgeon and patients blinded to the results of the panel decision. The patients were further stratified into 2 groups: appropriate treatment group (ATG) and inappropriate treatment group (ITG). RESULTS: Overall, 398 patients completed all forms at 12 months. Treatment was considered appropriate for 365 participants and inappropriate for 33 participants. The mean improvement in the SF-36 physical component score at 12 months was significantly higher in the ATG (mean: 12.3 points) than in the ITG (mean: 6.8 points) (P = 0.01), as well as the mean improvement in the SF-36 mental component score (ATG mean: 5.0 points; ITG mean: -0.5 points) (P = 0.02). Improvement was also significantly higher in the ATG for the mean VAS back pain (ATG mean: 2.3 points; ITG mean: 0.8 points; P = 0.02) and Roland-Morris disability score (ATG mean: 7.7 points; ITG mean: 4.2 points; P = 0.004). The ATG also had a higher improvement in mean VAS for sciatica (4.0 points) than the ITG (2.8 points), but the difference was not significant (P = 0.08). The SF-36 General Health score declined in both groups after 12 months, however, the decline was worse in the ITG (mean decline: 8.2 points) than in the ATG (mean decline: 1.2 points) (P = 0.04). Overall, in comparison to ITG patients, ATG patients had significantly higher improvement at 12 months, both statistically and clinically. CONCLUSION: In comparison to previously reported literature, our study is the first to assess the utility of appropriateness criteria for low back surgery at 1-year follow-up with multiple outcome dimensions. Our results confirm the hypothesis that application of appropriateness criteria can significantly improve patient outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Infection with EBV and a lack in vitamin D may be important environmental triggers of MS. 1,25-(OH)2D3 mediates a shift of antigen presenting cells (APC) and CD4+ T cells to a less inflammatory profile. Although CD8+ T cells do express the vitamin D receptor, a direct effect of 1,25(OH)2D3 on these cells has not been demonstrated until now. Since CD8+ T cells are important immune mediators of the inflammatory response in MS, we examined whether vitamin D directly affects the CD8+ T cell response, and more specifically if it modulates the EBV-specific CD8+ T cell response. Material and Methods: To explore whether the vitamin D status may influence the pattern of the EBV-specific CD8+ T cell response, PBMC of 10 patients with early MS and 10 healthy controls (HC) were stimulated with a pool of immunodominant 8-10 mer peptide epitopes known to elicit CD8+ T cell responses. PBMC were stimulated with this EBV CD8 peptide pool, medium (negative control) or anti- CD3/anti-CD28 beads (positive control). The following assays were performed: ELISPOT to assess the secretion of IFN-gamma by T cells in general; cytometric beads array (CBA) and ELISA to determine whichcytokines were released by EBV-specific CD8+ T cells after six days of culture; and intracellular cytokine staining assay to determine by which subtype of T cells secreted given cytokines. To examine whether vitamin D could directly modulate CD8+ T cell immune responses, we depleted CD4+ T cells using negative selection. Results: We found that pre-treatment of vitamin D had an antiinflammatory action on both EBV-specific CD8+ T cells and on CD3/ CD28-stimulated T cells: secretion of pro-inflammatory cytokines (IFNgamma and TNF-alpha) was decreased, whereas secretion of antiinflammatory cytokines (IL-5 and TGF-beta) was increased. At baseline, CD8+ T cells of early MS patients showed a higher secretion of TNFalpha and lower secretion of IL-5. Addition of vitamin D did not restore the same levels of both cytokines as compared to HC. Vitamin D-pretreated CD8+T cells exhibited a decreased secretion of IFN-gamma and TNF-alpha, even after depletion of CD4+ T cells from culture. Conclusion: Vitamin D has a direct anti-inflammatory effect on CD8+ T cells independently from CD4+ T cells. CD8+ T cells of patients with earlyMS are less responsive to the inflammatory effect of vitamin D than HC, pointing toward an intrinsic dysregulation of CD8+ T cells. The modulation of EBV-specific CD8+T cells by vitaminDsuggests that there may be interplay between these twomajor environmental factors of MS. This study was supported by a grant from the Swiss National Foundation (PP00P3-124893), and by an unrestricted research grant from Bayer to RDP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DREAM is an initiative that allows researchers to assess how well their methods or approaches can describe and predict networks of interacting molecules [1]. Each year, recently acquired datasets are released to predictors ahead of publication. Researchers typically have about three months to predict the masked data or network of interactions, using any predictive method. Predictions are assessed prior to an annual conference where the best predictions are unveiled and discussed. Here we present the strategy we used to make a winning prediction for the DREAM3 phosphoproteomics challenge. We used Amelia II, a multiple imputation software method developed by Gary King, James Honaker and Matthew Blackwell[2] in the context of social sciences to predict the 476 out of 4624 measurements that had been masked for the challenge. To chose the best possible multiple imputation parameters to apply for the challenge, we evaluated how transforming the data and varying the imputation parameters affected the ability to predict additionally masked data. We discuss the accuracy of our findings and show that multiple imputations applied to this dataset is a powerful method to accurately estimate the missing data. We postulate that multiple imputations methods might become an integral part of experimental design as a mean to achieve cost savings in experimental design or to increase the quantity of samples that could be handled for a given cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.