910 resultados para Multiple criteria analysis
Resumo:
The surrounding capsule of Streptococcus pneumoniae has been identified as a major virulence factor and is targeted by pneumococcal conjugate vaccines (PCV). However, nonencapsulated Streptococcus pneumoniae (Non-Ec-Sp) have also been isolated globally, mainly in carriage studies. It is unknown if Non-Ec-Sp evolve sporadically, if they have high antibiotic non-susceptiblity rates and a unique, specific gene content. Here, whole genome sequencing of 131 Non-Ec-Sp isolates sourced from 17 different locations around the world was performed. Results revealed a deep-branching classic lineage that is distinct from multiple sporadic lineages. The sporadic lineages clustered with a previously sequenced, global collection of encapsulated S. pneumoniae (Ec-Sp) isolates while the classic lineage is comprised mainly of the frequently identified multi-locus sequences types ST344 (n=39) and ST448 (n=40). All ST344 and nine ST448 isolates had high non-susceptiblity rates to β-lactams and other antimicrobials. Analysis of the accessory genome reveals that the classic Non-Ec-Sp contained an increased number of mobile elements, than Ec-Sp and sporadic Non-Ec-Sp. Performing adherence assays to human epithelial cells for selected classic and sporadic Non-Ec-Sp revealed that the presence of a integrative conjugative element (ICE) results in increased adherence to human epithelial cells (P=0.005). In contrast, sporadic Non-Ec-Sp lacking the ICE had greater growth in vitro possibly resulting in improved fitness. In conclusion, Non-Ec-Sp isolates from the classic lineage have evolved separately. They have spread globally, are well adapted to nasopharyngeal carriage and are able to coexist with Ec-Sp. Due to continued use of pneumococcal conjugate vaccines, Non-Ec-Sp may become more prevalent.
Resumo:
BACKGROUND Retinal optical coherence tomography (OCT) permits quantification of retinal layer atrophy relevant to assessment of neurodegeneration in multiple sclerosis (MS). Measurement artefacts may limit the use of OCT to MS research. OBJECTIVE An expert task force convened with the aim to provide guidance on the use of validated quality control (QC) criteria for the use of OCT in MS research and clinical trials. METHODS A prospective multi-centre (n = 13) study. Peripapillary ring scan QC rating of an OCT training set (n = 50) was followed by a test set (n = 50). Inter-rater agreement was calculated using kappa statistics. Results were discussed at a round table after the assessment had taken place. RESULTS The inter-rater QC agreement was substantial (kappa = 0.7). Disagreement was found highest for judging signal strength (kappa = 0.40). Future steps to resolve these issues were discussed. CONCLUSION Substantial agreement for QC assessment was achieved with aid of the OSCAR-IB criteria. The task force has developed a website for free online training and QC certification. The criteria may prove useful for future research and trials in MS using OCT as a secondary outcome measure in a multi-centre setting.
Resumo:
Sequence analysis and optimal matching are useful heuristic tools for the descriptive analysis of heterogeneous individual pathways such as educational careers, job sequences or patterns of family formation. However, to date it remains unclear how to handle the inevitable problems caused by missing values with regard to such analysis. Multiple Imputation (MI) offers a possible solution for this problem but it has not been tested in the context of sequence analysis. Against this background, we contribute to the literature by assessing the potential of MI in the context of sequence analyses using an empirical example. Methodologically, we draw upon the work of Brendan Halpin and extend it to additional types of missing value patterns. Our empirical case is a sequence analysis of panel data with substantial attrition that examines the typical patterns and the persistence of sex segregation in school-to-work transitions in Switzerland. The preliminary results indicate that MI is a valuable methodology for handling missing values due to panel mortality in the context of sequence analysis. MI is especially useful in facilitating a sound interpretation of the resulting sequence types.
Resumo:
OBJECTIVE To assess the 5-year survival of metal-ceramic and all-ceramic tooth-supported fixed dental prostheses (FDPs) and to describe the incidence of biological, technical and esthetic complications. METHODS Medline (PubMed), Embase and Cochrane Central Register of Controlled Trials (CENTRAL) searches (2006-2013) were performed for clinical studies focusing on tooth-supported FDPs with a mean follow-up of at least 3 years. This was complemented by an additional hand search and the inclusion of 10 studies from a previous systematic review [1]. Survival and complication rates were analyzed using robust Poisson's regression models to obtain summary estimates of 5-year proportions. RESULTS Forty studies reporting on 1796 metal-ceramic and 1110 all-ceramic FDPs fulfilled the inclusion criteria. Meta-analysis of the included studies indicated an estimated 5-year survival rate of metal-ceramic FDPs of 94.4% (95% CI: 91.2-96.5%). The estimated survival rate of reinforced glass ceramic FDPs was 89.1% (95% CI: 80.4-94.0%), the survival rate of glass-infiltrated alumina FDPs was 86.2% (95% CI: 69.3-94.2%) and the survival rate of densely sintered zirconia FDPs was 90.4% (95% CI: 84.8-94.0%) in 5 years of function. Even though the survival rate of all-ceramic FDPs was lower than for metal-ceramic FDPs, the differences did not reach statistical significance except for the glass-infiltrated alumina FDPs (p=0.05). A significantly higher incidence of caries in abutment teeth was observed for densely sintered zirconia FDPs compared to metal-ceramic FDPs. Significantly more framework fractures were reported for reinforced glass ceramic FDPs (8.0%) and glass-infiltrated alumina FDPs (12.9%) compared to metal-ceramic FDPs (0.6%) and densely sintered zirconia FDPs (1.9%) in 5 years in function. However, the incidence of ceramic fractures and loss of retention was significantly (p=0.018 and 0.028 respectively) higher for densely sintered zirconia FDPs compared to all other types of FDPs. CONCLUSIONS Survival rates of all types of all-ceramic FDPs were lower than those reported for metal-ceramic FDPs. The incidence of framework fractures was significantly higher for reinforced glass ceramic FDPs and infiltrated glass ceramic FDPs, and the incidence for ceramic fractures and loss of retention was significantly higher for densely sintered zirconia FDPs compared to metal-ceramic FDPs.
Resumo:
BACKGROUND Non-steroidal anti-inflammatory drugs (NSAIDs) are the backbone of osteoarthritis pain management. We aimed to assess the effectiveness of different preparations and doses of NSAIDs on osteoarthritis pain in a network meta-analysis. METHODS For this network meta-analysis, we considered randomised trials comparing any of the following interventions: NSAIDs, paracetamol, or placebo, for the treatment of osteoarthritis pain. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the reference lists of relevant articles for trials published between Jan 1, 1980, and Feb 24, 2015, with at least 100 patients per group. The prespecified primary and secondary outcomes were pain and physical function, and were extracted in duplicate for up to seven timepoints after the start of treatment. We used an extension of multivariable Bayesian random effects models for mixed multiple treatment comparisons with a random effect at the level of trials. For the primary analysis, a random walk of first order was used to account for multiple follow-up outcome data within a trial. Preparations that used different total daily dose were considered separately in the analysis. To assess a potential dose-response relation, we used preparation-specific covariates assuming linearity on log relative dose. FINDINGS We identified 8973 manuscripts from our search, of which 74 randomised trials with a total of 58 556 patients were included in this analysis. 23 nodes concerning seven different NSAIDs or paracetamol with specific daily dose of administration or placebo were considered. All preparations, irrespective of dose, improved point estimates of pain symptoms when compared with placebo. For six interventions (diclofenac 150 mg/day, etoricoxib 30 mg/day, 60 mg/day, and 90 mg/day, and rofecoxib 25 mg/day and 50 mg/day), the probability that the difference to placebo is at or below a prespecified minimum clinically important effect for pain reduction (effect size [ES] -0·37) was at least 95%. Among maximally approved daily doses, diclofenac 150 mg/day (ES -0·57, 95% credibility interval [CrI] -0·69 to -0·46) and etoricoxib 60 mg/day (ES -0·58, -0·73 to -0·43) had the highest probability to be the best intervention, both with 100% probability to reach the minimum clinically important difference. Treatment effects increased as drug dose increased, but corresponding tests for a linear dose effect were significant only for celecoxib (p=0·030), diclofenac (p=0·031), and naproxen (p=0·026). We found no evidence that treatment effects varied over the duration of treatment. Model fit was good, and between-trial heterogeneity and inconsistency were low in all analyses. All trials were deemed to have a low risk of bias for blinding of patients. Effect estimates did not change in sensitivity analyses with two additional statistical models and accounting for methodological quality criteria in meta-regression analysis. INTERPRETATION On the basis of the available data, we see no role for single-agent paracetamol for the treatment of patients with osteoarthritis irrespective of dose. We provide sound evidence that diclofenac 150 mg/day is the most effective NSAID available at present, in terms of improving both pain and function. Nevertheless, in view of the safety profile of these drugs, physicians need to consider our results together with all known safety information when selecting the preparation and dose for individual patients. FUNDING Swiss National Science Foundation (grant number 405340-104762) and Arco Foundation, Switzerland.
Resumo:
Background: Multiple True-False-Items (MTF-Items) might offer some advantages compared to one-best-answer-questions (TypeA) as they allow more than one correct answer and may better represent clinical decisions. However, in medical education assessment MTF-Items are seldom used. Summary of Work: With this literature review existing findings on MTF-items and on TypeA were compared along the Ottawa Criteria for Good Assessment, i.e. (1) reproducibility, (2) feasibility, (3) validity, (4) acceptance, (5) educational effect, (6) catalytic effects, and (7) equivalence. We conducted a literature research on ERIC and Google Scholar including papers from the years 1935 to 2014. We used the search terms “multiple true-false”, “true-false”, “true/false”, and “Kprim” combined with “exam”, “test”, and “assessment”. Summary of Results: We included 29 out of 33 studies. Four of them were carried out in the medical field Compared to TypeA, MTF-Items are associated with (1) higher reproducibility (2) lower feasibility (3) similar validity (4) higher acceptance (5) higher educational effect (6) no studies on catalytic effects or (7) equivalence. Discussion and Conclusions: While studies show overall good characteristics of MTF items according to the Ottawa criteria, this type of question seems to be rather seldom used. One reason might be the reported lower feasibility. Overall the literature base is still weak. Furthermore, only 14 % of literature is from the medical domain. Further studies to better understand the characteristics of MTF-Items in the medical domain are warranted. Take-home messages: Overall the literature base is weak and therefore further studies are needed. Existing studies show that: MTF-Items show higher reliability, acceptance and educational effect; MTF-Items are more difficult to produce
Resumo:
In this dissertation, the cytogenetic characteristics of bone marrow cells from 41 multiple myeloma patients were investigated. These cytogenetic data were correlated with the total DNA content as measured by flow cytometry. Both the cytogenetic information and DNA content were then correlated with clinical data to determine if diagnosis and prognosis of multiple myeloma could be improved.^ One hundred percent of the patients demonstrated abnormal chromosome numbers per metaphase. The average chromosome number per metaphase ranged from 42 to 49.9, with a mean of 44.99. The percent hypodiploidy ranged from 0-100% and the percent hyperdiploidy from 0-53%. Detailed cytogenetic analyses were very difficult to perform because of the paucity of mitotic figures and the poor chromosome morphology. Thus, detailed chromosome banding analysis on these patients was impossible.^ Thirty seven percent of the patients had normal total DNA content, whereas 63% had abnormal amounts of DNA (one patient with less than normal amounts and 25 patients with greater than normal amounts of DNA).^ Several clinical parameters were used in the statistical analyses: tumor burden, patient status at biopsy, patient response status, past therapy, type of treatment and percent plasma cells. Only among these clinical parameters were any statistically significant correlations found: pretreatment tumor burden versus patient response, patient biopsy status versus patient response and past therapy versus patient response.^ No correlations were found between percent hypodiploid, diploid, hyperdiploid or DNA content, and the patient response status, nor were any found between those patients with: (a) normal plasma cells, low pretreatment tumor mass burden and more than 50% of the analyzed metaphases with 46 chromosomes; (b) normal amounts of DNA, low pretreatment tumor mass burden and more than 50% of the metaphases with 46 chromosomes; (c) normal amounts of DNA and normal quantities of plasma cells; (d) abnormal amounts of DNA, abnormal amounts of plasma cells, high pretreatment tumor mass burden and less than 50% of the metaphases with 46 chromosomes.^ Technical drawbacks of both cytogenetic and DNA content analysis in these multiple myeloma patients are discussed along with the lack of correlations between DNA content and chromosome number. Refined chromosome banding analysis awaits technical improvements before we can understand which chromosome material (if any) makes up the "extra" amounts of DNA in these patients. None of the correlations tested can be used as diagnostic or prognostic aids for multiple myeloma. ^
Resumo:
Objective: In this secondary data analysis, three statistical methodologies were implemented to handle cases with missing data in a motivational interviewing and feedback study. The aim was to evaluate the impact that these methodologies have on the data analysis. ^ Methods: We first evaluated whether the assumption of missing completely at random held for this study. We then proceeded to conduct a secondary data analysis using a mixed linear model to handle missing data with three methodologies (a) complete case analysis, (b) multiple imputation with explicit model containing outcome variables, time, and the interaction of time and treatment, and (c) multiple imputation with explicit model containing outcome variables, time, the interaction of time and treatment, and additional covariates (e.g., age, gender, smoke, years in school, marital status, housing, race/ethnicity, and if participants play on athletic team). Several comparisons were conducted including the following ones: 1) the motivation interviewing with feedback group (MIF) vs. the assessment only group (AO), the motivation interviewing group (MIO) vs. AO, and the intervention of the feedback only group (FBO) vs. AO, 2) MIF vs. FBO, and 3) MIF vs. MIO.^ Results: We first evaluated the patterns of missingness in this study, which indicated that about 13% of participants showed monotone missing patterns, and about 3.5% showed non-monotone missing patterns. Then we evaluated the assumption of missing completely at random by Little's missing completely at random (MCAR) test, in which the Chi-Square test statistic was 167.8 with 125 degrees of freedom, and its associated p-value was p=0.006, which indicated that the data could not be assumed to be missing completely at random. After that, we compared if the three different strategies reached the same results. For the comparison between MIF and AO as well as the comparison between MIF and FBO, only the multiple imputation with additional covariates by uncongenial and congenial models reached different results. For the comparison between MIF and MIO, all the methodologies for handling missing values obtained different results. ^ Discussions: The study indicated that, first, missingness was crucial in this study. Second, to understand the assumptions of the model was important since we could not identify if the data were missing at random or missing not at random. Therefore, future researches should focus on exploring more sensitivity analyses under missing not at random assumption.^
Resumo:
Satellites and space equipment are exposed to diffuse acoustic fields during the launch process. The use of adequate techniques to model the response to the acoustic loads is a fundamental task during the design and verification phases. Considering the modal density of each element is necessary to identify the correct methodology. In this report selection criteria are presented in order to choose the correct modelling technique depending on the frequency ranges. A model satellite’s response to acoustic loads is presented, determining the modal densities of each component in different frequency ranges. The paper proposes to select the mathematical method in each modal density range and the differences in the response estimation due to the different used techniques. In addition, the methodologies to analyse the intermediate range of the system are discussed. The results are compared with experimental testing data obtained in an experimental modal test.
Resumo:
Steam Generator Tube Rupture (SGTR) sequences in Pressurized Water Reactors are known to be one of the most demanding transients for the operating crew. SGTR are a special kind of transient as they could lead to radiological releases without core damage or containment failure, as they can constitute a direct path from the reactor coolant system to the environment. The first methodology used to perform the Deterministic Safety Analysis (DSA) of a SGTR did not credit the operator action for the first 30 min of the transient, assuming that the operating crew was able to stop the primary to secondary leakage within that period of time. However, the different real SGTR accident cases happened in the USA and over the world demonstrated that the operators usually take more than 30 min to stop the leakage in actual sequences. Some methodologies were raised to overcome that fact, considering operator actions from the beginning of the transient, as it is done in Probabilistic Safety Analysis. This paper presents the results of comparing different assumptions regarding the single failure criteria and the operator action taken from the most common methodologies included in the different Deterministic Safety Analysis. One single failure criteria that has not been analysed previously in the literature is proposed and analysed in this paper too. The comparison is done with a PWR Westinghouse three loop model in TRACE code (Almaraz NPP) with best estimate assumptions but including deterministic hypothesis such as single failure criteria or loss of offsite power. The behaviour of the reactor is quite diverse depending on the different assumptions made regarding the operator actions. On the other hand, although there are high conservatisms included in the hypothesis, as the single failure criteria, all the results are quite far from the regulatory limits. In addition, some improvements to the Emergency Operating Procedures to minimize the offsite release from the damaged SG in case of a SGTR are outlined taking into account the offsite dose sensitivity results.
Resumo:
The objective of this study was to propose a multi-criteria optimization and decision-making technique to solve food engineering problems. This technique was demostrated using experimental data obtained on osmotic dehydratation of carrot cubes in a sodium chloride solution. The Aggregating Functions Approach, the Adaptive Random Search Algorithm, and the Penalty Functions Approach were used in this study to compute the initial set of non-dominated or Pareto-optimal solutions. Multiple non-linear regression analysis was performed on a set of experimental data in order to obtain particular multi-objective functions (responses), namely water loss, solute gain, rehydration ratio, three different colour criteria of rehydrated product, and sensory evaluation (organoleptic quality). Two multi-criteria decision-making approaches, the Analytic Hierarchy Process (AHP) and the Tabular Method (TM), were used simultaneously to choose the best alternative among the set of non-dominated solutions. The multi-criteria optimization and decision-making technique proposed in this study can facilitate the assessment of criteria weights, giving rise to a fairer, more consistent, and adequate final compromised solution or food process. This technique can be useful to food scientists in research and education, as well as to engineers involved in the improvement of a variety of food engineering processes.
Resumo:
In Operational Modal Analysis (OMA) of a structure, the data acquisition process may be repeated many times. In these cases, the analyst has several similar records for the modal analysis of the structure that have been obtained at di�erent time instants (multiple records). The solution obtained varies from one record to another, sometimes considerably. The differences are due to several reasons: statistical errors of estimation, changes in the external forces (unmeasured forces) that modify the output spectra, appearance of spurious modes, etc. Combining the results of the di�erent individual analysis is not straightforward. To solve the problem, we propose to make the joint estimation of the parameters using all the records. This can be done in a very simple way using state space models and computing the estimates by maximum-likelihood. The method provides a single result for the modal parameters that combines optimally all the records.