950 resultados para Statistical mixture-design optimization
Resumo:
Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^
Resumo:
Health care providers face the problem of trying to make decisions with inadequate information and also with an overload of (often contradictory) information. Physicians often choose treatment long before they know which disease is present. Indeed, uncertainty is intrinsic to the practice of medicine. Decision analysis can help physicians structure and work through a medical decision problem, and can provide reassurance that decisions are rational and consistent with the beliefs and preferences of other physicians and patients. ^ The primary purpose of this research project is to develop the theory, methods, techniques and tools necessary for designing and implementing a system to support solving medical decision problems. A case study involving “abdominal pain” serves as a prototype for implementing the system. The research, however, focuses on a generic class of problems and aims at covering theoretical as well as practical aspects of the system developed. ^ The main contributions of this research are: (1) bridging the gap between the statistical approach and the knowledge-based (expert) approach to medical decision making; (2) linking a collection of methods, techniques and tools together to allow for the design of a medical decision support system, based on a framework that involves the Analytic Network Process (ANP), the generalization of the Analytic Hierarchy Process (AHP) to dependence and feedback, for problems involving diagnosis and treatment; (3) enhancing the representation and manipulation of uncertainty in the ANP framework by incorporating group consensus weights; and (4) developing a computer program to assist in the implementation of the system. ^
Resumo:
Offset printing is a common method to produce large amounts of printed matter. We consider a real-world offset printing process that is used to imprint customer-specific designs on napkin pouches. The print- ing technology used yields a number of specific constraints. The planning problem consists of allocating designs to printing-plate slots such that the given customer demand for each design is fulfilled, all technologi- cal and organizational constraints are met and the total overproduction and setup costs are minimized. We formulate this planning problem as a mixed-binary linear program, and we develop a multi-pass matching-based savings heuristic. We report computational results for a set of problem instances devised from real-world data.
Resumo:
PURPOSE Confidence intervals (CIs) are integral to the interpretation of the precision and clinical relevance of research findings. The aim of this study was to ascertain the frequency of reporting of CIs in leading prosthodontic and dental implantology journals and to explore possible factors associated with improved reporting. MATERIALS AND METHODS Thirty issues of nine journals in prosthodontics and implant dentistry were accessed, covering the years 2005 to 2012: The Journal of Prosthetic Dentistry, Journal of Oral Rehabilitation, The International Journal of Prosthodontics, The International Journal of Periodontics & Restorative Dentistry, Clinical Oral Implants Research, Clinical Implant Dentistry and Related Research, The International Journal of Oral & Maxillofacial Implants, Implant Dentistry, and Journal of Dentistry. Articles were screened and the reporting of CIs and P values recorded. Other information including study design, region of authorship, involvement of methodologists, and ethical approval was also obtained. Univariable and multivariable logistic regression was used to identify characteristics associated with reporting of CIs. RESULTS Interrater agreement for the data extraction performed was excellent (kappa = 0.88; 95% CI: 0.87 to 0.89). CI reporting was limited, with mean reporting across journals of 14%. CI reporting was associated with journal type, study design, and involvement of a methodologist or statistician. CONCLUSIONS Reporting of CI in implant dentistry and prosthodontic journals requires improvement. Improved reporting will aid appraisal of the clinical relevance of research findings by providing a range of values within which the effect size lies, thus giving the end user the opportunity to interpret the results in relation to clinical practice.
Resumo:
OBJECTIVE The cost-effectiveness of cast nonprecious frameworks has increased their prevalence in cemented implant crowns. The purpose of this study was to assess the effect of the design and height of the retentive component of a standard titanium implant abutment on the fit, possible horizontal rotation and retention forces of cast nonprecious alloy crowns prior to cementation. MATERIALS AND METHODS Two abutment designs were examined: Type A with a 6° taper and 8 antirotation planes (Straumann Tissue-Level RN) and Type B with a 7.5° taper and 1 antirotation plane (SICace implant). Both types were analyzed using 60 crowns: 20 with a full abutment height (6 mm), 20 with a medium abutment height (4 mm), and 20 with a minimal (2.5 mm) abutment height. The marginal and internal fit and the degree of possible rotation were evaluated by using polyvinylsiloxane impressions under a light microscope (magnification of ×50). To measure the retention force, a custom force-measuring device was employed. STATISTICAL ANALYSIS one-sided Wilcoxon rank-sum tests with Bonferroni-Holm corrections, Fisher's exact tests, and Spearman's rank correlation coefficient. RESULTS Type A exhibited increased marginal gaps (primary end-point: 55 ± 20 μm vs. 138 ± 59 μm, P < 0.001) but less rotation (P < 0.001) than Type B. The internal fit was also better for Type A than for Type B (P < 0.001). The retention force of Type A (2.49 ± 3.2 N) was higher (P = 0.019) than that of Type B (1.27 ± 0.84 N). Reduction in abutment height did not affect the variables observed. CONCLUSION Less-tapered abutments with more antirotation planes provide an increase in the retention force, which confines the horizontal rotation but widens the marginal gaps of the crowns. Thus, casting of nonprecious crowns with Type A abutments may result in clinically unfavorable marginal gaps.
Resumo:
This paper presents the capabilities of a Space-Based Space Surveillance (SBSS) demonstration mission for Space Surveillance and Tracking (SST) based on a micro-satellite platform. The results have been produced in the frame of ESA’s "Assessment Study for Space Based Space Surveillance Demonstration Mission" performed by the Airbus Defence and Space consortium. The assessment of SBSS in an SST system architecture has shown that both an operational SBSS and also already a well- designed space-based demonstrator can provide substantial performance in terms of surveillance and tracking of beyond-LEO objects. Especially the early deployment of a demonstrator, possible by using standard equipment, could boost initial operating capability and create a self-maintained object catalogue. Furthermore, unique statistical information about small-size LEO debris (mm size) can be collected in-situ. Unlike classical technology demonstration missions, the primary goal is the demonstration and optimisation of the functional elements in a complex end-to-end chain (mission planning, observation strategies, data acquisition, processing, etc.) until the final products can be offered to the users and with low technological effort and risk. The SBSS system concept takes the ESA SST System Requirements into account and aims at fulfilling SST core requirements in a stand-alone manner. Additionally, requirements for detection and characterisation of small-sizedLEO debris are considered. The paper presents details of the system concept, candidate micro-satellite platforms, the instrument design and the operational modes. Note that the detailed results of performance simulations for space debris coverage and cataloguing accuracy are presented in a separate paper “Capability of a Space-based Space Surveillance System to Detect and Track Objects in GEO, MEO and LEO Orbits” by J. Silha (AIUB) et al., IAC-14, A6, 1.1x25640.
Resumo:
INTRODUCTION Even though arthroplasty of the ankle joint is considered to be an established procedure, only about 1,300 endoprostheses are implanted in Germany annually. Arthrodeses of the ankle joint are performed almost three times more often. This may be due to the availability of the procedure - more than twice as many providers perform arthrodesis - as well as the postulated high frequency of revision procedures of arthroplasties in the literature. In those publications, however, there is often no clear differentiation between revision surgery with exchange of components, subsequent interventions due to complications and subsequent surgery not associated with complications. The German Orthopaedic Foot and Ankle Association's (D. A. F.) registry for total ankle replacement collects data pertaining to perioperative complications as well as cause, nature and extent of the subsequent interventions, and postoperative patient satisfaction. MATERIAL AND METHODS The D. A. F.'s total ankle replacement register is a nation-wide, voluntary registry. After giving written informed consent, the patients can be added to the database by participating providers. Data are collected during hospital stay for surgical treatment, during routine follow-up inspections and in the context of revision surgery. The information can be submitted in paper-based or online formats. The survey instruments are available as minimum data sets or scientific questionnaires which include patient-reported outcome measures (PROMs). The pseudonymous clinical data are collected and evaluated at the Institute for Evaluative Research in Medicine, University of Bern/Switzerland (IEFM). The patient-related data remain on the register's module server in North Rhine-Westphalia, Germany. The registry's methodology as well as the results of the revisions and patient satisfaction for 115 patients with a two year follow-up period are presented. Statistical analyses are performed with SAS™ (Version 9.4, SAS Institute, Inc., Cary, NC, USA). RESULTS About 2½ years after the register was launched there are 621 datasets on primary implantations, 1,427 on follow-ups and 121 records on re-operation available. 49 % of the patients received their implants due to post-traumatic osteoarthritis, 27 % because of a primary osteoarthritis and 15 % of patients suffered from a rheumatic disease. More than 90 % of the primary interventions proceeded without complications. Subsequent interventions were recorded for 84 patients, which corresponds to a rate of 13.5 % with respect to the primary implantations. It should be noted that these secondary procedures also include two-stage procedures not due to a complication. "True revisions" are interventions with exchange of components due to mechanical complications and/or infection and were present in 7.6 % of patients. 415 of the patients commented on their satisfaction with the operative result during the last follow-up: 89.9 % of patients evaluate their outcome as excellent or good, 9.4 % as moderate and only 0.7 % (3 patients) as poor. In these three cases a component loosening or symptomatic USG osteoarthritis was present. Two-year follow-up data using the American Orthopedic Foot and Ankle Society Ankle and Hindfoot Scale (AOFAS-AHS) are already available for 115 patients. The median AOFAS-AHS score increased from 33 points preoperatively to more than 80 points three to six months postoperatively. This increase remained nearly constant over the entire two-year follow-up period. CONCLUSION Covering less than 10 % of the approximately 240 providers in Germany and approximately 12 % of the annually implanted total ankle-replacements, the D. A. F.-register is still far from being seen as a national registry. Nevertheless, geographical coverage and inclusion of "high-" (more than 100 total ankle replacements a year) and "low-volume surgeons" (less than 5 total ankle replacements a year) make the register representative for Germany. The registry data show that the number of subsequent interventions and in particular the "true revision" procedures are markedly lower than the 20 % often postulated in the literature. In addition, a high level of patient satisfaction over the short and medium term is recorded. From the perspective of the authors, these results indicate that total ankle arthroplasty - given a correct indication and appropriate selection of patients - is not inferior to an ankle arthrodesis concerning patients' satisfaction and function. First valid survival rates can be expected about 10 years after the register's start.
Resumo:
I introduce the new mgof command to compute distributional tests for discrete (categorical, multinomial) variables. The command supports largesample tests for complex survey designs and exact tests for small samples as well as classic large-sample x2-approximation tests based on Pearson’s X2, the likelihood ratio, or any other statistic from the power-divergence family (Cressie and Read, 1984, Journal of the Royal Statistical Society, Series B (Methodological) 46: 440–464). The complex survey correction is based on the approach by Rao and Scott (1981, Journal of the American Statistical Association 76: 221–230) and parallels the survey design correction used for independence tests in svy: tabulate. mgof computes the exact tests by using Monte Carlo methods or exhaustive enumeration. mgof also provides an exact one-sample Kolmogorov–Smirnov test for discrete data.
Resumo:
XENON is a dark matter direct detection project, consisting of a time projection chamber (TPC) filled with liquid xenon as detection medium. The construction of the next generation detector, XENON1T, is presently taking place at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy. It aims at a sensitivity to spin-independent cross sections of 2 10-47 c 2 for WIMP masses around 50 GeV2, which requires a background reduction by two orders of magnitude compared to XENON100, the current generation detector. An active system that is able to tag muons and muon-induced backgrounds is critical for this goal. A water Cherenkov detector of ~ 10 m height and diameter has been therefore developed, equipped with 8 inch photomultipliers and cladded by a reflective foil. We present the design and optimization study for this detector, which has been carried out with a series of Monte Carlo simulations. The muon veto will reach very high detection efficiencies for muons (>99.5%) and showers of secondary particles from muon interactions in the rock (>70%). Similar efficiencies will be obtained for XENONnT, the upgrade of XENON1T, which will later improve the WIMP sensitivity by another order of magnitude. With the Cherenkov water shield studied here, the background from muon-induced neutrons in XENON1T is negligible.
Resumo:
OBJECTIVES The objectives of the present study were to investigate temporal/spectral sound-feature processing in preschool children (4 to 7 years old) with peripheral hearing loss compared with age-matched controls. The results verified the presence of statistical learning, which was diminished in children with hearing impairments (HIs), and elucidated possible perceptual mediators of speech production. DESIGN Perception and production of the syllables /ba/, /da/, /ta/, and /na/ were recorded in 13 children with normal hearing and 13 children with HI. Perception was assessed physiologically through event-related potentials (ERPs) recorded by EEG in a multifeature mismatch negativity paradigm and behaviorally through a discrimination task. Temporal and spectral features of the ERPs during speech perception were analyzed, and speech production was quantitatively evaluated using speech motor maximum performance tasks. RESULTS Proximal to stimulus onset, children with HI displayed a difference in map topography, indicating diminished statistical learning. In later ERP components, children with HI exhibited reduced amplitudes in the N2 and early parts of the late disciminative negativity components specifically, which are associated with temporal and spectral control mechanisms. Abnormalities of speech perception were only subtly reflected in speech production, as the lone difference found in speech production studies was a mild delay in regulating speech intensity. CONCLUSIONS In addition to previously reported deficits of sound-feature discriminations, the present study results reflect diminished statistical learning in children with HI, which plays an early and important, but so far neglected, role in phonological processing. Furthermore, the lack of corresponding behavioral abnormalities in speech production implies that impaired perceptual capacities do not necessarily translate into productive deficits.
Resumo:
Development of interfaces for sample introduction from high pressures is important for real-time online hyphenation of chromatographic and other separation devices with mass spectrometry (MS) or accelerator mass spectrometry (AMS). Momentum separators can reduce unwanted low-density gases and introduce the analyte into the vacuum. In this work, the axial jet separator, a new momentum interface, is characterized by theory and empirical optimization. The mathematical model describes the different axial penetration of the components of a jetgas mixture and explains the empirical results for injections of CO2 in helium into MS and AMS instruments. We show that the performance of the new interface is sensitive to the nozzle size, showing good qualitative agreement with the mathematical model. Smaller nozzle sizes are more preferable due to their higher inflow capacity. The CO2 transmission efficiency of the interface into a MS instrument is ~14% (CO2/helium separation factor of 2.7). The interface receives and delivers flows of ~17.5 mL/min and ~0.9 mL/min, respectively. For the interfaced AMS instrument, the ionization and overall efficiencies are 0.7-3% and 0.1-0.4%, respectively, for CO2 amounts of 4-0.6 µg C, which is only slightly lower compared to conventional systems using intermediate trapping. The ionization efficiency depends on to the carbon mass flow in the injected pulse and is suppressed at high CO2 flows. Relative to a conventional jet separator, the transmission efficiency of the axial jet separator is lower, but its performance is less sensitive to misalignments.
Resumo:
This work deals with parallel optimization of expensive objective functions which are modelled as sample realizations of Gaussian processes. The study is formalized as a Bayesian optimization problem, or continuous multi-armed bandit problem, where a batch of q > 0 arms is pulled in parallel at each iteration. Several algorithms have been developed for choosing batches by trading off exploitation and exploration. As of today, the maximum Expected Improvement (EI) and Upper Confidence Bound (UCB) selection rules appear as the most prominent approaches for batch selection. Here, we build upon recent work on the multipoint Expected Improvement criterion, for which an analytic expansion relying on Tallis’ formula was recently established. The computational burden of this selection rule being still an issue in application, we derive a closed-form expression for the gradient of the multipoint Expected Improvement, which aims at facilitating its maximization using gradient-based ascent algorithms. Substantial computational savings are shown in application. In addition, our algorithms are tested numerically and compared to state-of-the-art UCB-based batchsequential algorithms. Combining starting designs relying on UCB with gradient-based EI local optimization finally appears as a sound option for batch design in distributed Gaussian Process optimization.
Resumo:
OBJECTIVES To assess the presence of within-group comparisons with baseline in a subset of leading dental journals and to explore possible associations with a range of study characteristics including journal and study design. STUDY DESIGN AND SETTING Thirty consecutive issues of five leading dental journals were electronically searched. The conduct and reporting of statistical analysis in respect of comparisons against baseline or otherwise along with the manner of interpretation of the results were assessed. Descriptive statistics were obtained, and chi-square test and Fisher's exact were undertaken to test the association between trial characteristics and overall study interpretation. RESULTS A total of 184 studies were included with the highest proportion published in Journal of Endodontics (n = 84, 46%) and most involving a single center (n = 157, 85%). Overall, 43 studies (23%) presented interpretation of their outcomes based solely on comparisons against baseline. Inappropriate use of baseline testing was found to be less likely in interventional studies (P < 0.001). CONCLUSION Use of comparisons with baseline appears to be common among both observational and interventional research studies in dentistry. Enhanced conduct and reporting of statistical tests are required to ensure that inferences from research studies are appropriate and informative.
Resumo:
A longitudinal study of three discrete online public access catalog (OPAC) design enhancements examined the possible effects such changes may have on circulation and resource sharing within the automated library consortium environment. Statistical comparisons were made of both circulation and interlibrary loan (ILL) figures from the year before enhancement to the year after implementation. Data from sixteen libraries covering a seven-year period were studied in order to determine the degree to which patrons may or may not utilize increasingly broader OPAC ILL options over time. Results indicated that while ILL totals increased significantly after each OPAC enhancement, such gains did not result in significant corresponding changes in total circulation.
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^