890 resultados para distributed meta classifiers


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Results of 14 randomized controlled trials of acupuncture for chronic pain were pooled in a meta-analysis and analysed in three subgroups according to site of pain; and in two subgroups each according to type to trial, type of treatment, type of control, 'blindness' of participating agents, trial size, and type of journal in which results were published. While few individual trials had statistically significant results, pooled results of many subgroups attained statistical significance in favour of acupuncture. Various potential sources of bias, including problems with blindness, precluded a conclusive finding although most results apparently favoured acupuncture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previously, a single nucleotide polymorphism (SNP), rs9939609, in the FTO gene showed a much stronger association with all-cause mortality than expected from its association with body mass index (BMI), body fat mass index (FMI) and waist circumference (WC). This finding implies that the SNP has strong pleiotropic effects on adiposity and adiposity-independent pathological pathways that leads to increased mortality. To investigate this further, we conducted a meta-analysis of similar data from 34 longitudinal studies including 169,551 adult Caucasians among whom 27,100 died during follow-up. Linear regression showed that the minor allele of the FTO SNP was associated with greater BMI (n = 169,551; 0.32 kg m(-2) ; 95% CI 0.28-0.32, P < 1 × 10(-32) ), WC (n = 152,631; 0.76 cm; 0.68-0.84, P < 1 × 10(-32) ) and FMI (n = 48,192; 0.17 kg m(-2) ; 0.13-0.22, P = 1.0 × 10(-13) ). Cox proportional hazard regression analyses for mortality showed that the hazards ratio (HR) for the minor allele of the FTO SNPs was 1.02 (1.00-1.04, P = 0.097), but the apparent excess risk was eliminated after adjustment for BMI and WC (HR: 1.00; 0.98-1.03, P = 0.662) and for FMI (HR: 1.00; 0.96-1.04, P = 0.932). In conclusion, this study does not support that the FTO SNP is associated with all-cause mortality independently of the adiposity phenotypes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the variability of bond strength test results of adhesive systems (AS) and to correlate the results with clinical parameters of clinical studies investigating cervical restorations. MATERIALS AND METHODS: Regarding the clinical studies, the internal database which had previously been used for a meta-analysis on cervical restorations was updated with clinical studies published between 2008 and 2012 by searching the PubMed and SCOPUS databases. PubMed and the International Association for Dental Research abstracts online were searched for laboratory studies on microtensile, macrotensile and macroshear bond strength tests. The inclusion criteria were (1) dentin, (2) testing of at least four adhesive systems, (3) same diameter of composite and (4) 24h of water storage prior to testing. The clinical outcome variables were retention loss, marginal discoloration, detectable margins, and a clinical index comprising the three parameters by weighing them. Linear mixed models which included a random study effect were calculated for both, the laboratory and the clinical studies. The variability was assessed by calculating a ratio of variances, dividing the variance among the estimated bonding effects obtained in the linear mixed models by the sum of all variance components estimated in these models. RESULTS: Thirty-two laboratory studies fulfilled the inclusion criteria comprising 183 experiments. Of those, 86 used the microtensile test evaluating 22 adhesive systems (AS). Twenty-seven used the macrotensile test with 17 AS, and 70 used the macroshear test with 24 AS. For 28 AS the results from clinical studies were available. Microtensile and macrotensile (Spearman rho=0.66, p=0.007) were moderately correlated and also microtensile and macroshear (Spearman rho=0.51, p=0.03) but not macroshear and macrotensile (Spearman rho=0.34, p=0.22). The effect of the adhesive system was significant for microtensile and macroshear (p<0.001) but not for macrotensile. The effect of the adhesive system could explain 36% of the variability of the microtensile test, 27% of the macrotensile and 33% of the macroshear test. For the clinical trials, about 49% of the variability of retained restorations could be explained by the adhesive system. With respect to the correlation between bond strength tests and clinical parameters, only a moderate correlation between micro- and macrotensile test results and marginal discoloration was demonstrated. However, no correlation between these tests and a retention loss or marginal integrity was shown. The correlation improved when more studies were included compared to assessing only one study. SIGNIFICANCE: The high variability of bond strength test results highlights the need to establish individual acceptance levels for a given test institute. The weak correlation of bond-strength test results with clinical parameters leads to the conclusion that one should not rely solely on bond strength tests to predict the clinical performance of an adhesive system but one should conduct other laboratory tests like tests on the marginal adaptation of fillings in extracted teeth and the retention loss of restorations in non-retentive cavities after artificial aging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Pneumonia is the biggest cause of deaths in young children in developing countries, but early diagnosis and intervention can effectively reduce mortality. We aimed to assess the diagnostic value of clinical signs and symptoms to identify radiological pneumonia in children younger than 5 years and to review the accuracy of WHO criteria for diagnosis of clinical pneumonia. METHODS: We searched Medline (PubMed), Embase (Ovid), the Cochrane Database of Systematic Reviews, and reference lists of relevant studies, without date restrictions, to identify articles assessing clinical predictors of radiological pneumonia in children. Selection was based on: design (diagnostic accuracy studies), target disease (pneumonia), participants (children aged <5 years), setting (ambulatory or hospital care), index test (clinical features), and reference standard (chest radiography). Quality assessment was based on the 2011 Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) criteria. For each index test, we calculated sensitivity and specificity and, when the tests were assessed in four or more studies, calculated pooled estimates with use of bivariate model and hierarchical summary receiver operation characteristics plots for meta-analysis. FINDINGS: We included 18 articles in our analysis. WHO-approved signs age-related fast breathing (six studies; pooled sensitivity 0·62, 95% CI 0·26-0·89; specificity 0·59, 0·29-0·84) and lower chest wall indrawing (four studies; 0·48, 0·16-0·82; 0·72, 0·47-0·89) showed poor diagnostic performance in the meta-analysis. Features with the highest pooled positive likelihood ratios were respiratory rate higher than 50 breaths per min (1·90, 1·45-2·48), grunting (1·78, 1·10-2·88), chest indrawing (1·76, 0·86-3·58), and nasal flaring (1·75, 1·20-2·56). Features with the lowest pooled negative likelihood ratio were cough (0·30, 0·09-0·96), history of fever (0·53, 0·41-0·69), and respiratory rate higher than 40 breaths per min (0·43, 0·23-0·83). INTERPRETATION: Not one clinical feature was sufficient to diagnose pneumonia definitively. Combination of clinical features in a decision tree might improve diagnostic performance, but the addition of new point-of-care tests for diagnosis of bacterial pneumonia would help to attain an acceptable level of accuracy. FUNDING: Swiss National Science Foundation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: (1) To assess the outcomes of minimally invasive simple prostatectomy (MISP) for the treatment of symptomatic benign prostatic hyperplasia in men with large prostates and (2) to compare them with open simple prostatectomy (OSP). METHODS: A systematic review of outcomes of MISP for benign prostatic hyperplasia with meta-analysis was conducted. The article selection process was conducted according to the PRISMA guidelines. RESULTS: Twenty-seven observational studies with 764 patients were analyzed. The mean prostate volume was 113.5 ml (95 % CI 106-121). The mean increase in Qmax was 14.3 ml/s (95 % CI 13.1-15.6), and the mean improvement in IPSS was 17.2 (95 % CI 15.2-19.2). Mean duration of operation was 141 min (95 % CI 124-159), and the mean intraoperative blood loss was 284 ml (95 % CI 243-325). One hundred and four patients (13.6 %) developed a surgical complication. In comparative studies, length of hospital stay (WMD -1.6 days, p = 0.02), length of catheter use (WMD -1.3 days, p = 0.04) and estimated blood loss (WMD -187 ml, p = 0.015) were significantly lower in the MISP group, while the duration of operation was longer than in OSP (WMD 37.8 min, p < 0.0001). There were no differences in improvements in Qmax, IPSS and perioperative complications between both procedures. The small study sizes, publication bias, lack of systematic complication reporting and short follow-up are limitations. CONCLUSIONS: MISP seems an effective and safe treatment option. It provides similar improvements in Qmax and IPSS as OSP. Despite taking longer, it results in less blood loss and shorter hospital stay. Prospective randomized studies comparing OSP, MISP and laser enucleation are needed to define the standard surgical treatment for large prostates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IMPORTANCE: Cerebral amyloid-β aggregation is an early pathological event in Alzheimer disease (AD), starting decades before dementia onset. Estimates of the prevalence of amyloid pathology in persons without dementia are needed to understand the development of AD and to design prevention studies. OBJECTIVE: To use individual participant data meta-analysis to estimate the prevalence of amyloid pathology as measured with biomarkers in participants with normal cognition, subjective cognitive impairment (SCI), or mild cognitive impairment (MCI). DATA SOURCES: Relevant biomarker studies identified by searching studies published before April 2015 using the MEDLINE and Web of Science databases and through personal communication with investigators. STUDY SELECTION: Studies were included if they provided individual participant data for participants without dementia and used an a priori defined cutoff for amyloid positivity. DATA EXTRACTION AND SYNTHESIS: Individual records were provided for 2914 participants with normal cognition, 697 with SCI, and 3972 with MCI aged 18 to 100 years from 55 studies. MAIN OUTCOMES AND MEASURES: Prevalence of amyloid pathology on positron emission tomography or in cerebrospinal fluid according to AD risk factors (age, apolipoprotein E [APOE] genotype, sex, and education) estimated by generalized estimating equations. RESULTS: The prevalence of amyloid pathology increased from age 50 to 90 years from 10% (95% CI, 8%-13%) to 44% (95% CI, 37%-51%) among participants with normal cognition; from 12% (95% CI, 8%-18%) to 43% (95% CI, 32%-55%) among patients with SCI; and from 27% (95% CI, 23%-32%) to 71% (95% CI, 66%-76%) among patients with MCI. APOE-ε4 carriers had 2 to 3 times higher prevalence estimates than noncarriers. The age at which 15% of the participants with normal cognition were amyloid positive was approximately 40 years for APOE ε4ε4 carriers, 50 years for ε2ε4 carriers, 55 years for ε3ε4 carriers, 65 years for ε3ε3 carriers, and 95 years for ε2ε3 carriers. Amyloid positivity was more common in highly educated participants but not associated with sex or biomarker modality. CONCLUSIONS AND RELEVANCE: Among persons without dementia, the prevalence of cerebral amyloid pathology as determined by positron emission tomography or cerebrospinal fluid findings was associated with age, APOE genotype, and presence of cognitive impairment. These findings suggest a 20- to 30-year interval between first development of amyloid positivity and onset of dementia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: This is the first meta-analysis on the efficacy of composite resin restorations in anterior teeth. The objective of the present meta-analysis was to verify whether specific material classes, tooth conditioning methods and operational procedures influence the result for Class III and Class IV restorations. MATERIAL AND METHODS: The database SCOPUS and PubMed were searched for clinical trials on anterior resin composites without restricting the search to the year of publication. The inclusion criteria were: (1) prospective clinical trial with at least 2 years of observation; (2) minimal number of restorations at last recall=20; (3) report on drop-out rate; (4) report of operative technique and materials used in the trial, and (5) utilization of Ryge or modified Ryge evaluation criteria. For the statistical analysis, a linear mixed model was used with random effects to account for the heterogeneity between the studies. p-Values smaller than 0.05 were considered to be significant. RESULTS: Of the 84 clinical trials, 21 studies met the inclusion criteria, 14 of them for Class III restorations, 6 for Class IV restorations and 1 for closure of diastemata; the latter was included in the Class IV group. Twelve of the 21 studies started before 1991 and 18 before 2001. The estimated median overall success rate (without replacement) after 10 years for Class III composite resin restorations was 95% and for Class IV restorations 90%. The main reason for the replacement of Class IV restorations was bulk fractures, which occurred significantly more frequently with microfilled composites than with hybrid and macrofilled composites. Caries adjacent to restorations was infrequent in most studies and accounted only for about 2.5% of all replaced restorations after 10 years irrespective of the cavity class. Class III restorations with glass ionomer derivates suffered significantly more loss of anatomical form than did fillings with other types of material. When the enamel was acid-etched and no bonding agent was applied, significantly more restorations showed marginal staining and detectable margins compared to enamel etching with enamel bonding or the total etch technique; fillings with self-etching systems were in between of these two outcome variables. Bevelling of the enamel was associated with a significantly reduced deterioration of the anatomical form compared to no bevelling but not with less marginal staining or less detectable margins. The type of isolation (absolute/relative) had a statistically significant influence on marginal caries which, however, might be a random finding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coffee, a major dietary source of caffeine, is among the most widely consumed beverages in the world and has received considerable attention regarding health risks and benefits. We conducted a genome-wide (GW) meta-analysis of predominately regular-type coffee consumption (cups per day) among up to 91 462 coffee consumers of European ancestry with top single-nucleotide polymorphisms (SNPs) followed-up in ~30 062 and 7964 coffee consumers of European and African-American ancestry, respectively. Studies from both stages were combined in a trans-ethnic meta-analysis. Confirmed loci were examined for putative functional and biological relevance. Eight loci, including six novel loci, met GW significance (log10Bayes factor (BF)>5.64) with per-allele effect sizes of 0.03-0.14 cups per day. Six are located in or near genes potentially involved in pharmacokinetics (ABCG2, AHR, POR and CYP1A2) and pharmacodynamics (BDNF and SLC6A4) of caffeine. Two map to GCKR and MLXIPL genes related to metabolic traits but lacking known roles in coffee consumption. Enhancer and promoter histone marks populate the regions of many confirmed loci and several potential regulatory SNPs are highly correlated with the lead SNP of each. SNP alleles near GCKR, MLXIPL, BDNF and CYP1A2 that were associated with higher coffee consumption have previously been associated with smoking initiation, higher adiposity and fasting insulin and glucose but lower blood pressure and favorable lipid, inflammatory and liver enzyme profiles (P<5 × 10-8).Our genetic findings among European and African-American adults reinforce the role of caffeine in mediating habitual coffee consumption and may point to molecular mechanisms underlying inter-individual variability in pharmacological and health effects of coffee.