983 resultados para Operable Adaptive Diagnostic Scale OADS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To evaluate the resistance of Aedes aegypti to temephos Fersol 1G (temephos 1% w/w) associated with the adaptive disadvantage of insect populations in the absence of selection pressure. METHODS A diagnostic dose of 0.28 mg a.i./L and doses between 0.28 mg a.i./L and 1.40 mg a.i./L were used. Vector populations collected between 2007 and 2008 in the city of Campina Grande, state of Paraíba, were evaluated. To evaluate competition in the absence of selection pressure, insect populations with initial frequencies of 20.0%, 40.0%, 60.0%, and 80.0% resistant individuals were produced and subjected to the diagnostic dose for two months. Evaluation of the development of aquatic and adult stages allowed comparison of the life cycles in susceptible and resistant populations and construction of fertility life tables. RESULTS No mortality was observed in Ae. aegypti populations subjected to the diagnostic dose of 0.28 mg a.i./L. The decreased mortality observed in populations containing 20.0%, 40.0%, 60.0%, and 80.0% resistant insects indicates that temephos resistance is unstable in the absence of selection pressure. A comparison of the life cycles indicated differences in the duration and viability of the larval phase, but no differences were observed in embryo development, sex ratio, adult longevity, and number of eggs per female. CONCLUSIONS The fertility life table results indicated that some populations had reproductive disadvantages compared with the susceptible population in the absence of selection pressure, indicating the presence of a fitness cost in populations resistant to temephos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Even though Software Transactional Memory (STM) is one of the most promising approaches to simplify concurrent programming, current STM implementations incur significant overheads that render them impractical for many real-sized programs. The key insight of this work is that we do not need to use the same costly barriers for all the memory managed by a real-sized application, if only a small fraction of the memory is under contention lightweight barriers may be used in this case. In this work, we propose a new solution based on an approach of adaptive object metadata (AOM) to promote the use of a fast path to access objects that are not under contention. We show that this approach is able to make the performance of an STM competitive with the best fine-grained lock-based approaches in some of the more challenging benchmarks. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. METHOD: The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. RESULTS: The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel spatiotemporal-adaptive Multiscale Finite Volume (MsFV) method, which is based on the natural idea that the global coarse-scale problem has longer characteristic time than the local fine-scale problems. As a consequence, the global problem can be solved with larger time steps than the local problems. In contrast to the pressure-transport splitting usually employed in the standard MsFV approach, we propose to start directly with a local-global splitting that allows to locally retain the original degree of coupling. This is crucial for highly non-linear systems or in the presence of physical instabilities. To obtain an accurate and efficient algorithm, we devise new adaptive criteria for global update that are based on changes of coarse-scale quantities rather than on fine-scale quantities, as it is routinely done before in the adaptive MsFV method. By means of a complexity analysis we show that the adaptive approach gives a noticeable speed-up with respect to the standard MsFV algorithm. In particular, it is efficient in case of large upscaling factors, which is important for multiphysics problems. Based on the observation that local time stepping acts as a smoother, we devise a self-correcting algorithm which incorporates the information from previous times to improve the quality of the multiscale approximation. We present results of multiphase flow simulations both for Darcy-scale and multiphysics (hybrid) problems, in which a local pore-scale description is combined with a global Darcy-like description. The novel spatiotemporal-adaptive multiscale method based on the local-global splitting is not limited to porous media flow problems, but it can be extended to any system described by a set of conservation equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate the diagnostic value and image quality of CT with filtered back projection (FBP) compared with adaptive statistical iterative reconstructed images (ASIR) in body stuffers with ingested cocaine-filled packets.Methods and Materials: Twenty-nine body stuffers (mean age 31.9 years, 3 women) suspected for ingestion of cocaine-filled packets underwent routine-dose 64-row multidetector CT with FBP (120kV, pitch 1.375, 100-300 mA and automatic tube current modulation (auto mA), rotation time 0.7sec, collimation 2.5mm), secondarily reconstructed with 30 % and 60 % ASIR. In 13 (44.83%) out of the body stuffers cocaine-filled packets were detected, confirmed by exact analysis of the faecal content including verification of the number (range 1-25). Three radiologists independently and blindly evaluated anonymous CT examinations (29 FBP-CT and 68 ASIR-CT) for the presence and number of cocaine-filled packets indicating observers' confidence, and graded them for diagnostic quality, image noise, and sharpness. Sensitivity, specificity, area under the receiver operating curve (ROC) Az and interobserver agreement between the 3 radiologists for FBP-CT and ASIR-CT were calculated.Results: The increase of the percentage of ASIR significantly diminished the objective image noise (p<0.001). Overall sensitivity and specificity for the detection of the cocaine-filled packets were 87.72% and 76.15%, respectively. The difference of ROC area Az between the different reconstruction techniques was significant (p= 0.0101), that is 0.938 for FBP-CT, 0.916 for 30 % ASIR-CT, and 0.894 for 60 % ASIR-CT.Conclusion: Despite the evident image noise reduction obtained by ASIR, the diagnostic value for detecting cocaine-filled packets decreases, depending on the applied ASIR percentage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The French Analytical Questionnaire of diagnostic of the Personality (Q.A.P., 1996), who took over from the characterological questionnaire of Berger (1950), is a interesting one for psychologists and characterologists because this test is based on very coherent theoretical corpus. This questionnaire is divided in two parts, the first one consists of three factor or fundamental scales which allow to determine the characterological type of individuals, the second part is made of nine complementary scales allowing to determine more precisely the personality of the subjects. We have done a structural validation of that questionnaire using a large sample (n=865). Several factor analyses were conducted on both part of the test. We also made a reliability analysis of each scale using the alpha of Cronbach and a homogeneity analysis of each question. Thank to these analyses we were able to evalue that instrument and we were able to set up that the factorial structure of the test corresponds to the theoretical one developped by the french school of characterology. The Analytical Questionnaire of diagnostic of the Personnality is globaly reliable in particullary the fist part which is very consistent. The second part is less reliable, and this is partly on due to the correlations between the scales. We have also done a correlational analysis between the first part of the Q.A.P., the questionnaire of Berger and the Rapid Questionnaire of Diagnostic of the Personnality (Q.R.D.P., 1996). The Q.A.P. might be the more reliable one. Finally we have evaluated the impact of gender, age and profession on the factors of the Analytical Questionnaire of diagnostic of the Personnality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Obesity is a major risk factor for type 2 diabetes mellitus (T2DM). A proper anthropometric characterisation of T2DM risk is essential for disease prevention and clinical risk assessement. Methods: Longitudinal study in 37 733 participants (63% women) of the Spanish EPIC (European Prospective Investigation into Cancer and Nutrition) cohort without prevalent diabetes. Detailed questionnaire information was collected at baseline and anthropometric data gathered following standard procedures. A total of 2513 verified incident T2DM cases occurred after 12.1 years of mean follow-up. Multivariable Cox regression was used to calculate hazard ratios of T2DM by levels of anthropometric variables. Results: Overall and central obesity were independently associated with T2DM risk. BMI showed the strongest association with T2DM in men whereas waist-related indices were stronger independent predictors in women. Waist-to-height ratio revealed the largest area under the ROC curve in men and women, with optimal cut-offs at 0.60 and 0.58, respectively. The most discriminative waist circumference (WC) cut-off values were 99.4 cm in men and 90.4 cm in women. Absolute risk of T2DM was higher in men than women for any combination of age, BMI and WC categories, and remained low in normal-waist women. The population risk of T2DM attributable to obesity was 17% in men and 31% in women. Conclusions: Diabetes risk was associated with higher overall and central obesity indices even at normal BMI and WC values. The measurement of waist circumference in the clinical setting is strongly recommended for the evaluation of future T2DM risk in women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Obesity is an unfavorable prognostic factor in breast cancer (BC) patients regardless of menopausal status and treatment received. However, the association between obesity and survival outcome by pathological subtype requires further clarification. METHODS We performed a retrospective analysis including 5,683 operable BC patients enrolled in four randomized clinical trials (GEICAM/9906, GEICAM/9805, GEICAM/2003-02, and BCIRG 001) evaluating anthracyclines and taxanes as adjuvant treatments. Our primary aim was to assess the prognostic effect of body mass index (BMI) on disease recurrence, breast cancer mortality (BCM), and overall mortality (OM). A secondary aim was to detect differences of such prognostic effects by subtype. RESULTS Multivariate survival analyses adjusting for age, tumor size, nodal status, menopausal status, surgery type, histological grade, hormone receptor status, human epidermal growth factor receptor 2 (HER2) status, chemotherapy regimen, and under-treatment showed that obese patients (BMI 30.0 to 34.9) had similar prognoses to that of patients with a BMIâ<â25 (reference group) in terms of recurrence (Hazard Ratio [HR]â=â1.08, 95% Confidence Interval [CI]â=â0.90 to 1.30), BCM (HRâ=â1.02, 0.81 to 1.29), and OM (HRâ=â0.97, 0.78 to 1.19). Patients with severe obesity (BMIââ¥â35) had a significantly increased risk of recurrence (HRâ=â1.26, 1.00 to 1.59, Pâ=â0.048), BCM (HRâ=â1.32, 1.00 to 1.74, Pâ=â0.050), and OM (HRâ=â1.35, 1.06 to 1.71, Pâ=â0.016) compared to our reference group. The prognostic effect of severe obesity did not vary by subtype. CONCLUSIONS Severely obese patients treated with anthracyclines and taxanes present a worse prognosis regarding recurrence, BCM, and OM than patients with BMIâ<â25. The magnitude of the harmful effect of BMI on survival-related outcomes was similar across subtypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major climatic and geological events but also population history (secondary contacts) have generated cycles of population isolation and connection of long and short periods. Recent empirical and theoretical studies suggest that fast evolutionary processes might be triggered by such events, as commonly illustrated in ecology by the adaptive radiation of cichlid fishes (isolation and reconnection of lakes and watersheds) and in epidemiology by the fast adaptation of the influenza virus (isolation and reconnection in hosts). We test whether cyclic population isolation and connection provide the raw material (standing genetic variation) for species evolution and diversification. Our analytical results demonstrate that population isolation and connection can provide, to populations, a high excess of genetic diversity compared with what is expected at equilibrium. This excess is either cyclic (high allele turnover) or cumulates with time depending on the duration of the isolation and the connection periods and the mutation rate. We show that diversification rates of animal clades are associated with specific periods of climatic cycles in the Quaternary. We finally discuss the importance of our results for macroevolutionary patterns and for the inference of population history from genomic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The quality of colon cleansing is a major determinant of quality of colonoscopy. To our knowledge, the impact of bowel preparation on the quality of colonoscopy has not been assessed prospectively in a large multicenter study. Therefore, this study assessed the factors that determine colon-cleansing quality and the impact of cleansing quality on the technical performance and diagnostic yield of colonoscopy. METHODS: Twenty-one centers from 11 countries participated in this prospective observational study. Colon-cleansing quality was assessed on a 5-point scale and was categorized on 3 levels. The clinical indication for colonoscopy, diagnoses, and technical parameters related to colonoscopy were recorded. RESULTS: A total of 5832 patients were included in the study (48.7% men, mean age 57.6 [15.9] years). Cleansing quality was lower in elderly patients and in patients in the hospital. Procedures in poorly prepared patients were longer, more difficult, and more often incomplete. The detection of polyps of any size depended on cleansing quality: odds ratio (OR) 1.73: 95% confidence interval (CI)[1.28, 2.36] for intermediate-quality compared with low-quality preparation; and OR 1.46: 95% CI[1.11, 1.93] for high-quality compared with low-quality preparation. For polyps &gt;10 mm in size, corresponding ORs were 1.0 for low-quality cleansing, OR 1.83: 95% CI[1.11, 3.05] for intermediate-quality cleansing, and OR 1.72: 95% CI[1.11, 2.67] for high-quality cleansing. Cancers were not detected less frequently in the case of poor preparation. CONCLUSIONS: Cleansing quality critically determines quality, difficulty, speed, and completeness of colonoscopy, and is lower in hospitalized patients and patients with higher levels of comorbid conditions. The proportion of patients who undergo polypectomy increases with higher cleansing quality, whereas colon cancer detection does not seem to critically depend on the quality of bowel preparation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD thesis addresses the issue of scalable media streaming in large-scale networking environments. Multimedia streaming is one of the largest sink of network resources and this trend is still growing as testified by the success of services like Skype, Netflix, Spotify and Popcorn Time (BitTorrent-based). In traditional client-server solutions, when the number of consumers increases, the server becomes the bottleneck. To overcome this problem, the Content-Delivery Network (CDN) model was invented. In CDN model, the server copies the media content to some CDN servers, which are located in different strategic locations on the network. However, they require heavy infrastructure investment around the world, which is too expensive. Peer-to-peer (P2P) solutions are another way to achieve the same result. These solutions are naturally scalable, since each peer can act as both a receiver and a forwarder. Most of the proposed streaming solutions in P2P networks focus on routing scenarios to achieve scalability. However, these solutions cannot work properly in video-on-demand (VoD) streaming, when resources of the media server are not sufficient. Replication is a solution that can be used in these situations. This thesis specifically provides a family of replication-based media streaming protocols, which are scalable, efficient and reliable in P2P networks. First, it provides SCALESTREAM, a replication-based streaming protocol that adaptively replicates media content in different peers to increase the number of consumers that can be served in parallel. The adaptiveness aspect of this solution relies on the fact that it takes into account different constraints like bandwidth capacity of peers to decide when to add or remove replicas. SCALESTREAM routes media blocks to consumers over a tree topology, assuming a reliable network composed of homogenous peers in terms of bandwidth. Second, this thesis proposes RESTREAM, an extended version of SCALESTREAM that addresses the issues raised by unreliable networks composed of heterogeneous peers. Third, this thesis proposes EAGLEMACAW, a multiple-tree replication streaming protocol in which two distinct trees, named EAGLETREE and MACAWTREE, are built in a decentralized manner on top of an underlying mesh network. These two trees collaborate to serve consumers in an efficient and reliable manner. The EAGLETREE is in charge of improving efficiency, while the MACAWTREE guarantees reliability. Finally, this thesis provides TURBOSTREAM, a hybrid replication-based streaming protocol in which a tree overlay is built on top of a mesh overlay network. Both these overlays cover all peers of the system and collaborate to improve efficiency and low-latency in streaming media to consumers. This protocol is implemented and tested in a real networking environment using PlanetLab Europe testbed composed of peers distributed in different places in Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Family Attitude Scale (FAS) is a self-report measure of critical or hostile attitudes and behaviors towards another family member, and demonstrates an ability to predict relapse in psychoses. Data are not currently available on a French version of the scale. The present study developed a French version of the FAS, using a large general population sample to test its internal structure, criterion validity and relationships with the respondents' symptoms and psychiatric diagnoses, and examined the reciprocity of FAS ratings by respondents and their partners. A total of 2072 adults from an urban population undertook a diagnostic interview and completed self-report measures, including an FAS about their partner. A subset of participants had partners who also completed the FAS. Confirmatory factor analyses revealed an excellent fit by a single-factor model, and the FAS demonstrated a strong association with dyadic adjustment. FAS scores of respondents were affected by their anxiety levels and mood, alcohol and anxiety diagnoses, and moderate reciprocity of attitudes and behaviors between the partners was seen. The French version of the FAS has similarly strong psychometric properties to the original English version. Future research should assess the ability of the French FAS to predict relapse of psychiatric disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stable co-existence of two haploid genotypes or two species is studied in a spatially heterogeneous environment submitted to a mixture of soft selection (within-patch regulation) and hard selection (outside-patch regulation) and where two kinds of resource are available. This is analysed both at an ecological time-scale (short term) and at an evolutionary time-scale (long term). At an ecological scale, we show that co-existence is very unlikely if the two competitors are symmetrical specialists exploiting different resources. In this case, the most favourable conditions are met when the two resources are equally available, a situation that should favour generalists at an evolutionary scale. Alternatively, low within-patch density dependence (soft selection) enhances the co-existence between two slightly different specialists of the most available resource. This results from the opposing forces that are acting in hard and soft regulation modes. In the case of unbalanced accessibility to the two resources, hard selection favours the most specialized genotype, whereas soft selection strongly favours the less specialized one. Our results suggest that competition for different resources may be difficult to demonstrate in the wild even when it is a key factor in the maintenance of adaptive diversity. At an evolutionary scale, a monomorphic invasive evolutionarily stable strategy (ESS) always exists. When a linear trade-off exists between survival in one habitat versus that in another, this ESS lies between an absolute adjustment of survival to niche size (for mainly soft-regulated populations) and absolute survival (specialization) in a single niche (for mainly hard-regulated populations). This suggests that environments in agreement with the assumptions of such models should lead to an absence of adaptive variation in the long term.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate modeling of flow instabilities requires computational tools able to deal with several interacting scales, from the scale at which fingers are triggered up to the scale at which their effects need to be described. The Multiscale Finite Volume (MsFV) method offers a framework to couple fine-and coarse-scale features by solving a set of localized problems which are used both to define a coarse-scale problem and to reconstruct the fine-scale details of the flow. The MsFV method can be seen as an upscaling-downscaling technique, which is computationally more efficient than standard discretization schemes and more accurate than traditional upscaling techniques. We show that, although the method has proven accurate in modeling density-driven flow under stable conditions, the accuracy of the MsFV method deteriorates in case of unstable flow and an iterative scheme is required to control the localization error. To avoid large computational overhead due to the iterative scheme, we suggest several adaptive strategies both for flow and transport. In particular, the concentration gradient is used to identify a front region where instabilities are triggered and an accurate (iteratively improved) solution is required. Outside the front region the problem is upscaled and both flow and transport are solved only at the coarse scale. This adaptive strategy leads to very accurate solutions at roughly the same computational cost as the non-iterative MsFV method. In many circumstances, however, an accurate description of flow instabilities requires a refinement of the computational grid rather than a coarsening. For these problems, we propose a modified iterative MsFV, which can be used as downscaling method (DMsFV). Compared to other grid refinement techniques the DMsFV clearly separates the computational domain into refined and non-refined regions, which can be treated separately and matched later. This gives great flexibility to employ different physical descriptions in different regions, where different equations could be solved, offering an excellent framework to construct hybrid methods.