874 resultados para Filmic approach methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND/AIMS: For many therapeutic decisions in Crohn's disease (CD), high-grade evidence is lacking. To assist clinical decision-making, explicit panel-based appropriateness criteria were developed by an international, multidisciplinary expert panel. METHODS: 10 gastroenterologists, 3 surgeons and 2 general practitioners from 12 European countries assessed the appropriateness of therapy for CD using the RAND Appropriateness Method. Their assessment was based on the study of a recent literature review of the subject, combined with their own expert clinical judgment. Panelists rated clinical indications and treatment options using a 9-point scale (1 = extremely inappropriate; 9 = extremely appropriate). These scenarios were then discussed in detail at the panel meeting and re-rated. Median ratings and disagreement were used to aggregate ratings into three assessment categories: appropriate (A), uncertain (U) and inappropriate (I). RESULTS: 569 specific indications were rated, dealing with 9 clinical presentations: mild/moderate luminal CD (n = 104), severe CD (n = 126), steroid-dependent CD (n = 25), steroid-refractory CD (n = 37), fistulizing CD (n = 49), fibrostenotic CD (n = 35), maintenance of medical remission of CD (n = 84), maintenance of surgical remission (n = 78), drug safety in pregnancy (n = 24) and use of infliximab (n = 7). Overall, 146 indications (26%) were judged appropriate, 129 (23%) uncertain and 294 (52%) inappropriate. Frank disagreement was low (14% overall) with the greatest disagreement (54% of scenarios) being observed for treatment of steroid-refractory disease. CONCLUSIONS: Detailed explicit appropriateness criteria for the appropriate use of therapy for CD were developed for the first time by a European expert panel. Disease location, severity and previous treatments were the main factors taken into account. User-friendly access to EPACT criteria is available via an Internet site, www.epact.ch, allowing prospective evaluation and improvement of appropriateness of current CD therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Baseline physical activity data are needed to effectively plan programs and policies to prevent noncommunicable diseases, but for many African countries these data are lacking. PURPOSE: To describe and compare levels and patterns of physical activity among adults across 22 African countries. METHODS: Data from 57,038 individuals from 22 countries (11 national and 11 subnational samples) that participated in the STEPwise approach to chronic disease risk factor surveillance (2003-2009) were analyzed in 2010. The validated Global Physical Activity Questionnaire (GPAQ) was used to assess days and duration of physical activity at work, for transport, and during leisure time in a typical week. RESULTS: Overall, 83.8% of men and 75.7% of women met WHO physical activity recommendations (at least 150 minutes of moderate activity per week or equivalent). Country prevalence ranged from 46.8% (Mali) to 96.0% (Mozambique). Physical activity, both at work and for transport, including walking, had large contributions to overall physical activity, while physical activity during leisure time was rare in the analyzed countries. CONCLUSIONS: Physical activity levels varied greatly across African countries and population subgroups. Leisure time activity was consistently low. These data will be useful to inform policymakers and to guide interventions to promote physical activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel spatiotemporal-adaptive Multiscale Finite Volume (MsFV) method, which is based on the natural idea that the global coarse-scale problem has longer characteristic time than the local fine-scale problems. As a consequence, the global problem can be solved with larger time steps than the local problems. In contrast to the pressure-transport splitting usually employed in the standard MsFV approach, we propose to start directly with a local-global splitting that allows to locally retain the original degree of coupling. This is crucial for highly non-linear systems or in the presence of physical instabilities. To obtain an accurate and efficient algorithm, we devise new adaptive criteria for global update that are based on changes of coarse-scale quantities rather than on fine-scale quantities, as it is routinely done before in the adaptive MsFV method. By means of a complexity analysis we show that the adaptive approach gives a noticeable speed-up with respect to the standard MsFV algorithm. In particular, it is efficient in case of large upscaling factors, which is important for multiphysics problems. Based on the observation that local time stepping acts as a smoother, we devise a self-correcting algorithm which incorporates the information from previous times to improve the quality of the multiscale approximation. We present results of multiphase flow simulations both for Darcy-scale and multiphysics (hybrid) problems, in which a local pore-scale description is combined with a global Darcy-like description. The novel spatiotemporal-adaptive multiscale method based on the local-global splitting is not limited to porous media flow problems, but it can be extended to any system described by a set of conservation equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The control of endemic diseases has not attained the desired level of effectiveness in spite of the use of modern efficient thecnologies. The classic interventionist approach for the control of schistosomiasis is centered on systemic control of the snail hosts combined to large scale medical treatment and is usually carried out without social preocupation due to the assisted communities. It is easy to understand the interest and the ethical compromise of public health research while producing studies in which the biological and social determinants as well as the cultural components should be considered and also encompass the historical dimensions and symbolic representations. In face of the recent political decision in favor of decentralizations of health administration to municipal level, we suggest, in the present paper, an integrated approach for the epidemiological diagnosis of an endemic situation at local level. Theoretical and methodological aspects from both, epidemiology and anthropology are discussed. Epidemiological methods can be used to detect the dependent variables (those related to the human infection) and the independent variables (demographic, economic, sanitary and social). Another methodological approach of anthropological /etnographic nature can be conducted in order to make an articulation of the knowledge on the various dimensions or determinant levels of the disease. Mutual comprehension, between researchers and the people under investigation, on the dynamic transmission process would be relevant for a joint construction, at local level, of programmed actions for the control of endemic diseases. This would extend reflections on the health/disease process as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally accepted that most plant populations are locally adapted. Yet, understanding how environmental forces give rise to adaptive genetic variation is a challenge in conservation genetics and crucial to the preservation of species under rapidly changing climatic conditions. Environmental variation, phylogeographic history, and population demographic processes all contribute to spatially structured genetic variation, however few current models attempt to separate these confounding effects. To illustrate the benefits of using a spatially-explicit model for identifying potentially adaptive loci, we compared outlier locus detection methods with a recently-developed landscape genetic approach. We analyzed 157 loci from samples of the alpine herb Gentiana nivalis collected across the European Alps. Principle coordinates of neighbor matrices (PCNM), eigenvectors that quantify multi-scale spatial variation present in a data set, were incorporated into a landscape genetic approach relating AFLP frequencies with 23 environmental variables. Four major findings emerged. 1) Fifteen loci were significantly correlated with at least one predictor variable (R (adj) (2) > 0.5). 2) Models including PCNM variables identified eight more potentially adaptive loci than models run without spatial variables. 3) When compared to outlier detection methods, the landscape genetic approach detected four of the same loci plus 11 additional loci. 4) Temperature, precipitation, and solar radiation were the three major environmental factors driving potentially adaptive genetic variation in G. nivalis. Techniques presented in this paper offer an efficient method for identifying potentially adaptive genetic variation and associated environmental forces of selection, providing an important step forward for the conservation of non-model species under global change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Recent evidence suggests that there may be more than one Gilles de la Tourette syndrome (GTS)/tic disorder phenotype. However, little is known about the common patterns of these GTS/tic disorder-related comorbidities. In addition, sex-specific phenomenological data of GTS/tic disorder-affected adults are rare. Therefore, this community-based study used latent class analyses (LCA) to investigate sex-related and non-sex-related subtypes of GTS/tic disorders and their most common comorbidities. METHODS: The data were drawn from the PsyCoLaus study (n = 3691), a population-based survey conducted in Lausanne, Switzerland. LCA were performed on the data of 80 subjects manifesting motor/vocal tics during their childhood/adolescence. Comorbid attention-deficit hyperactivity disorder (ADHD), obsessive-compulsive disorder, depressive, phobia and panic symptoms/syndromes comprised the selected indicators. The resultant classes were characterized by psychosocial correlates. RESULTS: In LCA, four latent classes provided the best fit to the data. We identified two male-related classes. The first class exhibited both ADHD and depression. The second class comprised males with only depression. Class three was a female-related class depicting obsessive thoughts/compulsive acts, phobias and panic attacks. This class manifested high psychosocial impairment. Class four had a balanced sex proportion and comorbid symptoms/syndromes such as phobias and panic attacks. The complementary occurrence of comorbid obsessive thoughts/compulsive acts and ADHD impulsivity was remarkable. CONCLUSIONS: To the best of our knowledge, this is the first study applying LCA to community data of GTS symptoms/tic disorder-affected persons. Our findings support the utility of differentiating GTS/tic disorder subphenotypes on the basis of comorbid syndromes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Retinal detachment (RD) is a major complication of cataract surgery, which can be treated by either primary vitrectomy without indentation or the scleral buckling procedure. The aim of this study is to compare the results of these two techniques for the treatment of pseudophakic RD. PATIENTS AND METHODS: The charts of 40 patients (40 eyes) treated with scleral buckling for a primary pseudophakic RD were retrospectively studied and compared to the charts of 32 patients (32 eyes) treated with primary vitrectomy without scleral buckle during the same period by the same surgeons. To obtain comparable samples, patients with giant retinal tears, vitreous hemorrhage, and severe preoperative proliferative vitreoretinopathy (PVR) were not included. Minimal follow-up was 6 months. RESULTS: The primary success rate was 84% in the vitrectomy group and 82.5% in the ab-externo group. Final anatomical success was observed in 100% of cases in the vitrectomy group and in 95% of cases in the ab-externo group. Final visual acuity was 0.5 or better in 44% of cases in the vitrectomy group and 37.5% in the ab-externo group. The duration of the surgery was significantly lower in the ab-externo group, whereas the hospital stay tended to be lower in the vitrectomy group. In the vitrectomy group, postoperative PVR developed in 3 eyes and new or undetected breaks were responsible for failure of the initial procedure in 2 eyes. CONCLUSION: Primary vitrectomy appears to be as effective as scleral buckling procedures for the treatment of pseudophakic RD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oscillations have been increasingly recognized as a core property of neural responses that contribute to spontaneous, induced, and evoked activities within and between individual neurons and neural ensembles. They are considered as a prominent mechanism for information processing within and communication between brain areas. More recently, it has been proposed that interactions between periodic components at different frequencies, known as cross-frequency couplings, may support the integration of neuronal oscillations at different temporal and spatial scales. The present study details methods based on an adaptive frequency tracking approach that improve the quantification and statistical analysis of oscillatory components and cross-frequency couplings. This approach allows for time-varying instantaneous frequency, which is particularly important when measuring phase interactions between components. We compared this adaptive approach to traditional band-pass filters in their measurement of phase-amplitude and phase-phase cross-frequency couplings. Evaluations were performed with synthetic signals and EEG data recorded from healthy humans performing an illusory contour discrimination task. First, the synthetic signals in conjunction with Monte Carlo simulations highlighted two desirable features of the proposed algorithm vs. classical filter-bank approaches: resilience to broad-band noise and oscillatory interference. Second, the analyses with real EEG signals revealed statistically more robust effects (i.e. improved sensitivity) when using an adaptive frequency tracking framework, particularly when identifying phase-amplitude couplings. This was further confirmed after generating surrogate signals from the real EEG data. Adaptive frequency tracking appears to improve the measurements of cross-frequency couplings through precise extraction of neuronal oscillations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The two main alternative methods used to identify key sectors within the input-output approach, the Classical Multiplier method (CMM) and the Hypothetical Extraction method (HEM), are formally and empirically compared in this paper. Our findings indicate that the main distinction between the two approaches stems from the role of the internal effects. These internal effects are quantified under the CMM while under the HEM only external impacts are considered. In our comparison, we find, however that CMM backward measures are more influenced by within-block effects than the proposed forward indices under this approach. The conclusions of this comparison allow us to develop a hybrid proposal that combines these two existing approaches. This hybrid model has the advantage of making it possible to distinguish and disaggregate external effects from those that a purely internal. This proposal has also an additional interest in terms of policy implications. Indeed, the hybrid approach may provide useful information for the design of ''second best'' stimulus policies that aim at a more balanced perspective between overall economy-wide impacts and their sectoral distribution.