99 resultados para Gordon, Matthew: Sociolinguistics : method and interpretation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS: c-Met is an emerging biomarker in pancreatic ductal adenocarcinoma (PDAC); there is no consensus regarding the immunostaining scoring method for this marker. We aimed to assess the prognostic value of c-Met overexpression in resected PDAC, and to elaborate a robust and reproducible scoring method for c-Met immunostaining in this setting. METHODS AND RESULTS: c-Met immunostaining was graded according to the validated MetMab score, a classic visual scale combining surface and intensity (SI score), or a simplified score (high c-Met: ≥20% of tumour cells with strong membranous staining), in stage I-II PDAC. A computer-assisted classification method (Aperio software) was developed. Clinicopathological parameters were correlated with disease-free survival (DFS) and overall survival(OS). One hundred and forty-nine patients were analysed retrospectively in a two-step process. Thirty-seven samples (whole slides) were analysed as a pre-run test. Reproducibility values were optimal with the simplified score (kappa = 0.773); high c-Met expression (7/37) was associated with shorter DFS [hazard ratio (HR) 3.456, P = 0.0036] and OS (HR 4.257, P = 0.0004). c-Met expression was concordant on whole slides and tissue microarrays in 87.9% of samples, and quantifiable with a specific computer-assisted algorithm. In the whole cohort (n = 131), patients with c-Met(high) tumours (36/131) had significantly shorter DFS (9.3 versus 20.0 months, HR 2.165, P = 0.0005) and OS (18.2 versus 35.0 months, HR 1.832, P = 0.0098) in univariate and multivariate analysis. CONCLUSIONS: Simplified c-Met expression is an independent prognostic marker in stage I-II PDAC that may help to identify patients with a high risk of tumour relapse and poor survival.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantification is a major problem when using histology to study the influence of ecological factors on tree structure. This paper presents a method to prepare and to analyse transverse sections of cambial zone and of conductive phloem in bark samples. The following paper (II) presents the automated measurement procedure. Part I here describes and discusses the preparation method, and the influence of tree age on the observed structure. Highly contrasted images of samples extracted at breast height during dormancy were analysed with an automatic image analyser. Between three young (38 years) and three old (147 years) trees, age-related differences were identified by size and shape parameters, at both cell and tissue levels. In the cambial zone, older trees had larger and more rectangular fusiform initials. In the phloem, sieve tubes were also larger, but their shape did not change and the area for sap conduction was similar in both categories. Nevertheless, alterations were limited, and demanded statistical analysis to be identified and ascertained. The physiological implications of the structural changes are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Explicitly correlated coupled-cluster calculations of intermolecular interaction energies for the S22 benchmark set of Jurecka, Sponer, Cerny, and Hobza (Chem. Phys. Phys. Chem. 2006, 8, 1985) are presented. Results obtained with the recently proposed CCSD(T)-F12a method and augmented double-zeta basis sets are found to be in very close agreement with basis set extrapolated conventional CCSD(T) results. Furthermore, we propose a dispersion-weighted MP2 (DW-MP2) approximation that combines the good accuracy of MP2 for complexes with predominately electrostatic bonding and SCS-MP2 for dispersion-dominated ones. The MP2-F12 and SCS-MP2-F12 correlation energies are weighted by a switching function that depends on the relative HF and correlation contributions to the interaction energy. For the S22 set, this yields a mean absolute deviation of 0.2 kcal/mol from the CCSD(T)-F12a results. The method, which allows obtaining accurate results at low cost, is also tested for a number of dimers that are not in the training set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method is used to estimate the volumes of sediments of glacial valleys. This method is based on the concept of sloping local base level and requires only a digital terrain model and the limits of the alluvial valleys as input data. The bedrock surface of the glacial valley is estimated by a progressive excavation of the digital elevation model (DEM) of the filled valley area. This is performed using an iterative routine that replaces the altitude of a point of the DEM by the mean value of its neighbors minus a fixed value. The result is a curved surface, quadratic in 2D. The bedrock surface of the Rhone Valley in Switzerland was estimated by this method using the free digital terrain model Shuttle Radar Topography Mission (SRTM) (~92 m resolution). The results obtained are in good agreement with the previous estimations based on seismic profiles and gravimetric modeling, with the exceptions of some particular locations. The results from the present method and those from the seismic interpretation are slightly different from the results of the gravimetric data. This discrepancy may result from the presence of large buried landslides in the bottom of the Rhone Valley.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The determination of line crossing sequences between rollerball pens and laser printers presents difficulties that may not be overcome using traditional techniques. This research aimed to study the potential of digital microscopy and 3-D laser profilometry to determine line crossing sequences between a toner and an aqueous ink line. Different paper types, rollerball pens, and writing pressure were tested. Correct opinions of the sequence were given for all case scenarios, using both techniques. When the toner was printed before the ink, a light reflection was observed in all crossing specimens, while this was never observed in the other sequence types. The 3-D laser profilometry, more time-consuming, presented the main advantage of providing quantitative results. The findings confirm the potential of the 3-D laser profilometry and demonstrate the efficiency of digital microscopy as a new technique for determining the sequence of line crossings involving rollerball pen ink and toner. With the mass marketing of laser printers and the popularity of rollerball pens, the determination of line crossing sequences between such instruments is encountered by forensic document examiners. This type of crossing presents difficulties with optical microscopic line crossing techniques involving ballpoint pens or gel pens and toner (1-4). Indeed, the rollerball's aqueous ink penetrates through the toner and is absorbed by the fibers of the paper, leaving the examiner with the impression that the toner is above the ink even when it is not (5). Novotny and Westwood (3) investigated the possibility of determining aqueous ink and toner crossing sequences by microscopic observation of the intersection before and after toner removal. A major disadvantage of their study resides in destruction of the sample by scraping off the toner line to see what was underneath. The aim of this research was to investigate the ways to overcome these difficulties through digital microscopy and three-dimensional (3-D) laser profilometry. The former was used as a technique for the determination of sequences between gel pen and toner printing strokes, but provided less conclusive results than that of an optical stereomicroscope (4). 3-D laser profilometry, which allows one to observe and measure the topography of a surface, has been the subject of a number of recent studies in this area. Berx and De Kinder (6) and Schirripa Spagnolo (7,8) have tested the application of laser profilometry to determine the sequence of intersections of several lines. The results obtained in these studies overcome disadvantages of other methods applied in this area, such as scanning electron microscope or the atomic force microscope. The main advantages of 3-D laser profilometry include the ease of implementation of the technique and its nondestructive nature, which does not require sample preparation (8-10). Moreover, the technique is reproducible and presents a high degree of freedom in the vertical axes (up to 1000 μm). However, when the paper surface presents a given roughness, if the pen impressions alter the paper with a depth similar to the roughness of medium, the results are not always conclusive (8). It becomes difficult in this case to distinguish which characteristics can be imputed to the pen impressions or the quality of the paper surface. This important limitation is assessed by testing different types of paper of variable quality (of different grammage and finishing) and the writing pressure. The authors will therefore assess the limits of 3-D laser profilometry technique and determine whether the method can overcome such constraints. Second, the authors will investigate the use of digital microscopy because it presents a number of advantages: it is efficient, user-friendly, and provides an objective evaluation and interpretation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Few epidemiological studies have addressed the health of workers exposed to novel manufactured nanomaterials. The small current workforce will necessitate pooling international cohorts. METHOD: A road map was defined for a globally harmonized framework for the careful choice of materials, exposure characterization, identification of study populations, definition of health endpoints, evaluation of appropriateness of study designs, data collection and analysis, and interpretation of the results. RESULTS: We propose a road map to reach global consensus on these issues. The proposed strategy should ensure that the costs of action are not disproportionate to the potential benefits and that the approach is pragmatic and practical. CONCLUSIONS: We should aim to go beyond the collection of health complaints, illness statistics, or even counts of deaths; the manifestation of such clear endpoints would indicate a failure of preventive measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Accurate placement of an external ventricular drain (EVD) for the treatment of hydrocephalus is of paramount importance for its functionality and in order to minimize morbidity and complications. The aim of this study was to compare two different drain insertion assistance tools with the traditional free-hand anatomical landmark method, and to measure efficacy, safety and precision. METHODS: Ten cadaver heads were prepared by opening large bone windows centered on Kocher's points on both sides. Nineteen physicians, divided in two groups (trainees and board certified neurosurgeons) performed EVD insertions. The target for the ventricular drain tip was the ipsilateral foramen of Monro. Each participant inserted the external ventricular catheter in three different ways: 1) free-hand by anatomical landmarks, 2) neuronavigation-assisted (NN), and 3) XperCT-guided (XCT). The number of ventricular hits and dangerous trajectories; time to proceed; radiation exposure of patients and physicians; distance of the catheter tip to target and size of deviations projected in the orthogonal plans were measured and compared. RESULTS: Insertion using XCT increased the probability of ventricular puncture from 69.2 to 90.2 % (p = 0.02). Non-assisted placements were significantly less precise (catheter tip to target distance 14.3 ± 7.4 mm versus 9.6 ± 7.2 mm, p = 0.0003). The insertion time to proceed increased from 3.04 ± 2.06 min. to 7.3 ± 3.6 min. (p < 0.001). The X-ray exposure for XCT was 32.23 mSv, but could be reduced to 13.9 mSv if patients were initially imaged in the hybrid-operating suite. No supplementary radiation exposure is needed for NN if patients are imaged according to a navigation protocol initially. CONCLUSION: This ex vivo study demonstrates a significantly improved accuracy and safety using either NN or XCT-assisted methods. Therefore, efforts should be undertaken to implement these new technologies into daily clinical practice. However, the accuracy versus urgency of an EVD placement has to be balanced, as the image-guided insertion technique will implicate a longer preparation time due to a specific image acquisition and trajectory planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work describes the ab initio procedure employed to build an activation model for the alpha 1b-adrenergic receptor (alpha 1b-AR). The first version of the model was progressively modified and complicated by means of a many-step iterative procedure characterized by the employment of experimental validations of the model in each upgrading step. A combined simulated (molecular dynamics) and experimental mutagenesis approach was used to determine the structural and dynamic features characterizing the inactive and active states of alpha 1b-AR. The latest version of the model has been successfully challenged with respect to its ability to interpret and predict the functional properties of a large number of mutants. The iterative approach employed to describe alpha 1b-AR activation in terms of molecular structure and dynamics allows further complications of the model to allow prediction and interpretation of an ever-increasing number of experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dispersal process, by which individuals or other dispersing agents such as gametes or seeds move from birthplace to a new settlement locality, has important consequences for the dynamics of genes, individuals, and species. Many of the questions addressed by ecology and evolutionary biology require a good understanding of species' dispersal patterns. Much effort has thus been devoted to overcoming the difficulties associated with dispersal measurement. In this context, genetic tools have long been the focus of intensive research, providing a great variety of potential solutions to measuring dispersal. This methodological diversity is reviewed here to help (molecular) ecologists find their way toward dispersal inference and interpretation and to stimulate further developments.