955 resultados para Gordon, Matthew: Sociolinguistics : method and interpretation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Explicitly correlated coupled-cluster calculations of intermolecular interaction energies for the S22 benchmark set of Jurecka, Sponer, Cerny, and Hobza (Chem. Phys. Phys. Chem. 2006, 8, 1985) are presented. Results obtained with the recently proposed CCSD(T)-F12a method and augmented double-zeta basis sets are found to be in very close agreement with basis set extrapolated conventional CCSD(T) results. Furthermore, we propose a dispersion-weighted MP2 (DW-MP2) approximation that combines the good accuracy of MP2 for complexes with predominately electrostatic bonding and SCS-MP2 for dispersion-dominated ones. The MP2-F12 and SCS-MP2-F12 correlation energies are weighted by a switching function that depends on the relative HF and correlation contributions to the interaction energy. For the S22 set, this yields a mean absolute deviation of 0.2 kcal/mol from the CCSD(T)-F12a results. The method, which allows obtaining accurate results at low cost, is also tested for a number of dimers that are not in the training set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method is used to estimate the volumes of sediments of glacial valleys. This method is based on the concept of sloping local base level and requires only a digital terrain model and the limits of the alluvial valleys as input data. The bedrock surface of the glacial valley is estimated by a progressive excavation of the digital elevation model (DEM) of the filled valley area. This is performed using an iterative routine that replaces the altitude of a point of the DEM by the mean value of its neighbors minus a fixed value. The result is a curved surface, quadratic in 2D. The bedrock surface of the Rhone Valley in Switzerland was estimated by this method using the free digital terrain model Shuttle Radar Topography Mission (SRTM) (~92 m resolution). The results obtained are in good agreement with the previous estimations based on seismic profiles and gravimetric modeling, with the exceptions of some particular locations. The results from the present method and those from the seismic interpretation are slightly different from the results of the gravimetric data. This discrepancy may result from the presence of large buried landslides in the bottom of the Rhone Valley.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The determination of line crossing sequences between rollerball pens and laser printers presents difficulties that may not be overcome using traditional techniques. This research aimed to study the potential of digital microscopy and 3-D laser profilometry to determine line crossing sequences between a toner and an aqueous ink line. Different paper types, rollerball pens, and writing pressure were tested. Correct opinions of the sequence were given for all case scenarios, using both techniques. When the toner was printed before the ink, a light reflection was observed in all crossing specimens, while this was never observed in the other sequence types. The 3-D laser profilometry, more time-consuming, presented the main advantage of providing quantitative results. The findings confirm the potential of the 3-D laser profilometry and demonstrate the efficiency of digital microscopy as a new technique for determining the sequence of line crossings involving rollerball pen ink and toner. With the mass marketing of laser printers and the popularity of rollerball pens, the determination of line crossing sequences between such instruments is encountered by forensic document examiners. This type of crossing presents difficulties with optical microscopic line crossing techniques involving ballpoint pens or gel pens and toner (1-4). Indeed, the rollerball's aqueous ink penetrates through the toner and is absorbed by the fibers of the paper, leaving the examiner with the impression that the toner is above the ink even when it is not (5). Novotny and Westwood (3) investigated the possibility of determining aqueous ink and toner crossing sequences by microscopic observation of the intersection before and after toner removal. A major disadvantage of their study resides in destruction of the sample by scraping off the toner line to see what was underneath. The aim of this research was to investigate the ways to overcome these difficulties through digital microscopy and three-dimensional (3-D) laser profilometry. The former was used as a technique for the determination of sequences between gel pen and toner printing strokes, but provided less conclusive results than that of an optical stereomicroscope (4). 3-D laser profilometry, which allows one to observe and measure the topography of a surface, has been the subject of a number of recent studies in this area. Berx and De Kinder (6) and Schirripa Spagnolo (7,8) have tested the application of laser profilometry to determine the sequence of intersections of several lines. The results obtained in these studies overcome disadvantages of other methods applied in this area, such as scanning electron microscope or the atomic force microscope. The main advantages of 3-D laser profilometry include the ease of implementation of the technique and its nondestructive nature, which does not require sample preparation (8-10). Moreover, the technique is reproducible and presents a high degree of freedom in the vertical axes (up to 1000 μm). However, when the paper surface presents a given roughness, if the pen impressions alter the paper with a depth similar to the roughness of medium, the results are not always conclusive (8). It becomes difficult in this case to distinguish which characteristics can be imputed to the pen impressions or the quality of the paper surface. This important limitation is assessed by testing different types of paper of variable quality (of different grammage and finishing) and the writing pressure. The authors will therefore assess the limits of 3-D laser profilometry technique and determine whether the method can overcome such constraints. Second, the authors will investigate the use of digital microscopy because it presents a number of advantages: it is efficient, user-friendly, and provides an objective evaluation and interpretation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Few epidemiological studies have addressed the health of workers exposed to novel manufactured nanomaterials. The small current workforce will necessitate pooling international cohorts. METHOD: A road map was defined for a globally harmonized framework for the careful choice of materials, exposure characterization, identification of study populations, definition of health endpoints, evaluation of appropriateness of study designs, data collection and analysis, and interpretation of the results. RESULTS: We propose a road map to reach global consensus on these issues. The proposed strategy should ensure that the costs of action are not disproportionate to the potential benefits and that the approach is pragmatic and practical. CONCLUSIONS: We should aim to go beyond the collection of health complaints, illness statistics, or even counts of deaths; the manifestation of such clear endpoints would indicate a failure of preventive measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that regression analyses involving compositional data need special attention because the data are not of full rank. For a regression analysis where both the dependent and independent variable are components we propose a transformation of the components emphasizing their role as dependent and independent variables. A simple linear regression can be performed on the transformed components. The regression line can be depicted in a ternary diagram facilitating the interpretation of the analysis in terms of components. An exemple with time-budgets illustrates the method and the graphical features

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have developed an easy method for the synthesis of thirteen compounds derived from 1,2,4-triazoles through a carboxylic acid and hydrazinophthalazine reaction, with a 75-85% yield mediated by the use of agents such as 1-ethyl-3-(3'-dimethylaminopropyl)-carbodiimide hydrochloride and 1-hydroxybenzotriazole. The operational simplicity of this method and the good yield of products make it valuable for the synthesis of new compounds with pharmacological activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development and validation of an UV-Visible spectrophotometric method for quantitation of genistein and genistin in soy dry extracts, after reaction with aluminum chloride. The method showed to be linear (r²= 0.9999), precise (R.S.D. < 2%), accurate (recovery of 101.56%) and robust. Seven samples of soy dry extracts were analyzed by the spectrophotometric validated method and by RP-HPLC. Genistein concentrations determined by spectrophotometry (0.63% - 16.05%) were slightly higher than values obtained by HPLC analysis (0.40% - 12.79%); however, the results of both methods showed a strong correlation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A spectrophotometric method for the indirect determination of captopril (CP) in pharmaceutical formulations is proposed. The proposed procedure is based on the oxidation of captopril by potassium dichromate and the determination excess oxidant on the basis of its reaction with diphenylcarbazide (DPC). Under the optimum conditions, a good linear relationship (r = 0.9997) was obtained in the range of 0.08-3.5 µg mL-1. The assay limits of detection and quantitation were 0.024 and 0.08 µg mL-1, respectively. The results obtained for captopril determination in pharmaceuticals using the proposed method and those obtained with the US Pharmacopoeia method were in good agreement at the 95% confidence level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the thesis is to generate scenarios of future purposes and of use of ships, suitable for STX Finland Cruise Oy to design and build, over a 50 year time span by applying the Delphi method and an open innovation approach in a future workshop. The scenarios were mapped out with help of two Delphi survey rounds and one future workshop. The number of participants in both surveys and the workshop was some twenty experts in each, representing various fields. On the basis of the first survey round, four different subject areas were selected for analysis: purposes for the use of ships; energy efficiency of cruises and ships; cost efficiency of sea transportation and vacation; and the views and expectations of the customers in the future. As a result of the future workshop, four different themes were established, which were studied further during the second Delphi round. The themes are future service and operation concepts; versatile uses of the space in ships; communication of environmental benefits of ships, future energy solutions and social interaction between passengers onboard. In addition to generating the scenarios, further aim of the thesis is to implement the Delphi method and workshop activity as foresight tools for STX Europe and to produce a chart of a future shipbuilding foresight community to can serve the open innovation processes in the maritime cluster as a whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation approaches the manifestations of ideology in U.S. Strategic Communication. The discussion approaches Strategic Communication by relating it to the Enlightenment narratives and suggesting these narratives maintain similar social and political functions. This dissertation aims to address the key contents and mechanisms of Strategic Communication by covering the perspectives of (i) communication as leadership as well as (ii) communication as discourse , i.e. practice and contents. Throughout the empirical part of the dissertation, the communication theoretical discussion is supported by a methodological framework that bridges Critical Discourse Analysis (CDA) and functional language theory. According to the principles of CDA, Strategic Communication is treated as ideological, hegemonic discourse that impacts social order. The primary method of analysis is transitivity analysis, which is concerned with how language and its patterns construe reality. This analysis is complemented with a discussion on the rituals of production and interpretation, which can be treated as visual extensions of textual transitivity. The concept of agency is the key object of analysis. From the perspective of leadership, Strategic Communication is essentially a leadership model through which the organization defines itself, its aims and legitimacy. This dissertation arrives to the conclusion that Strategic Communication is used not only as a concept for managing Public Relations and information operations. It is an esse ntial asset in the inter-organization management of its members. The current developments indicate that the concept is developing towards even heavier measures of control. From the perspective of language and discourse, the key narratives of Strategic Communication are advocated with the intrinsic values of democracy and technological progress as the prerequisites of ethics and justice. The transitivity patterns reveal highly polarized agency. The agency of the Self is typically outsourced to technology. Further, the transitivity pa tterns demonstrate how the effects-centric paradigm of warfare has created a lexicon that is ideologically exclusive. It has led to the development of two mutually exclusive sets of vocabulary, where the desc riptions of legitimate ac tion exclude Others by default. These ideological discourses have become naturalized in the official vocabulary of strategic planning and le adership. Finally, the analysis of the images of the captures and deaths of Saddam Hussein, Osama bin Laden and Muammar Gaddafi bring the discussion back to the themes of the Enlightenment by demonstrating how democracy is framed to serve political purposes. The images of democracy are essentially images of violence. Contrary to the official, instrumental and humanitari an narratives of Strategic Communication, it is the grammar of expressive, violent rituals that serve as the instrument of unity.