84 resultados para process analysis
em Université de Lausanne, Switzerland
Formulation and Implementation of Air Quality Control Pogrammes : Patterns of Interest Consideration
Resumo:
This article investigates some central aspects of the relationships between programme structure and implementation of sulphur dioxide air quality control policies. Previous implementation research, primarily adopting American approaches, has neglected the connections between the processes of programme formulation and implementation. 'Programme', as the key variable in implementation studies, has been defined too narrowly. On the basis of theoretical and conceptual reflections and provisional empirical results from studies in France, Italy, England, and the Federal Republic of Germany, the authors demonstrate that an integral process analysis using a more extended programme concept is necessary if patterns of interest recognition in policies are to be discovered. Otherwise, the still important question of critical social science cannot be answered, namely, what is the impact of special interests upon implementation processes.
Resumo:
The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.
Resumo:
In this research, we analyse the contact-specific mean of the final cooperation probability, distinguishing on the one hand between contacts with household reference persons and with other eligible household members, and on the other hand between first and later contacts. Data comes from two Swiss Household Panel surveys. The interviewer-specific variance is higher for first contacts, especially in the case of the reference person. For later contacts with the reference person, the contact-specific variance dominates. This means that interaction effects and situational factors are decisive. The contact number has negative effects on the performance of contacts with the reference person, positive in the case of other persons. Also time elapsed since the previous contact has negative effects in the case of reference persons. The result of the previous contact has strong effects, especially in the case of the reference person. These findings call for a quick completion of the household grid questionnaire, assigning the best interviewers to conducting the first contact. While obtaining refusals has negative effects, obtaining other contact results has only weak effects on the interviewer's subsequent contact outcome. Using the same interviewer for contacts has no positive effects.
Resumo:
The objective of this work was to combine the advantages of the dried blood spot (DBS) sampling process with the highly sensitive and selective negative-ion chemical ionization tandem mass spectrometry (NICI-MS-MS) to analyze for recent antidepressants including fluoxetine, norfluoxetine, reboxetine, and paroxetine from micro whole blood samples (i.e., 10 microL). Before analysis, DBS samples were punched out, and antidepressants were simultaneously extracted and derivatized in a single step by use of pentafluoropropionic acid anhydride and 0.02% triethylamine in butyl chloride for 30 min at 60 degrees C under ultrasonication. Derivatives were then separated on a gas chromatograph coupled with a triple-quadrupole mass spectrometer operating in negative selected reaction monitoring mode for a total run time of 5 min. To establish the validity of the method, trueness, precision, and selectivity were determined on the basis of the guidelines of the "Société Française des Sciences et des Techniques Pharmaceutiques" (SFSTP). The assay was found to be linear in the concentration ranges 1 to 500 ng mL(-1) for fluoxetine and norfluoxetine and 20 to 500 ng mL(-1) for reboxetine and paroxetine. Despite the small sampling volume, the limit of detection was estimated at 20 pg mL(-1) for all the analytes. The stability of DBS was also evaluated at -20 degrees C, 4 degrees C, 25 degrees C, and 40 degrees C for up to 30 days. Furthermore, the method was successfully applied to a pharmacokinetic investigation performed on a healthy volunteer after oral administration of a single 40-mg dose of fluoxetine. Thus, this validated DBS method combines an extractive-derivative single step with a fast and sensitive GC-NICI-MS-MS technique. Using microliter blood samples, this procedure offers a patient-friendly tool in many biomedical fields such as checking treatment adherence, therapeutic drug monitoring, toxicological analyses, or pharmacokinetic studies.
Resumo:
Deliberate fires appear to be borderless and timeless events creating a serious security problem. There have been many attempts to develop approaches to tackle this problem, but unfortunately acting effectively against deliberate fires has proven a complex challenge. This article reviews the current situation relating to deliberate fires: what do we know, how serious is the situation, how is it being dealt with, and what challenges are faced when developing a systematic and global methodology to tackle the issues? The repetitive nature of some types of deliberate fires will also be discussed. Finally, drawing on the reality of repetition within deliberate fires and encouraged by successes obtained in previous repetitive crimes (such as property crimes or drug trafficking), we will argue that the use of the intelligence process cycle as a framework to allow a follow-up and systematic analysis of fire events is a relevant approach. This is the first article of a series of three articles. This first part is introducing the context and discussing the background issues in order to provide a better underpinning knowledge to managers and policy makers planning on tackling this issue. The second part will present a methodology developed to detect and identify repetitive fire events from a set of data, and the third part will discuss the analyses of these data to produce intelligence.
Resumo:
We have suggested previously that both the negatively and positively charged residues of the highly conserved Glu/Asp-Arg-Tyr (E/DRY) motif play an important role in the activation process of the alpha(1b)-adreneric receptor (AR). In this study, R143 of the E/DRY sequence in the alpha(1b)-AR was mutated into several amino acids (Lys, His, Glu, Asp, Ala, Asn, and Ile). The charge-conserving mutation of R143 into lysine not only preserved the maximal agonist-induced response of the alpha(1b)-AR, but it also conferred high degree of constitutive activity to the receptor. Both basal and agonist-induced phosphorylation levels were significantly increased for the R143K mutant compared with those of the wild-type receptor. Other substitutions of R143 resulted in receptor mutants with either a small increase in constitutive activity (R143H and R143D), impairment (R143H, R143D), or complete loss of receptor-mediated response (R143E, R143A, R143N, R143I). The R413E mutant displayed a small, but significant increase in basal phosphorylation despite being severely impaired in receptor-mediated response. Interestingly, all the arginine mutants displayed increased affinity for agonist binding compared with the wild-type alpha(1b)-AR. A correlation was found between the extent of the affinity shift and the intrinsic activity of the agonists. The analysis of the receptor mutants using the allosteric ternary complex model in conjunction with the results of molecular dynamics simulations on the receptor models support the hypothesis that mutations of R143 can drive the isomerization of the alpha(1b)-AR into different states, highlighting the crucial role of this residue in the activation process of the receptor.
Resumo:
In 1903, the eastern slope of Turtle Mountain (Alberta) was affected by a 30 M m3-rockslide named Frank Slide that resulted in more than 70 casualties. Assuming that the main discontinuity sets, including bedding, control part of the slope morphology, the structural features of Turtle Mountain were investigated using a digital elevation model (DEM). Using new landscape analysis techniques, we have identified three main joint and fault sets. These results are in agreement with those sets identified through field observations. Landscape analysis techniques, using a DEM, confirm and refine the most recent geology model of the Frank Slide. The rockslide was initiated along bedding and a fault at the base of the slope and propagated up slope by a regressive process following a surface composed of pre-existing discontinuities. The DEM analysis also permits the identification of important geological structures along the 1903 slide scar. Based on the so called Sloping Local Base Level (SLBL) an estimation was made of the present unstable volumes in the main scar delimited by the cracks, and around the south area of the scar (South Peak). The SLBL is a method permitting a geometric interpretation of the failure surface based on a DEM. Finally we propose a failure mechanism permitting the progressive failure of the rock mass that considers gentle dipping wedges (30°). The prisms or wedges defined by two discontinuity sets permit the creation of a failure surface by progressive failure. Such structures are more commonly observed in recent rockslides. This method is efficient and is recommended as a preliminary analysis prior to field investigation.
Resumo:
Immunocompetent microglia play an important role in the pathogenesis of Alzheimer's disease (AD). Antimicroglial antibodies in the cerebrospinal fluid (CSF) in clinically diagnosed AD patients have been previously recorded. Here, we report the results of the analysis of the CSF from 38 autopsy cases: 7 with definite AD; 14 with mild and 10 with moderate Alzheimer's type pathology; and 7 controls. Antimicroglial antibodies were identified in 70% of patients with definite AD, in 80% of patients with moderate and in 28% of patients with mild Alzheimer's type pathology. CSF antimicroglial antibodies were not observed in any of the control cases. The results show that CSF antimicroglial antibodies are present in the majority of patients with definite AD and also in cases with moderate Alzheimer's type changes. They may also indicate dysregulation of microglial function. Together with previous observations, these findings indicate that compromised immune defense mechanisms play an important role in the pathogenesis of AD.
Resumo:
There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.
Resumo:
In many fields, the spatial clustering of sampled data points has many consequences. Therefore, several indices have been proposed to assess the level of clustering affecting datasets (e.g. the Morisita index, Ripley's Kfunction and Rényi's generalized entropy). The classical Morisita index measures how many times it is more likely to select two measurement points from the same quadrats (the data set is covered by a regular grid of changing size) than it would be in the case of a random distribution generated from a Poisson process. The multipoint version (k-Morisita) takes into account k points with k >= 2. The present research deals with a new development of the k-Morisita index for (1) monitoring network characterization and for (2) detection of patterns in monitored phenomena. From a theoretical perspective, a connection between the k-Morisita index and multifractality has also been found and highlighted on a mathematical multifractal set.
Resumo:
Purpose/Objective(s): RTwith TMZ is the standard for GBM. dd TMZ causes prolongedMGMTdepletion in mononuclear cells and possibly in tumor. The RTOG 0525 trial (ASCO 2011) did not show an advantage from dd TMZ for survival or progression free survival. We conducted exploratory, hypothesis-generating subset analyses to detect possible benefit from dd TMZ.Materials/Methods: Patients were randomized to std (150-200 mg/m2 x 5 d) or dd TMZ (75-100 mg/m2 x 21 d) q 4 weeks for 6- 12 cycles. Eligibility included age.18, KPS$ 60, and. 1 cm2 tissue for prospective MGMTanalysis for stratification. Furtheranalyses were performed for all randomized patients (''intent-to-treat'', ITT), and for all patients starting protocol therapy (SPT). Subset analyses were performed by RPA class (III, IV, V), KPS (90-100, = 50,\50), resection (partial, total), gender (female, male), and neurologic dysfunction (nf = none, minor, moderate).Results: No significant difference was seen for median OS (16.6 vs. 14.9 months), or PFS (5.5 vs. 6.7 months, p = 0.06). MGMT methylation was linked to improved OS (21.2 vs. 14 months, p\0.0001), and PFS (8.7 vs. 5.7 months, p\0.0001). For the ITT (n = 833), there was no OS benefit from dd TMZ in any subset. Two subsets showed a PFS benefit for dd TMZ: RPA class III (6.2 vs. 12.6 months, HR 0.69, p = 0.03) and nf = minor (HR 0.77, p = 0.01). For RPA III, dd dramatically delayed progression, but post-progression dd patients died more quickly than std. A similar pattern for nf = minor was observed. For the SPT group (n = 714) there was neither PFS nor OS benefit for dd TMZ, overall. For RPA class III and nf = minor, there was a PFS benefit for dd TMZ (HR 0.73, p = 0.08; HR 0.77, p = 0.02). For nf = moderate subset, both ITT and SPT, the std arm showed superior OS (14.4 vs. 10.9 months) compared to dd, without improved PFS (HR 1.46, p = 0.03; and HR 1.74, p = 0.01. In terms of methylation status within this subset, there were more methylated patients in the std arm of the ITT subset (n = 159; 32 vs. 24%). For the SPT subset (n = 124), methylation status was similar between arms.Conclusions: This study did not demonstrate improved OS for dd TMZ for any subgroup, but for 2 highly functional subgroups, PFS was significantly increased. These data generate the testable hypothesis that intensive treatment may selectively improve disease control in those most likely able to tolerate dd therapy. Interpretation of this should be considered carefully due to small sample size, the process of multiple observations, and other confounders.Acknowledgment: This project was supported by RTOG grant U10 CA21661, and CCOP grant U10 CA37422 from the National Cancer Institute (NCI).
Resumo:
PURPOSE OF REVIEW: The mechanisms involved in the formation of red blood cell (RBC) microparticles in vivo as well as during erythrocyte storage are reviewed, and the potential role of microparticles in transfusion medicine is described. RECENT FINDINGS: Microparticles release is an integral part of the erythrocyte ageing process, preventing early removal of RBCs. Proteomics analyses have outlined the key role of band 3-ankyrin anchoring complex and the occurrence of selective RBC membrane remodelling mechanisms in microparticles formation. The presence of several RBC antigens, expressed on microparticles, has been demonstrated. The potential deleterious effects of RBC microparticles in transfused recipients, including hypercoagulability, microcirculation impairment and immunosuppression, are discussed. SUMMARY: Formation and role of RBC microparticles are far from being completely understood. Combining various approaches to elucidate these mechanisms could improve blood product quality and transfusion safety. Implementation of RBC microparticles as biomarkers in the laboratory routine needs to overcome technical barriers involved in their analysis.
Resumo:
This contribution explores the role of international standards in the rules governing the internationalisation of the service economy. It analyses on a cross-institutional basis patterns of authority in the institutional setting of service standards in the European and Amercian context. The entry into force of the World Trade Organisation (WTO) in 1995 gave international standards a major role in harmonising the technical specifications of goods and services traded on the global market Despite the careful wording of the WTO, a whole range of international bodies still have the capacity to define generic as well as detailed technical specifications affecting how swelling offshore services are expected to be traded on worldwide basis. The analysis relies on global political economy approaches to identify constitutive patterns of authority mediating between the political and the economic spheres on a transnational space. It extends to the area of service standards the assumption that the process of globalisation is not opposing states and markets, but a joint expression of both of them including new patterns and agents of structural change through formal and informal power and regulatory practices. The paper argues that service standards reflect the significant development of a form of transnational hybrid authority, that blurs the distinction between private and public actors, whose scope spread all along from physical measures to societal values, and which reinforces the deterritorialisation of regulatory practices in contemporary capitalism. It provides evidence of this argument by analysing the current European strategy regarding service standardization in response to several programming mandate of the European Commission and the American views on the future development of service standards.