915 resultados para New methodology
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The objective of this work was to develop a low-cost portable damage detection tool to assess and predict damage areas in highway bridges. The proposed tool was based on standard vibration-based damage identification (VBDI) techniques but was extended to a new approach based on operational traffic load. The methodology was tested using numerical simulations, laboratory experiments, and field testing.
Resumo:
The following paper introduces a new approach to the analysis of offensive game in football. Therefore, the main aim of this study was to create an instrument for collecting information for the analysis of offensive action and interactions game. The observation instrument that was used to accomplish the main objective of this work consists of a combination of format fields (FC) and systems of categories (SC). This methodology is a particular strategy of the scientific method that has as an objective to analyse the perceptible behaviour that occurs in habitual contexts, allowing them to be formally recorded and quantified and using an ad hoc instrument in order to obtain a behaviour systematic registration that, since they have been transformed in quantitative data with the necessary reliability and validity determined level, will allow analysis of the relations between these behaviours. The codifications undertaken to date in various games of football have shown that it serves the purposes for which it was developed, allowing more research into the offensive game methods in football.
Resumo:
Understanding the oxidative reactivity of nanoparticles (NPs; <100 nm) could substantially contribute to explaining their toxicity. We attempted to refine the use of 2′7-dichlorodihydrofluorescein (DCFH) to characterize NP generation of reactive oxygen species (ROS). Several fluorescent probes have been applied to testing oxidative reactivity, but despite DCFH being one of the most popular for the detection of ROS, when it has been applied to NPs there have been an unexplainably wide variability in results. Without a uniform methodology, validating even robust results is impossible. This study, therefore, identified sources of conflicting results and investigated ways of reducing occurrence of artificial results. Existing techniques were tested and combined (using their most desirable features) to form a more reliable method for the measurement of NP reactivity in aqueous dispersions. We also investigated suitable sample ranges necessary to determine generation of ROS. Specifically, ultrafiltration and time-resolved scan absorbance spectra were used to study possible optical interference when using high sample concentrations. Robust results were achieved at a 5 µM DCFH working solution with 0.5 unit/mL horseradish peroxidase (HRP) dissolved in ethanol. Sonication in DCFH-HRP working solution provided more stable data with a relatively clean background. Optimal particle concentration depends on the type of NP and in general was in the µg/mL range. Major reasons for previously reported conflicting results due to interference were different experimental approaches and NP sample concentrations. The protocol presented here could form the basis of a standardized method for applying DCFH to detect generation of ROS by NPs.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Knowledge about spatial biodiversity patterns is a basic criterion for reserve network design. Although herbarium collections hold large quantities of information, the data are often scattered and cannot supply complete spatial coverage. Alternatively, herbarium data can be used to fit species distribution models and their predictions can be used to provide complete spatial coverage and derive species richness maps. Here, we build on previous effort to propose an improved compositionalist framework for using species distribution models to better inform conservation management. We illustrate the approach with models fitted with six different methods and combined using an ensemble approach for 408 plant species in a tropical and megadiverse country (Ecuador). As a complementary view to the traditional richness hotspots methodology, consisting of a simple stacking of species distribution maps, the compositionalist modelling approach used here combines separate predictions for different pools of species to identify areas of alternative suitability for conservation. Our results show that the compositionalist approach better captures the established protected areas than the traditional richness hotspots strategies and allows the identification of areas in Ecuador that would optimally complement the current protection network. Further studies should aim at refining the approach with more groups and additional species information.
Resumo:
Citalopram, a new bicyclic antidepressant, is the most selective serotonin reuptake inhibitor. In a number of double-blind controlled studies, citalopram was compared to placebo and to known tricyclic antidepressants. These studies have shown their efficacy and good safety. The inefficacy of a psychotropic treatment in at least 20% of depressives has led a number of authors to propose original drug combinations and associations, like antidepressant/lithium (Li), antidepressant/sleep deprivation (agrypnia), antidepressant/ECT, or antidepressant/LT3. The aim of this investigation is to evaluate the clinical effectiveness and safety of a combined citalopram/lithium treatment in therapy-resistant patients, taking account of serotonergic functions, as tested by the fenfluramine/prolactin test, and of drug pharmacokinetics and pharmacogenetics of metabolism. DESIGN OF THE STUDY: A washout period of 3 days before initiating the treatment is included. After an open treatment phase of 28 days (D) with citalopram (20 mg D1-D3; 40 mg D4-D14; 40 or 60 mg D15-D28; concomitant medication allowed: chloral, chlorazepate), the nonresponding patients [less than 50% improvement in the total score on the 21 item-Hamilton Depression Rating Scale (HDRS)] are selected and treated with or without Li (randomized in double-blind conditions: citalopram/Li or citalopram/placebo) during the treatment (D29-D35). Thereafter, all patients included in the double-blind phase subsequently receive an open treatment with citalopram/Li for 7 days (D36-D42). The hypothesis of a relationship between serotoninergic functions in patients using the fenfluramine/prolactin test (D1) and the clinical response to citalopram (and Li) is assessed. Moreover, it is evaluated whether the pharmacogenetic status of the patients, as determined by the mephenytoin/dextromethorphan test (D0-D28), is related to the metabolism of fenfluramine and citalopram, and also to the clinical response. CLINICAL ASSESSMENT: Patients with a diagnosis of major depressive disorders according to DSM III are submitted to a clinical assessment of D1, D7, D14, D28, D35, D42: HDRS, CGI (clinical global impression), VAS (visual analog scales for self-rating of depression), HDRS (Hamilton depression rating scale, 21 items), UKU (side effects scale), and to clinical laboratory examens, as well as ECG, control of weight, pulse, blood pressure at D1, D28, D35. Fenfluramine/prolactin test: A butterfly needle is inserted in a forearm vein at 7 h 45 and is kept patent with liquemine. Samples for plasma prolactin, and d- and l-fenfluramine determinations are drawn at 8 h 15 (base line). Patients are given 60 mg fenfluramine (as a racemate) at 8 h 30. Kinetic points are determined at 9 h 30, 10 h 30, 11 h 30, 12 h 30, 13 h 30. Plasma levels of d- and l-fenfluramine are determined by gas chromatography and prolactin by IRNA. Mephenytoin/dextromethorphan test: Patients empty their bladders before the test; they are then given 25 mg dextropethorphan and 100 mg mephenytoin (as a racemate) at 8 h 00. They collect all urines during the following 8 hours. The metabolic ratio is determined by gas chromatography (metabolic ratio dextromethorphan/dextrorphan greater than 0.3 = PM (poor metabolizer); mephenytoin/4-OH-mephenytoin greater than 5.6, or mephenytoin S/R greater than 0.8 = PM). Citalopram plasma levels: Plasma levels of citalopram, desmethylcitalopram and didesmethylcitalopram are determined by gas chromatography--mass spectrometry. RESULTS OF THE PILOT STUDY. The investigation has been preceded by a pilot study including 14 patients, using the abovementioned protocol, except that all nonresponders were medicated with citalopram/Li on D28 to D42. The mean total score (n = 14) on the 21 item Hamilton scale was significantly reduced after the treatment, ie from 26.93 +/- 5.80 on D1 to 8.57 +/- 6.90 on D35 (p less than 0.001). A similar patCitalopram, a new bicyclic antidepressant, is the most selective serotonin reu
Resumo:
The purpose of this project is to develop an investment analysis model that integrates the capabilities of four types of analysis for use in evaluating interurban transportation system improvements. The project will also explore the use of new data warehousing and mining techniques to design the types of databases required for supporting such a comprehensive transportation model. The project consists of four phases. The first phase, which is documented in this report, involves development of the conceptual foundation for the model. Prior research is reviewed in Chapter 1, which is composed of three major sections providing demand modeling background information for passenger transportation, transportation of freight (manufactured products and supplies), and transportation of natural resources and agricultural commodities. Material from the literature on geographic information systems makes up Chapter 2. Database models for the national and regional economies and for the transportation and logistics network are conceptualized in Chapter 3. Demand forecasting of transportation service requirements is introduced in Chapter 4, with separate sections for passenger transportation, freight transportation, and transportation of natural resources and commodities. Characteristics and capacities of the different modes, modal choices, and route assignments are discussed in Chapter 5. Chapter 6 concludes with a general discussion of the economic impacts and feedback of multimodal transportation activities and facilities.
Resumo:
Decisions taken in modern organizations are often multi-dimensional, involving multiple decision makers and several criteria measured on different scales. Multiple Criteria Decision Making (MCDM) methods are designed to analyze and to give recommendations in this kind of situations. Among the numerous MCDM methods, two large families of methods are the multi-attribute utility theory based methods and the outranking methods. Traditionally both method families require exact values for technical parameters and criteria measurements, as well as for preferences expressed as weights. Often it is hard, if not impossible, to obtain exact values. Stochastic Multicriteria Acceptability Analysis (SMAA) is a family of methods designed to help in this type of situations where exact values are not available. Different variants of SMAA allow handling all types of MCDM problems. They support defining the model through uncertain, imprecise, or completely missing values. The methods are based on simulation that is applied to obtain descriptive indices characterizing the problem. In this thesis we present new advances in the SMAA methodology. We present and analyze algorithms for the SMAA-2 method and its extension to handle ordinal preferences. We then present an application of SMAA-2 to an area where MCDM models have not been applied before: planning elevator groups for high-rise buildings. Following this, we introduce two new methods to the family: SMAA-TRI that extends ELECTRE TRI for sorting problems with uncertain parameter values, and SMAA-III that extends ELECTRE III in a similar way. An efficient software implementing these two methods has been developed in conjunction with this work, and is briefly presented in this thesis. The thesis is closed with a comprehensive survey of SMAA methodology including a definition of a unified framework.
Resumo:
BACKGROUND: Human RNA polymerase III (pol III) transcription is regulated by several factors, including the tumor suppressors P53 and Rb, and the proto-oncogene c-Myc. In yeast, which lacks these proteins, a central regulator of pol III transcription, called Maf1, has been described. Maf1 is required for repression of pol III transcription in response to several signal transduction pathways and is broadly conserved in eukaryotes. METHODOLOGY/PRINCIPAL FINDINGS: We show that human endogenous Maf1 can be co-immunoprecipitated with pol III and associates in vitro with two pol III subunits, the largest subunit RPC1 and the alpha-like subunit RPAC2. Maf1 represses pol III transcription in vitro and in vivo and is required for maximal pol III repression after exposure to MMS or rapamycin, treatments that both lead to Maf1 dephosphorylation. CONCLUSIONS/SIGNIFICANCE: These data suggest that Maf1 is a major regulator of pol III transcription in human cells.
Resumo:
The diffusion of mobile telephony began in 1971 in Finland, when the first car phones, called ARP1 were taken to use. Technologies changed from ARP to NMT and later to GSM. The main application of the technology, however, was voice transfer. The birth of the Internet created an open public data network and easy access to other types of computer-based services over networks. Telephones had been used as modems, but the development of the cellular technologies enabled automatic access from mobile phones to Internet. Also other wireless technologies, for instance Wireless LANs, were also introduced. Telephony had developed from analog to digital in fixed networks and allowed easy integration of fixed and mobile networks. This development opened a completely new functionality to computers and mobile phones. It also initiated the merger of the information technology (IT) and telecommunication (TC) industries. Despite the arising opportunity for firms' new competition the applications based on the new functionality were rare. Furthermore, technology development combined with innovation can be disruptive to industries. This research focuses on the new technology's impact on competition in the ICT industry through understanding the strategic needs and alternative futures of the industry's customers. The change speed inthe ICT industry is high and therefore it was valuable to integrate the DynamicCapability view of the firm in this research. Dynamic capabilities are an application of the Resource-Based View (RBV) of the firm. As is stated in the literature, strategic positioning complements RBV. This theoretical framework leads theresearch to focus on three areas: customer strategic innovation and business model development, external future analysis, and process development combining these two. The theoretical contribution of the research is in the development of methodology integrating theories of the RBV, dynamic capabilities and strategic positioning. The research approach has been constructive due to the actual managerial problems initiating the study. The requirement for iterative and innovative progress in the research supported the chosen research approach. The study applies known methods in product development, for instance, innovation process in theGroup Decision Support Systems (GDSS) laboratory and Quality Function Deployment (QFD), and combines them with known strategy analysis tools like industry analysis and scenario method. As the main result, the thesis presents the strategic innovation process, where new business concepts are used to describe the alternative resource configurations and scenarios as alternative competitive environments, which can be a new way for firms to achieve competitive advantage in high-velocity markets. In addition to the strategic innovation process as a result, thestudy has also resulted in approximately 250 new innovations for the participating firms, reduced technology uncertainty and helped strategic infrastructural decisions in the firms, and produced a knowledge-bank including data from 43 ICT and 19 paper industry firms between the years 1999 - 2004. The methods presentedin this research are also applicable to other industries.
Resumo:
There is a broad consensus among economists that technologicalchange has been a major contributor to the productivity growth and, hence, to the growth of the material welfare in western industrialized countries at least over the last century. Paradoxically, this issue has not been the focal point of theoretical economics. At the same time, we have witnessed the rise of the importance of technological issues at the strategic management level of business firms. Interestingly, the research has not accurately responded to this challenge either. The tension between the overwhelming empirical evidence of the importance of technology and its relative omission in the research offers a challenging target for a methodological endeavor. This study deals with the question of how different theories cope with technology and explain technological change. The focusis at the firm level and the analysis concentrates on metatheoretical issues, except for the last two chapters, which examine the problems of strategic management of technology. Here the aim is to build a new evolutionary-based theoreticalframework to analyze innovation processes at the firm level. The study consistsof ten chapters. Chapter 1 poses the research problem and contrasts the two basic approaches, neoclassical and evolutionary, to be analyzed. Chapter 2 introduces the methodological framework which is based on the methodology of isolation. Methodological and ontoogical commitments of the rival approaches are revealed and basic questions concerning their ways of theorizing are elaborated. Chapters 3-6 deal with the so-called substantive isolative criteria. The aim is to examine how different approaches cope with such critical issues as inherent uncertainty and complexity of innovative activities (cognitive isolations, chapter 3), theboundedness of rationality of innovating agents (behavioral isolations, chapter4), the multidimensional nature of technology (chapter 5), and governance costsrelated to technology (chapter 6). Chapters 7 and 8 put all these things together and look at the explanatory structures used by the neoclassical and evolutionary approaches in the light of substantive isolations. The last two cpahters of the study utilize the methodological framework and tools to appraise different economics-based candidates in the context of strategic management of technology. The aim is to analyze how different approaches answer the fundamental question: How can firms gain competitive advantages through innovations and how can the rents appropriated from successful innovations be sustained? The last chapter introduces a new evolutionary-based technology management framework. Also the largely omitted issues of entrepreneurship are examined.
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
Background: Antiretroviral therapy has changed the natural history of human immunodeficiency virus (HIV) infection in developed countries, where it has become a chronic disease. This clinical scenario requires a new approach to simplify follow-up appointments and facilitate access to healthcare professionals. Methodology: We developed a new internet-based home care model covering the entire management of chronic HIV-infected patients. This was called Virtual Hospital. We report the results of a prospective randomised study performed over two years, comparing standard care received by HIV-infected patients with Virtual Hospital care. HIV-infected patients with access to a computer and broadband were randomised to be monitored either through Virtual Hospital (Arm I) or through standard care at the day hospital (Arm II). After one year of follow up, patients switched their care to the other arm. Virtual Hospital offered four main services: Virtual Consultations, Telepharmacy, Virtual Library and Virtual Community. A technical and clinical evaluation of Virtual Hospital was carried out. Findings: Of the 83 randomised patients, 42 were monitored during the first year through Virtual Hospital (Arm I) and 41 through standard care (Arm II). Baseline characteristics of patients were similar in the two arms. The level of technical satisfaction with the virtual system was high: 85% of patients considered that Virtual Hospital improved their access to clinical data and they felt comfortable with the videoconference system. Neither clinical parameters [level of CD4 + T lymphocytes, proportion of patients with an undetectable level of viral load (p = 0.21) and compliance levels 90% (p = 0.58)] nor the evaluation of quality of life or psychological questionnaires changed significantly between the two types of care. Conclusions: Virtual Hospital is a feasible and safe tool for the multidisciplinary home care of chronic HIV patients. Telemedicine should be considered as an appropriate support service for the management of chronic HIV infection.
Resumo:
Ever since the inception of economics over two hundred years ago, the tools at the discipline's disposal have grown more and more more sophisticated. This book provides a historical introduction to the methodology of economics through the eyes of economists. The story begins with John Stuart Mill's seminal essay from 1836 on the definition and method of political economy, which is then followed by an examination of how the actual practices of economists changed over time to such an extent that they not only altered their methods of enquiry, but also their self-perception as economists. Beginning as intellectuals and journalists operating to a large extent in the public sphere, they then transformed into experts who developed their tools of research increasingly behind the scenes. No longer did they try to influence policy agendas through public discourse; rather they targeted policymakers directly and with instruments that showed them as independent and objective policy advisors, the tools of the trade changing all the while. In order to shed light on this evolution of economic methodology, this book takes carefully selected snapshots from the discipline's history. It tracks the process of development through the nineteenth and twentieth centuries, analysing the growth of empirical and mathematical modelling. It also looks at the emergence of the experiment in economics, in addition to the similarities and differences between modelling and experimentation. This book will be relevant reading for students and academics in the fields of economic methodology, history of economics, and history and philosophy of the social sciences.