933 resultados para [JEL:C33] Mathematical and Quantitative Methods - Econometric Methods: Multiple
Resumo:
National malaria control programmes have the responsibility to develop a policy for malaria disease management based on a set of defined criteria as efficacy, side effects, costs and compliance. These will fluctuate over time and national guidelines will require periodic re-assessment and revision. Changing a drug policy is a major undertaking that can take several years before being fully operational. The standard methods on which a decision can be taken are the in vivo and the in vitro tests. The latter allow a quantitative measurement of the drug response and the assessment of several drugs at once. However, in terms of drug policy change its results might be difficult to interpret although they may be used as an early warning system for 2nd or 3rd line drugs. The new WHO 14-days in vivo test addresses mainly the problem of treatment failure and of haematological parameters changes in sick children. It gives valuable information on whether a drug still `works'. None of these methods are well suited for large-scale studies. Molecular methods based on detection of mutations in parasite molecules targeted by antimalarial drugs could be attractive tools for surveillance. However, their relationship with in vivo test results needs to be established
Resumo:
Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.
Resumo:
Two concentration methods for fast and routine determination of caffeine (using HPLC-UV detection) in surface, and wastewater are evaluated. Both methods are based on solid-phase extraction (SPE) concentration with octadecyl silica sorbents. A common “offline” SPE procedure shows that quantitative recovery of caffeine is obtained with 2 mL of an elution mixture solvent methanol-water containing at least 60% methanol. The method detection limit is 0.1 μg L−1 when percolating 1 L samples through the cartridge. The development of an “online” SPE method based on a mini-SPE column, containing 100 mg of the same sorbent, directly connected to the HPLC system allows the method detection limit to be decreased to 10 ng L−1 with a sample volume of 100 mL. The “offline” SPE method is applied to the analysis of caffeine in wastewater samples, whereas the “on-line” method is used for analysis in natural waters from streams receiving significant water intakes from local wastewater treatment plants
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
-
Resumo:
This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.
Resumo:
A study was conducted to estimate variation among laboratories and between manual and automated techniques of measuring pressure on the resulting gas production profiles (GPP). Eight feeds (molassed sugarbeet feed, grass silage, maize silage, soyabean hulls, maize gluten feed, whole crop wheat silage, wheat, glucose) were milled to pass a I mm screen and sent to three laboratories (ADAS Nutritional Sciences Research Unit, UK; Institute of Grassland and Environmental Research (IGER), UK; Wageningen University, The Netherlands). Each laboratory measured GPP over 144 h using standardised procedures with manual pressure transducers (MPT) and automated pressure systems (APS). The APS at ADAS used a pressure transducer and bottles in a shaking water bath, while the APS at Wageningen and IGER used a pressure sensor and bottles held in a stationary rack. Apparent dry matter degradability (ADDM) was estimated at the end of the incubation. GPP were fitted to a modified Michaelis-Menten model assuming a single phase of gas production, and GPP were described in terms of the asymptotic volume of gas produced (A), the time to half A (B), the time of maximum gas production rate (t(RM) (gas)) and maximum gas production rate (R-M (gas)). There were effects (P<0.001) of substrate on all parameters. However, MPT produced more (P<0.001) gas, but with longer (P<0.001) B and t(RM gas) (P<0.05) and lower (P<0.001) R-M gas compared to APS. There was no difference between apparatus in ADDM estimates. Interactions occurred between substrate and apparatus, substrate and laboratory, and laboratory and apparatus. However, when mean values for MPT were regressed from the individual laboratories, relationships were good (i.e., adjusted R-2 = 0.827 or higher). Good relationships were also observed with APS, although they were weaker than for MPT (i.e., adjusted R-2 = 0.723 or higher). The relationships between mean MPT and mean APS data were also good (i.e., adjusted R 2 = 0. 844 or higher). Data suggest that, although laboratory and method of measuring pressure are sources of variation in GPP estimation, it should be possible using appropriate mathematical models to standardise data among laboratories so that data from one laboratory could be extrapolated to others. This would allow development of a database of GPP data from many diverse feeds. (c) 2005 Published by Elsevier B.V.
Resumo:
Our new molecular understanding of immune priming states that dendritic cell activation is absolutely pivotal for expansion and differentiation of naïve T lymphocytes, and it follows that understanding DC activation is essential to understand and design vaccine adjuvants. This chapter describes how dendritic cells can be used as a core tool to provide detailed quantitative and predictive immunomics information about how adjuvants function. The role of distinct antigen, costimulation, and differentiation signals from activated DC in priming is explained. Four categories of input signals which control DC activation – direct pathogen detection, sensing of injury or cell death, indirect activation via endogenous proinflammatory mediators, and feedback from activated T cells – are compared and contrasted. Practical methods for studying adjuvants using DC are summarised and the importance of DC subset choice, simulating T cell feedback, and use of knockout cells is highlighted. Finally, five case studies are examined that illustrate the benefit of DC activation analysis for understanding vaccine adjuvant function.
Resumo:
Grammar has always been an important part of language learning. Based on various theories, such as the universal grammar theory (Chomsky, 1959) and, the input theory (Krashen, 1970), the explicit and implicit teaching methods have been developed. Research shows that both methods may have some benefits and disadvantages. The attitude towards English grammar teaching methods in schools has also changed and nowadays grammar teaching methods and learning strategies, as a part of language mastery, are one of the discussion topics among linguists. This study focuses on teacher and learner experiences and beliefs about teaching English grammar and difficulties learners may face. The aim of the study is to conduct a literature review and to find out what scientific knowledge exists concerning the previously named topics. Along with this, the relevant steering documents are investigated focusing on grammar teaching at Swedish upper secondary schools. The universal grammar theory of Chomsky as well as Krashen’s input hypotheses provide the theoretical background for the current study. The study has been conducted applying qualitative and quantitative methods. The systematic search in four databases LIBRIS, ERIK, LLBA and Google Scholar were used for collecting relevant publications. The result shows that scientists’ publications name different grammar areas that are perceived as problematic for learners all over the world. The most common explanation of these difficulties is the influence of learner L1. Research presents teachers’ and learners’ beliefs to the benefits of grammar teaching methods. An effective combination of teaching methods needs to be done to fit learners’ expectations and individual needs. Together, they will contribute to the achieving of higher language proficiency levels and, therefore, they can be successfully applied at Swedish upper secondary schools.
Resumo:
The acidic ninhydrin spectrophotometric method (ANSM) for quantitative determination of free and bound sialic acid of milk glycoprotein has been proved to be fast and efficient for routine detection of fraudulent addition of rennet whey to fluid milk. In this research the ANSM was compared with the high performance liquid chromatography (HPLC) method, internationally recommended for caseinomacropeptide (CMP) determination, which besides its high accuracy is more sophisticated and requires trained personnel. For several sample conditions (raw milk and milk with variable added amounts of rennet cheese whey), the methods showed an excellent linear correlation, with r = 0.981 when milk was deproteinized with a 120 g.L-1 final concentration of trichloroacetic acid (TCA) concentration. The best correlations could be seen with final concentrations of 100 g.L-1 and 80 g.L-1 TCA; respectively, r = 0.992 and 0.993.
Resumo:
A methodology to define favorable areas in petroleum and mineral exploration is applied, which consists in weighting the exploratory variables, in order to characterize their importance as exploration guides. The exploration data are spatially integrated in the selected area to establish the association between variables and deposits, and the relationships among distribution, topology, and indicator pattern of all variables. Two methods of statistical analysis were compared. The first one is the Weights of Evidence Modeling, a conditional probability approach (Agterberg, 1989a), and the second one is the Principal Components Analysis (Pan, 1993). In the conditional method, the favorability estimation is based on the probability of deposit and variable joint occurrence, with the weights being defined as natural logarithms of likelihood ratios. In the multivariate analysis, the cells which contain deposits are selected as control cells and the weights are determined by eigendecomposition, being represented by the coefficients of the eigenvector related to the system's largest eigenvalue. The two techniques of weighting and complementary procedures were tested on two case studies: 1. Recôncavo Basin, Northeast Brazil (for Petroleum) and 2. Itaiacoca Formation of Ribeira Belt, Southeast Brazil (for Pb-Zn Mississippi Valley Type deposits). The applied methodology proved to be easy to use and of great assistance to predict the favorability in large areas, particularly in the initial phase of exploration programs. © 1998 International Association for Mathematical Geology.
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.