872 resultados para [JEL:C21] Mathematical and Quantitative Methods - Econometric Methods: Single Equation Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evaluation, selection and finally decision making are all among important issues, which engineers face in long run of projects. Engineers implement mathematical and nonmathematical methods to make accurate and correct decisions, whenever needed. As extensive as these methods are, effects of any selected method on outputs achieved and decisions made are still suspicious. This is more controversial and challengeable, where evaluation is made among non-quantitative alternatives. In civil engineering and construction management problems, criteria include both quantitative and qualitative ones, such as aesthetic, construction duration, building and operation costs, and environmental considerations. As the result, decision making frequently takes place among non-quantitative alternatives. It should be noted that traditional comparison methods, including clear-cut and inflexible mathematics, have always been criticized. This paper demonstrates a brief review of traditional methods of evaluating alternatives. It also offers a new decision making method using, fuzzy calculations. The main focus of this research is some engineering issues, which have flexible nature and vague borders. Suggested method provides analyzability of evaluation for decision makers. It is also capable to overcome multi criteria and multi-referees problems. In order to ease calculations, a program named DeMA is introduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many modeling situations in which parameter values can only be estimated or are subject to noise, the appropriate mathematical representation is a stochastic ordinary differential equation (SODE). However, unlike the deterministic case in which there are suites of sophisticated numerical methods, numerical methods for SODEs are much less sophisticated. Until a recent paper by K. Burrage and P.M. Burrage (1996), the highest strong order of a stochastic Runge-Kutta method was one. But K. Burrage and P.M. Burrage (1996) showed that by including additional random variable terms representing approximations to the higher order Stratonovich (or Ito) integrals, higher order methods could be constructed. However, this analysis applied only to the one Wiener process case. In this paper, it will be shown that in the multiple Wiener process case all known stochastic Runge-Kutta methods can suffer a severe order reduction if there is non-commutativity between the functions associated with the Wiener processes. Importantly, however, it is also suggested how this order can be repaired if certain commutator operators are included in the Runge-Kutta formulation. (C) 1998 Elsevier Science B.V. and IMACS. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative market data has traditionally been used throughout marketing and business as a tool to inform and direct design decisions. However, in our changing economic climate, businesses need to innovate and create products their customers will love. Deep customer insight methods move beyond just questioning customers and aims to provoke true emotional responses in order to reveal new opportunities that go beyond functional product requirements. This paper explores traditional market research methods and compares them to methods used to gain deep customer insights. This study reports on a collaborative research project with seven small to medium enterprises and four multi-national organisations. Firms were introduced to a design led innovation approach, and were taught the different methods to gain deep customer insights. Interviews were conducted to understand the experience and outcomes of pre-existing research methods and deep customer insight approaches. Findings concluded that deep customer insights were unlikely to be revealed through traditional market research techniques. The theoretical outcome of this study is a complementary methods matrix, providing guidance on appropriate research methods in accordance to a project’s timeline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The occurrence of occupational chronic solvent encephalopathy (CSE) seems to decrease, but still every year reveals new cases. To prevent CSE and early retirement of solvent-exposed workers, actions should focus on early CSE detection and diagnosis. Identifying the work tasks and solvent exposure associated with high risk for CSE is crucial. Clinical and exposure data of all the 128 cases diagnosed with CSE as an occupational disease in Finland during 1995-2007 was collected from the patient records at the Finnish Institute of Occupational Health (FIOH) in Helsinki. The data on the number of exposed workers in Finland were gathered from the Finnish Job-exposure Matrix (FINJEM) and the number of employed from the national workforce survey. We analyzed the work tasks and solvent exposure of CSE patients and the findings in brain magnetic resonance imaging (MRI), quantitative electroencephalography (QEEG), and event-related potentials (ERP). The annual number of new cases diminished from 18 to 3, and the incidence of CSE decreased from 8.6 to 1.2 / million employed per year. The highest incidence of CSE was in workers with their main exposure to aromatic hydrocarbons; during 1995-2006 the incidence decreased from 1.2 to 0.3 / 1 000 exposed workers per year. The work tasks with the highest incidence of CSE were floor layers and lacquerers, wooden surface finishers, and industrial, metal, or car painters. Among 71 CSE patients, brain MRI revealed atrophy or white matter hyperintensities or both in 38% of the cases. Atrophy which was associated with duration of exposure was most frequently located in the cerebellum and in the frontal or parietal brain areas. QEEG in a group of 47 patients revealed increased power of the theta band in the frontal brain area. In a group of 86 patients, the P300 amplitude of auditory ERP was decreased, but at individual level, all the amplitude values were classified as normal. In 11 CSE patients and 13 age-matched controls, ERP elicited by a multimodal paradigm including an auditory, a visual detection, and a recognition memory task under single and dual-task conditions corroborated the decrease of auditory P300 amplitude in CSE patients in single-task condition. In dual-task conditions, the auditory P300 component was, more often in patients than in controls, unrecognizable. Due to the paucity and non-specificity of the findings, brain MRI serves mainly for differential diagnostics in CSE. QEEG and auditory P300 are insensitive at individual level and not useful in the clinical diagnostics of CSE. A multimodal ERP paradigm may, however, provide a more sensitive method to diagnose slight cognitive disturbances such as CSE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the role that qualitative methods can play in the study of children's racial attitudes and behaviour. It does this by discussing a number of examples taken from a qualitative, ethnographic study of five- and six-year-old children in an English multi-ethnic, inner-city primary school. The examples are used to highlight the limitations of research that relies solely on quantitative methods and the potential that qualitative methods have for addressing these limitations. Within this context the article contrasts the strengths and weaknesses of qualitative and quantitative methods in the study of children's racial attitudes and identities. The article concludes by arguing that a much more integrated multi-method approach is needed in this area and sets out some of the most effective ways this could be achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese dout., Matemática, Investigação Operacional, Universidade do Algarve, 2009

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study was conducted to estimate variation among laboratories and between manual and automated techniques of measuring pressure on the resulting gas production profiles (GPP). Eight feeds (molassed sugarbeet feed, grass silage, maize silage, soyabean hulls, maize gluten feed, whole crop wheat silage, wheat, glucose) were milled to pass a I mm screen and sent to three laboratories (ADAS Nutritional Sciences Research Unit, UK; Institute of Grassland and Environmental Research (IGER), UK; Wageningen University, The Netherlands). Each laboratory measured GPP over 144 h using standardised procedures with manual pressure transducers (MPT) and automated pressure systems (APS). The APS at ADAS used a pressure transducer and bottles in a shaking water bath, while the APS at Wageningen and IGER used a pressure sensor and bottles held in a stationary rack. Apparent dry matter degradability (ADDM) was estimated at the end of the incubation. GPP were fitted to a modified Michaelis-Menten model assuming a single phase of gas production, and GPP were described in terms of the asymptotic volume of gas produced (A), the time to half A (B), the time of maximum gas production rate (t(RM) (gas)) and maximum gas production rate (R-M (gas)). There were effects (P<0.001) of substrate on all parameters. However, MPT produced more (P<0.001) gas, but with longer (P<0.001) B and t(RM gas) (P<0.05) and lower (P<0.001) R-M gas compared to APS. There was no difference between apparatus in ADDM estimates. Interactions occurred between substrate and apparatus, substrate and laboratory, and laboratory and apparatus. However, when mean values for MPT were regressed from the individual laboratories, relationships were good (i.e., adjusted R-2 = 0.827 or higher). Good relationships were also observed with APS, although they were weaker than for MPT (i.e., adjusted R-2 = 0.723 or higher). The relationships between mean MPT and mean APS data were also good (i.e., adjusted R 2 = 0. 844 or higher). Data suggest that, although laboratory and method of measuring pressure are sources of variation in GPP estimation, it should be possible using appropriate mathematical models to standardise data among laboratories so that data from one laboratory could be extrapolated to others. This would allow development of a database of GPP data from many diverse feeds. (c) 2005 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our new molecular understanding of immune priming states that dendritic cell activation is absolutely pivotal for expansion and differentiation of naïve T lymphocytes, and it follows that understanding DC activation is essential to understand and design vaccine adjuvants. This chapter describes how dendritic cells can be used as a core tool to provide detailed quantitative and predictive immunomics information about how adjuvants function. The role of distinct antigen, costimulation, and differentiation signals from activated DC in priming is explained. Four categories of input signals which control DC activation – direct pathogen detection, sensing of injury or cell death, indirect activation via endogenous proinflammatory mediators, and feedback from activated T cells – are compared and contrasted. Practical methods for studying adjuvants using DC are summarised and the importance of DC subset choice, simulating T cell feedback, and use of knockout cells is highlighted. Finally, five case studies are examined that illustrate the benefit of DC activation analysis for understanding vaccine adjuvant function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grammar has always been an important part of language learning. Based on various theories, such as the universal grammar theory (Chomsky, 1959) and, the input theory (Krashen, 1970), the explicit and implicit teaching methods have been developed. Research shows that both methods may have some benefits and disadvantages. The attitude towards English grammar teaching methods in schools has also changed and nowadays grammar teaching methods and learning strategies, as a part of language mastery, are one of the discussion topics among linguists. This study focuses on teacher and learner experiences and beliefs about teaching English grammar and difficulties learners may face. The aim of the study is to conduct a literature review and to find out what scientific knowledge exists concerning the previously named topics. Along with this, the relevant steering documents are investigated focusing on grammar teaching at Swedish upper secondary schools. The universal grammar theory of Chomsky as well as Krashen’s input hypotheses provide the theoretical background for the current study. The study has been conducted applying qualitative and quantitative methods. The systematic search in four databases LIBRIS, ERIK, LLBA and Google Scholar were used for collecting relevant publications. The result shows that scientists’ publications name different grammar areas that are perceived as problematic for learners all over the world. The most common explanation of these difficulties is the influence of learner L1. Research presents teachers’ and learners’ beliefs to the benefits of grammar teaching methods. An effective combination of teaching methods needs to be done to fit learners’ expectations and individual needs. Together, they will contribute to the achieving of higher language proficiency levels and, therefore, they can be successfully applied at Swedish upper secondary schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The acidic ninhydrin spectrophotometric method (ANSM) for quantitative determination of free and bound sialic acid of milk glycoprotein has been proved to be fast and efficient for routine detection of fraudulent addition of rennet whey to fluid milk. In this research the ANSM was compared with the high performance liquid chromatography (HPLC) method, internationally recommended for caseinomacropeptide (CMP) determination, which besides its high accuracy is more sophisticated and requires trained personnel. For several sample conditions (raw milk and milk with variable added amounts of rennet cheese whey), the methods showed an excellent linear correlation, with r = 0.981 when milk was deproteinized with a 120 g.L-1 final concentration of trichloroacetic acid (TCA) concentration. The best correlations could be seen with final concentrations of 100 g.L-1 and 80 g.L-1 TCA; respectively, r = 0.992 and 0.993.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology to define favorable areas in petroleum and mineral exploration is applied, which consists in weighting the exploratory variables, in order to characterize their importance as exploration guides. The exploration data are spatially integrated in the selected area to establish the association between variables and deposits, and the relationships among distribution, topology, and indicator pattern of all variables. Two methods of statistical analysis were compared. The first one is the Weights of Evidence Modeling, a conditional probability approach (Agterberg, 1989a), and the second one is the Principal Components Analysis (Pan, 1993). In the conditional method, the favorability estimation is based on the probability of deposit and variable joint occurrence, with the weights being defined as natural logarithms of likelihood ratios. In the multivariate analysis, the cells which contain deposits are selected as control cells and the weights are determined by eigendecomposition, being represented by the coefficients of the eigenvector related to the system's largest eigenvalue. The two techniques of weighting and complementary procedures were tested on two case studies: 1. Recôncavo Basin, Northeast Brazil (for Petroleum) and 2. Itaiacoca Formation of Ribeira Belt, Southeast Brazil (for Pb-Zn Mississippi Valley Type deposits). The applied methodology proved to be easy to use and of great assistance to predict the favorability in large areas, particularly in the initial phase of exploration programs. © 1998 International Association for Mathematical Geology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptualand “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of controlconditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^