32 resultados para [JEL:C1] Mathematical and Quantitative Methods - Econometric and Statistical Methods: General


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a theoretical method for a direct evaluation of the average and reliability error exponents in low-density parity-check error-correcting codes using methods of statistical physics. Results for the binary symmetric channel are presented for codes of both finite and infinite connectivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new mathematical model for efficiency analysis, which combines DEA methodology with an old idea-Ratio Analysis. Our model, called DEA-R, treats all possible ratios "output/input" as outputs within the standard DEA model. Although DEA and DEA-R generate different summary measures for efficiency, the two measures are comparable. Our mathematical and empirical comparisons establish the validity of DEA-R model in its own right. The key advantage of DEA-R over DEA is that it allows effective integration of the model with experts' opinions via flexible restrictive conditions on individual "output/input" pairs. © 2007 Springer Science+Business Media, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present work, the more important parameters of the heat pump system and of solar assisted heat pump systems were analysed in a quantitative way. Ideal and real Rankine cycles applied to the heat pump, with and without subcooling and superheating were studied using practical recommended values for their thermodynamics parameters. Comparative characteristics of refrigerants here analysed looking for their applicability in heat pumps for domestic heating and their effect in the performance of the system. Curves for the variation of the coefficient of performance as a function of condensing and evaporating temperatures were prepared for R12. Air, water and earth as low-grade heat sources and basic heat pump design factors for integrated heat pumps and thermal stores and for solar assisted heat pump-series, parallel and dual-systems were studied. The analysis of the relative performance of these systems demonstrated that the dual system presents advantages in domestic applications. An account of energy requirements for space and hater heating in the domestic sector in the O.K. is presented. The expected primary energy savings by using heat pumps to provide for the heating demand of the domestic sector was found to be of the order of 7%. The availability of solar energy in the U.K. climatic conditions and the characteristics of the solar radiation here studied. Tables and graphical representations in order to calculate the incident solar radiation over a tilted roof were prepared and are given in this study in section IV. In order to analyse and calculate the heating load for the system, new mathematical and graphical relations were developed in section V. A domestic space and water heating system is described and studied. It comprises three main components: a solar radiation absorber, the normal roof of a house, a split heat pump and a thermal store. A mathematical study of the heat exchange characteristics in the roof structure was done. This permits to evaluate the energy collected by the roof acting as a radiation absorber and its efficiency. An indication of the relative contributions from the three low-grade sources: ambient air, solar boost and heat loss from the house to the roof space during operation is given in section VI, together with the average seasonal performance and the energy saving for a prototype system tested at the University of Aston. The seasonal performance as found to be 2.6 and the energy savings by using the system studied 61%. A new store configuration to reduce wasted heat losses is also discussed in section VI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An apparatus was developed to project spinning golf balls directly onto golf greens. This employed a modified baseball/practice machine with two counter-rotating pneumatic wheels. The speed of the wheels could be varied independently allowing backspin to be given to the ball. The ball was projected into a darkened enclosure where the motion of the ball before and after impacting with the turf was recorded using a still camera and a stroboscope. The resulting photographs contained successive images of the ball on a single frame of film. The apparatus was tested on eighteen golf courses resulting in 721 photographs of impacts. Statistical analysis was carried out on the results of the photographs and from this, two types of green emerged. On the first, the ball tended to rebound with topspin, while on the second, the ball retained backspin after impact if the initial backspin was greater than about 350 rads-1. Eleven tests were devised to determine the characteristics of greens and statistical techniques were used to analyse the relationships between these tests. These showed the effects of the green characteristics on ball/turf impacts. It was found that the ball retained backspin on greens that were freely drained and had less than 60% of Poa annua (annual meadow grass) in their swards. Visco-elastic models were used to simulate the impact of the ball with the turf. Impacts were simulated by considering the ball to be rigid and the turf to be a two layered system consisting of springs and dampers. The model showed good agreement with experiment and was used to simulate impacts from two different shots onto two contrasting types of green.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although crisp data are fundamentally indispensable for determining the profit Malmquist productivity index (MPI), the observed values in real-world problems are often imprecise or vague. These imprecise or vague data can be suitably characterized with fuzzy and interval methods. In this paper, we reformulate the conventional profit MPI problem as an imprecise data envelopment analysis (DEA) problem, and propose two novel methods for measuring the overall profit MPI when the inputs, outputs, and price vectors are fuzzy or vary in intervals. We develop a fuzzy version of the conventional MPI model by using a ranking method, and solve the model with a commercial off-the-shelf DEA software package. In addition, we define an interval for the overall profit MPI of each decision-making unit (DMU) and divide the DMUs into six groups according to the intervals obtained for their overall profit efficiency and MPIs. We also present two numerical examples to demonstrate the applicability of the two proposed models and exhibit the efficacy of the procedures and algorithms. © 2011 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The postgenomic era, as manifest, inter alia, by proteomics, offers unparalleled opportunities for the efficient discovery of safe, efficacious, and novel subunit vaccines targeting a tranche of modern major diseases. A negative corollary of this opportunity is the risk of becoming overwhelmed by this embarrassment of riches. Informatics techniques, working to address issues of both data management and through prediction to shortcut the experimental process, can be of enormous benefit in leveraging the proteomic revolution.In this disquisition, we evaluate proteomic approaches to the discovery of subunit vaccines, focussing on viral, bacterial, fungal, and parasite systems. We also adumbrate the impact that proteomic analysis of host-pathogen interactions can have. Finally, we review relevant methods to the prediction of immunome, with special emphasis on quantitative methods, and the subcellular localization of proteins within bacteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study draws upon effectuation and causation as examples of planning-based and flexible decision-making logics, and investigates dynamics in the use of both logics. The study applies a longitudinal process research approach to investigate strategic decision-making in new venture creation over time. Combining qualitative and quantitative methods, we analyze 385 decision events across nine technology-based ventures. Our observations suggest a hybrid perspective on strategic decision-making, demonstrating how effectuation and causation logics are combined, and how entrepreneurs’ emphasis on these logics shifts and re-shifts over time. We induce a dynamic model which extends the literature on strategic decision-making in venture creation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We survey articles covering how hedge fund returns are explained, using largely non-linear multifactor models that examine the non-linear pay-offs and exposures of hedge funds. We provide an integrated view of the implicit factor and statistical factor models that are largely able to explain the hedge fund return-generating process. We present their evolution through time by discussing pioneering studies that made a significant contribution to knowledge, and also recent innovative studies that examine hedge fund exposures using advanced econometric methods. This is the first review that analyzes very recent studies that explain a large part of hedge fund variation. We conclude by presenting some gaps for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

UK schools and universities are trying to remedy a nationally recognized skills shortage in quantitative methods among graduates. This article describes and analyses a research project in German Studies funded by the Economic and Social Research Council (ESRC) aimed at addressing the issue. The interdisciplinary pilot project introduced quantitative methods into undergraduate curricula not only in Linguistics, but also in German Studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficacy of a specially constructed Gallager-type error-correcting code to communication in a Gaussian channel is examined. The construction is based on the introduction of complex matrices, used in both encoding and decoding, which comprise sub-matrices of cascading connection values. The finite-size effects are estimated for comparing the results with the bounds set by Shannon. The critical noise level achieved for certain code rates and infinitely large systems nearly saturates the bounds set by Shannon even when the connectivity used is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid global loss of biodiversity has led to a proliferation of systematic conservation planning methods. In spite of their utility and mathematical sophistication, these methods only provide approximate solutions to real-world problems where there is uncertainty and temporal change. The consequences of errors in these solutions are seldom characterized or addressed. We propose a conceptual structure for exploring the consequences of input uncertainty and oversimpli?ed approximations to real-world processes for any conservation planning tool or strategy. We then present a computational framework based on this structure to quantitatively model species representation and persistence outcomes across a range of uncertainties. These include factors such as land costs, landscape structure, species composition and distribution, and temporal changes in habitat. We demonstrate the utility of the framework using several reserve selection methods including simple rules of thumb and more sophisticated tools such as Marxan and Zonation. We present new results showing how outcomes can be strongly affected by variation in problem characteristics that are seldom compared across multiple studies. These characteristics include number of species prioritized, distribution of species richness and rarity, and uncertainties in the amount and quality of habitat patches. We also demonstrate how the framework allows comparisons between conservation planning strategies and their response to error under a range of conditions. Using the approach presented here will improve conservation outcomes and resource allocation by making it easier to predict and quantify the consequences of many different uncertainties and assumptions simultaneously. Our results show that without more rigorously generalizable results, it is very dif?cult to predict the amount of error in any conservation plan. These results imply the need for standard practice to include evaluating the effects of multiple real-world complications on the behavior of any conservation planning method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project objective was to develop a reliable selection procedure to match contact lens materials with individual wearers by the identification of a biochemical marker for assessment of in-eye performance of contact lenses. There is a need for such a procedure as one of the main reasons for contact lens wearers ceasing wearing contact lenses is poor end of day comfort i.e. the lenses become intolerable to the wearer as the day progresses. The selection of an optimal material for individual wearers has the potential benefit to reduce drop Qut, hence increasing the overall contact lens population, and to improve contact lens comfort for established wearers. Using novel analytical methods and statistical techniques, we were able to investigate the interactions between the composition of the tear film and of the biofilm deposited on the contact lenses and contact lens performance. The investigations were limited to studying the lipid components of the tear film; the lipid layer, which plays a key role in preventing evaporation and stabilising the tear film, has been reported to be significantly thinner and of different mixing characteristics during contact lens wear. Different lipid families were found to influence symptomatology, in vivo tear film structure and stability as well as ocular integrity. Whereas the symptomatology was affected by both the tear film lipid composition and the nature of the lipid deposition, the structure of the tear film and its stability were mainly influenced by the tear film lipid composition. The ocular integrity also appeared to be influenced by the nature of the lipid deposition. Potential markers within the lipid species have been identified and could be applied as follows: When required in order to identify a problematic wearer or to match the contact lens material to the contact lens wearer, tear samples collected by the clinician could be dispatched to an analytical laboratory where lipid analysis could be carried out by HPLC. A colorimetric kit based on the lipid markers could also be developed and used by clinician directly in the practice; such a kit would involve tear sampling and classification according to the colour into "Problem", "Border line" and "Good" contact lens wearers groups. A test kit would also have wider scope for marketing in other areas such as general dry-eye pathology.