25 resultados para [JEL:C70] Mathematical and Quantitative Methods - Game Theory and Bargaining Theory - General


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is proposed that, for rural secondary schoolgirls, school is a site of contestation. Rural girls attempt to `use' school as a means of resisting traditional patriarchal definitions of a `woman's place'. In their efforts, the girls are thwarted by aspects of the school itself, the behaviour and attitudes of the boys in school, and also the `careers advice' which they receive. It is argued that the girls perceive school as being of greater importance to them than is the case for the boys, and that these gender differentiated perceptions are related to the `social' lives of the girls and boys, and also to their future employment prospects. Unlike the boys, the girls experience considerable restrictions concerning these two areas. This theory was grounded in an ethnographic study which was conducted in and around a village in a rural county in England. As well as developing the theory through ethnography, the thesis contains tests of certain hypotheses generated by the theory. These hypotheses relate to the gender differentiated perspectives of secondary school pupils with regard to the areas of school itself, life outside school, and expectations for the future. The quantitative methods used to test these hypotheses confirm that there is a tendency for girls to be more positively orientated to school than the boys; to feel less able to engage in preferred activities outside school time than the boys, and also to be more willing to move away from the area than the boys. For comparative purposes these hypotheses were tested in two other rural locations and the results indicate the need for further research of a quantitative kind into the context of girls' schooling in such locations. A critical review of literature is presented, as is a detailed discussion of the research process itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Previous data suggest heterogeneity in laminar distribution of the pathology in the molecular disorder frontotemporal lobar degeneration (FTLD) with transactive response (TAR) DNA-binding protein of 43kDa (TDP-43) proteinopathy (FTLD-TDP). To study this heterogeneity, we quantified the changes in density across the cortical laminae of neuronal cytoplasmic inclusions, glial inclusions, neuronal intranuclear inclusions, dystrophic neurites, surviving neurones, abnormally enlarged neurones, and vacuoles in regions of the frontal and temporal lobe. Methods: Changes in density of histological features across cortical gyri were studied in 10 sporadic cases of FTLD-TDP using quantitative methods and polynomial curve fitting. Results: Our data suggest that laminar neuropathology in sporadic FTLD-TDP is highly variable. Most commonly, neuronal cytoplasmic inclusions, dystrophic neurites and vacuolation were abundant in the upper laminae and glial inclusions, neuronal intranuclear inclusions, abnormally enlarged neurones, and glial cell nuclei in the lower laminae. TDP-43-immunoreactive inclusions affected more of the cortical profile in longer duration cases; their distribution varied with disease subtype, but was unrelated to Braak tangle score. Different TDP-43-immunoreactive inclusions were not spatially correlated. Conclusions: Laminar distribution of pathological features in 10 sporadic cases of FTLD-TDP is heterogeneous and may be accounted for, in part, by disease subtype and disease duration. In addition, the feedforward and feedback cortico-cortical connections may be compromised in FTLD-TDP. © 2012 The Authors. Neuropathology and Applied Neurobiology © 2012 British Neuropathological Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although crisp data are fundamentally indispensable for determining the profit Malmquist productivity index (MPI), the observed values in real-world problems are often imprecise or vague. These imprecise or vague data can be suitably characterized with fuzzy and interval methods. In this paper, we reformulate the conventional profit MPI problem as an imprecise data envelopment analysis (DEA) problem, and propose two novel methods for measuring the overall profit MPI when the inputs, outputs, and price vectors are fuzzy or vary in intervals. We develop a fuzzy version of the conventional MPI model by using a ranking method, and solve the model with a commercial off-the-shelf DEA software package. In addition, we define an interval for the overall profit MPI of each decision-making unit (DMU) and divide the DMUs into six groups according to the intervals obtained for their overall profit efficiency and MPIs. We also present two numerical examples to demonstrate the applicability of the two proposed models and exhibit the efficacy of the procedures and algorithms. © 2011 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The postgenomic era, as manifest, inter alia, by proteomics, offers unparalleled opportunities for the efficient discovery of safe, efficacious, and novel subunit vaccines targeting a tranche of modern major diseases. A negative corollary of this opportunity is the risk of becoming overwhelmed by this embarrassment of riches. Informatics techniques, working to address issues of both data management and through prediction to shortcut the experimental process, can be of enormous benefit in leveraging the proteomic revolution.In this disquisition, we evaluate proteomic approaches to the discovery of subunit vaccines, focussing on viral, bacterial, fungal, and parasite systems. We also adumbrate the impact that proteomic analysis of host-pathogen interactions can have. Finally, we review relevant methods to the prediction of immunome, with special emphasis on quantitative methods, and the subcellular localization of proteins within bacteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study draws upon effectuation and causation as examples of planning-based and flexible decision-making logics, and investigates dynamics in the use of both logics. The study applies a longitudinal process research approach to investigate strategic decision-making in new venture creation over time. Combining qualitative and quantitative methods, we analyze 385 decision events across nine technology-based ventures. Our observations suggest a hybrid perspective on strategic decision-making, demonstrating how effectuation and causation logics are combined, and how entrepreneurs’ emphasis on these logics shifts and re-shifts over time. We induce a dynamic model which extends the literature on strategic decision-making in venture creation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ageing is a natural phenomenon of the human lifecycle, yet it is still not understood what causes the deterioration of the human body near the end of the lifespan. One popular theory is the Free Radical Theory of Ageing, which proposes that oxidative damage to biomolecules causes ageing of tissues. The ageing population is affected by many chronic diseases. This study focused on sarcopenia (muscle loss in ageing) and obesity as two models for comparison of oxidative damage in muscle proteins in mice. The aim of the study was to develop advanced mass spectrometry methods to detect specific oxidative modifications to mouse muscle proteins, including oxidation, nitration, chlorination, and carbonyl group formation, but western blotting was also used to provide complementary information on the oxidative state of proteins from aged and obese muscle. Mass spectrometry proved to be a powerful tool, enabling identification of the types of modifications present, the sites at which they were present and percentage of the peptide populations that were modified. Targeted and semi-targeted mass spectrometry methods were optimised for the identification and quantitation of the oxidised residues in muscle proteins. The development of the quantitative methods enabled comparisons of mass spectrometry instruments. Both the Time of Flight and QTRAP systems showed advantages of using the different mass analysers to quantify oxidative modifications. Several oxidised residues were characterised and quantified in both the obese and sarcopenic models, and higher levels of oxidation were found compared to their control counterparts. Residues found to be oxidised were oxidation of proline, tyrosine and tryptophan, dioxidation of methionine, allysine and nitration of tyrosine. However quantification was performed on methionine dioxidation and cysteine trioxidation containing residues in SERCA. The combination of measuring residue susceptibility and functional studies could contribute to understanding the overall role of oxidation in ageing and obesity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the dynamical properties of the RZ-DPSK encoded sequences, focusing on the instabilities in the soliton train leading to the distortions of the information transmitted. The problem is reformulated within the framework of complex Toda chain model which allows one to carry out the simplified description of the optical soliton dynamics. We elucidate how the bit composition of the pattern affects the initial (linear) stage of the train dynamics and explain the general mechanisms of the appearance of unstable collective soliton modes. Then we discuss the nonlinear regime using asymptotic properties of the pulse stream at large propagation distances and analyze the dynamical behavior of the train classifying different scenarios for the pattern instabilities. Both approaches are based on the machinery of Hermitian and non-Hermitian lattice analysis. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By mixing concepts from both game theoretic analysis and real options theory, an investment decision in a competitive market can be seen as a ‘‘game’’ between firms, as firms implicitly take into account other firms’ reactions to their own investment actions. We review two decades of real option game models, suggesting which critical problems have been ‘‘solved’’ by considering game theory, and which significant problems have not been yet adequately addressed. We provide some insights on the plausible empirical applications, or shortfalls in applications to date, and suggest some promising avenues for future research.