994 resultados para SUGGESTED METHODS
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
Regulation is subject to information asymmetries that can lead to allocative and productive inefficiencies. One solution, suggested by Shleifer in 1985 and now adopted by many regulatory bodies round the world, is 'benchmarking', which is sometimes called 'yardstick competition'. In this paper we consider Shleifer's original approach to benchmarking and contrast this with the actual use of benchmarking by UK regulatory bodies in telecommunications, water and the energy sector since the privatizations of the 1980s and early 1990s. We find that benchmarking plays only one part and sometimes a small part in the setting of regulatory price caps in the UK. We also find that in practice benchmarking has been subject to a number of difficulties, which mean that it is never likely to be more than one tool in the regulator's armoury. The UK's experience provides lessons for regulation internationally. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.
Resumo:
Two contrasting multivariate statistical methods, viz., principal components analysis (PCA) and cluster analysis were applied to the study of neuropathological variations between cases of Alzheimer's disease (AD). To compare the two methods, 78 cases of AD were analyzed, each characterised by measurements of 47 neuropathological variables. Both methods of analysis revealed significant variations between AD cases. These variations were related primarily to differences in the distribution and abundance of senile plaques (SP) and neurofibrillary tangles (NFT) in the brain. Cluster analysis classified the majority of AD cases into five groups which could represent subtypes of AD. However, PCA suggested that variation between cases was more continuous with no distinct subtypes. Hence, PCA may be a more appropriate method than cluster analysis in the study of neuropathological variations between AD cases.
Resumo:
Counts of Pick bodies (PB), Pick cells (PC), senile plaques (SP) and neurofibrillary tangles (NFT) were made in the frontal and temporal cortex from patients with Pick's disease (PD). Lesions were stained histologically with hematoxylin and eosin (HE) and the Bielschowsky silver impregnation method and labeled immunohistochemically with antibodies raised to ubiquitin and tau. The greatest numbers of PB were revealed by immunohistochemistry. Counts of PB revealed by ubiquitin and tau were highly positively correlated which suggested that the two antibodies recognized virtually identical populations of PB. The greatest numbers of PC were revealed by HE followed by the anti-ubiquitin antibody. However, the correlation between counts was poor, suggesting that HE and ubiquitin revealed different populations of PC. The greatest numbers of SP and NFT were revealed by the Bielschowsky method indicating the presence of Alzheimer-type lesions not revealed by the immunohistochemistry. In addition, more NFT were revealed by the anti-ubiquitin compared with the anti-tau antibody. The data suggested that in PD: (i) the anti-ubiquitin and anti-tau antibodies were equally effective at labeling PB; (ii) both HE and anti-ubiquitin should be used to quantitate PC; and (iii) the Bielschowsky method should be used to quantitate SP and NFT.
Resumo:
An investigation of behavioural patterns that form a basis for termite control in the Australasian region was undertaken using laboratory colonies of the subterranean termite Reticulitermes santonensis (Feytaud). The study attempted to build a picture of the behavioural elements of individuals in a colony and based on this, trophallaxis, aggression and cannibalism were investigated in detail. Preliminary study of food transmission showed that 'workers' played a major role in the distribution of food. It was found, that among factors responsible for release of trophallactic behaviour the presence of 'right odour' between participants was important. It also appeared that the role taken by individuals depended on whether they were hungry or fully fed. Antennal palpation was shown by donors and acceptors alike and this seemed to be excitatory in function. Introduction of aliens into nests elicited aggression and these aliens were often killed. Factors eliciting aggression were investigated and colony odour was found to be important. Further investigations revealed that development of colony odour was governed by genetical and environmental mechanisms. Termite response to injury and death was also governed by odour. In the case of injury either the fresh haemolymph from the wound or some component of the haemolymph evoked cannibalism. Necrophagic behaviour was found to be released by fatty acids found in the corpses. Finally, the response of colonies to nestmates carrying arsenic trioxide was investigated. It was found that living and freshly dead arsenic-carrying nestmates were treated like normal nestmates, resulting in high initial mortality. However, poisoned cadavers soon became repellant and were buried thus preventing further spread of the poison to the rest of the colony. This suggested that complete control of subterranean termites by arsenic trioxide is unlikely to be fully effective, especially in those species which are capable of developing secondary reproductives from survivors and thus rebuilding the community.
Resumo:
This thesis is an exploration of the organisation and functioning of the human visual system using the non-invasive functional imaging modality magnetoencephalography (MEG). Chapters one and two provide an introduction to the ‘human visual system and magnetoencephalographic methodologies. These chapters subsequently describe the methods by which MEG can be used to measure neuronal activity from the visual cortex. Chapter three describes the development and implementation of novel analytical tools; including beamforming based analyses, spectrographic movies and an optimisation of group imaging methods. Chapter four focuses on the use of established and contemporary analytical tools in the investigation of visual function. This is initiated with an investigation of visually evoked and induced responses; covering visual evoked potentials (VEPs) and event related synchronisation/desynchronisation (ERS/ERD). Chapter five describes the employment of novel methods in the investigation of cortical contrast response and demonstrates distinct contrast response functions in striate and extra-striate regions of visual cortex. Chapter six use synthetic aperture magnetometry (SAM) to investigate the phenomena of visual cortical gamma oscillations in response to various visual stimuli; concluding that pattern is central to its generation and that it increases in amplitude linearly as a function of stimulus contrast, consistent with results from invasive electrode studies in the macaque monkey. Chapter seven describes the use of driven visual stimuli and tuned SAM methods in a pilot study of retinotopic mapping using MEG; finding that activity in the primary visual cortex can be distinguished in four quadrants and two eccentricities of the visual field. Chapter eight is a novel implementation of the SAM beamforming method in the investigation of a subject with migraine visual aura; the method reveals desynchronisation of the alpha and gamma frequency bands in occipital and temporal regions contralateral to observed visual abnormalities. The final chapter is a summary of main conclusions and suggested further work.
Resumo:
The optical layouts incorporating binary phase diffractive grating and a standard micro-objective were used for femtosecond microfabrication of periodical structures in fused silica. Two beams, generated in Talbot type interferometer, interfered on a surface and in the bulk of the sample. The method suggested allows better control over the transverse size of the grating pitch, and thus control the reflection strength of the waveguide or fibre grating. We present the examples of direct inscription of the sub-micrometer periodical structures using a 267 nm femtosecond laser radiation.
Resumo:
* The work is partially supported by Grant no. NIP917 of the Ministry of Science and Education – Republic of Bulgaria.
Resumo:
The optical layouts incorporating binary phase diffractive grating and a standard micro-objective were used for femtosecond microfabrication of periodical structures in fused silica. Two beams, generated in Talbot type interferometer, interfered on a surface and in the bulk of the sample. The method suggested allows better control over the transverse size of the grating pitch, and thus control the reflection strength of the waveguide or fibre grating. We present the examples of direct inscription of the sub-micrometer periodical structures using a 267 nm femtosecond laser radiation.
Resumo:
The paper reviews some axioms of additivity concerning ranking methods used for generalized tournaments with possible missing values and multiple comparisons. It is shown that one of the most natural properties, called consistency, has strong links to independence of irrelevant comparisons, an axiom judged unfavourable when players have different opponents. Therefore some directions of weakening consistency are suggested, and several ranking methods, the score, generalized row sum and least squares as well as fair bets and its two variants (one of them entirely new) are analysed whether they satisfy the properties discussed. It turns out that least squares and generalized row sum with an appropriate parameter choice preserve the relative ranking of two objects if the ranking problems added have the same comparison structure.
Resumo:
The purpose of this research was to compare the delivery methods as practiced by higher education faculty teaching distance courses with recommended or emerging standard instructional delivery methods for distance education. Previous research shows that traditional-type instructional strategies have been used in distance education and that there has been no training to distance teach. Secondary data, however, appear to suggest emerging practices which could be pooled toward the development of standards. This is a qualitative study based on the constant comparative analysis approach of grounded theory.^ Participants (N = 5) of this study were full-time faculty teaching distance education courses. The observation method used was unobtrusive content analysis of videotaped instruction. Triangulation of data was accomplished through one-on-one in-depth interviews and from literature review. Due to the addition of non-media content being analyzed, a special time-sampling technique was designed by the researcher--influenced by content analyst theories of media-related data--to sample portions of the videotape instruction that were observed and counted. A standardized interview guide was used to collect data from in-depth interviews. Coding was done based on categories drawn from review of literature, and from Cranton and Weston's (1989) typology of instructional strategies. The data were observed, counted, tabulated, analyzed, and interpreted solely by the researcher. It should be noted however, that systematic and rigorous data collection and analysis led to credible data.^ The findings of this study supported the proposition that there are no standard instructional practices for distance teaching. Further, the findings revealed that of the emerging practices suggested by proponents and by faculty who teach distance education courses, few were practiced even minimally. A noted example was the use of lecture and questioning. Questioning, as a teaching tool was used a great deal, with students at the originating site but not with distance students. Lectures were given, but were mostly conducted in traditional fashion--long in duration and with no interactive component.^ It can be concluded from the findings that while there are no standard practices for instructional delivery for distance education, there appears to be sufficient information from secondary and empirical data to initiate some standard instructional practices. Therefore, grounded in this research data is the theory that the way to arrive at some instructional delivery standards for televised distance education is a pooling of the tacitly agreed-upon emerging practices by proponents and practicing instructors. Implicit in this theory is a need for experimental research so that these emerging practices can be tested, tried, and proven, ultimately resulting in formal standards for instructional delivery in television education. ^
Resumo:
In an attempt to improve students' functional understanding of plagiarism a variety of approaches were tried within the context of a more comprehensive information literacy program. Sessions were taught as a one hour "module" inside a required communications skills class at a small private university. Approaches taken included control sessions (a straightforward PowerPoint presentation of the material), direct instruction sessions (featuring mostly direct lecture but with some seatwork as well), and student-centered sessions (utilizing role playing and group exercises). Students were taught basic content and definitions regarding plagiarism, what circumstances or instances constitute plagiarism, where to go for help in avoiding plagiarism, and what constitutes appropriate paraphrasing and citation. Pre-test and post-test scores determined students' functional understanding primarily by their ability to recognize properly and improperly paraphrased text, content understanding by their combined total score on a multiple choice quiz, and their attitude and conceptual understanding by their ability to recognize circumstances which would constitute plagiarism. While students improved across all methods the study was unable to identify one that performed significantly better than the others. The results supported the need for more education with regard to plagiarism and suggested a need for perhaps more time on task and/or a mixed approach towards conveying the content.
Resumo:
On the base of data on benthic foraminifera and sediment biogeochemistry (contents of total organic carbon, calcium carbonate and biogenic opal) in two cores (1265 and 1312 m water depth) from the southeastern Sakhalin slope and one core (839 m water depth) from the southwestern Kamchatka slope variations of the oxygen minimum zone during the last 50 ka in the Okhotsk Sea are reconstructed. The oxygen minimum zone was less pronounced during cooling in the MIS 2 that is suggested to be caused by maximal expansion of the sea ice cover, decrease of marine productivity and increase of production of oxygenated Okhotsk Sea Intermediate Water (OSIW). Two-step-like strengthening of oxygen minimum zone during warmings in the Terminations 1a and 1b was combined with (1) enhanced oxygen consumption due to decomposition of large amount of organic matter in the water column and bottom sediments due to increased marine productivity and supply of terrigenous material from submerged northern shelves; (2) sea ice cover retreat and reduction of OSIW production; (3) freely inflow of the oxygen-depleted intermediate water mass from the North Pacific.
Resumo:
OBJECTIVE: To demonstrate the application of causal inference methods to observational data in the obstetrics and gynecology field, particularly causal modeling and semi-parametric estimation. BACKGROUND: Human immunodeficiency virus (HIV)-positive women are at increased risk for cervical cancer and its treatable precursors. Determining whether potential risk factors such as hormonal contraception are true causes is critical for informing public health strategies as longevity increases among HIV-positive women in developing countries. METHODS: We developed a causal model of the factors related to combined oral contraceptive (COC) use and cervical intraepithelial neoplasia 2 or greater (CIN2+) and modified the model to fit the observed data, drawn from women in a cervical cancer screening program at HIV clinics in Kenya. Assumptions required for substantiation of a causal relationship were assessed. We estimated the population-level association using semi-parametric methods: g-computation, inverse probability of treatment weighting, and targeted maximum likelihood estimation. RESULTS: We identified 2 plausible causal paths from COC use to CIN2+: via HPV infection and via increased disease progression. Study data enabled estimation of the latter only with strong assumptions of no unmeasured confounding. Of 2,519 women under 50 screened per protocol, 219 (8.7%) were diagnosed with CIN2+. Marginal modeling suggested a 2.9% (95% confidence interval 0.1%, 6.9%) increase in prevalence of CIN2+ if all women under 50 were exposed to COC; the significance of this association was sensitive to method of estimation and exposure misclassification. CONCLUSION: Use of causal modeling enabled clear representation of the causal relationship of interest and the assumptions required to estimate that relationship from the observed data. Semi-parametric estimation methods provided flexibility and reduced reliance on correct model form. Although selected results suggest an increased prevalence of CIN2+ associated with COC, evidence is insufficient to conclude causality. Priority areas for future studies to better satisfy causal criteria are identified.