905 resultados para Expectation Maximization
Resumo:
While planning the GAIN International Study of gavestinel in acute stroke, a sequential triangular test was proposed but not implemented. Before the trial commenced it was agreed to evaluate the sequential design retrospectively to evaluate the differences in the resulting analyses, trial durations and sample sizes in order to assess the potential of sequential procedures for future stroke trials. This paper presents four sequential reconstructions of the GAIN study made under various scenarios. For the data as observed, the sequential design would have reduced the trial sample size by 234 patients and shortened its duration by 3 or 4 months. Had the study not achieved a recruitment rate that far exceeded expectation, the advantages of the sequential design would have been much greater. Sequential designs appear to be an attractive option for trials in stroke. Copyright 2004 S. Karger AG, Basel
Resumo:
Floral nectar spurs are widely considered to influence pollinator behaviour in orchids. Spurs of 21 orchid species selected from within four molecularly circumscribed clades of subtribe Orchidinae (based on Platanthera s.l., Gymnadenia-Dactylorhiza s.l., Anacamptis s.l., Orchis s.s.) were examined under light and scanning electron microscopes in order to estimate correlations between nectar production (categorized as absent, trace, reservoir), interior epidermal papillae (categorized as absent, short, medium, long) and epidermal cell striations (categorized as apparently absent, weak, moderate, strong). Closely related congeneric species scored similarly, but more divergent species showed less evidence of phylogenetic constraints. Nectar secretion was negatively correlated with striations and positively correlated with papillae, which were especially frequent and large in species producing substantial reservoirs of nectar. We speculate that the primary function of the papillae is conserving energy through nectar resorption and explain the presence of large papillae in a minority of deceit-pollinated species by arguing that the papillae improve pollination because they are a tactile expectation of pollinating insects. In contrast, the prominence of striations may be a 'spandrel', simply reflecting the thickness of the overlying cuticle. Developmentally, the spur is an invagination of the labellum; it is primarily vascularized by a single 'U'-shaped primary strand, with smaller strands present in some species. Several suggestions are made for developing further, more targeted research programmes. (C) 2009 The Linnean Society of London, Botanical Journal of the Linnean Society, 2009, 160, 369-387.
Resumo:
This paper considers the problem of estimation when one of a number of populations, assumed normal with known common variance, is selected on the basis of it having the largest observed mean. Conditional on selection of the population, the observed mean is a biased estimate of the true mean. This problem arises in the analysis of clinical trials in which selection is made between a number of experimental treatments that are compared with each other either with or without an additional control treatment. Attempts to obtain approximately unbiased estimates in this setting have been proposed by Shen [2001. An improved method of evaluating drug effect in a multiple dose clinical trial. Statist. Medicine 20, 1913–1929] and Stallard and Todd [2005. Point estimates and confidence regions for sequential trials involving selection. J. Statist. Plann. Inference 135, 402–419]. This paper explores the problem in the simple setting in which two experimental treatments are compared in a single analysis. It is shown that in this case the estimate of Stallard and Todd is the maximum-likelihood estimate (m.l.e.), and this is compared with the estimate proposed by Shen. In particular, it is shown that the m.l.e. has infinite expectation whatever the true value of the mean being estimated. We show that there is no conditionally unbiased estimator, and propose a new family of approximately conditionally unbiased estimators, comparing these with the estimators suggested by Shen.
Resumo:
Microsatellite lengths change over evolutionary time through a process of replication slippage. A recently proposed model of this process holds that the expansionary tendencies of slippage mutation are balanced by point mutations breaking longer microsatellites into smaller units and that this process gives rise to the observed frequency distributions of uninterrupted microsatellite lengths. We refer to this as the slippage/point-mutation theory. Here we derive the theory's predictions for interrupted microsatellites comprising regions of perfect repeats, labeled segments, separated by dinucleotide interruptions containing point mutations. These predictions are tested by reference to the frequency distributions of segments of AC microsatellite in the human genome, and several predictions are shown not to be supported by the data, as follows. The estimated slippage rates are relatively low for the first four repeats, and then rise initially linearly with length, in accordance with previous work. However, contrary to expectation and the experimental evidence, the inferred slippage rates decline in segments above 10 repeats. Point mutation rates are also found to be higher within microsatellites than elsewhere. The theory provides an excellent fit to the frequency distribution of peripheral segment lengths but fails to explain why internal segments are shorter. Furthermore, there are fewer microsatellites with many segments than predicted. The frequencies of interrupted microsatellites decline geometrically with microsatellite size measured in number of segments, so that for each additional segment, the number of microsatellites is 33.6% less. Overall we conclude that the detailed structure of interrupted microsatellites cannot be reconciled with the existing slippage/point-mutation theory of microsatellite evolution, and we suggest that microsatellites are stabilized by processes acting on interior rather than on peripheral segments.
Resumo:
A series of government initiatives has raised both the profile of ICT in the curriculum and the expectation that high quality teaching and learning resources will be accessible across electronic networks. In order for e-learning resources such as websites to have the maximum educational impact, teachers need to be involved in their design and development. Use-case analysis provides a means of defining user requirements and other constraints in such a way that software developers can produce e-learning resources which reflect teachers' professional knowledge and support their classroom practice. It has some features in common with the participatory action research used to develop other aspects of classroom practice. Two case-studies are presented: one involves the development of an on-line resource centred on transcripts of original historical documents; the other describes how 'Learning how to Learn', a major, distributed research project funded under the ESRC Teaching and Learning Research Programme is using use-case analysis to develop web resources and services.
Resumo:
Over-involved parenting is commonly hypothesized to be it risk factor for the development of anxiety disorders in childhood. This parenting style may result from parental attempts to prevent child distress based on expectations that the child will be unable to cope in a challenging situation. Naturalistic studies are limited in their ability to disentangle the overlapping contribution of child and parent factors in driving parental behaviours. To overcome this difficulty, an experimental study was conducted in which parental expectations of child distress were manipulated and the effects on parent behaviour and child mood were assessed. Fifty-two children (aged 7 - 11 years) and their primary caregiver participated. Parents were allocated to either a "positive" or a "negative" expectation group. Observations were made of the children and their parents interacting whilst completing a difficult anagram task. Parents given negative expectations of their child's response displayed higher levels of involvement. No differences were found on indices of child mood and behaviour and possible explanations for this are considered. The findings are consistent with suggestions that increased parental involvement may be a "natural" reaction to enhanced perceptions of child vulnerability and an attempt to avoid child distress.
Resumo:
We present a conceptual architecture for a Group Support System (GSS) to facilitate Multi-Organisational Collaborative Groups (MOCGs) initiated by local government and including external organisations of various types. Multi-Organisational Collaborative Groups (MOCGs) consist of individuals from several organisations which have agreed to work together to solve a problem. The expectation is that more can be achieved working in harmony than separately. Work is done interdependently, rather than independently in diverse directions. Local government, faced with solving complex social problems, deploy MOCGs to enable solutions across organisational, functional, professional and juridical boundaries, by involving statutory, voluntary, community, not-for-profit and private organisations. This is not a silver bullet as it introduces new pressures. Each member organisation has its own goals, operating context and particular approaches, which can be expressed as their norms and business processes. Organisations working together must find ways of eliminating differences or mitigating their impact in order to reduce the risks of collaborative inertia and conflict. A GSS is an electronic collaboration system that facilitates group working and can offer assistance to MOCGs. Since many existing GSSs have been primarily developed for single organisation collaborative groups, even though there are some common issues, there are some difficulties peculiar to MOCGs, and others that they experience to a greater extent: a diversity of primary organisational goals among members; different funding models and other pressures; more significant differences in other information systems both technologically and in their use than single organisations; greater variation in acceptable approaches to solve problems. In this paper, we analyse the requirements of MOCGs led by local government agencies, leading to a conceptual architecture for an e-government GSS that captures the relationships between 'goal', 'context', 'norm', and 'business process'. Our models capture the dynamics of the circumstances surrounding each individual representing an organisation in a MOCG along with the dynamics of the MOCG itself as a separate community.
Resumo:
Conflation of academic copyright issues with respect to books (whether text books, research monographs or popularisations) and research articles, is rife in the academic publishing industry. A charitable interpretation is that this is because to publishers they are all effectively the same: a product produced for commercial benefit. An uncharitable interpretation is that this is a classic Fear Uncertainty and Doubt approach, in an attempt to delay the inevitable move to Open Access (OA) to research articles. To authors, however, research articles and books are generally very different things. Research articles are produced without the expectation of direct financial return, whereas books generally include some consideration of financial return. Taylor’s “Copyright and research: an academic publisher’s perspective” (SCRIPT-ed 4:2) falls wholesale into this mental trap and in particular his lauding of the position paper of the Association of American Professional and Scholarly Publishers, shows a lack of understanding of the continuing huge loss to scholarship of a lack of OA to research articles. It should be regarded as a categorical imperative for scholars to embrace OA to research articles.
Resumo:
Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd.
Resumo:
Most active-contour methods are based either on maximizing the image contrast under the contour or on minimizing the sum of squared distances between contour and image 'features'. The Marginalized Likelihood Ratio (MLR) contour model uses a contrast-based measure of goodness-of-fit for the contour and thus falls into the first class. The point of departure from previous models consists in marginalizing this contrast measure over unmodelled shape variations. The MLR model naturally leads to the EM Contour algorithm, in which pose optimization is carried out by iterated least-squares, as in feature-based contour methods. The difference with respect to other feature-based algorithms is that the EM Contour algorithm minimizes squared distances from Bayes least-squares (marginalized) estimates of contour locations, rather than from 'strongest features' in the neighborhood of the contour. Within the framework of the MLR model, alternatives to the EM algorithm can also be derived: one of these alternatives is the empirical-information method. Tracking experiments demonstrate the robustness of pose estimates given by the MLR model, and support the theoretical expectation that the EM Contour algorithm is more robust than either feature-based methods or the empirical-information method. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Ten projects constructed in Ghana between 2003 and 2010 are examined and analysed to ascertain the reliability of estimated costs provided for the projects. Cost estimates for five of the projects were calculated by consultants and cost estimates for the five remaining projects were calculated by contractors. Cost estimates prepared by contractors seemed to be closer to actual costs than estimates calculated by consultants. Projects estimated by consultants experienced an average cost overrun of 40% and time overrun of 62% whereas projects priced by contractors experienced an average cost overrun of 6% and time overrun of 41%. It seemed that contractors had a better understanding of the actual construction processes and a clearer expectation of the needs of the client hence an ability to calculate estimates that were closer to reality. Construction clients in Ghana should rely on contractors for more realistic cost estimates as estimates by consultants may be inaccurate. Where consultants are employed, an allowance of up 40% should be added to the estimated costs as a margin for inaccuracy.
Resumo:
The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.
Resumo:
Theoretical models suggest that decisions about diet, weight and health status are endogenous within a utility maximization framework. In this article, we model these behavioural relationships in a fixed-effect panel setting using a simultaneous equation system, with a view to determining whether economic variables can explain the trends in calorie consumption, obesity and health in Organization for Economic Cooperation and Development (OECD) countries and the large differences among the countries. The empirical model shows that progress in medical treatment and health expenditure mitigates mortality from diet-related diseases, despite rising obesity rates. While the model accounts for endogeneity and serial correlation, results are affected by data limitations.
Resumo:
A methodology for identifying equatorial waves is used to analyze the multilevel 40-yr ECMWF Re-Analysis (ERA-40) data for two different years (1992 and 1993) to investigate the behavior of the equatorial waves under opposite phases of the quasi-biennial oscillation (QBO). A comprehensive view of 3D structures and of zonal and vertical propagation of equatorial Kelvin, westward-moving mixed Rossby–gravity (WMRG), and n = 1 Rossby (R1) waves in different QBO phases is presented. Consistent with expectation based on theory, upward-propagating Kelvin waves occur more frequently during the easterly QBO phase than during the westerly QBO phase. However, the westward-moving WMRG and R1 waves show the opposite behavior. The presence of vertically propagating equatorial waves in the stratosphere also depends on the upper tropospheric winds and tropospheric forcing. Typical propagation parameters such as the zonal wavenumber, zonal phase speed, period, vertical wavelength, and vertical group velocity are found. In general, waves in the lower stratosphere have a smaller zonal wavenumber, shorter period, faster phase speed, and shorter vertical wavelength than those in the upper troposphere. All of the waves in the lower stratosphere show an upward group velocity and downward phase speed. When the phase of the QBO is not favorable for waves to propagate, their phase speed in the lower stratosphere is larger and their period is shorter than in the favorable phase, suggesting Doppler shifting by the ambient flow and a filtering of the slow waves. Tropospheric WMRG and R1 waves in the Western Hemisphere also show upward phase speed and downward group velocity, with an indication of their forcing from middle latitudes. Although the waves observed in the lower stratosphere are dominated by “free” waves, there is evidence of some connection with previous tropical convection in the favorable year for the Kelvin waves in the warm water hemisphere and WMRG and R1 waves in the Western Hemisphere, which is suggestive of the importance of convective forcing for the existence of propagating coupled Kelvin waves and midlatitude forcing for the existence of coupled WMRG and R1 waves.