935 resultados para Power to decide process
Resumo:
Female choice based on multiple male traits has been documented in many species but the functions of such multiple traits are still under debate. The satin bowerbird has a polygynous mating system in which males attract females to bowers for mating; females choose mates based on multiple aspects of males and their bowers. In this paper, we demonstrate that females use some cues to decide which males to examine closely and other cues to decide which males to mate with. Female visitation rates to bowers were significantly related to male size and the males' 'solitary' display rates, and, to a lesser extent, to the numbers of bower decorations. After controlling for female visitation rates, it was found that a male's mating success was significantly related to his size and the rate at which he 'painted' his bower with saliva and chewed up plant material.
Resumo:
There is a general form of an argument which I call the 'argument from vagueness' which attempts to show that objects persist by perduring, via the claim that vagueness is never ontological in nature and thus that composition is unrestricted. I argue that even if we grant that vagueness is always the result of semantic indeterminacy rather than ontological vagueness, and thus also grant that composition is unrestricted, it does not follow that objects persist by perduring. Unrestricted mereological composition lacks the power to ensure that there exist instantaneous objects that wholly overlap persisting objects at times, and thus lacks the power to ensure that there exists anything that could be called a temporal part. Even if we grant that such instantaneous objects exist, however, I argue that it does not follow that objects perdure. To show this I briefly outline a coherent version of three dimensionalism that grants just such an assumption. Thus considerations pertaining to the nature of vagueness need not lead us inevitably to accept perdurantism.
Resumo:
Background The identification and characterization of genes that influence the risk of common, complex multifactorial disease primarily through interactions with other genes and environmental factors remains a statistical and computational challenge in genetic epidemiology. We have previously introduced a genetic programming optimized neural network (GPNN) as a method for optimizing the architecture of a neural network to improve the identification of gene combinations associated with disease risk. The goal of this study was to evaluate the power of GPNN for identifying high-order gene-gene interactions. We were also interested in applying GPNN to a real data analysis in Parkinson's disease. Results We show that GPNN has high power to detect even relatively small genetic effects (2–3% heritability) in simulated data models involving two and three locus interactions. The limits of detection were reached under conditions with very small heritability (
Resumo:
Background: The identification and characterization of genes that influence the risk of common, complex multifactorial disease primarily through interactions with other genes and environmental factors remains a statistical and computational challenge in genetic epidemiology. We have previously introduced a genetic programming optimized neural network (GPNN) as a method for optimizing the architecture of a neural network to improve the identification of gene combinations associated with disease risk. The goal of this study was to evaluate the power of GPNN for identifying high-order gene-gene interactions. We were also interested in applying GPNN to a real data analysis in Parkinson's disease. Results: We show that GPNN has high power to detect even relatively small genetic effects (2-3% heritability) in simulated data models involving two and three locus interactions. The limits of detection were reached under conditions with very small heritability (
Resumo:
Presence-absence surveys are a commonly used method for monitoring broad-scale changes in wildlife distributions. However, the lack of power of these surveys for detecting population trends is problematic for their application in wildlife management. Options for improving power include increasing the sampling effort or arbitrarily relaxing the type I error rate. We present an alternative, whereby targeted sampling of particular habitats in the landscape using information from a habitat model increases power. The advantage of this approach is that it does not require a trade-off with either cost or the Pr(type I error) to achieve greater power. We use a demographic model of koala (Phascolarctos cinereus) population dynamics and simulations of the monitoring process to estimate the power to detect a trend in occupancy for a range of strategies, thereby demonstrating that targeting particular habitat qualities can improve power substantially. If the objective is to detect a decline in occupancy, the optimal strategy is to sample high-quality habitats. Alternatively, if the objective is to detect an increase in occupancy, the optimal strategy is to sample intermediate-quality habitats. The strategies with the highest power remained the same under a range of parameter assumptions, although observation error had a strong influence on the optimal strategy. Our approach specifically applies to monitoring for detecting long-term trends in occupancy or abundance. This is a common and important monitoring objective for wildlife managers, and we provide guidelines for more effectively achieving it.
Resumo:
There have been many models developed by scientists to assist decision-makers in making socio-economic and environmental decisions. It is now recognised that there is a shift in the dominant paradigm to making decisions with stakeholders, rather than making decisions for stakeholders. Our paper investigates two case studies where group model building has been undertaken for maintaining biodiversity in Australia. The first case study focuses on preservation and management of green spaces and biodiversity in metropolitan Melbourne under the umbrella of the Melbourne 2030 planning strategy. A geographical information system is used to collate a number of spatial datasets encompassing a range of cultural and natural assets data layers including: existing open spaces, waterways, threatened fauna and flora, ecological vegetation covers, registered cultural heritage sites, and existing land parcel zoning. Group model building is incorporated into the study through eliciting weightings and ratings of importance for each datasets from urban planners to formulate different urban green system scenarios. The second case study focuses on modelling ecoregions from spatial datasets for the state of Queensland. The modelling combines collaborative expert knowledge and a vast amount of environmental data to build biogeographical classifications of regions. An information elicitation process is used to capture expert knowledge of ecoregions as geographical descriptions, and to transform this into prior probability distributions that characterise regions in terms of environmental variables. This prior information is combined with measured data on the environmental variables within a Bayesian modelling technique to produce the final classified regions. We describe how linked views between descriptive information, mapping and statistical plots are used to decide upon representative regions that satisfy a number of criteria for biodiversity and conservation. This paper discusses the advantages and problems encountered when undertaking group model building. Future research will extend the group model building approach to include interested individuals and community groups.
Resumo:
A plethora of process modeling techniques has been proposed over the years. One way of evaluating and comparing the scope and completeness of techniques is by way of representational analysis. The purpose of this paper is to examine how process modeling techniques have developed over the last four decades. The basis of the comparison is the Bunge-Wand-Weber representation model, a benchmark used for the analysis of grammars that purport to model the real world and the interactions within it. This paper presents a comparison of representational analyses of several popular process modeling techniques and has two main outcomes. First, it provides insights, within the boundaries of a representational analysis, into the extent to which process modeling techniques have developed over time. Second, the findings also indicate areas in which the underlying theory seems to be over-engineered or lacking in specialization.
Resumo:
Em sua teoria mimética do desejo, René Girard apresenta Cristo como modelo ideal a ser seguido, uma vez que Jesus demonstrou como é possível resolver conflitos sem associá-los à vingança ou à violência. Através de sua vitimização na Cruz, Jesus revela toda a verdade de quem somos e quem Deus é, ao manifestar sua inocência, Ele reverte para si a acusação daqueles que se mantêm no círculo da auto justificação por transferência da culpa. Assim, o Cristo decide, por sua livre vontade, perdoar. Isto é, uma nova forma de perdão, que denominamos novum. Fundamentado no amor, ele vem de fora e fura o círculo da violência. O novum revela uma nova maneira de se relacionar com as pessoas que nos prejudicaram, de tentar compreender quem somos através do Outro. Essa nova mimesis valoriza a vida, a liberdade, o cuidado com o próximo, a reconciliação mais do que ofertas e sacrifícios. Trata-se de uma superação dialética, pois, apesar de nesse processo a decisão de perdoar estar de posse do sujeito sendo esta uma via de mão única , a decisão de reconciliação depende também do ofendido/ofensor, esta outra, via de mão dupla . Através do novum é possível mudar o sentido do passado, destruir a fatalidade e não ter necessidade de continuar como refém da culpa. Esta atitude possibilita o sujeito olhar o futuro com esperança. Ao focar a Paixão e a Ressurreição, o sujeito descobre quem ele realmente é e poderá decidir seguir o modelo Cristocêntrico. Essa decisão leva-o a sair da mimesis violenta e passar a elaborar a vontade, para então decidir perdoar àquele que o ofendeu. O sujeito, por fim, reconhece o perdão novum como modelo que ao ser imitado e doado é capaz de refazer a pessoa de seu doador, bem como àquele que é perdoado.
Resumo:
The availability of ‘omics’ technologies is transforming scientific approaches to physiological problems from a reductionist viewpoint to that of a holistic viewpoint. This is of profound importance in nutrition, since the integration of multiple systems at the level of gene expression on the synthetic side through to metabolic enzyme activity on the degradative side combine to govern nutrient availability to tissues. Protein activity is central to the process of nutrition from the initial absorption of nutrients via uptake carriers in the gut, through to distribution and transport in the blood, metabolism by degradative enzymes in tissues and excretion through renal tubule exchange proteins. Therefore, the global profiling of the proteome, defined as the entire protein complement of the genome expressed in a particular cell or organ, or in plasma or serum at a particular time, offers the potential for identification of important biomarkers of nutritional state that respond to alterations in diet. The present review considers the published evidence of nutritional modulation of the proteome in vivo which has expanded exponentially over the last 3 years. It highlights some of the challenges faced by researchers using proteomic approaches to understand the interactions of diet with genomic and metabolic–phenotypic variables in normal populations.
Resumo:
Multidimensional compound optimization is a new paradigm in the drug discovery process, yielding efficiencies during early stages and reducing attrition in the later stages of drug development. The success of this strategy relies heavily on understanding this multidimensional data and extracting useful information from it. This paper demonstrates how principled visualization algorithms can be used to understand and explore a large data set created in the early stages of drug discovery. The experiments presented are performed on a real-world data set comprising biological activity data and some whole-molecular physicochemical properties. Data visualization is a popular way of presenting complex data in a simpler form. We have applied powerful principled visualization methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), to help the domain experts (screening scientists, chemists, biologists, etc.) understand and draw meaningful decisions. We also benchmark these principled methods against relatively better known visualization approaches, principal component analysis (PCA), Sammon's mapping, and self-organizing maps (SOMs), to demonstrate their enhanced power to help the user visualize the large multidimensional data sets one has to deal with during the early stages of the drug discovery process. The results reported clearly show that the GTM and HGTM algorithms allow the user to cluster active compounds for different targets and understand them better than the benchmarks. An interactive software tool supporting these visualization algorithms was provided to the domain experts. The tool facilitates the domain experts by exploration of the projection obtained from the visualization algorithms providing facilities such as parallel coordinate plots, magnification factors, directional curvatures, and integration with industry standard software. © 2006 American Chemical Society.
Resumo:
We combine the replica approach from statistical physics with a variational approach to analyze learning curves analytically. We apply the method to Gaussian process regression. As a main result we derive approximative relations between empirical error measures, the generalization error and the posterior variance.
Resumo:
With the accelerating industrialization process, even further increasing population and mass deforestation, ‘sustainability’ as a concept has only recently been popularized in Bangladesh. This paper sheds light on the sustainable development process in Bangladesh. It points out the major challenges to this process and identifies the motivating factors for a sustainable society in Bangladesh. The paper concludes with some strategies that are considered essential for the development of a sustainable society in Bangladesh, e.g., strong and effective regulatory framework, emphasis on rural entrepreneurship, development of indigenous technology and an integrated environmental management system.
Resumo:
University to business technology transfer offers specific challenges, beyond those encountered in industry more widely. This paper examines the issues in university to business technology transfer in the UK and USA and presents the results of a survey of UK and US university technology transfer officers. Findings indicate significant differences in the motivations of universities in each country to transfer technology, the consistency of university technology transfer policies and the accessibility of university technologies to business. The study also looks at perceived barriers to university to business technology transfer and offers suggestions for possible improvements to the process. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
The application of systems thinking to designing, managing, and improving business processes has developed a new "holonic-based" process modeling methodology. The theoretical background and the methodology are described using examples taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. A key point of differentiation attributed to this methodology is that it allows a set of models to be produced without taking a task breakdown approach but instead uses systems thinking and a construct known as the "holon" to build process descriptions as a system of systems (i.e., a holarchy). The process-oriented holonic modeling methodology has been used for total quality management and business process engineering exercises in different industrial sectors and builds models that connect the strategic vision of a company to its operational processes. Exercises have been conducted in response to environmental pressures to make operations align with strategic thinking as well as becoming increasingly agile and efficient. This unique methodology is best applied in environments of high complexity, low volume, and high variety, where repeated learning opportunities are few and far between (e.g., large development projects). © 2007 IEEE.
Resumo:
Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.