5 resultados para Bayesian-inference
em Brock University, Canada
Resumo:
Many arthropods exhibit behaviours precursory to social life, including adult longevity, parental care, nest loyalty and mutual tolerance, yet there are few examples of social behaviour in this phylum. The small carpenter bees, genus Ceratina, provide important insights into the early stages of sociality. I described the biology and social behaviour of five facultatively social species which exhibit all of the preadaptations for successful group living, yet present ecological and behavioural characteristics that seemingly disfavour frequent colony formation. These species are socially polymorphic with both / solitary and social nests collected in sympatry. Social colonies consist of two adult females, one contributing both foraging and reproductive effort and the second which remains at the nest as a passive guard. Cooperative nesting provides no overt reproductive benefits over solitary nesting, although brood survival tends to be greater in social colonies. Three main theories explain cooperation among conspecifics: mutual benefit, kin selection and manipulation. Lifetime reproductive success calculations revealed that mutual benefit does not explain social behaviour in this group as social colonies have lower per capita life time reproductive success than solitary nests. Genetic pedigrees constructed from allozyme data indicate that kin selection might contribute to the maintenance of social nesting -, as social colonies consist of full sisters and thus some indirect fitness benefits are inherently bestowed on subordinate females as a result of remaining to help their dominant sister. These data suggest that the origin of sociality in ceratinines has principal costs and the great ecological success of highly eusociallineages occurred well after social origins. Ecological constraints such as resource limitation, unfavourable weather conditions and parasite pressure have long been considered some of the most important selective pressures for the evolution of sociality. I assessed the fitness consequences of these three ecological factors for reproductive success of solitary and social colonies and found that nest sites were not limiting, and the frequency of social nesting was consistent across brood rearing seasons. Local weather varied between seasons but was not correlated with reproductive success. Severe parasitism resulted in low reproductive success and total nest failure in solitary nests. Social colonies had higher reproductive success and were never extirpated by parasites. I suggest that social nesting represents a form of bet-hedging. The high frequency of solitary nests suggests that this is the optimal strategy when parasite pressure is low. However, social colonies have a selective advantage over solitary nesting females during periods of extreme parasite pressure. Finally, the small carpenter bees are recorded from all continents except Antarctica. I constructed the first molecular phylogeny of ceratinine bees based on four gene regions of selected species covering representatives from all continents and ecological regions. Maximum parsimony and Bayesian Inference tree topology and fossil dating support an African origin followed by an Old World invasion and New World radiation. All known Old World ceratinines form social colonies while New World species are largely solitary; thus geography and phylogenetic inertia are likely predictors of social evolution in this genus. This integrative approach not only describes the behaviour of several previously unknown or little-known Ceratina species, bu~ highlights the fact that this is an important, though previously unrecognized, model for studying evolutionary transitions from solitary to social behaviour.
Resumo:
This thesis explores the debate and issues regarding the status of visual ;,iferellces in the optical writings of Rene Descartes, George Berkeley and James 1. Gibson. It gathers arguments from across their works and synthesizes an account of visual depthperception that accurately reflects the larger, metaphysical implications of their philosophical theories. Chapters 1 and 2 address the Cartesian and Berkelean theories of depth-perception, respectively. For Descartes and Berkeley the debate can be put in the following way: How is it possible that we experience objects as appearing outside of us, at various distances, if objects appear inside of us, in the representations of the individual's mind? Thus, the Descartes-Berkeley component of the debate takes place exclusively within a representationalist setting. Representational theories of depthperception are rooted in the scientific discovery that objects project a merely twodimensional patchwork of forms on the retina. I call this the "flat image" problem. This poses the problem of depth in terms of a difference between two- and three-dimensional orders (i.e., a gap to be bridged by one inferential procedure or another). Chapter 3 addresses Gibson's ecological response to the debate. Gibson argues that the perceiver cannot be flattened out into a passive, two-dimensional sensory surface. Perception is possible precisely because the body and the environment already have depth. Accordingly, the problem cannot be reduced to a gap between two- and threedimensional givens, a gap crossed with a projective geometry. The crucial difference is not one of a dimensional degree. Chapter 3 explores this theme and attempts to excavate the empirical and philosophical suppositions that lead Descartes and Berkeley to their respective theories of indirect perception. Gibson argues that the notion of visual inference, which is necessary to substantiate representational theories of indirect perception, is highly problematic. To elucidate this point, the thesis steps into the representationalist tradition, in order to show that problems that arise within it demand a tum toward Gibson's information-based doctrine of ecological specificity (which is to say, the theory of direct perception). Chapter 3 concludes with a careful examination of Gibsonian affordallces as the sole objects of direct perceptual experience. The final section provides an account of affordances that locates the moving, perceiving body at the heart of the experience of depth; an experience which emerges in the dynamical structures that cross the body and the world.
Resumo:
The purpose of this study is to examine the impact of the choice of cut-off points, sampling procedures, and the business cycle on the accuracy of bankruptcy prediction models. Misclassification can result in erroneous predictions leading to prohibitive costs to firms, investors and the economy. To test the impact of the choice of cut-off points and sampling procedures, three bankruptcy prediction models are assessed- Bayesian, Hazard and Mixed Logit. A salient feature of the study is that the analysis includes both parametric and nonparametric bankruptcy prediction models. A sample of firms from Lynn M. LoPucki Bankruptcy Research Database in the U. S. was used to evaluate the relative performance of the three models. The choice of a cut-off point and sampling procedures were found to affect the rankings of the various models. In general, the results indicate that the empirical cut-off point estimated from the training sample resulted in the lowest misclassification costs for all three models. Although the Hazard and Mixed Logit models resulted in lower costs of misclassification in the randomly selected samples, the Mixed Logit model did not perform as well across varying business-cycles. In general, the Hazard model has the highest predictive power. However, the higher predictive power of the Bayesian model, when the ratio of the cost of Type I errors to the cost of Type II errors is high, is relatively consistent across all sampling methods. Such an advantage of the Bayesian model may make it more attractive in the current economic environment. This study extends recent research comparing the performance of bankruptcy prediction models by identifying under what conditions a model performs better. It also allays a range of user groups, including auditors, shareholders, employees, suppliers, rating agencies, and creditors' concerns with respect to assessing failure risk.
Resumo:
Complex networks can arise naturally and spontaneously from all things that act as a part of a larger system. From the patterns of socialization between people to the way biological systems organize themselves, complex networks are ubiquitous, but are currently poorly understood. A number of algorithms, designed by humans, have been proposed to describe the organizational behaviour of real-world networks. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. The algorithms, called graph models, represent significant human effort. Deriving accurate graph models is non-trivial, time-intensive, challenging and may only yield useful results for very specific phenomena. An automated approach can greatly reduce the human effort required and if effective, provide a valuable tool for understanding the large decentralized systems of interrelated things around us. To the best of the author's knowledge this thesis proposes the first method for the automatic inference of graph models for complex networks with varied properties, with and without community structure. Furthermore, to the best of the author's knowledge it is the first application of genetic programming for the automatic inference of graph models. The system and methodology was tested against benchmark data, and was shown to be capable of reproducing close approximations to well-known algorithms designed by humans. Furthermore, when used to infer a model for real biological data the resulting model was more representative than models currently used in the literature.
Object-Oriented Genetic Programming for the Automatic Inference of Graph Models for Complex Networks
Resumo:
Complex networks are systems of entities that are interconnected through meaningful relationships. The result of the relations between entities forms a structure that has a statistical complexity that is not formed by random chance. In the study of complex networks, many graph models have been proposed to model the behaviours observed. However, constructing graph models manually is tedious and problematic. Many of the models proposed in the literature have been cited as having inaccuracies with respect to the complex networks they represent. However, recently, an approach that automates the inference of graph models was proposed by Bailey [10] The proposed methodology employs genetic programming (GP) to produce graph models that approximate various properties of an exemplary graph of a targeted complex network. However, there is a great deal already known about complex networks, in general, and often specific knowledge is held about the network being modelled. The knowledge, albeit incomplete, is important in constructing a graph model. However it is difficult to incorporate such knowledge using existing GP techniques. Thus, this thesis proposes a novel GP system which can incorporate incomplete expert knowledge that assists in the evolution of a graph model. Inspired by existing graph models, an abstract graph model was developed to serve as an embryo for inferring graph models of some complex networks. The GP system and abstract model were used to reproduce well-known graph models. The results indicated that the system was able to evolve models that produced networks that had structural similarities to the networks generated by the respective target models.