879 resultados para estimating conditional probabilities


Relevância:

20.00% 20.00%

Publicador:

Resumo:

To understand the evolution of well-organized social behaviour, we must first understand the mechanism by which collective behaviour establishes. In this study, the mechanisms of collective behaviour in a colony of social insects were studied in terms of the transition probability between active and inactive states, which is linked to mutual interactions. The active and inactive states of the social insects were statistically extracted from the velocity profiles. From the duration distributions of the two states, we found that 1) the durations of active and inactive states follow an exponential law, and 2) pair interactions increase the transition probability from inactive to active states. The regulation of the transition probability by paired interactions suggests that such interactions control the populations of active and inactive workers in the colony.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Economic theory makes no predictions about social factors affecting decisions under risk. We examine situations in which a decision maker decides for herself and another person under conditions of payoff equality, and compare them to individual decisions. By estimating a structural model, we find that responsibility leaves utility curvature unaffected, but accentuates the subjective distortion of very small and very large probabilities for both gains and losses. We also find that responsibility reduces loss aversion, but that these results only obtain under some specific definitions of the latter. These results serve to generalize and reconcile some of the still largely contradictory findings in the literature. They also have implications for financial agency, which we discuss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research in Bid Tender Forecasting Models (BTFM) has been in progress since the 1950s. None of the developed models were easy-to-use tools for effective use by bidding practitioners because the advanced mathematical apparatus and massive data inputs required. This scenario began to change in 2012 with the development of the Smartbid BTFM, a quite simple model that presents a series of graphs that enables any project manager to study competitors using a relatively short historical tender dataset. However, despite the advantages of this new model, so far, it is still necessary to study all the auction participants as an indivisible group; that is, the original BTFM was not devised for analyzing the behavior of a single bidding competitor or a subgroup of them. The present paper tries to solve that flaw and presents a stand-alone methodology useful for estimating future competitors’ bidding behaviors separately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human observers exhibit large systematic distance-dependent biases when estimating the three-dimensional (3D) shape of objects defined by binocular image disparities. This has led some to question the utility of disparity as a cue to 3D shape and whether accurate estimation of 3D shape is at all possible. Others have argued that accurate perception is possible, but only with large continuous perspective transformations of an object. Using a stimulus that is known to elicit large distance-dependent perceptual bias (random dot stereograms of elliptical cylinders) we show that contrary to these findings the simple adoption of a more naturalistic viewing angle completely eliminates this bias. Using behavioural psychophysics, coupled with a novel surface-based reverse correlation methodology, we show that it is binocular edge and contour information that allows for accurate and precise perception and that observers actively exploit and sample this information when it is available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Accurate dietary assessment is key to understanding nutrition-related outcomes and is essential for estimating dietary change in nutrition-based interventions. Objective: The objective of this study was to assess the pan-European reproducibility of the Food4Me food-frequency questionnaire (FFQ) in assessing the habitual diet of adults. Methods: Participantsfromthe Food4Me study, a 6-mo,Internet-based, randomizedcontrolled trial of personalized nutrition conducted in the United Kingdom, Ireland, Spain, Netherlands, Germany, Greece, and Poland were included. Screening and baseline data (both collected before commencement of the intervention) were used in the present analyses, and participants were includedonly iftheycompleted FFQs at screeningand at baselinewithin a 1-mo timeframebeforethe commencement oftheintervention. Sociodemographic (e.g., sex andcountry) andlifestyle[e.g.,bodymass index(BMI,inkg/m2)and physical activity] characteristics were collected. Linear regression, correlation coefficients, concordance (percentage) in quartile classification, and Bland-Altman plots for daily intakes were used to assess reproducibility. Results: In total, 567 participants (59% female), with a mean 6 SD age of 38.7 6 13.4 y and BMI of 25.4 6 4.8, completed bothFFQswithin 1 mo(mean 6 SD: 19.26 6.2d).Exact plus adjacent classification oftotal energy intakeinparticipants was highest in Ireland (94%) and lowest in Poland (81%). Spearman correlation coefficients (r) in total energy intake between FFQs ranged from 0.50 for obese participants to 0.68 and 0.60 in normal-weight and overweight participants, respectively. Bland-Altman plots showed a mean difference between FFQs of 210 kcal/d, with the agreement deteriorating as energy intakes increased. There was little variation in reproducibility of total energy intakes between sex and age groups. Conclusions: The online Food4Me FFQ was shown to be reproducible across 7 European countries when administered within a 1-mo period to a large number of participants. The results support the utility of the online Food4Me FFQ as a reproducible tool across multiple European populations. This trial was registered at clinicaltrials.gov as NCT01530139.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study determined the sensory shelf life of a commercial brand of chocolate and carrot cupcakes, aiming at increasing the current 120 days of shelf life to 180. Appearance, texture, flavor and overall quality of cakes stored at six different storage times were evaluated by 102 consumers. The data were analyzed by analysis of variance and linear regression. For both flavors, the texture presented a greater loss in acceptance during the storage period, showing an acceptance mean close to indifference on the hedonic scale at 120 days. Nevertheless, appearance, flavor and overall quality stayed acceptable up to 150 days. The end of shelf life was estimated at about 161 days for chocolate cakes and 150 days for carrot cakes. This study showed that the current 120 days of shelf life can be extended to 150 days for carrot cake and to 160 days for chocolate cake. However, the 180 days of shelf life desired by the company were not achieved. PRACTICAL APPLICATIONS This research shows the adequacy of using sensory acceptance tests to determine the shelf life of two food products (chocolate and carrot cupcakes). This practical application is useful because the precise determination of the shelf life of a food product is of vital importance for its commercial success. The maximum storage time should always be evaluated in the development or reformulation of new products, changes in packing or storage conditions. Once the physical-chemical and microbiological stability of a product is guaranteed, sensorial changes that could affect consumer acceptance will determine the end of the shelf life of a food product. Thus, the use of sensitive and reliable methods to estimate the sensory shelf life of a product is very important. Findings show the importance of determining the shelf life of each product separately and to avoid using the shelf time estimated for a specific product on other, similar products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In arthropods, most cases of morphological dimorphism within males are the result of a conditional evolutionarily stable strategy (ESS) with status-dependent tactics. In conditionally male-dimorphic species, the status` distributions of male morphs often overlap, and the environmentally cued threshold model (ET) states that the degree of overlap depends on the genetic variation in the distribution of the switchpoints that determine which morph is expressed in each value of status. Here we describe male dimorphism and alternative mating behaviors in the harvestman Serracutisoma proximum. Majors express elongated second legs and use them in territorial fights; minors possess short second legs and do not fight, but rather sneak into majors` territories and copulate with egg-guarding females. The static allometry of second legs reveals that major phenotype expression depends on body size (status), and that the switchpoint underlying the dimorphism presents a large amount of genetic variation in the population, which probably results from weak selective pressure on this trait. With a mark-recapture study, we show that major phenotype expression does not result in survival costs, which is consistent with our hypothesis that there is weak selection on the switchpoint. Finally, we demonstrate that switchpoint is independent of status distribution. In conclusion, our data support the ET model prediction that the genetic correlation between status and switchpoint is low, allowing the status distribution to evolve or to fluctuate seasonally, without any effect on the position of the mean switchpoint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information to guide decision making is especially urgent in human dominated landscapes in the tropics, where urban and agricultural frontiers are still expanding in an unplanned manner. Nevertheless, most studies that have investigated the influence of landscape structure on species distribution have not considered the heterogeneity of altered habitats of the matrix, which is usually high in human dominated landscapes. Using the distribution of small mammals in forest remnants and in the four main altered habitats in an Atlantic forest landscape, we investigated 1) how explanatory power of models describing species distribution in forest remnants varies between landscape structure variables that do or do not incorporate matrix quality and 2) the importance of spatial scale for analyzing the influence of landscape structure. We used standardized sampling in remnants and altered habitats to generate two indices of habitat quality, corresponding to the abundance and to the occurrence of small mammals. For each remnant, we calculated habitat quantity and connectivity in different spatial scales, considering or not the quality of surrounding habitats. The incorporation of matrix quality increased model explanatory power across all spatial scales for half the species that occurred in the matrix, but only when taking into account the distance between habitat patches (connectivity). These connectivity models were also less affected by spatial scale than habitat quantity models. The few consistent responses to the variation in spatial scales indicate that despite their small size, small mammals perceive landscape features at large spatial scales. Matrix quality index corresponding to species occurrence presented a better or similar performance compared to that of species abundance. Results indicate the importance of the matrix for the dynamics of fragmented landscapes and suggest that relatively simple indices can improve our understanding of species distribution, and could be applied in modeling, monitoring and managing complex tropical landscapes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most techniques used for estimating the age of Sotalia guianensis (van B,n,den, 1864) (Cetacea; Delphinidae) are very expensive, and require sophisticated equipment for preparing histological sections of teeth. The objective of this study was to test a more affordable and much simpler method, involving of the manual wear of teeth followed by decalcification and observation under a stereomicroscope. This technique has been employed successfully with larger species of Odontoceti. Twenty-six specimens were selected, and one tooth of each specimen was worn and demineralized for growth layers reading. Growth layers were evidenced in all specimens; however, in 4 of the 26 teeth, not all the layers could be clearly observed. In these teeth, there was a significant decrease of growth layer group thickness, thus hindering the layers count. The juxtaposition of layers hindered the reading of larger numbers of layers by the wear and decalcification technique. Analysis of more than 17 layers in a single tooth proved inconclusive. The method applied here proved to be efficient in estimating the age of Sotalia guianensis individuals younger than 18 years. This method could simplify the study of the age structure of the overall population, and allows the use of the more expensive methodologies to be confined to more specific studies of older specimens. It also enables the classification of the calf, young and adult classes, which is important for general population studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function lambda(t), t >= 0. This rate function also depends on some parameters that need to be estimated. Two forms of lambda(t), t >= 0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating the sizes of hard-to-count populations is a challenging and important problem that occurs frequently in social science, public health, and public policy. This problem is particularly pressing in HIV/AIDS research because estimates of the sizes of the most at-risk populations-illicit drug users, men who have sex with men, and sex workers-are needed for designing, evaluating, and funding programs to curb the spread of the disease. A promising new approach in this area is the network scale-up method, which uses information about the personal networks of respondents to make population size estimates. However, if the target population has low social visibility, as is likely to be the case in HIV/AIDS research, scale-up estimates will be too low. In this paper we develop a game-like activity that we call the game of contacts in order to estimate the social visibility of groups, and report results from a study of heavy drug users in Curitiba, Brazil (n = 294). The game produced estimates of social visibility that were consistent with qualitative expectations but of surprising magnitude. Further, a number of checks suggest that the data are high-quality. While motivated by the specific problem of population size estimation, our method could be used by researchers more broadly and adds to long-standing efforts to combine the richness of social network analysis with the power and scale of sample surveys. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss potential caveats when estimating topologies of 3D brain networks from surface recordings. It is virtually impossible to record activity from all single neurons in the brain and one has to rely on techniques that measure average activity at sparsely located (non-invasive) recording sites Effects of this spatial sampling in relation to structural network measures like centrality and assortativity were analyzed using multivariate classifiers A simplified model of 3D brain connectivity incorporating both short- and long-range connections served for testing. To mimic M/EEG recordings we sampled this model via non-overlapping regions and weighted nodes and connections according to their proximity to the recording sites We used various complex network models for reference and tried to classify sampled versions of the ""brain-like"" network as one of these archetypes It was found that sampled networks may substantially deviate in topology from the respective original networks for small sample sizes For experimental studies this may imply that surface recordings can yield network structures that might not agree with its generating 3D network. (C) 2010 Elsevier Inc All rights reserved

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a previous paper, we developed a phenomenological-operator technique aiming to simplify the estimate of losses due to dissipation in cavity quantum electrodynamics. In this paper, we apply that technique to estimate losses during an entanglement concentration process in the context of dissipative cavities. In addition, some results, previously used without proof to justify our phenomenological-operator approach, are now formally derived, including an equivalent way to formulate the Wigner-Weisskopf approximation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been great interest in deciding whether a combinatorial structure satisfies some property, or in estimating the value of some numerical function associated with this combinatorial structure, by considering only a randomly chosen substructure of sufficiently large, but constant size. These problems are called property testing and parameter testing, where a property or parameter is said to be testable if it can be estimated accurately in this way. The algorithmic appeal is evident, as, conditional on sampling, this leads to reliable constant-time randomized estimators. Our paper addresses property testing and parameter testing for permutations in a subpermutation perspective; more precisely, we investigate permutation properties and parameters that can be well approximated based on a randomly chosen subpermutation of much smaller size. In this context, we use a theory of convergence of permutation sequences developed by the present authors [C. Hoppen, Y. Kohayakawa, C.G. Moreira, R.M. Sampaio, Limits of permutation sequences through permutation regularity, Manuscript, 2010, 34pp.] to characterize testable permutation parameters along the lines of the work of Borgs et al. [C. Borgs, J. Chayes, L Lovasz, V.T. Sos, B. Szegedy, K. Vesztergombi, Graph limits and parameter testing, in: STOC`06: Proceedings of the 38th Annual ACM Symposium on Theory of Computing, ACM, New York, 2006, pp. 261-270.] in the case of graphs. Moreover, we obtain a permutation result in the direction of a famous result of Alon and Shapira [N. Alon, A. Shapira, A characterization of the (natural) graph properties testable with one-sided error, SIAM J. Comput. 37 (6) (2008) 1703-1727.] stating that every hereditary graph property is testable. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.