968 resultados para chance


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research examined how retrospective self-assessments of performance are affected by major depression. To test the validity of the depressive realism versus the selective processing hypotheses, aggregate posttest performance estimates (PTPEs) were obtained from clinically depressed patients and an age-matched comparison group across 4 decision tasks (object recognition, general knowledge, social judgment, and line-length judgment). As expected on the basis of previous findings, both groups were underconfident in their PTPEs, consistently underestimating the percentage of questions they had answered correctly. Contrary to depressive realism, and in partial support of the selective processing account, this underconfidence effect was not reduced but modestly exacerbated in the depressed patients. Further, whereas the PTPEs of the comparison group exceeded that expected on the basis of chance alone those of the depressed individuals did not. The results provide no support for the depressive realism account and suggest that negative biases contribute to metacognitive information processing in major depression.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recognition as a cue to judgment in a novel, multi-option domain (the Sunday Times Rich List) is explored. As in previous studies, participants were found to make use of name recognition as a cue to the presumed wealth of individuals. Names that were recognized were judged to be the richest name from amongst the set presented at above chance levels. This effect persisted across situations in which more than one name was recognized; recognition was used as an inclusion criterion for the sub-set of names to be considered the richest of the set presented. However, when the question was reversed, and a “poorest” judgment was required, use of recognition as an exclusion criterion was observed only when a single name was recognized. Reaction times when making these judgments also show a distinction between “richest” and “poorest” questions with recognition of none of the options taking the longest time to judge in the “richest” question condition and full recognition of all the names presented taking longest to judge in the “poorest” question condition. Implications for decision-making using simple heuristics are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virtual Reality (VR) is widely used in visualizing medical datasets. This interest has emerged due to the usefulness of its techniques and features. Such features include immersion, collaboration, and interactivity. In a medical visualization context, immersion is important, because it allows users to interact directly and closelywith detailed structures in medical datasets. Collaboration on the other hand is beneficial, because it gives medical practitioners the chance to share their expertise and offer feedback and advice in a more effective and intuitive approach. Interactivity is crucial in medical visualization and simulation systems, because responsiveand instantaneous actions are key attributes in applications, such as surgical simulations. In this paper we present a case study that investigates the use of VR in a collaborative networked CAVE environment from a medical volumetric visualization perspective. The study will present a networked CAVE application, which has been built to visualize and interact with volumetric datasets. We will summarize the advantages of such an application and the potential benefits of our system. We also will describe the aspects related to this application area and the relevant issues of such implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução: No Brasil, desde a constituição de 1988, a creche passou a ser um direito da criança, um dever do Estado e uma opção da família. Considerando que o desenvolvimento infantil é um processo complexo resultante da interação do potencial biológico com o ambiente social e cultural no qual a criança está inserida, as creches se constituem como fator ambiental que influencia o desenvolvimento das habilidades cognitivas, motoras e sociais das crianças. Assim, conhecer situações que possam comprometer o desenvolvimento de crianças inseridas em creches é fundamental para a elaboração de políticas e estratégias que contribuam para melhorar a qualidade dos serviços ofertados por estas instituições. Objetivo: verificar a prevalência de atraso no desenvolvimento neuropsicomotor em lactentes inseridos em creches públicas na cidade de João Pessoa/PB e analisar fatores associados ao desenvolvimento infantil. Metodologia: de março a junho de 2012 realizou-se um estudo seccional nas turmas de berçários dos Centros de Referência em Educação Infantil (CREI) da Rede Municipal de Ensino da Cidade de João Pessoa/PB com a população de crianças na faixa etária entre 6 e 18 meses, e suas respectivas mães (biológicas ou substitutas). O desfecho estudado foi o desenvolvimento neuropsicomotor avaliado pelo Teste de Triagem do Desenvolvimento de Denver II. Variáveis explicativas de natureza biológica, materna, social e demográfica foram investigadas a partir de questionário aplicado a mãe/responsável, da avaliação da caderneta de saúde da criança e por um formulário sobre a creche por meio da observação do ambiente físico e pela entrevista com gestores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The word tyche (plural tychai) denotes an ancient Greek concept encompassing many aspects of fortune – chance, fate, luck, occurrence, even achievement, success, and wealth – both good and bad. As a personification of that concept, the goddess Tyche came to symbolize the fate and fortune of rulers and through them their cities; she thus emerged as the preeminent city goddess throughout the Hellenistic and Roman worlds. Her name is etymologically related to the verb tynchanein (“to hit, meet with, be favored with, happen accidentally”). The connection between the noun and verb is so close that it is difficult to distinguish in Greek literature between the deity and the abstraction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The meridional overturning circulation (MOC) is part of a global ocean circulation that redistributes heat from Equatorial to Polar regions. In the Atlantic the MOC carries heat northward (the Atlantic Heat Conveyor) which is released to the atmosphere and maintains UK temperatures between 3 to 5°C higher than elsewhere at similar latitudes. However, the present strength and structure of the MOC may not continue. The 2007 IPCC assessment report (IPCC, 2007) suggests that there is less than 10% chance of abrupt changes during the 21st Century, but that there is greater than 90% chance that MOC will slow by an average of 25% compared to pre-industrial levels, offsetting some of the warming over the European sector of the North Atlantic, and contributing to the rate of sea-level-rise. Daily observations using the RAPID MOC mooring array at 26.5°N are providing a continuous and growing time-series of the MOC strength and structure, but the five year record is at present too short to establish trends in the annual mean MOC. Other observations do not at present provide a coherent Atlantic wide picture of MOC variability, and there is little evidence of any long-term slowing. Ocean assimilation models suggest a slowing over the past decade of around 10%. However, models still have many problems in representing ocean circulation and conclusions of change are very uncertain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cyanobacteria (blue-green algae) blooms in water bodies present serious public health issues with attendant economic and ecological impacts. Llyn Tegid (Lake Bala) is an important conservation and amenity asset within Snowdonia National Park, Wales which since the mid-1990s has experienced multiple toxic cyanobacteria blooms threatening the ecology and tourism-dependent local economy. Multiple working hypotheses explain the emergence of this problem, including climate change, land management linked to increased nutrient flux, hydromorphological alterations or changing trophic structure - any of which may operate individually or cumulatively to impair lake function. This paper reports the findings of a sedimentfingerprinting study using dated lake cores to explore the linkages between catchment and lake management practices and the emergence of the algal blooms problem. Since 1900 AD lake bed sedimentation rates have varied from 0.06 to 1.07 g cm−2 yr−1, with a pronounced acceleration since the early 1980s. Geochemical analysis revealed increases in the concentrations of total phosphorus (TP), calcium and heavy metals such as zinc and lead consistent with eutrophication and a rising pollution burden, particularly since the late 1970s. An uncertainty-inclusive sedimentfingerprinting approach was used to apportion the relative fluxes from the major catchment land cover types of improved pasture, rough grazing, forestry and channel banks. This showed improved pasture and channel banks are the dominant diffuse sources of sediment in the catchment, though forestry sources were important historically. Conversion of rough grazing to improved grassland, coupled with intensified land management and year-round livestock grazing, is concluded to provide the principal source of rising TP levels. Lake Habitat Survey and particle size analysis of lake cores demonstrate the hydromorphological impact of the River Dee Regulation Scheme, which controls water level and periodically diverts flow into Llyn Tegid from the adjacent Afon Tryweryn catchment. This hydromorphological impact has also been most pronounced since the late 1970s. It is concluded that an integrated approach combining land management to reduce agricultural runoff allied to improved water level regulation enabling recovery of littoral macrophytes offers the greatest chance halting the on-going cyanobacteria issue in Llyn Tegid.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global climate and weather models tend to produce rainfall that is too light and too regular over the tropical ocean. This is likely because of convective parametrizations, but the problem is not well understood. Here, distributions of precipitation rates are analyzed for high-resolution UK Met Office Unified Model simulations of a 10 day case study over a large tropical domain (∼20°S–20°N and 42°E–180°E). Simulations with 12 km grid length and parametrized convection have too many occurrences of light rain and too few of heavier rain when interpolated onto a 1° grid and compared with Tropical Rainfall Measuring Mission (TRMM) data. In fact, this version of the model appears to have a preferred scale of rainfall around 0.4 mm h−1 (10 mm day−1), unlike observations of tropical rainfall. On the other hand, 4 km grid length simulations with explicit convection produce distributions much more similar to TRMM observations. The apparent preferred scale at lighter rain rates seems to be a feature of the convective parametrization rather than the coarse resolution, as demonstrated by results from 12 km simulations with explicit convection and 40 km simulations with parametrized convection. In fact, coarser resolution models with explicit convection tend to have even more heavy rain than observed. Implications for models using convective parametrizations, including interactions of heating and moistening profiles with larger scales, are discussed. One important implication is that the explicit convection 4 km model has temperature and moisture tendencies that favour transitions in the convective regime. Also, the 12 km parametrized convection model produces a more stable temperature profile at its extreme high-precipitation range, which may reduce the chance of very heavy rainfall. Further study is needed to determine whether unrealistic precipitation distributions are due to some fundamental limitation of convective parametrizations or whether parametrizations can be improved, in order to better simulate these distributions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents the first global-scale multi-sectoral regional assessment of the magnitude and uncertainty in the impacts of climate change avoided by emissions policies. The analysis suggests that the most stringent emissions policy considered here – which gives a 50% chance of remaining below a 2oC temperature rise target - reduces impacts by 20-65% by 2100 relative to a ‘business-as-usual’ pathway (A1B) which reaches 4oC, and can delay impacts by several decades. Effects vary between sector and region, and there are few noticeable effects of mitigation policy by 2030. The impacts avoided by 2100 are more strongly influenced by the date and level at which emissions peak than the rate of decline of emissions, with an earlier and lower emissions peak avoiding more impacts. The estimated proportion of impacts avoided at the global scale is relatively robust despite uncertainty in the spatial pattern of climate change, but the absolute amount of avoided impacts is considerably more variable and therefore uncertain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effectively preparing and planning for Customer Relationship Management (CRM) strategy is critical to CRM implementation success. A lack of a common and systematic way to implement CRM means that focus must be placed on the pre-implementation stage to ensure chance of success. Although existing CRM implementation approaches evidence the need to concentrate mostly on the pre-implementation stage, they fail to address some key issues, which raises the need for a generic framework that address CRM strategy analysis. This paper proposes a framework to support effective CRM pre-implementation strategy development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The BFKL equation and the kT-factorization theorem are used to obtain predictions for F2 in the small Bjo/rken-x region over a wide range of Q2. The dependence on the parameters, especially on those concerning the infrared region, is discussed. After a background fit to recent experimental data obtained at DESY HERA and at Fermilab (E665 experiment) we find that the predicted, almost Q2 independent BFKL slope λ≳0.5 appears to be too steep at lower Q2 values. Thus there seems to be a chance that future HERA data can distinguish between pure BFKL and conventional field theoretic renormalization group approaches. © 1995 The American Physical Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The great majority of plant species in the tropics require animals to achieve pollination, but the exact role of floral signals in attraction of animal pollinators is often debated. Many plants provide a floral reward to attract a guild of pollinators, and it has been proposed that floral signals of non-rewarding species may converge on those of rewarding species to exploit the relationship of the latter with their pollinators. In the orchid family (Orchidaceae), pollination is almost universally animal-mediated, but a third of species provide no floral reward, which suggests that deceptive pollination mechanisms are prevalent. Here, we examine floral colour and shape convergence in Neotropical plant communities, focusing on certain food-deceptive Oncidiinae orchids (e.g. Trichocentrum ascendens and Oncidium nebulosum) and rewarding species of Malpighiaceae. We show that the species from these two distantly related families are often more similar in floral colour and shape than expected by chance and propose that a system of multifarious floral mimicry—a form of Batesian mimicry that involves multiple models and is more complex than a simple one model–one mimic system—operates in these orchids. The same mimetic pollination system has evolved at least 14 times within the species-rich Oncidiinae throughout the Neotropics. These results help explain the extraordinary diversification of Neotropical orchids and highlight the complexity of plant–animal interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multicellularity evolved well before 600 million years ago, and all multicellular animals have evolved since then with the need to protect against pathogens. There is no reason to expect their immune systems to be any less sophisticated than ours. The vertebrate system, based on rearranging immunoglobulin-superfamily domains, appears to have evolved partly as a result of chance insertion of RAG genes by horizontal transfer. Remarkably sophisticated systems for expansion of immunological repertoire have evolved in parallel in many groups of organisms. Vaccination of invertebrates against commercially important pathogens has been empirically successful, and suggests that the definition of an adaptive and innate immune system should no longer depend on the presence of memory and specificity, since these terms are hard to define in themselves. The evolution of randomly-created immunological repertoire also carries with it the potential for generating autoreactive specificities and consequent autoimmune damage.While invertebrates may use systems analogous to ours to control autoreactive specificities, they may have evolved alternative mechanisms which operate either at the level of individuals-within-populations rather than cells-within-individuals, by linking self-reactive specificities to regulatory pathways and non-self-reactive to effector pathways.