81 resultados para Historical Artifacts


Relevância:

20.00% 20.00%

Publicador:

Resumo:

From the 1950s up to the early 1990s the All-India data show an ever-declining share of informal credit in the total outstanding debt of rural households. Contemporaneous micro-level studies, using more qualitative research methodologies, provide evidence that questions the strength of this trend, and more recent All-India credit surveys show, first, a levelling, and then a rise, in the share of rural informal credit in 1990/91 and 2000/01, respectively. By reference to findings of a study of village moneylenders in Rajasthan, the paper notes lessons to be drawn. First, informal financial agents have not disappeared from the rural financial landscape in India. Second, formal-sector financial institutions can learn much about rural financial service needs from the financial products and processes of their informal counterparts. Third, a national survey of informal agents, similar to that of the 1921 Census survey of indigenous bankers and moneylenders, would provide valuable pointers towards policy options for the sector. A recent Reserve Bank of India Report on Moneylender Legislation not only explores incentive mechanisms to better ensure fair practice, but also proposes provision for a new category of loan providers that would explicitly link the rural informal and formal financial sectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PopABC is a computer package for inferring the pattern of demographic divergence of closely related populations and species. The software performs coalescent simulation in the framework of approximate Bayesian computation (ABC). PopABC can also be used to perform Bayesian model choice to discriminate between different demographic scenarios. The program can be used either for research or for education and teaching purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the performance of phylogenetic mixture models in reducing a well-known and pervasive artifact of phylogenetic inference known as the node-density effect, comparing them to partitioned analyses of the same data. The node-density effect refers to the tendency for the amount of evolutionary change in longer branches of phylogenies to be underestimated compared to that in regions of the tree where there are more nodes and thus branches are typically shorter. Mixture models allow more than one model of sequence evolution to describe the sites in an alignment without prior knowledge of the evolutionary processes that characterize the data or how they correspond to different sites. If multiple evolutionary patterns are common in sequence evolution, mixture models may be capable of reducing node-density effects by characterizing the evolutionary processes more accurately. In gene-sequence alignments simulated to have heterogeneous patterns of evolution, we find that mixture models can reduce node-density effects to negligible levels or remove them altogether, performing as well as partitioned analyses based on the known simulated patterns. The mixture models achieve this without knowledge of the patterns that generated the data and even in some cases without specifying the full or true model of sequence evolution known to underlie the data. The latter result is especially important in real applications, as the true model of evolution is seldom known. We find the same patterns of results for two real data sets with evidence of complex patterns of sequence evolution: mixture models substantially reduced node-density effects and returned better likelihoods compared to partitioning models specifically fitted to these data. We suggest that the presence of more than one pattern of evolution in the data is a common source of error in phylogenetic inference and that mixture models can often detect these patterns even without prior knowledge of their presence in the data. Routine use of mixture models alongside other approaches to phylogenetic inference may often reveal hidden or unexpected patterns of sequence evolution and can improve phylogenetic inference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The partitioning of minor trivalent actinides (An) from lanthanides (Ln) is one of the challenges in the chemical treatment of nuclear waste. The optimal ligand to carry out the separation of An(III) and Ln(III) using solvent extraction has to meet several important criteria: high selectivity towards the solute, chemical and radiolytic stability, stripping possibilities and recycling of the organic phase, high separation factors and good distribution ratio, to name just a few of them. A chronological line can be drawn along the development of each extraction ligand family and some milestones are emphasized in this overview. Further developments in organic synthesis of extracting ligands are expected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How does the manipulation of visual representations play a role in the practices of generating, evolving and exchanging knowledge? The role of visual representation in mediating knowledge work is explored in a study of design work of an architectural practice, Edward Cullinan Architects. The intensity of interactions with visual representations in the everyday activities on design projects is immediately striking. Through a discussion of observed design episodes, two ways are articulated in which visual representations act as 'artefacts of knowing'. As communication media they are symbolic representations, rich in meaning, through which ideas are articulated, developed and exchanged. Furthermore, as tangible artefacts they constitute material entities with which to interact and thereby develop knowledge. The communicative and interactive properties of visual representations constitute them as central elements of knowledge work. The paper explores emblematic knowledge practices supported by visual representation and concludes by pinpointing avenues for further research.

Relevância:

20.00% 20.00%

Publicador: