887 resultados para Historical novels
Resumo:
From the 1950s up to the early 1990s the All-India data show an ever-declining share of informal credit in the total outstanding debt of rural households. Contemporaneous micro-level studies, using more qualitative research methodologies, provide evidence that questions the strength of this trend, and more recent All-India credit surveys show, first, a levelling, and then a rise, in the share of rural informal credit in 1990/91 and 2000/01, respectively. By reference to findings of a study of village moneylenders in Rajasthan, the paper notes lessons to be drawn. First, informal financial agents have not disappeared from the rural financial landscape in India. Second, formal-sector financial institutions can learn much about rural financial service needs from the financial products and processes of their informal counterparts. Third, a national survey of informal agents, similar to that of the 1921 Census survey of indigenous bankers and moneylenders, would provide valuable pointers towards policy options for the sector. A recent Reserve Bank of India Report on Moneylender Legislation not only explores incentive mechanisms to better ensure fair practice, but also proposes provision for a new category of loan providers that would explicitly link the rural informal and formal financial sectors.
Resumo:
PopABC is a computer package for inferring the pattern of demographic divergence of closely related populations and species. The software performs coalescent simulation in the framework of approximate Bayesian computation (ABC). PopABC can also be used to perform Bayesian model choice to discriminate between different demographic scenarios. The program can be used either for research or for education and teaching purposes.
Resumo:
Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)
Resumo:
This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
The partitioning of minor trivalent actinides (An) from lanthanides (Ln) is one of the challenges in the chemical treatment of nuclear waste. The optimal ligand to carry out the separation of An(III) and Ln(III) using solvent extraction has to meet several important criteria: high selectivity towards the solute, chemical and radiolytic stability, stripping possibilities and recycling of the organic phase, high separation factors and good distribution ratio, to name just a few of them. A chronological line can be drawn along the development of each extraction ligand family and some milestones are emphasized in this overview. Further developments in organic synthesis of extracting ligands are expected.
Rural financial institutions and agents in India: a historical and contemporary comparative analysis
Resumo:
In this article, we provide an initial insight into the study of MI and what it means for a machine to be intelligent. We discuss how MI has progressed to date and consider future scenarios in a realistic and logical way as much as possible. To do this, we unravel one of the major stumbling blocks to the study of MI, which is the field that has become widely known as "artificial intelligence"