929 resultados para Frequently asked question
Resumo:
In a cross-sectional study of 400 randomly selected smallholder dairy farms in the Tanga and Iringa regions of Tanzania, 14.2% (95% confidence interval (CI) = 11.6-17.3) of cows had developed clinical mastitis during the previous year. The point prevalence of subclinical mastitis, defined as a quarter positive by the California Mastitis Test (CMT) or by bacteriological culture, was 46.2% (95% Cl = 43.6-48.8) and 24.3% (95% Cl = 22.2-26.6), respectively. In a longitudinal disease study in Iringa, the incidence of clinical mastitis was 31.7 cases per 100 cow-years. A randomised intervention trial indicated that intramammary antibiotics significantly reduced the proportion of bacteriologically positive quarters in the short-term (14 days post-infusion) but teat dipping had no detectable effect on bacteriological infection and CMT positive quarters. Other risk and protective factors were identified from both the cross-sectional and longitudinal included animals with Boran breeding (odds ratio (OR) = 3,40, 95% CI = 1.00-11.57, P < 0.05 for clinical mastitis, and OR = 3.51, 95% CI = 1.299.55, P < 0.01 for a CMT positive quarter), while the practice of residual calf suckling was protective for a bacteriologically positive quarter (OR = 0.63, 95% Cl = 0.48-0.81, P <= 0.001) and for a CMT positive quarter (OR = 0.69, 95% Cl = 0.63-0.75, P < 0.001). A mastitis training course for farmers and extension officers was held, and the knowledge gained and use of different methods of dissemination were assessed over time. In a subsequent randomised controlled trial, there were strong associations between knowledge gained and both the individual question asked and the combination of dissemination methods (village meeting, video and handout) used. This study demonstrated that both clinical and subclinical mastitis is common in smallholder dairying in Tanzania, and that some of the risk and protective factors for mastitis can be addressed by practical management of dairy cows following effective knowledge transfer. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Background: The present paper investigates the question of a suitable basic model for the number of scrapie cases in a holding and applications of this knowledge to the estimation of scrapie-ffected holding population sizes and adequacy of control measures within holding. Is the number of scrapie cases proportional to the size of the holding in which case it should be incorporated into the parameter of the error distribution for the scrapie counts? Or, is there a different - potentially more complex - relationship between case count and holding size in which case the information about the size of the holding should be better incorporated as a covariate in the modeling? Methods: We show that this question can be appropriately addressed via a simple zero-truncated Poisson model in which the hypothesis of proportionality enters as a special offset-model. Model comparisons can be achieved by means of likelihood ratio testing. The procedure is illustrated by means of surveillance data on classical scrapie in Great Britain. Furthermore, the model with the best fit is used to estimate the size of the scrapie-affected holding population in Great Britain by means of two capture-recapture estimators: the Poisson estimator and the generalized Zelterman estimator. Results: No evidence could be found for the hypothesis of proportionality. In fact, there is some evidence that this relationship follows a curved line which increases for small holdings up to a maximum after which it declines again. Furthermore, it is pointed out how crucial the correct model choice is when applied to capture-recapture estimation on the basis of zero-truncated Poisson models as well as on the basis of the generalized Zelterman estimator. Estimators based on the proportionality model return very different and unreasonable estimates for the population sizes. Conclusion: Our results stress the importance of an adequate modelling approach to the association between holding size and the number of cases of classical scrapie within holding. Reporting artefacts and speculative biological effects are hypothesized as the underlying causes of the observed curved relationship. The lack of adjustment for these artefacts might well render ineffective the current strategies for the control of the disease.
Resumo:
This note considers the variance estimation for population size estimators based on capture–recapture experiments. Whereas a diversity of estimators of the population size has been suggested, the question of estimating the associated variances is less frequently addressed. This note points out that the technique of conditioning can be applied here successfully which also allows us to identify sources of variation: the variance due to estimation of the model parameters and the binomial variance due to sampling n units from a population of size N. It is applied to estimators typically used in capture–recapture experiments in continuous time including the estimators of Zelterman and Chao and improves upon previously used variance estimators. In addition, knowledge of the variances associated with the estimators by Zelterman and Chao allows the suggestion of a new estimator as the weighted sum of the two. The decomposition of the variance into the two sources allows also a new understanding of how resampling techniques like the Bootstrap could be used appropriately. Finally, the sample size question for capture–recapture experiments is addressed. Since the variance of population size estimators increases with the sample size, it is suggested to use relative measures such as the observed-to-hidden ratio or the completeness of identification proportion for approaching the question of sample size choice.
Resumo:
Education and ethnicity cannot be discussed without taking language into account. This paper will argue that any discussion of ethnic minorities cannot ignore the question of language, nor can any discussion of human rights ignore the question of language rights. Unfortunately, in today's globalised world, governments and minorities are faced with conflicting pressures: on the one hand, for the development and use of education in a global/international language; on the other for the use and development of mother tongue, local or indigenous languages in education. Language complexity and ethnic plurality were largely brought about as a result of the creation of nation-states, which were spread around the world as a result of European colonialism. European languages and formal education systems were used as a means of political and economic control. The legacy that was left by the colonial powers has complicated ethnic relations and has frequently led to conflict. While there is now greater recognition of the importance of language both for economic and educational development, as well as for human rights, the forces of globalisation are leading towards uniformity in the languages used, in culture and even in education. They are working against the development of language rights for smaller groups. We are witnessing a sharp decline in the number of languages spoken. Only those languages which are numerically, economically and politically strong are likely to survive. As a result many linguistic and ethnic groups are in danger of being further marginalised. This paper will illustrate this thesis both historically and from several contemporary societies, showing how certain policies have exacerbated ethnic conflict while others are seeking to promote harmony and reconciliation. Why this should be so will be explored. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Two experiments investigated effects of active processing of risk information on participants' understanding and judgments. It was hypothesized that more active processing would lead to better understanding and differences in affective judgments (e.g. increased satisfaction and reduced perceived risk to health). In both experiments participants were given a written scenario about their being prescribed a fictitious medication. This medication was said to cause side effects in 2% of people who took it. Before answering a series of written questions, participants in the active conditions of both experiments were asked to carry out a reflective task (portraying the size of risk on a bar chart in Experiment 1 and answering a reflective question in Experiment 2). The results showed that active participants rated the likelihood of experiencing possible side effects significantly lower than passive participants (Experiment 1), and that active participants were significantly more satisfied with the information and judged perceived risk to health from taking the medication significantly lower than passive participants (Experiment 2). In both experiments, active participants were significantly more correct in their probability and frequency estimates. The studies demonstrate that active processing of risk information leads to improved understanding of the information given. This has important implications for risk communication. In the context of health, better understanding should lead to improved decision-making and health outcomes. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
This paper discusses the RFID implants for identification via a sensor network. Brain-computer implants linked in to a wireless network. Biometric identification via body sensors is also discussed. The use of a network as a means for remote and distance monitoring of humans opens up a range of potential uses. Where implanted identification is concerned this immediately offers high security access to specific areas by means of only an RFID device. If a neural implant is employed then clearly the information exchanged with a network can take on a much richer form, allowing for identification and response to an individual's needs based on the signals apparent on their nervous system.
Resumo:
The recent poor performance of the equity market in the UK has meant that real estate is increasingly been seen as an attractive addition to the mixed-asset portfolio. However, determining whether the good return enjoyed by real estate is a temporary or long-term phenomenon is a question that remains largely unanswered. In other words, there is little or no evidence to indicate whether real estate should play a consistent role in the mixed-asset portfolio over short- and long-term investment horizons. Consistency in this context refers to the ability of an asset to maintain a positive allocation in an efficient portfolio over different holding periods. Such consistency is a desirable trait for any investment, but takes on particular significance when real estate is considered, as the asset class is generally perceived to be a long-term investment due to illiquidity. From an institutional investor’s perspective, it is therefore crucial to determine whether real estate can be reasonably expected to maintain a consistent allocation in the mixed-asset portfolio in both the short and long run and at what percentage. To address the question of consistency the allocation of real estate in the mixed-asset portfolio was calculated over different holding periods varying from 5- to 25-years.