940 resultados para Veterinary instruments and apparatus


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear alkylbenzenes, LAB, formed by the Alel3 or HF catalyzed alkylation of benzene are common raw materials for surfactant manufacture. Normally they are sulphonated using S03 or oleum to give the corresponding linear alkylbenzene sulphonates In >95 % yield. As concern has grown about the environmental impact of surfactants,' questions have been raised about the trace levels of unreacted raw materials, linear alkylbenzenes and minor impurities present in them. With the advent of modem analytical instruments and techniques, namely GCIMS, the opportunity has arisen to identify the exact nature of these impurities and to determine the actual levels of them present in the commercial linear ,alkylbenzenes. The object of the proposed study was to separate, identify and quantify major and minor components (1-10%) in commercial linear alkylbenzenes. The focus of this study was on the structure elucidation and determination of impurities and on the qualitative determination of them in all analyzed linear alkylbenzene samples. A gas chromatography/mass spectrometry, (GCIMS) study was performed o~ five samples from the same manufacturer (different production dates) and then it was followed by the analyses of ten commercial linear alkylbenzenes from four different suppliers. All the major components, namely linear alkylbenzene isomers, followed the same elution pattern with the 2-phenyl isomer eluting last. The individual isomers were identified by interpretation of their electron impact and chemical ionization mass spectra. The percent isomer distribution was found to be different from sample to sample. Average molecular weights were calculated using two methods, GC and GCIMS, and compared with the results reported on the Certificate of Analyses (C.O.A.) provided by the manufacturers of commercial linear alkylbenzenes. The GC results in most cases agreed with the reported values, whereas GC/MS results were significantly lower, between 0.41 and 3.29 amu. The minor components, impurities such as branched alkylbenzenes and dialkyltetralins eluted according to their molecular weights. Their fragmentation patterns were studied using electron impact ionization mode and their molecular weight ions confirmed by a 'soft ionization technique', chemical ionization. The level of impurities present i~ the analyzed commercial linear alkylbenzenes was expressed as the percent of the total sample weight, as well as, in mg/g. The percent of impurities was observed to vary between 4.5 % and 16.8 % with the highest being in sample "I". Quantitation (mg/g) of impurities such as branched alkylbenzenes and dialkyltetralins was done using cis/trans-l,4,6,7-tetramethyltetralin as an internal standard. Samples were analyzed using .GC/MS system operating under full scan and single ion monitoring data acquisition modes. The latter data acquisition mode, which offers higher sensitivity, was used to analyze all samples under investigation for presence of linear dialkyltetralins. Dialkyltetralins were reported quantitatively, whereas branched alkylbenzenes were reported semi-qualitatively. The GC/MS method that was developed during the course of this study allowed identification of some other trace impurities present in commercial LABs. Compounds such as non-linear dialkyltetralins, dialkylindanes, diphenylalkanes and alkylnaphthalenes were identified but their detailed structure elucidation and the quantitation was beyond the scope of this study. However, further investigation of these compounds will be the subject of a future study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that standard asymptotic theory is not valid or is extremely unreliable in models with identification problems or weak instruments [Dufour (1997, Econometrica), Staiger and Stock (1997, Econometrica), Wang and Zivot (1998, Econometrica), Stock and Wright (2000, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. One possible way out consists here in using a variant of the Anderson-Rubin (1949, Ann. Math. Stat.) procedure. The latter, however, allows one to build exact tests and confidence sets only for the full vector of the coefficients of the endogenous explanatory variables in a structural equation, which in general does not allow for individual coefficients. This problem may in principle be overcome by using projection techniques [Dufour (1997, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. AR-types are emphasized because they are robust to both weak instruments and instrument exclusion. However, these techniques can be implemented only by using costly numerical techniques. In this paper, we provide a complete analytic solution to the problem of building projection-based confidence sets from Anderson-Rubin-type confidence sets. The latter involves the geometric properties of “quadrics” and can be viewed as an extension of usual confidence intervals and ellipsoids. Only least squares techniques are required for building the confidence intervals. We also study by simulation how “conservative” projection-based confidence sets are. Finally, we illustrate the methods proposed by applying them to three different examples: the relationship between trade and growth in a cross-section of countries, returns to education, and a study of production functions in the U.S. economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article sets out the concept of a State-to-State human transfer agreement of which extradition and deportation are specialised forms. Asylum sharing agreements are other variations which the article explores in more detail. Human transfer agreements always affect at least the right to liberty and the freedom of movement, but other rights will also be at issue to some extent. The article shows how human rights obligations limit State discretion in asylum sharing agreements and considers how past and present asylum sharing arrangements in Europe and North America deal with these limits, if at all. The article suggests changes in the way asylum sharing agreements are drafted: for example, providing for a treaty committee would allow existing agreements to better conform to international human rights instruments and would facilitate State compliance to their human rights obligations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La dernière décennie a connu un intérêt croissant pour les problèmes posés par les variables instrumentales faibles dans la littérature économétrique, c’est-à-dire les situations où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter. En effet, il est bien connu que lorsque les instruments sont faibles, les distributions des statistiques de Student, de Wald, du ratio de vraisemblance et du multiplicateur de Lagrange ne sont plus standard et dépendent souvent de paramètres de nuisance. Plusieurs études empiriques portant notamment sur les modèles de rendements à l’éducation [Angrist et Krueger (1991, 1995), Angrist et al. (1999), Bound et al. (1995), Dufour et Taamouti (2007)] et d’évaluation des actifs financiers (C-CAPM) [Hansen et Singleton (1982,1983), Stock et Wright (2000)], où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter, ont montré que l’utilisation de ces statistiques conduit souvent à des résultats peu fiables. Un remède à ce problème est l’utilisation de tests robustes à l’identification [Anderson et Rubin (1949), Moreira (2002), Kleibergen (2003), Dufour et Taamouti (2007)]. Cependant, il n’existe aucune littérature économétrique sur la qualité des procédures robustes à l’identification lorsque les instruments disponibles sont endogènes ou à la fois endogènes et faibles. Cela soulève la question de savoir ce qui arrive aux procédures d’inférence robustes à l’identification lorsque certaines variables instrumentales supposées exogènes ne le sont pas effectivement. Plus précisément, qu’arrive-t-il si une variable instrumentale invalide est ajoutée à un ensemble d’instruments valides? Ces procédures se comportent-elles différemment? Et si l’endogénéité des variables instrumentales pose des difficultés majeures à l’inférence statistique, peut-on proposer des procédures de tests qui sélectionnent les instruments lorsqu’ils sont à la fois forts et valides? Est-il possible de proposer les proédures de sélection d’instruments qui demeurent valides même en présence d’identification faible? Cette thèse se focalise sur les modèles structurels (modèles à équations simultanées) et apporte des réponses à ces questions à travers quatre essais. Le premier essai est publié dans Journal of Statistical Planning and Inference 138 (2008) 2649 – 2661. Dans cet essai, nous analysons les effets de l’endogénéité des instruments sur deux statistiques de test robustes à l’identification: la statistique d’Anderson et Rubin (AR, 1949) et la statistique de Kleibergen (K, 2003), avec ou sans instruments faibles. D’abord, lorsque le paramètre qui contrôle l’endogénéité des instruments est fixe (ne dépend pas de la taille de l’échantillon), nous montrons que toutes ces procédures sont en général convergentes contre la présence d’instruments invalides (c’est-à-dire détectent la présence d’instruments invalides) indépendamment de leur qualité (forts ou faibles). Nous décrivons aussi des cas où cette convergence peut ne pas tenir, mais la distribution asymptotique est modifiée d’une manière qui pourrait conduire à des distorsions de niveau même pour de grands échantillons. Ceci inclut, en particulier, les cas où l’estimateur des double moindres carrés demeure convergent, mais les tests sont asymptotiquement invalides. Ensuite, lorsque les instruments sont localement exogènes (c’est-à-dire le paramètre d’endogénéité converge vers zéro lorsque la taille de l’échantillon augmente), nous montrons que ces tests convergent vers des distributions chi-carré non centrées, que les instruments soient forts ou faibles. Nous caractérisons aussi les situations où le paramètre de non centralité est nul et la distribution asymptotique des statistiques demeure la même que dans le cas des instruments valides (malgré la présence des instruments invalides). Le deuxième essai étudie l’impact des instruments faibles sur les tests de spécification du type Durbin-Wu-Hausman (DWH) ainsi que le test de Revankar et Hartley (1973). Nous proposons une analyse en petit et grand échantillon de la distribution de ces tests sous l’hypothèse nulle (niveau) et l’alternative (puissance), incluant les cas où l’identification est déficiente ou faible (instruments faibles). Notre analyse en petit échantillon founit plusieurs perspectives ainsi que des extensions des précédentes procédures. En effet, la caractérisation de la distribution de ces statistiques en petit échantillon permet la construction des tests de Monte Carlo exacts pour l’exogénéité même avec les erreurs non Gaussiens. Nous montrons que ces tests sont typiquement robustes aux intruments faibles (le niveau est contrôlé). De plus, nous fournissons une caractérisation de la puissance des tests, qui exhibe clairement les facteurs qui déterminent la puissance. Nous montrons que les tests n’ont pas de puissance lorsque tous les instruments sont faibles [similaire à Guggenberger(2008)]. Cependant, la puissance existe tant qu’au moins un seul instruments est fort. La conclusion de Guggenberger (2008) concerne le cas où tous les instruments sont faibles (un cas d’intérêt mineur en pratique). Notre théorie asymptotique sous les hypothèses affaiblies confirme la théorie en échantillon fini. Par ailleurs, nous présentons une analyse de Monte Carlo indiquant que: (1) l’estimateur des moindres carrés ordinaires est plus efficace que celui des doubles moindres carrés lorsque les instruments sont faibles et l’endogenéité modérée [conclusion similaire à celle de Kiviet and Niemczyk (2007)]; (2) les estimateurs pré-test basés sur les tests d’exogenété ont une excellente performance par rapport aux doubles moindres carrés. Ceci suggère que la méthode des variables instrumentales ne devrait être appliquée que si l’on a la certitude d’avoir des instruments forts. Donc, les conclusions de Guggenberger (2008) sont mitigées et pourraient être trompeuses. Nous illustrons nos résultats théoriques à travers des expériences de simulation et deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le problème bien connu du rendement à l’éducation. Le troisième essai étend le test d’exogénéité du type Wald proposé par Dufour (1987) aux cas où les erreurs de la régression ont une distribution non-normale. Nous proposons une nouvelle version du précédent test qui est valide même en présence d’erreurs non-Gaussiens. Contrairement aux procédures de test d’exogénéité usuelles (tests de Durbin-Wu-Hausman et de Rvankar- Hartley), le test de Wald permet de résoudre un problème courant dans les travaux empiriques qui consiste à tester l’exogénéité partielle d’un sous ensemble de variables. Nous proposons deux nouveaux estimateurs pré-test basés sur le test de Wald qui performent mieux (en terme d’erreur quadratique moyenne) que l’estimateur IV usuel lorsque les variables instrumentales sont faibles et l’endogénéité modérée. Nous montrons également que ce test peut servir de procédure de sélection de variables instrumentales. Nous illustrons les résultats théoriques par deux applications empiriques: le modèle bien connu d’équation du salaire [Angist et Krueger (1991, 1999)] et les rendements d’échelle [Nerlove (1963)]. Nos résultats suggèrent que l’éducation de la mère expliquerait le décrochage de son fils, que l’output est une variable endogène dans l’estimation du coût de la firme et que le prix du fuel en est un instrument valide pour l’output. Le quatrième essai résout deux problèmes très importants dans la littérature économétrique. D’abord, bien que le test de Wald initial ou étendu permette de construire les régions de confiance et de tester les restrictions linéaires sur les covariances, il suppose que les paramètres du modèle sont identifiés. Lorsque l’identification est faible (instruments faiblement corrélés avec la variable à instrumenter), ce test n’est en général plus valide. Cet essai développe une procédure d’inférence robuste à l’identification (instruments faibles) qui permet de construire des régions de confiance pour la matrices de covariances entre les erreurs de la régression et les variables explicatives (possiblement endogènes). Nous fournissons les expressions analytiques des régions de confiance et caractérisons les conditions nécessaires et suffisantes sous lesquelles ils sont bornés. La procédure proposée demeure valide même pour de petits échantillons et elle est aussi asymptotiquement robuste à l’hétéroscédasticité et l’autocorrélation des erreurs. Ensuite, les résultats sont utilisés pour développer les tests d’exogénéité partielle robustes à l’identification. Les simulations Monte Carlo indiquent que ces tests contrôlent le niveau et ont de la puissance même si les instruments sont faibles. Ceci nous permet de proposer une procédure valide de sélection de variables instrumentales même s’il y a un problème d’identification. La procédure de sélection des instruments est basée sur deux nouveaux estimateurs pré-test qui combinent l’estimateur IV usuel et les estimateurs IV partiels. Nos simulations montrent que: (1) tout comme l’estimateur des moindres carrés ordinaires, les estimateurs IV partiels sont plus efficaces que l’estimateur IV usuel lorsque les instruments sont faibles et l’endogénéité modérée; (2) les estimateurs pré-test ont globalement une excellente performance comparés à l’estimateur IV usuel. Nous illustrons nos résultats théoriques par deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le modèle de rendements à l’éducation. Dans la première application, les études antérieures ont conclu que les instruments n’étaient pas trop faibles [Dufour et Taamouti (2007)] alors qu’ils le sont fortement dans la seconde [Bound (1995), Doko et Dufour (2009)]. Conformément à nos résultats théoriques, nous trouvons les régions de confiance non bornées pour la covariance dans le cas où les instruments sont assez faibles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis certain important aspects of heavy metal toxicity have been worked out. Recent studies have clearly shown that when experimental media contained more than one heavy metals, such metals could conspicuously influence the toxic reaction of the animals both in terms of quantity and nature. The experimental results available on individual metal toxicity show that, in majority of such results, unrealistically high concentrations of dissolved metals are involved. A remarkable number of factors have been shown to influence metal toxicity such as various environmental factors particularly temperature and salinity, the condition of the organism and the ability of some of the marine organisms to adapt to metallic contamination. Further, some of the more sensitive functions like embryonic and larval development, growth and fecundity, oxygen utilization and the function of various enzymes are found to be demonstrably sensitive in the presence of heavy metals. However, some of the above functions could be compensated for by adaptive process. If it is assumed that the presence of a single metal in higher concentrations could affect the life function of marine animals, more than one metal in the experimental media should manifest such effects in a greater scale. Commonly known as synergism or more than additivity, majority of heavy metals bring about synergistic reaction .The work presented in this thesis comprises lethal and sublethal toxicities of different salt forms of copper and silver on the brown mussel Perna indica. during the present investigation sublethal concentrations of copper and silver in their dent effects on survival, oxygen consumption, filtration, accumulation and depuration on Perna indica. The results are presented under different sections to make the presentation meaningful .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Swift heavy ion induced changes in microstructure and surface morphology of vapor deposited Fe–Ni based metallic glass thin films have been investigated by using atomic force microscopy, X-ray diffraction and transmission electron microscopy. Ion beam irradiation was carried out at room temperature with 103 MeV Au9+ beam with fluences ranging from 3 1011 to 3 1013 ions/cm2. The atomic force microscopy images were subjected to power spectral density analysis and roughness analysis using an image analysis software. Clusters were found in the image of as-deposited samples, which indicates that the film growth is dominated by the island growth mode. As-deposited films were amorphous as evidenced from X-ray diffraction; however, high resolution transmission electron microscopy measurements revealed a short range atomic order in the samples with crystallites of size around 3 nm embedded in an amorphous matrix. X-ray diffraction pattern of the as-deposited films after irradiation does not show any appreciable changes, indicating that the passage of swift heavy ions stabilizes the short range atomic ordering, or even creates further amorphization. The crystallinity of the as-deposited Fe–Ni based films was improved by thermal annealing, and diffraction results indicated that ion beam irradiation on annealed samples results in grain fragmentation. On bombarding annealed films, the surface roughness of the films decreased initially, then, at higher fluences it increased. The observed change in surface morphology of the irradiated films is attributed to the interplay between ion induced sputtering, volume diffusion and surface diffusion

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have investigated the effects of swift heavy ion irradiation on thermally evaporated 44 nm thick, amorphous Co77Fe23 thin films on silicon substrates using 100 MeV Ag7+ ions fluences of 1 1011 ions/ cm2, 1 1012 ions/cm2, 1 1013 ions/cm2, and 3 1013 ions/cm2. The structural modifications upon swift heavy irradiation were investigated using glancing angle X-ray diffraction. The surface morphological evolution of thin film with irradiation was studied using Atomic Force Microscopy. Power spectral density analysis was used to correlate the roughness variation with structural modifications investigated using X-ray diffraction. Magnetic measurements were carried out using vibrating sample magnetometry and the observed variation in coercivity of the irradiated films is explained on the basis of stress relaxation. Magnetic force microscopy images are subjected to analysis using the scanning probe image processor software. These results are in agreement with the results obtained using vibrating sample magnetometry. The magnetic and structural properties are correlated

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diese Dissertation stellt eine Studie da, welche sich mit den Änderungen in der Governance der Hochschulbildung in Vietnam beschäftigt. Das zentrale Ziel dieser Forschungsarbeit ist die Untersuchung der Herkunft und Änderung in der Beziehung der Mächte zwischen dem vietnamesischen Staat und den Hochschulbildungsinstituten (HI), welche hauptsächlich aus der Interaktion dieser beiden Akteure resultiert. Die Macht dieser beiden Akteure wurde im sozialen Bereich konstruiert und ist hauptsächlich durch ihre Nützlichkeit und Beiträge für die Hochschulbildung bestimmt. Diese Arbeit beschäftigt sich dabei besonders mit dem Aspekt der Lehrqualität. Diese Studie nimmt dabei die Perspektive einer allgemeinen Governance ein, um die Beziehung zwischen Staat und HI zu erforschen. Zudem verwendet sie die „Resource Dependence Theory“ (RDT), um das Verhalten der HI in Bezug auf die sich verändernde Umgebung zu untersuchen, welche durch die Politik und eine abnehmende Finanzierung charakterisiert ist. Durch eine empirische Untersuchung der Regierungspolitik sowie der internen Steuerung und den Praktiken der vier führenden Universitäten kommt die Studie zu dem Schluss, dass unter Berücksichtigung des Drucks der Schaffung von Einkommen die vietnamesischen Universitäten sowohl Strategien als auch Taktiken entwickelt haben, um Ressourcenflüsse und Legitimität zu kontrollieren. Die Entscheidungs- und Zielfindung der Komitees, die aus einer Mehrheit von Akademikern bestehen, sind dabei mächtiger als die der Manager. Daher werden bei initiativen Handlungen der Universitäten größtenteils Akademiker mit einbezogen. Gestützt auf die sich entwickelnden Muster der Ressourcenbeiträge von Akademikern und Studierenden für die Hochschulbildung prognostiziert die Studie eine aufstrebende Governance Konfiguration, bei der die Dimensionen der akademischen Selbstverwaltung und des Wettbewerbsmarktes stärker werden und die Regulation des Staates rational zunimmt. Das derzeitige institutionelle Design und administrative System des Landes, die spezifische Gewichtung und die Koordinationsmechanismen, auch als sogenanntes effektives Aufsichtssystem zwischen den drei Schlüsselakteuren - der Staat, die HI/Akademiker und die Studierenden – bezeichnet, brauchen eine lange Zeit zur Detektion und Etablierung. In der aktuellen Phase der Suche nach einem solchen System sollte die Regierung Management-Tools stärken, wie zum Beispiel die Akkreditierung, belohnende und marktbasierte Instrumente und das Treffen informations-basierter Entscheidungen. Darüber hinaus ist es notwendig die Transparenz der Politik zu erhöhen und mehr Informationen offenzulegen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constraints to the introduction of enhanced biosecurity systems are rarely considered in sufficient detail when population medicine specialists initiate new control schemes. The main objective of our research was to investigate and compare the different attitudes constraining improvement in biosecurity for cattle and sheep farmers, practising veterinary surgeons and the auxiliary industries in Great Britain (GB). This study was carried out utilizing farmer focus groups, a questionnaire survey of veterinary practitioners and a telephone survey of auxiliary industry representatives. It appears that farmers and veterinarians have their own relatively clear definitions for biosecurity in relation to some major diseases threatening GB agriculture. Overall, farmers believe that other stakeholders, such as the government, should make a greater contribution towards biosecurity within GB. Conversely, veterinary practitioners saw their clients' ability or willingness to invest in biosecurity measures as a major constraint. Veterinary practitioners also felt that there was need for additional proof of efficacy and/or the potential economic benefits of proposed farm biosecurity practices better demonstrated. Auxiliary industries, in general, were not certain of their role in biosecurity although study participants highlighted zoonoses as part of the issue and offered that most of the constraints operated at farm level. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constraints to the introduction of enhanced biosecurity systems are rarely considered in sufficient detail when population medicine specialists initiate new control schemes. The main objective of our research was to investigate and compare the different attitudes constraining improvement in biosecurity for cattle and sheep farmers, practising veterinary surgeons and the auxiliary industries in Great Britain (GB). This study was carried out utilizing farmer focus groups, a questionnaire survey of veterinary practitioners and a telephone survey of auxiliary industry representatives. It appears that farmers and veterinarians have their own relatively clear definitions for biosecurity in relation to some major diseases threatening GB agriculture. Overall, farmers believe that other stakeholders, such as the government, should make a greater contribution towards biosecurity within GB. Conversely, veterinary practitioners saw their clients' ability or willingness to invest in biosecurity measures as a major constraint. Veterinary practitioners also felt that there was need for additional proof of efficacy and/or the potential economic benefits of proposed farm biosecurity practices better demonstrated. Auxiliary industries, in general, were not certain of their role in biosecurity although study participants highlighted zoonoses as part of the issue and offered that most of the constraints operated at farm level. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a cross-sectional study of 400 randomly selected smallholder dairy farms in the Tanga and Iringa regions of Tanzania, 14.2% (95% confidence interval (CI) = 11.6-17.3) of cows had developed clinical mastitis during the previous year. The point prevalence of subclinical mastitis, defined as a quarter positive by the California Mastitis Test (CMT) or by bacteriological culture, was 46.2% (95% Cl = 43.6-48.8) and 24.3% (95% Cl = 22.2-26.6), respectively. In a longitudinal disease study in Iringa, the incidence of clinical mastitis was 31.7 cases per 100 cow-years. A randomised intervention trial indicated that intramammary antibiotics significantly reduced the proportion of bacteriologically positive quarters in the short-term (14 days post-infusion) but teat dipping had no detectable effect on bacteriological infection and CMT positive quarters. Other risk and protective factors were identified from both the cross-sectional and longitudinal included animals with Boran breeding (odds ratio (OR) = 3,40, 95% CI = 1.00-11.57, P < 0.05 for clinical mastitis, and OR = 3.51, 95% CI = 1.299.55, P < 0.01 for a CMT positive quarter), while the practice of residual calf suckling was protective for a bacteriologically positive quarter (OR = 0.63, 95% Cl = 0.48-0.81, P <= 0.001) and for a CMT positive quarter (OR = 0.69, 95% Cl = 0.63-0.75, P < 0.001). A mastitis training course for farmers and extension officers was held, and the knowledge gained and use of different methods of dissemination were assessed over time. In a subsequent randomised controlled trial, there were strong associations between knowledge gained and both the individual question asked and the combination of dissemination methods (village meeting, video and handout) used. This study demonstrated that both clinical and subclinical mastitis is common in smallholder dairying in Tanzania, and that some of the risk and protective factors for mastitis can be addressed by practical management of dairy cows following effective knowledge transfer. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biological emergencies such as the appearance of an exotic transboundary or emerging disease can become disasters. The question that faces Veterinary Services in developing countries is how to balance resources dedicated to active insurance measures, such as border control, surveillance, working with the governments of developing countries, and investing in improving veterinary knowledge and tools, with passive measures, such as contingency funds and vaccine banks. There is strong evidence that the animal health situation in developed countries has improved and is relatively stable. In addition, through trade with other countries, developing countries are becoming part of the international animal health system, the status of which is improving, though with occasional setbacks. However, despite these improvements, the risk of a possible biological disaster still remains, and has increased in recent times because of the threat of bioterrorism. This paper suggests that a model that combines decision tree analysis with epidemiology is required to identify critical points in food chains that should be strengthened to reduce the risk of emergencies and prevent emergencies from becoming disasters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study was conducted to estimate variation among laboratories and between manual and automated techniques of measuring pressure on the resulting gas production profiles (GPP). Eight feeds (molassed sugarbeet feed, grass silage, maize silage, soyabean hulls, maize gluten feed, whole crop wheat silage, wheat, glucose) were milled to pass a I mm screen and sent to three laboratories (ADAS Nutritional Sciences Research Unit, UK; Institute of Grassland and Environmental Research (IGER), UK; Wageningen University, The Netherlands). Each laboratory measured GPP over 144 h using standardised procedures with manual pressure transducers (MPT) and automated pressure systems (APS). The APS at ADAS used a pressure transducer and bottles in a shaking water bath, while the APS at Wageningen and IGER used a pressure sensor and bottles held in a stationary rack. Apparent dry matter degradability (ADDM) was estimated at the end of the incubation. GPP were fitted to a modified Michaelis-Menten model assuming a single phase of gas production, and GPP were described in terms of the asymptotic volume of gas produced (A), the time to half A (B), the time of maximum gas production rate (t(RM) (gas)) and maximum gas production rate (R-M (gas)). There were effects (P<0.001) of substrate on all parameters. However, MPT produced more (P<0.001) gas, but with longer (P<0.001) B and t(RM gas) (P<0.05) and lower (P<0.001) R-M gas compared to APS. There was no difference between apparatus in ADDM estimates. Interactions occurred between substrate and apparatus, substrate and laboratory, and laboratory and apparatus. However, when mean values for MPT were regressed from the individual laboratories, relationships were good (i.e., adjusted R-2 = 0.827 or higher). Good relationships were also observed with APS, although they were weaker than for MPT (i.e., adjusted R-2 = 0.723 or higher). The relationships between mean MPT and mean APS data were also good (i.e., adjusted R 2 = 0. 844 or higher). Data suggest that, although laboratory and method of measuring pressure are sources of variation in GPP estimation, it should be possible using appropriate mathematical models to standardise data among laboratories so that data from one laboratory could be extrapolated to others. This would allow development of a database of GPP data from many diverse feeds. (c) 2005 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.