71 resultados para Center for Disease Control.
Resumo:
Since 2000-2001, dengue virus type 1 has circulated in the Pacific region. However, in 2007, type 4 reemerged and has almost completely displaced the strains of type 1. If only 1 serotype circulates at any time and is replaced approximately every 5 years, DENV-3 may reappear in 2012.
Resumo:
Mycobacterium lentiflavum, a slow-growing nontuberculous mycobacterium, is a rare cause of human disease. It has been isolated from environmental samples worldwide. To assess the clinical significance of M. lentiflavum isolates reported to the Queensland Tuberculosis Control Centre, Australia, during 2001-2008, we explored the genotypic similarity and geographic relationship between isolates from humans and potable water in the Brisbane metropolitan area. A total of 47 isolates from 36 patients were reported; 4 patients had clinically significant disease. M. lentiflavum was cultured from 13 of 206 drinking water sites. These sites overlapped geographically with home addresses of the patients who had clinically significant disease. Automated repetitive sequence-based PCR genotyping showed a dominant environmental clone closely related to clinical strains. This finding suggests potable water as a possible source of M. lentiflavum infection in humans.
Resumo:
Humankind has been dealing with all kinds of disasters since the dawn of time. The risk and impact of disasters producing mass casualties worldwide is increasing, due partly to global warming as well as to increased population growth, increased density and the aging population. China, as a country with a large population, vast territory, and complex climatic and geographical conditions, has been plagued by all kinds of disasters. Disaster health management has traditionally been a relatively arcane discipline within public health. However, SARS, Avian Influenza, and earthquakes and floods, along with the need to be better prepared for the Olympic Games in China has brought disasters, their management and their potential for large scale health consequences on populations to the attention of the public, the government and the international community alike. As a result significant improvements were made to the disaster management policy framework, as well as changes to systems and structures to incorporate an improved disaster management focus. This involved the upgrade of the Centres for Disease Control and Prevention (CDC) throughout China to monitor and better control the health consequences particularly of infectious disease outbreaks. However, as can be seen in the Southern China Snow Storm and Wenchuan Earthquake in 2008, there remains a lack of integrated disaster management and efficient medical rescue, which has been costly in terms of economics and health for China. In the context of a very large and complex country, there is a need to better understand whether these changes have resulted in effective management of the health impacts of such incidents. To date, the health consequences of disasters, particularly in China, have not been a major focus of study. The main aim of this study is to analyse and evaluate disaster health management policy in China and in particular, its ability to effectively manage the health consequences of disasters. Flood has been selected for this study as it is a common and significant disaster type in China and throughout the world. This information will then be used to guide conceptual understanding of the health consequences of floods. A secondary aim of the study is to compare disaster health management in China and Australia as these countries differ in their length of experience in having a formalised policy response. The final aim of the study is to determine the extent to which Walt and Gilson’s (1994) model of policy explains how disaster management policy in China was developed and implemented after SARS in 2003 to the present day. This study has utilised a case study methodology. A document analysis and literature search of Chinese and English sources was undertaken to analyse and produce a chronology of disaster health management policy in China. Additionally, three detailed case studies of flood health management in China were undertaken along with three case studies in Australia in order to examine the policy response and any health consequences stemming from the floods. A total of 30 key international disaster health management experts were surveyed to identify fundamental elements and principles of a successful policy framework for disaster health management. Key policy ingredients were identified from the literature, the case-studies and the survey of experts. Walt and Gilson (1994)’s policy model that focuses on the actors, content, context and process of policy was found to be a useful model for analysing disaster health management policy development and implementation in China. This thesis is divided into four parts. Part 1 is a brief overview of the issues and context to set the scene. Part 2 examines the conceptual and operational context including the international literature, government documents and the operational environment for disaster health management in China. Part 3 examines primary sources of information to inform the analysis. This involves two key studies: • A comparative analysis of the management of floods in China and Australia • A survey of international experts in the field of disaster management so as to inform the evaluation of the policy framework in existence in China and the criteria upon which the expression of that policy could be evaluated Part 4 describes the key outcomes of this research which include: • A conceptual framework for describing the health consequences of floods • A conceptual framework for disaster health management • An evaluation of the disaster health management policy and its implementation in China. The research outcomes clearly identified that the most significant improvements are to be derived from improvements in the generic management of disasters, rather than the health aspects alone. Thus, the key findings and recommendations tend to focus on generic issues. The key findings of this research include the following: • The health consequences of floods may be described in terms of time as ‘immediate’, ‘medium term’ and ‘long term’ and also in relation to causation as ‘direct’ and ‘indirect’ consequences of the flood. These two aspects form a matrix which in turn guides management responses. • Disaster health management in China requires a more comprehensive response throughout the cycle of prevention, preparedness, response and recovery but it also requires a more concentrated effort on policy implementation to ensure the translation of the policy framework into effective incident management. • The policy framework in China is largely of international standard with a sound legislative base. In addition the development of the Centres for Disease Control and Prevention has provided the basis for a systematic approach to health consequence management. However, the key weaknesses in the current system include: o The lack of a key central structure to provide the infrastructure with vital support for policy development, implementation and evaluation. o The lack of well-prepared local response teams similar to local government based volunteer groups in Australia. • The system lacks structures to coordinate government action at the local level. The result of this is a poorly coordinated local response and lack of clarity regarding the point at which escalation of the response to higher levels of government is advisable. These result in higher levels of risk and negative health impacts. The key recommendations arising from this study are: 1. Disaster health management policy in China should be enhanced by incorporating disaster management considerations into policy development, and by requiring a disaster management risk analysis and disaster management impact statement for development proposals. 2. China should transform existing organizations to establish a central organisation similar to the Federal Emergency Management Agency (FEMA) in the USA or the Emergency Management Australia (EMA) in Australia. This organization would be responsible for leading nationwide preparedness through planning, standards development, education and incident evaluation and to provide operational support to the national and local government bodies in the event of a major incident. 3. China should review national and local plans to reflect consistency in planning, and to emphasize the advantages of the integrated planning process. 4. Enhance community resilience through community education and the development of a local volunteer organization. China should develop a national strategy which sets direction and standards in regard to education and training, and requires system testing through exercises. Other initiatives may include the development of a local volunteer capability with appropriate training to assist professional response agencies such as police and fire services in a major incident. An existing organisation such as the Communist Party may be an appropriate structure to provide this response in a cost effective manner. 5. Continue development of professional emergency services, particularly ambulance, to ensure an effective infrastructure is in place to support the emergency response in disasters. 6. Funding for disaster health management should be enhanced, not only from government, but also from other sources such as donations and insurance. It is necessary to provide a more transparent mechanism to ensure the funding is disseminated according to the needs of the people affected. 7. Emphasis should be placed on prevention and preparedness, especially on effective disaster warnings. 8. China should develop local disaster health management infrastructure utilising existing resources wherever possible. Strategies for enhancing local infrastructure could include the identification of local resources (including military resources) which could be made available to support disaster responses. It should develop operational procedures to access those resources. Implementation of these recommendations should better position China to reduce the significant health consequences experienced each year from major incidents such as floods and to provide an increased level of confidence to the community about the country’s capacity to manage such events.
Resumo:
At the beginning of the pandemic (H1N1) 2009 outbreak, we estimated the potential surge in demand for hospital-based services in 4 Health Service Districts of Queensland, Australia, using the FluSurge model. Modifications to the model were made on the basis of emergent evidence and results provided to local hospitals to inform resource planning for the forthcoming pandemic. To evaluate the fit of the model, a comparison between the model's predictions and actual hospitalizations was made. In early 2010, a Web-based survey was undertaken to evaluate the model's usefulness. Predictions based on modified assumptions arising from the new pandemic gained better fit than results from the default model. The survey identified that the modeling support was helpful and useful to service planning for local hospitals. Our research illustrates an integrated framework involving post hoc comparison and evaluation for implementing epidemiologic modeling in response to a public health emergency.
Resumo:
Maize streak virus (MSV; Genus Mastrevirus, Family Geminiviridae) occurs throughout Africa, where it causes what is probably the most serious viral crop disease on the continent. It is obligately transmitted by as many as six leafhopper species in the Genus Cicadulina, but mainly by C. mbila Naudé and C. storeyi. In addition to maize, it can infect over 80 other species in the Family Poaceae. Whereas 11 strains of MSV are currently known, only the MSV-A strain is known to cause economically significant streak disease in maize. Severe maize streak disease (MSD) manifests as pronounced, continuous parallel chlorotic streaks on leaves, with severe stunting of the affected plant and, usuallly, a failure to produce complete cobs or seed. Natural resistance to MSV in maize, and/or maize infections caused by non-maize-adapted MSV strains, can result in narrow, interrupted streaks and no obvious yield losses. MSV epidemiology is primarily governed by environmental influences on its vector species, resulting in erratic epidemics every 3-10 years. Even in epidemic years, disease incidences can vary from a few infected plants per field, with little associated yield loss, to 100% infection rates and complete yield loss. Taxonomy: The only virus species known to cause MSD is MSV, the type member of the Genus Mastrevirus in the Family Geminiviridae. In addition to the MSV-A strain, which causes the most severe form of streak disease in maize, 10 other MSV strains (MSV-B to MSV-K) are known to infect barley, wheat, oats, rye, sugarcane, millet and many wild, mostly annual, grass species. Seven other mastrevirus species, many with host and geographical ranges partially overlapping those of MSV, appear to infect primarily perennial grasses. Physical properties: MSV and all related grass mastreviruses have single-component, circular, single-stranded DNA genomes of approximately 2700 bases, encapsidated in 22 × 38-nm geminate particles comprising two incomplete T = 1 icosahedra, with 22 pentameric capsomers composed of a single 32-kDa capsid protein. Particles are generally stable in buffers of pH 4-8. Disease symptoms: In infected maize plants, streak disease initially manifests as minute, pale, circular spots on the lowest exposed portion of the youngest leaves. The only leaves that develop symptoms are those formed after infection, with older leaves remaining healthy. As the disease progresses, newer leaves emerge containing streaks up to several millimetres in length along the leaf veins, with primary veins being less affected than secondary or tertiary veins. The streaks are often fused laterally, appearing as narrow, broken, chlorotic stripes, which may extend over the entire length of severely affected leaves. Lesion colour generally varies from white to yellow, with some virus strains causing red pigmentation on maize leaves and abnormal shoot and flower bunching in grasses. Reduced photosynthesis and increased respiration usually lead to a reduction in leaf length and plant height; thus, maize plants infected at an early stage become severely stunted, producing undersized, misshapen cobs or giving no yield at all. Yield loss in susceptible maize is directly related to the time of infection: Infected seedlings produce no yield or are killed, whereas plants infected at later times are proportionately less affected. Disease control: Disease avoidance can be practised by only planting maize during the early season when viral inoculum loads are lowest. Leafhopper vectors can also be controlled with insecticides such as carbofuran. However, the development and use of streak-resistant cultivars is probably the most effective and economically viable means of preventing streak epidemics. Naturally occurring tolerance to MSV (meaning that, although plants become systemically infected, they do not suffer serious yield losses) has been found, which has primarily been attributed to a single gene, msv-1. However, other MSV resistance genes also exist and improved resistance has been achieved by concentrating these within individual maiz genotypes. Whereas true MSV immunity (meaning that plants cannot be symptomatically infected by the virus) has been achieved in lines that include multiple small-effect resistance genes together with msv-1, it has proven difficult to transfer this immunity into commercial maize genotypes. An alternative resistance strategy using genetic engineering is currently being investigated in South Africa. Useful websites: 〈http://www.mcb.uct.ac.za/MSV/mastrevirus.htm〉; 〈http://www. danforthcenter.org/iltab/geminiviridae/geminiaccess/mastrevirus/Mastrevirus. htm〉. © 2009 Blackwell Publishing Ltd.
Resumo:
Alcohol-related driving is a longstanding, serious problem in China (Li, Xie, Nie, & Zhang, 2012). On 1st May, 2011 a national law was introduced to criminalize drunk driving, and imposed serious penalties including jail for driving with a blood alcohol level of above 80mg/100ml. This pilot study, undertaken a year after introduction of the law, sought traffic police officers’ perceptions of drink driving and the practice of breath alcohol testing (BAT) in a large city in Guangdong Province, southern China. A questionnaire survey and semi-structured interviews were used to gain an in-depth understanding of issues relevant to alcohol-related driving. Fifty-five traffic police officers were recruited for the survey and six traffic police officers with a variety of working experience including roadside alcohol breath testing, traffic crash investigation and police resourcing were interviewed individually. The officers were recruited by the first author with the assistance of the staff from Guangdong Institute of Public Health, Centre for Disease Control and Prevention (CDC). Interview participants reported three primary reasons why people drink and drive: 1) being prepared to take the chance of not being apprehended by police; 2) the strong traditional Chinese drinking culture; and 3) insufficient public awareness about the harmfulness of drink driving. Problems associated with the process of breath alcohol testing (BAT) were described and fit broadly into two categories: resourcing and avoiding detection. It was reported that there were insufficient traffic police officers to conduct routine traffic policing, including alcohol testing. Police BAT equipment was considered sufficient for routine traffic situations but not highway traffic operations. Local media and posters are used by the Public Security Bureau which is responsible for education about safe driving but participants thought that the education campaigns are limited in scope. Participants also described detection avoidance strategies used by drivers including: changing route; ignoring a police instruction to stop; staying inside the vehicle with windows and doors locked to avoid being tested; intentionally not performing breath tests correctly; and arguing with officers. This pilot study provided important insights from traffic police in one Chinese city which suggest there may be potential unintended effects of introducing more severe penalties including a range of strategies reportedly used by drivers to avoid detection. Recommendations for future research include a larger study to confirm these findings and examine the training and education of drivers; the focus and reach of publicity; and possible resource needs to support police enforcement.
Resumo:
Wolbachia pipientis is an endosymbiotic bacterium present in diverse insect species. Although it is well studied for its dramatic effects on host reproductive biology, little is known about its effects on other aspects of host biology, despite its presence in a wide array of host tissues. This study examined the effects of three Wolbachia strains on two different Drosophila species, using a laboratory performance assay for insect locomotion in response to olfactory cues. The results demonstrate that Wolbachia infection can have significant effects on host responsiveness that vary with respect to the Wolbachia strain-host species combination. The wRi strain, native to Drosophila simulans, increases the basal activity level of the host insect as well as its responsiveness to food cues. In contrast, the wMel strain and the virulent wMelPop strain, native to Drosophila melanogaster, cause slight decreases in responsiveness to food cues but do not alter basal activity levels in the host. Surprisingly, the virulent wMelPop strain has very little impact on host responsiveness in D. simulans. This novel strain-host relationship was artificially created previously by transinfection. These findings have implications for understanding the evolution and spread of Wolbachia infections in wild populations and for Wolbachia-based vector-borne disease control strategies currently being developed.
Resumo:
BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.
Resumo:
Bayesian networks (BNs) provide a statistical modelling framework which is ideally suited for modelling the many factors and components of complex problems such as healthcare-acquired infections. The methicillin-resistant Staphylococcus aureus (MRSA) organism is particularly troublesome since it is resistant to standard treatments for Staph infections. Overcrowding and understa�ng are believed to increase infection transmission rates and also to inhibit the effectiveness of disease control measures. Clearly the mechanisms behind MRSA transmission and containment are very complicated and control strategies may only be e�ective when used in combination. BNs are growing in popularity in general and in medical sciences in particular. A recent Current Content search of the number of published BN journal articles showed a fi�ve fold increase in general and a six fold increase in medical and veterinary science from 2000 to 2009. This chapter introduces the reader to Bayesian network (BN) modelling and an iterative modelling approach to build and test the BN created to investigate the possible role of high bed occupancy on transmission of MRSA while simultaneously taking into account other risk factors.
Resumo:
Objective: To evaluate the economic burden of malignant neoplasms in Shandong province in order to provide scientific evidence for policy-making. Methods: The main sources for this study were the data from the third sampling survey of death cause in 2006 and cancer prevalence survey in 2007 in Shandong province. The direct medical cost was calculated based on the survey data. The indirect cost due to mortality and morbidity were estimated with human capital approach based on the data of disability-adjusted life years derived from the two surveys and gross domestic product (GDP) data. The total economic burden was the sum of direct medical cost and indirect cost. The uncertainty analysis was conducted according to the methodology in global burden of disease study. Results: The total cost of cancer in Shandong province in 2006 estimated was 18 057 million Yuan RMB (95% confidence interval:16 817 - 19 393 million), which accounted for 0. 83% of the total GDP. The direct medical cost,indirect mortality cost and indirect morbidity cost accounted for 17.28%, 78.53%, and 4.20% of total economic burden of malignant neoplasms, respectively. Liver,lung and stomach cancer were the top three tumors with heavier economic burden, with accounted for more than one half (57. 83%) of the total economic burden of all cancers. The uncertainty of total burden estimated was around ± 7%, which mainly derived from the uncertainty of indirect economic burden. Conclusion: The influence of cancers on social economy is dominated by the loss of productivity, especially by the productivity loss due to premature death. Liver, lung and stomach cancer are the major cancers for disease control and prevention in Shandong province. Abstract in Chinese 目的 评价山东省恶性肿瘤经济负担,为卫生决策提供科学依据. 方法 以2006年山东省第3次死因回顾抽样凋查资料和2007年山东省恶性肿瘤现患状况抽样调查资料为基础,测算全省直接医疗成本;采用人力资本法测算死亡间接负担和伤残间接负担;参考全球疾病负担研究的方法对测算结果的不确定性进行分析. 结果 2006年山东省因恶性肿瘤导致的总经济负担为180.57亿元(95%CI=16 817~19 393),占全省GDP总量的0.83%,其中直接医疗成本占总负担的17.28%,死亡造成的间接经济负担占78.53%,伤残所致间接经济负担占4.20%;肝癌、肺癌和胃癌为山东省经济负担最重的3种恶性肿瘤,总经济负担合计占全部恶性肿瘤的57.83%;总经济负担估计结果的不确定性范围在±7%左右,主要取决于间接经济负担的不确定性. 结论 恶性肿瘤对社会经济的影响主要通过生产力的损失产生作用,并以死亡所致生产力损失为主;肝癌、肺癌和胃癌应是山东省恶性肿瘤预防控制的重点.
Resumo:
Objective To describe the trend of overall mortality and major causes of death in Shandong population from 1970 to 2005,and to quantitatively estimate the influential factors. Methods Trends of overall mortality and major causes of death were described using indicators such as mortality rates and age-adjusted death rates by comparing three large-scale mortality surveys in Shandong province. Difference decomposing method was applied to estimate the contribution of demographic and non-demographic factors for the change of mortality. Results The total mortality had had a slight change since 1970s,but had increased since 1990s.However,both the mortality rates of age-adjusted and age-specific decreased significantly. The mortality of Group Ⅰ diseases including infectious diseases as well maternal and perinatal diseases decreased drastically. By contrast, the mortality of non-communicable chronic diseases (NCDs)including cardiovascular diseases(CVDs),cancer and injuries increased. The sustentation of recent overall mortality was caused by the interaction of demographic and non-demographic factors which worked oppositely. Non-demographic factors were responsible for the decrease of Group Ⅰ disease and the increase of injuries. With respect to the increase of NCDs as a whole. Demographic factors might take the full responsibility and the non-demographic factors were the opposite force to reduce the mortality. Nevertheless, for the increase of some leading NCD diseases as CVDs and cancer, the increase was mainly due to non-demographic rather than demographic factors. Conclusion Through the interaction of the aggravation of ageing population and the enhancement of non-demographic effect, the overall mortality in Shandong would maintain a balance or slightly rise in the coming years. Group Ⅰ diseases in Shandong had been effectively under control. Strategies focusing on disease control and prevention should be transferred to chronic diseases, especially leading NCDs, such as CVDs and cancer.
Resumo:
In early April 1998, the Centre for Disease Control in Darwin was notified of a possible case of dengue which appeared to have been acquired in the Northern Territory. Because dengue is not endemic to the Northern Territory, locally acquired infection has significant public health implications, particularly for vector identification and control to limit the spread of infection. Dengue IgM serology was positive on two occasions, but the illness was eventually presumptively identified as Kokobera infection. This case illustrates the complexity of interpreting flavivirus serology. Determining the cause of infection requires consideration of the clinical illness, the incubation period, the laboratory results and vector presence. Waiting for confirmation of results, before the institution of the public health measures necessary for a true case of dengue, was ultimately justified in this case. This is a valid approach in the Northern Territory, but may not be applicable to areas of Australia with established vectors for dengue. Commun Dis Intell 1998;22:105-107.
Resumo:
In early April 1998 the Centre for Disease Control (CDC) in Darwin was notified of a case with positive dengue serology. The illness appeared to have been acquired in the Northern Territory (NT). Because dengue is not endemic to the NT, locally acquired infection has significant public health implications, particularly for vector identification and control to limit the spread of infection. Dengue IgM serology was positive on two occasions but the illness was eventually presumptively identified as Kokobera infection. This case illustrates some important points about serology. The interpretation of flavivirus serology is complex and can be misleading, despite recent improvements. The best method of determining the cause of infection is still attempting to reconcile clinical illness details with incubation times and vector presence, as well as laboratory results. This approach ultimately justified the initial period of waiting for confirmatory results in this case, before the institution of public health measures necessary for a true case of dengue.
Resumo:
On 18 September 1998 the Centre for Disease Control (CDC), Darwin was notified of an outbreak of gastroenteritis predominantly affecting adults in a Top End coastal community. There had been no previous presentations to the community clinic in the month of September with vomiting or diarrhoea. On 14 September, a green turtle (Chledonia mydas) was cooked and distributed throughout the community. Water collected from a water hole near the community (known as the aerator) was used as drinking water at the cook site and to cook the meat. In addition, there were reports that kava, a plant derived tranquilliser,1 had been consumed the night before using water from the same source. An investigation was conducted to determine the aetiology and source and to instigate prevention and control measures.