984 resultados para Biology, Microbiology|Hispanic American Studies|Health Sciences, Epidemiology
Resumo:
Though a lot of progress has been made in the treatment, prevention, and in increasing the knowledge and awareness of HIV/AIDS, the CDC reports that over 21% of the people infected with HIV are unaware of their HIV serostatus. Thirty-one percent of people infected with HIV are diagnosed late in the disease progression, often too late to prevent the transmission or the progression of HIV to AIDS. CDC has set a goal to increase by the year 2010, the number of people aware of the HIV serostatus by 5%. ^ This study examined the association between decision-making and risk-taking (assessed using the decision-making confidence and risk-taking scales of the Texas Christian University Self Rating Form) and HIV testing behaviors within a population of heterosexuals at risk for HIV infections living in Harris County, Texas (N=923). Data used in the study was obtained during the first cycle of the National HIV Behavioral Surveillance among heterosexuals at risk for HIV infection (NHBS-HET1), conducted from October, 2006 to June, 2007. Eighty percent of the study population reported testing for HIV at some point in their lives. The results showed that individuals who scored high (>3.3) on the decision-making confidence scale of the TCU/SRF were more likely to be tested for HIV when compared to those who scored low on the scale (OR= 2.02, 95% CI= 1.44–2.84), and that individuals who score low on the risk-taking scale of the TCU/SRF were more likely to have been tested for HIV when compared to those who scored high on the scale (OR= 1.65, 95% CI= 1.2–2.31). Several demographic factors were also assessed for their association with HIV testing behaviors. Only sex was found to be associated with HIV testing. ^ The findings suggest that risk-taking and decision-making are predictors of HIV testing behaviors such as prior HIV testing within heterosexuals living in high-risk areas of Houston, Texas, and that intervention designed to improve the risk-taking and decision-making attributes of this population might improve HIV testing within this population.^
Resumo:
Many public health agencies and researchers are interested in comparing hospital outcomes, for example, morbidity, mortality, and hospitalization across areas and hospitals. However, since there is variation of rates in clinical trials among hospitals because of several biases, we are interested in controlling for the bias and assessing real differences in clinical practices. In this study, we compared the variations between hospitals in rates of severe Intraventricular Haemorrhage (IVH) infant using Frequentist statistical approach vs. Bayesian hierarchical model through simulation study. The template data set for simulation study was included the number of severe IVH infants of 24 intensive care units in Australian and New Zealand Neonatal Network from 1995 to 1997 in severe IVH rate in preterm babies. We evaluated the rates of severe IVH for 24 hospitals with two hierarchical models in Bayesian approach comparing their performances with the shrunken rates in Frequentist method. Gamma-Poisson (BGP) and Beta-Binomial (BBB) were introduced into Bayesian model and the shrunken estimator of Gamma-Poisson (FGP) hierarchical model using maximum likelihood method were calculated as Frequentist approach. To simulate data, the total number of infants in each hospital was kept and we analyzed the simulated data for both Bayesian and Frequentist models with two true parameters for severe IVH rate. One was the observed rate and the other was the expected severe IVH rate by adjusting for five predictors variables for the template data. The bias in the rate of severe IVH infant estimated by both models showed that Bayesian models gave less variable estimates than Frequentist model. We also discussed and compared the results from three models to examine the variation in rate of severe IVH by 20th centile rates and avoidable number of severe IVH cases. ^
Resumo:
Type 2 diabetes has grown to epidemic proportions in the U.S., and its prevalence has been steadily increasing in Texas. The physical activity levels in the population have remained low despite it being one of the primary preventive strategies for type 2 diabetes. The objectives of this study were to estimate the direct medical costs of type 2 diabetes attributable to not meeting physical activity Guidelines and to physical inactivity in the U.S. and Texas in 2007. This was a cross sectional study that used physical activity prevalence data from the 2007 Behavioral Risk Factor Surveillance System (BRFSS) to estimate the population attributable risk percentage (PAR%) for type 2 diabetes. These data were combined with the prevalence and cost data of type 2 diabetes to estimate the cost of type 2 diabetes attributable to not meeting Guidelines and to inactivity in the U.S. and Texas in 2007.^ The cost of type 2 diabetes in the U.S. in 2007, attributable to not meeting physical activity Guidelines was estimated to be $13.29 billion, and that attributable to physical inactivity (no leisure time physical activity) was estimated to be $3.32 billion. Depending on various assumptions, these estimates ranged from $7.61 billion to $41.48 billion for not meeting Guidelines, and $1.90 billion to $13.20 billion for physical inactivity in the U.S. in 2007. The cost of type 2 diabetes in Texas in 2007 attributable to not meeting physical activity Guidelines was estimated to be $1.15 billion, and that attributable to physical inactivity (no leisure time physical activity) was estimated to be $325 million. Depending on various assumptions, these estimates ranged from $800 million to $3.47 billion for not meeting Guidelines, and $186 million to $1.28 billion for physical inactivity in Texas in 2007. These results illustrate how much money could be saved annually just in terms of type 2 diabetes cost in the U.S. and Texas, if the entire adult population was active enough to meet physical activity Guidelines. Physical activity promotion, particularly at the environmental and policy level should be a priority in the population. ^
Resumo:
West Nile Virus (WNV) is an arboviral disease that has affected hundreds of residents in Harris County, Texas since its introduction in 2002. Persistent infection, lingering sequelae and other long-term symptoms of patients reaffirm the need for prevention of this important vector-borne disease. This study aimed to determine if living within 400m of a water body increases one’s odds of infection with WNV. Additionally, we wanted to determine if one’s proximity to a particular water type or water body source increased one’s odds of infection with WNV.^ 145 cases’ addresses were abstracted from the initial interview and consent records from a cohort of patients (Epidemiology of Arboviral Encephalitis in Houston study, HSC-SPH-03-039). After applying inclusion criteria, 140 cases were identified for analysis. 140 controls were selected for analysis using a population proportionate to size model and US Census Bureau data. MapMarker USA v14 was used to geocode the cases’ addresses. Both cases’ and controls’ coordinates were uploaded onto a Harris County water shapefile in MapInfo Professional v9.5.1. Distance in meters to the closest water source, closest water source type, and closest water source name were recorded.^ Analysis of Variance (p=0.329, R2 = 0.0034) indicated no association between water body distance and risk of WNV disease. Living near a creek (x2 = 11.79, p < 0.001), or the combined group of creek and gully (x 2 = 14.02, p < 0.001) were found to be strongly associated with infection of WNV. Living near Cypress Creek and its feeders (x2 = 15.2, p < 0.001) was found to be strongly associated with WNV infection. We found that creek and gully habitats, particularly Cypress Creek, were preferential for the local disease transmitting Culex quinquefasciatus and reservoir avian population.^
Resumo:
Background. An enlarged tracheoesophageal puncture (TEP) results in aspiration around the voice prosthesis (VP) and may lead to pneumonia. The aims of this research were: (1) to conduct a systematic review and meta-analysis on enlarged TEP; (2) to analyze preoperative, perioperative, and postoperative risk factors for enlarged TEP; and (3) to evaluate control of leakage around the VP using conservative treatments and adverse events in patients with enlarged TEP.^ Methods. A systematic review was conducted (1978-2008). A summary risk estimate was calculated using a random-effects meta-analysis model. A retrospective cohort study was completed. Patients who underwent total laryngectomy and TEP at The University of Texas M. D. Anderson Cancer Center (MDACC) were included. Multiple logistic regression methods were used to assess risk factors for enlargement. Descriptive and bivariate statistics were calculated to evaluate outcomes and adverse events. Results: Twenty-seven manuscripts were included in the systematic review. The summary risk estimate of enlarged TEP/leakage around the VP was 7.2% (95% CI: 4.8%-9.6%). Temporary VP removal and TEP-site injections were the most commonly reported treatments. Neither prosthetic diameter (p=0.076) nor timing of TEP (p=0.297) significantly increased risk of enlargement per stratified analyses of published outcomes. The cumulative incidence of enlarged TEP was 18.6% (36/194, 95% CI: 13.0%-24.1%) in the MDACC cohort. Enlarged TEP occurred exclusively in irradiated patients. Adjusting for length of follow-up and timing of TEP, advanced nodal disease (ORadjusted: 4.3, 95% CI: 1.0-19.1), stricture (ORadjusted : 3.2, 95% CI: 1.2-8.6), and locoregional recurrence/distant metastasis after laryngectomy (ORadjusted: 6.2, 95% CI: 2.3-16.4) increased risk of enlarged TEP. At last follow-up, conservative methods controlled leakage around the VP in 81% (29/36) of patients. Unresolved leakage was associated with recurrent cancer (p=0.081) and TEP-site irregularity (p=0.003). Relative to those without enlargement, enlarged TEP patients had significantly higher risk of pneumonia (RR: 3.4, 95% CI: 1.9-6.2).^ Conclusions. These data establish that enlarged TEP poses serious health risks, and provide insight into medical and oncologic factors that may contribute to development of this complication. In addition, this research supports the use of conservative treatments to address leakage after enlarged TEP in lieu of complete TEP closure.^
Resumo:
Unlike infections occurring during periods of chemotherapy-induced neutropenia, postoperative infections in patients with solid malignancy remain largely understudied. The purpose of this population-based study was to evaluate the clinical and economic burden, as well as the relationship of hospital surgical volume and outcomes associated with serious postoperative infection (SPI) – i.e., bacteremia/sepsis, pneumonia, and wound infection – following resection of common solid tumors.^ From the Texas Discharge Data Research File, we identified all Texas residents who underwent resection of cancer of the lung, esophagus, stomach, pancreas, colon, or rectum between 2002 and 2006. From their billing records, we identified ICD-9 codes indicating SPI and also subsequent SPI-related readmissions occurring within 30 days of surgery. Random-effects logistic regression was used to calculate the impact of SPI on mortality, as well as the association between surgical volume and SPI, adjusting for case-mix, hospital characteristics, and clustering of multiple surgical admissions within the same patient and patients within the same hospital. Excess bed days and costs were calculated by subtracting values for patients without infections from those with infections computed using multilevel mixed-effects generalized linear model by fitting a gamma distribution to the data using log link.^ Serious postoperative infection occurred following 9.4% of the 37,582 eligible tumor resections and was independently associated with an 11-fold increase in the odds of in-hospital mortality (95% Confidence Interval [95% CI], 6.7-18.5, P < 0.001). Patients with SPI required 6.3 additional hospital days (95% CI, 6.1 - 6.5) at an incremental cost of $16,396 (95% CI, $15,927–$16,875). There was a significant trend toward lower overall rates of SPI with higher surgical volume (P=0.037). ^ Due to the substantial morbidity, mortality, and excess costs associated with SPI following solid tumor resections and given that, under current reimbursement practices, most of this heavy burden is borne by acute care providers, it is imperative for hospitals to identify more effective prophylactic measures, so that these potentially preventable infections and their associated expenditures can be averted. Additional volume-outcomes research is also needed to identify infection prevention processes that can be transferred from higher- to lower-volume providers.^
Resumo:
Respiratory Syncytial Virus (RSV) is a major cause of respiratory tract infections in immunocompromised patients such as children less than 2 years, premature infants with congenital heart disease and chronic lung disease, elderly patients and patients who have undergone hematopoietic stem cell transplant (HSCT). HSCT patients are at high risk of RSV infection, at increased risk of developing pneumonia, and RSV-related mortality. Immunodeficiency can be a major risk factor for severe infection & mortality. Therapy of RSV infection with Ribavirin, Palivizumab and Immunoglobulin has shown to reduce the risk of progression to LRI and mortality, especially if initiated early in the disease. Data on RSV infection in HSCT patients is limited, especially at various levels of immunodeficiency. 323 RSV infections in HSCT patients have been identified between 1/1995 and 8/2009 at University of Texas M D Anderson Cancer Center (UTMDACC). In this proposed study, we attempted to analyze a de-identified database of these cases and describe the epidemiologic characteristics of RSV infection in HSCT patients, the course of the infection, rate of development of pneumonia and RSV-related mortality in HSCT patients at UTMDACC.^ Key words: RSV infections, HSCT patients ^
Resumo:
Purpose. To determine the risk of late breast cancer recurrence (5 years after treatment) in a population of women diagnosed with early-stage breast cancer at The University of Texas M.D. Anderson Cancer Center (MDACC) between 1985-2000 and to examine the effect of this population’s BMI, smoking history, reproductive history, hormone use, and alcohol intake at the time of diagnosis on risk of late recurrence.^ Methods. Patients included 1,913 members of the Early Stage Breast Cancer Repository recruited at MDACC who had survived without a recurrence for at least five years after their initial diagnosis of early stage breast cancer. Clinical and epidemiological information was ascertained twice on participants during the study—first by medical record abstraction then by patient interview at least five years after receipt of adjuvant treatment. A total of 223 late breast cancer recurrences were captured, with an average follow-up of 10.6 years. Cox proportional hazards models were used to calculate hazard ratios (HR) and 95% confidence intervals (CI). ^
Resumo:
The objective of this dissertation was to determine the initiation and completion rates of adjuvant chemotherapy, its toxicity and the compliance rates of post-treatment surveillance for elderly patients with colon cancer using the linked Surveillance, Epidemiology, and End Results – Medicare database.^ The first study assessed the initiation and completion rate of 5-fluorouracil-based adjuvant chemotherapy and its relationship with patient characteristics. Of the 12,265 patients diagnosed with stage III colon adenocarcinoma in 1991-2005, 64.4% received adjuvant chemotherapy within 3-months after tumor resection and 40% of them completed the treatment. Age, marital status, and comorbidity score were significant predictors for chemotherapy initiation and completion.^ The second study estimated the incidence rate of toxicity-related endpoints among stage III colon adenocarcinoma patients treated with chemotherapy in 1991-2005. Of the 12,099 patients, 63.9% underwent chemotherapy and had volume depletion disorder (3-month cumulative incidence rate [CIR]=9.1%), agranulocytosis (CIR=3.4%), diarrhea (CIR=2.4%), nausea and vomiting (CIR=2.3%). Cox regression analysis confirmed such association (HR=2.76; 95% CI=2.42-3.15). The risk of ischemic heart diseases was slightly associated with chemotherapy (HR=1.08), but significantly among patients aged <75 with no comorbidity (HR=1.70). ^ The third study determined the adherence rate of follow-up cares among patients diagnosed with stage I-III colon adenocarcinoma in 2000 - June 2002. We identified 7,348 patients with a median follow-up of 59 months. The adherence rate was 83.9% for office visits, 29.4% for CEA tests, and 74.3% for colonoscopy. Overall, 25.2% met the recommended post-treatment care. Younger age at diagnosis, white race, married, advanced stage, fewer comorbidities, and chemotherapy use were significantly associated with guideline adherence.^ In conclusions, not all colon cancer patients received chemotherapy. Receiving chemotherapy was associated with increased risk of developing gastrointestinal, hematological and cardiac toxicities. Patients were more likely to comply with the schedule for office visits and colonoscopy but failed in CEA tests. ^
Resumo:
Identifying accurate numbers of soldiers determined to be medically not ready after completing soldier readiness processing may help inform Army leadership about ongoing pressures on the military involved in long conflict with regular deployment. In Army soldiers screened using the SRP checklist for deployment, what is the prevalence of soldiers determined to be medically not ready? Study group. 15,289 soldiers screened at all 25 Army deployment platform sites with the eSRP checklist over a 4-month period (June 20, 2009 to October 20, 2009). The data included for analysis included age, rank, component, gender and final deployment medical readiness status from MEDPROS database. Methods.^ This information was compiled and univariate analysis using chi-square was conducted for each of the key variables by medical readiness status. Results. Descriptive epidemiology Of the total sample 1548 (9.7%) were female and 14319 (90.2%) were male. Enlisted soldiers made up 13,543 (88.6%) of the sample and officers 1,746 (11.4%). In the sample, 1533 (10.0%) were soldiers over the age of 40 and 13756 (90.0%) were age 18-40. Reserve, National Guard and Active Duty made up 1,931 (12.6%), 2,942 (19.2%) and 10,416 (68.1%) respectively. Univariate analysis. Overall 1226 (8.0%) of the soldiers screened were determined to be medically not ready for deployment. Biggest predictive factor was female gender OR (2.8; 2.57-3.28) p<0.001. Followed by enlisted rank OR (2.01; 1.60-2.53) p<0.001. Reserve component OR (1.33; 1.16-1.53) p<0.001 and Guard OR (0.37; 0.30-0.46) p<0.001. For age > 40 demonstrated OR (1.2; 1.09-1.50) p<0.003. Overall the results underscore there may be key demographic groups relating to medical readiness that can be targeted with programs and funding to improve overall military medical readiness.^
Resumo:
Problem. Recent statistics show that over a fifth of children aged 2-5 years in 2006-2008 were overweight, with 7% above the 97 th percentile of the BMI-for-age growth charts (extreme obesity). Because poor diet is an important environmental determinant of obesity and the preschool years are crucial developmentally, examination of factors related to diet in the pre-school years is important for obesity prevention efforts. ^ Objective. The goals of this study were to determine the association between BMI of the parents and the number of servings of fruits, vegetables, and whole grains (FVWG) packed; the nutrient content of preschool children’s lunches; and norms and expectations about FVWG intake.^ Methods. This study was a cross sectional analysis of parents enrolled in the Lunch is in the Bag program at baseline. The independent measure was weight status of the parents/caregivers, which was determined using body mass index (BMI) calculated from self-reported height and weight. BMI was classified as healthy weight (BMI <25) or overweight/obese (BMI ≥25). Outcomes for the study included the number of servings of fruits, vegetables and whole grains (FVWG) in sack lunches, as well as the nutrient content of the lunches, and psychosocial constructs related to FVWG consumption. Linear regression analysis was conducted and adjusted for confounders to examine the associations of these outcomes with parental weight status, the main predictor. ^ Results. A total of 132 parent/child dyads were enrolled in the study; 59.09% (n=78) of the parents/caregivers were healthy weight and 39.01% (n=54) of the parents/caregivers were overweight/obese. Parents/caregivers in the study were predominantly white (68%, n=87) and had at least some college education (98%, n=128). No significant associations were found between the weight status of the parents and the servings of fruits, vegetables and whole grain packed in preschool children’s lunchboxes. The results were similar for the association of parental weight status and the nutrient contents of the packed lunches. Both healthy weight and overweight/obese parents packed less than the recommended amounts of vegetables (mean servings = 0.49 and 0.534, respectively) and whole grains (mean servings = 0.58 and 0.511, respectively). However, the intentions of the obese/overweight parents were higher compare to the healthy for vegetables and whole grains.^ Conclusion. Results from this study indicate that there are few differences in the servings of fruits, vegetables and whole grains packed by healthy weight parents/caregivers compared to overweight/obese parents/caregivers in a high income, well-educated population, although neither group met the recommended number of servings of vegetables or whole grains. Thus, results indicate the need for behaviorally-based health promotion programs for parents, regardless of their weight status; however, this study should be replicated with larger and more diverse populations to determine if these results are similar with less homogenous populations.^
Resumo:
Generalized linear Poisson and logistic regression models were utilized to examine the relationship between temperature and precipitation and cases of Saint Louis encephalitis virus spread in the Houston metropolitan area. The models were investigated with and without repeated measures, with a first order autoregressive (AR1) correlation structure used for the repeated measures model. The two types of Poisson regression models, with and without correlation structure, showed that a unit increase in temperature measured in degrees Fahrenheit increases the occurrence of the virus 1.7 times and a unit increase in precipitation measured in inches increases the occurrence of the virus 1.5 times. Logistic regression did not show these covariates to be significant as predictors for encephalitis activity in Houston for either correlation structure. This discrepancy for the logistic model could be attributed to the small data set.^ Keywords: Saint Louis Encephalitis; Generalized Linear Model; Poisson; Logistic; First Order Autoregressive; Temperature; Precipitation. ^
Resumo:
Background. MRSA (methicillin-resistant Staphylococcus aureus) is a multi-drug resistant bacterium that is quite prevalent in social environments where close person-to-person contact and crowding are an issue. In dental settings, the likelihood of transmission of MRSA may be higher than among other healthcare practitioners because of the close proximity between a patient's nose (where MRSA colonizes) and the field of procedure (the mouth) to the dental professional. Objective. To estimate the prevalence of MRSA nasal colonization among dental professionals (dentists and dental hygienists) in the Greater Houston Metropolitan Area, Texas, and analyze its associations with demographic, professional and personal protective equipment-related variables. Methods. 800 dental professionals (400 dentists and 400 dental hygienists) were randomly selected in the Greater Houston Metropolitan Area. Multiple waves of nasal swab kits and a self-administered questionnaire were mailed to increase the response rate of the study population. The swabs were cultured on chromagenic agar growth medium and bacterial growth results were evaluated after 18 hours. Positively selected bacterial colonies were confirmed as MRSA by further culturing these isolated bacteria on blood agar plates. Associations between positive nasal swabs and self-reported professional practice patterns, personal protective equipment use and demographics were analyzed using multiple logistic regression. Main Results. Completed questionnaires and nasal swabs were received from 496 study participants (68%). Fourteen cultures were positive for MRSA (4.2% among dentists and 1.6% among dental hygienists, p=0.07). After adjusting for gender, dental hygienists had a significantly lower prevalence of nasal colonization of MRSA as compared to dentists (OR: 0.20, 95% CI: 0.05–0.75). No other significant associations or interactions were found. Conclusion. The prevalence of nasal colonization with MRSA among dentists is similar to that reported for health care workers in general, whereas prevalence among dental hygienists is only slightly above that of the general population (1%). Differences in practice patterns and use of personal protective equipment did not explain this difference in this study, and was possibly due either to residual confounding or unexplored risk factors. Increased prevalence of MRSA among dentists warrants further investigation as to the reason for the increased rate and to allow implementation of measures to avoid transmission and progression to disease. ^
Resumo:
Gastroesophageal reflux disease is a common condition affecting 25 to 40% of the population and causes significant morbidity in the U.S., accounting for at least 9 million office visits to physicians with estimated annual costs of $10 billion. Previous research has not clearly established whether infection with Helicobacter pylori, a known cause of peptic ulcer, atrophic gastritis and non cardia adenocarcinoma of the stomach, is associated with gastroesophageal reflux disease. This study is a secondary analysis of data collected in a cross-sectional study of a random sample of adult residents of Ciudad Juarez, Mexico, that was conducted in 2004 (Prevalence and Determinants of Chronic Atrophic Gastritis Study or CAG study, Dr. Victor M. Cardenas, Principal Investigator). In this study, the presence of gastroesophageal reflux disease was based on responses to the previously validated Spanish Language Dyspepsia Questionnaire. Responses to this questionnaire indicating the presence of gastroesophageal reflux symptoms and disease were compared with the presence of H. pylori infection as measured by culture, histology and rapid urease test, and with findings of upper endoscopy (i.e., hiatus hernia and erosive and atrophic esophagitis). The prevalence ratio was calculated using bivariate, stratified and multivariate negative binomial logistic regression analyses in order to assess the relation between active H. pylori infection and the prevalence of gastroesophageal reflux typical syndrome and disease, while controlling for known risk factors of gastroesophageal reflux disease such as obesity. In a random sample of 174 adults 48 (27.6%) of the study participants had typical reflux syndrome and only 5% (or 9/174) had gastroesophageal reflux disease per se according to the Montreal consensus, which defines reflux syndromes and disease based on whether the symptoms are perceived as troublesome by the subject. There was no association between H. pylori infection and typical reflux syndrome or gastroesophageal reflux disease. However, we found that in this Northern Mexican population, there was a moderate association (Prevalence Ratio=2.5; 95% CI=1.3, 4.7) between obesity (≥30 kg/m2) and typical reflux syndrome. Management and prevention of obesity will significantly curb the growing numbers of persons affected by gastroesophageal reflux symptoms and disease in Northern Mexico. ^
Resumo:
Purpose. To evaluate the use of the Legionella Urine Antigen Test as a cost effective method for diagnosing Legionnaires’ disease in five San Antonio Hospitals from January 2007 to December 2009. ^ Methods. The data reported by five San Antonio hospitals to the San Antonio Metropolitan Health District during a 3-year retrospective study (January 2007 to December 2009) were evaluated for the frequency of non-specific pneumonia infections, the number of Legionella Urine Antigen Tests performed, and the percentage of positive cases of Legionnaires’ disease diagnosed by the Legionella Urine Antigen Test.^ Results. There were a total of 7,087 cases of non-specific pneumonias reported across the five San Antonio hospitals studied from 2007 to 2009. A total of 5,371 Legionella Urine Antigen Tests were performed from January, 2007 to December, 2009 across the five San Antonio hospitals in the study. A total of 38 positive cases of Legionnaires’ disease were identified by the use of Legionella Urinary Antigen Test from 2007-2009.^ Conclusions. In spite of the limitations of this study in obtaining sufficient relevant data to evaluate the cost effectiveness of Legionella Urinary Antigen Test in diagnosing Legionnaires’ disease, the Legionella Urinary Antigen Test is simple, accurate, faster, as results can be obtained within minutes to hours; and convenient because it can be performed in emergency room department to any patient who presents with the clinical signs or symptoms of pneumonia. Over the long run, it remains to be shown if this test may decrease mortality, lower total medical costs by decreasing the number of broad-spectrum antibiotics prescribed, shorten patient wait time/hospital stay, and decrease the need for unnecessary ancillary testing, and improve overall patient outcomes.^