5 resultados para Supply and demand

em DigitalCommons@The Texas Medical Center


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The goal of this study is to characterize the current workforce of CIHs, the lengths of professional practice careers of the past and current CIHs.^ Methods. This is a secondary data analysis of data compiled from all of the nearly 50 annual roster listings of the American Board of Industrial Hygiene (ABIH) for Certified Industrial Hygienists active in each year since 1960. Survival analysis was performed as a technique to measure the primary outcome of interest. The technique which was involved in this study was the Kaplan-Meier method for estimating the survival function.^ Study subjects: The population to be studied is all Certified Industrial Hygienists (CIHs). A CIH is defined by the ABIH as an individual who has achieved the minimum requirements for education, working experience and through examination, has demonstrated a minimum level of knowledge and competency in the prevention of occupational illnesses. ^ Results. A Cox-proportional hazards model analysis was performed by different start-time cohorts of CIHs. In this model we chose cohort 1 as the reference cohort. The estimated relative risk of the event (defined as retirement, or absent from 5 consecutive years of listing) occurred for CIHs for cohorts 2,3,4,5 relative to cohort 1 is 0.385, 0.214, 0.234, 0.299 relatively. The result show that cohort 2 (CIHs issued from 1970-1980) has the lowest hazard ratio which indicates the lowest retirement rate.^ Conclusion. The manpower of CIHs (still actively practicing up to the end of 2009) increased tremendously starting in 1980 and grew into a plateau in recent decades. This indicates that the supply and demand of the profession may have reached equilibrium. More demographic information and variables are needed to actually predict the future number of CIHs needed. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maine implemented a hospital rate-setting program in 1984 at approximately the same time as Medicare started the Prospective Payment System (PPS). This study examines the effectiveness of the program in controlling cost over the period 1984-1989. Hospital costs in Maine are compared to costs in 36 non rate-setting states and 11 other rate-setting states. Changes in cost per equivalent admission, adjusted patient day, per capita, admissions, and length of stay are described and analyzed using multivariate techniques. A number of supply and demand variables which were expected to influence costs independently of rate-setting were controlled for in the study. Results indicate the program was effective in containing costs measured in terms of cost per adjusted patient day. However, this was not true for the other two cost variables. The average length of stay increased during the period in Maine hospitals indicating an association with rate-setting. Several supply variables, especially the number of beds per 1,000 population were strongly associated with the cost and use of hospitals. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. End-stage liver disease (ESLD) is an irreversible condition that leads to the imminent complete failure of the liver. Orthotopic liver transplantation (OLT) has been well accepted as the best curative option for patients with ESLD. Despite the progress in liver transplantation, the major limitation nowadays is the discrepancy between donor supply and organ demand. In an effort to alleviate this situation, mismatched donor and recipient gender or race livers are being used. However, the simultaneous impact of donor and recipient gender and race mismatching on patient survival after OLT remains unclear and relatively challenging to surgeons. ^ Objective. To examine the impact of donor and recipient gender and race mismatching on patient survival after OLT using the United Network for Organ Sharing (UNOS) database. ^ Methods. A total of 40,644 recipients who underwent OLT between 2002 and 2011 were included. Kaplan-Meier survival curves and the log-rank tests were used to compare the survival rates among different donor-recipient gender and race combinations. Univariate Cox regression analysis was used to assess the association of donor-recipient gender and race mismatching with patient survival after OLT. Multivariable Cox regression analysis was used to model the simultaneous impact of donor-recipient gender and race mismatching on patient survival after OLT adjusting for a list of other risk factors. Multivariable Cox regression analysis stratifying on recipient hepatitis C virus (HCV) status was also conducted to identify the variables that were differentially associated with patient survival in HCV + and HCV − recipients. ^ Results. In the univariate analysis, compared to male donors to male recipients, female donors to male recipients had a higher risk of patient mortality (HR, 1.122; 95% CI, 1.065–1.183), while in the multivariable analysis, male donors to female recipients experienced an increased mortality rates (adjusted HR, 1.114; 95% CI, 1.048–1.184). Compared to white donors to white recipients, Hispanic donors to black recipients had a higher risk of patient mortality (HR, 1.527; 95% CI, 1.293–1.804) in the univariate analysis, and similar result (adjusted HR, 1.553; 95% CI, 1.314–1.836) was noted in multivariable analysis. After the stratification on recipient HCV status in the multivariable analysis, HCV + mismatched recipients appeared to be at greater risk of mortality than HCV − mismatched recipients. Female donors to female HCV − recipients (adjusted HR, 0.843; 95% CI, 0.769–0.923), and Hispanic HCV + recipients receiving livers from black donors (adjusted HR, 0.758; 95% CI, 0.598–0.960) had a protective effect on patient survival after OLT. ^ Conclusion. Donor-recipient gender and race mismatching adversely affect patient survival after OLT, both independently and after the adjustment for other risk factors. Female recipient HCV status is an important effect modifier in the association between donor-recipient gender combination and patient survival.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Centers for Disease Control estimates that foodborne diseases cause approximately 76 million illnesses, 325,000 hospitalizations, and 5,000 deaths in the United States each year. The American public is becoming more health conscious and there has been an increase in the dietary intake of fresh fruits and vegetables. Affluence and demand for convenience has allowed consumers to opt for pre-processed packaged fresh fruits and vegetables. These pre-processed foods are considered Ready-to-Eat. They have many of the advantages of fresh produce without the inconvenience of processing at home. After seeing a decline in food-related illnesses between 1996 and 2004, due to an improvement in meat and poultry safety, tainted produce has tilted the numbers back. This has resulted in none of the Healthy People 2010 targets for food-related illness reduction being reached. Irradiation has been shown to be effective in eliminating many of the foodborne pathogens. The application of irradiation as a food safety treatment has been widely endorsed by many of the major associations involved with food safety and public health. Despite these endorsements there has been very little use of this technology to date for reducing the disease burden associated with the consumption of these products. A review of the available literature since the passage of the 1996 Food Quality Protection Act was conducted on the barriers to implementing irradiation as a food safety process for fresh fruits and vegetables. The impediments to adopting widespread utilization of irradiation food processing as a food safety measure involve a complex array of legislative, regulatory, industry, and consumer issues. The FDA’s approval process limits the expansion of the list of foods approved for the application of irradiation as a food safety process. There is also a lack of capacity within the industry to meet the needs of a geographically dispersed industry.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diarrheal disease associated with enterotoxigenic Escherichia coli (ETEC) infection is one of the major public health problems in many developing countries, especially in infants and young children. Because tests suitable for field laboratories have been developed only relatively recently, the literature on the environmental risk factors associated with ETEC is not as complete as for many other pathogens or for diarrhea of unspecified etiology.^ Data from a diarrheal disease surveillance project in rural Egypt in which stool samples were tested for a variety of pathogens, and in which an environmental questionnaire was completed for the same study households, provided an opportunity to test for an association between ETEC and various risk factors present in those households. ETEC laboratory-positive specimens were compared with ETEC laboratory-negative specimens for both symptomatic and asymptomatic children less than three years of age at the individual and household level using a case-comparison design.^ Individual children more likely to have LT infection were those who lived in HHs that had cooked food stored for subsequent consumption at the time of the visit, where caretakers used water but not soap to clean an infant after a diarrheal stool, and that had an indoor, private water source. LT was more common in HHs where the caretaker did not clean an infant with soap after a diarrheal stool, and where a sleeping infant was not covered with a net. At both the individual and HH level, LT was significantly associated with good water supply in terms of quantity and storage.^ ST was isolated more frequently at the individual level where a sleeping infant was covered with a net, where large animals were kept in or around the house, where water was always available and was not potable, and where the water container was not covered. At the HH level, the absence of a toilet or latrine and the indiscriminate disposal of animal waste decreased risk. Using animal feces for fertilizer, the presence of large animals, and poor water quality were associated with ST at both the individual and HH level.^ These findings are mostly consistent with those of other studies, and/or are biologically plausible, with the obvious exception of those from this study where poorer water supplies are associated with less infection, at least in the case of LT. More direct observation of how animal ownership and feces disposal relates to different types of water supply and usage might clarify mechanisms through which some ETEC infection could be prevented in similar settings. ^