984 resultados para Accumulation rate, n-alkanes C29-C33 per year
Resumo:
In the strongly seasonal, but annually very wet, parts of the tropics, low-water availability in the short dry season leads to a semi-deciduous forest, one which is also highly susceptible to nutrient loss from leaching in the long wet season. Patterns in litterfall were compared between forest with low (LEM) and high (HEM) abundances of ectomycorrhizal trees in Korup National Park, Cameroon, over 26 months in 1990–92. Leaf litter was sorted into 26 abundant species which included six ectomycorrhizal species, and of these three were the large grove-forming trees Microberlinia bisulcata, Tetraberlinia bifoliolata and Tetraberlinia moreliana. Larger-tree species shed their leaves with pronounced peaks in the dry season, whereas other species had either weaker dependence, showed several peaks per year, or were wet-season shedders. Although total annual litterfall differed little between forest types, in the HEM forest (dominated by M. bisulcata) the dry-season peak was more pronounced and earlier than that in the LEMforest. Species differed greatly in their mean leaf litterfall nutrient concentrations, with an approx. twofold range for nitrogen and phosphorus, and 2.5–3.5-fold for potassium, magnesium and calcium. In the dry season, LEM and HEM litter showed similar declines in P and N concentration, and increases in K and Mg; some species, especially M. bisculcata, showed strong dry-wet season differences. The concentration of P (but not N) was higher in the leaf litter of ectomycorrhizal than nonectomycorrhizal species. Retranslocation of N and P was lower among the ectomycorrhizal than nonectomycorrhizal species by approx. twofold. It is suggested that, within ectomycorrhizal groves on this soil low in P, a fast decomposition rate with minimal loss of mineralized P is possible due to the relatively high litter P not limiting the cycle at this stage, combined with an efficient recapture of released P by the surface organic layer of ectomycorrhizas and fine roots. This points to a feedback between two essential controlling steps (retranslocation and mineralization) in a tropical rain forest ecosystem dominated by ectomycorrhizal trees.
Resumo:
Diminishing crude oil and natural gas supplies, along with concern about greenhouse gas are major driving forces in the search for efficient renewable energy sources. The conversion of lignocellulosic biomass to energy and useful chemicals is a component of the solution. Ethanol is most commonly produced by enzymatic hydrolysis of complex carbohydrates to simple sugars followed by fermentation using yeast. C6Hl0O5 + H2O −Enxymes→ C6H12O6 −Yeast→ 2CH3CH2OH + 2C02 In the U.S. corn is the primary starting raw material for commercial ethanol production. However, there is insufficient corn available to meet the future demand for ethanol as a gasoline additive. Consequently a variety of processes are being developed for producing ethanol from biomass; among which is the NREL process for the production of ethanol from white hardwood. The objective of the thesis reported here was to perform a technical economic analysis of the hardwood to ethanol process. In this analysis a Greenfield plant was compared to co-locating the ethanol plant adjacent to a Kraft pulp mill. The advantage of the latter case is that facilities can be shared jointly for ethanol production and for the production of pulp. Preliminary process designs were performed for three cases; a base case size of 2205 dry tons/day of hardwood (52 million gallons of ethanol per year) as well as the two cases of half and double this size. The thermal efficiency of the NREL process was estimated to be approximately 36%; that is about 36% of the thermal energy in the wood is retained in the product ethanol and by-product electrical energy. The discounted cash flow rate of return on investment and the net present value methods of evaluating process alternatives were used to evaluate the economic feasibility of the NREL process. The minimum acceptable discounted cash flow rate of return after taxes was assumed to be 10%. In all of the process alternatives investigated, the dominant cost factors are the capital recovery charges and the cost of wood. The Greenfield NREL process is not economically viable with the cost of producing ethanol varying from $2.58 to $2.08/gallon for the half capacity and double capacity cases respectively. The co-location cases appear more promising due to reductions in capital costs. The most profitable co-location case resulted in a discounted cash flow rate of return improving from 8.5% for the half capacity case to 20.3% for the double capacity case. Due to economy of scale, the investments become more and more profitable as the size of the plant increases. This concept is limited by the amount of wood that can be delivered to the plant on a sustainable basis as well as the demand for ethanol within a reasonable distance of the plant.
Resumo:
Objective. The study reviewed one year of Texas hospital discharge data and Trauma Registry data for the 22 trauma services regions in Texas to identify regional variations in capacity, process of care and clinical outcomes for trauma patients, and analyze the statistical associations among capacity, process of care, and outcomes. ^ Methods. Cross sectional study design covering one year of state-wide Texas data. Indicators of trauma capacity, trauma care processes, and clinical outcomes were defined and data were collected on each indicator. Descriptive analyses were conducted of regional variations in trauma capacity, process of care, and clinical outcomes at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers. Multilevel regression models were performed to test the relations among trauma capacity, process of care, and outcome measures at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers while controlling for confounders such as age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization. ^ Results. Significant regional variation was found among the 22 trauma services regions across Texas in trauma capacity, process of care, and clinical outcomes. The regional trauma bed rate, the average staffed bed per 100,000 varied significantly by trauma service region. Pre-hospital trauma care processes were significantly variable by region---EMS time, transfer time, and triage. Clinical outcomes including mortality, hospital and intensive care unit length of stay, and hospital charges also varied significantly by region. In multilevel regression analysis, the average trauma bed rate was significantly related to trauma care processes including ambulance delivery time, transfer time, and triage after controlling for age, gender, race/ethnicity, injury severity, level of trauma centers, and urbanization at all trauma centers. Transfer time only among processes of care was significant with the average trauma bed rate by region at Level III and IV. Also trauma mortality only among outcomes measures was significantly associated with the average trauma bed rate by region at all trauma centers. Hospital charges only among outcomes measures were statistically related to trauma bed rate at Level I and II trauma centers. The effect of confounders on processes and outcomes such as age, gender, race/ethnicity, injury severity, and urbanization was found significantly variable by level of trauma centers. ^ Conclusions. Regional variation in trauma capacity, process, and outcomes in Texas was extensive. Trauma capacity, age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization were significantly associated with trauma process and clinical outcomes depending on level of trauma centers. ^ Key words: regionalized trauma systems, trauma capacity, pre-hospital trauma care, process, trauma outcomes, trauma performance, evaluation measures, regional variations ^
Resumo:
Hepatocellular carcinoma (HCC) has been ranked as the top cause of death due to neoplasm malignancy in Taiwan for years. The high incidence of HCC in Taiwan is primarily attributed to high prevalence of hepatitis viral infection. Screening the subjects with liver cirrhosis for HCC was widely recommended by many previous studies. The latest practice guideline for management of HCC released by the American Association for the Study of Liver Disease (AASLD) in 2005 recommended that the high risk groups, including cirrhotic patients, chronic HBV/HCV carriers, and subjects with family history of HCC and etc., should undergo surveillance.^ This study aims to investigate (1) whether the HCC screening program can prolong survival period of the high risk group, (2) what is the incremental cost-effectiveness ratio of the HCC screening program in Taiwan, as compared with a non-screening strategy from the payer perspective, (3) which high risk group has the lowest ICER for the HCC screening program from the insurer's perspective, in comparison with no screening strategy of each group, and (4) the estimated total cost of providing the HCC screening program to all high risk groups.^ The high risk subjects in the study were identified from the communities with high prevalence of hepatitis viral infection and classified into three groups (cirrhosis group, early cirrhosis group, and no cirrhosis group) at different levels of risk to HCC by status of liver disease at the time of enrollment. The repeated ultrasound screenings at an interval of 3, 6, and 12 months were applied to cirrhosis group, early cirrhosis group, and no cirrhosis group, respectively. The Markov-based decision model was constructed to simulate progression of HCC and to estimate the ICER for each group of subjects.^ The screening group had longer survival in the statistical results and the model outcomes. Owing to the low HCC incidence rate in the community-based screening program, screening services only have limited effect on survival of the screening group. The incremental cost-effectiveness ratio of the HCC screening program was $3834 per year of life saved, in comparison with the non-screening strategy. The estimated total cost of each group from the screening model over 13.5 years approximately consumes 0.13%, 1.06%, and 0.71% of total amount of adjusted National Health Expenditure from Jan 1992 to Jun 2005. ^ The subjects at high risk of developing HCC to undergo repeated ultrasound screenings had longer survival than those without screening, but screening was not the only factor to cause longer survival in the screening group. The incremental cost-effectiveness ratio of the 2-stage community-based HCC screening program in Taiwan was small. The HCC screening program was worthy of investment in Taiwan. In comparison with early cirrhosis group and no cirrhosis group, cirrhosis group has the lowest ICER when the screening period is less than 19 years. The estimated total cost of providing the HCC screening program to all high risk groups consumes approximately 1.90% of total amount of adjusted 13.5-year NHE in Taiwan.^
Resumo:
Background. Vascular dementia (VaD) is the second most common of dementia. Multiple risk factors are associated with VaD, but the individual contribution of each to disease onset and progression is unclear. We examined the relationship between diabetes mellitus type 2 (DM) and the clinical variables of VaD.^ Methods. Data from 593 patients evaluated between June, 2003 and June, 2008 for cognitive impairment were prospectively entered into a database. We retrospectively reviewed the charts of 63 patients who fit the NINDS-AIREN criteria of VaD. The patients were divided into those with DM (VaD-DM, n=29) and those without DM (VaD, n=34). The groups were compared with regard to multiple variables.^ Results. Patients with DM had a significantly earlier onset of VaD (71.9±6.54 vs. 77.2±6.03, p<0.001), a faster rate of decline per year on the mini mental state examination (MMSE; 3.60±1.82 vs. 2.54±1.60 points, p=0.02), and a greater prevalence of neuropsychiatric symptoms (62% vs. 21%, p=0.02) at the time of diagnosis.^ Conclusions. This study shows that a history of pre-morbid DM is associated with an early onset and faster cognitive deterioration in VaD. Moreover, the presence of DM predicts the presence of neuropsychiatric symptoms in patients with VaD. A larger study is needed to verify these associations. It will be important to investigate whether better glycemic control will mitigate the potential effects of DM on VaD.^
Resumo:
Characteristics of Medicare-certified home health agencies in Texas and the contributions of selected agency characteristics on home health care costs were examined. Cost models were developed and estimated for both nursing and total visit costs using multiple regression procedures. The models included home health agency size, profit status, control, hospital-based affiliation, contract-cost ratio, service provision, competition, urban-rural input-price differences, and selected measures of patient case-mix. The study population comprised 314 home health agencies in Texas that had been certified at least one year on July, 1, 1986. Data for the analysis were obtained from Medicare Cost Reports for fiscal year ending between July 1, 1985 to June 30, 1986.^ Home health agency size, as measured by the logs of nursing and total visits, has a statistically significant negative linear relationship with nursing visit and total visit costs. Nursing and total visit costs decrease at a declining rate as size increases. The size-cost relationship is not altered when controlling for any other agency characteristic. The number of visits per patient per year, a measure of patient case-mix, is also negatively related to costs, suggesting that costs decline with care of chronic patients. Hospital-based affiliation and urban location are positively associated with costs. Together, the four characteristics explain 19 percent of the variance in nursing visit costs and 24 percent of the variance in total visit costs.^ Profit status and control, although correlated with other agency characteristics, exhibit no observable effect on costs. Although no relationship was found between costs and competition, contract cost ratio, or the provision on non-reimburseable services, no conclusions can be made due to problems with measurement of these variables. ^
Resumo:
The purpose of this study was to determine the incidence of cancer in Titus County, Texas, through the identification of all cases of cancer that occurred in residents of the county during the period from 1977 to 1984. Data gathered from Texas Cancer Registry, hospital records, and death certificates were analyzed with regard to anatomic site, race, sex, age, city of residence, and place of birth. Adjustment of incidence rates by sex and race allowed comparisons with U.S. rates provided by the Surveillance, Epidemiology, and End Results Program (SEER).^ Seven hundred sixty-six (766) cancer cases were identified for the eight year period during 171,536 person-years of observation. In whites, statistically significant standardized incidence ratios (SIR) were found for leukemia (males SIR = 2.70 and females SIR = 2.26), melanoma (males SIR = 1.90 and females SIR = 2.25), lung (males SIR = 1.45) and for multiple myeloma (both sexes combined SIR = 1.86). In blacks, significant excess numbers of cases were found for Hodgkin's disease (males SIR = 8.33 and females SIR = 13.3) and for esophagus and bone considering both sexes together (SIR = 2.68 and 12.54, respectively). Rates for blacks were based on a small population and therefore unstable. A statistically significant excess number of cases for all sites combined was found in Mount Pleasant residents (age-adjusted incidence rate = 563.6 per 100,000 per year).^ A review of possible environmental risk factors in the area: hazardous waste disposal site, lignite deposits, and petrochemical and poultry industries are presented. A need for further epidemiological and environmental studies to identify etiological factors that could be responsible for the excess number of leukemia cases are recommended. For melanoma, a public health educational program to teach the population methods of protection from sun exposure is also suggested. ^
A descriptive and exploratory analysis of occupational injuries at a chemical manufacturing facility
Resumo:
A retrospective study of 1353 occupational injuries occurring at a chemical manufacturing facility in Houston, Texas from January, 1982 through May, 1988 was performed to investigate the etiology of the occupational injury process. Injury incidence rates were calculated for various sub-populations of workers to determine differences in the risk of injury for various groups. Linear modeling techniques were used to determine the association between certain collected independent variables and severity of an injury event. Finally, two sub-groups of the worker population, shiftworkers and injury recidivists, were examined. An injury recidivist as defined is any worker experiencing one or more injury per year. Overall, female shiftworkers evidenced the highest average injury incidence rate compared to all other worker groups analyzed. Although the female shiftworkers were younger and less experienced, the etiology of their increased risk of injury remains unclear, although the rigors of performing shiftwork itself or ergonomic factors are suspect. In general, females were injured more frequently than males, but they did not incur more severe injuries. For all workers, many injuries were caused by erroneous or foregone training, and risk taking behaviors. Injuries of these types are avoidable. The distribution of injuries by severity level was bimodal; either injuries were of minor or major severity with only a small number of cases falling in between. Of the variables collected, only the type of injury incurred and the worker's titlecode were statistically significantly associated with injury severity. Shiftworkers did not sustain more severe injuries than other worker groups. Injury to shiftworkers varied as a 24-hour pattern; the greatest number occurred between 1200-1230 hours, (p = 0.002) by Cosinor analysis. Recidivists made up 3.3% of the population (23 males and 10 females), yet suffered 17.8% of the injuries. Although past research suggests that injury recidivism is a random statistical event, analysis of the data by logistic regression implicates gender, area worked, age and job titlecode as being statistically significantly related to injury recidivism at this facility. ^
Resumo:
Invasive pneumococcal disease (IPD) causes significant health burden in the US, is responsible for the majority of bacterial meningitis, and causes more deaths than any other vaccine preventable bacterial disease in the US. The estimated National IPD rate is 14.3 cases per 100,000 population with a case-fatality rate of 1.5 cases per 100,000 population. Although cases of IPD are routinely reported to the local health department in Harris County Texas, the incidence (IR) and case-fatality (CFR) rates have not been reported. Additionally, it is important to know which serotypes of S. pneumoniae are circulating in Harris County Texas and to determine if ‘replacement disease’ is occurring. ^ This study reported incidence and case-fatality rates from 2003 to 2009, and described the trends in IPD, including the IPD serotypes circulating in Harris County Texas during the study period, particularly in 2008 and 2010. Annual incidence rates were calculated and reported for 2003 to 2009, using complete surveillance-year data. ^ Geographic information system (GIS) software was used to create a series of maps of the data reported during the study period. Cluster and outlier analysis and hot spot analysis were conducted using both case counts by census tract and disease rate by census tract. ^ IPD age- and race-adjusted IR for Harris County Texas and their 95% confidence intervals (CIs) were 1.40 (95% CI 1.0, 1.8), 1.71 (95% CI 1.24, 2.17), 3.13 (95% CI 2.48, 3.78), 3.08 (95% CI 2.43, 3.74), 5.61 (95% CI 4.79, 6.43), 8.11 (95% CI 7.11, 9.1), and 7.65 (95% CI 6.69, 8.61) for the years 2003 to 2009, respectively (rates were age- and race-adjusted to each year's midyear US population estimates). A Poisson regression model demonstrated a statistically significant increasing trend of about 32 percent per year in the IPD rates over the course of the study period. IPD age- and race-adjusted case-fatality rates (CFR) for Harris County Texas were also calculated and reported. A Poisson regression model demonstrated a statistically significant increasing trend of about 26 percent per year in the IPD case-fatality rates from 2003 through 2009. A logistic regression model associated the risk of dying from IPD to alcohol abuse (OR 4.69, 95% CI 2.57, 8.56) and to meningitis (OR 2.42, 95% CI 1.46, 4.03). ^ The prevalence of non-vaccine serotypes (NVT) among IPD cases with serotyped isolates was 98.2 percent. In 2008, the year with the sample more geographically representative of all areas of Harris County Texas, the prevalence was 96 percent. Given these findings, it is reasonable to conclude that ‘replacement disease’ is occurring in Harris County Texas, meaning that, the majority of IPD is caused by serotypes not included in the PCV7 vaccine. Also in conclusion, IPD rates increased during the study period in Harris County Texas.^
Resumo:
Venous thromboembolism (VTE), including deep vein thrombosis (DVT) and pulmonary embolism (PE), is the third most preventable cardiovascular disease and a growing public health problem in the United States. The incidence of VTE remains high with an annual estimate of more than 600,000 symptomatic events. DVT affects an estimated 2 million American each year with a death toll of 300,000 persons per year from DVT-related PE. Leukemia patients are at high risk for both hemorrhage and thrombosis; however, little is known about thrombosis among acute leukemia patients. The ultimate goal of this dissertation was to obtain deep understanding of thrombotic issue among acute leukemia patients. The dissertation was presented in a format of three papers. First paper mainly looked at distribution and risk factors associated with development of VTE among patients with acute leukemia prior to leukemia treatment. Second paper looked at incidence, risk factors, and impact of VTE on survival of patients with acute lymphoblastic leukemia during treatment. Third paper looked at recurrence and risk factors for VTE recurrence among acute leukemia patients with an initial episode of VTE. Descriptive statistics, Chi-squared or Fisher's exact test, median test, Mann-Whitney test, logistic regression analysis, Nonparametric Estimation Kaplan-Meier with a log-rank test or Cox model were used when appropriate. Results from analyses indicated that acute leukemia patients had a high prevalence, incidence, and recurrent rate of VTE. Prior history of VTE, obesity, older age, low platelet account, presence of Philadelphia positive ALL, use of oral contraceptives or hormone replacement therapy, presence of malignancies, and co-morbidities may place leukemia patients at an increased risk for VTE development or recurrence. Interestingly, development of VTE was not associated with a higher risk of death among hospitalized acute leukemia patients.^
Resumo:
Precipitation for 2011 was less than the longterm climate average. Early in the year, precipitation lagged behind normal, but then tracked close to the normal accumulation rate from mid-April through mid-August. After that time, precipitation amounts greatly lagged behind normal, and the year ended almost 7 in. behind the long-term average. (Figure 1). Overall, 2011 will be remembered for good moisture early, but ending the season with almost no rainfall.
Resumo:
Mass accumulation rates (MAR) of different components of North Pacific deep-sea sediment provide detailed information about the timing of the onset of major Northern Hemisphere glaciation that occurred at 2.65 Ma. An increase in explosive volcanism in the Kamchatka-Kurile and Aleutian arcs occured at this same time, suggesting a link between volcanism and glaciation. Sediments recovered by piston-coring techniques during ODP Leg 145 provide a unique opportunity to undertake a detailed test of this possibility. Here we use volcanic glass as a proxy for explosive volcanism and ice-rafted debris (IRD) as a proxy for glaciation. The MAR of both glass and IRD increase markedly at 2.65 Ma. Further, the flux of the volcanic glass increased just prior the flix of ice-radted material, suggesting that the cooling resulting from explosive volcanic eruptions may have been the ultimate trigger for the mid-Pliocene glacial intensification.
Resumo:
The mid-Pliocene was an episode of prolonged global warmth and strong North Atlantic thermohaline circulation, interrupted briefly at circa 3.30 Ma by a global cooling event corresponding to marine isotope stage (MIS) M2. Paleoceanographic changes in the eastern North Atlantic have been reconstructed between circa 3.35 and 3.24 Ma at Deep Sea Drilling Project Site 610 and Integrated Ocean Drilling Program Site 1308. Mg/Ca ratios and d18O from Globigerina bulloides are used to reconstruct the temperature and relative salinity of surface waters, and dinoflagellate cyst assemblages are used to assess variability in the North Atlantic Current (NAC). Our sea surface temperature data indicate warm waters at both sites before and after MIS M2 but a cooling of ~2-3°C during MIS M2. A dinoflagellate cyst assemblage overturn marked by a decline in Operculodinium centrocarpum reflects a southward shift or slowdown of the NAC between circa 3.330 and 3.283 Ma, reducing northward heat transport 23-35 ka before the global ice volume maximum of MIS M2. This will have established conditions that ultimately allowed the Greenland ice sheet to expand, leading to the global cooling event at MIS M2. Comparison with an ice-rafted debris record excludes fresh water input via icebergs in the northeast Atlantic as a cause of NAC decline. The mechanism causing the temporary disruption of the NAC may be related to a brief reopening of the Panamanian Gateway at about this time.
Resumo:
In general, a moderate drying trend is observed in mid-latitude arid Central Asia since the Mid-Holocene, attributed to the progressively weakening influence of the mid-latitude Westerlies on regional climate. However, as the spatio-temporal pattern of this development and the underlying climatic mechanisms are yet not fully understood, new high-resolution paleoclimate records from this region are needed. Within this study, a sediment core from Lake Son Kol (Central Kyrgyzstan) was investigated using sedimentological, (bio)geochemical, isotopic, and palynological analyses, aiming at reconstructing regional climate development during the last 6000 years. Biogeochemical data, mainly reflecting summer moisture conditions, indicate predominantly wet conditions until 4950 cal. yr BP, succeeded by a pronounced dry interval between 4950 and 3900 cal. yr BP. In the following, a return to wet conditions and a subsequent moderate drying trend until present times are observed. This is consistent with other regional paleoclimate records and likely reflects the gradual Late Holocene diminishment of the amount of summer moisture provided by the mid-latitude Westerlies. However, climate impact of the Westerlies was apparently not only restricted to the summer season but also significant during winter as indicated by recurrent episodes of enhanced allochthonous input through snowmelt, occurring before 6000 cal. yr BP and at 5100-4350, 3450-2850, and 1900-1500 cal. yr BP. The distinct ~1500-year periodicity of these episodes of increased winter precipitation in Central Kyrgyzstan resembles similar cyclicities observed in paleoclimate records around the North Atlantic, likely indicating a hemispheric-scale climatic teleconnection and an impact of North Atlantic Oscillation (NAO) variability in Central Asia.