937 resultados para Transit Operations
Resumo:
Located in the northeastern region of Italy, the Venetian Plain (VP) is a sedimentary basin containing an extensively exploited groundwater system. The northern part is characterised by a large undifferentiated phreatic aquifer constituted by coarse grain alluvial deposits and recharged by local rainfalls and discharges from the rivers Brenta and Piave. The southern plain is characterised by a series of aquitards and sandy aquifers forming a well-defined artesian multi-aquifer system. In order to determine origins, transit times and mixing proportions of different components in groundwater (GW), a multi tracer study (H, He/He, C, CFC, SF, Kr, Ar, Sr/Sr, O, H, cations, and anions) has been carried out in VP between the rivers Brenta and Piave. The geochemical pattern of GW allows a distinction of the different water origins in the system, in particular based on View the MathML source HCO3-,SO42-,Ca/Mg,NO3-, O, H. A radiogenic Sr signature clearly marks GW originated from the Brenta and Tertiary catchments. End-member analysis and geochemical modelling highlight the existence of a mixing process involving waters recharged from the Brenta and Piave rivers, from the phreatic aquifer and from another GW reservoirs characterised by very low mineralization. Noble gas excesses in respect to atmospheric equilibrium occur in all samples, particularly in the deeper aquifers of the Piave river, but also in phreatic water of the undifferentiated aquifers. He–H ages in the phreatic aquifer and in the shallower level of the multi-aquifer system indicate recharge times in the years 1970–2008. The progression of H–He ages with the distance from the recharge areas together with initial tritium concentration (H + Hetrit) imply an infiltration rate of about 1 km/y and the absence of older components in these GW. SF and Kr data corroborate these conclusions. H − He ages in the deeper artesian aquifers suggest a dilution process with older, tritium free waters. C Fontes–Garnier model ages of the old GW components range from 1 to 12 ka, yielding an apparent GW velocity of about 1–10 m/y. Increase of radiogenic He follows the progression of C ages. Ar, radiogenic He and C tracers yield model-dependent age-ranges in overall good agreement once diffusion of C from aquitards, GW dispersion, lithogenic Ar production, and He production-rate heterogeneities are taken into account. The rate of radiogenic He increase with time, deduced by comparison with C model ages, is however very low compared to other studies. Comparison with C and C data obtained 40 years ago on the same aquifer system shows that exploitation of GW caused a significant loss of the old groundwater reservoir during this time.
Resumo:
During the last two years of World War I food supply in Switzerland declined and caused shortcomings in consume, leading to social distress and conflict. Mainly two important factors caused these problems: First, Switzerland was highly dependent on food imports and during the war traditional supply lines faded. Second, weather extremes in the years 1916–1917 caused crop failure all over Europe and North America, which intensified the decline of food trade between the nations. In 1918 a conflict between classic urban consumers, such as workers, and famers erupted due to the food shortcomings and led to a lasting discord between urban and agrarian regions in Switzerland. But there was not only disharmony and conflict between the urban and agrarian regions. As a matter of fact several agents (urban and agrarian) interested in presenting adequate coping strategies to overcome the food shortages developed ideas of alternative ways of food production and supply since 1917. The aim of the paper is to outline these strategies that were undertaken to create a new era of food production that was not solely dependent on the agrarian sector or the import-trade. Actual growing of vegetables in estate areas is an important, but just one, factor of establishing a new system of food production, distribution and consume. The market-leading grocery stores in Switzerland nowadays (Coop and Migros) started their business during that time as co-operatives establishing new forms of distribution and food-production. So the interest of the paper is not only in actual «urban farming», but it wants to share some light on how swiss urban and agrarian spheres overlapped their functions in order to create a modern system of agro food-chains at the beginning of the interwar period.
Resumo:
The function of the esophagus is transporting nutrients from the oropharyngeal cavity to the stomach. This is achieved by coordinated contractions and relaxation of the tubular esophagus and the upper and lower esophageal sphincter. Multichannel intraluminal impedance monitoring offers quantification of esophageal bolus transit and/or retention without the use of ionizing radiation. Combined with conventional or high-resolution manometry, impedance measurements complement the quantification of esophageal body contraction and sphincter relaxation, offering a more comprehensive evaluation of esophageal function. Further studies evaluating the utility of quantifying bolus transit will help clarify the role and position of impedance measurements.
Resumo:
BACKGROUND This study evaluated whether risk factors for sternal wound infections vary with the type of surgical procedure in cardiac operations. METHODS This was a university hospital surveillance study of 3,249 consecutive patients (28% women) from 2006 to 2010 (median age, 69 years [interquartile range, 60 to 76]; median additive European System for Cardiac Operative Risk Evaluation score, 5 [interquartile range, 3 to 8]) after (1) isolated coronary artery bypass grafting (CABG), (2) isolated valve repair or replacement, or (3) combined valve procedures and CABG. All other operations were excluded. Univariate and multivariate binary logistic regression were conducted to identify independent predictors for development of sternal wound infections. RESULTS We detected 122 sternal wound infections (3.8%) in 3,249 patients: 74 of 1,857 patients (4.0%) after CABG, 19 of 799 (2.4%) after valve operations, and 29 of 593 (4.9%) after combined procedures. In CABG patients, bilateral internal thoracic artery harvest, procedural duration exceeding 300 minutes, diabetes, obesity, chronic obstructive pulmonary disease, and female sex (model 1) were independent predictors for sternal wound infection. A second model (model 2), using the European System for Cardiac Operative Risk Evaluation, revealed bilateral internal thoracic artery harvest, diabetes, obesity, and the second and third quartiles of the European System for Cardiac Operative Risk Evaluation were independent predictors. In valve patients, model 1 showed only revision for bleeding as an independent predictor for sternal infection, and model 2 yielded both revision for bleeding and diabetes. For combined valve and CABG operations, both regression models demonstrated revision for bleeding and duration of operation exceeding 300 minutes were independent predictors for sternal infection. CONCLUSIONS Risk factors for sternal wound infections after cardiac operations vary with the type of surgical procedure. In patients undergoing valve operations or combined operations, procedure-related risk factors (revision for bleeding, duration of operation) independently predict infection. In patients undergoing CABG, not only procedure-related risk factors but also bilateral internal thoracic artery harvest and patient characteristics (diabetes, chronic obstructive pulmonary disease, obesity, female sex) are predictive of sternal wound infection. Preventive interventions may be justified according to the type of operation.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.
Resumo:
The clinical arm of the UConn Health Center consists of the UConn Medical (UMG), our physician faculty practice, the John Dempsey Hospital (JDH) and the Correctional Managed Health Care Program (CMHC). This 2005 Clinical Annual Report was issued Sept.1, 2005 by Steven L. Strongwater, MD, Director of Clinical Operations.
Resumo:
A face to face survey addressing environmental risk perception was conducted in January through March 2010. The 35 question survey was administered to a random sample of 73 households in El Paso, Texas. The instrument, administered in two adjacent residential communities neighboring an inactive copper smelter solicited responses about manmade and naturally occurring health risks and sources of health information that might utilized by respondents. The objective of the study was to determine if intervention which occurred in one of the communities increased residents' perception of risk to themselves and their families. The study was undertaken subsequent to increased attention from news media and public debate surrounding the request to reopen the smelter's operations. Results of the study indicated that the perception of environmental related health concerns were not significantly correlated with residence in a community receiving outreach and intervention. Both communities identified sun exposure as their greatest perceived environmental risk followed by cigarette smoking. Though industrial by products and chemical pollution were high ranking concerns, respondents indicated they felt that the decision not to reopen the smelter reduced risk in these areas. Residents expressed confidence in information received from the local health district though most indicated they received very little information from that source indicating an opportunity for public health education in this community as a strategy to address future health concerns.^
Resumo:
This dissertation develops and tests a comparative effectiveness methodology utilizing a novel approach to the application of Data Envelopment Analysis (DEA) in health studies. The concept of performance tiers (PerT) is introduced as terminology to express a relative risk class for individuals within a peer group and the PerT calculation is implemented with operations research (DEA) and spatial algorithms. The analysis results in the discrimination of the individual data observations into a relative risk classification by the DEA-PerT methodology. The performance of two distance measures, kNN (k-nearest neighbor) and Mahalanobis, was subsequently tested to classify new entrants into the appropriate tier. The methods were applied to subject data for the 14 year old cohort in the Project HeartBeat! study.^ The concepts presented herein represent a paradigm shift in the potential for public health applications to identify and respond to individual health status. The resultant classification scheme provides descriptive, and potentially prescriptive, guidance to assess and implement treatments and strategies to improve the delivery and performance of health systems. ^
Resumo:
Refugee populations suffer poor health status and yet the activities of refugee relief agencies in the public health sector have not been subjected previously to comprehensive evaluation. The purpose of this study was to examine the effectiveness and cost of the major public health service inputs of the international relief operation for Indochinese refugees in Thailand coordinated by the United Nations High Commissioner for Refugees (UNHCR). The investigator collected data from surveillance reports and agency records pertaining to 11 old refugee camps administered by the Government of Thailand Ministry of Interior (MOI) since an earlier refugee influx, and five new Khmer holding centers administered directly by UNHCR, from November, 1979, to March, 1982.^ Generous international funding permitted UNHCR to maintain a higher level of public health service inputs than refugees usually enjoyed in their countries of origin or than Thais around them enjoyed. Annual per capita expenditure for public health inputs averaged approximately US$151. Indochinese refugees in Thailand, for the most part, had access to adequate general food rations, to supplementary feeding programs, and to preventive health measures, and enjoyed high-quality medical services. Old refugee camps administered by MOI consistently received public health inputs of lower quantity and quality compared with new UNHCR-administered holding centers, despite comparable per capita expenditure after both types of camps had stabilized (static phase).^ Mortality and morbidity rates among new Khmer refugees were catastrophic during the emergency and transition phases of camp development. Health status in the refugee population during the static phase, however, was similar to, or better than, health status in the refugees' countries of origin or the Thai communities surrounding the camps. During the static phase, mortality and morbidity generally remained stable at roughly the same low levels in both types of camps.^ Furthermore, the results of multiple regression analyses demonstrated that combined public health inputs accounted for from one to 23 per cent of the variation in refugee mortality and morbidity. The direction of associations between some public health inputs and specific health outcome variables demonstrated no clear pattern. ^
Resumo:
Personnel involved in natural or man-made disaster response and recovery efforts may be exposed to a wide variety of physical and mental stressors that can exhibit long-lasting and detrimental psychopathological outcomes. In a disaster situation, huge numbers of "secondary" responders can be involved in contaminant clean-up and debris removal and can be at risk of developing stress-related mental health outcomes. The Occupational Safety and Health Administration (OSHA) worker training hierarchy typically required for response workers, known as "Hazardous Waste Operations and Emergency Response" (HAZWOPER), does not address the mental health and safety concerns of workers. This study focused on the prevalence of traumatic stress experienced by secondary responders that had received or expressed interest in receiving HAZWOPER training through the National Institute of Environmental Health Sciences Worker Education and Training Program (NIEHS WETP). ^ The study involved the modification of two preexisting and validated survey tools to assess secondary responder awareness of physical, mental, and traumatic stressors on mental health and sought to determine if a need existed to include traumatic stress-related mental health education in the current HAZWOPER training regimen. The study evaluated post-traumatic stress disorder (PTSD), resiliency, mental distress, and negative effects within a secondary responder population of 176 respondents. Elevated PTSD levels were seen in the study population as compared to a general responder population (32.9% positive vs. 8%-22.5% positive). Results indicated that HAZWOPER-trained disaster responders were likely to test positive for PTSD, whereas, untrained responders with no disaster experience and responders who possessed either training or disaster experience only were likely to test PTSD negative. A majority (68.75%) of the population tested below the mean resiliency to cope score (80.4) of the average worker population. Results indicated that those who were trained only or who possessed both training and disaster work experience were more likely to have lower resiliency scores than those with no training or experience. There were direct correlations between being PTSD positive and having worked at a disaster site and experiencing mental distress and negative effects. However, HAZWOPER training status does not significantly correlate with mental distress or negative effect. ^ The survey indicated clear support (91% of respondents) for mental health education. The development of a pre- and post-deployment training module is recommended. Such training could provide responders with the necessary knowledge and skills to recognize the symptomology of PTSD, mental stressors, and physical and traumatic stressors, thus empowering them to employ protective strategies or seek professional help if needed. It is further recommended that pre-deployment mental health education be included in the current HAZWOPER 24- and 40-hour course curriculums, as well as, consideration be given towards integrating a stand-alone post-deployment mental health education training course into the current HAZWOPER hierarchy.^
Resumo:
This paper concerns the measurement of the impact of tax differentials across countries on inflow of Foreign Direct Investment (FDI) by using comprehensive data on the foreign operations of U.S. multinational corporations that has been collected by the Bureau of Economic Analysis (BEA), the U.S. Department of Commerce. In particular, this research focuses on examining: (1) how responsive FDI locations are to tax differentials across countries, (2) how different the tax effect on FDI inflow is between developed and developing countries, and (3) whether investment location decisions have become more or less sensitive to tax differences between countries over time ranging from the late 1990s to the early 2000s. Estimation results suggest that high rates of corporate income taxation are associated with reduced foreign assets of U.S. multinational firms in all industries by decreasing the return to foreign asset investment. Further, foreign assets of U.S. multinationals in all industries have become more responsive to non-income tax differentials across countries than to income tax differences from 1999 to 2004. Empirical estimates also indicate that foreign investment by American firms is associated with higher tax sensitivity more in developed countries than in those that are developing.