972 resultados para Damage control (Warships)
Resumo:
BACKGROUND Gambling is a form of nonsubstance addiction classified as an impulse control disorder. Pathologic gamblers are considered healthy with respect to their cognitive status. Lesions of the frontolimbic systems, mostly of the right hemisphere, are associated with addictive behavior. Because gamblers are not regarded as "brain-lesioned" and gambling is nontoxic, gambling is a model to test whether addicted "healthy" people are relatively impaired in frontolimbic neuropsychological functions. METHODS Twenty-one nonsubstance dependent gamblers and nineteen healthy subjects underwent a behavioral neurologic interview centered on incidence, origin, and symptoms of possible brain damage, a neuropsychological examination, and an electroencephalogram. RESULTS Seventeen gamblers (81%) had a positive medical history for brain damage (mainly traumatic head injury, pre- or perinatal complications). The gamblers, compared with the controls, were significantly more impaired in concentration, memory, and executive functions, and evidenced a higher prevalence of non-right-handedness (43%) and, non-left-hemisphere language dominance (52%). Electroencephalogram (EEG) revealed dysfunctional activity in 65% of the gamblers, compared with 26% of controls. CONCLUSIONS This study shows that the "healthy" gamblers are indeed brain-damaged. Compared with a matched control population, pathologic gamblers evidenced more brain injuries, more fronto-temporo-limbic neuropsychological dysfunctions and more EEG abnormalities. The authors thus conjecture that addictive gambling may be a consequence of brain damage, especially of the frontolimbic systems, a finding that may well have medicolegal consequences.
Resumo:
BACKGROUND It has been suggested that sleep apnea syndrome may play a role in normal-tension glaucoma contributing to optic nerve damage. The purpose of this study was to evaluate if optic nerve and visual field parameters in individuals with sleep apnea syndrome differ from those in controls. PATIENTS AND METHODS From the records of the sleep laboratory at the University Hospital in Bern, Switzerland, we recruited consecutive patients with severe sleep apnea syndrome proven by polysomnography, apnea-hypopnea index >20, as well as no sleep apnea controls with apnea-hypopnea index <10. Participants had to be unknown to the ophtalmology department and had to have no recent eye examination in the medical history. All participants underwent a comprehensive eye examination, scanning laser polarimetry (GDx VCC, Carl Zeiss Meditec, Dublin, California), scanning laser ophthalmoscopy (Heidelberg Retina Tomograph II, HRT II), and automated perimetry (Octopus 101 Programm G2, Haag-Streit Diagnostics, Koeniz, Switzerland). Mean values of the parameters of the two groups were compared by t-test. RESULTS The sleep apnea group consisted of 69 eyes of 35 patients; age 52.7 ± 9.7 years, apnea-hypopnea index 46.1 ± 24.8. As controls served 38 eyes of 19 patients; age 45.8 ± 11.2 years, apnea-hypopnea index 4.8 ± 1.9. A difference was found in mean intraocular pressure, although in a fully overlapping range, sleep apnea group: 15.2 ± 3.1, range 8-22 mmHg, controls: 13.6 ± 2.3, range 9-18 mmHg; p<0.01. None of the extended visual field, optic nerve head (HRT) and retinal nerve fiber layer (GDx VCC) parameters showed a significant difference between the groups. CONCLUSION Visual field, optic nerve head, and retinal nerve fiber layer parameters in patients with sleep apnea did not differ from those in the control group. Our results do not support a pathogenic relationship between sleep apnea syndrome and glaucoma.
Resumo:
BACKGROUND Assessment of the proportion of patients with well controlled cardiovascular risk factors underestimates the proportion of patients receiving high quality of care. Evaluating whether physicians respond appropriately to poor risk factor control gives a different picture of quality of care. We assessed physician response to control cardiovascular risk factors, as well as markers of potential overtreatment in Switzerland, a country with universal healthcare coverage but without systematic quality monitoring, annual report cards on quality of care or financial incentives to improve quality. METHODS We performed a retrospective cohort study of 1002 randomly selected patients aged 50-80 years from four university primary care settings in Switzerland. For hypertension, dyslipidemia and diabetes mellitus, we first measured proportions in control, then assessed therapy modifications among those in poor control. "Appropriate clinical action" was defined as a therapy modification or return to control without therapy modification within 12 months among patients with baseline poor control. Potential overtreatment of these conditions was defined as intensive treatment among low-risk patients with optimal target values. RESULTS 20% of patients with hypertension, 41% with dyslipidemia and 36% with diabetes mellitus were in control at baseline. When appropriate clinical action in response to poor control was integrated into measuring quality of care, 52 to 55% had appropriate quality of care. Over 12 months, therapy of 61% of patients with baseline poor control was modified for hypertension, 33% for dyslipidemia, and 85% for diabetes mellitus. Increases in number of drug classes (28-51%) and in drug doses (10-61%) were the most common therapy modifications. Patients with target organ damage and higher baseline values were more likely to have appropriate clinical action. We found low rates of potential overtreatment with 2% for hypertension, 3% for diabetes mellitus and 3-6% for dyslipidemia. CONCLUSIONS In primary care, evaluating whether physicians respond appropriately to poor risk factor control, in addition to assessing proportions in control, provide a broader view of the quality of care than relying solely on measures of proportions in control. Such measures could be more clinically relevant and acceptable to physicians than simply reporting levels of control.
Resumo:
Hips with a cam deformity are at risk for early cartilage degeneration, mainly in the anterolateral region of the joint. T1ρ MRI is a described technique for assessment of proteoglycan content in hyaline cartilage and subsequently early cartilage damage. In this study, 1.5 Tesla T1ρ MRI was performed on 20 asymptomatic hips with a cam deformity and compared to 16 healthy control hips. Cam deformity was defined as an alpha angle at 1:30 o'clock position over 60° and/or at 3:00 o'clock position over 50.5°. Hip cartilage was segmented and divided into four regions of interest (ROIs): anterolateral, anteromedial, posterolateral and posteromedial quadrants. Mean T1ρ value of the entire weight bearing cartilage in hips with a cam deformity (34.0 ± 4.6 ms) was significantly higher compared to control hips (31.3 ± 3.2 ms, p = 0.050). This difference reached significance in the anterolateral (p = 0.042) and posteromedial quadrants (p = 0.041). No significant correlation between the alpha angle and T1ρ values was detected. The results indicate cartilage damage occurs in hips with a cam deformity before symptoms occur. A significant difference in T1ρ values was found in the anterolateral quadrant, the area of direct engagement of the deformity, and in the posteromedial quadrant. To conclude, T1ρ MRI can detect early chondral damage in asymptomatic hips with a cam deformity. This article is protected by copyright. All rights reserved.
Resumo:
Non-cage housing systems for laying hens such as aviaries provide greater freedom to perform species-specific behavior and thus are thought to improve welfare of the birds; however, aviaries are associated with a high prevalence of keel bone damage (fractures and deviations), which is a major welfare problem in commercial laying hens. Potential causes of keel bone damage are falls and collisions with internal housing structures that occur as birds move between tiers or perches in the aviary. The aim of this study was to investigate the scope for reducing keel bone damage by reducing falls and collisions through modifications of aviary design. Birds were kept in 20 pens in a laying hen house (225 hens per pen) that were assigned to four different treatments (n = 5 pens per treatment group) including (1) control pens and pens modified by the addition of (2) perches, (3) platforms and (4) ramps. Video recordings at 19, 22, 29, 36 and 43 weeks of age were used to analyze controlled movements and falls (including details on occurrence of collision, cause of fall, height of fall and behavior after fall) during the transitional dusk and subsequent dark phase. Palpation assessments (focusing on fractures and deviations) using 20 focal hens per pen were conducted at 18, 20, 23, 30, 37, 44, 52 and 60 weeks of age. In comparison to the control group, we found 44% more controlled movements in the ramp (P = 0.003) and 47% more controlled movements in the platform treatments (P = 0.014) as well as 45% fewer falls (P = 0.006) and 59% fewer collisions (P < 0.001) in the ramp treatment. There were no significant differences between the control and perch treatments. Also, at 60 weeks of age, 23% fewer fractured keel bones were found in the ramp compared with the control treatment (P = 0.0053). After slaughter at 66 weeks of age, no difference in keel bone damage was found between treatment groups and the prevalence of fractures increased to an average of 86%. As a potential mechanism to explain the differences in locomotion, we suggest that ramps facilitated movement in the vertical plane by providing a continuous path between the tiers and thus supported more natural behavior (i.e. walking and running) of the birds. As a consequence of reducing events that potentially damage keel bones, the installation of ramps may have reduced the prevalence of keel fractures for a major portion of the flock cycle. We conclude that aviary design and installation of specific internal housing structures (i.e. ramps and platforms) have considerable potential to reduce keel bone damage of laying hens in aviary systems.
Resumo:
Lung damage is a common side effect of chemotherapeutic drugs such as bleomycin. This study used a bleomycin mouse model which simulates the lung damage observed in humans. Noninvasive, in vivo cone-beam computed tomography (CBCT) was used to visualize and quantify fibrotic and inflammatory damage over the entire lung volume of mice. Bleomycin was used to induce pulmonary damage in vivo and the results from two CBCT systems, a micro-CT and flat panel CT (fpCT), were compared to histologic measurements, the standard method of murine lung damage quantification. Twenty C57BL/6 mice were given either 3 U/kg of bleomycin or saline intratracheally. The mice were scanned at baseline, before the administration of bleomycin, and then 10, 14, and 21 days afterward. At each time point, a subset of mice was sacrificed for histologic analysis. The resulting CT images were used to assess lung volume. Percent lung damage (PLD) was calculated for each mouse on both the fpCT (PLDfpcT) and the micro-CT (PLDμCT). Histologic PLD (PLDH) was calculated for each histologic section at each time point (day 10, n = 4; day 14, n = 4; day 21, n = 5; control group, n = 5). A linear regression was applied to the PLDfpCT vs. PLDH, PLDμCT vs. PLDH and PLDfpCT vs. PLDμCT distributions. This study did not demonstrate strong correlations between PLDCT and PLDH. The coefficient of determination, R, was 0.68 for PLDμCT vs. PLDH and 0.75 for the PLD fpCT vs. PLDH. The experimental issues identified from this study were: (1) inconsistent inflation of the lungs from scan to scan, (2) variable distribution of damage (one histologic section not representative of overall lung damage), (3) control mice not scanned with each group of bleomycin mice, (4) two CT systems caused long anesthesia time for the mice, and (5) respiratory gating did not hold the volume of lung constant throughout the scan. Addressing these issues might allow for further improvement of the correlation between PLDCT and PLDH. ^
Resumo:
Lymphocyte development requires the assembly of diversified antigen receptor complexes generated by the genetically programmed V(D)J recombination event. Because germline DNA is cut, introducing potentially dangerous double-stranded breaks (DSBs) and rearranged prior to repair, its activity is limited to the non-cycling stages of the cell cycle, G0/G1. The potential involvement of a key mediator, Ataxia Telangiectasia Mutated or ATM, in the DNA damage response (DDR) and cell cycle checkpoints has been implicated in recombination, but its role is not fully understood. Thymic lymphomas from ATM deficient mice contain clonal chromosomal translocations involving the T-cell antigen receptor (TCR). A previous report found ATM and its downstream target p53 associated with V(D)J intermediates, suggesting the DDR senses recombination. In this study, we sought to understand the role of ATM in V(D)J recombination. Developing thymocytes from ATM deficient mice were analyzed according to the cell cycle to detect V(D)J intermediates. Examination of all TCR loci in the non-cycling (G0/G1) and cycling (S/G2/M) fractions revealed the persistence of intermediates in ATM deficient thymocytes, contrary to the wild-type in which intermediates are found only during G0/G1. Further analysis found no defect in end-joining of intermediates, nor were they detected in developed T-cells. Based upon the presence of persisting intermediates, the recombination initiating nuclease Rag-2 was examined; strict regulation limits it to G 0/G1. Rag-2 regulation was not affected by an ATM deficiency as Rag-2 expression remained contained within G0/G 1, indicating recombination is not continuous. To determine if an ATM deficiency affects recognition of V(D)J breaks, sites of recombination identified by a TCR locus or Rag expression were analyzed according to co-localization with a DDR factor phosphorylated immediately after DNA damage, phosphorylated H2AX (γH2AX). No differences in co-localization were found between the wild-type and ATM deficiency, demonstrating ATM deficient lymphocytes retain the ability to recognize DSBs. Together, these results suggest ATM is necessary in the cell cycle regulation of recombination but not essential for the identification of V(D)J breaks. ATM ensures the containment of intermediates within G0/G1 and maintains genomic stability of developing lymphocytes, emphasizing its fundamental role in preventing tumorigenesis.^
Resumo:
This case control study was conducted to assess the association between lung cancer risk, mutagen sensitivity (a marker of cancer susceptibility), and a putative lung carcinogen, wood dust exposure. There were 165 cases (98 African-Americans, 67 Mexican-Americans) with newly diagnosed, previously untreated lung cancer, and 239 controls, frequency-matched on age, sex, and ethnicity.^ Mutagen sensitivity ($\ge$1 break/cell) was associated with a statistically significant elevated risk for lung cancer (odds ratio (OR) = 4.1, 95% confidence limits (CL) = 2.3,7.2). Wood dust exposure was also a significant predictor of risk (OR = 2.8, 95% CL = 1.2,6.6) after controlling for smoking and mutagen sensitivity. When stratified by ethnicity, wood dust exposure was a significant risk factor for African-Americans (OR = 4.0, 95% CL = 1.4,11.5), but not for Mexican-Americans (OR = 1.5, 95% CL = 0.3,7.1). Stratified analysis suggested a greater than multiplicative interaction between wood dust exposure and both mutagen sensitivity and smoking.^ The cases had significantly more breaks on chromosomes 4 and 5 than the controls did with ORs of 4.9 (95% CL = 2.0, 11.7) and 3.9 (95% CL = 1.6, 9.3), respectively. Breaks at 4p14, 4q27, 4q31, 5q21-22, 5q31, and 5q33 were significantly more common in lung cancer patients than in controls. Lung cancer risk had a dose-response relationship with breaks on chromosomes 4 and 5. Cigarette smoking had a strong interaction with breaks on chromosomes 2, 4, and 5.^ In a molecular cytogenetic study, using chromosome painting and G-banding, we showed that: (1) the proportion of chromosome 5 abnormalities surviving as chromosome-type aberrations remained significantly higher in cells of lung cancer cases (14%) than in controls (5%) (P $<$ 0.001). However, no significant differences were detected in chromosome 4 abnormalities between cases and controls; (2) the proportion of chromosome 5q13-22 abnormalities was 5.3% in the cases and 0.7% in the controls (P $<$ 0.001). 5q13-22 regions represented 40% of all abnormalities on chromosome 5 in the cases and only 14% in the controls.^ This study suggests that mutagen sensitivity, wood dust exposure, and cigarette smoking were independent risk factors for lung cancer, and the susceptibility of particular chromosome loci to mutagenic damage may be a genetic marker for specific types of lung cancer. ^
Resumo:
The inability to maintain genomic stability and control proliferation are hallmarks of many cancers, which become exacerbated in the presence of unrepaired DNA damage. Such genotoxic stresses trigger the p53 tumor suppressor network to activate transient cell cycle arrest allowing for DNA repair; if the damage is excessive or irreparable, apoptosis or cellular senescence is triggered. One of the major DNA repair pathway that mends DNA double strand breaks is non-homologous end joining (NHEJ). Abrogating the NHEJ pathway leads to an accumulation of DNA damage in the lymphoid system that triggers p53-mediated apoptosis; complete deletion of p53 in this system leads to aggressive lymphomagenesis. Therefore, to study the effect of p53-dependent cell cycle arrest, we utilized a hypomorphic, separation-of-function mutant, p53p/p, which completely abrogates apoptosis yet retains partial cell cycle arrest ability. We crossed DNA ligase IV deficiency, a downstream ligase crucial in mending breaks during NHEJ, into the p53p/p background (Lig4-/-p53p/p). The accumulation of DNA damage activated the p53/p21 axis to trigger cellular senescence in developing lymphoid cells, which absolutely suppressed tumorigenesis. Interestingly, these mice progressively succumb to severe diabetes. Mechanistic analysis revealed that spontaneous DNA damage accumulated in the pancreatic b-cells, a unique subset of endocrine cells solely responsible for insulin production to regulate glucose homeostasis. The genesis of adult b-cells predominantly occurs through self-replication, therefore modulating cellular proliferation is an essential component for renewal. The progressive accumulation of DNA damage, caused by Lig4-/-, activated p53/p21-dependent cellular senescence in mutant pancreatic b-cells that lead to islet involution. Insulin levels subsequently decreased, deregulating glucose homeostasis driving overt diabetes. Our Lig4-/-p53p/p model aptly depicts the dichotomous role of cellular senescence—in the lymphoid system prevents tumorigenesis yet in the endocrine system leads to the decrease of insulin-producing cells causing diabetes. To further delineate the function of NHEJ in pancreatic b-cells, we analyzed mice deficient in another component of the NHEJ pathway, Ku70. Although most notable for its role in DNA damage recognition and repair within the NHEJ pathway, Ku70 has NHEJ-independent functions in telomere maintenance, apoptosis, and transcriptional regulation/repression. To our surprise, Ku70-/-p53p/p mutant mice displayed a stark increase in b-cell proliferation, resulting in islet expansion, heightened insulin levels and hypoglycemia. Augmented b-cell proliferation was accompanied with the stabilization of the canonical Wnt pathway, responsible for this phenotype. Interestingly, the progressive onset of cellular senescence prevented islet tumorigenesis. This study highlights Ku70 as an important modulator in not only maintaining genomic stability through NHEJ-dependent functions, but also reveals a novel NHEJ-independent function through regulation of pancreatic b-cell proliferation. Taken in aggregate, these studies underscore the importance for NHEJ to maintain genomic stability in b-cells as well as introduces a novel regulator for pancreatic b-cell proliferation.
Resumo:
Extreme winter warming events in the sub-Arctic have caused considerable vegetation damage due to rapid changes in temperature and loss of snow cover. The frequency of extreme weather is expected to increase due to climate change thereby increasing the potential for recurring vegetation damage in Arctic regions. Here we present data on vegetation recovery from one such natural event and multiple experimental simulations in the sub-Arctic using remote sensing, handheld passive proximal sensors and ground surveys. Normalized difference vegetation index (NDVI) recovered fast (2 years), from the 26% decline following one natural extreme winter warming event. Recovery was associated with declines in dead Empetrum nigrum (dominant dwarf shrub) from ground surveys. However, E. nigrum healthy leaf NDVI was also reduced (16%) following this winter warming event in experimental plots (both control and treatments), suggesting that non-obvious plant damage (i.e., physiological stress) had occurred in addition to the dead E. nigrum shoots that was considered responsible for the regional 26% NDVI decline. Plot and leaf level NDVI provided useful additional information that could not be obtained from vegetation surveys and regional remote sensing (MODIS) alone. The major damage of an extreme winter warming event appears to be relatively transitory. However, potential knock-on effects on higher trophic levels (e.g., rodents, reindeer, and bear) could be unpredictable and large. Repeated warming events year after year, which can be expected under winter climate warming, could result in damage that may take much longer to recover.
Resumo:
Las "orugas defoliadoras" afectan la producción del cultivo de soja, sobre todo en años secos y con altas temperaturas que favorecen su desarrollo. El objetivo del presente trabajo fue evaluar la eficiencia de control de insecticidas neurotóxicos e IGRs sobre "orugas defoliadoras" en soja. Se realizaron ensayos en lotes comerciales en tres localidades de la provincia de Córdoba en las campañas agrícolas 2008/09 y 2009/10, bajo un diseño de bloques al azar, con seis tratamientos y tres repeticiones. Los tratamientos fueron: T1: Clorpirifos (384 g p.a.ha-1), T2: Cipermetrina (37,5 g p.a.ha-1), T3: Lufenuron+Profenofos (15 + 150 g p.a.ha-1), T4: Metoxifenocide (28,8 g p.a.ha-1), T5: Novaluron (10 g p.a.ha-1) y T6: Testigo. El tamaño de las parcelas fue de 12 surcos de 10 m de largo distanciados a 0,52 m. La aplicación se realizó con una mochila provista de boquillas de cono hueco (40 gotas.cm-2), cuando la plaga alcanzó el umbral de daño económico. En cada parcela se tomaron cinco muestras a los 0, 2, 7 y 14 días después de la aplicación (DDA) utilizando el paño vertical, identificando y cuantificando las orugas vivas mayores a 1,5 cm. A los 14 DDA se extrajeron 30 folíolos por parcela (estrato medio y superior de la planta) y se determinó el porcentaje de defoliación utilizando el software WinFolia Reg. 2004. Se estimó el rendimiento sobre 5 muestras de 1 m2 en cada parcela y se realizó ANOVA y test de comparación de medias LSD de Fisher. El Clorpirifos mostró el mayor poder de volteo y el Metoxifenocide la mayor eficiencia a los 7 DDA. En general los IGRs mostraron mayor poder residual.
Resumo:
Future oceans are predicted to contain less oxygen than at present. This is because oxygen is less soluble in warmer water and predicted stratification will reduce mixing. Hypoxia in marine environments is thus likely to become more widespread in marine environments and understanding species-responses is important to predicting future impacts on biodiversity. This study used a tractable model, the Antarctic clam, Laternula elliptica, which can live for 36 years, and has a well-characterized ecology and physiology to understand responses to hypoxia and how the effect varied with age. Younger animals had a higher condition index, higher adenylate energy charge and transcriptional profiling indicated that they were physically active in their response to hypoxia, whereas older animals were more sedentary, with higher levels of oxidative damage and apoptosis in the gills. These effects could be attributed, in part, to age-related tissue scaling; older animals had proportionally less contractile muscle mass and smaller gills and foot compared with younger animals, with consequential effects on the whole-animal physiological response. The data here emphasize the importance of including age effects, as large mature individuals appear to be less able to resist hypoxic conditions and this is the size range that is the major contributor to future generations. Thus, the increased prevalence of hypoxia in future oceans may have marked effects on benthic organisms' abilities to persist and this is especially so for long-lived species when predicting responses to environmental perturbation.
Resumo:
In the early stages of the development of Japan’s environmental policy, sulfur oxide (SOx) emissions, which seriously damage health, was the most important air pollution problem. In the second half of the 1960s and the first half of the 1970s, the measures against SOx emissions progressed quickly, and these emissions were reduced drastically. The most important factor of the reduction was the conversion to a low-sulfur fuel for large-scale fuel users, such as the electric power industry. However, industries started conversion to low-sulfur fuel not due to environmental concerns, but simply to reduce costs. Furthermore, the interaction among the various interests of the electric power industry, oil refineries, the central government, local governments, and citizens over the energy and environmental policies led to the measures against SOx emissions by fuel conversion.
Resumo:
Olive fruit fly, Bactrocera oleae (Rossi), is a key pest in olive orchards, causing serious economic damage. To date, the pest has already developed resistance to the insecticides commonly applied to control it. Thus, in searching for new products for an accurate resistance management programme, targeting the ecdysone receptor (EcR)might provide alternative compounds for use in such programmes. RESULTS: Residual contact and oral exposure in the laboratory of B. oleae adults to the dibenzoylhydrazine-based compounds methoxyfenozide, tebufenozide and RH-5849 showed different results. Methoxyfenozide and tebufenozide did not provoke anynegative effectsontheadults,but RH-5849 killed98-100%of the treated insects15 days after treatment. Theligand-binding domain (LBD) of the EcR of B. oleae (BoEcR-LBD) was sequenced, and a homology protein model was constructed. Owing to a restricted extent of the ligand-binding cavity of the BoEcR-LBD, docking experiments with the three tested insecticides showed a severe steric clash in the case of methoxyfenozide and tebufenozide, while this was not the case with RH-5849. CONCLUSION: IGR molecules similar to the RH-5849 molecule, and different from methoxyfenozide and tebufenozide, might have potential in controlling this pest.
Resumo:
This paper presents the design and implementation of an intelligent control system based on local neurofuzzy models of the milling process relayed through an Ehternet-based application. Its purpose is to control the spindle torque of a milling process by using an internal model control paradigm to modify the feed rate in real time. The stabilization of cutting cutting torque is especially necessary in milling processes such as high-spedd roughing of steel moulds and dies tha present minor geometric uncertainties. Thus, maintenance of the curring torque increaes the material removal rate and reduces the risk of damage due to excessive spindle vibration, a very sensitive and expensive component in all high-speed milling machines. Torque control is therefore an interesting challenge from an industrial point of view.