934 resultados para barrier integrity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geochemical barrier zones play an important role in determining various physical systems and characteristics of oceans, e.g. hydrodynamics, salinity, temperature and light. In the book each of more than 30 barrier zones are illustrated and defined by physical, chemical and biological parameters. Among the topics discussed are processes of inflow, transformation and precipitation of the sedimentary layer of the open oceans and more restricted areas such as the Baltic, Black and Mediterranean Seas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Surface finish is one of the most relevant aspects of machining operations, since it is one of the principle methods to assess quality. Also, surface finish influences mechanical properties such as fatigue behavior, wear, corrosion, etc. The feed, the cutting speed, the cutting tool material, the workpiece material and the cutting tool wear are some of the most important factors that affects the surface roughness of the machined surface. Due to the importance of the martensitic 416 stainless steel in the petroleum industry, especially in valve parts and pump shafts, this material was selected to study the influence of the feed per tooth and cutting speed on tool wear and surface integrity. Also the influence of tool wear on surface roughness is analyzed. Results showed that high values of roughness are obtained when using low cutting speed and feed per tooth and by using these conditions tool wear decreases prolonging tool life. Copyright © 2009 by ASME.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complete and faithful duplication of the genome is essential to ensure normal cell division and organismal development. Eukaryotic DNA replication is initiated at multiple sites termed origins of replication that are activated at different time through S phase. The replication timing program is regulated by the S-phase checkpoint, which signals and repairs replicative stress. Eukaryotic DNA is packaged with histones into chromatin, thus DNA-templated processes including replication are modulated by the local chromatin environment such as post-translational modifications (PTMs) of histones.

One such epigenetic mark, methylation of lysine 20 on histone H4 (H4K20), has been linked to chromatin compaction, transcription, DNA repair and DNA replication. H4K20 can be mono-, di- and tri-methylated. Monomethylation of H4K20 (H4K20me1) is mediated by the cell cycle-regulated histone methyltransferase PR-Set7 and subsequent di-/tri- methylation is catalyzed by Suv4-20. Prior studies have shown that PR-Set7 depletion in mammalian cells results in defective S phase progression and the accumulation of DNA damage, which may be partially attributed to defects in origin selection and activation. Meanwhile, overexpression of mammalian PR-Set7 recruits components of pre-Replication Complex (pre-RC) onto chromatin and licenses replication origins for re-replication. However, these studies were limited to only a handful of mammalian origins, and it remains unclear how PR-Set7 impacts the replication program on a genomic scale. Finally, the methylation substrates of PR-Set7 include both histone (H4K20) and non-histone targets, therefore it is necessary to directly test the role of H4K20 methylation in PR-Set7 regulated phenotypes.

I employed genetic, cytological, and genomic approaches to better understand the role of H4K20 methylation in regulating DNA replication and genome stability in Drosophila melanogaster cells. Depletion of Drosophila PR-Set7 by RNAi in cultured Kc167 cells led to an ATR-dependent cell cycle arrest with near 4N DNA content and the accumulation of DNA damage, indicating a defect in completing S phase. The cells were arrested at the second S phase following PR-Set7 downregulation, suggesting that it was an epigenetic effect that coupled to the dilution of histone modification over multiple cell cycles. To directly test the role of H4K20 methylation in regulating genome integrity, I collaborated with the Duronio Lab and observed spontaneous DNA damage on the imaginal wing discs of third instar mutant larvae that had an alanine substitution on H4K20 (H4K20A) thus unable to be methylated, confirming that H4K20 is a bona fide target of PR-Set7 in maintaining genome integrity.

One possible source of DNA damage due to loss of PR-Set7 is reduced origin activity. I used BrdU-seq to profile the genome-wide origin activation pattern. However, I found that deregulation of H4K20 methylation states by manipulating the H4K20 methyltransferases PR-Set7 and Suv4-20 had no impact on origin activation throughout the genome. I then mapped the genomic distribution of DNA damage upon PR-Set7 depletion. Surprisingly, ChIP-seq of the DNA damage marker γ-H2A.v located the DNA damage to late replicating euchromatic regions of the Drosophila genome, and the strength of γ-H2A.v signal was uniformly distributed and spanned the entire late replication domain, implying stochastic replication fork collapse within late replicating regions. Together these data suggest that PR-Set7-mediated monomethylation of H4K20 is critical for maintaining the genomic integrity of late replicating domains, presumably via stabilization of late replicating forks.

In addition to investigating the function of H4K20me, I also used immunofluorescence to characterize the cell cycle regulated chromatin loading of Mcm2-7 complex, the DNA helicase that licenses replication origins, using H4K20me1 level as a proxy for cell cycle stages. In parallel with chromatin spindown data by Powell et al. (Powell et al. 2015), we showed a continuous loading of Mcm2-7 during G1 and a progressive removal from chromatin through S phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The distribution, abundance, behaviour, and morphology of marine species is affected by spatial variability in the wave environment. Maps of wave metrics (e.g. significant wave height Hs, peak energy wave period Tp, and benthic wave orbital velocity URMS) are therefore useful for predictive ecological models of marine species and ecosystems. A number of techniques are available to generate maps of wave metrics, with varying levels of complexity in terms of input data requirements, operator knowledge, and computation time. Relatively simple "fetch-based" models are generated using geographic information system (GIS) layers of bathymetry and dominant wind speed and direction. More complex, but computationally expensive, "process-based" models are generated using numerical models such as the Simulating Waves Nearshore (SWAN) model. We generated maps of wave metrics based on both fetch-based and process-based models and asked whether predictive performance in models of benthic marine habitats differed. Predictive models of seagrass distribution for Moreton Bay, Southeast Queensland, and Lizard Island, Great Barrier Reef, Australia, were generated using maps based on each type of wave model. For Lizard Island, performance of the process-based wave maps was significantly better for describing the presence of seagrass, based on Hs, Tp, and URMS. Conversely, for the predictive model of seagrass in Moreton Bay, based on benthic light availability and Hs, there was no difference in performance using the maps of the different wave metrics. For predictive models where wave metrics are the dominant factor determining ecological processes it is recommended that process-based models be used. Our results suggest that for models where wave metrics provide secondarily useful information, either fetch- or process-based models may be equally useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uptake of anthropogenic emission of carbon dioxide is resulting in a lowering of the carbonate saturation state and a drop in ocean pH. Understanding how marine calcifying organisms such as coralline algae may acclimatize to ocean acidification is important to understand their survival over the coming century. We present the first long-term perturbation experiment on the cold-water coralline algae, which are important marine calcifiers in the benthic ecosystems particularly at the higher latitudes. Lithothamnion glaciale, after three months incubation, continued to calcify even in undersaturated conditions with a significant trend towards lower growth rates with increasing pCO2. However, the major changes in the ultra-structure occur by 589 µatm (i.e. in saturated waters). Finite element models of the algae grown at these heightened levels show an increase in the total strain energy of nearly an order of magnitude and an uneven distribution of the stress inside the skeleton when subjected to similar loads as algae grown at ambient levels. This weakening of the structure is likely to reduce the ability of the alga to resist boring by predators and wave energy with severe consequences to the benthic community structure in the immediate future (50 years).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Community metabolism was investigated using a Lagrangian flow respirometry technique on 2 reef flats at Moorea (French Polynesia) during austral winter and Yonge Reef (Great Barrier Reef) during austral summer. The data were used to estimate related air-sea CO2 disequilibrium. A sine function did not satisfactorily model the diel light curves and overestimated the metabolic parameters. The ranges of community gross primary production and respiration (Pg and R; 9 to 15 g C m-2 d-1) were within the range previously reported for reef flats, and community net calcification (G; 19 to 25 g CaCO3 m-2 d-1) was higher than the 'standard' range. The molar ratio of organic to inorganic carbon uptake was 6:1 for both sites. The reef flat at Moorea displayed a higher rate of organic production and a lower rate of calcification compared to previous measurements carried out during austral summer. The approximate uncertainty of the daily metabolic parameters was estimated using a procedure based on a Monte Carlo simulation. The standard errors of Pg,R and Pg/R expressed as a percentage of the mean are lower than 3% but are comparatively larger for E, the excess production (6 to 78%). The daily air-sea CO2 flux (FCO2) was positive throughout the field experiments, indicating that the reef flats at Moorea and Yonge Reef released CO2 to the atmosphere at the time of measurement. FCO2 decreased as a function of increasing daily irradiance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Academic integrity (AI) has been defined as the commitment to the values of honesty, trust, fairness, respect, and responsibility with courage in all academic endeavours. The senior years of nursing studies provide an intersection for students to transition to professional roles through student clinical practice. It is essential to understand what predicts senior nursing students’ intention to behave with AI so that efforts can be directed to initiatives focused on strengthening their commitment to behaving with AI. Research Questions: To what extent do students differ on Theory of Planned Behaviour (TPB) variables? What predicts intention to behave with academic integrity among senior nursing students in clinical practice across three different Canadian Schools of Nursing? Method: The TPB framework, an elicitation (n=30) and two pilot studies (n=59, n=29) resulted in the development of a 38 question (41-item) self-report survey (Miron Academic Integrity Nursing Survey—MAINS: α>0.70) that was administered to Year 3 and 4 students (N=339). Three predictor variables (attitude, subjective norm, perceived behavioural control) were measured with students’ intention to behave with AI in clinical. Age, sex, year of study, program stream, students’ understanding of AI policies, and locations where students accessed AI information were also measured. Results: Hierarchical multiple regression analyses revealed that background, site, and TPB variables explained 32.6% of the variance in intention to behave with academic integrity. The TPB variables explained 26.8% of the variance in intention after controlling for background and site variables. In the final model, only the TPB predictor variables were statistically significant with Attitude having the highest beta value (beta=0.35, p<0.001), followed by Subjective Norm (beta=0.21, p<0.001) and Perceived Behavioural Control (beta=0.12, p<0.02). Conclusion: Student attitude is the strongest predictor to intention to behave with AI in clinical practice and efforts to positively influence students’ attitudes need to be a focus for schools, curricula, and clinical educators. Opportunities for future research should include replicating the current study with students enrolled in other professional programs and intervention studies that examine the effectiveness of specific endeavours to promote AI in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Annexin A1 is a potent anti-inflammatory molecule that has been extensively studied in the peripheral immune system, but has not as yet been exploited as a therapeutic target/agent. In the last decade, we have undertaken the study of this molecule in the central nervous system (CNS), focusing particularly on the primary interface between the peripheral body and CNS: the blood–brain barrier. In this review, we provide an overview of the role of this molecule in the brain, with a particular emphasis on its functions in the endothelium of the blood–brain barrier, and the protective actions the molecule may exert in neuroinflammatory, neurovascular and metabolic disease. We focus on the possible new therapeutic avenues opened up by an increased understanding of the role of annexin A1 in the CNS vasculature, and its potential for repairing blood–brain barrier damage in disease and aging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La Banque mondiale propose la bonne gouvernance comme la stratégie visant à corriger les maux de la mauvaise gouvernance et de faciliter le développement dans les pays en développement (Carayannis, Pirzadeh, Popescu & 2012; & Hilyard Wilks 1998; Leftwich 1993; Banque mondiale, 1989). Dans cette perspective, la réforme institutionnelle et une arène de la politique publique plus inclusive sont deux stratégies critiques qui visent à établir la bonne gouvernance, selon la Banque et d’autres institutions de Bretton Woods. Le problème, c’est que beaucoup de ces pays en voie de développement ne possèdent pas l’architecture institutionnelle préalable à ces nouvelles mesures. Cette thèse étudie et explique comment un état en voie de développement, le Commonwealth de la Dominique, s’est lancé dans un projet de loi visant l’intégrité dans la fonction publique. Cette loi, la Loi sur l’intégrité dans la fonction publique (IPO) a été adoptée en 2003 et mis en œuvre en 2008. Cette thèse analyse les relations de pouvoir entre les acteurs dominants autour de évolution de la loi et donc, elle emploie une combinaison de technique de l’analyse des réseaux sociaux et de la recherche qualitative pour répondre à la question principale: Pourquoi l’État a-t-il développé et mis en œuvre la conception actuelle de la IPO (2003)? Cette question est d’autant plus significative quand nous considérons que contrairement à la recherche existante sur le sujet, l’IPO dominiquaise diverge considérablement dans la structure du l’IPO type idéal. Nous affirmons que les acteurs "rationnels," conscients de leur position structurelle dans un réseau d’acteurs, ont utilisé leurs ressources de pouvoir pour façonner l’institution afin qu’elle serve leurs intérêts et ceux et leurs alliés. De plus, nous émettons l’hypothèse que: d’abord, le choix d’une agence spécialisée contre la corruption et la conception ultérieure de cette institution reflètent les préférences des acteurs dominants qui ont participé à la création de ladite institution et la seconde, notre hypothèse rivale, les caractéristiques des modèles alternatifs d’institutions de l’intégrité publique sont celles des acteurs non dominants. Nos résultats sont mitigés. Le jeu de pouvoir a été limité à un petit groupe d’acteurs dominants qui ont cherché à utiliser la création de la loi pour assurer leur légitimité et la survie politique. Sans surprise, aucun acteur n’a avancé un modèle alternatif. Nous avons conclu donc que la loi est la conséquence d’un jeu de pouvoir partisan. Cette recherche répond à la pénurie de recherche sur la conception des institutions de l’intégrité publique, qui semblent privilégier en grande partie un biais organisationnel et structurel. De plus, en étudiant le sujet du point de vue des relations de pouvoir (le pouvoir, lui-même, vu sous l’angle actanciel et structurel), la thèse apporte de la rigueur conceptuelle, méthodologique, et analytique au discours sur la création de ces institutions par l’étude de leur genèse des perspectives tant actancielles que structurelles. En outre, les résultats renforcent notre capacité de prédire quand et avec quelle intensité un acteur déploierait ses ressources de pouvoir.