942 resultados para strict liability
Resumo:
A small proportion of harmful algae produce toxins which are harmful to human health. Strict monitoring programmes are in place within Ireland and the EU to effectively manage risk to human consumers of shellfish species that have accumulated marine biotoxins in their tissues. However, little is known about the impacts of HABs on shellfish health. This study used Solid Phase Adsorption and Toxin Tracking (SPATT) for the passive sampling of algal biotoxins at Lough Hyne Marine Nature Reserve in West Cork, Ireland. Spatial and temporal monitoring of the incidence of a wide range of lipophilic toxins was assessed over a 4-month period. Active sampling accumulated sufficient quantities of toxin for use in subsequent experimentation. In addition to commonly occurring Diarrhetic Shellfish Poisoning (DSP) toxins, Dinophysis toxin-1 and Pinnatoxin-G were both detected in the samples. This is the first identification of these latter two toxins in Irish waters. The effects of the DSP toxin okadaic acid (OA) were investigated on three shellfish species: Mytilus edulis, Ruditapes philippinarum and Crassostrea gigas. Histological examination of the gill, mantle and hepatopancreas tissues revealed varying intensity of damage depending both on the tissue type and the species involved. At the cellular level, flow cytometric analysis of the differential cell population distribution was assessed. No change in cell population distribution was observed in Mytilus edulis or Ruditapes philippinarum, however significant changes were observed in Crassostrea gigas granulocytes at the lower levels of toxin exposure. This indicated a chemically-induced response to OA. DNA fragmentation was measured in the haemolymph and hepatopancreas cells post OA-exposure in Mytilus edulis and Crassostrea gigas. A significant increase in DNA fragmentation was observed in both species over time, even at the lowest OA concentrations. DNA fragmentation could be due to genotoxicity of OA and/or to the induction of cell apoptosis.
Resumo:
This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.
Resumo:
Aim: To develop and evaluate the psychometric properties of an instrument for the measurement of self-neglect (SN).Conceptual Framework: An elder self-neglect (ESN) conceptual framework guided the literature review and scale development. The framework has two key dimensions physical/psycho-social and environmental and seven sub dimensions which are representative of the factors that can contribute to intentional and unintentional SN. Methods: A descriptive cross-sectional design was adopted to achieve the research aim. The study was conducted in two phases. Phase 1 involved the development of the questionnaire content and structure. Phase 2 focused on establishing the psychometric properties of the instrument. Content validity was established by a panel of 8 experts and piloted with 9 health and social care professionals. The instrument was subsequently posted with a stamped addressed envelope to 566 health and social care professionals who met specific eligibility criteria across the four HSE areas. A total of 341 questionnaires were returned, a response rate of 60% and 305 (50%) completed responses were included in exploratory factor analysis (EFA). Item and factor analyses were performed to elicit the instruments underlying factor structure and establish preliminary construct validity. Findings: Item and factor analyses resulted in a logically coherent, 37 items, five factor solution, explaining 55.6% of the cumulative variance. The factors were labelled: ‘Environment’, ‘Social Networks’, ‘Emotional and Behavioural Liability’, ‘Health Avoidance’ and ‘Self-Determinism’. The factor loadings were >0.40 for all items on each of the five subscales. Preliminary construct validity was supported by findings. Conclusion: The main outcome of this research is a 37 item Self-Neglect (SN-37) measurement instrument that was developed by EFA and underpinned by an ESN conceptual framework. Preliminary psychometric evaluation of the instrument is promising. Future work should be directed at establishing the construct and criterion related validity of the instrument.
Resumo:
There is an increasing appreciation of the polymicrobial nature of bacterial infections associated with Cystic Fibrosis (CF) and of the important role for interactions in influencing bacterial virulence and response to therapy. Patients with CF are co-infected with Pseudomonas aeruginosa, Burkholderia cenocepacia and Stenotrophomonas maltophilia. These latter bacteria produce signal molecules of the diffusible signal factor (DSF) family, which are cis-2-unsaturated fatty acids. Previous studies showed that DSF from S. maltophilia leads to altered biofilm formation and increased tolerance to antibiotics in P. aeruginosa and that these responses require the P. aeruginosa sensor kinase PA1396. The work in this thesis aims of further elucidate the influence and mechanism of DSF signalling on P. aeruginosa and examine the role that such interspecies signalling play in infection of the CF airway. Next generation sequencing technologies targeting the 16S ribosomal RNA gene were applied to DNA and RNA isolated from sputum taken from cohorts of CF and non-CF subjects to characterise the bacterial community. In parallel, metabolomics analysis of sputum provided insight into the environment of the CF airway. This analysis revealed a number of observations including; that differences in metabolites occur in sputum taken from clinically stable CF patients and those with exacerbation and DNA- and RNA-based methods suggested that a strong relationship existed between the abundance of specific strict anaerobes and fluctuations in the level of metabolites during exacerbation. DSF family signals were also detected in the sputum and a correlation with the presence of DSFproducing organisms was observed. To examine the signal transduction mechanisms used by P. aeruginosa, bioinformatics with site directed mutagenesis were employed to identify signalling partners for PA1396. A pathway suggesting a role for a number of proteins in the regulation of several factors following DSF recognition by PA1396 were observed.
Resumo:
Through the recognition of potentially harmful stimuli, Toll-like receptors (TLRs) initiate the innate immune response and induce the expression of hundreds of immune and pro-inflammatory genes. TLRs are critical in mounting a defence against invading pathogens however, strict control of TLR signalling is vital to prevent host damage from excessive or prolonged immune activation. In this thesis the role of the IκB protein Bcl (B-cell lymphoma)-3 in the regulation of TLR signalling is investigated. Bcl3-/- mice and cells are hyper responsive to TLR stimulation and are defective in LPS tolerance. Bcl-3 interacts with and blocks the ubiquitination of homodimers of the NF-κB subunit, p50. Through stabilisation of inhibitory p50 homodimers, Bcl-3 negatively regulates NF-κB dependent inflammatory gene transcription following TLR activation. Firstly, we investigated the nature of the interaction between Bcl-3 and p50 and using peptide array technology. Key amino acids required for the formation of the p50:Bcl-3 immunosuppressor complex were identified. Furthermore, we demonstrate for the first time that interaction between Bcl-3 and p50 is necessary and sufficient for the anti-inflammatory properties of Bcl-3. Using the data generated from peptide array analysis we then generated cell permeable peptides designed to mimic Bcl-3 function and stabilise p50 homodimers. These Bcl-3 derived peptides are potent inhibitors of NF-κB dependent transcription activity in vitro and provide a solid basis for the development of novel gene-specific approaches in the treatment of inflammatory diseases. Secondly, we demonstrate that Bcl-3 mediated regulation of TLR signalling is not limited to NF-κB and identify the MAK3K Tumour Progression Locus (Tpl)-2 as a new binding partner of Bcl-3. Our data establishes role for Bcl-3 as a negative regulator of the MAPK-ERK pathway.
Resumo:
OBJECTIVE: Strict lifelong compliance to a gluten-free diet (GFD) minimizes the long-term risk of mortality, especially from lymphoma, in adult celiac disease (CD). Although serum IgA antitransglutaminase (IgA-tTG-ab), like antiendomysium (IgA-EMA) antibodies, are sensitive and specific screening tests for untreated CD, their reliability as predictors of strict compliance to and dietary transgressions from a GFD is not precisely known. We aimed to address this question in consecutively treated adult celiacs. METHODS: In a cross-sectional study, 95 non-IgA deficient adult (median age: 41 yr) celiacs on a GFD for at least 1 yr (median: 6 yr) were subjected to 1) a dietician-administered inquiry to pinpoint and quantify the number and levels of transgressions (classified as moderate or large, using as a cutoff value the median gluten amount ingested in the overall noncompliant patients of the series) over the previous 2 months, 2) a search for IgA-tTG-ab and -EMA, and 3) perendoscopic duodenal biopsies. The ability of both antibodies to discriminate celiacs with and without detected transgressions was described using receiver operating characteristic curves and quantified as to sensitivity and specificity, according to the level of transgressions. RESULTS: Forty (42%) patients strictly adhered to a GFD, 55 (58%) had committed transgressions, classified as moderate (< or = 18 g of gluten/2 months; median number 6) in 27 and large (>18 g; median number 69) in 28. IgA-tTG-ab and -EMA specificity (proportion of correct recognition of strictly compliant celiacs) was 0.97 and 0.98, respectively, and sensitivity (proportion of correct recognition of overall, moderate, and large levels of transgressions) was 0.52, 0.31, and 0.77, and 0.62, 0.37, and 0.86, respectively. IgA-tTG-ab and -EMA titers were correlated (p < 0.001) to transgression levels (r = 0.560 and R = 0.631, respectively) and one to another (p < 0.001) in the whole patient population (r = 0.834, N = 84) as in the noncompliant (r = 0.915, N = 48) group. Specificity and sensitivity of IgA-tTG-ab and IgA-EMA for recognition of total villous atrophy in patients under a GFD were 0.90 and 0.91, and 0.60 and 0.73, respectively. CONCLUSIONS: In adult CD patients on a GFD, IgA-tTG-ab are poor predictors of dietary transgressions. Their negativity is a falsely secure marker of strict diet compliance.
Resumo:
Currently, the sole strategy for managing food hypersensitivity involves strict avoidance of the trigger. Several alternate strategies for the treatment of food allergies are currently under study. Also being explored is the process of eliminating allergenic proteins from crop plants. Legumes are a rich source of protein and are an essential component of the human diet. Unfortunately, legumes, including soybean and peanut, are also common sources of food allergens. Four protein families and superfamilies account for the majority of legume allergens, which include storage proteins of seeds (cupins and prolamins), profilins, and the larger group of pathogenesis-related proteins. Two strategies have been used to produce hypoallergenic legume crops: (1) germplasm lines are screened for the absence or reduced content of specific allergenic proteins and (2) genetic transformation is used to silence native genes encoding allergenic proteins. Both approaches have been successful in producing cultivars of soybeans and peanuts with reduced allergenic proteins. However, it is unknown whether the cultivars are actually hypoallergenic to those with sensitivity. This review describes efforts to produce hypoallergenic cultivars of soybean and peanut and discusses the challenges that need to be overcome before such products could be available in the marketplace.
Resumo:
Chemoprevention agents are an emerging new scientific area that holds out the promise of delaying or avoiding a number of common cancers. These new agents face significant scientific, regulatory, and economic barriers, however, which have limited investment in their research and development (R&D). These barriers include above-average clinical trial scales, lengthy time frames between discovery and Food and Drug Administration approval, liability risks (because they are given to healthy individuals), and a growing funding gap for early-stage candidates. The longer time frames and risks associated with chemoprevention also cause exclusivity time on core patents to be limited or subject to significant uncertainties. We conclude that chemoprevention uniquely challenges the structure of incentives embodied in the economic, regulatory, and patent policies for the biopharmaceutical industry. Many of these policy issues are illustrated by the recently Food and Drug Administration-approved preventive agents Gardasil and raloxifene. Our recommendations to increase R&D investment in chemoprevention agents include (a) increased data exclusivity times on new biological and chemical drugs to compensate for longer gestation periods and increasing R&D costs; chemoprevention is at the far end of the distribution in this regard; (b) policies such as early-stage research grants and clinical development tax credits targeted specifically to chemoprevention agents (these are policies that have been very successful in increasing R&D investment for orphan drugs); and (c) a no-fault liability insurance program like that currently in place for children's vaccines.
Resumo:
Metals support surface plasmons at optical wavelengths and have the ability to localize light to subwavelength regions. The field enhancements that occur in these regions set the ultimate limitations on a wide range of nonlinear and quantum optical phenomena. We found that the dominant limiting factor is not the resistive loss of the metal, but rather the intrinsic nonlocality of its dielectric response. A semiclassical model of the electronic response of a metal places strict bounds on the ultimate field enhancement. To demonstrate the accuracy of this model, we studied optical scattering from gold nanoparticles spaced a few angstroms from a gold film. The bounds derived from the models and experiments impose limitations on all nanophotonic systems.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Resumo:
The word 'impromptu' began to appear in music literature in the early 19th century, specifically as title for a relatively short composition written for solo piano. The first impromptus appear to have been named so by the publishers. However, the composers themselves soon embraced the title to indicate, for the most part, fairly short character pieces. Impromptus do not follow any specific structural pattern, although many are cast in ternary form. The formal design ranges from strict compound ternary in the early impromptus to through-composed and variation forms. The peak of impromptu's popularity undoubtedly came during the middle and late19th century. However, they are still being composed today, albeit much less frequently. Although there have been many variants of impromptus in relation to formal design and harmonic language over the years, the essence of impromptu remains the same: it is still a short character piece with a general feeling of spontaneity. Overall, impromptus may be categorized into several different groups: some appear as part of a larger cycle, such as Dvorak's G minor Impromptu from his Piano Pieces, B. 110; many others use an element of an additional genre that enhances the character ofthe impromptu, such as Liszt's Valse-Impromptu and Antonio Bibalo's Tango Impromptu; yet another group consists of works based on opera themes, such as Liszt's Impromptu Brillant sur des themes de Rossini et Spontini and Czerny's Impromptus et variations sur Oberon, Op. 134. My recording project includes well-known impromptus, such as Schubert's Op. 142 and the four by Chopin, as well as lesser known works that have not been performed or recorded often. There are four impromptus that have been recorded here for the first time, including those written by Leopold Godowsky, Antonio Bibalo, Altin Volaj, and Nikolay Mazhara. I personally requested the two last named composers to contribute impromptus to this project. My selection represents works by twenty composers and reflects the different types of impromptus that have been encountered through almost three hundred years of the genre's existence, from approximately 1817 (VoriSek) to 2008 (Volaj and Mazhara).
Resumo:
Variation, or the re-working of existing musical material, has consistently attracted the attention of composers and performers throughout the history of Western music. In three recorded recitals at the University of Maryland School of Music, this dissertation project explores a diverse range of expressive possibilities for violin in seven types of variation form in Austro-German works for violin from the 17th through the 20th centuries. The first program, consisting of Baroque Period works, performed on period instrument, includes the divisions on “John come kiss me now” from The Division Violin by Thomas Baltzar (1631 – 1663), constant bass variations in Sonate Unarum Fidium by Johann Heinrich von Schmelzer (1623 – 1680), arbitrary variation in Sonata for Violin and Continuo in E Major, Op. 1, No. 12 “Roger” by George Friedrich Händel (1685 – 1759), and French Double style, melodic-outline variation in Partita for Unaccompanied Violin in B Minor by Johan Sebastian Bach (1685 – 1750). Theme and Variations, a popular Classical Period format, is represented by the Sonata for Piano and Violin in G Major K. 379 by Wolfgang Amadeus Mozart (1756 – 1791) and Sonata for Violin and Piano in A Major, Op. 47 No. 9 the “Kreutzer” by Ludwig van Beethoven (1770 – 1827). Fantasy for Piano and Violin in C Major D. 934 by Franz Schubert (1797 – 1828) represents the 19th century fantasia variation. In these pieces, the piano and violin parts are densely interwoven, having equal importance. Many 20th century composers incorporated diverse types of variations in their works and are represented in the third recital program comprising: serial variation in the Phantasy for Violin and Piano Op.47 of Arnold Schoenberg (1874 – 1951); a strict form of melodic-outline variation in Sonate für Violine allein, Op. 31, No. 2 of Paul Hindemith (1895 – 1963); ostinato variation in Johan Halvorsen’s (1864 – 1935) Passacaglia for Violin and Viola, after G. F. Handel’s Passacaglia from the Harpsichord Suite No. 7 in G Minor. Pianist Audrey Andrist, harpsichordist Sooyoung Jung, and violist Dong-Wook Kim assisted in these performances.
Resumo:
The Dietary Approaches to Stop Hypertension (DASH) trial showed that a diet rich in fruits, vegetables, low-fat dairy products with reduced total and saturated fat, cholesterol, and sugar-sweetened products effectively lowers blood pressure in individuals with prehypertension and stage I hypertension. Limited evidence is available on the safety and efficacy of the DASH eating pattern in special patient populations that were excluded from the trial. Caution should be exercised before initiating the DASH diet in patients with chronic kidney disease, chronic liver disease, and those who are prescribed renin-angiotensin-aldosterone system antagonist, but these conditions are not strict contraindications to DASH. Modifications to the DASH diet may be necessary to facilitate its use in patients with chronic heart failure, uncontrolled diabetes mellitus type II, lactose intolerance, and celiac disease. In general, the DASH diet can be adopted by most patient populations and initiated simultaneously with medication therapy and other lifestyle interventions.
Resumo:
In Belgium, gender-parity has been compulsory for all party lists (in local, regional, federal and European elections) for several years. As a result, the proportion of women has risen from a fourth up to a third of the deputies. Yet, strict parity is still far from realised. This article seeks to establish what causes this glass ceiling, namely the parties' reluctance to place female candidates in the top positions or even as the front-runner. In a proportional representation system with half-open lists, and especially when the constituencies are small, this automatically leads to a smaller proportion of women among the elected deputies. One important reason for the parties' reluctance to rank female candidates higher is their assumption that women are less effective as "election locomotives" than men. However, the analysis of the Belgian election results makes clear that this is not the case. Female candidates in top positions are as successful as their male counterparts. © (2008) Swiss Political Science Review.
Resumo:
Thin-layer and high-performance thin-layer chromatography (TLC/HPTLC) methods for assaying compound(s) in a sample must be validated to ensure that they are fit for their intended purpose and, where applicable, meet the strict regulatory requirements for controlled products. Two validation approaches are identified in the literature, i.e. the classic and the alternative, which is using accuracy profiles.Detailed procedures of the two approaches are discussed based on the validation of methods for pharmaceutical analysis, which is an area considered having more strict requirements. Estimation of the measurement uncertainty from the validation approach using accuracy profiles is also described.Examples of HPTLC methods, developed and validated to assay sulfamethoxazole and trimethoprim on the one hand and lamivudine, stavudine, and nevirapine on the other, in their fixed-dose combination tablets, are further elaborated.