852 resultados para Hepatitis A
Resumo:
Introduction: Paramedics and other emergency health workers are exposed to infectious disease particularly when undertaking exposure-prone procedures as a component of their everyday practice. This study examined paramedic knowledge of infectious disease aetiology and transmission in the pre-hospital care environment.--------- Methods: A mail survey of paramedics from an Australian ambulance service (n=2274) was conducted.--------- Results: With a response rate of 55.3% (1258/2274), the study demonstrated that paramedic knowledge of infectious disease aetiology and modes of transmission was poor. Of the 25 infectious diseases included in the survey, only three aetiological agents were correctly identified by at least 80% of respondents. The most accurate responses for aetiology of individual infectious diseases were for HIV/AIDS (91.4%), influenza (87.4%), and hepatitis B (85.7%). Poorest results were observed for pertussis, infectious mononucleosis, leprosy, dengue fever, Japanese B encephalitis and vancomycin resistant enterococcus (VRE), all with less than half the sample providing a correct response. Modes of transmission of significant infectious diseases were also assessed. Most accurate responses were found for HIV/AIDS (85.8%), salmonella (81.9%) and influenza (80.1%). Poorest results were observed for infectious mononucleosis, diphtheria, shigella, Japanese B encephalitis, vancomycin resistant enterococcus, meningococcal meningitis, rubella and infectious mononucleosis, with less than a third of the sample providing a correct response.--------- Conclusions: Results suggest that knowledge of aetiology and transmission of infectious disease is generally poor amongst paramedics. A comprehensive in-service education infection control programs for paramedics with emphasis on infectious disease aetiology and transmission is recommended.
Resumo:
Floods are the most common type of disaster globally, responsible for almost 53,000 deaths in the last decade alone (23:1 low- versus high-income countries). This review assessed recent epidemiological evidence on the impacts of floods on human health. Published articles (2004–2011) on the quantitative relationship between floods and health were systematically reviewed. 35 relevant epidemiological studies were identified. Health outcomes were categorized into short- and long-term and were found to depend on the flood characteristics and people's vulnerability. It was found that long-term health effects are currently not well understood. Mortality rates were found to increase by up to 50% in the first year post-flood. After floods, it was found there is an increased risk of disease outbreaks such as hepatitis E, gastrointestinal disease and leptospirosis, particularly in areas with poor hygiene and displaced populations. Psychological distress in survivors (prevalence 8.6% to 53% two years post-flood) can also exacerbate their physical illness. There is a need for effective policies to reduce and prevent flood-related morbidity and mortality. Such steps are contingent upon the improved understanding of potential health impacts of floods. Global trends in urbanization, burden of disease, malnutrition and maternal and child health must be better reflected in flood preparedness and mitigation programs.
Resumo:
Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.
Resumo:
Background: About one third of refugee and humanitarian entrants to Australia are women age 12—44 years. Pregnant women from refugee backgrounds may have been exposed to a range of medical and psychosocial issues that can impact maternal, fetal and neonatal health. Research question: What are the key elements that characterise a best practice model of maternity care for women from refugee backgrounds? This paper outlines the findings of a project which aimed at developing such a model at a major maternity hospital in Brisbane, Australia. Participants and methods: This multifaceted project included a literature review, consultations with key stakeholders, a chart audit of hospital use by African-born women in 2006 that included their obstetric outcomes, a survey of 23 African-born women who gave birth at the hospital in 2007—08, and a survey of 168 hospital staff members. Results: The maternity chart audit identified complex medical and social histories among the women, including anaemia, female circumcision, hepatitis B, thrombocytopenia, and barriers to access antenatal care. The rates of caesarean sections and obstetric complications increased over time. Women and hospital staff surveys indicated the need for adequate interpreting services, education programs for women regarding antenatal and postnatal care, and professional development for health care staff to enhance cultural responsiveness. Discussion and conclusions: The findings point towards the need for a model of refugee maternity care that comprises continuity of carer, quality interpreter services, educational strategies for both women and healthcare professionals, and the provision of psychosocial support to women from refugee backgrounds.
Resumo:
As new diseases and medical conditions emerge, new community groups appear in the public health arena as consumer advocates or lobby groups seeking to affect policy or to represent ‘communities’ formed around these new diseases and conditions. The role of these groups in public health, their political status, and the extent to which they are actually representative are highly problematic for public health. These new constellations of social groups and activities challenge traditional ideas about public health decision-making and demand a rethinking of the constituency and limits of public health. Using discourse theory, symbolic interactionism, and ethological theory, the authors examine one case, exploring the perspectives of various communities on hepatitis C, and explore some issues that this raises for public health.
Resumo:
Immunogenicity and reactogenicity of DTPa and reduced antigen dTpa booster vaccines were compared to a hepatitis A control vaccine in DTPa-primed toddlers aged 18-20 months. Post-booster, all DTPa and dTpa recipients were seroprotected against diphtheria and tetanus, and >= 93.3% had a booster response to pertussis. There were similar reactogenicity rates in the DTPa and dTpa vaccine recipients. Few Grade 3 symptoms were reported. Just over one in four children in the control group had diphtheria antibody at or potentially below the correlate of protection benchmark (0.016 IU/ml). Larger studies should evaluate potential benefits of reduced antigen vaccines and seroprotection in children who do not receive a booster dose of DTPa at this age, including protection against diphtheria until subsequent booster doses are given. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Importance of the field: Reactive oxygen species (ROS) occur as natural by-products of oxygen metabolism and have important cellular functions. Normally, the cell is able to maintain an adequate balance between the formation and removal of ROS either via anti-oxidants or through the use specific enzymatic pathways. However, if this balance is disturbed, oxidative stress may occur in the cell, a situation linked to the pathogenesis of many diseases, including cancer. Areas covered in this review: HDACs are important regulators of many oxidative stress pathways including those involved with both sensing and coordinating the cellular response to oxidative stress. In particular aberrant regulation of these pathways by histone deacetylases may play critical roles in cancer progression. What the reader will gain: In this review we discuss the notion that targeting HDACs may be a useful therapeutic avenue in the treatment of oxidative stress in cancer, using chronic obstructive pulmonary disease (COPD), NSCLC and hepatocellular carcinoma (HCC) as examples to illustrate this possibility. Take home message: Epigenetic mechanisms may be an important new therapeutic avenue for targeting oxidative stress in cancer. © 2010 Informa UK, Ltd.
Resumo:
Recent studies have demonstrated that angiogenesis and suppressed cell- mediated immunity (CMI) play a central role in the pathogenesis of malignant disease facilitating tumour growth, invasion and metastasis. In the majority of tumours, the malignant process is preceded by a pathological condition or exposure to an irritant which itself is associated with the induction of angiogenesis and/or suppressed CMI. These include: cigarette smoking, chronic bronchitis and lung cancer; chronic oesophagitis and oesophageal cancer; chronic viral infections such as human papilloma virus and ano-genital cancers, chronic hepatitis B and C and hepatocellular carcinoma, and Epstein- Barr virus (EBV) and lymphomas; chronic inflammatory conditions such as Crohn's disease and ulcerative colitis and colorectal cancer; asbestos exposure and mesothelioma and excessive sunlight exposure/sunburn and malignant melanoma. Chronic exposure to growth factors (insulin-like growth factor-I in acromegaly), mutations in tumour suppressor genes (TP53 in Li Fraumeni syndrome) and long-term exposure to immunosuppressive agents (cyclosporin A) may also give rise to similar environments and are associated with the development of a range of solid tumours. The increased blood supply would facilitate the development and proliferation of an abnormal clone or clones of cells arising as the result of: (a) an inherited genetic abnormality; and/or (b) acquired somatic mutations, the latter due to local production and/or enhanced delivery of carcinogens and mutagenic growth factors. With progressive detrimental mutations and growth-induced tumour hypoxia, the transformed cell, to a lesser or greater extent, may amplify the angiogenic process and CMI suppression, thereby facilitating further tumour growth and metastasis. There is accumulating evidence that long-term treatment with cyclo-oxygenase inhibitors (aspirin and indomethacin), cytokines such as interferon-α, anti-oestrogens (tamoxifen and raloxifene) and captopril significantly reduces the incidence of solid tumours such as breast and colorectal cancer. These agents are anti-angiogenic and, in the case of aspirin, indomethacin and interferon-α have proven immunomodulatory effects. Collectively these observations indicate that angiogenesis and suppressed CMI play a central role in the development and progression of malignant disease. (C) 2000 Elsevier Science Ltd.
Resumo:
Background Historically, the paper hand-held record (PHR) has been used for sharing information between hospital clinicians, general practitioners and pregnant women in a maternity shared-care environment. Recently in alignment with a National e-health agenda, an electronic health record (EHR) was introduced at an Australian tertiary maternity service to replace the PHR for collection and transfer of data. The aim of this study was to examine and compare the completeness of clinical data collected in a PHR and an EHR. Methods We undertook a comparative cohort design study to determine differences in completeness between data collected from maternity records in two phases. Phase 1 data were collected from the PHR and Phase 2 data from the EHR. Records were compared for completeness of best practice variables collected The primary outcome was the presence of best practice variables and the secondary outcomes were the differences in individual variables between the records. Results Ninety-four percent of paper medical charts were available in Phase 1 and 100% of records from an obstetric database in Phase 2. No PHR or EHR had a complete dataset of best practice variables. The variables with significant improvement in completeness of data documented in the EHR, compared with the PHR, were urine culture, glucose tolerance test, nuchal screening, morphology scans, folic acid advice, tobacco smoking, illicit drug assessment and domestic violence assessment (p = 0.001). Additionally the documentation of immunisations (pertussis, hepatitis B, varicella, fluvax) were markedly improved in the EHR (p = 0.001). The variables of blood pressure, proteinuria, blood group, antibody, rubella and syphilis status, showed no significant differences in completeness of recording. Conclusion This is the first paper to report on the comparison of clinical data collected on a PHR and EHR in a maternity shared-care setting. The use of an EHR demonstrated significant improvements to the collection of best practice variables. Additionally, the data in an EHR were more available to relevant clinical staff with the appropriate log-in and more easily retrieved than from the PHR. This study contributes to an under-researched area of determining data quality collected in patient records.
Resumo:
Aims The aim of the study was to evaluate the significance of total bilirubin, aspartate transaminase (AST), alanine transaminase and gamma-glutamyltransferase (GGT) for predicting outcome in sepsis-associated cholestasis. Methods: A retrospective cohort review of the hospital records was performed in 181 neonates admitted to the Neonatal Care Unit. A comparison was performed between subjects with low and high liver values based on cut-off values from ROC analysis. We defined poor prognosis to be when a subject had prolonged cholestasis of more than 3.5 months, developed severe sepsis, septic shock or had a fatal outcome. Results: The majority of the subjects were male (56%), preterm (56%) and had early onset sepsis (73%). The poor prognosis group had lower initial values of GGT compared with the good prognosis group (P = 0.003). Serum GGT (cut-off value of 85.5 U/L) and AST (cut-off value of 51 U/L) showed significant correlation with the outcome following multivariate analysis. The odds ratio (OR) of low GGT and high AST were OR 4.3 (95% CI:1.6 to11.8) and OR 2.9 (95% CI:1.1 to 8), respectively, for poor prognosis. In subjects with normal AST values, those with low GGT value had relative risk of 2.52 (95% CI:1.4 to 3.5) for poorer prognosis compared with those with normal or high GGT. Conclusion: Serum GGT and AST values can be used to predict the prognosis of patients with sepsis-associated cholestasis
Resumo:
Though difficult, the study of gene-environment interactions in multifactorial diseases is crucial for interpreting the relevance of non-heritable factors and prevents from overlooking genetic associations with small but measurable effects. We propose a "candidate interactome" (i.e. a group of genes whose products are known to physically interact with environmental factors that may be relevant for disease pathogenesis) analysis of genome-wide association data in multiple sclerosis. We looked for statistical enrichment of associations among interactomes that, at the current state of knowledge, may be representative of gene-environment interactions of potential, uncertain or unlikely relevance for multiple sclerosis pathogenesis: Epstein-Barr virus, human immunodeficiency virus, hepatitis B virus, hepatitis C virus, cytomegalovirus, HHV8-Kaposi sarcoma, H1N1-influenza, JC virus, human innate immunity interactome for type I interferon, autoimmune regulator, vitamin D receptor, aryl hydrocarbon receptor and a panel of proteins targeted by 70 innate immune-modulating viral open reading frames from 30 viral species. Interactomes were either obtained from the literature or were manually curated. The P values of all single nucleotide polymorphism mapping to a given interactome were obtained from the last genome-wide association study of the International Multiple Sclerosis Genetics Consortium & the Wellcome Trust Case Control Consortium, 2. The interaction between genotype and Epstein Barr virus emerges as relevant for multiple sclerosis etiology. However, in line with recent data on the coexistence of common and unique strategies used by viruses to perturb the human molecular system, also other viruses have a similar potential, though probably less relevant in epidemiological terms. © 2013 Mechelli et al.
Resumo:
Purpose: To evaluate the efficacy and safety of adalimumab in patients with non-radiographic axial spondyloarthritis (nr-axSpA). Methods: Patients fulfilled Assessment of Spondyloarthritis international Society (ASAS) criteria for axial spondyloarthritis, had a Bath Ankylosing Spondylitis Disease Activity Index (BASDAI) score of ≥ 4, total back pain score of ≥ 4 (10 cm visual analogue scale) and inadequate response, intolerance or contraindication to non-steroidal anti-inflammatory drugs (NSAIDs); patients fulfilling modified New York criteria for ankylosing spondylitis were excluded. Patients were randomised to adalimumab (N=91) or placebo (N=94). The primary endpoint was the percentage of patients achieving ASAS40 at week 12. Efficacy assessments included BASDAI and Ankylosing Spondylitis Disease Activity Score (ASDAS). MRI was performed at baseline and week 12 and scored using the Spondyloarthritis Research Consortium of Canada (SPARCC) index. Results: Significantly more patients in the adalimumab group achieved ASAS40 at week 12 compared with patients in the placebo group (36% vs 15%, p<0.001). Significant clinical improvements based on other ASAS responses, ASDAS and BASDAI were also detected at week 12 with adalimumab treatment, as were improvements in quality of life measures. Inflammation in the spine and sacroiliac joints on MRI significantly decreased after 12 weeks of adalimumab treatment. Shorter disease duration, younger age, elevated baseline C-reactive protein or higher SPARCC MRI sacroiliac joint scores were associated with better week 12 responses to adalimumab. The safety profile was consistent with what is known for adalimumab in ankylosing spondylitis and other diseases. Conclusions: In patients with nr-axSpA, adalimumab treatment resulted in effective control of disease activity, decreased inflammation and improved quality of life compared with placebo. Results from ABILITY-1 suggest that adalimumab has a positive benefit-risk profile in active nr-axSpA patients with inadequate response to NSAIDs.
Resumo:
There have been recent improvements in the clinical understanding and definition of the major types of autoimmune liver disease. However, still lacking is knowledge of their prevalence and pathogenesis. Three areas of study are in progress in our laboratory. First, in type 1 autoimmune hepatitis, the search continues to identify a liver/disease-specific autoantigenic reactant. Using hepatocyte membrane preparations, immunoblotting has underlined the problem of distinguishing, among multiple reactants, those that may be causally rather than consequentially related to hepatocellular damage. Second, in primary biliary cirrhosis (PBC), the need for population screening to ascertain prevalence and detect preclinical cases can be met by a rapid automated procedure for detection, by specific enzyme inhibition in microtitre wells, of antibody (anti-M2) to the pyruvate dehydrogenase complex E2 subunit (PDC-E2). Third, the structure of the conformational epitope within the inner lipoyl domain of PDC-E2 is being investigated by screening random phage-displayed peptide libraries using PBC sera. This has yielded phage clones in which the sequence of the peptide insert portrays the structure of this epitope, as judged by clustering of PBC-derived sequences to particular branches of a guide-tree that shows relatedness of peptides, and by reactivity of selected phage clones with anti-PDC-E2. Thus phage display identifies a peptide 'mimotope' of the antibody epitope in the inner lipoyl domain of PDC-E2.
Resumo:
Objective People with chronic liver disease, particularly those with decompensated cirrhosis, experience several potentially debilitating complications that can have a significant impact on activities of daily living and quality of life. These impairments combined with the associated complex treatment mean that they are faced with specific and high levels of supportive care needs. We aimed to review reported perspectives, experiences and concerns of people with chronic liver disease worldwide. This information is necessary to guide development of policies around supportive needs screening tools and to enable prioritisation of support services for these patients. Design Systematic searches of PubMed, MEDLINE, CINAHL and PsycINFO from the earliest records until 19 September 2014. Data were extracted using standardised forms. A qualitative, descriptive approach was utilised to analyse and synthesise data. Results The initial search yielded 2598 reports: 26 studies reporting supportive care needs among patients with chronic liver disease were included, but few of them were patient-reported needs, none used a validated liver disease-specific supportive care need assessment instrument, and only three included patients with cirrhosis. Five key domains of supportive care needs were identified: informational or educational (eg, educational material, educational sessions), practical (eg, daily living), physical (eg, controlling pruritus and fatigue), patient care and support (eg, support groups), and psychological (eg, anxiety, sadness). Conclusions While several key domains of supportive care needs were identified, most studies included hepatitis patients. There is a paucity of literature describing the supportive care needs of the chronic liver disease population likely to have the most needs—namely those with cirrhosis. Assessing the supportive care needs of people with chronic liver disease have potential utility in clinical practice for facilitating timely referrals to support services.