39 resultados para Accumulation rate per year
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The breeding program for beef cattle in Japan has changed dramatically over 4 decades. Visual judging was done initially, but progeny testing in test stations began in 1968. In the 1980s, the genetic evaluation program using field records, so-called on-farm progeny testing, was first adopted in Oita, Hyogo, and Kumamoto prefectures. In this study, genetic trends for carcass traits in these 3 Wagyu populations were estimated, and genetic gains per year were compared among the 3 different beef cattle breeding programs. The field carcass records used were collected between 1988 and 2003. The traits analyzed were carcass weight, LM area, rib thickness, s.c. fat thickness, and beef marbling standard number. The average breeding values of reproducing dams born the same year were used to estimate the genetic trends for the carcass traits. For comparison of the 3 breeding programs, birth years of the dams were divided into 3 periods reflecting each program. Positive genetic trends for beef marbling standard number were clearly shown in all populations. The genetic gains per year for all carcass traits were significantly enhanced by adopting the on-farm progeny testing program. These results indicate that the on-farm progeny testing program with BLUP is a very powerful approach for genetic improvement of carcass traits in Japanese Wagyu beef cattle.
Resumo:
During school-to-work transition, adolescents develop values and prioritize what is im-portant in their life. Values are concepts or beliefs about desirable states or behaviors that guide the selection or evaluation of behavior and events, and are ordered by their relative importance (Schwartz & Bilsky, 1987). Stressing the important role of values, career re-search has intensively studied the effect of values on educational decisions and early career development (e.g. Eccles, 2005; Hirschi, 2010; Rimann, Udris, & Weiss, 2000). Few re-searchers, however, have investigated so far how values develop in the early career phase and how value trajectories are influenced by individual characteristics. Values can be oriented towards specific life domains, such as work or family. Work values include intrinsic and extrinsic aspects of work (e.g., self-development, cooperation with others, income) (George & Jones, 1997). Family values include the importance of partner-ship, the creation of an own family and having children (Mayer, Kuramschew, & Trommsdroff, 2009). Research indicates that work values change considerably during early career development (Johnson, 2001; Lindsay & Knox, 1984). Individual differences in work values and value trajectories are found e.g., in relation to gender (Duffy & Sedlacek, 2007), parental background (Loughlin & Barling, 2001), personality (Lowry et al., 2012), educa-tion (Battle, 2003), and the anticipated timing of school-to-work transition (Porfeli, 2007). In contrast to work values, research on family value trajectories is rare and knowledge about the development during the school-to-work transition and early career development is lack-ing. This paper aims at filling this research gap. Focusing on family values and intrinsic work values and we expect a) family and work val-ues to change between ages 16 and 25, and b) that initial levels of family and work values as well as value change to be predicted by gender, reading literacy, ambition, and expected du-ration of education. Method. Using data from 2620 young adults (59.5% females), who participated in the Swiss longitudinal study TREE, latent growth modeling was employed to estimate the initial level and growth rate per year for work and family values. Analyses are based on TREE-waves 1 (year 2001, first year after compulsory school) to 8 (year 2010). Variables in the models included family values and intrinsic work values, gender, reading literacy, ambition and ex-pected duration of education. Language region was included as control variable. Results. Family values did not change significantly over the first four years after leaving compulsory school (mean slope = -.03, p =.36). They increased, however, significantly five years after compulsory school (mean slope = .13, p >.001). Intercept (.23, p < .001), first slope (.02, p < .001), and second slope (.01, p < .001) showed significant variance. Initial levels were higher for men and those with higher ambitions. Increases were found to be steeper for males as well as for participants with lower educational duration expectations and reading skills. Intrinsic work values increased over the first four years (mean slope =.03, p <.05) and showed a tendency to decrease in the years five to ten (mean slope = -.01, p < .10). Intercept (.21, p < .001), first slope (.01, p < .001), and second slope (.01, p < .001) showed signifi-cant variance, meaning that there are individual differences in initial levels and growth rates. Initial levels were higher for females, and those with higher ambitions, expecting longer educational pathways, and having lower reading skills. Growth rates were lower for the first phase and steeper for the second phase for males compared to females. Discussion. In general, results showed different patterns of work and family value trajecto-ries, and different individual factors related to initial levels and development after compul-sory school. Developments seem to fit to major life and career roles: in the first years after compulsory school young adults may be engaged to become established in one's job; later on, raising a family becomes more important. That we found significant gender differences in work and family trajectories may reflect attempts to overcome traditional roles, as over-all, women increase in work values and men increase in family values, resulting in an over-all trend to converge.
Resumo:
PURPOSE: To test the hypothesis that the extension of areas with increased fundus autofluorescence (FAF) outside atrophic patches correlates with the rate of spread of geographic atrophy (GA) over time in eyes with age-related macular degeneration (AMD). METHODS: The database of the multicenter longitudinal natural history Fundus Autofluorescence in AMD (FAM) Study was reviewed for patients with GA recruited through the end of August 2003, with follow-up examinations within at least 1 year. Only eyes with sufficient image quality and with diffuse patterns of increased FAF surrounding atrophy were chosen. In standardized digital FAF images (excitation, 488 nm; emission, >500 nm), total size and spread of GA was measured. The convex hull (CH) of increased FAF as the minimum polygon encompassing the entire area of increased FAF surrounding the central atrophic patches was quantified at baseline. Statistical analysis was performed with the Spearman's rank correlation coefficient (rho). RESULTS: Thirty-nine eyes of 32 patients were included (median age, 75.0 years; interquartile range [IQR], 67.8-78.9); median follow-up, 1.87 years; IQR, 1.43-3.37). At baseline, the median total size of atrophy was 7.04 mm2 (IQR, 4.20-9.88). The median size of the CH was 21.47 mm2 (IQR, 15.19-28.26). The median rate of GA progression was 1.72 mm2 per year (IQR, 1.10-2.83). The area of increased FAF around the atrophy (difference between the CH and the total GA size at baseline) showed a positive correlation with GA enlargement over time (rho=0.60; P=0.0002). CONCLUSIONS: FAF characteristics that are not identified by fundus photography or fluorescein angiography may serve as a prognostic determinant in advanced atrophic AMD. As the FAF signal originates from lipofuscin (LF) in postmitotic RPE cells and since increased FAF indicates excessive LF accumulation, these findings would underscore the pathophysiological role of RPE-LF in AMD pathogenesis.
Resumo:
BACKGROUND: Bullous pemphigoid (BP), pemphigus vulgaris (PV) and pemphigus foliaceus (PF) are autoimmune bullous diseases characterized by the presence of tissue-bound and circulating autoantibodies directed against disease-specific target antigens of the skin. Although rare, these diseases run a chronic course and are associated with significant morbidity and mortality. There are few prospective data on gender- and age-specific incidence of these disorders. OBJECTIVES: Our aims were: (i) to evaluate the incidence of BP and PV/PF in Swiss patients, as the primary endpoint; and (ii) to assess the profile of the patients, particularly for comorbidities and medications, as the secondary endpoint. METHODS: The protocol of the study was distributed to all dermatology clinics, immunopathology laboratories and practising dermatologists in Switzerland. All newly diagnosed cases of BP and pemphigus occurring between 1 January 2001 and 31 December 2002 were collected. In total, 168 patients (73 men and 95 women) with these autoimmune bullous diseases, with a diagnosis based on clinical, histological and immunopathological criteria, were finally included. RESULTS: BP showed a mean incidence of 12.1 new cases per million people per year. Its incidence increased significantly after the age of 70 years, with a maximal value after the age of 90 years. The female/male ratio was 1.3. The age-standardized incidence of BP using the European population as reference was, however, lower, with 6.8 new cases per million people per year, reflecting the ageing of the Swiss population. In contrast, both PV and PF were less frequent. Their combined mean incidence was 0.6 new cases per million people per year. CONCLUSIONS; This is the first comprehensive prospective study analysing the incidence of autoimmune bullous diseases in an entire country. Our patient cohort is large enough to establish BP as the most frequent autoimmune bullous disease. Its incidence rate appears higher compared with other previous studies, most likely because of the demographic characteristics of the Swiss population. Nevertheless, based on its potentially misleading presentations, it is possible that the real incidence rate of BP is still underestimated. Based on its significant incidence in the elderly population, BP should deserve more public health concern.
Resumo:
OBJECTIVES Tenofovir is associated with reduced renal function. It is not clear whether patients can be expected to fully recover their renal function if tenofovir is discontinued. METHODS We calculated the estimated glomerular filtration rate (eGFR) for patients in the Swiss HIV Cohort Study remaining on tenofovir for at least 1 year after starting a first antiretroviral therapy regimen with tenofovir and either efavirenz or the ritonavir-boosted protease inhibitor lopinavir, atazanavir or darunavir. We estimated the difference in eGFR slope between those who discontinued tenofovir after 1 year and those who remained on tenofovir. RESULTS A total of 1049 patients on tenofovir for at least 1 year were then followed for a median of 26 months, during which time 259 patients (25%) discontinued tenofovir. After 1 year on tenofovir, the difference in eGFR between those starting with efavirenz and those starting with lopinavir, atazanavir and darunavir was - 0.7 [95% confidence interval (CI) -2.3 to 0.8], -1.4 (95% CI -3.2 to 0.3) and 0.0 (95% CI -1.7 to 1.7) mL/min/1.73 m(2) , respectively. The estimated linear rate of decline in eGFR on tenofovir was -1.1 (95% CI -1.5 to -0.8) mL/min/1.73 m(2) per year and its recovery after discontinuing tenofovir was 2.1 (95% CI 1.3 to 2.9) mL/min/1.73 m(2) per year. Patients starting tenofovir with either lopinavir or atazanavir appeared to have the same rates of decline and recovery as those starting tenofovir with efavirenz. CONCLUSIONS If patients discontinue tenofovir, clinicians can expect renal function to recover more rapidly than it declined.
Resumo:
Limited data exist on the efficacy of long-term therapies for osteoporosis. In osteoporotic postmenopausal women receiving denosumab for 7 years, nonvertebral fracture rates significantly decreased in years 4-7 versus years 1-3. This is the first demonstration of a further benefit on fracture outcomes with long-term therapy for osteoporosis. INTRODUCTION This study aimed to evaluate whether denosumab treatment continued beyond 3 years is associated with a further reduction in nonvertebral fracture rates. METHODS Participants who completed the 3-year placebo-controlled Fracture REduction Evaluation of Denosumab in Osteoporosis every 6 Months (FREEDOM) study were invited to participate in an open-label extension. The present analysis includes 4,074 postmenopausal women with osteoporosis (n = 2,343 long-term; n = 1,731 cross-over) who enrolled in the extension, missed ≤1 dose during their first 3 years of denosumab treatment, and continued into the fourth year of treatment. Comparison of nonvertebral fracture rates during years 1-3 of denosumab with that of the fourth year and with the rate during years 4-7 was evaluated. RESULTS For the combined group, the nonvertebral fracture rate per 100 participant-years was 2.15 for the first 3 years of denosumab treatment (referent) and 1.36 in the fourth year (rate ratio [RR] = 0.64; 95 % confidence interval (CI) = 0.48 to 0.85, p = 0.003). Comparable findings were observed in the groups separately and when nonvertebral fracture rates during years 1-3 were compared to years 4-7 in the long-term group (RR = 0.79; 95 % CI = 0.62 to 1.00, p = 0.046). Fracture rate reductions in year 4 were most prominent in subjects with persisting low hip bone mineral density (BMD). CONCLUSIONS Denosumab treatment beyond 3 years was associated with a further reduction in nonvertebral fracture rate that persisted through 7 years of continuous denosumab administration. The degree to which denosumab further reduces nonvertebral fracture risk appears influenced by the hip bone density achieved with initial therapy.
Resumo:
Removal of miniplates is a controversial topic in oral and maxillofacial surgery. Originally, miniplates were designed to be removed on completion of bone healing. The introduction of low profile titanium miniplates has led to the routine removal of miniplates becoming comparatively rare in many parts of the world. Few studies have investigated the reasons for non-routine removal of miniplates and the factors that affect osteosynthesis after osteotomy in large numbers of patients. The aim of the present study was to investigate complications related to osteosynthesis after bilateral sagittal split osteotomy (BSSO) in a large number (n=153) of patients. In addition to the rates of removal, emphasis was placed on investigating the reasons and risk factors associated with symptomatic miniplate removal. The rate of plate removal per patient was 18.6%, the corresponding rate per plate being 18.2%. Reasons for plate removal included plate-related complications in 16 patients and subjective discomfort in 13 patients. Half of the plates were removed during the first postoperative year. Smoking was the only significant predictor for plate removal. Patients undergoing orthognathic surgery should be screened with regard to smoking and encouraged and assisted to cease smoking, at least perioperatively.
Resumo:
PURPOSE: The mandibular implant overdenture is a popular treatment modality and is well documented in the literature. Follow-up studies with a long observation period are difficult to perform due to the increasing age of patients. The present data summarize a long-term clinical observation of patients with implant overdentures. MATERIALS AND METHODS: Between 1984 and 1997, edentulous patients were consecutively admitted for treatment with an implant overdenture. The dentures were connected to the implants by means of bars or ball anchors. Regular maintenance was provided with at least one or two scheduled visits per year. Recall attendance and reasons for dropout were analyzed based on the specific history of the patient. Denture maintenance service, relining, repair, and fabrication of new dentures were identified, and complications with the retention devices specified separately. RESULTS: In the time period from 1984 to 2008, 147 patients with a total of 314 implants had completed a follow-up period of >10 years. One hundred one patients were still available in 2008, while 46 patients were not reexamined for various reasons. Compliance was high, with a regular recall attendance of >90%. More than 80% of dentures remained in continuous service. Although major prosthetic maintenance was rather low in relation to the long observation period, visits to a dental hygienist and dentist resulted in an annual visit rate of 1.5 and 2.4, respectively. If new dentures became necessary, these were made in student courses, which increased the treatment time and number of appointments needed. Complications with the retention devices consisted mostly of the mounting of new female retainers, the repair of bars, and the changing of ball anchors. The average number of events and the rate of prosthetic service with ball anchors were significantly higher than those with bars. Twenty-two patients changed from ball anchors to bars; 9 patients switched from a clip bar to a rigid U-shaped bar. CONCLUSIONS: This long-term follow-up study demonstrates that implant overdentures are a favorable solution for edentulous patients with regular maintenance. In spite of specific circumstances in an aging population, it is possible to provide long-term care, resulting in a good prognosis and low risk for this treatment modality. For various reasons the dropout rate can be considerable in elderly patients and prosthetic service must be provided regularly.
Resumo:
Refinement in microvascular reconstructive techniques over the last 30 years has enabled an increasing number of patients to be rehabilitated for both functional and aesthetic reasons. The purpose of this study was to evaluate different microsurgical practice, including perioperative management, in Germany, Austria, and Switzerland. The DÖSAK collaborative group for Microsurgical Reconstruction developed a detailed questionnaire which was circulated to units in the three countries. The current practice of the departments was evaluated. Thirty-eight questionnaires were completed resulting in a 47.5% response rate. A considerable variation in the number of microsurgical reconstructions per year was noted. In relation to the timing of bony reconstruction, 10 hospitals did reconstructions primarily (26.3%), 19 secondarily (50%) and 9 (23.7%) hospitals used both concepts. In the postoperative course, 15.8% of hospitals use inhibitors of platelet aggregation, most hospitals use low molecular heparin (52.6%) or other heparin products (44.7%). This survey shows variation in the performance, management, and care of microsurgical reconstructions of patients. This is due in part to the microvascular surgeons available in the unit but it is also due to different types of hospitals where various types of care can be performed in these patients needing special perioperative care.
Resumo:
Rangelands store about 30% of the world’s carbon and support over 120 million pastoralists globally. Adjusting the management of remote alpine pastures bears a substantial climate change mitigation potential that can provide livelihood support for marginalized pastoralists through carbon payment. Landless pastoralists in Northern Pakistan seek higher income by cropping potatoes and peas over alpine pastures. However, tilling steep slopes without terracing exposes soil to erosion. Moreover, yields decline rapidly requiring increasing fertilizer inputs. Under these conditions, carbon payment could be a feasible option to compensate pastoralists for renouncing hazardous cropping while favoring pastoral activities. The study quantifies and compares C on cropped and grazed land. The hypothesis was that cropping on alpine pastures reduces former carbon storage. The study area located in the Naran valley of the Pakistani Himalayas receives an annual average of 819 mm of rain and 764 mm of snow. Average temperatures remain below 0°C from November to March while frost may occur all year round. A total of 72 soil core samples were collected discriminating land use (cropping, pasture), aspect (North, South), elevation (low 3000, middle 3100, and high 3200 m a.s.l.), and soil depth (shallow 0-10, deep 10-30 cm). Thirty six biomass samples were collected over the same independent variables (except for soil depth) using a 10x10x20 cm steal box inserted in the ground for each sample. Aboveground biomass and coarse roots were separated from the soil aggregate and oven-dried. Soil organic carbon (SOC) and biomass carbon (BC) were estimated through a potassium dichromate oxidation treatment. The samples were collected during the second week of October 2010 at the end of the grazing and cropping season and before the first snowfall. The data was statistically analyzed by means of a one-way analysis of variance. Results show that all variables taken separately have a significant effect on mean SOC [%]: crop/pasture 1.33/1.6, North/South 1.61/1.32, low/middle/high 1.09/1.62/1.68, shallow/deep 1.4/1.53. However, for BC, only land use has a significant effect with more than twice the amount of carbon in pastures [g m-2]: crop/pasture 127/318. These preliminary findings suggest that preventing the conversion of pastures into cropping fields in the Naran valley avoids an average loss of 12.2 t C ha-1 or 44.8 t CO2eq ha-1 representing a foreseeable compensation of 672 € ha-1 for the Naran landless pastoralists who would renounce cropping. The ongoing study shall provide a complete picture for carbon payment integrating key aspects such as the rate of cropping encroachment over pastures per year, the methane leakage from the system due to livestock enteric fermentation, the expected cropping income vs. livestock income and the transaction costs of implementing the mitigation project, certifying it, and verifying carbon credits. A net present value over an infinite time horizon for the mitigation scenario shall be estimated on an iterative simulation to consider weather and price uncertainties. The study will also provide an estimate of the minimum price of carbon at which pastoralists would consider engaging in the mitigation activity.
Resumo:
Background Low back pain (LBP) is one of the major concerns in health care. In Switzerland, musculoskeletal problems represent the third largest illness group with 9.4 million consultations per year. The return to work rate is increased by an active treatment program and saves societal costs. However, results after rehabilitation are generally poorer in patients with a Southeast European cultural background than in other patients. This qualitative research about the rehabilitation of patients with LBP and a Southeast European cultural background, therefore, explores possible barriers to successful rehabilitation. Methods We used a triangulation of methods combining three qualitative methods of data collection: 13 semi-structured in-depth interviews with patients who have a Southeast European cultural background and live in Switzerland, five semi-structured in-depth interviews and two focus groups with health professionals, and a literature review. Between June and December 2008, we recruited participants at a Rehabilitation Centre in the German-speaking part of Switzerland. Results To cope with pain, patients prefer passive strategies, which are not in line with recommended coping strategies. Moreover, the families of patients tend to support passive behaviour and reduce the autonomy of patients. Health professionals and researchers propagate active strategies including activity in the presence of pain, yet patients do not consider psychological factors contributing to LBP. The views of physicians and health professionals are in line with research evidence demonstrating the importance of psychosocial factors for LBP. Treatment goals focusing on increasing daily activities and return to work are not well understood by patients partly due to communication problems, which is something that patients and health professionals are aware of. Additional barriers to returning to work are caused by poor job satisfaction and other work-related factors. Conclusions LBP rehabilitation can be improved by addressing the following points. Early management of LBP should be activity-centred instead of pain-centred. It is mandatory to implement return to work management early, including return to adapted work, to improve rehabilitation for patients. Rehabilitation has to start when patients have been off work for three months. Using interpreters more frequently would improve communication between health professionals and patients, and reduce misunderstandings about treatment procedures. Special emphasis must be put on the process of goal-formulation by spending more time with patients in order to identify barriers to goal attainment. Information on the return to work process should also include the financial aspects of unemployment and disability.
Resumo:
BACKGROUND: Stent thrombosis is a safety concern associated with use of drug-eluting stents. Little is known about occurrence of stent thrombosis more than 1 year after implantation of such stents. METHODS: Between April, 2002, and Dec, 2005, 8146 patients underwent percutaneous coronary intervention with sirolimus-eluting stents (SES; n=3823) or paclitaxel-eluting stents (PES; n=4323) at two academic hospitals. We assessed data from this group to ascertain the incidence, time course, and correlates of stent thrombosis, and the differences between early (0-30 days) and late (>30 days) stent thrombosis and between SES and PES. FINDINGS: Angiographically documented stent thrombosis occurred in 152 patients (incidence density 1.3 per 100 person-years; cumulative incidence at 3 years 2.9%). Early stent thrombosis was noted in 91 (60%) patients, and late stent thrombosis in 61 (40%) patients. Late stent thrombosis occurred steadily at a constant rate of 0.6% per year up to 3 years after stent implantation. Incidence of early stent thrombosis was similar for SES (1.1%) and PES (1.3%), but late stent thrombosis was more frequent with PES (1.8%) than with SES (1.4%; p=0.031). At the time of stent thrombosis, dual antiplatelet therapy was being taken by 87% (early) and 23% (late) of patients (p<0.0001). Independent predictors of overall stent thrombosis were acute coronary syndrome at presentation (hazard ratio 2.28, 95% CI 1.29-4.03) and diabetes (2.03, 1.07-3.83). INTERPRETATION: Late stent thrombosis was encountered steadily with no evidence of diminution up to 3 years of follow-up. Early and late stent thrombosis were observed with SES and with PES. Acute coronary syndrome at presentation and diabetes were independent predictors of stent thrombosis.
Resumo:
AIMS: To determine whether the current practice of sweat testing in Swiss hospitals is consistent with the current international guidelines. METHODS: A questionnaire was mailed to all children's hospitals (n = 8), regional paediatric sections of general hospitals (n = 28), and all adult pulmonology centres (n = 8) in Switzerland which care for patients with cystic fibrosis (CF). The results were compared with published "guidelines 2000" of the American National Committee for Clinical Laboratory Standards (NCCLS) and the UK guidelines of 2003. RESULTS: The response rate was 89%. All 8 children's hospitals and 18 out of 23 answering paediatric sections performed sweat tests but none of the adult pulmonology centres. In total, 1560 sweat tests (range: 5-200 tests/centre/year, median 40) per year were done. 88% (23/26) were using Wescor systems, 73% (19/26) the Macroduct system for collecting sweat and 31% (8/26) the Nanoduct system. Sweat chloride was determined by only 62% (16/26) of all centres; of these, only 63% (10/16) indicated to use the recommended diagnostic chloride-CF-reference value of >60 mmol/l. Osmolality was measured in 35%, sodium in 42% and conductivity in 62% of the hospitals. Sweat was collected for maximal 30-120 (median 55) minutes; only three centres used the maximal 30 minutes sample time recommended by the international guidelines. CONCLUSIONS: Sweat testing practice in Swiss hospitals was inconsistent and seldom followed the current international guidelines for sweat collection, analyzing method and reference values. Only 62% were used the chloride concentration as a diagnostic reference, the only accepted diagnostic measurement by the NCCLS or UK guidelines.
Resumo:
BACKGROUND: CD4+ T-cell recovery in patients with continuous suppression of plasma HIV-1 viral load (VL) is highly variable. This study aimed to identify predictive factors for long-term CD4+ T-cell increase in treatment-naive patients starting combination antiretroviral therapy (cART). METHODS: Treatment-naive patients in the Swiss HIV Cohort Study reaching two VL measurements <50 copies/ml >3 months apart during the 1st year of cART were included (n=1816 patients). We studied CD4+ T-cell dynamics until the end of suppression or up to 5 years, subdivided into three periods: 1st year, years 2-3 and years 4-5 of suppression. Multiple median regression adjusted for repeated CD4+ T-cell measurements was used to study the dependence of CD4+ T-cell slopes on clinical covariates and drug classes. RESULTS: Median CD4+ T-cell increases following VL suppression were 87, 52 and 19 cells/microl per year in the three periods. In the multiple regression model, median CD4+ T-cell increases over all three periods were significantly higher for female gender, lower age, higher VL at cART start, CD4+ T-cell <650 cells/microl at start of the period and low CD4+ T-cell increase in the previous period. Patients on tenofovir showed significantly lower CD4+ T-cell increases compared with stavudine. CONCLUSIONS: In our observational study, long-term CD4+ T-cell increase in drug-naive patients with suppressed VL was higher in regimens without tenofovir. The clinical relevance of these findings must be confirmed in, ideally, clinical trials or large, collaborative cohort projects but could influence treatment of older patients and those starting cART at low CD4+ T-cell levels.