840 resultados para first year teachers
Resumo:
Background The effectiveness of durable polymer drug-eluting stents comes at the expense of delayed arterial healing and subsequent late adverse events such as stent thrombosis (ST). We report the 4 year follow-up of an assessment of biodegradable polymer-based drug-eluting stents, which aim to improve safety by avoiding the persistent inflammatory stimulus of durable polymers. Methods We did a multicentre, assessor-masked, non-inferiority trial. Between Nov 27, 2006, and May 18, 2007, patients aged 18 years or older with coronary artery disease were randomly allocated with a computer-generated sequence to receive either biodegradable polymer biolimus-eluting stents (BES) or durable polymer sirolimus-eluting stents (SES; 1:1 ratio). The primary endpoint was a composite of cardiac death, myocardial infarction, or clinically-indicated target vessel revascularisation (TVR); patients were followed-up for 4 years. Analysis was by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00389220. Findings 1707 patients with 2472 lesions were randomly allocated to receive either biodegradable polymer BES (857 patients, 1257 lesions) or durable polymer SES (850 patients, 1215 lesions). At 4 years, biodegradable polymer BES were non-inferior to durable polymer SES for the primary endpoint: 160 (18·7%) patients versus 192 (22·6%) patients (rate ratios [RR] 0·81, 95% CI 0·66–1·00, p for non-inferiority <0·0001, p for superiority=0·050). The RR of definite ST was 0·62 (0·35–1·08, p=0·09), which was largely attributable to a lower risk of very late definite ST between years 1 and 4 in the BES group than in the SES group (RR 0·20, 95% CI 0·06–0·67, p=0·004). Conversely, the RR of definite ST during the first year was 0·99 (0·51–1·95; p=0·98) and the test for interaction between RR of definite ST and time was positive (pinteraction=0·017). We recorded an interaction with time for events associated with ST but not for other events. For primary endpoint events associated with ST, the RR was 0·86 (0·41–1·80) during the first year and 0·17 (0·04–0·78) during subsequent years (pinteraction=0·049). Interpretation Biodegradable polymer BES are non-inferior to durable polymer SES and, by reducing the risk of cardiac events associated with very late ST, might improve long-term clinical outcomes for up to 4 years compared with durable polymer SES. Funding Biosensors Europe SA, Switzerland.
Resumo:
This study sought to investigate quantitative and homogeneity differential echogenicity changes of the ABSORB scaffold (1.1) during the first year after implantation.
Resumo:
PURPOSE: In the present cohort study, overdentures with a combined root and implant support were evaluated and compared with either exclusively root- or implant-supported overdentures. Results of a 2-year follow-up period are reported, namely survival of implants, root copings, and prostheses, plus prosthetic complications, maintenance service, and patient satisfaction. MATERIALS AND METHODS: Fourteen patients were selected for the combined overdenture therapy and were compared with 2 patient groups in which either roots or implants provided overdenture support. Altogether, 14, 17, and 15 patients (in groups 1, 2, and 3, respectively) were matched with regard to age, sex, treatment time, and observation period. The mean age was around 67 years. Periodontal parameters were recorded, radiographs were taken, and all complications and failures were registered during the entire observation time. The patients answered a 9-item questionnaire by means of a visual analogue scale (VAS). RESULTS: One implant failed and 1 tooth root was removed following longitudinal root fracture. Periodontal/peri-implant parameters gave evidence of good oral hygiene for roots and implants, and slight crestal bone resorption was measured for both. Technical complications and service performed were significantly higher in the first year (P < .04) in all 3 groups and significantly higher in the tooth root group (P < .03). The results of the VAS indicated significantly lower scores for satisfaction, speaking ability, wearing comfort, and denture stability with combined or exclusive root support (P < .05 and .02, respectively). Initial costs of overdentures with combined or root support were 10% lower than for implant overdentures. CONCLUSION: The concept of combined root and implant support can be integrated into treatment planning and overdenture design for patients with a highly reduced dentition.
Resumo:
BACKGROUND: Highly active antiretroviral therapy (HAART) for the treatment of HIV infection was introduced a decade ago. We aimed to examine trends in the characteristics of patients starting HAART in Europe and North America, and their treatment response and short-term prognosis. METHODS: We analysed data from 22,217 treatment-naive HIV-1-infected adults who had started HAART and were followed up in one of 12 cohort studies. The probability of reaching 500 or less HIV-1 RNA copies per mL by 6 months, and the change in CD4 cell counts, were analysed for patients starting HAART in 1995-96, 1997, 1998, 1999, 2000, 2001, and 2002-03. The primary endpoints were the hazard ratios for AIDS and for death from all causes in the first year of HAART, which were estimated using Cox regression. RESULTS: The proportion of heterosexually infected patients increased from 20% in 1995-96 to 47% in 2002-03, and the proportion of women from 16% to 32%. The median CD4 cell count when starting HAART increased from 170 cells per muL in 1995-96 to 269 cells per muL in 1998 but then decreased to around 200 cells per muL. In 1995-96, 58% achieved HIV-1 RNA of 500 copies per mL or less by 6 months compared with 83% in 2002-03. Compared with 1998, adjusted hazard ratios for AIDS were 1.07 (95% CI 0.84-1.36) in 1995-96 and 1.35 (1.06-1.71) in 2002-03. Corresponding figures for death were 0.87 (0.56-1.36) and 0.96 (0.61-1.51). INTERPRETATION: Virological response after starting HAART improved over calendar years, but such improvement has not translated into a decrease in mortality.
Resumo:
OBJECTIVE: To assess the memory of various subdimensions of the birth experience in the second year postpartum, and to identify women in the first weeks postpartum at risk of developing a long-term negative memory. DESIGN, METHOD, OUTCOME MEASURES: New mothers' birth experience (BE) was assessed 48-96 hours postpartum (T1) by means of the SIL-Ger and the BBCI (perception of intranatal relationships); early postnatal adjustment (week 3 pp: T1(bis)) was also assessed. Then, four subgroups of women were defined by means of a cluster-analysis, integrating the T1/T1(bis) variables. To evaluate the memory of the BE, the SIL-Ger was again applied in the second year after childbirth (T2). First, the ratings of the SIL-Ger dimensions of T1 were compared to those at T2 in the whole sample. Then, the four subgroups were compared with respect to their ratings of the birth experience at T2 (correlations, ANOVAs and t-tests). RESULTS: In general, fulfillment, emotional adaptation, physical discomfort, and anxiety improve spontaneously over the first year postpartum, whereas in negative emotional experience, control, and time-going-slowly no shift over time is observed. However, women with a negative overall birth experience and a low level of perceived intranatal relationship at T1 run a high risk of retaining a negative memory in all of the seven subdimensions of the birth experience. CONCLUSIONS: Women at risk of developing a negative long-term memory of the BE can be identified at the time of early postpartum, when the overall birth experience and the perceived intranatal relationship are taken into account.
Resumo:
The new Swiss implant system SPI became available three years ago and is used in combination with fixed and removable prosthetic reconstructions. In a pilot study the clinical procedures were evaluated and data of prosthetic complications of maintenance service were collected. 25 patients participated in the study with a total of 79 SPI implants during the time period from 2003-2004. 37 implants were located in the maxilla and 42 implants in the mandible. Two implants failed during the healing period, but no loaded implant was lost. Thus, the survival rate was 97.5% (77/79). 44 implants supported a fixed prosthesis, including nine single crowns and 33 implants were used in combination with removable partial denture. Four implants were used with ball anchor retention, 29 with bar support. The ELEMENT implant with the low implant shoulder allows very good esthetics. Prosthetic complications and maintenance service during the first year of function was comparable with other implant systems. Since the design of the abutment screws, healing caps and screwdriver was changed, the system has become easier in its application.
Resumo:
Stem cells of various tissues are typically defined as multipotent cells with 'self-renewal' properties. Despite the increasing interest in stem cells, surprisingly little is known about the number of times stem cells can or do divide over a lifetime. Based on telomere-length measurements of hematopoietic cells, we previously proposed that the self-renewal capacity of hematopoietic stem cells is limited by progressive telomere attrition and that such cells divide very rapidly during the first year of life. Recent studies of patients with aplastic anemia resulting from inherited mutations in telomerase genes support the notion that the replicative potential of hematopoietic stem cells is directly related to telomere length, which is indirectly related to telomerase levels. To revisit conclusions about stem cell turnover based on cross-sectional studies of telomere length, we performed a longitudinal study of telomere length in leukocytes from newborn baboons. All four individual animals studied showed a rapid decline in telomere length (approximately 2-3 kb) in granulocytes and lymphocytes in the first year after birth. After 50-70 weeks the telomere length appeared to stabilize in all cell types. These observations suggest that hematopoietic stem cells, after an initial phase of rapid expansion, switch at around 1 year of age to a different functional mode characterized by a markedly decreased turnover rate.
Resumo:
OBJECT: In this study, the authors prospectively evaluated long-term psychosocial and neurocognitive performance in patients suffering from nonaneurysmal, nontraumatic subarachnoid hemorrhage (SAH) and investigated the association between the APOE-epsilon4 genotype and outcome in these patients. METHODS: All patients admitted to the authors' institution between January 2001 and January 2003 with spontaneous nonaneurysmal SAH were prospectively examined (mean follow-up 59.8 months). The APOE genotype was determined in all patients by polymerase chain reaction from a blood sample. Of the 30 patients included in this study, 11 were carriers of the epsilon4 allele. RESULTS: All patients showed a good recovery and regained full independence with no persisting neurological deficits. The patients with the epsilon4 allele, however, scored significantly higher on the Beck Depression Inventory (22.1 +/- 6.3 vs 14.1 +/- 5.1). At follow-up, depression was more persistent in the group with the epsilon4 allele compared with the group that lacked the allele. This finding reached statistical significance (p < 0.05). Selective attention was impaired in all patients during the first year of follow-up, with an earlier recovery noted in the patients without the epsilon4 allele. Moreover, there was a tendency toward a linear relationship between the Beck Depression Inventory and the d2 Test of Attention. Two patients who carried the epsilon4 allele did not return to their employment even after 5 years. CONCLUSIONS: The findings in this study suggest that the APOE genotypes may be associated with the psychosocial and neurocognitive performance after spontaneous nonaneurysmal SAH, even in the absence of neurological impairment. Physicians should consider patient genotype in assessing the long-term consequences of nonaneurysmal SAH.
Resumo:
The endomyocardial biopsy (EMB) in heart transplant recipients has been considered the "gold standard" for diagnosis of graft rejection (REJ). The purpose of this retrospective study is to develop long-term strategies (frequency and postoperative duration of EMB) for REJ monitoring. Between 1985 and 1992, 346 patients (mean age 44.5 years, female patients = 14%) received 382 heart grafts. For graft surveillance EMBs were performed according to a fixed schedule depending on postoperative day and the results of previous biopsies. In the first year the average number (no.) of EMBs/patient was 20 with 19% positive for REJ in the first quarter, dropping to 7% REJ/EMB by the end of the first year. The percentage of REJ/EMB declined annually from 4.7% to 4.5%, 2.2% and less than 1% after the fifth year. Individual biopsy results in the first 3 postoperative months had little predictive value. Patients with fewer than two REJ (group 1), vs patients with two or more REJ in the first 6 postoperative months (group 2), were significantly less likely to reject in the second half of the first year (group 1: 0.29 +/- 0.6 REJ/patient; group 2:0.83 +/- 1.3 REJ/patient; P < 0.001) and third postoperative year (group 1:0.12 +/- 0.33 REJ/patients; group 2:0.46 +/- 0.93 REJ/patient; P < 0.05). In conclusion, routine EMBs in the first 3 postoperative months have only limited predictive value, however the number of routine EMBs can be drastically reduced later depending on the intermediate postoperative REJ pattern.
Resumo:
In autumn 2007 the Swiss Medical School of Berne (Switzerland) implemented mandatory short-term clerkships in primary health care for all undergraduate medical students. Students studying for a Bachelor degree complete 8 half-days per year in the office of a general practitioner, while students studying for a Masters complete a three-week clerkship. Every student completes his clerkships in the same GP office during his four years of study. The purpose of this paper is to show how the goals and learning objectives were developed and evaluated. Method:A working group of general practitioners and faculty had the task of defining goals and learning objectives for a specific training program within the complex context of primary health care. The group based its work on various national and international publications. An evaluation of the program, a list of minimum requirements for the clerkships, an oral exam in the first year and an OSCE assignment in the third year assessed achievement of the learning objectives. Results: The findings present the goals and principal learning objectives for these clerkships, the results of the evaluation and the achievement of minimum requirements. Most of the defined learning objectives were taught and duly learned by students. Some learning objectives proved to be incompatible in the context of ambulatory primary care and had to be adjusted accordingly. Discussion: The learning objectives were evaluated and adapted to address students’ and teachers’ needs and the requirements of the medical school. The achievement of minimum requirements (and hence of the learning objectives) for clerkships has been mandatory since 2008. Further evaluations will show whether additional learning objectives need to be adopte
Resumo:
PURPOSE Extended grafting procedures in atrophic ridges are invasive and time-consuming and increase cost and patient morbidity. Therefore, ridge-splitting techniques have been suggested to enlarge alveolar crests. The aim of this cohort study was to report techniques and radiographic outcomes of implants placed simultaneously with a piezoelectric alveolar ridge-splitting technique (RST). Peri-implant bone-level changes (ΔIBL) of implants placed with (study group, SG) or without RST (control group, CG) were compared. MATERIALS AND METHODS Two cohorts (seven patients in each) were matched regarding implant type, position, and number; superstructure type; age; and gender and received 17 implants each. Crestal implant bone level (IBL) was measured at surgery (T0), loading (T1), and 1 year (T2) and 2 years after loading (T3). For all implants, ΔIBL values were determined from radiographs. Differences in ΔIBL between SG and CG were analyzed statistically (Mann-Whitney U test). Bone width was assessed intraoperatively, and vertical bone mapping was performed at T0, T1, and T3. RESULTS After a mean observation period of 27.4 months after surgery, the implant survival rate was 100%. Mean ΔIBL was -1.68 ± 0.90 mm for SG and -1.04 ± 0.78 mm for CG (P = .022). Increased ΔIBL in SG versus CG occurred mainly until T2. Between T2 and T3, ΔIBL was limited (-0.11 ± 1.20 mm for SG and -0.05 ± 0.16 mm for CG; P = .546). Median bone width increased intraoperatively by 4.7 mm. CONCLUSIONS Within the limitations of this study, it can be suggested that RST is a well-functioning one-stage alternative to extended grafting procedures if the ridge shows adequate height. ΔIBL values indicated that implants with RST may fulfill accepted implant success criteria. However, during healing and the first year of loading, increased IBL alterations must be anticipated.
Resumo:
BACKGROUND From January 2011 onward, the Swiss newborn screening (NBS) program has included a test for cystic fibrosis (CF). In this study, we evaluate the first year of implementation of the CF-NBS program. METHODS The CF-NBS program consists of testing in two steps: a heel prick sample is drawn (= Guthrie test) for measurement of immunoreactive trypsinogen (IRT) and for DNA screening. All children with a positive screening test are referred to a CF center for further diagnostic testing (sweat test and genetic analysis). After assessment in the CF center, the parents are given a questionnaire. All the results of the screening process and the parent questionnaires were centrally collected and evaluated. RESULTS In 2011, 83 198 neonates were screened, 84 of whom (0.1%) had a positive screening result and were referred to a CF center. 30 of these 84 infants were finally diagnosed with CF (positive predictive value: 35.7%). There was an additional infant with CF and meconium ileus whose IRT value was normal. The 31 diagnosed children with CF correspond to an incidence of 1 : 2683. The average time from birth to genetically confirmed diagnosis was 34 days (range: 13-135). 91% of the parents were satisfied that their child had undergone screening. All infants receiving a diagnosis of CF went on to receive further professional care in a CF center. CONCLUSION The suggested procedure for CF-NBS has been found effective in practice; there were no major problems with its implementation. It reached high acceptance among physicians and parents.
Resumo:
During school-to-work transition, adolescents develop values and prioritize what is im-portant in their life. Values are concepts or beliefs about desirable states or behaviors that guide the selection or evaluation of behavior and events, and are ordered by their relative importance (Schwartz & Bilsky, 1987). Stressing the important role of values, career re-search has intensively studied the effect of values on educational decisions and early career development (e.g. Eccles, 2005; Hirschi, 2010; Rimann, Udris, & Weiss, 2000). Few re-searchers, however, have investigated so far how values develop in the early career phase and how value trajectories are influenced by individual characteristics. Values can be oriented towards specific life domains, such as work or family. Work values include intrinsic and extrinsic aspects of work (e.g., self-development, cooperation with others, income) (George & Jones, 1997). Family values include the importance of partner-ship, the creation of an own family and having children (Mayer, Kuramschew, & Trommsdroff, 2009). Research indicates that work values change considerably during early career development (Johnson, 2001; Lindsay & Knox, 1984). Individual differences in work values and value trajectories are found e.g., in relation to gender (Duffy & Sedlacek, 2007), parental background (Loughlin & Barling, 2001), personality (Lowry et al., 2012), educa-tion (Battle, 2003), and the anticipated timing of school-to-work transition (Porfeli, 2007). In contrast to work values, research on family value trajectories is rare and knowledge about the development during the school-to-work transition and early career development is lack-ing. This paper aims at filling this research gap. Focusing on family values and intrinsic work values and we expect a) family and work val-ues to change between ages 16 and 25, and b) that initial levels of family and work values as well as value change to be predicted by gender, reading literacy, ambition, and expected du-ration of education. Method. Using data from 2620 young adults (59.5% females), who participated in the Swiss longitudinal study TREE, latent growth modeling was employed to estimate the initial level and growth rate per year for work and family values. Analyses are based on TREE-waves 1 (year 2001, first year after compulsory school) to 8 (year 2010). Variables in the models included family values and intrinsic work values, gender, reading literacy, ambition and ex-pected duration of education. Language region was included as control variable. Results. Family values did not change significantly over the first four years after leaving compulsory school (mean slope = -.03, p =.36). They increased, however, significantly five years after compulsory school (mean slope = .13, p >.001). Intercept (.23, p < .001), first slope (.02, p < .001), and second slope (.01, p < .001) showed significant variance. Initial levels were higher for men and those with higher ambitions. Increases were found to be steeper for males as well as for participants with lower educational duration expectations and reading skills. Intrinsic work values increased over the first four years (mean slope =.03, p <.05) and showed a tendency to decrease in the years five to ten (mean slope = -.01, p < .10). Intercept (.21, p < .001), first slope (.01, p < .001), and second slope (.01, p < .001) showed signifi-cant variance, meaning that there are individual differences in initial levels and growth rates. Initial levels were higher for females, and those with higher ambitions, expecting longer educational pathways, and having lower reading skills. Growth rates were lower for the first phase and steeper for the second phase for males compared to females. Discussion. In general, results showed different patterns of work and family value trajecto-ries, and different individual factors related to initial levels and development after compul-sory school. Developments seem to fit to major life and career roles: in the first years after compulsory school young adults may be engaged to become established in one's job; later on, raising a family becomes more important. That we found significant gender differences in work and family trajectories may reflect attempts to overcome traditional roles, as over-all, women increase in work values and men increase in family values, resulting in an over-all trend to converge.
Resumo:
Diarrhea disease is a leading cause of morbidity and mortality, especially in children in developing countries. An estimate of the global mortality caused by diarrhea among children under five years of age was 3.3 million deaths per year. Cryptosporidium parvum was first identified in 1907, but it was not until 1970 that this organism was recognized as a cause of diarrhea in calves. Then it was as late as 1976 that the first reported case of human Cryptosporidiosis occurred. This study was conducted to ascertain the risk factors of first symptomatic infection with Cryptosporidium parvum in a cohort of infants in a rural area of Egypt. The cohort was followed from birth through the first year of life. Univariate and multivariate analyses of data demonstrated that infants greater than six months of age had a two-fold risk of infection compared with infants less than six months of age (RR = 2.17; 95% C.I. = 1.01-4.82). When stratified, male infants greater than six months of age were four times more likely to become infected than male infants less than six months of age. Among female infants, there was no difference in risk between infants greater than six months of age and infants less than six months of age. Female infants less than six months of age were twice more likely to become infected than male infants less than six months of age. The reverse occurred for infants greater than six months of age, i.e., male infants greater than six months of age had twice the risk of infection compared to females of the same age group. Further analysis of the data revealed an increased risk of Cryptosporidiosis infection in infants who were attended in childbirth by traditional childbirth attendants compared to infants who were attended by modern childbirth attendants (nurses, trained midwives, physicians) (RR = 4. 18; 95% C.I. = 1.05-36.06). The final risk factor of significance was the number of people residing in the household. Infants in households which housed more than seven persons had an almost two-fold risk of infection compared with infants in homes with fewer than seven persons. Other risk factors which suggested increased risk were lack of education among the mothers, absence of latrines and faucets in the homes, and mud used as building material for walls and floors in the homes. ^
Resumo:
PURPOSE To evaluate and compare crestal bone level changes and peri-implant status of implant-supported reconstructions in edentulous and partially dentate patients after a minimum of 5 years of loading. MATERIALS AND METHODS All patients who received a self-tapping implant with a microstructured surface during the years 2003 and 2004 at the Department of Prosthodontics, University of Bern, were included in this study. The implant restorations comprised fixed and removable prostheses for partially and completely edentulous patients. Radiographs were taken immediately after surgery, at impression making, and 1 and 5 years after loading. Crestal bone level (BIC) was measured from the implant shoulder to the first bone contact, and changes were calculated over time (ΔBIC). The associations between pocket depth, bleeding on probing (BOP), and ΔBIC were assessed. RESULTS Sixty-one implants were placed in 20 patients (mean age, 62 ± 7 years). At the 5-year follow-up, 19 patients with 58 implants were available. Implant survival was 98.4% (one early failure; one patient died). The average ΔBIC between surgery and 5-year follow-up was 1.5 ± 0.9 mm and 1.1 ± 0.6 mm for edentulous and partially dentate patients, respectively. Most bone resorption (50%, 0.7 mm) occurred during the first 3 months (osseointegration) and within the first year of loading (21%, 0.3 mm). Mean annual bone loss during the 5 years of loading was < 0.12 mm. Mean pocket depth was 2.6 ± 0.7 mm. Seventeen percent of the implant sites displayed BOP; the frequency was significantly higher in women. None of the variables were significantly associated with crestal bone loss. CONCLUSION Crestal bone loss after 5 years was within the normal range, without a significant difference between edentulous and partially dentate patients. In the short term, this implant system can be used successfully for various prosthetic indications.