809 resultados para First-year students


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The new Swiss implant system SPI became available three years ago and is used in combination with fixed and removable prosthetic reconstructions. In a pilot study the clinical procedures were evaluated and data of prosthetic complications of maintenance service were collected. 25 patients participated in the study with a total of 79 SPI implants during the time period from 2003-2004. 37 implants were located in the maxilla and 42 implants in the mandible. Two implants failed during the healing period, but no loaded implant was lost. Thus, the survival rate was 97.5% (77/79). 44 implants supported a fixed prosthesis, including nine single crowns and 33 implants were used in combination with removable partial denture. Four implants were used with ball anchor retention, 29 with bar support. The ELEMENT implant with the low implant shoulder allows very good esthetics. Prosthetic complications and maintenance service during the first year of function was comparable with other implant systems. Since the design of the abutment screws, healing caps and screwdriver was changed, the system has become easier in its application.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stem cells of various tissues are typically defined as multipotent cells with 'self-renewal' properties. Despite the increasing interest in stem cells, surprisingly little is known about the number of times stem cells can or do divide over a lifetime. Based on telomere-length measurements of hematopoietic cells, we previously proposed that the self-renewal capacity of hematopoietic stem cells is limited by progressive telomere attrition and that such cells divide very rapidly during the first year of life. Recent studies of patients with aplastic anemia resulting from inherited mutations in telomerase genes support the notion that the replicative potential of hematopoietic stem cells is directly related to telomere length, which is indirectly related to telomerase levels. To revisit conclusions about stem cell turnover based on cross-sectional studies of telomere length, we performed a longitudinal study of telomere length in leukocytes from newborn baboons. All four individual animals studied showed a rapid decline in telomere length (approximately 2-3 kb) in granulocytes and lymphocytes in the first year after birth. After 50-70 weeks the telomere length appeared to stabilize in all cell types. These observations suggest that hematopoietic stem cells, after an initial phase of rapid expansion, switch at around 1 year of age to a different functional mode characterized by a markedly decreased turnover rate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECT: In this study, the authors prospectively evaluated long-term psychosocial and neurocognitive performance in patients suffering from nonaneurysmal, nontraumatic subarachnoid hemorrhage (SAH) and investigated the association between the APOE-epsilon4 genotype and outcome in these patients. METHODS: All patients admitted to the authors' institution between January 2001 and January 2003 with spontaneous nonaneurysmal SAH were prospectively examined (mean follow-up 59.8 months). The APOE genotype was determined in all patients by polymerase chain reaction from a blood sample. Of the 30 patients included in this study, 11 were carriers of the epsilon4 allele. RESULTS: All patients showed a good recovery and regained full independence with no persisting neurological deficits. The patients with the epsilon4 allele, however, scored significantly higher on the Beck Depression Inventory (22.1 +/- 6.3 vs 14.1 +/- 5.1). At follow-up, depression was more persistent in the group with the epsilon4 allele compared with the group that lacked the allele. This finding reached statistical significance (p < 0.05). Selective attention was impaired in all patients during the first year of follow-up, with an earlier recovery noted in the patients without the epsilon4 allele. Moreover, there was a tendency toward a linear relationship between the Beck Depression Inventory and the d2 Test of Attention. Two patients who carried the epsilon4 allele did not return to their employment even after 5 years. CONCLUSIONS: The findings in this study suggest that the APOE genotypes may be associated with the psychosocial and neurocognitive performance after spontaneous nonaneurysmal SAH, even in the absence of neurological impairment. Physicians should consider patient genotype in assessing the long-term consequences of nonaneurysmal SAH.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this issue...Butte, Anaconda, Silver Lake, Montana, Wilson Chemical Company, Seattle, Washington, Co-Ed Club, Professor Koenig, First-Year student course

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The endomyocardial biopsy (EMB) in heart transplant recipients has been considered the "gold standard" for diagnosis of graft rejection (REJ). The purpose of this retrospective study is to develop long-term strategies (frequency and postoperative duration of EMB) for REJ monitoring. Between 1985 and 1992, 346 patients (mean age 44.5 years, female patients = 14%) received 382 heart grafts. For graft surveillance EMBs were performed according to a fixed schedule depending on postoperative day and the results of previous biopsies. In the first year the average number (no.) of EMBs/patient was 20 with 19% positive for REJ in the first quarter, dropping to 7% REJ/EMB by the end of the first year. The percentage of REJ/EMB declined annually from 4.7% to 4.5%, 2.2% and less than 1% after the fifth year. Individual biopsy results in the first 3 postoperative months had little predictive value. Patients with fewer than two REJ (group 1), vs patients with two or more REJ in the first 6 postoperative months (group 2), were significantly less likely to reject in the second half of the first year (group 1: 0.29 +/- 0.6 REJ/patient; group 2:0.83 +/- 1.3 REJ/patient; P < 0.001) and third postoperative year (group 1:0.12 +/- 0.33 REJ/patients; group 2:0.46 +/- 0.93 REJ/patient; P < 0.05). In conclusion, routine EMBs in the first 3 postoperative months have only limited predictive value, however the number of routine EMBs can be drastically reduced later depending on the intermediate postoperative REJ pattern.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In autumn 2007 the Swiss Medical School of Berne (Switzerland) implemented mandatory short-term clerkships in primary health care for all undergraduate medical students. Students studying for a Bachelor degree complete 8 half-days per year in the office of a general practitioner, while students studying for a Masters complete a three-week clerkship. Every student completes his clerkships in the same GP office during his four years of study. The purpose of this paper is to show how the goals and learning objectives were developed and evaluated. Method:A working group of general practitioners and faculty had the task of defining goals and learning objectives for a specific training program within the complex context of primary health care. The group based its work on various national and international publications. An evaluation of the program, a list of minimum requirements for the clerkships, an oral exam in the first year and an OSCE assignment in the third year assessed achievement of the learning objectives. Results: The findings present the goals and principal learning objectives for these clerkships, the results of the evaluation and the achievement of minimum requirements. Most of the defined learning objectives were taught and duly learned by students. Some learning objectives proved to be incompatible in the context of ambulatory primary care and had to be adjusted accordingly. Discussion: The learning objectives were evaluated and adapted to address students’ and teachers’ needs and the requirements of the medical school. The achievement of minimum requirements (and hence of the learning objectives) for clerkships has been mandatory since 2008. Further evaluations will show whether additional learning objectives need to be adopte

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE Extended grafting procedures in atrophic ridges are invasive and time-consuming and increase cost and patient morbidity. Therefore, ridge-splitting techniques have been suggested to enlarge alveolar crests. The aim of this cohort study was to report techniques and radiographic outcomes of implants placed simultaneously with a piezoelectric alveolar ridge-splitting technique (RST). Peri-implant bone-level changes (ΔIBL) of implants placed with (study group, SG) or without RST (control group, CG) were compared. MATERIALS AND METHODS Two cohorts (seven patients in each) were matched regarding implant type, position, and number; superstructure type; age; and gender and received 17 implants each. Crestal implant bone level (IBL) was measured at surgery (T0), loading (T1), and 1 year (T2) and 2 years after loading (T3). For all implants, ΔIBL values were determined from radiographs. Differences in ΔIBL between SG and CG were analyzed statistically (Mann-Whitney U test). Bone width was assessed intraoperatively, and vertical bone mapping was performed at T0, T1, and T3. RESULTS After a mean observation period of 27.4 months after surgery, the implant survival rate was 100%. Mean ΔIBL was -1.68 ± 0.90 mm for SG and -1.04 ± 0.78 mm for CG (P = .022). Increased ΔIBL in SG versus CG occurred mainly until T2. Between T2 and T3, ΔIBL was limited (-0.11 ± 1.20 mm for SG and -0.05 ± 0.16 mm for CG; P = .546). Median bone width increased intraoperatively by 4.7 mm. CONCLUSIONS Within the limitations of this study, it can be suggested that RST is a well-functioning one-stage alternative to extended grafting procedures if the ridge shows adequate height. ΔIBL values indicated that implants with RST may fulfill accepted implant success criteria. However, during healing and the first year of loading, increased IBL alterations must be anticipated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND From January 2011 onward, the Swiss newborn screening (NBS) program has included a test for cystic fibrosis (CF). In this study, we evaluate the first year of implementation of the CF-NBS program. METHODS The CF-NBS program consists of testing in two steps: a heel prick sample is drawn (= Guthrie test) for measurement of immunoreactive trypsinogen (IRT) and for DNA screening. All children with a positive screening test are referred to a CF center for further diagnostic testing (sweat test and genetic analysis). After assessment in the CF center, the parents are given a questionnaire. All the results of the screening process and the parent questionnaires were centrally collected and evaluated. RESULTS In 2011, 83 198 neonates were screened, 84 of whom (0.1%) had a positive screening result and were referred to a CF center. 30 of these 84 infants were finally diagnosed with CF (positive predictive value: 35.7%). There was an additional infant with CF and meconium ileus whose IRT value was normal. The 31 diagnosed children with CF correspond to an incidence of 1 : 2683. The average time from birth to genetically confirmed diagnosis was 34 days (range: 13-135). 91% of the parents were satisfied that their child had undergone screening. All infants receiving a diagnosis of CF went on to receive further professional care in a CF center. CONCLUSION The suggested procedure for CF-NBS has been found effective in practice; there were no major problems with its implementation. It reached high acceptance among physicians and parents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During school-to-work transition, adolescents develop values and prioritize what is im-portant in their life. Values are concepts or beliefs about desirable states or behaviors that guide the selection or evaluation of behavior and events, and are ordered by their relative importance (Schwartz & Bilsky, 1987). Stressing the important role of values, career re-search has intensively studied the effect of values on educational decisions and early career development (e.g. Eccles, 2005; Hirschi, 2010; Rimann, Udris, & Weiss, 2000). Few re-searchers, however, have investigated so far how values develop in the early career phase and how value trajectories are influenced by individual characteristics. Values can be oriented towards specific life domains, such as work or family. Work values include intrinsic and extrinsic aspects of work (e.g., self-development, cooperation with others, income) (George & Jones, 1997). Family values include the importance of partner-ship, the creation of an own family and having children (Mayer, Kuramschew, & Trommsdroff, 2009). Research indicates that work values change considerably during early career development (Johnson, 2001; Lindsay & Knox, 1984). Individual differences in work values and value trajectories are found e.g., in relation to gender (Duffy & Sedlacek, 2007), parental background (Loughlin & Barling, 2001), personality (Lowry et al., 2012), educa-tion (Battle, 2003), and the anticipated timing of school-to-work transition (Porfeli, 2007). In contrast to work values, research on family value trajectories is rare and knowledge about the development during the school-to-work transition and early career development is lack-ing. This paper aims at filling this research gap. Focusing on family values and intrinsic work values and we expect a) family and work val-ues to change between ages 16 and 25, and b) that initial levels of family and work values as well as value change to be predicted by gender, reading literacy, ambition, and expected du-ration of education. Method. Using data from 2620 young adults (59.5% females), who participated in the Swiss longitudinal study TREE, latent growth modeling was employed to estimate the initial level and growth rate per year for work and family values. Analyses are based on TREE-waves 1 (year 2001, first year after compulsory school) to 8 (year 2010). Variables in the models included family values and intrinsic work values, gender, reading literacy, ambition and ex-pected duration of education. Language region was included as control variable. Results. Family values did not change significantly over the first four years after leaving compulsory school (mean slope = -.03, p =.36). They increased, however, significantly five years after compulsory school (mean slope = .13, p >.001). Intercept (.23, p < .001), first slope (.02, p < .001), and second slope (.01, p < .001) showed significant variance. Initial levels were higher for men and those with higher ambitions. Increases were found to be steeper for males as well as for participants with lower educational duration expectations and reading skills. Intrinsic work values increased over the first four years (mean slope =.03, p <.05) and showed a tendency to decrease in the years five to ten (mean slope = -.01, p < .10). Intercept (.21, p < .001), first slope (.01, p < .001), and second slope (.01, p < .001) showed signifi-cant variance, meaning that there are individual differences in initial levels and growth rates. Initial levels were higher for females, and those with higher ambitions, expecting longer educational pathways, and having lower reading skills. Growth rates were lower for the first phase and steeper for the second phase for males compared to females. Discussion. In general, results showed different patterns of work and family value trajecto-ries, and different individual factors related to initial levels and development after compul-sory school. Developments seem to fit to major life and career roles: in the first years after compulsory school young adults may be engaged to become established in one's job; later on, raising a family becomes more important. That we found significant gender differences in work and family trajectories may reflect attempts to overcome traditional roles, as over-all, women increase in work values and men increase in family values, resulting in an over-all trend to converge.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diarrhea disease is a leading cause of morbidity and mortality, especially in children in developing countries. An estimate of the global mortality caused by diarrhea among children under five years of age was 3.3 million deaths per year. Cryptosporidium parvum was first identified in 1907, but it was not until 1970 that this organism was recognized as a cause of diarrhea in calves. Then it was as late as 1976 that the first reported case of human Cryptosporidiosis occurred. This study was conducted to ascertain the risk factors of first symptomatic infection with Cryptosporidium parvum in a cohort of infants in a rural area of Egypt. The cohort was followed from birth through the first year of life. Univariate and multivariate analyses of data demonstrated that infants greater than six months of age had a two-fold risk of infection compared with infants less than six months of age (RR = 2.17; 95% C.I. = 1.01-4.82). When stratified, male infants greater than six months of age were four times more likely to become infected than male infants less than six months of age. Among female infants, there was no difference in risk between infants greater than six months of age and infants less than six months of age. Female infants less than six months of age were twice more likely to become infected than male infants less than six months of age. The reverse occurred for infants greater than six months of age, i.e., male infants greater than six months of age had twice the risk of infection compared to females of the same age group. Further analysis of the data revealed an increased risk of Cryptosporidiosis infection in infants who were attended in childbirth by traditional childbirth attendants compared to infants who were attended by modern childbirth attendants (nurses, trained midwives, physicians) (RR = 4. 18; 95% C.I. = 1.05-36.06). The final risk factor of significance was the number of people residing in the household. Infants in households which housed more than seven persons had an almost two-fold risk of infection compared with infants in homes with fewer than seven persons. Other risk factors which suggested increased risk were lack of education among the mothers, absence of latrines and faucets in the homes, and mud used as building material for walls and floors in the homes. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Palestinians living in the West Bank, a territory occupied by the State of Israel according to International Law, face deprived access to land and a limited ability to move freely which pertains to the presence of Israeli settlements and other infrastructure (closures, restricted or forbidden roads, etc.). This confinement has significant impacts on their economic and social livelihoods, and it is even worsening with the on-going construction of a 709 km long Barrier which mainly runs inside the West Bank. With regard to this situation, there is a clear need to strengthen the capacity of civil society and its representatives to apply sound research processes as a basis for improved advocacy for Palestinian human rights. Monitoring processes and tools are needed to assess the impacts of the Palestinians’ confinement, particularly in relation to the Barrier’s construction. Reliable data has also to be collected, managed, and above all, shared. These challenges have been addressed within the Academic Cooperation Palestine Project (ACPP) that brings together academic partners from the occupied Palestinian territory (oPt) West Bank (WB), and Switzerland as well as other international academic institutions and Palestinian governmental and non-governmental agencies. ACPP started in early 2011 and is designed as a large cooperation networking platform involving researchers, students, public servants and experts from the oPt WB. A large set of actions have already been developed during the first year of the project, including courses, training, and research actions. First relevant results and impacts of the different actions are presented in this paper. Taken as a whole, the project produces valuable results for all partners: useful advocacy material for the Palestinian partners, and a unique “real-scale laboratory” where investigations are jointly conducted to develop novel confinement and change indicators.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE To evaluate and compare crestal bone level changes and peri-implant status of implant-supported reconstructions in edentulous and partially dentate patients after a minimum of 5 years of loading. MATERIALS AND METHODS All patients who received a self-tapping implant with a microstructured surface during the years 2003 and 2004 at the Department of Prosthodontics, University of Bern, were included in this study. The implant restorations comprised fixed and removable prostheses for partially and completely edentulous patients. Radiographs were taken immediately after surgery, at impression making, and 1 and 5 years after loading. Crestal bone level (BIC) was measured from the implant shoulder to the first bone contact, and changes were calculated over time (ΔBIC). The associations between pocket depth, bleeding on probing (BOP), and ΔBIC were assessed. RESULTS Sixty-one implants were placed in 20 patients (mean age, 62 ± 7 years). At the 5-year follow-up, 19 patients with 58 implants were available. Implant survival was 98.4% (one early failure; one patient died). The average ΔBIC between surgery and 5-year follow-up was 1.5 ± 0.9 mm and 1.1 ± 0.6 mm for edentulous and partially dentate patients, respectively. Most bone resorption (50%, 0.7 mm) occurred during the first 3 months (osseointegration) and within the first year of loading (21%, 0.3 mm). Mean annual bone loss during the 5 years of loading was < 0.12 mm. Mean pocket depth was 2.6 ± 0.7 mm. Seventeen percent of the implant sites displayed BOP; the frequency was significantly higher in women. None of the variables were significantly associated with crestal bone loss. CONCLUSION Crestal bone loss after 5 years was within the normal range, without a significant difference between edentulous and partially dentate patients. In the short term, this implant system can be used successfully for various prosthetic indications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

AIMS HIV infection may be associated with an increased recurrence rate of myocardial infarction. Our aim was to determine whether HIV infection is a risk factor for worse outcomes in patients with coronaray artery disease. METHODS We compared data aggregated from two ongoing cohorts: (i) the Acute Myocardial Infarction in Switzerland (AMIS) registry, which includes patients with acute myocardial infarction (AMI), and (ii) the Swiss HIV Cohort Study (SHCS), a prospective registry of HIV-positive (HIV+) patients. We included all patients who survived an incident AMI occurring on or after 1st January 2005. Our primary outcome measure was all-cause mortality at one year; secondary outcomes included AMI recurrence and cardiovascular-related hospitalisations. Comparisons used Cox and logistic regression analyses, respectively. RESULTS There were 133 HIV+, (SHCS) and 5,328 HIV-negative [HIV-] (AMIS) individuals with incident AMI. In the SHCS and AMIS registries, patients were predominantly male (72% and 85% male, respectively), with a median age of 51 years (interquartile range [IQR] 46-57) and 64 years (IQR 55-74), respectively. Nearly all (90%) of HIV+ individuals were on successful antiretroviral therapy. During the first year of follow-up, 5 (3.6%) HIV+ and 135 (2.5%) HIV- individuals died. At one year, HIV+ status after adjustment for age, sex, calendar year of AMI, smoking status, hypertension and diabetes was associated with a higher risk of death (HR 4.42, 95% CI 1.73-11.27). There were no significant differences in recurrent AMIs (4 [3.0%] HIV+ and 146 [3.0%] HIV- individuals, OR 1.16, 95% CI 0.41-3.27) or in hospitalization rates (OR 0.68 [95% CI 0.42-1.11]). CONCLUSIONS HIV infection was associated with a significantly increased risk of all-cause mortality one year after incident AMI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Interstellar Boundary Explorer (IBEX) observes the IBEX ribbon, which stretches across much of the sky observed in energetic neutral atoms (ENAs). The ribbon covers a narrow (~20°-50°) region that is believed to be roughly perpendicular to the interstellar magnetic field. Superimposed on the IBEX ribbon is the globally distributed flux that is controlled by the processes and properties of the heliosheath. This is a second study that utilizes a previously developed technique to separate ENA emissions in the ribbon from the globally distributed flux. A transparency mask is applied over the ribbon and regions of high emissions. We then solve for the globally distributed flux using an interpolation scheme. Previously, ribbon separation techniques were applied to the first year of IBEX-Hi data at and above 0.71 keV. Here we extend the separation analysis down to 0.2 keV and to five years of IBEX data enabling first maps of the ribbon and the globally distributed flux across the full sky of ENA emissions. Our analysis shows the broadening of the ribbon peak at energies below 0.71 keV and demonstrates the apparent deformation of the ribbon in the nose and heliotail. We show global asymmetries of the heliosheath, including both deflection of the heliotail and differing widths of the lobes, in context of the direction, draping, and compression of the heliospheric magnetic field. We discuss implications of the ribbon maps for the wide array of concepts that attempt to explain the ribbon's origin. Thus, we present the five-year separation of the IBEX ribbon from the globally distributed flux in preparation for a formal IBEX data release of ribbon and globally distributed flux maps to the heliophysics community.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. METHODS A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. RESULTS Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CONCLUSIONS CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.