126 resultados para cross-sectional area
Resumo:
OBJECTIVE The aim of this cross-sectional study was to estimate bone loss of implants with platform-switching design and analyze possible risk indicators after 5 years of loading in a multi-centered private practice network. METHOD AND MATERIALS Peri-implant bone loss was measured radiographically as the distance from the implant shoulder to the mesial and distal alveolar crest, respectively. Risk factor analysis for marginal bone loss included type of implant prosthetic treatment concept and dental status of the opposite arch. RESULTS A total of 316 implants in 98 study patients after 5 years of loading were examined. The overall mean value for radiographic bone loss was 1.02 mm (SD ± 1.25 mm, 95% CI 0.90- 1.14). Correlation analyses indicated a strong association of peri-implant bone loss > 2 mm for removable implant-retained prostheses with an odds ratio of 53.8. CONCLUSION The 5-year-results of the study show clinically acceptable values of mean bone loss after 5 years of loading. Implant-supported removable prostheses seem to be a strong co-factor for extensive bone level changes compared to fixed reconstructions. However, these results have to be considered for evaluation of the included special cohort under private dental office conditions.
Resumo:
BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.
Resumo:
BACKGROUND: Cardiovascular diseases are the leading cause of death worldwide and in Switzerland. When applied, treatment guidelines for patients with acute ST-segment elevation myocardial infarction (STEMI) improve the clinical outcome and should eliminate treatment differences by sex and age for patients whose clinical situations are identical. In Switzerland, the rate at which STEMI patients receive revascularization may vary by patient and hospital characteristics. AIMS: To examine all hospitalizations in Switzerland from 2010-2011 to determine if patient or hospital characteristics affected the rate of revascularization (receiving either a percutaneous coronary intervention or a coronary artery bypass grafting) in acute STEMI patients. DATA AND METHODS: We used national data sets on hospital stays, and on hospital infrastructure and operating characteristics, for the years 2010 and 2011, to identify all emergency patients admitted with the main diagnosis of acute STEMI. We then calculated the proportion of patients who were treated with revascularization. We used multivariable multilevel Poisson regression to determine if receipt of revascularization varied by patient and hospital characteristics. RESULTS: Of the 9,696 cases we identified, 71.6% received revascularization. Patients were less likely to receive revascularization if they were female, and 80 years or older. In the multivariable multilevel Poisson regression analysis, there was a trend for small-volume hospitals performing fewer revascularizations but this was not statistically significant while being female (Relative Proportion = 0.91, 95% CI: 0.86 to 0.97) and being older than 80 years was still associated with less frequent revascularization. CONCLUSION: Female and older patients were less likely to receive revascularization. Further research needs to clarify whether this reflects differential application of treatment guidelines or limitations in this kind of routine data.
Resumo:
BACKGROUND Double-checking is widely recommended as an essential method to prevent medication errors. However, prior research has shown that the concept of double-checking is not clearly defined, and that little is known about actual practice in oncology, for example, what kind of checking procedures are applied. OBJECTIVE To study the practice of different double-checking procedures in chemotherapy administration and to explore nurses' experiences, for example, how often they actually find errors using a certain procedure. General evaluations regarding double-checking, for example, frequency of interruptions during and caused by a check, or what is regarded as its essential feature was assessed. METHODS In a cross-sectional survey, qualified nurses working in oncology departments of 3 hospitals were asked to rate 5 different scenarios of double-checking procedures regarding dimensions such as frequency of use in practice and appropriateness to prevent medication errors; they were also asked general questions about double-checking. RESULTS Overall, 274 nurses (70% response rate) participated in the survey. The procedure of jointly double-checking (read-read back) was most commonly used (69% of respondents) and rated as very appropriate to prevent medication errors. Jointly checking medication was seen as the essential characteristic of double-checking-more frequently than 'carrying out checks independently' (54% vs 24%). Most nurses (78%) found the frequency of double-checking in their department appropriate. Being interrupted in one's own current activity for supporting a double-check was reported to occur frequently. Regression analysis revealed a strong preference towards checks that are currently implemented at the responders' workplace. CONCLUSIONS Double-checking is well regarded by oncology nurses as a procedure to help prevent errors, with jointly checking being used most frequently. Our results show that the notion of independent checking needs to be transferred more actively into clinical practice. The high frequency of reported interruptions during and caused by double-checks is of concern.
Resumo:
Healthy replacement heifers are one of the foundations of a healthy dairy herd. Farm management andrearing systems in Switzerland provide a wide variety of factors that could potentially be associated withintramammary infections (IMI) in early lactating dairy heifers. In this study, IMI with minor mastitispathogens such as coagulase-negative staphylococci (CNS), contagious pathogens, and environmentalmajor pathogens were identified. Fifty-four dairy farms were enrolled in the study. A questionnaire wasused to collect herd level data on housing, management and welfare of young stock during farm isitsand interviews with the farmers. Cow-level data such as breed, age at first calving, udder condition andswelling, and calving ease were also recorded. Data was also collected about young stock that spent aperiod of at least 3 months on an external rearing farm or on a seasonal alpine farm. At the quarterlevel, teat conditions such as teat lesions, teat dysfunction, presence of a papilloma and teat lengthwere recorded. Within 24 h after parturition, samples of colostral milk from 1564 quarters (391 heifers)were collected aseptically for bacterial culture. Positive bacteriological culture results were found in 49%of quarter samples. Potential risk factors for IMI were identified at the quarter, animal and herd levelusing multivariable and multilevel logistic regression analysis. At the herd level tie-stalls, and at cow-level the breed category “Brown cattle” were risk factors for IMI caused by contagious major pathogenssuch as Staphylococcus aureus (S. aureus). At the quarter-level, teat swelling and teat lesions were highlyassociated with IMI caused by environmental major pathogens. At the herd level heifer rearing at externalfarms was associated with less IMI caused by major environmental pathogens. Keeping pregnant heifersin a separate group was negatively associated with IMI caused by CNS. The odds of IMI with coagulase-negative staphylococci increased if weaning age was less than 4 months and if concentrates were fed tocalves younger than 2 weeks. This study identified herd, cow- and quarter-level risk factors that may beimportant for IMI prevention in the future.
Resumo:
BACKGROUND: Decreased bone mineral density has been reported in children with inflammatory bowel disease (IBD). We used peripheral quantitative computed tomography (pQCT) to assess bone mineralization, geometry, and muscle cross-sectional area (CSA) in pediatric IBD. METHODS: In a cross-sectional study, pQCT of the forearm was applied in 143 IBD patients (mean age 13.9 +/- 3.5 years); 29% were newly diagnosed, 98 had Crohn's disease, and 45 had ulcerative colitis. Auxological data, cumulative glucocorticoid dose, disease activity indices, laboratory markers for inflammation, and bone metabolism were related to the results of pQCT. RESULTS: Patients were compromised in height (-0.82 +/- 1.1 SD), weight (-0.77 +/- 1.0 SD), muscle mass (-1.12 +/- 1.0 SD), and total bone cross-sectional area (-0.79 +/- 1.0 SD) compared to age- and sex-matched healthy controls (z-scores). In newly diagnosed patients, the ratio of bone mineral mass per muscle CSA was higher than in those with longer disease duration (1.00 versus 0.30, P = 0.007). Serum albumin level and disease activity correlated with muscle mass, accounting for 41.0% of variability in muscle mass (P < 0.01). The trabecular bone mineral density z-score was on average at the lower normal level (-0.40 +/- 1.3 SD, P < 0.05). CONCLUSIONS: Reduced bone geometry was explained only in part by reduced height. Bone disease in children with IBD seems to be secondary to muscle wasting, which is already present at diagnosis. With longer disease duration, bone adapts to the lower muscle CSA. Serum albumin concentration is a good marker for muscle wasting and abnormal bone development.
Resumo:
The effect of acetyl-L-carnitine (ALCAR) supplementation to 3-month-old rats in normal-loading and unloading conditions has been here investigated by a combined morphological, biochemical and transcriptional approach to test whether ALCAR might cause a remodeling of the metabolic/contractile phenotype of soleus muscle. Morphological assessment demonstrated an increase of type I oxidative fiber content and cross-sectional area in ALCAR-treated animals both in normal-loading and in unloading conditions. ALCAR prevented loss of mitochondrial mass in unloaded animals whereas no ALCAR-dependent increase of mitochondrial mass occurred in normal-loaded muscle. Validated microarray analysis delineated an ALCAR-induced maintenance of a slow-oxidative expression program only in unloaded soleus muscle. Indeed, the muscle adjustment of the expression profile of factors underlying mitochondrial oxidative metabolism, protein turnover, fiber type differentiation and an adaptation of voltage-gated ion channel expression was distinguishable with respect to the loading status. This selectivity may suggest a key role of muscle loading status in the manifestation of ALCAR effects. The results extend to a broader level of biological informations the previous notion on ALCAR positive effect in rat soleus muscle during unloading and point to a role of ALCAR for the maintenance of its slow-oxidative fiber character.
Resumo:
Purpose To compare changes in the largest cross-sectional area (CSA) of the median nerve in wrists undergoing surgical decompression with changes in wrists undergoing non-surgical treatment of carpal tunnel syndrome (CTS). Methods This study was a prospective cohort study in 55 consecutive patients with 78 wrists with established CTS, including 60 wrists treated with surgical decompression and 18 wrists with non-surgical treatment. A sonographic examination was scheduled before and 4 months after initiation of treatment. We compared changes in CSA of the median nerve between wrists with surgical treatment and wrists with non-surgical treatment using linear regression models. Results Decreases in CSA of the median nerve were more pronounced in wrists with CTS release than in wrists undergoing nonsurgical treatment (difference in means, 1.0 mm2; 95% confidence interval, 0.3–1.8 mm2). Results were robust to the adjustment for age, gender, and neurological severity at baseline. Among wrists with CTS release, those with postoperative CSA of 10 mm2 or less tended to have better clinical outcomes than those with postoperative CSA of greater than 10 mm2 (p=.055). Postoperative sonographic workup in the 3 patients with unfavorable outcome or recurrence identified likely causes for treatment failure in 2 patients. Conclusions In this observational study, surgical decompression was associated with a greater decrease in median nerve CSA than was nonsurgical treatment. Smaller postoperative CSAs may be associated with better clinical outcomes. Additional randomized trials are necessary to determine the optimal treatment strategy in different subgroups of patients with CTS. Type of study/level of evidence Therapeutic III.
Resumo:
High-resolution ultrasound is becoming increasingly important in the diagnosis of carpal tunnel syndrome (CTS). Most studies define cut-off values of the cross-sectional area (CSA) of the median nerve in different locations. The individual range of nerve swelling, the size of the nerve, and its CSA are not addressed. The aim of the study is to define the intra- and interobserver reliability of diagnostic ultrasound using two different cross-sectional areas of the median nerve at the carpal tunnel in predefined locations.
Resumo:
OBJECTIVES:: Metacarpal juxta-articular bone is altered in Rheumatoid Arthritis (RA). However, a detailed analysis of disease related geometrical adaptations of the metacarpal shaft is missing. The aim of the present study was to assess the role of RA disease, forearm muscle cross-sectional area (CSA), age and sex on bone geometry at the metacarpal shaft. METHODS:: In 64 RA patients and 128 control subjects geometric properties of the third metacarpal bone mid-shaft and forearm muscle CSA were measured by peripheral quantitative computed tomography (pQCT). Linear models were performed for cortical CSA, total bone CSA, polar stress-strain Index (polar SSI, a surrogate for bone's resistance to bending and torsion), cortical thickness and Metacarpal Index (MI=cortical CSA/total CSA) with explanatory variables muscle CSA, age, RA status and sex. RESULTS:: Forearm muscle CSA was associated with cortical and total metacarpal CSA, and polar SSI. RA group status was associated with all bone parameters except cortical CSA. There was a significant interaction between RA status and age, indicating that the RA group had a greater age-related decrease in cortical CSA, cortical thickness and MI. CONCLUSIONS:: Bone geometry of the metacarpal shaft is altered in RA patients compared to healthy controls. While bone mass of the metacarpal shaft is adapted to forearm muscle mass, cortical thickness and MI are reduced but outer bone shaft circumference and polar SSI increased in RA patients. These adaptations correspond to an enhanced aging pattern in RA patients.
Resumo:
Reprogramming of gene expression contributes to structural and functional adaptation of muscle tissue in response to altered use. The aim of this study was to investigate mechanisms for observed improvements in leg extension strength, gain in relative thigh muscle mass and loss of body and thigh fat content in response to eccentric and conventional strength training in elderly men (n = 14) and women (n = 14; average age of the men and women: 80.1 ± 3.7 years) by means of structural and molecular analyses. Biopsies were collected from m. vastus lateralis in the resting state before and after 12 weeks of training with two weekly resistance exercise sessions (RET) or eccentric ergometer sessions (EET). Gene expression was analyzed using custom-designed low-density PCR arrays. Muscle ultrastructure was evaluated using EM morphometry. Gain in thigh muscle mass was paralleled by an increase in muscle fiber cross-sectional area (hypertrophy) with RET but not with EET, where muscle growth is likely occurring by the addition of sarcomeres in series or by hyperplasia. The expression of transcripts encoding factors involved in muscle growth, repair and remodeling (e.g., IGF-1, HGF, MYOG, MYH3) was increased to a larger extent after EET than RET. MicroRNA 1 expression was decreased independent of the training modality, and was paralleled by an increased expression of IGF-1 representing a potential target. IGF-1 is a potent promoter of muscle growth, and its regulation by microRNA 1 may have contributed to the gain of muscle mass observed in our subjects. EET depressed genes encoding mitochondrial and metabolic transcripts. The changes of several metabolic and mitochondrial transcripts correlated significantly with changes in mitochondrial volume density. Intramyocellular lipid content was decreased after EET concomitantly with total body fat. Changes in intramyocellular lipid content correlated with changes in body fat content with both RET and EET. In the elderly, RET and EET lead to distinct molecular and structural adaptations which might contribute to the observed small quantitative differences in functional tests and body composition parameters. EET seems to be particularly convenient for the elderly with regard to improvements in body composition and strength but at the expense of reducing muscular oxidative capacity.
Resumo:
Human skeletal muscle exhibits an outstanding phenotypic plasticity. Endurance training leads to massive increases of mitochondria and improves capillarization. Strength training increases muscle cross-sectional area mainly by increasing myofibrillar proteins. Over the last 15 years many molecular techniques have become available which have allowed for understanding of the basic adaptive mechanism behind muscle plasticity. Multiple parallel pathways increasing mainly transcriptional activities for selected muscle proteins are responsible for endurance training related muscle changes. Muscle changes associated with strength training are dominantly achieved by modifying translational mechanisms. This review intends to delineate the relevant molecular mechanism in a functional context which is responsible for the phenotypic plasticity of adult skeletal muscle tissue.
Resumo:
For the development of meniscal substitutes and related finite element models it is necessary to know the mechanical properties of the meniscus and its attachments. Measurement errors can falsify the determination of material properties. Therefore the impact of metrological and geometrical measurement errors on the determination of the linear modulus of human meniscal attachments was investigated. After total differentiation the error of the force (+0.10%), attachment deformation (−0.16%), and fibre length (+0.11%) measurements almost annulled each other. The error of the cross-sectional area determination ranged from 0.00%, gathered from histological slides, up to 14.22%, obtained from digital calliper measurements. Hence, total measurement error ranged from +0.05% to −14.17%, predominantly affected by the cross-sectional area determination error. Further investigations revealed that the entire cross-section was significantly larger compared to the load-carrying collagen fibre area. This overestimation of the cross-section area led to an underestimation of the linear modulus of up to −36.7%. Additionally, the cross-sections of the collagen-fibre area of the attachments significantly varied up to +90% along their longitudinal axis. The resultant ratio between the collagen fibre area and the histologically determined cross-sectional area ranged between 0.61 for the posterolateral and 0.69 for the posteromedial ligament. The linear modulus of human meniscal attachments can be significantly underestimated due to the use of different methods and locations of cross-sectional area determination. Hence, it is suggested to assess the load carrying collagen fibre area histologically, or, alternatively, to use the correction factors proposed in this study.
Resumo:
Neurodegenerative diseases affect the cerebellum of numerous dog breeds. Although subjective, magnetic resonance (MR) imaging has been used to detect cerebellar atrophy in these diseases, but there are few data available on the normal size range of the cerebellum relative to other brain regions. The purpose of this study was to determine whether the size of the cerebellum maintains a consistent ratio with other brain regions in different ages and breeds of normal dogs and to define a measurement that can be used to identify cerebellar atrophy on MR images. Images from 52 normal and 13 dogs with cerebellar degenerative diseases were obtained. Volume and mid-sagittal cross-sectional area of the forebrain, brainstem, and cerebellum were calculated for each normal dog and compared between different breeds and ages as absolute and relative values. The ratio of the cerebellum to total brain and of the brainstem to cerebellum mid-sagittal cross-sectional area was compared between normal and affected dogs and the sensitivity and specificity of these ratios at distinguishing normal from affected dogs was calculated. The percentage of the brain occupied by the cerebellum in diverse dog breeds between 1 and 5 years of age was not significantly different, and cerebellar size did not change with increasing age. Using a cut off of 89%, the ratio between the brainstem and cerebellum mid-sagittal cross-sectional area could be used successfully to differentiate affected from unaffected dogs with a sensitivity and specificity of 100%, making this ratio an effective tool for identifying cerebellar atrophy on MR images.