99 resultados para nonlinear panel estimation under cross-sectional dependence


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine the prevalence and possible etiological factors of erosive tooth wear and wedge-shaped defects in Swiss Army recruits and compare the findings with those of an analogous study conducted in 1996. In 2006, 621 recruits between 18 and 25 years of age (1996: 417 recruits; ages 19 to 25) were examined for erosive tooth wear and wedge-shaped defects. Additional data was acquired using a questionnaire about personal details, education, dentitions subjective condition, oral hygiene, eating and drinking habits, medications used, and general medical problems. In 2006, 60.1% of those examined exhibited occlusal erosive tooth wear not involving the dentin (1996: 82.0%) and 23.0% involving the dentin (1996: 30.7%). Vestibular erosive tooth wear without dentin involvement was seen in 7.7% in 2006 vs. 14.4% in 1996. Vestibular erosive tooth wear with dentin involvement was rare in both years (0.5%). Oral erosive tooth wear lacking exposed dentin was also rare in those years, although more teeth were affected in 2006 (2.1%) than in 1996 (0.7%). The examinations in 2006 found one or more initial wedge-shaped lesions in 8.5% of the recruits, while 20.4% of the study participants exhibited such in 1996. In 1996, 53% consumed acidic foods and beverages more than 5 times/day; in 2006, 83.9% did so. In neither study did multivariate regression analyses show any significant correlations between occurrence and location of erosive tooth wear and wedge-shaped defects and various other parameters, e.g., eating and hygiene habits, or dentin hyper-sensitivity. Despite a significant increase in consumption of acidic products between 1996 and 2006, the latter study found both fewer erosive tooth wear and fewer wedge-shaped defects (i.e., fewer non-carious lesions.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Patients with electrolyte imbalances or disorders have a high risk of mortality. It is unknown if this finding from sodium or potassium disorders extends to alterations of magnesium levels. METHODS AND PATIENTS In this cross-sectional analysis, all emergency room patients between 2010 and 2011 at the Inselspital Bern, Switzerland, were included. A multivariable logistic regression model was performed to assess the association between magnesium levels and in-hospital mortality up to 28days. RESULTS A total of 22,239 subjects were screened for the study. A total of 5339 patients had plasma magnesium concentrations measured at hospital admission and were included into the analysis. A total of 6.3% of the 352 patients with hypomagnesemia and 36.9% of the 151 patients with hypermagnesemia died. In a multivariate Cox regression model hypermagnesemia (HR 11.6, p<0.001) was a strong independent risk factor for mortality. In these patients diuretic therapy revealed to be protective (HR 0.5, p=0.007). Hypomagnesemia was not associated with mortality (p>0.05). Age was an independent risk factor for mortality (both p<0.001). CONCLUSION The study does demonstrate a possible association between hypermagnesemia measured upon admission in the emergency department, and early in-hospital mortality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Phosphate imbalances or disorders have a high risk of morbidity and mortality in patients with chronic kidney disease. It is unknown if this finding extends to mortality in patients presenting at an emergency room with or without normal kidney function. METHODS AND PATIENTS This cross sectional analysis included all emergency room patients between 2010 and 2011 at the Inselspital Bern, Switzerland. A multivariable cox regression model was applied to assess the association between phosphate levels and in-hospital mortality up to 28 days. RESULTS 22,239 subjects were screened for the study. Plasma phosphate concentrations were measured in 2,390 patients on hospital admission and were included in the analysis. 3.5% of the 480 patients with hypophosphatemia and 10.7% of the 215 patients with hyperphosphatemia died. In univariate analysis, phosphate levels were associated with mortality, age, diuretic therapy and kidney function (all p<0.001). In a multivariate Cox regression model, hyperphosphatemia (OR 3.29, p<0.001) was a strong independent risk factor for mortality. Hypophosphatemia was not associated with mortality (p>0.05). CONCLUSION Hyperphosphatemia is associated with 28-day in-hospital mortality in an unselected cohort of patients presenting in an emergency room.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Cardiovascular diseases are the leading cause of death worldwide and in Switzerland. When applied, treatment guidelines for patients with acute ST-segment elevation myocardial infarction (STEMI) improve the clinical outcome and should eliminate treatment differences by sex and age for patients whose clinical situations are identical. In Switzerland, the rate at which STEMI patients receive revascularization may vary by patient and hospital characteristics. AIMS: To examine all hospitalizations in Switzerland from 2010-2011 to determine if patient or hospital characteristics affected the rate of revascularization (receiving either a percutaneous coronary intervention or a coronary artery bypass grafting) in acute STEMI patients. DATA AND METHODS: We used national data sets on hospital stays, and on hospital infrastructure and operating characteristics, for the years 2010 and 2011, to identify all emergency patients admitted with the main diagnosis of acute STEMI. We then calculated the proportion of patients who were treated with revascularization. We used multivariable multilevel Poisson regression to determine if receipt of revascularization varied by patient and hospital characteristics. RESULTS: Of the 9,696 cases we identified, 71.6% received revascularization. Patients were less likely to receive revascularization if they were female, and 80 years or older. In the multivariable multilevel Poisson regression analysis, there was a trend for small-volume hospitals performing fewer revascularizations but this was not statistically significant while being female (Relative Proportion = 0.91, 95% CI: 0.86 to 0.97) and being older than 80 years was still associated with less frequent revascularization. CONCLUSION: Female and older patients were less likely to receive revascularization. Further research needs to clarify whether this reflects differential application of treatment guidelines or limitations in this kind of routine data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Double-checking is widely recommended as an essential method to prevent medication errors. However, prior research has shown that the concept of double-checking is not clearly defined, and that little is known about actual practice in oncology, for example, what kind of checking procedures are applied. OBJECTIVE To study the practice of different double-checking procedures in chemotherapy administration and to explore nurses' experiences, for example, how often they actually find errors using a certain procedure. General evaluations regarding double-checking, for example, frequency of interruptions during and caused by a check, or what is regarded as its essential feature was assessed. METHODS In a cross-sectional survey, qualified nurses working in oncology departments of 3 hospitals were asked to rate 5 different scenarios of double-checking procedures regarding dimensions such as frequency of use in practice and appropriateness to prevent medication errors; they were also asked general questions about double-checking. RESULTS Overall, 274 nurses (70% response rate) participated in the survey. The procedure of jointly double-checking (read-read back) was most commonly used (69% of respondents) and rated as very appropriate to prevent medication errors. Jointly checking medication was seen as the essential characteristic of double-checking-more frequently than 'carrying out checks independently' (54% vs 24%). Most nurses (78%) found the frequency of double-checking in their department appropriate. Being interrupted in one's own current activity for supporting a double-check was reported to occur frequently. Regression analysis revealed a strong preference towards checks that are currently implemented at the responders' workplace. CONCLUSIONS Double-checking is well regarded by oncology nurses as a procedure to help prevent errors, with jointly checking being used most frequently. Our results show that the notion of independent checking needs to be transferred more actively into clinical practice. The high frequency of reported interruptions during and caused by double-checks is of concern.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Healthy replacement heifers are one of the foundations of a healthy dairy herd. Farm management andrearing systems in Switzerland provide a wide variety of factors that could potentially be associated withintramammary infections (IMI) in early lactating dairy heifers. In this study, IMI with minor mastitispathogens such as coagulase-negative staphylococci (CNS), contagious pathogens, and environmentalmajor pathogens were identified. Fifty-four dairy farms were enrolled in the study. A questionnaire wasused to collect herd level data on housing, management and welfare of young stock during farm isitsand interviews with the farmers. Cow-level data such as breed, age at first calving, udder condition andswelling, and calving ease were also recorded. Data was also collected about young stock that spent aperiod of at least 3 months on an external rearing farm or on a seasonal alpine farm. At the quarterlevel, teat conditions such as teat lesions, teat dysfunction, presence of a papilloma and teat lengthwere recorded. Within 24 h after parturition, samples of colostral milk from 1564 quarters (391 heifers)were collected aseptically for bacterial culture. Positive bacteriological culture results were found in 49%of quarter samples. Potential risk factors for IMI were identified at the quarter, animal and herd levelusing multivariable and multilevel logistic regression analysis. At the herd level tie-stalls, and at cow-level the breed category “Brown cattle” were risk factors for IMI caused by contagious major pathogenssuch as Staphylococcus aureus (S. aureus). At the quarter-level, teat swelling and teat lesions were highlyassociated with IMI caused by environmental major pathogens. At the herd level heifer rearing at externalfarms was associated with less IMI caused by major environmental pathogens. Keeping pregnant heifersin a separate group was negatively associated with IMI caused by CNS. The odds of IMI with coagulase-negative staphylococci increased if weaning age was less than 4 months and if concentrates were fed tocalves younger than 2 weeks. This study identified herd, cow- and quarter-level risk factors that may beimportant for IMI prevention in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Disc degeneration, usually associated with low back pain and changes of intervertebral stiffness, represents a major health issue. As the intervertebral disc (IVD) morphology influences its stiffness, the link between mechanical properties and degenerative grade is partially lost without an efficient normalization of the stiffness with respect to the morphology. Moreover, although the behavior of soft tissues is highly nonlinear, only linear normalization protocols have been defined so far for the disc stiffness. Thus, the aim of this work is to propose a nonlinear normalization based on finite elements (FE) simulations and evaluate its impact on the stiffness of human anatomical specimens of lumbar IVD. First, a parameter study involving simulations of biomechanical tests (compression, flexion/extension, bilateral torsion and bending) on 20 FE models of IVDs with various dimensions was carried out to evaluate the effect of the disc's geometry on its compliance and establish stiffness/morphology relations necessary to the nonlinear normalization. The computed stiffness was then normalized by height (H), cross-sectional area (CSA), polar moment of inertia (J) or moments of inertia (Ixx, Iyy) to quantify the effect of both linear and nonlinear normalizations. In the second part of the study, T1-weighted MRI images were acquired to determine H, CSA, J, Ixx and Iyy of 14 human lumbar IVDs. Based on the measured morphology and pre-established relation with stiffness, linear and nonlinear normalization routines were then applied to the compliance of the specimens for each quasi-static biomechanical test. The variability of the stiffness prior to and after normalization was assessed via coefficient of variation (CV). The FE study confirmed that larger and thinner IVDs were stiffer while the normalization strongly attenuated the effect of the disc geometry on its stiffness. Yet, notwithstanding the results of the FE study, the experimental stiffness showed consistently higher CV after normalization. Assuming that geometry and material properties affect the mechanical response, they can also compensate for one another. Therefore, the larger CV after normalization can be interpreted as a strong variability of the material properties, previously hidden by the geometry's own influence. In conclusion, a new normalization protocol for the intervertebral disc stiffness in compression, flexion, extension, bilateral torsion and bending was proposed, with the possible use of MRI and FE to acquire the discs' anatomy and determine the nonlinear relations between stiffness and morphology. Such protocol may be useful to relate the disc's mechanical properties to its degree of degeneration.