940 resultados para linear model


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Evidence of the relationship between altered cognitive function and depleted Fe status is accumulating in women of reproductive age but the degree of Fe deficiency associated with negative neuropsychological outcomes needs to be delineated. Data are limited regarding this relationship in university women in whom optimal cognitive function is critical to academic success. The aim of the present study was to examine the relationship between body Fe, in the absence of Fe-deficiency anaemia, and neuropsychological function in young college women. Healthy, non-Anaemic undergraduate women (n 42) provided a blood sample and completed a standardised cognitive test battery consisting of one manual (Tower of London (TOL), a measure of central executive function) and five computerised (Bakan vigilance task, mental rotation, simple reaction time, immediate word recall and two-finger tapping) tasks. Women's body Fe ranged from - 4·2 to 8·1 mg/kg. General linear model ANOVA revealed a significant effect of body Fe on TOL planning time (P= 0.002). Spearman's correlation coefficients showed a significant inverse relationship between body Fe and TOL planning time for move categories 4 (r - 0.39, P= 0.01) and 5 (r - 0.47, P= 0.002). Performance on the computerised cognitive tasks was not affected by body Fe level. These findings suggest that Fe status in the absence of anaemia is positively associated with central executive function in otherwise healthy college women. Copyright © The Authors 2012.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Koopmans gyakorlati problémák megoldása során szerzett tapasztalatait általánosítva fogott hozzá a lineáris tevékenységelemzési modell kidolgozásához. Meglepődve tapasztalta, hogy a korabeli közgazdaságtan nem rendelkezett egységes, kellően egzakt termeléselmélettel és fogalomrendszerrel. Úttörő dolgozatában ezért - mintegy a lineáris tevékenységelemzési modell elméleti kereteként - lerakta a technológiai halmazok fogalmán nyugvó axiomatikus termeléselmélet alapjait is. Nevéhez fűződik a termelési hatékonyság és a hatékonysági árak fogalmának egzakt definíciója, s az egymást kölcsönösen feltételező viszonyuk igazolása a lineáris tevékenységelemzési modell keretében. A hatékonyság manapság használatos, pusztán műszaki szempontból értelmezett definícióját Koopmans csak sajátos esetként tárgyalta, célja a gazdasági hatékonyság fogalmának a bevezetése és elemzése volt. Dolgozatunkban a lineáris programozás dualitási tételei segítségével rekonstruáljuk ez utóbbira vonatkozó eredményeit. Megmutatjuk, hogy egyrészt bizonyításai egyenértékűek a lineáris programozás dualitási tételeinek igazolásával, másrészt a gazdasági hatékonysági árak voltaképpen a mai értelemben vett árnyékárak. Rámutatunk arra is, hogy a gazdasági hatékonyság értelmezéséhez megfogalmazott modellje az Arrow-Debreu-McKenzie-féle általános egyensúlyelméleti modellek közvetlen előzményének tekinthető, tartalmazta azok szinte minden lényeges elemét és fogalmát - az egyensúlyi árak nem mások, mint a Koopmans-féle hatékonysági árak. Végezetül újraértelmezzük Koopmans modelljét a vállalati technológiai mikroökonómiai leírásának lehetséges eszközeként. Journal of Economic Literature (JEL) kód: B23, B41, C61, D20, D50. /===/ Generalizing from his experience in solving practical problems, Koopmans set about devising a linear model for analysing activity. Surprisingly, he found that economics at that time possessed no uniform, sufficiently exact theory of production or system of concepts for it. He set out in a pioneering study to provide a theoretical framework for a linear model for analysing activity by expressing first the axiomatic bases of production theory, which rest on the concept of technological sets. He is associated with exact definition of the concept of production efficiency and efficiency prices, and confirmation of their relation as mutual postulates within the linear model of activity analysis. Koopmans saw the present, purely technical definition of efficiency as a special case; he aimed to introduce and analyse the concept of economic efficiency. The study uses the duality precepts of linear programming to reconstruct the results for the latter. It is shown first that evidence confirming the duality precepts of linear programming is equal in value, and secondly that efficiency prices are really shadow prices in today's sense. Furthermore, the model for the interpretation of economic efficiency can be seen as a direct predecessor of the Arrow–Debreu–McKenzie models of general equilibrium theory, as it contained almost every essential element and concept of them—equilibrium prices are nothing other than Koopmans' efficiency prices. Finally Koopmans' model is reinterpreted as a necessary tool for microeconomic description of enterprise technology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation studied the determinants and consequences of corporate reputation. It explored how firm-, industry-, and country-level factors influence the general public’s assessment of a firm’s reputation and how this reputation assessment impacted the firm’s strategic actions and organizational outcomes. The three empirical essays are grounded on separate theoretical paradigms in strategy, organizational theory, and corporate governance. The first essay used signaling theory to investigate firm-, industry-, and country-level determinants of individual-level corporate reputation assessments. Using a hierarchical linear model, it tested the theory based on individual evaluations of the largest companies across countries. Results indicated that variables at multiple analysis levels simultaneously impact individual level reputation assessments. Interactions were also found between industry- and country-level factors. Results confirmed the multi-level nature of signaling influences on reputation assessments. Building on a stakeholder-power approach to corporate governance, the second essay studied how differences in the power and preferences of three stakeholder groups—shareholders, creditors, and workers—across countries influence the general public’s reputation assessments of corporations. Examining the largest companies across countries, the study found that while the influence of stock market return is stronger in societies where shareholders have more power, social performance has a more significant role in shaping reputation evaluations in societies with stronger labor rights. Unexpectedly, when creditors have greater power, the influence of financial stability on reputation assessment becomes weaker. Exploring the consequences of reputation, the third essay investigated the specific effects of intangible assets on strategic actions and organizational outcomes. Particularly, it individually studied the impacts of acquirer acquisition experience, corporate reputation, and approach toward social responsibilities as well as their combined effect on market reactions to acquisition announcements. Using an event study of acquisition announcements, it confirmed the significant impacts of both action-specific (acquisition experience) and general (reputation and social performance) intangible assets on market expectations of acquisition outcomes. Moreover, the analysis demonstrated that reputation magnifies the impact of acquisition experience on market response to acquisition announcements. In conclusion, this dissertation tried to advance and extend the application of management and organizational theories by explaining the mechanisms underlying antecedents and consequences of corporate reputation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Chronic disease affects 80% of adults over the age of 65 and is expected to increase in prevalence. To address the burden of chronic disease, self-management programs have been developed to increase self-efficacy and improve quality of life by reducing or halting disease symptoms. Two programs that have been developed to address chronic disease are the Chronic Disease Self-Management Program (CDSMP) and Tomando Control de su Salud (TCDS). CDSMP and TCDS both focus on improving participant self-efficacy, but use different curricula, as TCDS is culturally tailored for the Hispanic population. Few studies have evaluated the effectiveness of CDSMP and TCDS when translated to community settings. In addition, little is known about the correlation between demographic, baseline health status, and psychosocial factors and completion of either CDSMP or TCDS. This study used secondary data collected by agencies of the Healthy Aging Regional Collaborative from 10/01/2008–12/31/2010. The aims of this study were to examine six week differences in self-efficacy, time spent performing physical activity, and social/role activity limitations, and to identify correlates of program completion using baseline demographic and psychosocial factors. To examine if differences existed a general linear model was used. Additionally, logistic regression was used to examine correlates of program completion. Study findings show that all measures showed improvement at week six. For CDSMP, self-efficacy to manage disease (p = .001), self-efficacy to manage emotions (p = .026), social/role activities limitations (p = .001), and time spent walking (p = .008) were statistically significant. For TCDS, self-efficacy to manage disease (p = .006), social/role activities limitations (p = .001), and time spent walking (p = .016) and performing other aerobic activity (p = .005) were significant. For CDSMP, no correlates predicting program completion were found to be significant. For TCDS, participants who were male (OR=2.3, 95%CI: 1.15–4.66), from Broward County (OR=2.3, 95%CI: 1.27–4.25), or living alone (OR=2.0, 95%CI: 1.29-–3.08) were more likely to complete the program. CDSMP and TCDS, when implemented through a collaborative effort, can result in improvements for participants. Effective chronic disease management can improve health, quality of life, and reduce health care expenditures among older adults.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prior to 2000, there were less than 1.6 million students enrolled in at least one online course. By fall 2010, student enrollment in online distance education showed a phenomenal 283% increase to 6.1 million. Two years later, this number had grown to 7.1 million. In light of this significant growth and skepticism about quality, there have been calls for greater oversight of this format of educational delivery. Accrediting bodies tasked with this oversight have developed guidelines and standards for online education. There is a lack of empirical studies that examine the relationship between accrediting standards and student success. The purpose of this study was to examine the relationship between the presence of Southern Association of Colleges and Schools Commission on College (SACSCOC) standards for online education in online courses, (a) student support services and (b) curriculum and instruction, and student success. An original 24-item survey with an overall reliability coefficient of .94 was administered to students (N=464) at Florida International University, enrolled in 24 university-wide undergraduate online courses during fall 2014, who rated the presence of these standards in their online courses. The general linear model was utilized to analyze the data. The results of the study indicated that the two standards, student support services and curriculum and instruction were both significantly and positively correlated with student success but with small R2 and strengths of association less than .35 and .20 respectively. Mixed results were produced from Chi-square tests for differences in student success between higher and lower rated online courses when controlling for various covariates such as discipline, gender, race/ethnicity, GPA, age, and number of online courses previously taken. A multiple linear regression analysis revealed that the curriculum and instruction standard was the only variable that accounted for a significant amount of unique variance in student success. Another regression test revealed that no significant interaction effect exists between the two SACSCOC standards and GPA in predicting student success. The results of this study are useful for administrators, faculty, and researchers who are interested in accreditation standards for online education and how these standards relate to student success.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this thesis used four different methods in order to diagnose the precipitation extremes on Northeastern Brazil (NEB): Generalized Linear Model s via logistic regression and Poisson, extreme value theory analysis via generalized extre me value (GEV) and generalized Pareto (GPD) distributions and Vectorial Generalized Linea r Models via GEV (MVLG GEV). The logistic regression and Poisson models were used to identify the interactions between the precipitation extremes and other variables based on the odds ratios and relative risks. It was found that the outgoing longwave radiation was the indicator variable for the occurrence of extreme precipitation on eastern, northern and semi arid NEB, and the relative humidity was verified on southern NEB. The GEV and GPD distribut ions (based on the 95th percentile) showed that the location and scale parameters were presented the maximum on the eastern and northern coast NEB, the GEV verified a maximum core on western of Pernambuco influenced by weather systems and topography. The GEV and GPD shape parameter, for most regions the data fitted by Weibull negative an d Beta distributions (ξ < 0) , respectively. The levels and return periods of GEV (GPD) on north ern Maranhão (centerrn of Bahia) may occur at least an extreme precipitation event excee ding over of 160.9 mm /day (192.3 mm / day) on next 30 years. The MVLG GEV model found tha t the zonal and meridional wind components, evaporation and Atlantic and Pacific se a surface temperature boost the precipitation extremes. The GEV parameters show the following results: a) location ( ), the highest value was 88.26 ± 6.42 mm on northern Maran hão; b) scale ( σ ), most regions showed positive values, except on southern of Maranhão; an d c) shape ( ξ ), most of the selected regions were adjusted by the Weibull negative distr ibution ( ξ < 0 ). The southern Maranhão and southern Bahia have greater accuracy. The level period, it was estimated that the centern of Bahia may occur at least an extreme precipitatio n event equal to or exceeding over 571.2 mm/day on next 30 years.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

CHAPTER 1 - This study histologically evaluated two implant designs: a classic thread design versus another specifically designed for healing chamber formation placed with two drilling protocols. Forty dental implants (4.1 mm diameter) with two different macrogeometries were inserted in the tibia of 10 Beagle dogs, and maximum insertion torque was recorded. Drilling techniques were: until 3.75 mm (regular-group); and until 4.0 mm diameter (overdrillinggroup) for both implant designs. At 2 and 4 weeks, samples were retrieved and processed for histomorphometric analysis. For torque and BIC (bone-to-implant contact) and BAFO (bone area fraction occupied), a general-linear model was employed including instrumentation technique and time in vivo as independent. The insertion torque recorded for each implant design and drilling group significantly decreased as a function of increasing drilling diameter for both implant designs (p<0.001). No significant differences were detected between implant designs for each drilling technique (p>0.18). A significant increase in BIC was observed from 2 to 4 weeks for both implants placed with the overdrilling technique (p<0.03) only, but not for those placed in the 3.75 mm drilling sites (p>0.32). Despite the differences between implant designs and drilling technique an intramembranous-like healing mode with newly formed woven bone prevailed. CHAPTER 2 - The objective of this preliminary histologic study was to determine whether the alteration of drilling protocols (oversized, intermediate, undersized drilling) present different biologic responses at early healing periods of 2 weeks in vivo in a beagle dog model. Ten beagle dogs were acquired and subjected to surgeries in the tibia 2 weeks before euthanasia. During surgery, 3 implants, 4 mm in diameter by 10 mm in length, were placed in bone sites drilled to 3.5 mm, 3.75 mm, and 4.0 mm in final diameter. The insertion and removal torque was recorded for all samples. Statistical significance was set to 95% level of confidence and the number of dogs was considered as the statistical unit for all comparisons. For the torque and BIC and BAFO, a general linear model was employed including instrumentation technique and time in vivo as independent. Overall, the insertion torque increased as a function of drilling diameter from 4.0 mm, to 3.75 mm, to 3.5 mm, with a significant difference in torque levels between all groups (p<0.001). Statistical assessment of BIC and BAFO showed significantly higher values for the 3.75 mm (recommended) drilling group was observed relative to the other two groups (p<0.001). Different drilling dimensions resulted in variations in insertion torque values (primary stability) and different pattern of healing and interfacial remodeling was observed for the different groups. CHAPTER 3 - The present study evaluated the effect of different drilling dimensions (undersized, regular, and oversized) in the insertion and removal torques of dental implants in a beagle dog model. Six beagle dogs were acquired and subjected to bilateral surgeries in the radii 1 and 3 weeks before euthanasia. During surgery, 3 implants, 4 mm in diameter by 10 mm in length, were placed in bone sites drilled to 3.2 mm, 3.5 mm, and 3.8 mm in final diameter. The insertion and removal torque was recorded for all samples. Statistical analysis was performed by paired t tests for repeated measures and by t tests assuming unequal variances (all at the 95% level of significance). Overall, the insertion torque and removal torque levels obtained were inversely proportional to the drilling dimension, with a significant difference detected between the 3.2 mm and 3.5 mm relative to the 3.8 mm groups (P < 0.03). Although insertion torque–removal torque paired observations was statis- tically maintained for the 3.5 mm and 3.8 mm groups, a significant decrease in removal torque values relative to insertion torque levels was observed for the 3.2 mm group. A different pattern of healing and interfacial remodeling was observed for the different groups. Different drilling dimensions resulted in variations in insertion torque values (primary stability) and stability maintenance over the first weeks of bone healing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pupil light reflex can be used as a non-invasive ocular predictor of cephalic autonomic nervous system integrity. Spectral sensitivity of the pupil's response to light has, for some time, been an interesting issue. It has generally, however, only been investigated with the use of white light and studies with monochromatic wavelengths are scarce. This study investigates the effects of wavelength and age within three parameters of the pupil light reflex (amplitude of response, latency, and velocity of constriction) in a large sample of younger and older adults (N = 97), in mesopic conditions. Subjects were exposed to a single light stimulus at four different wavelengths: white (5600° K), blue (450 nm), green (510 nm), and red (600 nm). Data was analyzed appropriately, and, when applicable, using the General Linear Model (GLM), Randomized Complete Block Design (RCBD), Student's t-test and/or ANCOVA. Across all subjects, pupillary response to light had the greatest amplitude and shortest latency in white and green light conditions. In regards to age, older subjects (46-78 years) showed an increased latency in white light and decreased velocity of constriction in green light compared to younger subjects (18-45 years old). This study provides data patterns on parameters of wavelength-dependent pupil reflexes to light in adults and it contributes to the large body of pupillometric research. It is hoped that this study will add to the overall evaluation of cephalic autonomic nervous system integrity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Several theories, such as the biological width formation, the inflammatory reactions due to the implant-abutment microgap contamination, and the periimplant stress/strain concentration causing bone microdamage accumulation, have been suggested to explain early periimplant bone loss. However, it is yet not well understood to which extent the implant-abutment connection type may influence the remodeling process around dental implants. Aim: to evaluate clinical, bacteriological, and biomechanical parameters related to periimplant bone loss at the crestal region, comparing external hexagon (EH) and Morse-taper (MT) connections. Materials and methods: Twelve patients with totally edentulous mandibles received four custom made Ø 3.8 x 13 mm implants in the interforaminal region of the mandible, with the same design, but different prosthetic connections (two of them EH or MT, randomly placed based on a split-mouth design), and a immediate implant- supported prosthesis. Clinical parameters (periimplant probing pocket depth, modified gingival index and mucosal thickness) were evaluated at 6 sites around the implants, at a 12 month follow-up. The distance from the top of the implant to the first bone-to-implant contact – IT-FBIC was evaluated on standardized digital peri-apical radiographs acquired at 1, 3, 6 and 12 months follow-up. Samples of the subgingival microbiota were collected 1, 3 and 6 months after implant loading. DNA were extracted and used for the quantification of Tanerella forsythia, Porphyromonas gingivalis, Aggragatibacter actinomycetemcomitans, Prevotella intermedia and Fusobacterium nucleatum. Comparison among multiple periods of observation were performed using repeated-measures Analysis of Variance (ANOVA), followed by a Tukey post-hoc test, while two-period based comparisons were made using paired t- test. Further, 36 computer-tomographic based finite element (FE) models were accomplished, simulating each patient in 3 loading conditions. The results for the peak EQV strain in periimplant bone were interpreted by means of a general linear model (ANOVA). Results: The variation in periimplant bone loss assessed by means of radiographs was significantly different between the connection types (P<0.001). Mean IT-FBIC was 1.17±0.44 mm for EH, and 0.17±0.54 mm for MT, considering all evaluated time periods. All clinical parameters presented not significant differences. No significant microbiological differences could be observed between both connection types. Most of the collected samples had very few pathogens, meaning that these regions were healthy from a microbiological point of view. In FE analysis, a significantly higher peak of EQV strain (P=0.005) was found for EH (mean 3438.65 µ∑) compared to MT (mean 840.98 µ∑) connection. Conclusions: Varying implant-abutment connection type will result in diverse periimplant bone remodeling, regardless of clinical and microbiological conditions. This fact is more likely attributed to the singular loading transmission through different implant-abutment connections to the periimplant bone. The present findings suggest that Morse-taper connection is more efficient to prevent periimplant bone loss, compared to an external hexagon connection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Identifying biological markers to aid diagnosis of bipolar disorder (BD) is critically important. To be considered a possible biological marker, neural patterns in BD should be discriminant from those in healthy individuals (HI). We examined patterns of neuromagnetic responses revealed by magnetoencephalography (MEG) during implicit emotion-processing using emotional (happy, fearful, sad) and neutral facial expressions, in sixteen BD and sixteen age- and gender-matched healthy individuals. Methods: Neuromagnetic data were recorded using a 306-channel whole-head MEG ELEKTA Neuromag System, and preprocessed using Signal Space Separation as implemented in MaxFilter (ELEKTA). Custom Matlab programs removed EOG and ECG signals from filtered MEG data, and computed means of epoched data (0-250ms, 250-500ms, 500-750ms). A generalized linear model with three factors (individual, emotion intensity and time) compared BD and HI. A principal component analysis of normalized mean channel data in selected brain regions identified principal components that explained 95% of data variation. These components were used in a quadratic support vector machine (SVM) pattern classifier. SVM classifier performance was assessed using the leave-one-out approach. Results: BD and HI showed significantly different patterns of activation for 0-250ms within both left occipital and temporal regions, specifically for neutral facial expressions. PCA analysis revealed significant differences between BD and HI for mild fearful, happy, and sad facial expressions within 250-500ms. SVM quadratic classifier showed greatest accuracy (84%) and sensitivity (92%) for neutral faces, in left occipital regions within 500-750ms. Conclusions: MEG responses may be used in the search for disease specific neural markers.