925 resultados para hierarchical linear model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Koopmans gyakorlati problémák megoldása során szerzett tapasztalatait általánosítva fogott hozzá a lineáris tevékenységelemzési modell kidolgozásához. Meglepődve tapasztalta, hogy a korabeli közgazdaságtan nem rendelkezett egységes, kellően egzakt termeléselmélettel és fogalomrendszerrel. Úttörő dolgozatában ezért - mintegy a lineáris tevékenységelemzési modell elméleti kereteként - lerakta a technológiai halmazok fogalmán nyugvó axiomatikus termeléselmélet alapjait is. Nevéhez fűződik a termelési hatékonyság és a hatékonysági árak fogalmának egzakt definíciója, s az egymást kölcsönösen feltételező viszonyuk igazolása a lineáris tevékenységelemzési modell keretében. A hatékonyság manapság használatos, pusztán műszaki szempontból értelmezett definícióját Koopmans csak sajátos esetként tárgyalta, célja a gazdasági hatékonyság fogalmának a bevezetése és elemzése volt. Dolgozatunkban a lineáris programozás dualitási tételei segítségével rekonstruáljuk ez utóbbira vonatkozó eredményeit. Megmutatjuk, hogy egyrészt bizonyításai egyenértékűek a lineáris programozás dualitási tételeinek igazolásával, másrészt a gazdasági hatékonysági árak voltaképpen a mai értelemben vett árnyékárak. Rámutatunk arra is, hogy a gazdasági hatékonyság értelmezéséhez megfogalmazott modellje az Arrow-Debreu-McKenzie-féle általános egyensúlyelméleti modellek közvetlen előzményének tekinthető, tartalmazta azok szinte minden lényeges elemét és fogalmát - az egyensúlyi árak nem mások, mint a Koopmans-féle hatékonysági árak. Végezetül újraértelmezzük Koopmans modelljét a vállalati technológiai mikroökonómiai leírásának lehetséges eszközeként. Journal of Economic Literature (JEL) kód: B23, B41, C61, D20, D50. /===/ Generalizing from his experience in solving practical problems, Koopmans set about devising a linear model for analysing activity. Surprisingly, he found that economics at that time possessed no uniform, sufficiently exact theory of production or system of concepts for it. He set out in a pioneering study to provide a theoretical framework for a linear model for analysing activity by expressing first the axiomatic bases of production theory, which rest on the concept of technological sets. He is associated with exact definition of the concept of production efficiency and efficiency prices, and confirmation of their relation as mutual postulates within the linear model of activity analysis. Koopmans saw the present, purely technical definition of efficiency as a special case; he aimed to introduce and analyse the concept of economic efficiency. The study uses the duality precepts of linear programming to reconstruct the results for the latter. It is shown first that evidence confirming the duality precepts of linear programming is equal in value, and secondly that efficiency prices are really shadow prices in today's sense. Furthermore, the model for the interpretation of economic efficiency can be seen as a direct predecessor of the Arrow–Debreu–McKenzie models of general equilibrium theory, as it contained almost every essential element and concept of them—equilibrium prices are nothing other than Koopmans' efficiency prices. Finally Koopmans' model is reinterpreted as a necessary tool for microeconomic description of enterprise technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study explored the effects of class size on faculty and students. Specifically, it examined the relationship of class size and students' participation in class, faculty interactive styles, and academic environment and how these behaviors affected student achievement (percentage of students passing). The sample was composed of 629 students in 30 sections of Algebra I at a large, urban community college. A survey was administered to the students to solicit their perceptions on their participation in class, their faculty interaction style, and the academic environment in their classes. Selected classes were observed to triangulate the findings. The relationship of class size to student participation, faculty interactive styles, and academic environment was determined by using hierarchical linear modeling (HLM). A significant difference was found on the participation of students related to class size. Students in smaller classes participated more and were more engaged than students in larger classes. Regression analysis using the same variables in small and large classes showed that faculty interactive styles significantly predicted student achievement. Stepwise regression analyses of student and faculty background variables showed that (a) students' estimate of GPA was significantly related to their achievement (r = .63); (b) older students reported more participation than did younger ones, (c) students in classes taught by female, Hispanic faculty earned higher passing grades, and (d) students' participation was greater with adjunct professors. Class observations corroborated these findings. The analysis and observational data provided sufficient evidence to warrant the conclusion that small classes were not always most effective in promoting achievement. It was found that small classes may be an artifact of ineffectual teaching, actual or by reputation. While students in small classes participate and are more engaged than students in larger classes, the class-size effect is essentially due to what happens in instruction to promote learning. The interaction of the faculty with students significantly predicted students' achievement regardless of class size. Since college students select their own classes, students do not register for classes taught by faculty with poor teaching reputation, thereby leading to small classes. Further studies are suggested to determine reasons why classes differ in size.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The extant literature had studied the determinants of the firms’ location decisions with help of host country characteristics and distances between home and host countries. Firm resources and its internationalization strategies had found limited attention in this literature. To address this gap, the research question in this dissertation was whether and how firms’ resources and internationalization strategies impacted the international location decisions of emerging market firms. ^ To explore the research question, data were hand-collected from Indian software firms on their location decisions taken between April 2000 and March 2009. To analyze the multi-level longitudinal dataset, hierarchical linear modeling was used. The results showed that the internationalization strategies, namely market-seeking or labor-seeking had direct impact on firms’ location decision. This direct relationship was moderated by firm resource which, in case of Indian software firms, was the appraisal at CMMI level-5. Indian software firms located in developed countries with a market-seeking strategy and in emerging markets with a labor-seeking strategy. However, software firms with resource such as CMMI level-5 appraisal, when in a labor-seeking mode, were more likely to locate in a developed country over emerging market than firms without the appraisal. Software firms with CMMI level-5 appraisal, when in market-seeking mode, were more likely to locate in a developed country over an emerging market than firms without the appraisal. ^ It was concluded that the internationalization strategies and resources of companies predicted their location choices, over and above the variables studied in the theoretical field of location determinants.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to better understand the study behaviors and habits of university undergraduate students. It was designed to determine whether undergraduate students could be grouped based on their self-reported study behaviors and if any grouping system could be determined, whether group membership was related to students’ academic achievement. A total of 152 undergraduate students voluntarily participated in the current study by completing the Study Behavior Inventory instrument. All participants were enrolled in fall semester of 2010 at Florida International University. The Q factor analysis technique using principal components extraction and a varimax rotation was used in order to examine the participants in relation to each other and to detect a pattern of intercorrelations among participants based on their self-reported study behaviors. The Q factor analysis yielded a two factor structure representing two distinct student types among participants regarding their study behaviors. The first student type (i.e., Factor 1) describes proactive learners who organize both their study materials and study time well. Type 1 students are labeled “Proactive Learners with Well-Organized Study Behaviors”. The second type (i.e., Factor 2) represents students who are poorly organized as well as being very likely to procrastinate. Type 2 students are labeled Disorganized Procrastinators. Hierarchical linear regression was employed to examine the relationship between student type and academic achievement as measured by current grade point averages (GPAs). The results showed significant differences in GPAs between Type 1 and Type 2 students at the .05 significance level. Furthermore, student type was found to be a significant predictor of academic achievement beyond and above students’ attribute variables including sex, age, major, and enrollment status. The study has several implications for educational researchers, practitioners, and policy makers in terms of improving college students' learning behaviors and outcomes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Chronic disease affects 80% of adults over the age of 65 and is expected to increase in prevalence. To address the burden of chronic disease, self-management programs have been developed to increase self-efficacy and improve quality of life by reducing or halting disease symptoms. Two programs that have been developed to address chronic disease are the Chronic Disease Self-Management Program (CDSMP) and Tomando Control de su Salud (TCDS). CDSMP and TCDS both focus on improving participant self-efficacy, but use different curricula, as TCDS is culturally tailored for the Hispanic population. Few studies have evaluated the effectiveness of CDSMP and TCDS when translated to community settings. In addition, little is known about the correlation between demographic, baseline health status, and psychosocial factors and completion of either CDSMP or TCDS. This study used secondary data collected by agencies of the Healthy Aging Regional Collaborative from 10/01/2008–12/31/2010. The aims of this study were to examine six week differences in self-efficacy, time spent performing physical activity, and social/role activity limitations, and to identify correlates of program completion using baseline demographic and psychosocial factors. To examine if differences existed a general linear model was used. Additionally, logistic regression was used to examine correlates of program completion. Study findings show that all measures showed improvement at week six. For CDSMP, self-efficacy to manage disease (p = .001), self-efficacy to manage emotions (p = .026), social/role activities limitations (p = .001), and time spent walking (p = .008) were statistically significant. For TCDS, self-efficacy to manage disease (p = .006), social/role activities limitations (p = .001), and time spent walking (p = .016) and performing other aerobic activity (p = .005) were significant. For CDSMP, no correlates predicting program completion were found to be significant. For TCDS, participants who were male (OR=2.3, 95%CI: 1.15–4.66), from Broward County (OR=2.3, 95%CI: 1.27–4.25), or living alone (OR=2.0, 95%CI: 1.29-–3.08) were more likely to complete the program. CDSMP and TCDS, when implemented through a collaborative effort, can result in improvements for participants. Effective chronic disease management can improve health, quality of life, and reduce health care expenditures among older adults.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The role of the principal in school settings and the principal's perceived effect on student achievement have frequently been considered vital factors in school reform. The relationships between emotional intelligence, leadership style and school culture have been widely studied. The literature reveals agreement among scholars regarding the principal's vital role in developing and fostering a positive school culture. The purpose of this study was to explore the relationships between elementary school principals' emotional intelligence, leadership style and school culture. ^ The researcher implemented a non-experimental ex post facto research design to investigate four specific research hypotheses. Utilizing the Qualtrics Survey Software, 57 elementary school principals within a large urban school district in southeast Florida completed the Emotional Quotient Inventory (EQ-i), and 850 of their faculty members completed the Multifactor Leadership Questionnaire (MLQ Form 5X). Faculty responses to the school district's School Climate Survey retrieved from the district's web site were used as the measure of school culture. ^ Linear regression analyses revealed significant positive associations between emotional intelligence and the following leadership measures: Idealized Influence-Attributes (β = .23, p = < .05), Idealized Influence-Behaviors (β = .34, p = < .01), Inspirational Motivation (β = .39, p = < .01) and Contingent Reward (β = .33, p = < .01). Hierarchical regression analyses revealed positive associations between school culture and both transformational and transactional leadership measures, and negative associations between school culture and passive-avoidant leadership measures. Significant positive associations were found between school culture and the principals' emotional intelligence over and above leadership style. Hierarchical linear regressions to test the statistical hypothesis developed to account for alternative explanations revealed significant associations between leadership style and school culture over and above school grade. ^ These results suggest that emotional intelligence merits consideration in the development of leadership theory. Practical implications include suggestions that principals employ both transformational and transactional leadership strategies, and focus on developing their level of emotional intelligence. The associations between emotional intelligence, transformational leadership, Contingent Reward and school culture found in this study validate the role of the principal as the leader of school reform.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prior to 2000, there were less than 1.6 million students enrolled in at least one online course. By fall 2010, student enrollment in online distance education showed a phenomenal 283% increase to 6.1 million. Two years later, this number had grown to 7.1 million. In light of this significant growth and skepticism about quality, there have been calls for greater oversight of this format of educational delivery. Accrediting bodies tasked with this oversight have developed guidelines and standards for online education. There is a lack of empirical studies that examine the relationship between accrediting standards and student success. The purpose of this study was to examine the relationship between the presence of Southern Association of Colleges and Schools Commission on College (SACSCOC) standards for online education in online courses, (a) student support services and (b) curriculum and instruction, and student success. An original 24-item survey with an overall reliability coefficient of .94 was administered to students (N=464) at Florida International University, enrolled in 24 university-wide undergraduate online courses during fall 2014, who rated the presence of these standards in their online courses. The general linear model was utilized to analyze the data. The results of the study indicated that the two standards, student support services and curriculum and instruction were both significantly and positively correlated with student success but with small R2 and strengths of association less than .35 and .20 respectively. Mixed results were produced from Chi-square tests for differences in student success between higher and lower rated online courses when controlling for various covariates such as discipline, gender, race/ethnicity, GPA, age, and number of online courses previously taken. A multiple linear regression analysis revealed that the curriculum and instruction standard was the only variable that accounted for a significant amount of unique variance in student success. Another regression test revealed that no significant interaction effect exists between the two SACSCOC standards and GPA in predicting student success. The results of this study are useful for administrators, faculty, and researchers who are interested in accreditation standards for online education and how these standards relate to student success.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The extant literature had studied the determinants of the firms’ location decisions with help of host country characteristics and distances between home and host countries. Firm resources and its internationalization strategies had found limited attention in this literature. To address this gap, the research question in this dissertation was whether and how firms’ resources and internationalization strategies impacted the international location decisions of emerging market firms. To explore the research question, data were hand-collected from Indian software firms on their location decisions taken between April 2000 and March 2009. To analyze the multi-level longitudinal dataset, hierarchical linear modeling was used. The results showed that the internationalization strategies, namely market-seeking or labor-seeking had direct impact on firms’ location decision. This direct relationship was moderated by firm resource which, in case of Indian software firms, was the appraisal at CMMI level-5. Indian software firms located in developed countries with a market-seeking strategy and in emerging markets with a labor-seeking strategy. However, software firms with resource such as CMMI level-5 appraisal, when in a labor-seeking mode, were more likely to locate in a developed country over emerging market than firms without the appraisal. Software firms with CMMI level-5 appraisal, when in market-seeking mode, were more likely to locate in a developed country over an emerging market than firms without the appraisal. It was concluded that the internationalization strategies and resources of companies predicted their location choices, over and above the variables studied in the theoretical field of location determinants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis used four different methods in order to diagnose the precipitation extremes on Northeastern Brazil (NEB): Generalized Linear Model s via logistic regression and Poisson, extreme value theory analysis via generalized extre me value (GEV) and generalized Pareto (GPD) distributions and Vectorial Generalized Linea r Models via GEV (MVLG GEV). The logistic regression and Poisson models were used to identify the interactions between the precipitation extremes and other variables based on the odds ratios and relative risks. It was found that the outgoing longwave radiation was the indicator variable for the occurrence of extreme precipitation on eastern, northern and semi arid NEB, and the relative humidity was verified on southern NEB. The GEV and GPD distribut ions (based on the 95th percentile) showed that the location and scale parameters were presented the maximum on the eastern and northern coast NEB, the GEV verified a maximum core on western of Pernambuco influenced by weather systems and topography. The GEV and GPD shape parameter, for most regions the data fitted by Weibull negative an d Beta distributions (ξ < 0) , respectively. The levels and return periods of GEV (GPD) on north ern Maranhão (centerrn of Bahia) may occur at least an extreme precipitation event excee ding over of 160.9 mm /day (192.3 mm / day) on next 30 years. The MVLG GEV model found tha t the zonal and meridional wind components, evaporation and Atlantic and Pacific se a surface temperature boost the precipitation extremes. The GEV parameters show the following results: a) location ( ), the highest value was 88.26 ± 6.42 mm on northern Maran hão; b) scale ( σ ), most regions showed positive values, except on southern of Maranhão; an d c) shape ( ξ ), most of the selected regions were adjusted by the Weibull negative distr ibution ( ξ < 0 ). The southern Maranhão and southern Bahia have greater accuracy. The level period, it was estimated that the centern of Bahia may occur at least an extreme precipitatio n event equal to or exceeding over 571.2 mm/day on next 30 years.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

CHAPTER 1 - This study histologically evaluated two implant designs: a classic thread design versus another specifically designed for healing chamber formation placed with two drilling protocols. Forty dental implants (4.1 mm diameter) with two different macrogeometries were inserted in the tibia of 10 Beagle dogs, and maximum insertion torque was recorded. Drilling techniques were: until 3.75 mm (regular-group); and until 4.0 mm diameter (overdrillinggroup) for both implant designs. At 2 and 4 weeks, samples were retrieved and processed for histomorphometric analysis. For torque and BIC (bone-to-implant contact) and BAFO (bone area fraction occupied), a general-linear model was employed including instrumentation technique and time in vivo as independent. The insertion torque recorded for each implant design and drilling group significantly decreased as a function of increasing drilling diameter for both implant designs (p<0.001). No significant differences were detected between implant designs for each drilling technique (p>0.18). A significant increase in BIC was observed from 2 to 4 weeks for both implants placed with the overdrilling technique (p<0.03) only, but not for those placed in the 3.75 mm drilling sites (p>0.32). Despite the differences between implant designs and drilling technique an intramembranous-like healing mode with newly formed woven bone prevailed. CHAPTER 2 - The objective of this preliminary histologic study was to determine whether the alteration of drilling protocols (oversized, intermediate, undersized drilling) present different biologic responses at early healing periods of 2 weeks in vivo in a beagle dog model. Ten beagle dogs were acquired and subjected to surgeries in the tibia 2 weeks before euthanasia. During surgery, 3 implants, 4 mm in diameter by 10 mm in length, were placed in bone sites drilled to 3.5 mm, 3.75 mm, and 4.0 mm in final diameter. The insertion and removal torque was recorded for all samples. Statistical significance was set to 95% level of confidence and the number of dogs was considered as the statistical unit for all comparisons. For the torque and BIC and BAFO, a general linear model was employed including instrumentation technique and time in vivo as independent. Overall, the insertion torque increased as a function of drilling diameter from 4.0 mm, to 3.75 mm, to 3.5 mm, with a significant difference in torque levels between all groups (p<0.001). Statistical assessment of BIC and BAFO showed significantly higher values for the 3.75 mm (recommended) drilling group was observed relative to the other two groups (p<0.001). Different drilling dimensions resulted in variations in insertion torque values (primary stability) and different pattern of healing and interfacial remodeling was observed for the different groups. CHAPTER 3 - The present study evaluated the effect of different drilling dimensions (undersized, regular, and oversized) in the insertion and removal torques of dental implants in a beagle dog model. Six beagle dogs were acquired and subjected to bilateral surgeries in the radii 1 and 3 weeks before euthanasia. During surgery, 3 implants, 4 mm in diameter by 10 mm in length, were placed in bone sites drilled to 3.2 mm, 3.5 mm, and 3.8 mm in final diameter. The insertion and removal torque was recorded for all samples. Statistical analysis was performed by paired t tests for repeated measures and by t tests assuming unequal variances (all at the 95% level of significance). Overall, the insertion torque and removal torque levels obtained were inversely proportional to the drilling dimension, with a significant difference detected between the 3.2 mm and 3.5 mm relative to the 3.8 mm groups (P < 0.03). Although insertion torque–removal torque paired observations was statis- tically maintained for the 3.5 mm and 3.8 mm groups, a significant decrease in removal torque values relative to insertion torque levels was observed for the 3.2 mm group. A different pattern of healing and interfacial remodeling was observed for the different groups. Different drilling dimensions resulted in variations in insertion torque values (primary stability) and stability maintenance over the first weeks of bone healing.