996 resultados para APPLIED PROBABILITY
Resumo:
Objective. To evaluate the biaxial and short-beam uniaxial strength tests applied to resin composites based upon their Weibull parameters, fractographic features and stress distribution. Methods. Disk- (15 mm x 1 mm) and beam-shaped specimens (10 mm x 2 mm x 1 mm) of three commercial composites (Concept/Vigodent, CA; Heliomolar/Ivoclar-Vivadent, HE; Z250/3M ESPE, FZ) were prepared. After 48h dry storage at 37 degrees C, disks and beams were submitted to piston-on-three-balls (BI) and three-point bending (UNI) tests, respectively. Data were analyzed by Weibull statistics. Fractured surfaces were observed under stereomicroscope and scanning electron microscope. Maximum principal stress (sigma(1)) distribution was determined by finite element analysis (FEA). Maximum sigma(1-BI) and sigma(1-UNI) were compared to FZ strengths calculated by applying the average failure loads to the analytical equations (sigma(a-BI) and sigma(a-UNI)). Results. For BI, characteristic strengths were: 169.9a (FZ), 122.4b (CA) and 104.8c (HE), and for UNI were: 160.3a (FZ), 98.2b (CA) and 91.6b (HE). Weibull moduli ( m) were similar within the same test. CA and HE presented statistically higher m for BI. Surface pores ( BI) and edge flaws ( UNI) were the most frequent fracture origins. sigma(1-BI) was 14% lower than sigma(a-BI.) sigma(1-UNI) was 43% higher than sigma(a-UNI). Significance. Compared to the short-beam uniaxial test, the biaxial test detected more differences among composites and displayed less data scattering for two of the tested materials. Also, biaxial strength was closer to the material`s strength estimated by FEA. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Objective: Verify the influence of radiant exposure (H) on composite degree of conversion (DC) and mechanical properties. Methods: Composite was photoactivated with 3, 6, 12, 24, or 48 J/cm(2). Properties were measured after 48-h dry storage at room temperature. DC was determined on the flat surfaces of 6 mm x 2 mm disk-shaped specimens using FTIR. Flexural strength (FS) and modulus (FM) were accessed by three-point bending. Knoop microhardness number (KHN) was measured on fragments of FS specimens. Data were analyzed by one-way ANOVA/Tukey test, Student`s t-test, and regression analysis. Results: DC/top between 6 and 12 J/cm(2) and between 24 and 48 J/cm(2) were not statistically different. No differences between DC/top and bottom were detected. DC/bottom, FM, and KHN/top showed significant differences among all H levels. FS did not vary between 12 and 24 J/cm(2) and between 24 and 48 J/cm(2). KHN/bottom at 3 and 6 J/cm(2) was similar. KHN between top and bottom was different up to 12 J/cm(2). Regression analyses having H as independent variable showed a plateau region above 24 J/cm(2). KHN increased exponentially (top) or linearly (bottom) with DC. FS and FM increased almost linearly with DC/bottom up to 55% conversion. Conclusions: DC and mechanical properties increased with radiant exposure. Variables leveled off at high H levels. (C) 2007 Wiley Periodicals, Inc.
Resumo:
Objectives: This study tested the following null hypotheses: (1) there is no difference in resin-dentine bond strength when an experimental glutaraldehyde primer solution is added prior to bonding procedures and (2) there is no difference in resin-dentine bond strength when experimental glutaraldehyde/adhesive system is applied under dry or wet demineralized dentine conditions. Methods: Extracted human maxillary third molars were selected. Flat, mid-coronal dentine was exposed for bonding and four groups were formed. Two groups were designated for the dry and two for the wet dentine technique: DRY: (1) Group GD: acid etching + glutaraldehyde primer (primer A) + HEMA/ethanol primer (primer B)-under dried dentine + unfilled resin; (2) Group D: the same as GD, except for primer A application; WET: (3) Group GW: the same as GD, but primer B was applied under wet dentine condition; (4) Group W: the same as GW, except for primer A application. The bonding resin was light-cured and a resin core was built up on the adhesive layer. Teeth were then prepared for microtensile bond testing to evaluate bond strength. The data obtained were submitted to ANOVA and Tukey`s test (alpha = 0.05). Results: Glutaraldehyde primer application significantly improved resin-dentine bond strength. No significant difference was observed when the same experimental adhesive system was applied under either dry or wet dentine conditions. These results allow the first null hypothesis to be rejected and the second to be accepted. Conclusion: Glutaraldehyde may affect demineralized dentine properties leading to improved resin bonding to wet and dry substrates. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A 4-wheel is a simple graph on 5 vertices with 8 edges, formed by taking a 4-cycle and joining a fifth vertex (the centre of the 4-wheel) to each of the other four vertices. A lambda -fold 4-wheel system of order n is an edge-disjoint decomposition of the complete multigraph lambdaK(n) into 4-wheels. Here, with five isolated possible exceptions when lambda = 2, we give necessary and sufficient conditions for a lambda -fold 4-wheel system of order n to be transformed into a lambda -fold Ccyde system of order n by removing the centre vertex from each 4-wheel, and its four adjacent edges (retaining the 4-cycle wheel rim), and reassembling these edges adjacent to wheel centres into 4-cycles.
Resumo:
Interval-valued versions of the max-flow min-cut theorem and Karp-Edmonds algorithm are developed and provide robustness estimates for flows in networks in an imprecise or uncertain environment. These results are extended to networks with fuzzy capacities and flows. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Purpose: To test the strength to failure and fracture mode of three indirect composite materials directly applied onto Ti-6Al-4V implant abutments vs cemented standard porcelain-fused-to-metal (PFM) crowns. Materials and Methods: Sixty-four locking taper abutments were randomly allocated to four groups and were cleaned in ethanol in an ultrasonic bath for 5 min. After drying under ambient conditions, the abutments were grit blasted and a custom 4-cusp molar crown mold was utilized to produce identical crowns (n = 16 per group) of Tescera (Bisco), Ceramage (Shofu), and Diamond Crown (DRM) according to the manufacturer`s instructions. The porcelain-fused-to-metal crowns were fabricated by conventional means involving the construction and a wax pattern and casting of a metallic coping followed by sintering of increasing layers of porcelain. All crowns were loaded to failure by an indenter placed at one of the cusp tips at a 1 mm/min rate. Subsequently, fracture analysis was performed by means of stereomicroscopy and scanning electron microscopy. One-way ANOVA at 95% level of significance was utilized for statistical analysis. Results: The single load to failure (+/- SD) results were: Tescera (1130 +/- 239 N), Ceramage (1099 +/- 257 N), Diamond Crown (1155 +/- 284 N), and PFM (1081 +/- 243 N). Stereomicroscopy analysis showed two distinct failure modes, where the loaded cusp failed either with or without abutment/metallic coping exposure. SEM analysis of the fractures showed multiple crack propagation towards the cervical region of the crown below a region of plastic deformation at the indenter contact region. Conclusion: The three indirect composites and PFM systems fractured at loads higher than those typically associated with normal occlusal function. Although each material had a different composition and handling technique, no significant differences were found concerning their single load to fracture resistance among composite systems and PFM.
Resumo:
A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
Proportionally balanced designs were introduced by Gray and Matters in response to a need for the allocation of markers of the Queensland Core Skills Test to have a certain property. Put simply, markers were allocated to pairs of units in proportions that reflected the relative numbers of markers allocated in total to each unit. In this paper, the first author extends the theoretical results relating to such designs and provides further instances, and two general constructions, in the case that the design comprises blocks of precisely two sizes.
Resumo:
In this note we show by counter-example that the direct product of two weak uniquely completable partial latin squares is not necessarily a uniquely completable partial latin square. This counter-example rejects a conjecture by Gower (see [3]) on the direct product of two uniquely completable partial latin squares.
Resumo:
The number of 1-factors (near 1-factors) that mu 1-factorizations (near 1-factorizations) of the complete graph K-v, v even (v odd), can have in common, is studied. The problem is completely settled for mu = 2 and mu = 3.
Resumo:
This paper develops a general framework for valuing a wide range of derivative securities. Rather than focusing on the stochastic process of the underlying security and developing an instantaneously-riskless hedge portfolio, we focus on the terminal distribution of the underlying security. This enables the derivative security to be valued as the weighted sum of a number of component pieces. The component pieces are simply the different payoffs that the security generates in different states of the world, and they are weighted by the probability of the particular state of the world occurring. A full set of derivations is provided. To illustrate its use, the valuation framework is applied to plain-vanilla call and put options, as well as a range of derivatives including caps, floors, collars, supershares, and digital options.
Resumo:
This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of ��knowing’.
Resumo:
When examining a rock mass, joint sets and their orientations can play a significant role with regard to how the rock mass will behave. To identify joint sets present in the rock mass, the orientation of individual fracture planer can be measured on exposed rock faces and the resulting data can be examined for heterogeneity. In this article, the expectation-maximization algorithm is used to lit mixtures of Kent component distributions to the fracture data to aid in the identification of joint sets. An additional uniform component is also included in the model to accommodate the noise present in the data.