980 resultados para Computer Oriented Statistics
Resumo:
Purpose: The aim of this research was to assess the dimensional accuracy of orbital prostheses based on reversed images generated by computer-aided design/computer-assisted manufacturing (CAD/CAM) using computed tomography (CT) scans. Materials and Methods: CT scans of the faces of 15 adults, men and women older than 25 years of age not bearing any congenital or acquired craniofacial defects, were processed using CAD software to produce 30 reversed three-dimensional models of the orbital region. These models were then processed using the CAM system by means of selective laser sintering to generate surface prototypes of the volunteers` orbital regions. Two moulage impressions of the faces of each volunteer were taken to manufacture 15 pairs of casts. Orbital defects were created on the right or left side of each cast. The surface prototypes were adapted to the casts and then flasked to fabricate silicone prostheses. The establishment of anthropometric landmarks on the orbital region and facial midline allowed for the data collection of 31 linear measurements, used to assess the dimensional accuracy of the orbital prostheses and their location on the face. Results: The comparative analyses of the linear measurements taken from the orbital prostheses and the opposite sides that originated the surface prototypes demonstrated that the orbital prostheses presented similar vertical, transversal, and oblique dimensions, as well as similar depth. There was no transverse or oblique displacement of the prostheses. Conclusion: From a clinical perspective, the small differences observed after analyzing all 31 linear measurements did not indicate facial asymmetry. The dimensional accuracy of the orbital prostheses suggested that the CAD/CAM system assessed herein may be applicable for clinical purposes. Int J Prosthodont 2010;23:271-276.
Resumo:
Statement of the Problem: Adhesive systems can spread differently onto a substrate and, consequently, influence bonding. Purpose: The purpose of this study was to evaluate the effect of differently oriented dentin surfaces and the regional variation of specimens on adhesive layer thickness and microtensile bond strength (MTBS). Materials and Methods: Twenty-four molars were sectioned mesiodistally to expose flat buccal and lingual halves. Standardized drop volumes of adhesive systems (Single Bond [SB] and Prime & Bond 2.1 [PB2.1]) were applied to dentin according to the manufacturer`s instructions. Teeth halves were randomly divided into groups: 1A-SB/parallel to gravity; 1B-SB/perpendicular to gravity; 2A-PB2.1/parallel to gravity; and 2B-PB2.1/perpendicular to gravity. The bonded assemblies were stored in 37 degrees C distilled water for 24 hours and then sectioned to obtain dentin sticks (0.8 mm(2)). The adhesive layer thickness was determined in a light microscope (x200), and after 48 hours the specimens were subjected to MTBS test. Data were analyzed by one-way and two-way analysis of variance and Student-Newman-Keuls tests. Results: Mean values (MPa +/- SD) of MTBS were: 39.1 +/- 12.9 (1A); 32.9 +/- 12.4 (1B); 52.9 +/- 15.2 (2A); and 52.3 +/- 16.5 (2B). The adhesive systems` thicknesses (mu m +/- SD) were: 11.2 +/- 2.9 (1A); 18.1 +/- 7.3 (1B); 4.2 +/- 1.8 (2A); and 3.9 +/- 1.3 (2B). No correlation between bond strength and adhesive layer thickness for both SB and PB2.1 (r = -0.224, p = 0.112 and r = 0.099, p = 0.491, respectively) was observed. Conclusions: The differently oriented dentin surfaces and the regional variation of specimens on the adhesive layer thickness are material-dependent. These variables do not influence the adhesive systems` bond strength to dentin. CLINICAL SIGNIFICANCE Adhesive systems have different viscosities and spread differently onto a substrate, influencing the bond strength and also the adhesive layer thickness. Adhesive thickness does not influence dentin bond strength, but it may impair adequate solvent evaporation, polymer conversion, and may also determine water sorption and adhesive degradation over time. In the literature, many studies have shown that the adhesive layer is a permeable membrane and can fail over timebecause ofits continuous plasticizing and degradation when in contact with water. Therefore, avoiding thick adhesive layers may minimize these problems and provide long-term success for adhesive restorations.
Resumo:
Interval-valued versions of the max-flow min-cut theorem and Karp-Edmonds algorithm are developed and provide robustness estimates for flows in networks in an imprecise or uncertain environment. These results are extended to networks with fuzzy capacities and flows. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.
Resumo:
Map algebra is a data model and simple functional notation to study the distribution and patterns of spatial phenomena. It uses a uniform representation of space as discrete grids, which are organized into layers. This paper discusses extensions to map algebra to handle neighborhood operations with a new data type called a template. Templates provide general windowing operations on grids to enable spatial models for cellular automata, mathematical morphology, and local spatial statistics. A programming language for map algebra that incorporates templates and special processing constructs is described. The programming language is called MapScript. Example program scripts are presented to perform diverse and interesting neighborhood analysis for descriptive, model-based and processed-based analysis.
Resumo:
Incremental parsing has long been recognized as a technique of great utility in the construction of language-based editors, and correspondingly, the area currently enjoys a mature theory. Unfortunately, many practical considerations have been largely overlooked in previously published algorithms. Many user requirements for an editing system necessarily impact on the design of its incremental parser, but most approaches focus only on one: response time. This paper details an incremental parser based on LR parsing techniques and designed for use in a modeless syntax recognition editor. The nature of this editor places significant demands on the structure and quality of the document representation it uses, and hence, on the parser. The strategy presented here is novel in that both the parser and the representation it constructs are tolerant of the inevitable and frequent syntax errors that arise during editing. This is achieved by a method that differs from conventional error repair techniques, and that is more appropriate for use in an interactive context. Furthermore, the parser aims to minimize disturbance to this representation, not only to ensure other system components can operate incrementally, but also to avoid unfortunate consequences for certain user-oriented services. The algorithm is augmented with a limited form of predictive tree-building, and a technique is presented for the determination of valid symbols for menu-based insertion. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
When the data consist of certain attributes measured on the same set of items in different situations, they would be described as a three-mode three-way array. A mixture likelihood approach can be implemented to cluster the items (i.e., one of the modes) on the basis of both of the other modes simultaneously (i.e,, the attributes measured in different situations). In this paper, it is shown that this approach can be extended to handle three-mode three-way arrays where some of the data values are missing at random in the sense of Little and Rubin (1987). The methodology is illustrated by clustering the genotypes in a three-way soybean data set where various attributes were measured on genotypes grown in several environments.
Resumo:
The QU-GENE Computing Cluster (QCC) is a hardware and software solution to the automation and speedup of large QU-GENE (QUantitative GENEtics) simulation experiments that are designed to examine the properties of genetic models, particularly those that involve factorial combinations of treatment levels. QCC automates the management of the distribution of components of the simulation experiments among the networked single-processor computers to achieve the speedup.
Resumo:
The earnings gap between men and women has remained comparatively stable at an aggregate level over the 1990s in Australia. From one perspective, this is a reminder of the considerable difficulty of addressing wage differentials once the most overt forms of wage discrimination have been removed, and of the limited impact of most policy initiatives. From another, it may be seen as evidence that dire predictions about the effects of decentralisation on the earnings gap have failed to materialise. In this paper, I use Australian Bureau of Statistics data to show that a number of different trends are evident underneath the relatively static picture shown by the aggregate statistics, particularly as wage dispersion has increased. The data suggest not only that the prospects for pay equity are far from benign, but also that in the current labour market the issue of gender pay inequality cannot be effectively addressed separately from wage inequality more generally.