954 resultados para Object-oriented methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a framework for compositional verification of Object-Z specifications. Its key feature is a proof rule based on decomposition of hierarchical Object-Z models. For each component in the hierarchy local properties are proven in a single proof step. However, we do not consider components in isolation. Instead, components are envisaged in the context of the referencing super-component and proof steps involve assumptions on properties of the sub-components. The framework is defined for Linear Temporal Logic (LTL)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The offered paper deals with the problems of color images preliminary procession. Among these are: interference control (local ones and noise) and extraction of the object from the background on the stage preceding the process of contours extraction. It was considered for a long time that execution of smoothing in segmentation through the boundary extraction is inadmissible, but the described methods and the obtained results evidence about expedience of using the noise control methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this article we consider the application of the generalization of the symmetric version of the interior penalty discontinuous Galerkin finite element method to the numerical approximation of the compressible Navier--Stokes equations. In particular, we consider the a posteriori error analysis and adaptive mesh design for the underlying discretization method. Indeed, by employing a duality argument (weighted) Type I a posteriori bounds are derived for the estimation of the error measured in terms of general target functionals of the solution; these error estimates involve the product of the finite element residuals with local weighting terms involving the solution of a certain dual problem that must be numerically approximated. This general approach leads to the design of economical finite element meshes specifically tailored to the computation of the target functional of interest, as well as providing efficient error estimation. Numerical experiments demonstrating the performance of the proposed approach will be presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents studies of cases in power systems by Sensitivity Analysis (SA) oriented by Optimal Power Flow (OPF) problems in different operation scenarios. The studies of cases start from a known optimal solution obtained by OPF. This optimal solution is called base case, and from this solution new operation points may be evaluated by SA when perturbations occur in the system. The SA is based on Fiacco`s Theorem and has the advantage of not be an iterative process. In order to show the good performance of the proposed technique tests were carried out on the IEEE 14, 118 and 300 buses systems. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, processing methods of Fourier optics implemented in a digital holographic microscopy system are presented. The proposed methodology is based on the possibility of the digital holography in carrying out the whole reconstruction of the recorded wave front and consequently, the determination of the phase and intensity distribution in any arbitrary plane located between the object and the recording plane. In this way, in digital holographic microscopy the field produced by the objective lens can be reconstructed along its propagation, allowing the reconstruction of the back focal plane of the lens, so that the complex amplitudes of the Fraunhofer diffraction, or equivalently the Fourier transform, of the light distribution across the object can be known. The manipulation of Fourier transform plane makes possible the design of digital methods of optical processing and image analysis. The proposed method has a great practical utility and represents a powerful tool in image analysis and data processing. The theoretical aspects of the method are presented, and its validity has been demonstrated using computer generated holograms and images simulations of microscopic objects. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of two experiments are reported that examined how people respond to rectangular targets of different sizes in simple hitting tasks. If a target moves in a straight line and a person is constrained to move along a linear track oriented perpendicular to the targetrsquos motion, then the length of the target along its direction of motion constrains the temporal accuracy and precision required to make the interception. The dimensions of the target perpendicular to its direction of motion place no constraints on performance in such a task. In contrast, if the person is not constrained to move along a straight track, the targetrsquos dimensions may constrain the spatial as well as the temporal accuracy and precision. The experiments reported here examined how people responded to targets of different vertical extent (height): the task was to strike targets that moved along a straight, horizontal path. In experiment 1 participants were constrained to move along a horizontal linear track to strike targets and so target height did not constrain performance. Target height, length and speed were co-varied. Movement time (MT) was unaffected by target height but was systematically affected by length (briefer movements to smaller targets) and speed (briefer movements to faster targets). Peak movement speed (Vmax) was influenced by all three independent variables: participants struck shorter, narrower and faster targets harder. In experiment 2, participants were constrained to move in a vertical plane normal to the targetrsquos direction of motion. In this task target height constrains the spatial accuracy required to contact the target. Three groups of eight participants struck targets of different height but of constant length and speed, hence constant temporal accuracy demand (different for each group, one group struck stationary targets = no temporal accuracy demand). On average, participants showed little or no systematic response to changes in spatial accuracy demand on any dependent measure (MT, Vmax, spatial variable error). The results are interpreted in relation to previous results on movements aimed at stationary targets in the absence of visual feedback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a means of structuring specifications in real-time Object-Z: an integration of Object-Z with the timed refinement calculus. Incremental modification of classes using inheritance and composition of classes to form multi-component systems are examined. Two approaches to the latter are considered: using Object-Z's notion of object instantiation and introducing a parallel composition operator similar to those found in process algebras. The parallel composition operator approach is both more concise and allows more general modelling of concurrency. Its incorporation into the existing semantics of real-time Object-Z is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Edinburgh-Cape Blue Object Survey is a major survey to discover blue stellar objects brighter than B similar to 18 in the southern sky. It is planned to cover an area of sky of 10 000 deg(2) with \b\ > 30 degrees and delta < 0 degrees. The blue stellar objects are selected by automatic techniques from U and B pairs of UK Schmidt Telescope plates scanned with the COSMOS measuring machine. Follow-up photometry and spectroscopy are being obtained with the SAAO telescopes to classify objects brighter than B = 16.5. This paper describes the survey, the techniques used to extract the blue stellar objects, the photometric methods and accuracy, the spectroscopic classification, and the limits and completeness of the survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statement of the Problem: Adhesive systems can spread differently onto a substrate and, consequently, influence bonding. Purpose: The purpose of this study was to evaluate the effect of differently oriented dentin surfaces and the regional variation of specimens on adhesive layer thickness and microtensile bond strength (MTBS). Materials and Methods: Twenty-four molars were sectioned mesiodistally to expose flat buccal and lingual halves. Standardized drop volumes of adhesive systems (Single Bond [SB] and Prime & Bond 2.1 [PB2.1]) were applied to dentin according to the manufacturer`s instructions. Teeth halves were randomly divided into groups: 1A-SB/parallel to gravity; 1B-SB/perpendicular to gravity; 2A-PB2.1/parallel to gravity; and 2B-PB2.1/perpendicular to gravity. The bonded assemblies were stored in 37 degrees C distilled water for 24 hours and then sectioned to obtain dentin sticks (0.8 mm(2)). The adhesive layer thickness was determined in a light microscope (x200), and after 48 hours the specimens were subjected to MTBS test. Data were analyzed by one-way and two-way analysis of variance and Student-Newman-Keuls tests. Results: Mean values (MPa +/- SD) of MTBS were: 39.1 +/- 12.9 (1A); 32.9 +/- 12.4 (1B); 52.9 +/- 15.2 (2A); and 52.3 +/- 16.5 (2B). The adhesive systems` thicknesses (mu m +/- SD) were: 11.2 +/- 2.9 (1A); 18.1 +/- 7.3 (1B); 4.2 +/- 1.8 (2A); and 3.9 +/- 1.3 (2B). No correlation between bond strength and adhesive layer thickness for both SB and PB2.1 (r = -0.224, p = 0.112 and r = 0.099, p = 0.491, respectively) was observed. Conclusions: The differently oriented dentin surfaces and the regional variation of specimens on the adhesive layer thickness are material-dependent. These variables do not influence the adhesive systems` bond strength to dentin. CLINICAL SIGNIFICANCE Adhesive systems have different viscosities and spread differently onto a substrate, influencing the bond strength and also the adhesive layer thickness. Adhesive thickness does not influence dentin bond strength, but it may impair adequate solvent evaporation, polymer conversion, and may also determine water sorption and adhesive degradation over time. In the literature, many studies have shown that the adhesive layer is a permeable membrane and can fail over timebecause ofits continuous plasticizing and degradation when in contact with water. Therefore, avoiding thick adhesive layers may minimize these problems and provide long-term success for adhesive restorations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prostate Specific Antigen (PSA) is the biomarker of choice for screening prostate cancer throughout the population, with PSA values above 10 ng/mL pointing out a high probability of associated cancer1. According to the most recent World Health Organization (WHO) data, prostate cancer is the commonest form of cancer in men in Europe2. Early detection of prostate cancer is thus very important and is currently made by screening PSA in men over 45 years old, combined with other alterations in serum and urine parameters. PSA is a glycoprotein with a molecular mass of approximately 32 kDa consisting of one polypeptide chain, which is produced by the secretory epithelium of human prostate. Currently, the standard methods available for PSA screening are immunoassays like Enzyme-Linked Immunoabsorbent Assay (ELISA). These methods are highly sensitive and specific for the detection of PSA, but they require expensive laboratory facilities and high qualify personal resources. Other highly sensitive and specific methods for the detection of PSA have also become available and are in its majority immunobiosensors1,3-5, relying on antibodies. Less expensive methods producing quicker responses are thus needed, which may be achieved by synthesizing artificial antibodies by means of molecular imprinting techniques. These should also be coupled to simple and low cost devices, such as those of the potentiometric kind, one approach that has been proven successful6. Potentiometric sensors offer the advantage of selectivity and portability for use in point-of-care and have been widely recognized as potential analytical tools in this field. The inherent method is simple, precise, accurate and inexpensive regarding reagent consumption and equipment involved. Thus, this work proposes a new plastic antibody for PSA, designed over the surface of graphene layers extracted from graphite. Charged monomers were used to enable an oriented tailoring of the PSA rebinding sites. Uncharged monomers were used as control. These materials were used as ionophores in conventional solid-contact graphite electrodes. The obtained results showed that the imprinted materials displayed a selective response to PSA. The electrodes with charged monomers showed a more stable and sensitive response, with an average slope of -44.2 mV/decade and a detection limit of 5.8X10-11 mol/L (2 ng/mL). The corresponding non-imprinted sensors showed smaller sensitivity, with average slopes of -24.8 mV/decade. The best sensors were successfully applied to the analysis of serum samples, with percentage recoveries of 106.5% and relatives errors of 6.5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Ambiente, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation to obtain the degree of Doctor of Philosophy in Electrical and Computer Engineering(Industrial Information Systems)