145 resultados para Applied Mathematics|Computer Engineering|Computer science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide an abstract command language for real-time programs and outline how a partial correctness semantics can be used to compute execution times. The notions of a timed command, refinement of a timed command, the command traversal condition, and the worst-case and best-case execution time of a command are formally introduced and investigated with the help of an underlying weakest liberal precondition semantics. The central result is a theory for the computation of worst-case and best-case execution times from the underlying semantics based on supremum and infimum calculations. The framework is applied to the analysis of a message transmitter program and its implementation. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All signals that appear to be periodic have some sort of variability from period to period regardless of how stable they appear to be in a data plot. A true sinusoidal time series is a deterministic function of time that never changes and thus has zero bandwidth around the sinusoid's frequency. A zero bandwidth is impossible in nature since all signals have some intrinsic variability over time. Deterministic sinusoids are used to model cycles as a mathematical convenience. Hinich [IEEE J. Oceanic Eng. 25 (2) (2000) 256-261] introduced a parametric statistical model, called the randomly modulated periodicity (RMP) that allows one to capture the intrinsic variability of a cycle. As with a deterministic periodic signal the RMP can have a number of harmonics. The likelihood ratio test for this model when the amplitudes and phases are known is given in [M.J. Hinich, Signal Processing 83 (2003) 1349-13521. A method for detecting a RMP whose amplitudes and phases are unknown random process plus a stationary noise process is addressed in this paper. The only assumption on the additive noise is that it has finite dependence and finite moments. Using simulations based on a simple RMP model we show a case where the new method can detect the signal when the signal is not detectable in a standard waterfall spectrograrn display. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - In many scientific and engineering fields, large-scale heat transfer problems with temperature-dependent pore-fluid densities are commonly encountered. For example, heat transfer from the mantle into the upper crust of the Earth is a typical problem of them. The main purpose of this paper is to develop and present a new combined methodology to solve large-scale heat transfer problems with temperature-dependent pore-fluid densities in the lithosphere and crust scales. Design/methodology/approach - The theoretical approach is used to determine the thickness and the related thermal boundary conditions of the continental crust on the lithospheric scale, so that some important information can be provided accurately for establishing a numerical model of the crustal scale. The numerical approach is then used to simulate the detailed structures and complicated geometries of the continental crust on the crustal scale. The main advantage in using the proposed combination method of the theoretical and numerical approaches is that if the thermal distribution in the crust is of the primary interest, the use of a reasonable numerical model on the crustal scale can result in a significant reduction in computer efforts. Findings - From the ore body formation and mineralization points of view, the present analytical and numerical solutions have demonstrated that the conductive-and-advective lithosphere with variable pore-fluid density is the most favorite lithosphere because it may result in the thinnest lithosphere so that the temperature at the near surface of the crust can be hot enough to generate the shallow ore deposits there. The upward throughflow (i.e. mantle mass flux) can have a significant effect on the thermal structure within the lithosphere. In addition, the emplacement of hot materials from the mantle may further reduce the thickness of the lithosphere. Originality/value - The present analytical solutions can be used to: validate numerical methods for solving large-scale heat transfer problems; provide correct thermal boundary conditions for numerically solving ore body formation and mineralization problems on the crustal scale; and investigate the fundamental issues related to thermal distributions within the lithosphere. The proposed finite element analysis can be effectively used to consider the geometrical and material complexities of large-scale heat transfer problems with temperature-dependent fluid densities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Inpatient length of stay (LOS) is an important measure of hospital activity, health care resource consumption, and patient acuity. This research work aims at developing an incremental expectation maximization (EM) based learning approach on mixture of experts (ME) system for on-line prediction of LOS. The use of a batchmode learning process in most existing artificial neural networks to predict LOS is unrealistic, as the data become available over time and their pattern change dynamically. In contrast, an on-line process is capable of providing an output whenever a new datum becomes available. This on-the-spot information is therefore more useful and practical for making decisions, especially when one deals with a tremendous amount of data. Methods and material: The proposed approach is illustrated using a real example of gastroenteritis LOS data. The data set was extracted from a retrospective cohort study on all infants born in 1995-1997 and their subsequent admissions for gastroenteritis. The total number of admissions in this data set was n = 692. Linked hospitalization records of the cohort were retrieved retrospectively to derive the outcome measure, patient demographics, and associated co-morbidities information. A comparative study of the incremental learning and the batch-mode learning algorithms is considered. The performances of the learning algorithms are compared based on the mean absolute difference (MAD) between the predictions and the actual LOS, and the proportion of predictions with MAD < 1 day (Prop(MAD < 1)). The significance of the comparison is assessed through a regression analysis. Results: The incremental learning algorithm provides better on-line prediction of LOS when the system has gained sufficient training from more examples (MAD = 1.77 days and Prop(MAD < 1) = 54.3%), compared to that using the batch-mode learning. The regression analysis indicates a significant decrease of MAD (p-value = 0.063) and a significant (p-value = 0.044) increase of Prop(MAD

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multiagent diagnostic system implemented in a Protege-JADE-JESS environment interfaced with a dynamic simulator and database services is described in this paper. The proposed system architecture enables the use of a combination of diagnostic methods from heterogeneous knowledge sources. The process ontology and the process agents are designed based on the structure of the process system, while the diagnostic agents implement the applied diagnostic methods. A specific completeness coordinator agent is implemented to coordinate the diagnostic agents based on different methods. The system is demonstrated on a case study for diagnosis of faults in a granulation process based on HAZOP and FMEA analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-level language program compilation strategies can be proven correct by modelling the process as a series of refinement steps from source code to a machine-level description. We show how this can be done for programs containing recursively-defined procedures in the well-established predicate transformer semantics for refinement. To do so the formalism is extended with an abstraction of the way stack frames are created at run time for procedure parameters and variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many advanced applications, data are described by multiple high-dimensional features. Moreover, different queries may weight these features differently; some may not even specify all the features. In this paper, we propose our solution to support efficient query processing in these applications. We devise a novel representation that compactly captures f features into two components: The first component is a 2D vector that reflects a distance range ( minimum and maximum values) of the f features with respect to a reference point ( the center of the space) in a metric space and the second component is a bit signature, with two bits per dimension, obtained by analyzing each feature's descending energy histogram. This representation enables two levels of filtering: The first component prunes away points that do not share similar distance ranges, while the bit signature filters away points based on the dimensions of the relevant features. Moreover, the representation facilitates the use of a single index structure to further speed up processing. We employ the classical B+-tree for this purpose. We also propose a KNN search algorithm that exploits the access orders of critical dimensions of highly selective features and partial distances to prune the search space more effectively. Our extensive experiments on both real-life and synthetic data sets show that the proposed solution offers significant performance advantages over sequential scan and retrieval methods using single and multiple VA-files.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is shown that in some cases it is possible to reconstruct a block design D uniquely from incomplete knowledge of a minimal defining set for D. This surprising result has implications for the use of minimal defining sets in secret sharing schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an ongoing collaboration between Boeing Australia Limited and the University of Queensland to develop and deliver an introductory course on software engineering for Boeing Australia. The aim of the course is to provide a common understanding for all Boeing Australia's engineering staff of the nature of software engineering and the practices used throughout Boeing Australia. It is meant as an introductory course that can be presented to people with varying backgrounds, such as recent software engineering graduates, systems engineers, quality assurance personnel, etc. The paper describes the structure and content of the course, and the evaluation techniques used to collect feedback from the participants and the corresponding results. The course has been well-received by the participants, but the feedback from the course has indicated a need for more advanced courses in specific areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The demand for more pixels is beginning to be met as manufacturers increase the native resolution of projector chips. Tiling several projectors still offers a solution to augment the pixel capacity of a display. However, problems of color and illumination uniformity across projectors need to be addressed as well as the computer software required to drive such devices. We present the results obtained on a desktop-size tiled projector array of three D-ILA projectors sharing a common illumination source. A short throw lens (0.8:1) on each projector yields a 21-in. diagonal for each image tile; the composite image on a 3×1 array is 3840×1024 pixels with a resolution of about 80 dpi. The system preserves desktop resolution, is compact, and can fit in a normal room or laboratory. The projectors are mounted on precision six-axis positioners, which allow pixel level alignment. A fiber optic beamsplitting system and a single set of red, green, and blue dichroic filters are the key to color and illumination uniformity. The D-ILA chips inside each projector can be adjusted separately to set or change characteristics such as contrast, brightness, or gamma curves. The projectors were then matched carefully: photometric variations were corrected, leading to a seamless image. Photometric measurements were performed to characterize the display and are reported here. This system is driven by a small PC cluster fitted with graphics cards and running Linux. It can be scaled to accommodate an array of 2×3 or 3×3 projectors, thus increasing the number of pixels of the final image. Finally, we present current uses of the display in fields such as astrophysics and archaeology (remote sensing).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The task of segmenting cell nuclei from cytoplasm in conventional Papanicolaou (Pap) stained cervical cell images is a classical image analysis problem which may prove to be crucial to the development of successful systems which automate the analysis of Pap smears for detection of cancer of the cervix. Although simple thresholding techniques will extract the nucleus in some cases, accurate unsupervised segmentation of very large image databases is elusive. Conventional active contour models as introduced by Kass, Witkin and Terzopoulos (1988) offer a number of advantages in this application, but suffer from the well-known drawbacks of initialisation and minimisation. Here we show that a Viterbi search-based dual active contour algorithm is able to overcome many of these problems and achieve over 99% accurate segmentation on a database of 20 130 Pap stained cell images. (C) 1998 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use a spatially explicit population model to explore the population consequences of different habitat selection mechanisms on landscapes with fractal variation in habitat quality. We consider dispersal strategies ranging from random walks to perfect habitat selectors for two species of arboreal marsupial, the greater glider (Petauroides volans) and the mountain brushtail possum (Trichosurus caninus). In this model increasing habitat selection means individuals obtain higher quality territories, but experience increased mortality during dispersal. The net effect is that population sizes are smaller when individuals actively select habitat. We find positive relationships between habitat quality and population size can occur when individuals do not use information about the entire landscape when habitat quality is spatially autocorrelated. We also find that individual behaviour can mitigate the negative effects of spatial variation on population average survival and fecundity. (C) 1998 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we describe a model of the human visual system (HVS) based on the wavelet transform. This model is largely based on a previously proposed model, but has a number of modifications that make it more amenable to potential integration into a wavelet based image compression scheme. These modifications include the use of a separable wavelet transform instead of the cortex transform, the application of a wavelet contrast sensitivity function (CSP), and a simplified definition of subband contrast that allows us to predict noise visibility directly from wavelet coefficients. Initially, we outline the luminance, frequency, and masking sensitivities of the HVS and discuss how these can be incorporated into the wavelet transform. We then outline a number of limitations of the wavelet transform as a model of the HVS, namely the lack of translational invariance and poor orientation sensitivity. In order to investigate the efficacy of this wavelet based model, a wavelet visible difference predictor (WVDP) is described. The WVDP is then used to predict visible differences between an original and compressed (or noisy) image. Results are presented to emphasize the limitations of commonly used measures of image quality and to demonstrate the performance of the WVDP, The paper concludes with suggestions on bow the WVDP can be used to determine a visually optimal quantization strategy for wavelet coefficients and produce a quantitative measure of image quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CXTANNEAL is a program for analysing contaminant transport in soils. The code, written in Fortran 77, is a modified version of CXTFIT, a commonly used package for estimating solute transport parameters in soils. The improvement of the present code is that it includes simulated annealing as the optimization technique for curve fitting. Tests with hypothetical data show that CXTANNEAL performs better than the original code in searching for optimal parameter estimates. To reduce the computational time, a parallel version of CXTANNEAL (CXTANNEAL_P) was also developed. (C) 1999 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador: