39 resultados para System verification and analysis

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses Dynamic Integrated System Optimization and Parameter Estimation (DISOPE), which achieves the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimization procedure. A version of the algorithm with a linear-quadratic model-based problem, implemented in the C+ + programming language, is developed and applied to illustrative simulation examples. An analysis of the optimality and convergence properties of the algorithm is also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper makes a theoretical case for using these two systems approaches together. The theoretical and methodological assumptions of system dynamics (SD) and soft system methodology (SSM) are briefly described and a partial critique is presented. SSM generates and represents diverse perspectives on a problem situation and addresses the socio-political elements of an intervention. However, it is weak in ensuring `dynamic coherence'. consistency between the intuitive behaviour resulting from proposed changes and behaviour deduced from ideas on causal structure. Conversely, SD examines causal structures and dynamic behaviours. However, whilst emphasising the need for a clear issue focus, it has little theory for generating and representing diverse issues. Also, there is no theory for facilitating sensitivity to socio-political elements. A synthesis of the two called ‘Holon Dynamics' is proposed. After an SSM intervention, a second stage continues the socio-political analysis and also operates within a new perspective which values dynamic coherence of the mental construct - the holon - which is capable of expressing the proposed changes. A model of this holon is constructed using SD and the changes are thus rendered `systemically desirable' in the additional sense that dynamic consistency has been confirmed. The paper closes with reflections on the proposal and the need for theoretical consistency when mixing tools is emphasised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land cover plays a key role in global to regional monitoring and modeling because it affects and is being affected by climate change and thus became one of the essential variables for climate change studies. National and international organizations require timely and accurate land cover information for reporting and management actions. The North American Land Change Monitoring System (NALCMS) is an international cooperation of organizations and entities of Canada, the United States, and Mexico to map land cover change of North America's changing environment. This paper presents the methodology to derive the land cover map of Mexico for the year 2005 which was integrated in the NALCMS continental map. Based on a time series of 250 m Moderate Resolution Imaging Spectroradiometer (MODIS) data and an extensive sample data base the complexity of the Mexican landscape required a specific approach to reflect land cover heterogeneity. To estimate the proportion of each land cover class for every pixel several decision tree classifications were combined to obtain class membership maps which were finally converted to a discrete map accompanied by a confidence estimate. The map yielded an overall accuracy of 82.5% (Kappa of 0.79) for pixels with at least 50% map confidence (71.3% of the data). An additional assessment with 780 randomly stratified samples and primary and alternative calls in the reference data to account for ambiguity indicated 83.4% overall accuracy (Kappa of 0.80). A high agreement of 83.6% for all pixels and 92.6% for pixels with a map confidence of more than 50% was found for the comparison between the land cover maps of 2005 and 2006. Further wall-to-wall comparisons to related land cover maps resulted in 56.6% agreement with the MODIS land cover product and a congruence of 49.5 with Globcover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increase in e-commerce and the digitisation of design data and information,the construction sector has become reliant upon IT infrastructure and systems. The design and production process is more complex, more interconnected, and reliant upon greater information mobility, with seamless exchange of data and information in real time. Construction small and medium-sized enterprises (CSMEs), in particular,the speciality contractors, can effectively utilise cost-effective collaboration-enabling technologies, such as cloud computing, to help in the effective transfer of information and data to improve productivity. The system dynamics (SD) approach offers a perspective and tools to enable a better understanding of the dynamics of complex systems. This research focuses upon system dynamics methodology as a modelling and analysis tool in order to understand and identify the key drivers in the absorption of cloud computing for CSMEs. The aim of this paper is to determine how the use of system dynamics (SD) can improve the management of information flow through collaborative technologies leading to improved productivity. The data supporting the use of system dynamics was obtained through a pilot study consisting of questionnaires and interviews from five CSMEs in the UK house-building sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Haptic devices tend to be kept small as it is easier to achieve a large change of stiffness with a low associated apparent mass. If large movements are required there is a usually a reduction in the quality of the haptic sensations which can be displayed. The typical measure of haptic device performance is impedance-width (z-width) but this does not account for actuator saturation, usable workspace or the ability to do rapid movements. This paper presents the analysis and evaluation of a haptic device design, utilizing a variant of redundant kinematics, sometimes referred to as a macro-micro configuration, intended to allow large and fast movements without loss of impedance-width. A brief mathematical analysis of the design constraints is given and a prototype system is described where the effects of different elements of the control scheme can be examined to better understand the potential benefits and trade-offs in the design. Finally, the performance of the system is evaluated using a Fitts’ Law test and found to compare favourably with similar evaluations of smaller workspace devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EP2025 EDS project develops a highly parallel information server that supports established high-value interfaces. We describe the motivation for the project, the architecture of the system, and the design and application of its database and language subsystems. The Elipsys logic programming language, its advanced applications, EDS Lisp, and the Metal machine translation system are examined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CAFS search engine is a real machine in a virtual machine world; it is the hardware component of ICL's CAFS system. The paper is an introduction and prelude to the set of papers in this volume on CAFS applications. It defines The CAFS system and its context together with the function of its hardware and software components. It examines CAFS' role in the broad context of application development and information systems; it highlights some techniques and applications which exploit the CAFS system. Finally, it concludes with some suggestions for possible further developments. 'Search out thy wit for secret policies And we will make thee famous through the world' Henry VI, 1:3

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pharmacogenetic trials investigate the effect of genotype on treatment response. When there are two or more treatment groups and two or more genetic groups, investigation of gene-treatment interactions is of key interest. However, calculation of the power to detect such interactions is complicated because this depends not only on the treatment effect size within each genetic group, but also on the number of genetic groups, the size of each genetic group, and the type of genetic effect that is both present and tested for. The scale chosen to measure the magnitude of an interaction can also be problematic, especially for the binary case. Elston et al. proposed a test for detecting the presence of gene-treatment interactions for binary responses, and gave appropriate power calculations. This paper shows how the same approach can also be used for normally distributed responses. We also propose a method for analysing and performing sample size calculations based on a generalized linear model (GLM) approach. The power of the Elston et al. and GLM approaches are compared for the binary and normal case using several illustrative examples. While more sensitive to errors in model specification than the Elston et al. approach, the GLM approach is much more flexible and in many cases more powerful. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organic sweet maize consists of a new industrial crop product. Field experiment was conducted to determine the effects of cultural systems on growth, photosynthesis and yield components of sweet maize crop (Zea mays L. F-1 hybrid 'Midas'). A randomized complete block design was employed with four replicates per treatment (organic fertilization: cow manure (5, 10 and 20 t ha(-1)), poultry manure (5, 10 and 20 t ha(-1)) and barley mulch (5, 10 and 20 t ha(-1)), synthetic fertilizer (240 kg N ha(-1)): 21-0-0 and control). The lowest dry weight, height and leaf area index and sod organic matter were measured in the control treatment. Organic matter content was proportionate to the amount of manure applied. The control plots had the lowest yield (1593 kg ha(-1)) and the double rate cow manure plots the had,greatest one. (6104 kg ha(-1)). High correlation between sweet corn yield and organic matter was registered. Moreover, the lowest values of 1000-grain weight were obtained with control plot. The fertilizer plot gave values which were similar to the full rate cow manure treatment. The photosynthetic race of the untreated control was significantly lower than that of the other treatments. The phorosynthetic rate increased as poultry manure and barley mulch ram decreased and as cow manure increased. Furthermore the untreated control had the lowest stomatal conductance and chlorophyll content. Our results indicated that sweet corn growth and yield in the organic plots was significantly higher than those in the conventional plots.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have developed a new simple method for transport, storage, and analysis of genetic material from the corals Agaricia agaricites, Dendrogyra cylindrica, Eusmilia ancora, Meandrina meandrites, Montastrea annularis, Porites astreoides, Porites furcata, Porites porites, and Siderastrea siderea at room temperature. All species yielded sufficient DNA from a single FTA(R) card (19 mug-43 ng) for subsequent PCR amplification of both coral and zooxanthellar DNA. The D1 and D2 variable region of the large Subunit rRNA gene (LSUrDNA) was amplified from the DNA of P. furcata and S. siderea by PCR. Electrophoresis yielded two major DNA bands: an 800-base pair (bp) DNA, which represented the coral ribosomal RNA (rRNA) gene, and a 600-bp DNA, which represented the zooxanthellar srRNA gene. Extraction of DNA from the bands yielded between 290 mug total DNA (S. siderea coral DNA) and 9 mug total DNA (P. furcata zooxanthellar DNA). The ability to transport and store genetic material from scleractinian corals without resort to laboratory facilities in the field allows for the molecular Study of a far wider range and variety of coral sites than have been studied to date. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.