67 resultados para Temporal constraints analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Composite Applications on top of SAPs implementation of SOA (Enterprise SOA) enable the extension of already existing business logic. In this paper we show, based on a case study, how Model-Driven Engineering concepts are applied in the development of such Composite Applications. Our Case Study extends a back-end business process which is required for the specific needs of a demo company selling wine. We use this to describe how the business centric models specifying the modified business behaviour of our case study can be utilized for business performance analysis where most of the actions are performed by humans. In particular, we apply a refined version of Model-Driven Performance Engineering that we proposed in our previous work and motivate which business domain specifics have to be taken into account for business performance analysis. We additionally motivate the need for performance related decision support for domain experts, who generally lack performance related skills. Such a support should offer visual guidance about what should be changed in the design and resource mapping to get improved results with respect to modification constraints and performance objectives, or objectives for time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A substantial amount of the 'critical mass' of digital data available to scholarship contains place-names, and it is now recognised that spatial and temporal data points, including place-names, are a vital part of the e-research infrastructure that supports the use, re-use and advanced analysis of data using ICT tools and methods. Place-names can also be linked semantically to contribute to the web of data, and to enrich content through linking existing data, and identifying new collections for digitization to strategically enhance existing digital collections. However, existing e-projects rely on modern gazetteers limiting them to the modern and the near-contemporary. This workshop explored how to further integrate the wealth of historical place-name scholarship, and the resulting digital resources generated within UK academia, so enabling integration of local knowledge over much longer periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concentrations of major ions, silicate and nutrients (total N and P) were measured in samples of surface water from 28 lakes in ice-free areas of northern Victoria Land (East Antarctica). Sixteen lakes were sampled during austral summers 2001/02, 2003/04, 2004/05 and 2005/06 to assess temporal variation in water chemistry. Although samples showed a wide range in ion concentrations, their composition mainly reflected that of seawater. In general, as the distance from the sea increased, the input of elements from the marine environment (through aerosols and seabirds) decreased and there was an increase in nitrate and sulfate concentrations. Antarctic lakes lack outflows and during the austral summer the melting and/or ablation of ice cover, water evaporation and leaching processes in dry soils determine a progressive increase in water ion concentrations. During the five-year monitoring survey, no statistically significant variation in the water chemistry were detected, except for a slight (hardly significant) increase in TN concentrations. However, Canonical Correspondence Analysis (CCA) indicated that other factors besides distance from the sea, the presence of nesting seabirds, the sampling time and percentage of ice cover affect the composition of water in Antarctic cold desert environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Processor architectures has taken a turn towards many-core processors, which integrate multiple processing cores on a single chip to increase overall performance, and there are no signs that this trend will stop in the near future. Many-core processors are harder to program than multi-core and single-core processors due to the need of writing parallel or concurrent programs with high degrees of parallelism. Moreover, many-cores have to operate in a mode of strong scaling because of memory bandwidth constraints. In strong scaling increasingly finer-grain parallelism must be extracted in order to keep all processing cores busy.

Task dataflow programming models have a high potential to simplify parallel program- ming because they alleviate the programmer from identifying precisely all inter-task de- pendences when writing programs. Instead, the task dataflow runtime system detects and enforces inter-task dependences during execution based on the description of memory each task accesses. The runtime constructs a task dataflow graph that captures all tasks and their dependences. Tasks are scheduled to execute in parallel taking into account dependences specified in the task graph.

Several papers report important overheads for task dataflow systems, which severely limits the scalability and usability of such systems. In this paper we study efficient schemes to manage task graphs and analyze their scalability. We assume a programming model that supports input, output and in/out annotations on task arguments, as well as commutative in/out and reductions. We analyze the structure of task graphs and identify versions and generations as key concepts for efficient management of task graphs. Then, we present three schemes to manage task graphs building on graph representations, hypergraphs and lists. We also consider a fourth edge-less scheme that synchronizes tasks using integers. Analysis using micro-benchmarks shows that the graph representation is not always scalable and that the edge-less scheme introduces least overhead in nearly all situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In finite difference time domain simulation of room acoustics, source functions are subject to various constraints. These depend on the way sources are injected into the grid and on the chosen parameters of the numerical scheme being used. This paper addresses the issue of selecting and designing sources for finite difference simulation, by first reviewing associated aims and constraints, and evaluating existing source models against these criteria. The process of exciting a model is generalized by introducing a system of three cascaded filters, respectively, characterizing the driving pulse, the source mechanics, and the injection of the resulting source function into the grid. It is shown that hard, soft, and transparent sources can be seen as special cases within this unified approach. Starting from the mechanics of a small pulsating sphere, a parametric source model is formulated by specifying suitable filters. This physically constrained source model is numerically consistent, does not scatter incoming waves, and is free from zero- and low-frequency artifacts. Simulation results are employed for comparison with existing source formulations in terms of meeting the spectral and temporal requirements on the outward propagating wave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Composite Applications on top of SAPs implementation of SOA (Enterprise SOA) enable the extension of already existing business logic. In this paper we show, based on a case study, how Model-Driven Engineering concepts are applied in the development of such Composite Applications. Our Case Study extends a back-end business process which is required for the specific needs of a demo company selling wine. We use this to describe how the business centric models specifying the modified business behaviour of our case study can be utilized for business performance analysis where most of the actions are performed by humans. In particular, we apply a refined version of Model-Driven Performance Engineering that we proposed in our previous work and motivate which business domain specifics have to be taken into account for business performance analysis. We additionally motivate the need for performance related decision support for domain experts, who generally lack performance related skills. Such a support should offer visual guidance about what should be changed in the design and resource mapping to get improved results with respect to modification constraints and performance objectives, or objectives for time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although pumped hydro storage is seen as a strategic key asset by grid operators, financing it is complicated in new liberalised markets. It could be argued that the optimum generation portfolio is now determined by the economic viability of generators based on a short to medium term return on investment. This has meant that capital intensive projects such as pumped hydro storage are less attractive for wholesale electricity companies because the payback periods are too long. In tandem a significant amount of wind power has entered the generation mix, which has resulted in operating and planning integration issues due to wind's inherent uncertain, varying spatial and temporal nature. These integration issues can be overcome using fast acting gas peaking plant or energy storage. Most analysis of wind power integration using storage to date has used stochastic optimisation for power system balancing or arbitrage modelling to examine techno-economic viability. In this research a deterministic dynamic programming long term generation expansion model is employed to optimise the generation mix, total system costs and total carbon dioxide emissions, and unlike other studies calculates reserve to firm wind power. The key finding of this study is that the incentive to build capital-intensive pumped hydro storage to firm wind power is limited unless exogenous market costs come very strongly into play. Furthermore it was demonstrated that reserve increases with increasing wind power showing the importance of ancillary services in future power systems. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numbers of Tufted Ducks Aythya fuligula wintering at Lough Neagh declined dramatically following the winter of 2000/01. The abundance and biomass of benthic macroinvertebrates, their main food source, declined significantly between the winters of 1997/98 and 2010. Therefore, information on recent diet was required to determine if there had been any significant changes before and after the observed declines in numbers of both macroinvertebrates and birds. Here, we used oesophageal content analysis to characterise the contemporary diet of Tufted Ducks at Lough Neagh during 2010-12. Out of 75 shot ducks, only three individuals had prey items in their oesophagi while all four ducks that accidentally drowned in gill nets contained prey items. Oesophageal contents were then compared with data collected during a study conducted in the late 1990s. Contemporary diet of Tufted Ducks was dominated by Asellus aquaticus (48%), but molluscs (14%), grain (13%) and chironomid larvae (11%) were also consumed. Between 1998-99 and 2010-12, the contribution of Asellus aquaticus to the diet significantly decreased while the proportions of chironomid larvae, grain, Gammarus spp. and Mysis spp. increased. Alternative methods of dietary analysis, for example stable isotope analysis, are recommended in future studies of diving duck diet at Lough Neagh.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose
– The purpose of this paper is to explore and explain the change process in Northern Ireland policing through an analysis of temporally bracketed change phases and key change delivery themes ranging from 1996 to 2012.

Design/methodology/approach
– The research approach adopted is process based, longitudinal and multi-method, utilising “temporal bracketing” to determine phases of change and conjunctural reasoning to unravel the systematic factors interacting over time, within the case.

Findings
– The paper identifies and temporally brackets four phases of change: “Tipping point”; “Implementation, Symbolic Modification and Resistance”; “Power Assisted Steering”; and “A Return to Turbulence”, identifies four themes that emerge from RUC-PSNI experience: the role of adaptive leadership; pace and sequencing of change implementation; sufficient resourcing; and the impact of external agents acting as boundary spanners, and comments on the prominence of these themes through the phases. The paper goes on to reflect upon how these phases and themes inform our understanding of organisational change within policing organisations generally and within politically pressurised transition processes.

Originality/value
– The contribution of the paper lies in the documentation of an almost unique organisational case in an environmentally forced change process. In this it contains lessons for other organisations facing similar, if less extreme challenges and presents an example of intense change analysed longitudinally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to address road safety effectively, it is essential to understand all the factors, which
attribute to the occurrence of a road collision. This is achieved through road safety
assessment measures, which are primarily based on historical crash data. Recent advances
in uncertain reasoning technology have led to the development of robust machine learning
techniques, which are suitable for investigating road traffic collision data. These techniques
include supervised learning (e.g. SVM) and unsupervised learning (e.g. Cluster Analysis).
This study extends upon previous research work, carried out in Coll et al. [3], which
proposed a non-linear aggregation framework for identifying temporal and spatial hotspots.
The results from Coll et al. [3] identified Lisburn area as the hotspot, in terms of road safety,
in Northern Ireland. This study aims to use Cluster Analysis, to investigate and highlight any
hidden patterns associated with collisions that occurred in Lisburn area, which in turn, will
provide more clarity in the causation factors so that appropriate countermeasures can be put
in place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electric vehicles (EV) are proposed as a measure to reduce greenhouse gas emissions in transport and support increased wind power penetration across modern power systems. Optimal benefits can only be achieved, if EVs are deployed effectively, so that the exhaust emissions are not substituted by additional emissions in the electricity sector, which can be implemented using Smart Grid controls. This research presents the results of an EV roll-out in the all island grid (AIG) in Ireland using the long term generation expansion planning model called the Wien Automatic System Planning IV (WASP-IV) tool to measure carbon dioxide emissions and changes in total energy. The model incorporates all generators and operational requirements while meeting environmental emissions, fuel availability and generator operational and maintenance constraints to optimize economic dispatch and unit commitment power dispatch. In the study three distinct scenarios are investigated base case, peak and off-peak charging to simulate the impacts of EV’s in the AIG up to 2025.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the 1950s and 1960s, excavations by the Sarawak Museum at Niah Cave in northwest Borneo produced an enormous archive of records and artefacts, including in excess of 750,000 macro- and micro-vertebrate remains. The excellent state of preservation of the animal bone, dating from the Late Pleistocene (c. 40 kya) to as recently as c. 500 years ago had the potential to provide unparalleled zooarchaeological information about early hunter-gatherer resource procurement, temporal changes in subsistence patterning, and the impact of peoples on the local and regional environment in Island Southeast Asia. However, the coarse-grained methods of excavation employed during the original investigations and the sheer scale of the archaeological record and bone assemblages dissuaded many researchers from attempting to tackle the Niah archives. This paper outlines how important information on the nature of the archaeological record at Niah has now finally been extracted from the archive using a combination of zooarchaeological analysis and reference to the extensive archaeological records from the site. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel methodology has been developed to quantify important saltwater intrusion parameters in a sandbox style experiment using image analysis. Existing methods found in the literature are based mainly on visual observations, which are subjective, labour intensive and limits the temporal and spatial resolutions that can be analysed. A robust error analysis was undertaken to determine the optimum methodology to convert image light intensity to concentration. Results showed that defining a relationship on a pixel-wise basis provided the most accurate image to concentration conversion and allowed quantification of the width of mixing zone between the saltwater and freshwater. A large image sample rate was used to investigate the transient dynamics of saltwater intrusion, which rendered analysis by visual observation unsuitable. This paper presents the methodologies developed to minimise human input and promote autonomy, provide high resolution image to concentration conversion and allow the quantification of intrusion parameters under transient conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 15 years, the supernova community has endeavoured to directly identify progenitor stars for core-collapse supernovae discovered in nearby galaxies. These precursors are often visible as resolved stars in high-resolution images from space-and ground-based telescopes. The discovery rate of progenitor stars is limited by the local supernova rate and the availability and depth of archive images of galaxies, with 18 detections of precursor objects and 27 upper limits. This review compiles these results (from 1999 to 2013) in a distance-limited sample and discusses the implications of the findings. The vast majority of the detections of progenitor stars are of type II-P, II-L, or IIb with one type Ib progenitor system detected and many more upper limits for progenitors of Ibc supernovae (14 in all). The data for these 45 supernovae progenitors illustrate a remarkable deficit of high-luminosity stars above an apparent limit of log L/L-circle dot similar or equal to 5.1 dex. For a typical Salpeter initial mass function, one would expect to have found 13 high-luminosity and high-mass progenitors by now. There is, possibly, only one object in this time-and volume-limited sample that is unambiguously high-mass (the progenitor of SN2009ip) although the nature of that supernovae is still debated. The possible biases due to the influence of circumstellar dust, the luminosity analysis, and sample selection methods are reviewed. It does not appear likely that these can explain the missing high-mass progenitor stars. This review concludes that the community's work to date shows that the observed populations of supernovae in the local Universe are not, on the whole, produced by high-mass (M greater than or similar to 18 M-circle dot) stars. Theoretical explosions of model stars also predict that black hole formation and failed supernovae tend to occur above an initial mass of M similar or equal to 18 M-circle dot. The models also suggest there is no simple single mass division for neutron star or black-hole formation and that there are islands of explodability for stars in the 8-120 M-circle dot range. The observational constraints are quite consistent with the bulk of stars above M similar or equal to 18 M-circle dot collapsing to form black holes with no visible supernovae.