891 resultados para requirements management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electronics industry is developing rapidly together with the increasingly complex problem of microelectronic equipment cooling. It has now become necessary for thermal design engineers to consider the problem of equipment cooling at some level. The use of Computational Fluid Dynamics (CFD) for such investigations is fast becoming a powerful and almost essential tool for the design, development and optimisation of engineering applications. However turbulence models remain a key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt fluctuations experienced by the turbulent energy and other parameters located at near wall regions and shear layers a particularly fine computational mesh is necessary which inevitably increases the computer storage and run-time requirements. This paper will discuss results from an investigation into the accuract of currently used turbulence models. Also a newly formulated transitional hybrid turbulence model will be introduced with comparisonsaagainst experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes work towards the deployment of self-managing capabilities into an advanced middleware for automotive systems. The middleware will support a range of futuristic use-cases requiring context-awareness and dynamic system configuration. Several use-cases are described and their specific context-awareness requirements identified. The discussion is accompanied by a justification for the selection of policy-based computing as the autonomics technique to drive the self-management. The specific policy technology to be deployed is described briefly, with a focus on its specific features that are of direct relevance to the middleware project. A selected use-case is explored in depth to illustrate the extent of dynamic behaviour achievable in the proposed middleware architecture, which is composed of several policy-configured services. An early demonstration application which facilitates concept evaluation is presented and a sequence of typical device-discovery events is worked through

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cross-domain workflow application may be constructed using a standard reference model such as the one by the Workflow Management Coalition (WfMC) [7] but the requirements for this type of application are inherently different from one organization to another. The existing models and systems built around them meet some but not all the requirements from all the organizations involved in a collaborative process. Furthermore the requirements change over time. This makes the applications difficult to develop and distribute. Service Oriented Architecture (SOA) based approaches such as the BPET (Business Process Execution Language) intend to provide a solution but fail to address the problems sufficiently, especially in the situations where the expectations and level of skills of the users (e.g. the participants of the processes) in different organisations are likely to be different. In this paper, we discuss a design pattern that provides a novel approach towards a solution. In the solution, business users can design the applications at a high level of abstraction: the use cases and user interactions; the designs are documented and used, together with the data and events captured later that represents the user interactions with the systems, to feed an intermediate component local to the users -the IFM (InterFace Mapper) -which bridges the gaps between the users and the systems. We discuss the main issues faced in the design and prototyping. The approach alleviates the need for re-programming with the APIs to any back-end service thus easing the development and distribution of the applications

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Görzig, H., Engel, F., Brocks, H., Vogel, T. & Hemmje, M. (2015, August). Towards Data Management Planning Support for Research Data. Paper presented at the ASE International Conference on Data Science, Stanford, United States of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marine legislation is becoming more complex and marine ecosystem-based management is specified in national and regional legislative frameworks. Shelf-seas community and ecosystem models (hereafter termed ecosystem models) are central to the delivery of ecosystem-based management, but there is limited uptake and use of model products by decision makers in Europe and the UK in comparison with other countries. In this study, the challenges to the uptake and use of ecosystem models in support of marine environmental management are assessed using the UK capability as an example. The UK has a broad capability in marine ecosystem modelling, with at least 14 different models that support management, but few examples exist of ecosystem modelling that underpin policy or management decisions. To improve understanding of policy and management issues that can be addressed using ecosystem models, a workshop was convened that brought together advisors, assessors, biologists, social scientists, economists, modellers, statisticians, policy makers, and funders. Some policy requirements were identified that can be addressed without further model development including: attribution of environmental change to underlying drivers, integration of models and observations to develop more efficient monitoring programmes, assessment of indicator performance for different management goals, and the costs and benefit of legislation. Multi-model ensembles are being developed in cases where many models exist, but model structures are very diverse making a standardised approach of combining outputs a significant challenge, and there is a need for new methodologies for describing, analysing, and visualising uncertainties. A stronger link to social and economic systems is needed to increase the range of policy-related questions that can be addressed. It is also important to improve communication between policy and modelling communities so that there is a shared understanding of the strengths and limitations of ecosystem models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of wideband network services and the new network infrastructures to support them have placed much more requirements on current network management systems. Issues such as scalability, integrity and interoperability have become more important. Existing management systems are not flexible enough to support the provision of Quality of Service (QoS) in these dynamic environments. The concept of Programmable Networks has been proposed to address these requirements. Within this framework, CORBA is regarded as a middleware technology that can enable interoperation among the distributed entities founds in Programmable Networks. By using the basic CORBA environment in a heterogeneous network environment, a network manager is able to control remote Network Elements (NEs) in the same way it controls its local resources. Using this approach both the flexibility and intelligence of the overall network management can be improved. This paper proposes the use of two advanced features of CORBA to enhance the QoS management in a Programmable Network environment. The Transaction Service can be used to manage a set of tasks, whenever the management of elements in a network is correlated; and the Concurrency Service can be used to coordinate multiple accesses on the same network resources. It is also shown in this paper that proper use of CORBA can largely reduce the development and administration of network management applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In common with other farmland species, hares (Lepus spp.) are in widespread decline in agricultural landscapes due to agricultural intensification and habitat loss. We examined the importance of habitat heterogeneity to the Irish hare (Lepus timidus hibernicus) in a pastoral landscape. We used radio-tracking during nocturnal active and diurnal inactive periods throughout one year. In autumn, winter and spring, hares occupied a heterogeneous combination of improved grassland, providing food, and Juncus-dominated rough pasture, providing refuge. In summer, hares significantly increased their use of improved grassland. This homogeneous habitat can fulfil the discrete and varied resource requirements of hares for feeding and shelter at certain times of year. However, improved grassland may be a risky habitat for hares as silage harvesting occurs during their peak birthing period of late spring and early summer. We therefore posit the existence of a putative ecological trap inherent to a homogeneous habitat of perceived high value that satisfies the hares' habitat requirements but which presents risks at a critical time of year. To test this hypothesis in relation to hare populations, work is required to provide data on differential leveret mortality between habitat types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioral data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals’ environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For any proposed software project, when the software requirements specification has been established, requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development. Then it is necessary to provide effective and flexible requirements changes management. In this paper, we present an approach to managing requirements changes based on Booth’s negotiation-style framework for belief revision. Informally, we consider the current requirements specification as a belief set about the system-to-be. The request of requirements change is viewed as new information about the same system-to-be. Then the process of executing the requirements change is a process of revising beliefs about the system-to-be. We design a family of belief negotiation models appropriate for different processes of requirements revision, including the setting of the request of requirements change being fully accepted, the setting of the current requirements specification being fully preserved, and that of the current specification and the request of requirements change reaching a compromise. In particular, the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to identify the various managerial issues encountered by UK/Irish contractors in the management of materials in confined urban construction sites. Through extensive literature review, detailed interviews, case studies, cognitive mapping, causal loop diagrams, questionnaire survey and documenting severity indices, a comprehensive insight into the materials management concerns within a confined construction site environment is envisaged and portrayed. The leading issues highlighted are: that contractors’ material spatial requirements exceed available space, it is difficult to coordinate the storage of materials in line with the programme, location of the site entrance makes delivery of materials particularly difficult, it is difficult to store materials on-site due to the lack of space, and difficult to coordinate the storage requirements of the various sub-contractors. With the continued development of confined urban centres and the increasing high cost of materials, any marginal savings made on-site would translate into significant monetary savings at project completion. Such savings would give developers a distinct competitive advantage in this challenging economic climate. As on-site management professionals successfully identify, acknowledge and counteract the numerous issues illustrated, the successful management of materials on a confined urban construction site becomes attainable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many scientific applications are programmed using hybrid programming models that use both message passing and shared memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared memory or message passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoption of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74 percent on average and up to 13.8 percent) with some performance gain (up to 7.5 percent) or negligible performance loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The requirement to provide multimedia services with QoS support in mobile networks has led to standardization and deployment of high speed data access technologies such as the High Speed Downlink Packet Access (HSDPA) system. HSDPA improves downlink packet data and multimedia services support in WCDMA-based cellular networks. As is the trend in emerging wireless access technologies, HSDPA supports end-user multi-class sessions comprising parallel flows with diverse Quality of Service (QoS) requirements, such as real-time (RT) voice or video streaming concurrent with non real-time (NRT) data service being transmitted to the same user, with differentiated queuing at the radio link interface. Hence, in this paper we present and evaluate novel radio link buffer management schemes for QoS control of multimedia traffic comprising concurrent RT and NRT flows in the same HSDPA end-user session. The new buffer management schemes—Enhanced Time Space Priority (E-TSP) and Dynamic Time Space Priority (D-TSP)—are designed to improve radio link and network resource utilization as well as optimize end-to-end QoS performance of both RT and NRT flows in the end-user session. Both schemes are based on a Time-Space Priority (TSP) queuing system, which provides joint delay and loss differentiation between the flows by queuing (partially) loss tolerant RT flow packets for higher transmission priority but with restricted access to the buffer space, whilst allowing unlimited access to the buffer space for delay-tolerant NRT flow but with queuing for lower transmission priority. Experiments by means of extensive system-level HSDPA simulations demonstrates that with the proposed TSP-based radio link buffer management schemes, significant end-to-end QoS performance gains accrue to end-user traffic with simultaneous RT and NRT flows, in addition to improved resource utilization in the radio access network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Construction Design and Management (CDM) Regulations (2007) is one of the most important set of health and safety regulations in the construction industry today. The aim of this research is to examine critical success factors for CDM compliance in small to medium size contractors in the UK construction industry. The objectives of the research include the identification of critical barriers in doing so along with the identification of success factors where CDM is incorporated. A mixed method approach is adopted in the identification and categorisation of the various factors encompassing a literature review, interviews and questionnaire survey. The key finding which emerge is the lack of knowledge and understanding with regards the CDM regulations with the recommendation to encourage small and medium contractor compliance through illustrating the benefits attainable. The practicality of the research is evident based on the significant uptake in the CDM by larger contractors, yet the research indicates that further insight and guidance is required to educate and inform those working within small to medium sized contractors in the UK. Where such acknowledgement and compliance is adopted, it is envisaged that this sector will benefit from reduced incidents and accidents, increased productivity while ultimately leading to a safer and more productive industry as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we use qualitative research techniques to examine the role of general practitioners in the management of the long-term sickness absence. In order to uncover the perspectives of all the main agents affected by the actions of general practitioners, a case study approach focussing on one particular employment sector, the public health service, is adopted. The role of family physicians is viewed from the perspectives of health service managers, occupational health physicians, employees / patients, and general practitioners. Our argument is theoretically framed by Talcott Parsons’s model of the medical contribution to the sick role, along with subsequent conceptualisations of the social role and position of physicians. Sixty one semi-structured interviews and three focus group interviews were conducted in three Health and Social Care Trusts in Northern Ireland between 2010 and 2012. There was a consensus among respondents that general practitioners put far more weight on the preferences and needs of their patients than they did on the requirements of employing organisations. This was explained by respondents in terms of the propinquity and longevity of relationships between doctors and their patients, and by the ideology of holistic care and patient advocacy that general practitioners viewed as providing the foundations of their approach to patients. The approach of general practitioners was viewed negatively by managers and occupational health physicians, and more positively by general practitioners and patients. However, there is some evidence that general practitioners would be prepared to forfeit their role as validators of sick leave. Given the imperatives of both state and capital to reduce the financial burden of long-term sickness, this preparedness puts into doubt the continued role of general practitioners as gatekeepers to legitimate long-term sickness absence.