829 resultados para software management methodology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exploding demand for services like the World Wide Web reflects the potential that is presented by globally distributed information systems. The number of WWW servers world-wide has doubled every 3 to 5 months since 1993, outstripping even the growth of the Internet. At each of these self-managed sites, the Common Gateway Interface (CGI) and Hypertext Transfer Protocol (HTTP) already constitute a rudimentary basis for contributing local resources to remote collaborations. However, the Web has serious deficiencies that make it unsuited for use as a true medium for metacomputing --- the process of bringing hardware, software, and expertise from many geographically dispersed sources to bear on large scale problems. These deficiencies are, paradoxically, the direct result of the very simple design principles that enabled its exponential growth. There are many symptoms of the problems exhibited by the Web: disk and network resources are consumed extravagantly; information search and discovery are difficult; protocols are aimed at data movement rather than task migration, and ignore the potential for distributing computation. However, all of these can be seen as aspects of a single problem: as a distributed system for metacomputing, the Web offers unpredictable performance and unreliable results. The goal of our project is to use the Web as a medium (within either the global Internet or an enterprise intranet) for metacomputing in a reliable way with performance guarantees. We attack this problem one four levels: (1) Resource Management Services: Globally distributed computing allows novel approaches to the old problems of performance guarantees and reliability. Our first set of ideas involve setting up a family of real-time resource management models organized by the Web Computing Framework with a standard Resource Management Interface (RMI), a Resource Registry, a Task Registry, and resource management protocols to allow resource needs and availability information be collected and disseminated so that a family of algorithms with varying computational precision and accuracy of representations can be chosen to meet realtime and reliability constraints. (2) Middleware Services: Complementary to techniques for allocating and scheduling available resources to serve application needs under realtime and reliability constraints, the second set of ideas aim at reduce communication latency, traffic congestion, server work load, etc. We develop customizable middleware services to exploit application characteristics in traffic analysis to drive new server/browser design strategies (e.g., exploit self-similarity of Web traffic), derive document access patterns via multiserver cooperation, and use them in speculative prefetching, document caching, and aggressive replication to reduce server load and bandwidth requirements. (3) Communication Infrastructure: Finally, to achieve any guarantee of quality of service or performance, one must get at the network layer that can provide the basic guarantees of bandwidth, latency, and reliability. Therefore, the third area is a set of new techniques in network service and protocol designs. (4) Object-Oriented Web Computing Framework A useful resource management system must deal with job priority, fault-tolerance, quality of service, complex resources such as ATM channels, probabilistic models, etc., and models must be tailored to represent the best tradeoff for a particular setting. This requires a family of models, organized within an object-oriented framework, because no one-size-fits-all approach is appropriate. This presents a software engineering challenge requiring integration of solutions at all levels: algorithms, models, protocols, and profiling and monitoring tools. The framework captures the abstract class interfaces of the collection of cooperating components, but allows the concretization of each component to be driven by the requirements of a specific approach and environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of inexpensive workstations and networks has created a new era in distributed computing. At the same time, non-traditional applications such as computer-aided design (CAD), computer-aided software engineering (CASE), geographic-information systems (GIS), and office-information systems (OIS) have placed increased demands for high-performance transaction processing on database systems. The combination of these factors gives rise to significant challenges in the design of modern database systems. In this thesis, we propose novel techniques whose aim is to improve the performance and scalability of these new database systems. These techniques exploit client resources through client-based transaction management. Client-based transaction management is realized by providing logging facilities locally even when data is shared in a global environment. This thesis presents several recovery algorithms which utilize client disks for storing recovery related information (i.e., log records). Our algorithms work with both coarse and fine-granularity locking and they do not require the merging of client logs at any time. Moreover, our algorithms support fine-granularity locking with multiple clients permitted to concurrently update different portions of the same database page. The database state is recovered correctly when there is a complex crash as well as when the updates performed by different clients on a page are not present on the disk version of the page, even though some of the updating transactions have committed. This thesis also presents the implementation of the proposed algorithms in a memory-mapped storage manager as well as a detailed performance study of these algorithms using the OO1 database benchmark. The performance results show that client-based logging is superior to traditional server-based logging. This is because client-based logging is an effective way to reduce dependencies on server CPU and disk resources and, thus, prevents the server from becoming a performance bottleneck as quickly when the number of clients accessing the database increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this research we focus on the Tyndall 25mm and 10mm nodes energy-aware topology management to extend sensor network lifespan and optimise node power consumption. The two tiered Tyndall Heterogeneous Automated Wireless Sensors (THAWS) tool is used to quickly create and configure application-specific sensor networks. To this end, we propose to implement a distributed route discovery algorithm and a practical energy-aware reaction model on the 25mm nodes. Triggered by the energy-warning events, the miniaturised Tyndall 10mm data collector nodes adaptively and periodically change their association to 25mm base station nodes, while 25mm nodes also change the inter-connections between themselves, which results in reconfiguration of the 25mm nodes tier topology. The distributed routing protocol uses combined weight functions to balance the sensor network traffic. A system level simulation is used to quantify the benefit of the route management framework when compared to other state of the art approaches in terms of the system power-saving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to develop a methodology, based on satellite remote sensing, to estimate the vegetation Start of Season (SOS) across the whole island of Ireland on an annual basis. This growing body of research is known as Land Surface Phenology (LSP) monitoring. The SOS was estimated for each year from a 7-year time series of 10-day composited, 1.2 km reduced resolution MERIS Global Vegetation Index (MGVI) data from 2003 to 2009, using the time series analysis software, TIMESAT. The selection of a 10-day composite period was guided by in-situ observations of leaf unfolding and cloud cover at representative point locations on the island. The MGVI time series was smoothed and the SOS metric extracted at a point corresponding to 20% of the seasonal MGVI amplitude. The SOS metric was extracted on a per pixel basis and gridded for national scale coverage. There were consistent spatial patterns in the SOS grids which were replicated on an annual basis and were qualitatively linked to variation in landcover. Analysis revealed that three statistically separable groups of CORINE Land Cover (CLC) classes could be derived from differences in the SOS, namely agricultural and forest land cover types, peat bogs, and natural and semi-natural vegetation types. These groups demonstrated that managed vegetation, e.g. pastures has a significantly earlier SOS than in unmanaged vegetation e.g. natural grasslands. There was also interannual spatio-temporal variability in the SOS. Such variability was highlighted in a series of anomaly grids showing variation from the 7-year mean SOS. An initial climate analysis indicated that an anomalously cold winter and spring in 2005/2006, linked to a negative North Atlantic Oscillation index value, delayed the 2006 SOS countrywide, while in other years the SOS anomalies showed more complex variation. A correlation study using air temperature as a climate variable revealed the spatial complexity of the air temperature-SOS relationship across the Republic of Ireland as the timing of maximum correlation varied from November to April depending on location. The SOS was found to occur earlier due to warmer winters in the Southeast while it was later with warmer winters in the Northwest. The inverse pattern emerged in the spatial patterns of the spring correlates. This contrasting pattern would appear to be linked to vegetation management as arable cropping is typically practiced in the southeast while there is mixed agriculture and mostly pastures to the west. Therefore, land use as well as air temperature appears to be an important determinant of national scale patterns in the SOS. The TIMESAT tool formed a crucial component of the estimation of SOS across the country in all seven years as it minimised the negative impact of noise and data dropouts in the MGVI time series by applying a smoothing algorithm. The extracted SOS metric was sensitive to temporal and spatial variation in land surface vegetation seasonality while the spatial patterns in the gridded SOS estimates aligned with those in landcover type. The methodology can be extended for a longer time series of FAPAR as MERIS will be replaced by the ESA Sentinel mission in 2013, while the availability of full resolution (300m) MERIS FAPAR and equivalent sensor products holds the possibility of monitoring finer scale seasonality variation. This study has shown the utility of the SOS metric as an indicator of spatiotemporal variability in vegetation phenology, as well as a correlate of other environmental variables such as air temperature. However, the satellite-based method is not seen as a replacement of ground-based observations, but rather as a complementary approach to studying vegetation phenology at the national scale. In future, the method can be extended to extract other metrics of the seasonal cycle in order to gain a more comprehensive view of seasonal vegetation development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fundamental aim of this thesis is to examine the effect of New Public Management (NPM) on the traditional roles of elected representatives, management and community activists in Irish local government. This will be achieved through a case study analysis of one local authority, Cork County Council. NPM promises greater democracy in decision-making. Therefore, one can hypothesise that the roles of the three key groupings identified will become more influenced by principles of participatory decision-making. Thus, a number of related questions will be addressed by this work, such as, have the local elected representatives been empowered by NPM? Has a managerial revolution taken place? Has local democracy been enhanced by more effective community participation? It will be seen in chapter 2 that these questions have not been adequately addressed to date in NPM literature. The three groups identified can be regarded as stakeholders although the researcher is cautious in using this term because of its value-laden nature. Essentially, in terms of Cork County Council, stakeholders can be defined as decision-makers and people within the organization and its environment who are interested in or could be affected directly or indirectly by organizational performance. This is an all-embracing definition and includes all citizens, residents, community groups and client organizations. It is in this context that the term 'stakeholder' should be understood when it is occasionally used in this thesis. In this case, the perceptions of elected councilors, management and community representatives with regard to their changing roles are as significant as the changes themselves. The chapter begins with a brief account of the background to this research. This is followed by an explanation of the methodology which is used and then concludes with short statements about the remaining chapters in the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Comfort is, in essence, satisfaction with the environment, and with respect to the indoor environment it is primarily satisfaction with the thermal conditions and air quality. Improving comfort has social, health and economic benefits, and is more financially significant than any other building cost. Despite this, comfort is not strictly managed throughout the building lifecycle. This is mainly due to the lack of an appropriate system to adequately manage comfort knowledge through the construction process into operation. Previous proposals to improve knowledge management have not been successfully adopted by the construction industry. To address this, the BabySteps approach was devised. BabySteps is an approach, proposed by this research, which states that for an innovation to be adopted into the industry it must be implementable through a number of small changes. This research proposes that improving the management of comfort knowledge will improve comfort. ComMet is a new methodology proposed by this research that manages comfort knowledge. It enables comfort knowledge to be captured, stored and accessed throughout the building life-cycle and so allowing it to be re-used in future stages of the building project and in future projects. It does this using the following: Comfort Performances – These are simplified numerical representations of the comfort of the indoor environment. Comfort Performances quantify the comfort at each stage of the building life-cycle using standard comfort metrics. Comfort Ratings - These are a means of classifying the comfort conditions of the indoor environment according to an appropriate standard. Comfort Ratings are generated by comparing different Comfort Performances. Comfort Ratings provide additional information relating to the comfort conditions of the indoor environment, which is not readily determined from the individual Comfort Performances. Comfort History – This is a continuous descriptive record of the comfort throughout the project, with a focus on documenting the items and activities, proposed and implemented, which could potentially affect comfort. Each aspect of the Comfort History is linked to the relevant comfort entity it references. These three components create a comprehensive record of the comfort throughout the building lifecycle. They are then stored and made available in a common format in a central location which allows them to be re-used ad infinitum. The LCMS System was developed to implement the ComMet methodology. It uses current and emerging technologies to capture, store and allow easy access to comfort knowledge as specified by ComMet. LCMS is an IT system that is a combination of the following six components: Building Standards; Modelling & Simulation; Physical Measurement through the specially developed Egg-Whisk (Wireless Sensor) Network; Data Manipulation; Information Recording; Knowledge Storage and Access.Results from a test case application of the LCMS system - an existing office room at a research facility - highlighted that while some aspects of comfort were being maintained, the building’s environment was not in compliance with the acceptable levels as stipulated by the relevant building standards. The implementation of ComMet, through LCMS, demonstrates how comfort, typically only considered during early design, can be measured and managed appropriately through systematic application of the methodology as means of ensuring a healthy internal environment in the building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retrofitting of existing buildings for decreased energy usage, through increased energy efficiency and for minimum carbon dioxide emissions throughout their remaining lifetime is a major area of research. This research area requires development to provide building professionals with more efficient building retrofit solution determination tools. The overarching objective of this research is to develop a tool for this purpose through the implementation of a prescribed methodology. This has been achieved in three distinct steps. Firstly, the concept of using the degree-days modelling method as an adequate means of basing retrofit decision upon was analysed and the results illustrated that the concept had merit. Secondly, the concept of combining the degree-days modelling method and the Genetic Algorithms optimisation method is investigated as a method of determining optimal thermal energy retrofit solutions. Thirdly, the combination of the degree-days modelling method and the Genetic Algorithms optimisation method were packaged into a building retrofit decision-support tool and named BRaSS (Building Retrofit Support Software). The results demonstrate clearly that, fundamental building information, simplified occupancy profiles and weather data used in a static simulation modelling method is a sufficient and adequate means to base retrofitting decisions upon. The results also show that basing retrofit decisions upon energy analysis results are the best means to guide a retrofit project and also to achieve results which are optimum for a particular building. The results also indicate that the building retrofit decision-support tool, BRaSS, is an effective method to determine optimum thermal energy retrofit solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing number of software development projects successfully exhibit a mix of agile and traditional software development methodologies. Many of these mixed methodologies are organization specific and tailored to a specific project. Our objective in this research-in-progress paper is to develop an artifact that can guide the development of such a mixed methodology. Using control theory, we design a process model that provides theoretical guidance to build a portfolio of controls that can support the development of a mixed methodology for software development. Controls, embedded in methods, provide a generalizable and adaptable framework for project managers to develop their mixed methodology specific to the demands of the project. A research methodology is proposed to test the model. Finally, future directions and contributions are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective Describe the methodology and selection of quality indicators (QI) to be implemented in the EFFECT (EFFectiveness of Endometrial Cancer Treatment) project. EFFECT aims to monitor the variability in Quality of Care (QoC) of uterine cancer in Belgium, to compare the effectiveness of different treatment strategies to improve the QoC and to check the internal validity of the QI to validate the impact of process indicators on outcome. Methods A QI list was retrieved from literature, recent guidelines and QI databases. The Belgian Healthcare Knowledge Center methodology was used for the selection process and involved an expert's panel rating the QI on 4 criteria. The resulting scores and further discussion resulted in a final QI list. An online EFFECT module was developed by the Belgian Cancer Registry including the list of variables required for measuring the QI. Three test phases were performed to evaluate the relevance, feasibility and understanding of the variables and to test the compatibility of the dataset. Results 138 QI were considered for further discussion and 82 QI were eligible for rating. Based on the rating scores and consensus among the expert's panel, 41 QI were considered measurable and relevant. Testing of the data collection enabled optimization of the content and the user-friendliness of the dataset and online module. Conclusions This first Belgian initiative for monitoring the QoC of uterine cancer indicates that the previously used QI selection methodology is reproducible for uterine cancer. The QI list could be applied by other research groups for comparison. © 2013 Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a knowledge-based approach is proposed for the management of temporal information in process control. A common-sense theory of temporal constraints over processes/events, allowing relative temporal knowledge, is employed here as the temporal basis for the system. This theory supports duration reasoning and consistency checking, and accepts relative temporal knowledge which is in a form normally used by human operators. An architecture for process control is proposed which centres on an historical database consisting of events and processes, together with the qualitative temporal relationships between their occurrences. The dynamics of the system is expressed by means of three types of rule: database updating rules, process control rules, and data deletion rules. An example is provided in the form of a life scheduler, to illustrate the database and the rule sets. The example demonstrates the transitions of the database over time, and identifies the procedure in terms of a state transition model for the application. The dividing instant problem for logical inference is discussed with reference to this process control example, and it is shown how the temporal theory employed can be used to deal with the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electronics industry is developing rapidly together with the increasingly complex problem of microelectronic equipment cooling. It has now become necessary for thermal design engineers to consider the problem of equipment cooling at some level. The use of Computational Fluid Dynamics (CFD) for such investigations is fast becoming a powerful and almost essential tool for the design, development and optimisation of engineering applications. However turbulence models remain a key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt fluctuations experienced by the turbulent energy and other parameters located at near wall regions and shear layers a particularly fine computational mesh is necessary which inevitably increases the computer storage and run-time requirements. This paper will discuss results from an investigation into the accuract of currently used turbulence models. Also a newly formulated transitional hybrid turbulence model will be introduced with comparisonsaagainst experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Symposium, “Towards the sustainable use of Europe’s forests”, with sub-title “Forest ecosystem and landscape research: scientific challenges and opportunities” lists three fundamental substantive areas of research that are involved: Forest management and practices, Ecosystem processes and functional ecology, and Environmental economics and sociology. This paper argues that there are essential catalytic elements missing! Without these elements there is great danger that the aimed-for world leadership in the forest sciences will not materialize. What are the missing elements? All the sciences, and in particular biology, environmental sciences, sociology, economics, and forestry have evolved so that they include good scientific methodology. Good methodology is imperative in both the design and analysis of research studies, the management of research data, and in the interpretation of research finding. The methodological disciplines of Statistics, Modelling and Informatics (“SMI”) are crucial elements in a proposed Centre of European Forest Science, and the full involvement of professionals in these methodological disciplines is needed if the research of the Centre is to be world-class. Distributed Virtual Institute (DVI) for Statistics, Modelling and Informatics in Forestry and the Environment (SMIFE) is a consortium with the aim of providing world-class methodological support and collaboration to European research in the areas of Forestry and the Environment. It is suggested that DVI: SMIFE should be a formal partner in the proposed Centre for European Forest Science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software metrics are the key tool in software quality management. In this paper, we propose to use support vector machines for regression applied to software metrics to predict software quality. In experiments we compare this method with other regression techniques such as Multivariate Linear Regression, Conjunctive Rule and Locally Weighted Regression. Results on benchmark dataset MIS, using mean absolute error, and correlation coefficient as regression performance measures, indicate that support vector machines regression is a promising technique for software quality prediction. In addition, our investigation of PCA based metrics extraction shows that using the first few Principal Components (PC) we can still get relatively good performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evaluating ship layout for human factors (HF) issues using simulation software such as maritimeEXODUS can be a long and complex process. The analysis requires the identification of relevant evaluation scenarios; encompassing evacuation and normal operations; the development of appropriate measures which can be used to gauge the performance of crew and vessel and finally; the interpretation of considerable simulation data. Currently, the only agreed guidelines for evaluating HFs performance of ship design relate to evacuation and so conclusions drawn concerning the overall suitability of a ship design by one naval architect can be quite different from those of another. The complexity of the task grows as the size and complexity of the vessel increases and as the number and type of evaluation scenarios considered increases. Equally, it can be extremely difficult for fleet operators to set HFs design objectives for new vessel concepts. The challenge for naval architects is to develop a procedure that allows both accurate and rapid assessment of HFs issues associated with vessel layout and crew operating procedures. In this paper we present a systematic and transparent methodology for assessing the HF performance of ship design which is both discriminating and diagnostic. The methodology is demonstrated using two variants of a hypothetical naval ship.