18 resultados para Infrastructures linéaires
em Aston University Research Archive
Resumo:
The Indian petroleum industry is passing through a very dynamic business environment due to the liberalisation of many government policies, vertical integration among organisations and the presence of multinational companies. This caused a competitive environment among the organisations in the Indian petroleum industry in the public sector. Effective project management for developing new infrastructures and maintaining the existing facilities has been considered one of the means for remaining competitive in this business environment. However, present project management practices suffer from many shortcomings, as time, cost and quality non-achievements are part and parcel of almost every project. This study focuses on identifying the issues in managing projects of the organisation in the Indian petroleum sector with the involvement of the executives in a workshop environment. This also suggests some remedial measures for resolving those issues through identifying critical success factors and enablers. The enablers not only resolve the present issues but also ensure superior performance. These are analysed in a quantitative framework to derive improvement measures in project management practices.
An integrated multiple criteria decision making approach for resource allocation in higher education
Resumo:
Resource allocation is one of the major decision problems arising in higher education. Resources must be allocated optimally in such a way that the performance of universities can be improved. This paper applies an integrated multiple criteria decision making approach to the resource allocation problem. In the approach, the Analytic Hierarchy Process (AHP) is first used to determine the priority or relative importance of proposed projects with respect to the goals of the universities. Then, the Goal Programming (GP) model incorporating the constraints of AHP priority, system, and resource is formulated for selecting the best set of projects without exceeding the limited available resources. The projects include 'hardware' (tangible university's infrastructures), and 'software' (intangible effects that can be beneficial to the university, its members, and its students). In this paper, two commercial packages are used: Expert Choice for determining the AHP priority ranking of the projects, and LINDO for solving the GP model. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
This paper analyses the relationship between industrial total factor productivity and public capital across the 20 Italian administrative regions. It adds upon the existing literature in a number of ways: it analyses a longer period (1970-98); it allows for the role of human capital accumulation; it tests for the existence of a long-run relationship between total factor productivity and public capital (through previously suggested panel techniques) and for weak exogeneity of public capital; and it assesses the significance of public capital within a non-parametric set-up based on the Free Disposal Hull. The results confirm that public capital has a significant impact on the evolution of total factor productivity, particularly in the Southern regions. This impact is mainly ascribed to the core infrastructures (road and airports, harbours, railroads, water and electricity, telecommunications). Also, core infrastructures are weakly exogenous. © 2005 Regional Studies Association.
Resumo:
This thesis follows the argument that, to fully understand the current position of national research laboratories in Great Britain one needs to study the historical development of the government research establishment as a specific social institution. A particular model is outlined in which it is argued that institutional characteristics evolve through the continual interplay between internal development and environmental factors within a changing political and economic context, and that the continuous development of an institution depends on its ability to adapt to changes in its operational environment. Within this framework important historical precedents for formal government institutional support for applied research are identified. and the transition from private to public patronage documented. The emergence and consolidation of government research laboratories in Britain is described in detail. The subsequent relative decline of public laboratories is interpreted in terms of the undermining of a traditional role resulting in legitimation crisis. It is concluded that it is no longer feasible to consider the public research laboratory as a coherent institutional form, and that the future of each individual laboratory can only be considered in relation to the institutional needs of its own sphere of operation. Nevertheless the laboratories have been forced into decline in an essentially unplanned way which may have serious consequences for the maintenance of the scientific and technical infrastructures, necessary for material progress in the national context.
Resumo:
The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.
Resumo:
Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot full all user needs or cover all concepts of data quality. In this paper we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specification on data quality, and propose an integrated model for data quality in the eld of Earth observation. We also propose a practical mechanism for applying the integrated quality information model to large number of datasets through metadata inheritance. While our data quality management approach is in the domain of Earth observation, we believe the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.
Resumo:
Because metadata that underlies semantic web applications is gathered from distributed and heterogeneous data sources, it is important to ensure its quality (i.e., reduce duplicates, spelling errors, ambiguities). However, current infrastructures that acquire and integrate semantic data have only marginally addressed the issue of metadata quality. In this paper we present our metadata acquisition infrastructure, ASDI, which pays special attention to ensuring that high quality metadata is derived. Central to the architecture of ASDI is a verification engine that relies on several semantic web tools to check the quality of the derived data. We tested our prototype in the context of building a semantic web portal for our lab, KMi. An experimental evaluation comparing the automatically extracted data against manual annotations indicates that the verification engine enhances the quality of the extracted semantic metadata.
Resumo:
The Indian Petroleum Industry is passing through a very dynamic business environment due to liberalization. Effective project management for developing new infrastructures and maintaining the existing facilities has been considered as one of the means for remaining competitive but these practices suffer from many shortcomings, as time, cost and quality non-achievements are part and parcel of almost every project. This study focuses on identifying the specific causes of project failure by demonstrating first the characteristics of projects in Indian Petroleum industry and suggests some remedial measures for resolving these issues. The suggested project management model is integrated through information management system and demonstrated through a case study.
Resumo:
With their compact spectrum and high tolerance to residual chromatic dispersion, duobinary formats are attractive for the deployment of 40 Gb/s technology on 10 Gb/s WDM Long-Haul transmission infrastructures. Here, we compare the robustness of various duobinary formats when facing 40 Gb/s transmission impairments.
Resumo:
Purpose: To understand the tensions that servitization activities create between actors within networks. Design/methodology/approach: Interviews were conducted with manufacturers, intermediaries and customers across a range of industrial sectors. Findings: Tensions relating to two key sets of capabilities are identified: in developing or acquiring (i) operant technical expertise and (ii) operand service infrastructure. The former tension concerns whom knowledge is co-created with and where expertise resides. The latter involves a territorial investment component; firms developing strategies to acquire greater access to, or ownership of, infrastructures closer to customers. Developing and acquiring these capabilities is a strategic decision on the part of managers of servitizing firms, in order to gain recognized power and control in a particular territory. Originality/value: This paper explores how firms’ servitization activities involve value appropriation (from the rest of the network), contrasting with the narrative norm for servitization: that it creates additional value. There is a need to understand the tensions that servitization activities create within networks. Some firms may be able to improve servitization performance through co-operation rather than competition, generating co-opetitive relationships. Others may need to become much more aggressive, if they are to take a greater share of the ‘value’ from the value chain.
Resumo:
The emergence of innovative and revolutionary Integration Technologies (IntTech) has highly influenced the local government authorities (LGAs) in their decision-making process. LGAs that plan to adopt such IntTech may consider this as a serious investment. Advocates, however, claim that such IntTech have emerged to overcome the integration problems at all levels (e.g. data, object and process). With the emergence of electronic government (e-Government), LGAs have turned to IntTech to fully automate and offer their services on-line and integrate their IT infrastructures. While earlier research on the adoption of IntTech has considered several factors (e.g. pressure, technological, support, and financial), inadequate attention and resources have been applied in systematically investigating the individual, decision and organisational context factors, influencing top management's decisions for adopting IntTech in LGAs. It is a highly considered phenomenon that the success of an organisation's operations relies heavily on understanding an individual's attitudes and behaviours, the surrounding context and the type of decisions taken. Based on empirical evidence gathered through two intensive case studies, this paper attempts to investigate the factors that influence decision makers while adopting IntTech. The findings illustrate two different doctrines - one inclined and receptive towards taking risky decisions, the other disinclined. Several underlying rationales can be attributed to such mind-sets in LGAs. The authors aim to contribute to the body of knowledge by exploring the factors influencing top management's decision-making process while adopting IntTech vital for facilitating LGAs' operational reforms.
Resumo:
Distributed fibre sensors provide unique capabilities for monitoring large infrastructures with high resolution. Practically, all these sensors are based on some kind of backscattering interaction. A pulsed activating signal is launched on one side of the sensing fibre and the backscattered signal is read as a function of the time of flight of the pulse along the fibre. A key limitation in the measurement range of all these sensors is introduced by fibre attenuation. As the pulse travels along the fibre, the losses in the fibre cause a drop of signal contrast and consequently a growth in the measurement uncertainty. In typical single-mode fibres, attenuation imposes a range limit of less than 30km, for resolutions in the order of 1-2 meters. An interesting improvement in this performance can be considered by using distributed amplification along the fibre [1]. Distributed amplification allows having a more homogeneous signal power along the sensing fibre, which also enables reducing the signal power at the input and therefore avoiding nonlinearities. However, in long structures (≥ 50 km), plain distributed amplification does not perfectly compensate the losses and significant power variations along the fibre are to be expected, leading to inevitable limitations in the measurements. From this perspective, it is simple to understand intuitively that the best possible solution for distributed sensors would be offered by a virtually transparent fibre, i.e. a fibre exhibiting effectively zero attenuation in the spectral region of the pulse. In addition, it can be shown that lossless transmission is the working point that allows the minimization of the amplified spontaneous emission (ASE) noise build-up. © 2011 IEEE.
Resumo:
Mobile communication and networking infrastructures play an important role in the development of smart cities, to support real-time information exchange and management required in modern urbanization. Mobile WiFi devices that help offloading data traffic from the macro-cell base station and serve the end users within a closer range can significantly improve the connectivity of wireless communications between essential components including infrastructural and human devices in a city. However, this offloading function through interworking between LTE and WiFi systems will change the pattern of resource distributions operated by the base station. In this paper, a resource allocation scheme is proposed to ensure stable service coverage and end-user quality of experience (QoE) when offloading takes place in a macro-cell environment. In this scheme, a rate redistribution algorithm is derived to form a parametric scheduler to meet the required levels of efficiency and fairness, guided by a no-reference quality assessment metric. We show that the performance of resource allocation can be regulated by this scheduler without affecting the service coverage offered by the WLAN access point. The performances of different interworking scenarios and macro-cell scheduling policies are also compared.
Resumo:
UncertWeb is a European research project running from 2010-2013 that will realize the uncertainty enabled model web. The assumption is that data services, in order to be useful, need to provide information about the accuracy or uncertainty of the data in a machine-readable form. Models taking these data as imput should understand this and propagate errors through model computations, and quantify and communicate errors or uncertainties generated by the model approximations. The project will develop technology to realize this and provide demonstration case studies.