936 resultados para airport infrastructures


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Groundwater is an important resource in the UK, with 45% of public water supplies in the Thames Water region derived from subterranean sources. In urban areas, groundwater has been affected by onthropogenic activities over 0 long period of time and from a multitude of sources, At present, groundwater quality is assessed using a range of chemical species to determine the extent of contamination. However, analysing a complex mixture of chemicals is time-consuming and expensive, whereas the use of an ecotoxicity test provides information on (a) the degree of pollution present in the groundwater and (b) the potential effect of that pollution. Microtox (TM), Eclox (TM) and Daphnia magna microtests were used in conjunction with standard chemical protocols to assess the contamination of groundwaters from sites throughout the London Borough of Hounslow and nearby Heathrow Airport. Because of their precision, range of responses and ease of use, Daphnia magna and Microfox (TM) tests are the bioassays that appear to be most effective for assessing groundwater toxicity However, neither test is ideal because it is also essential to monitor water hardness. Eclox (TM) does not appear to be suitable for use in groundwater-quality assessment in this area, because it is adversely affected by high total dissolved solids and electrical conductivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research examines dynamics associated with new representational technologies in complex organizations through a study of the use of a Single Model Environment, prototyping and simulation tools in the mega-project to construct Terminal 5 at Heathrow Airport, London. The ambition of the client, BAA. was to change industrial practices reducing project costs and time to delivery through new contractual arrangements and new digitally-enabled collaborative ways of working. The research highlights changes over time and addresses two areas of 'turbulence' in the use of: 1) technologies, where there is a dynamic tension between desires to constantly improve, change and update digital technologies and the need to standardise practices, maintaining and defending the overall integrity of the system; and 2) representations, where dynamics result from the responsibilities and liabilities associated with sharing of digital representations and a lack of trust in the validity of data from other firms. These dynamics are tracked across three stages of this well-managed and innovative project and indicate the generic need to treat digital infrastructure as an ongoing strategic issue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Danish Eulerian Model (DEM) is a powerful air pollution model, designed to calculate the concentrations of various dangerous species over a large geographical region (e.g. Europe). It takes into account the main physical and chemical processes between these species, the actual meteorological conditions, emissions, etc.. This is a huge computational task and requires significant resources of storage and CPU time. Parallel computing is essential for the efficient practical use of the model. Some efficient parallel versions of the model were created over the past several years. A suitable parallel version of DEM by using the Message Passing Interface library (AIPI) was implemented on two powerful supercomputers of the EPCC - Edinburgh, available via the HPC-Europa programme for transnational access to research infrastructures in EC: a Sun Fire E15K and an IBM HPCx cluster. Although the implementation is in principal, the same for both supercomputers, few modifications had to be done for successful porting of the code on the IBM HPCx cluster. Performance analysis and parallel optimization was done next. Results from bench marking experiments will be presented in this paper. Another set of experiments was carried out in order to investigate the sensitivity of the model to variation of some chemical rate constants in the chemical submodel. Certain modifications of the code were necessary to be done in accordance with this task. The obtained results will be used for further sensitivity analysis Studies by using Monte Carlo simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A large and complex IT project may involve multiple organizations and be constrained within a temporal period. An organization is a system comprising of people, activities, processes, information, resources and goals. Understanding and modelling such a project and its interrelationship with relevant organizations are essential for organizational project planning. This paper introduces the problem articulation method (PAM) as a semiotic method for organizational infrastructure modelling. PAM offers a suite of techniques, which enables the articulation of the business, technical and organizational requirements, delivering an infrastructural framework to support the organization. It works by eliciting and formalizing (e. g. processes, activities, relationships, responsibilities, communications, resources, agents, dependencies and constraints) and mapping these abstractions to represent the manifestation of the "actual" organization. Many analysts forgo organizational modelling methods and use localized ad hoc and point solutions, but this is not amenable for organizational infrastructures modelling. A case study of the infrared atmospheric sounding interferometer (IASI) will be used to demonstrate the applicability of PAM, and to examine its relevancy and significance in dealing with the innovation and changes in the organizations.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work presents a new method for activity extraction and reporting from video based on the aggregation of fuzzy relations. Trajectory clustering is first employed mainly to discover the points of entry and exit of mobiles appearing in the scene. In a second step, proximity relations between resulting clusters of detected mobiles and contextual elements from the scene are modeled employing fuzzy relations. These can then be aggregated employing typical soft-computing algebra. A clustering algorithm based on the transitive closure calculation of the fuzzy relations allows building the structure of the scene and characterises the ongoing different activities of the scene. Discovered activity zones can be reported as activity maps with different granularities thanks to the analysis of the transitive closure matrix. Taking advantage of the soft relation properties, activity zones and related activities can be labeled in a more human-like language. We present results obtained on real videos corresponding to apron monitoring in the Toulouse airport in France.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As integrated software solutions reshape project delivery, they alter the bases for collaboration and competition across firms in complex industries. This paper synthesises and extends literatures on strategy in project-based industries and digitally-integrated work to understand how project-based firms interact with digital infrastructures for project delivery. Four identified strategies are to: 1) develop and use capabilities to shape the integrated software solutions that are used in projects; 2) co-specialize, developing complementary assets to work repeatedly with a particular integrator firm; 3) retain flexibility by developing and maintaining capabilities in multiple digital technologies and processes; and 4) manage interfaces, translating work into project formats for coordination while hiding proprietary data and capabilities in internal systems. The paper articulates the strategic importance of digital infrastructures for delivery as well as product architectures. It concludes by discussing managerial implications of the identified strategies and areas for further research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last few years, load growth, increases in intermittent generation, declining technology costs and increasing recognition of the importance of customer behaviour in energy markets have brought about a change in the focus of Demand Response (DR) in Europe. The long standing programmes involving large industries, through interruptible tariffs and time of day pricing, have been increasingly complemented by programmes aimed at commercial and residential customer groups. Developments in DR vary substantially across Europe reflecting national conditions and triggered by different sets of policies, programmes and implementation schemes. This paper examines experiences within European countries as well as at European Union (EU) level, with the aim of understanding which factors have facilitated or impeded advances in DR. It describes initiatives, studies and policies of various European countries, with in-depth case studies of the UK, Italy and Spain. It is concluded that while business programmes, technical and economic potentials vary across Europe, there are common reasons as to why coordinated DR policies have been slow to emerge. This is because of the limited knowledge on DR energy saving capacities; high cost estimates for DR technologies and infrastructures; and policies focused on creating the conditions for liberalising the EU energy markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates the determinants of commercial and retail airport revenues as well as revenues from real estate operations. Cross-sectional OLS, 2SLS and robust regression models of European airports identify a number of significant drivers of airport revenues. Aviation revenues per passenger are mainly determined by the national income per capita in which the airport is located, the percentage of leisure travelers and the size of the airport proxied by total aviation revenues. Main drivers of commercial revenues per passenger include the total number of passengers passing through the airport, the ratio of commercial to total revenues, the national income, the share of domestic and leisure travelers and the total number of flights. These results are in line with previous findings of a negative influence of business travelers on commercial revenues per passenger. We also find that a high amount of retail space per passenger is generally associated with lower commercial revenues per square meter confirming decreasing marginal revenue effects. Real estate revenues per passenger are positively associated with national income per capita at airport location, share of intra-EU passengers and percent delayed flights. Overall, aviation and non-aviation revenues appear to be strongly interlinked, underlining the potential for a comprehensive airport management strategy above and beyond mere cost minimization of the aviation sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An alternative approach to understanding innovation is made using two intersecting ideas. The first is that successful innovation requires consideration of the social and organizational contexts in which it is located. The complex context of construction work is characterized by inter-organizational collaboration, a project-based approach and power distributed amongst collaborating organizations. The second is that innovations can be divided into two modes: ‘bounded’, where the implications of innovation are restricted within a single, coherent sphere of influence, and ‘unbounded’, where the effects of implementation spill over beyond this. Bounded innovations are adequately explained within the construction literature. However, less discussed are unbounded innovations, where many firms' collaboration is required for successful implementation, even though many innovations can be considered unbounded within construction's inter-organizational context. It is argued that unbounded innovations require an approach to understand and facilitate the interactions both within a range of actors and between the actors and technological artefacts. The insights from a sociology of technology approach can be applied to the multiplicity of negotiations and alignments that constitute the implementation of unbounded innovation. The utility of concepts from the sociology of technology, including ‘system building’ and ‘heterogeneous engineering’, is demonstrated by applying them to an empirical study of an unbounded innovation on a major construction project (the new terminal at Heathrow Airport, London, UK). This study suggests that ‘system building’ contains outcomes that are not only transformations of practices, processes and systems, but also the potential transformation of technologies themselves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How can organizations use digital infrastructure to realise physical outcomes? The design and construction of London Heathrow Terminal 5 is analysed to build new theoretical understanding of visualization and materialization practices in the transition from digital design to physical realisation. In the project studied, an integrated software solution is introduced as an infrastructure for delivery. The analyses articulate the work done to maintain this digital infrastructure and also to move designs beyond the closed world of the computer to a physical reality. In changing medium, engineers use heterogeneous trials to interrogate and address the limitations of an integrated digital model. The paper explains why such trials, which involve the reconciliation of digital and physical data through parallel and iterative forms of work, provide a robust practice for realizing goals that have physical outcomes. It argues that this practice is temporally different from, and at times in conflict with, building a comprehensive dataset within the digital medium. The paper concludes by discussing the implications for organizations that use digital infrastructures in seeking to accomplish goals in digital and physical media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.