23 resultados para Infrastructures linéaires

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A major infrastructure project is used to investigate the role of digital objects in the coordination of engineering design work. From a practice-based perspective, research emphasizes objects as important in enabling cooperative knowledge work and knowledge sharing. The term ‘boundary object’ has become used in the analysis of mutual and reciprocal knowledge sharing around physical and digital objects. The aim is to extend this work by analysing the introduction of an extranet into the public–private partnership project used to construct a new motorway. Multiple categories of digital objects are mobilized in coordination across heterogeneous, cross-organizational groups. The main findings are that digital objects provide mechanisms for accountability and control, as well as for mutual and reciprocal knowledge sharing; and that different types of objects are nested, forming a digital infrastructure for project delivery. Reconceptualizing boundary objects as a digital infrastructure for delivery has practical implications for management practices on large projects and for the use of digital tools, such as building information models, in construction. It provides a starting point for future research into the changing nature of digitally enabled coordination in project-based work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increased use of technology is necessary in order for industrial control systems to maintain and monitor industrial, infrastructural, or environmental processes. The need to secure and identify threats to the system is equally critical. Securing Critical Infrastructures and Critical Control Systems: Approaches for Threat Protection provides a full and detailed understanding of the vulnerabilities and security threats that exist within an industrial control system. This collection of research defines and analyzes the technical, procedural, and managerial responses to securing these systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil data and reliable soil maps are imperative for environmental management. conservation and policy. Data from historical point surveys, e.g. experiment site data and farmers fields can serve this purpose. However, legacy soil information is not necessarily collected for spatial analysis and mapping such that the data may not have immediately useful geo-references. Methods are required to utilise these historical soil databases so that we can produce quantitative maps of soil propel-ties to assess spatial and temporal trends but also to assess where future sampling is required. This paper discusses two such databases: the Representative Soil Sampling Scheme which has monitored the agricultural soil in England and Wales from 1969 to 2003 (between 400 and 900 bulked soil samples were taken annually from different agricultural fields); and the former State Chemistry Laboratory, Victoria, Australia where between 1973 and 1994 approximately 80,000 soil samples were submitted for analysis by farmers. Previous statistical analyses have been performed using administrative regions (with sharp boundaries) for both databases, which are largely unrelated to natural features. For a more detailed spatial analysis that call be linked to climate and terrain attributes, gradual variation of these soil properties should be described. Geostatistical techniques such as ordinary kriging are suited to this. This paper describes the format of the databases and initial approaches as to how they can be used for digital soil mapping. For this paper we have selected soil pH to illustrate the analyses for both databases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we ask why so much ecological scientific research does not have a greater policy impact in the UK. We argue that there are two potentially important and related reasons for this failing. First, much current ecological science is not being conducted at a scale that is readily meaningful to policy-makers. Second, to make much of this research policy-relevant requires collaborative interdisciplinary research between ecologists and social scientists. However, the challenge of undertaking useful interdisciplinary research only re-emphasises the problems of scale: ecologists and social scientists traditionally frame their research questions at different scales and consider different facets of natural resource management, setting different objectives and using different language. We argue that if applied ecological research is to have greater impact in informing environmental policy, much greater attention needs to be given to the scale of the research efforts as well as to the interaction with social scientists. Such an approach requires an adjustment in existing research and funding infrastructures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Danish Eulerian Model (DEM) is a powerful air pollution model, designed to calculate the concentrations of various dangerous species over a large geographical region (e.g. Europe). It takes into account the main physical and chemical processes between these species, the actual meteorological conditions, emissions, etc.. This is a huge computational task and requires significant resources of storage and CPU time. Parallel computing is essential for the efficient practical use of the model. Some efficient parallel versions of the model were created over the past several years. A suitable parallel version of DEM by using the Message Passing Interface library (AIPI) was implemented on two powerful supercomputers of the EPCC - Edinburgh, available via the HPC-Europa programme for transnational access to research infrastructures in EC: a Sun Fire E15K and an IBM HPCx cluster. Although the implementation is in principal, the same for both supercomputers, few modifications had to be done for successful porting of the code on the IBM HPCx cluster. Performance analysis and parallel optimization was done next. Results from bench marking experiments will be presented in this paper. Another set of experiments was carried out in order to investigate the sensitivity of the model to variation of some chemical rate constants in the chemical submodel. Certain modifications of the code were necessary to be done in accordance with this task. The obtained results will be used for further sensitivity analysis Studies by using Monte Carlo simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A large and complex IT project may involve multiple organizations and be constrained within a temporal period. An organization is a system comprising of people, activities, processes, information, resources and goals. Understanding and modelling such a project and its interrelationship with relevant organizations are essential for organizational project planning. This paper introduces the problem articulation method (PAM) as a semiotic method for organizational infrastructure modelling. PAM offers a suite of techniques, which enables the articulation of the business, technical and organizational requirements, delivering an infrastructural framework to support the organization. It works by eliciting and formalizing (e. g. processes, activities, relationships, responsibilities, communications, resources, agents, dependencies and constraints) and mapping these abstractions to represent the manifestation of the "actual" organization. Many analysts forgo organizational modelling methods and use localized ad hoc and point solutions, but this is not amenable for organizational infrastructures modelling. A case study of the infrared atmospheric sounding interferometer (IASI) will be used to demonstrate the applicability of PAM, and to examine its relevancy and significance in dealing with the innovation and changes in the organizations.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As integrated software solutions reshape project delivery, they alter the bases for collaboration and competition across firms in complex industries. This paper synthesises and extends literatures on strategy in project-based industries and digitally-integrated work to understand how project-based firms interact with digital infrastructures for project delivery. Four identified strategies are to: 1) develop and use capabilities to shape the integrated software solutions that are used in projects; 2) co-specialize, developing complementary assets to work repeatedly with a particular integrator firm; 3) retain flexibility by developing and maintaining capabilities in multiple digital technologies and processes; and 4) manage interfaces, translating work into project formats for coordination while hiding proprietary data and capabilities in internal systems. The paper articulates the strategic importance of digital infrastructures for delivery as well as product architectures. It concludes by discussing managerial implications of the identified strategies and areas for further research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le filtrage de Bucy-Kalman s'applique au modèle d'état comprenant des équations linéaires bruitées, décrivant l'évolution de l'état et des équations linéaires bruitées d'observation . Ce filtrage consiste dans le cas gaussien, à calculer de façon récursive, la loi de probabilité, a posteriori, de l'état, au vu de l' observation actuelle et des observations passées . Le filtrage par densités approchées permet de traiter des équations d'état, non linéaires ou à bruits non Gaussiens. Pour un coefficient de rappel aléatoire, cas typique d'une situation de changements de modèles, l'article introduit une famille de lois de probabilité, paramétrées, bimodales servant, par ajustement des paramètres, à approcher les lois a posteriori de l'état aux divers instants . Les paramètres sont recalculés récursivement, lors des mises à jour et des prédictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last few years, load growth, increases in intermittent generation, declining technology costs and increasing recognition of the importance of customer behaviour in energy markets have brought about a change in the focus of Demand Response (DR) in Europe. The long standing programmes involving large industries, through interruptible tariffs and time of day pricing, have been increasingly complemented by programmes aimed at commercial and residential customer groups. Developments in DR vary substantially across Europe reflecting national conditions and triggered by different sets of policies, programmes and implementation schemes. This paper examines experiences within European countries as well as at European Union (EU) level, with the aim of understanding which factors have facilitated or impeded advances in DR. It describes initiatives, studies and policies of various European countries, with in-depth case studies of the UK, Italy and Spain. It is concluded that while business programmes, technical and economic potentials vary across Europe, there are common reasons as to why coordinated DR policies have been slow to emerge. This is because of the limited knowledge on DR energy saving capacities; high cost estimates for DR technologies and infrastructures; and policies focused on creating the conditions for liberalising the EU energy markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How can organizations use digital infrastructure to realise physical outcomes? The design and construction of London Heathrow Terminal 5 is analysed to build new theoretical understanding of visualization and materialization practices in the transition from digital design to physical realisation. In the project studied, an integrated software solution is introduced as an infrastructure for delivery. The analyses articulate the work done to maintain this digital infrastructure and also to move designs beyond the closed world of the computer to a physical reality. In changing medium, engineers use heterogeneous trials to interrogate and address the limitations of an integrated digital model. The paper explains why such trials, which involve the reconciliation of digital and physical data through parallel and iterative forms of work, provide a robust practice for realizing goals that have physical outcomes. It argues that this practice is temporally different from, and at times in conflict with, building a comprehensive dataset within the digital medium. The paper concludes by discussing the implications for organizations that use digital infrastructures in seeking to accomplish goals in digital and physical media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.