934 resultados para Pombalino construction system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis addresses the economic impacts of construction safety in Greece. The research involved the development of a methodology for determining the overall costs of safety, namely the sum of the costs of accidents and the costs of safety management failures (with or without accident) including image cost. Hitherto, very little work has been published on the cost of accidents in practical case studies. Moreover, to the author’s belief, no research has been published that seeks to determine in real cases the costs of prevention. The methodology developed is new, transparent, and capable of being replicated and adapted to other employment sectors and to other countries. The methodology was applied to three construction projects in Greece to test the safety costing methodology and to offer some preliminary evidence on the business case for safety. The survey work took place between 1999 and 2001 and involved 27 months of costing work on site. The study focuses on the overall costs of safety that apply to the main (principal) contractor. The methodology is supported by 120 discrete cost categories, and systematic criteria for determining which costs are included (counted) in the overall cost of safety. A quality system (in compliance with ISO9000 series) was developed to support the work and ensure accuracy of data gathering. The results of the study offer some support for the business case for safety. Though they offer good support for the economics of safety as they demonstrate need for cost effectiveness. Subject to important caveats, those projects that appeared to manage safety more cost-effectively achieved the lowest overall safety cost. Nevertheless, results are significantly lower than of other published works for two main reasons; first costs due to damages with no potential to injury were not included and second only costs to main constructor were considered. Study’s results are discussed and compared with other publish works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The control needed in the management of a project was analysed with particular reference to the unique needs of the construction industry within the context of site management. This was explored further by analysing the various problems facing managers within the overall system and determining to what extent the organisation would benefit from an integrated mangement information system. Integration and management of information within the organisational units and the cycles of events that make up the main sub-system was suggested as the means of achieving this objective. A conceptual model of the flow of information was constructed within the whole process of project management by examining the type of information and documents which are generated for the production cycle of a project. This model was analysed with respect to the site managers' needs and the minimum requirements for an overall integrated system. The most tedious and time-consuming task facing the site manager is the determination of weekly production costs, calculation and preparation of interim certificates and valuation of variations occurring during the production stage and finally the settlement and preparation of supplier and sub-contractors' accounts. These areas where microcomputers could be of most help were identified and a number of packages were designed and implemented for various contractors. The gradual integration of stand-alone packages within the whole of the construction industry is a logical sequence to achieve integration of management system. The methods of doing this were analysed together with the resulting advantages and disadvantages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epitopes mediated by T cells lie at the heart of the adaptive immune response and form the essential nucleus of anti-tumour peptide or epitope-based vaccines. Antigenic T cell epitopes are mediated by major histocompatibility complex (MHC) molecules, which present them to T cell receptors. Calculating the affinity between a given MHC molecule and an antigenic peptide using experimental approaches is both difficult and time consuming, thus various computational methods have been developed for this purpose. A server has been developed to allow a structural approach to the problem by generating specific MHC:peptide complex structures and providing configuration files to run molecular modelling simulations upon them. A system has been produced which allows the automated construction of MHC:peptide structure files and the corresponding configuration files required to execute a molecular dynamics simulation using NAMD. The system has been made available through a web-based front end and stand-alone scripts. Previous attempts at structural prediction of MHC:peptide affinity have been limited due to the paucity of structures and the computational expense in running large scale molecular dynamics simulations. The MHCsim server (http://igrid-ext.cryst.bbk.ac.uk/MHCsim) allows the user to rapidly generate any desired MHC:peptide complex and will facilitate molecular modelling simulation of MHC complexes on an unprecedented scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of mental health risk assessment, there is no standardisation between the data used in different systems. As a first step towards the possible interchange of data between assessment tools, an ontology has been constructed for a particular one, GRiST (Galatean Risk Screening Tool). We briefly introduce GRiST and its data structures, then describe the ontology and the benefits that have already been realised from the construction process. For example, the ontology has been used to check the consistency of the various trees used in the model. We then consider potential uses in integration of data from other sources. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Isoguanosine-containing dendritic small molecules self-assemble into decameric nucleodendrimers as observed by 1D NMR spectroscopy, 2D DOSY, and mass spectrometry. In particular, apolar building blocks readily form pentameric structures in acetonitrile while the presence of alkali metals promotes the formation of stable decameric assemblies with a preference for cesium ions. Remarkably, co-incubation of guanosine and isoguanosine-containing nucleodendrons results in the formation of decameric structures in absence of added salts. Further analysis of the mixture indicated that guanosine derivatives facilitate the formation, but are not involved in decameric structures; a process reminiscent of molecular crowding. This molecular system provides a powerful canvas for the rapid and modular assembly of polyfunctional dendritic macromolecules. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimization of design, creation, functioning and accompaniment processes of expert system is the important problem of artificial intelligence theory and decisions making methods techniques. In this paper the approach to its solving with the use of technology, being based on methodology of systems analysis, ontology of subject domain, principles and methods of self-organisation, is offered. The aspects of such approach realization, being based on construction of accordance between the ontology hierarchical structure and sequence of questions in automated systems for examination, are expounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methodology of computer-aided investigation and provision of safety for complex constructions and a prototype of the intelligent applied system, which implements it, are considered. The methodology is determined by the model of the object under scrutiny, by the structure and functions of investigation of safety as well as by a set of research methods. The methods are based on the technologies of object-oriented databases, expert systems and on the mathematical modeling. The intelligent system’s prototype represents component software, which provides for support of decision making in the process of safety investigations and investigation of the cause of failure. Support of decision making is executed by analogy, by determined search for the precedents (cases) with respect to predicted (on the stage of design) and observed (on the stage of exploitation) parameters of the damage, destruction and malfunction of a complex hazardous construction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource Space Model is a kind of data model which can effectively and flexibly manage the digital resources in cyber-physical system from multidimensional and hierarchical perspectives. This paper focuses on constructing resource space automatically. We propose a framework that organizes a set of digital resources according to different semantic dimensions combining human background knowledge in WordNet and Wikipedia. The construction process includes four steps: extracting candidate keywords, building semantic graphs, detecting semantic communities and generating resource space. An unsupervised statistical language topic model (i.e., Latent Dirichlet Allocation) is applied to extract candidate keywords of the facets. To better interpret meanings of the facets found by LDA, we map the keywords to Wikipedia concepts, calculate word relatedness using WordNet's noun synsets and construct corresponding semantic graphs. Moreover, semantic communities are identified by GN algorithm. After extracting candidate axes based on Wikipedia concept hierarchy, the final axes of resource space are sorted and picked out through three different ranking strategies. The experimental results demonstrate that the proposed framework can organize resources automatically and effectively.©2013 Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of automatic graphic user interface construction is described. It is based on the building of user interface as reflection of the data domain logical definition. The submitted approach to development of the information system user interface enables dynamic adaptation of the system during their operation. This approach is used for creation of information systems based on CASE-system METAS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new original method and CASE-tool of system analysis and modelling are represented. They are for the first time consistent with the requirements of object-oriented technology of informational systems design. They essentially facilitate the construction of organisational systems models and increase the quality of the organisational designing and basic technological processes of object application developing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated the model of the analysis of the text of the technical project is submitted, the attribute grammar of a technical specification, intended for formalization of limited Russian is constructed with the purpose of analysis of offers of text of a technical specification, style features of the technical project as class of documents are considered, recommendations on preparation of text of a technical specification for the automated processing are formulated. The computer-aided system of semantic text analysis of a technical specification is considered. This system consists of the following subsystems: preliminary text processing, the syntactic and semantic analysis and construction of software models, storage of documents and interface.