962 resultados para Construction techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthwork planning has been considered in this article and a generic block partitioning and modelling approach has been devised to provide strategic plans of various levels of detail. Conceptually this approach is more accurate and comprehensive than others, for instance those that are section based. In response to environmental concerns the metric for decision making was fuel consumption and emissions. Haulage distance and gradient are also included as they are important components of these metrics. Advantageously the fuel consumption metric is generic and captures the physical difficulties of travelling over inclines of different gradients, that is consistent across all hauling vehicles. For validation, the proposed models and techniques have been applied to a real world road project. The numerical investigations have demonstrated that the models can be solved with relatively little CPU time. The proposed block models also result in solutions of superior quality, i.e. they have reduced fuel consumption and cost. Furthermore the plans differ considerably from those based solely upon a distance based metric thus demonstrating a need for industry to reflect upon their current practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Describes the development and testing of a robotic system for charging blast holes in underground mining. The automation system supports four main tactical functions: detection of blast holes; teleoperated arm pose control; automatic arm pose control; and human-in-the-loop visual servoing. We present the system architecture, and analyse the major components, Hole detection is crucial for automating the process, and we discuss theoretical and practical aspects in detail. The sensors used are laser range finders and cameras installed in the end effector. For automatic insertion, we consider image processing techniques to support visual servoing the tool to the hole. We also discuss issues surrounding the control of heavy-duty mining manipulators, in particular, friction, stiction, and actuator saturation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grading is basic to the work of Landscape Architects concerned with design on the land. Gradients conducive to easy use, rainwater drained away, and land slope contributing to functional and aesthetic use are all essential to the amenity and pleasure of external environments. This workbook has been prepared specifically to support the program of landscape construction for students in Landscape Architecture. It is concerned primarily with the technical design of grading rather than with its aesthetic design. It must be stressed that the two aspects are rarely separate; what is designed should be technically correct and aesthetically pleasing - it needs to look good as well as to function effectively. This revised edition contains amended and new content which has evolved out of student classes and discussion with colleagues. I am pleased to have on record that every delivery of this workbook material has resulted in my own better understanding of grading and the techniques for its calculation and communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detailed knowledge of the past history of an active volcano is crucial for the prediction of the timing, frequency and style of future eruptions, and for the identification of potentially at-risk areas. Subaerial volcanic stratigraphies are often incomplete, due to a lack of exposure, or burial and erosion from subsequent eruptions. However, many volcanic eruptions produce widely-dispersed explosive products that are frequently deposited as tephra layers in the sea. Cores of marine sediment therefore have the potential to provide more complete volcanic stratigraphies, at least for explosive eruptions. Nevertheless, problems such as bioturbation and dispersal by currents affect the preservation and subsequent detection of marine tephra deposits. Consequently, cryptotephras, in which tephra grains are not sufficiently concentrated to form layers that are visible to the naked eye, may be the only record of many explosive eruptions. Additionally, thin, reworked deposits of volcanic clasts transported by floods and landslides, or during pyroclastic density currents may be incorrectly interpreted as tephra fallout layers, leading to the construction of inaccurate records of volcanism. This work uses samples from the volcanic island of Montserrat as a case study to test different techniques for generating volcanic eruption records from marine sediment cores, with a particular relevance to cores sampled in relatively proximal settings (i.e. tens of kilometres from the volcanic source) where volcaniclastic material may form a pervasive component of the sedimentary sequence. Visible volcaniclastic deposits identified by sedimentological logging were used to test the effectiveness of potential alternative volcaniclastic-deposit detection techniques, including point counting of grain types (component analysis), glass or mineral chemistry, colour spectrophotometry, grain size measurements, XRF core scanning, magnetic susceptibility and X-radiography. This study demonstrates that a set of time-efficient, non-destructive and high-spatial-resolution analyses (e.g. XRF core-scanning and magnetic susceptibility) can be used effectively to detect potential cryptotephra horizons in marine sediment cores. Once these horizons have been sampled, microscope image analysis of volcaniclastic grains can be used successfully to discriminate between tephra fallout deposits and other volcaniclastic deposits, by using specific criteria related to clast morphology and sorting. Standard practice should be employed when analysing marine sediment cores to accurately identify both visible tephra and cryptotephra deposits, and to distinguish fallout deposits from other volcaniclastic deposits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book reports on an empirically-based study of the manner in which the Magistrates' Courts in Victoria, construct occupational health and safety (OHS) issues when hearing prosecutions for offences under the Victorian OHS legislation. Prosecution has always been a controversial element in the enforcement armoury of OHS regulators, but at the same time it has long been argued that the low level of fines imposed by courts has had an important chilling effect on the OHS inspectorate's enforcement approaches, and on the impact of OHS legislation. Using a range of empirical research methods, including three samples of OHS prosecutions carried out in the Victorian Magistrates' Courts, Professor Johnstone shows how courts, inspectors, prosecutors and defence counsel are involved in filtering or reshaping OHS issues during the prosecution process, both pre-trial and in court. He argues that OHS offences are constructed by focusing on "events", in most cases incidents resulting in injury or death. This "event-focus" ensures that the attention of the parties is drawn to the details of the incident, and away from the broader context of the event. During the court-based sentencing process defence counsel is able to adopt a range of techniques which isolate the incident from its micro and macro contexts, thereby individualising and decontextualising the incident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy. We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous changing impacts appeared in all solution understanding approaches in the projects management field (especially in the construction field of work) by adopting dynamic solution paths. The paper will define what argue to be a better relational model for project management constraints (time, cost, and scope). This new model will increase the success factors of any complex program / project. This is a qualitative research adopting a new avenue of investigation by following different approach of attributing project activities with social phenomena, and supporting phenomenon with field of observations rather than mathematical method by emerging solution from human, and ants' colonies successful practices. The results will show the correct approach of relation between the triple constraints considering the relation as multi agents system having specified communication channels based on agents locations. Information will be transferred between agents, and action would be taken based on constraint agents locations in the project structure allowing immediate changes abilities in order to overcome issues of over budget, behind schedule, and additional scope impact. This is complex adaptive system having self organizes technique, and cybernetic control. Resulted model can be used for improving existing project management methodologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D printing (3Dp) has long been used in the manufacturing sector as a way to automate, accelerate production and reduce waste materials. It is able to build a wide variety of objects if the necessary specifications are provided to the printer and no problems are presented by the limited range of materials available. With 3Dp becoming cheaper, more reliable and, as a result, more prevalent in the world at large, it may soon make inroads into the construction industry. Little is known however, of 3Dp in current use the construction industry and its potential for the future and this paper seeks to rectify this situation by providing a review of the relevant literature. In doing this, the three main 3Dp methods of contour crafting, concrete printing and D-shape 3Dp are described which, as opposed to the traditional construction method of cutting materials down to size, deliver only what is needed for completion, vastly reducing waste. Also identified is 3Dp’s potential to enable buildings to be constructed many times faster and with significantly reduced labour costs. In addition, it is clear that construction 3Dp can allow the further inclusion of Building Information Modelling into the construction process - streamlining and improving the scheduling requirements of a project. However, current 3Dp processes are known to be costly, unsuited to large-scale products and conventional design approaches, and have a very limited range of materials that can be used. Moreover, the only successful examples of construction in action to date have occurred in controlled laboratory environments and, as real world trials have yet to be completed, it is yet to be seen whether it can be it equally proficient in practical situations. Key Words: 3D Printing; Contour Crafting; Concrete Printing; D-shape; Building Automation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing competitiveness in global markets, many developing nations are striving to constantly improve their services in search for the next competitive edge. As a result, the demand and need for Business Process Management (BPM) in these regions is seeing a rapid rise. Yet there exists a lack of professional expertise and knowledge to cater to that need. Therefore, the development of well-structured BPM training/ education programs has become an urgent requirement for these industries. Furthermore, the lack of textbooks or other self-educating material, that go beyond the basics of BPM, further ratifies the need for case based teaching and related cases that enable the next generation of professionals in these countries. Teaching cases create an authentic learning environment where complexities and challenges of the ‘real world’ can be presented in a narrative, enabling students to evolve crucial skills such as problem analysis, problem solving, creativity within constraints as well as the application of appropriate tools (BPMN) and techniques (including best practices and benchmarking) within richer and real scenarios. The aim of this paper is to provide a comprehensive teaching case demonstrating the means to tackle any developing nation’s legacy government process undermined by inefficiency and ineffectiveness. The paper also includes thorough teaching notes The article is presented in three main parts: (i) Introduction - that provides a brief background setting the context of this paper, (ii) The Teaching Case, and (iii) Teaching notes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A central tenet underlying studies on management fashions is that the diffusion of novel forms, models and techniques is driven by an institutional norm of progress, which is the societal expectation that managers will continuously use 'new and improved' management practices. We add to the literature on management fashions by arguing that, if the display of progressiveness in the manner of managing and organizing is expected of organizations, firms that are visibly progressive would be evaluated more positively by organizational audiences following this institutional prescription. Using article counts of co-occurrences of firms and various fashionable management practices in Wall Street Journal, we hypothesize positive effects of such associations on security analysts' evaluations of these firms. Results support this hypothesis. Our study enriches the management fashion literature by highlighting the consequential relevance of organizational adherence to the norm of progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The delivery of products and services for construction-based businesses is increasingly becoming knowledge-driven and information-intensive. The proliferation of building information modelling (BIM) has increased business opportunities as well as introduced new challenges for the architectural, engineering and construction and facilities management (AEC/FM) industry. As such, the effective use, sharing and exchange of building life cycle information and knowledge management in building design, construction, maintenance and operation assumes a position of paramount importance. This paper identifies a subset of construction management (CM) relevant knowledge for different design conditions of building components through a critical, comprehensive review of synthesized literature and other information gathering and knowledge acquisition techniques. It then explores how such domain knowledge can be formalized as ontologies and, subsequently, a query vocabulary in order to equip BIM users with the capacity to query digital models of a building for the retrieval of useful and relevant domain-specific information. The formalized construction knowledge is validated through interviews with domain experts in relation to four case study projects. Additionally, retrospective analyses of several design conditions are used to demonstrate the soundness (realism), completeness, and appeal of the knowledge base and query-based reasoning approach in relation to the state-of-the-art tools, Solibri Model Checker and Navisworks. The knowledge engineering process and the methods applied in this research for information representation and retrieval could provide useful mechanisms to leverage BIM in support of a number of knowledge intensive CM/FM tasks and functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three strategically important uses of IT in the construction industry are the storage and management of project documents on webservers (EDM), the electronic handling of orders and invoices between companies (EDI) and the use of 3-D models including non-geometrical attributes for integrated design and construction (BIM). In a broad longitudinal survey study of IT use in the Swedish Construction Industry the extent of use of these techniques was measured in 1998, 2000 and 2007. The results showed that EDM and EDI are currently already well-established techniques whereas BIM, although it promises the biggest potential benefits to the industry, only seems to be at the beginning of adoption. In a follow-up to the quantitative studies, the factors affecting the decisions to implement EDM, EDI and BIM as well as the actual adoption processes, were studied using semi-structured interviews with practitioners. The theoretical basis for the interview studies was informed by theoretical frameworks from IT-adoption theory, where in particular the UTAUT model has provided the main basis for the analyses presented here. The results showed that the decisions to take the above technologies into use are made on three differ- ent levels: the individual level, the organizational level in the form of a company, and the organiza- tional level in the form of a project. The different patterns in adoption can to some part be explained by where the decisions are mainly taken. EDM is driven from the organisation/project level, EDI mainly from the organisation/company level, and BIM is driven by individuals pioneering the technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Processor architects have a challenging task of evaluating a large design space consisting of several interacting parameters and optimizations. In order to assist architects in making crucial design decisions, we build linear regression models that relate Processor performance to micro-architecture parameters, using simulation based experiments. We obtain good approximate models using an iterative process in which Akaike's information criteria is used to extract a good linear model from a small set of simulations, and limited further simulation is guided by the model using D-optimal experimental designs. The iterative process is repeated until desired error bounds are achieved. We used this procedure to establish the relationship of the CPI performance response to 26 key micro-architectural parameters using a detailed cycle-by-cycle superscalar processor simulator The resulting models provide a significance ordering on all micro-architectural parameters and their interactions, and explain the performance variations of micro-architectural techniques.