934 resultados para Pombalino construction system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction organizations typically deal with large volumes of project data containing valuable information. It is found that these organizations do not use these data effectively for planning and decision-making. There are two reasons. First, the information systems in construction organizations are designed to support day-to-day construction operations. The data stored in these systems are often non-validated, nonintegrated and are available in a format that makes it difficult for decision makers to use in order to make timely decisions. Second, the organizational structure and the IT infrastructure are often not compatible with the information systems thereby resulting in higher operational costs and lower productivity. These two issues have been investigated in this research with the objective of developing systems that are structured for effective decision-making. A framework was developed to guide storage and retrieval of validated and integrated data for timely decision-making and to enable construction organizations to redesign their organizational structure and IT infrastructure matched with information system capabilities. The research was focused on construction owner organizations that were continuously involved in multiple construction projects. Action research and Data warehousing techniques were used to develop the framework. One hundred and sixty-three construction owner organizations were surveyed in order to assess their data needs, data management practices and extent of use of information systems in planning and decision-making. For in-depth analysis, Miami-Dade Transit (MDT) was selected which is in-charge of all transportation-related construction projects in the Miami-Dade county. A functional model and a prototype system were developed to test the framework. The results revealed significant improvements in data management and decision-support operations that were examined through various qualitative (ease in data access, data quality, response time, productivity improvement, etc.) and quantitative (time savings and operational cost savings) measures. The research results were first validated by MDT and then by a representative group of twenty construction owner organizations involved in various types of construction projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A man-machine system called teleoperator system has been developed to work in hazardous environments such as nuclear reactor plants. Force reflection is a type of force feedback in which forces experienced by the remote manipulator are fed back to the manual controller. In a force-reflecting teleoperation system, the operator uses the manual controller to direct the remote manipulator and receives visual information from a video image and/or graphical animation on the computer screen. This thesis presents the design of a portable Force-Reflecting Manual Controller (FRMC) for the teleoperation of tasks such as hazardous material handling, waste cleanup, and space-related operations. The work consists of the design and construction of a prototype 1-Degree-of-Freedom (DOF) FRMC, the development of the Graphical User Interface (GUI), and system integration. Two control strategies - PID and fuzzy logic controllers are developed and experimentally tested. The system response of each is analyzed and evaluated. In addition, the concept of a telesensation system is introduced, and a variety of design alternatives of a 3-DOF FRMC are proposed for future development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Implicit in current design practice of minimum uplift capacity, is the assumption that the connection's capacity is proportional to the number of fasteners per connection joint. This assumption may overestimate the capacity of joints by a factor of two or more and maybe the cause of connection failures in extreme wind events. The current research serves to modify the current practice by proposing a realistic relationship between the number of fasteners and the capacity of the joint. The research is also aimed at further development of non-intrusive continuous load path (CLP) connection system using Glass Fiber Reinforced Polymer (GFRP) and epoxy. Suitable designs were developed for stud to top plate and gable end connections and tests were performed to evaluate the ultimate load, creep and fatigue behavior. The objective was to determine the performance of the connections under simulated sustained hurricane conditions. The performance of the new connections was satisfactory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays lives up in an era of tight credit caused by the global financial crisis, as occurred in the past, it is the responsibility of various sectors and segments of society find ways to reinvent itself. In this context, Lean Construction presents itself as a strong alternative production management for companies in the construction segment. Arising out of lean thinking that originated in Japan in the postwar period and has spread around the world in times of extreme scarcity with the oil crisis. In practice the Lean Construction is a philosophy that seeks to improve the process of production management, maximizing the value of the flow from the customer's perspective through the elimination of losses. And thrives in environments and cultures that consider the scarcity of resources like something natural, applying both the macroeconomic crisis as in times of prosperity. The Planning and Production Control - PCP presents itself as a fundamental building block for companies to protect themselves in the face of economic fluctuations, seeking for their survival and success in the competitive market. Motivated by the lack of discussion of the topic in the local academy, and for the identification of 93.33% of construction companies that do not make use of methodological tools for PCP in the state, this dissertation aims to study and propose the implementation of lean construction in methodology of planning projects implemented on construction sites. This characterized the management system, of the production of a construction company, pointing out the main causes of ineffectiveness related to consequent low performance of one of his ventures. In sequence, the PCP was implemented with the use of tools to serve the principles of lean construction. This being monitored through indicators that provided managers managerial view of process of actions control and production of protective mechanisms. All implementation guidelines and application of this management model, were exposed in a simplified way, practical and efficient, in order to break the resistance of new practices and old paradigms in the industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scatter in medical imaging is typically cast off as image-related noise that detracts from meaningful diagnosis. It is therefore typically rejected or removed from medical images. However, it has been found that every material, including cancerous tissue, has a unique X-ray coherent scatter signature that can be used to identify the material or tissue. Such scatter-based tissue-identification provides the advantage of locating and identifying particular materials over conventional anatomical imaging through X-ray radiography. A coded aperture X-ray coherent scatter spectral imaging system has been developed in our group to classify different tissue types based on their unique scatter signatures. Previous experiments using our prototype have demonstrated that the depth-resolved coherent scatter spectral imaging system (CACSSI) can discriminate healthy and cancerous tissue present in the path of a non-destructive x-ray beam. A key to the successful optimization of CACSSI as a clinical imaging method is to obtain anatomically accurate phantoms of the human body. This thesis describes the development and fabrication of 3D printed anatomical scatter phantoms of the breast and lung.

The purpose of this work is to accurately model different breast geometries using a tissue equivalent phantom, and to classify these tissues in a coherent x-ray scatter imaging system. Tissue-equivalent anatomical phantoms were designed to assess the capability of the CACSSI system to classify different types of breast tissue (adipose, fibroglandular, malignant). These phantoms were 3D printed based on DICOM data obtained from CT scans of prone breasts. The phantoms were tested through comparison of measured scatter signatures with those of adipose and fibroglandular tissue from literature. Tumors in the phantom were modeled using a variety of biological tissue including actual surgically excised benign and malignant tissue specimens. Lung based phantoms have also been printed for future testing. Our imaging system has been able to define the location and composition of the various materials in the phantom. These phantoms were used to characterize the CACSSI system in terms of beam width and imaging technique. The result of this work showed accurate modeling and characterization of the phantoms through comparison of the tissue-equivalent form factors to those from literature. The physical construction of the phantoms, based on actual patient anatomy, was validated using mammography and computed tomography to visually compare the clinical images to those of actual patient anatomy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk impact on the various project objectives, and the Evidential Reasoning algorithm is suggested as a novel alternative for aggregating individual assessments. A spreadsheet-based decision support system (DSS) was devised to facilitate the proposed approach. Four case studies were conducted to examine the approach's viability. Senior managers in four British construction companies tried the DSS and gave very promising feedback. The paper concludes that the proposed methodology may contribute to bridging the gap between theory and practice of construction risk assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we explore the relationship between market norms and practices and the development of the figure of the parent within British education policy. Since the 1970s parents in England have been called upon to perform certain duties and obligations in their relation to the state. These duties include internalizing responsibility for risks, liabilities, inequities and the spectre of crises formerly managed by the state. Rather than characterize this situation in terms of the ‘hollowing of the state’, we argue that the role of the state includes enabling the functioning of the parent as a neoliberal subject, so that they may successfully harness the power of the market to their own advantage and (hopefully) minimize the kinds of risk generated through a deregulated education system. In this paper we examine how parents are compelled to embody certain market norms and practices as they navigate the field of education. In particular we focus on how parents are 1) summoned as consumers or choosers of education services, and thus encouraged to embody through their behaviour a competitive orientation; 2) summoned as governors and custodians of schools, with a focus on assessing financial and educational performance; and 3) summoned as producers and founders of schools, with a focus on entrepreneurial and innovative activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we explore the various spaces and sites through which the figure of the parent is summoned and activated to inhabit and perform market norms and practices in the field of education in England. Since the late 1970s successive governments have called on parents to enact certain duties and obligations in relation to the state. These duties include adopting and internalizing responsibility for all kinds of risks, liabilities and inequities formerly managed by the Keynesian welfare state. Rather than characterize this situation in terms of the ‘hollowing of the state’, we argue that the role of the state includes enabling the functioning of the parent as a neoliberal subject so that they may successfully harness the power of the market to their own advantage and (hopefully) minimize the kinds of risk and inequity generated through a market-based, deregulated education system. In this paper we examine how parents in England are differently, yet similarly, compelled to embody certain market norms and practices as they navigate the field of education. Adopting genealogical enquiry and policy discourse analysis as our methodology, we explore how parents across three policy sites or spaces are constructed as objects and purveyors of utility and ancillaries to marketisation. This includes a focus on how parents are summoned as 1) consumers or choosers of education services; 2) governors and overseers of schools; and 3) producers and founders of schools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of teams of Autonomous Underwater Vehicles for visual inspection tasks is a promising robotic field. The images captured by different robots can be also to aid in the localization/navigation of the fleet. In a previous work, a distributed localization system was presented based on the use of Augmented States Kalman Filter through the visual maps obtained by the fleet. In this context, this paper details a system for on-line construction of visual maps and its use to aid the localization and navigation of the robots. Different aspects related to the capture, treatment and construction of mosaics by fleets of robots are presented. The developed system can be executed on-line on different robotic platforms. The paper is concluded with a series of tests and analyses aiming at to system validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial Immune Systems have been used successfully to build recommender systems for film databases. In this research, an attempt is made to extend this idea to web site recommendation. A collection of more than 1000 individuals' web profiles (alternatively called preferences / favourites / bookmarks file) will be used. URLs will be classified using the DMOZ (Directory Mozilla) database of the Open Directory Project as our ontology. This will then be used as the data for the Artificial Immune Systems rather than the actual addresses. The first attempt will involve using a simple classification code number coupled with the number of pages within that classification code. However, this implementation does not make use of the hierarchical tree-like structure of DMOZ. Consideration will then be given to the construction of a similarity measure for web profiles that makes use of this hierarchical information to build a better-informed Artificial Immune System.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, systems that extract information from millions of Internet documents have become commonplace. Knowledge graphs -- structured knowledge bases that describe entities, their attributes and the relationships between them -- are a powerful tool for understanding and organizing this vast amount of information. However, a significant obstacle to knowledge graph construction is the unreliability of the extracted information, due to noise and ambiguity in the underlying data or errors made by the extraction system and the complexity of reasoning about the dependencies between these noisy extractions. My dissertation addresses these challenges by exploiting the interdependencies between facts to improve the quality of the knowledge graph in a scalable framework. I introduce a new approach called knowledge graph identification (KGI), which resolves the entities, attributes and relationships in the knowledge graph by incorporating uncertain extractions from multiple sources, entity co-references, and ontological constraints. I define a probability distribution over possible knowledge graphs and infer the most probable knowledge graph using a combination of probabilistic and logical reasoning. Such probabilistic models are frequently dismissed due to scalability concerns, but my implementation of KGI maintains tractable performance on large problems through the use of hinge-loss Markov random fields, which have a convex inference objective. This allows the inference of large knowledge graphs using 4M facts and 20M ground constraints in 2 hours. To further scale the solution, I develop a distributed approach to the KGI problem which runs in parallel across multiple machines, reducing inference time by 90%. Finally, I extend my model to the streaming setting, where a knowledge graph is continuously updated by incorporating newly extracted facts. I devise a general approach for approximately updating inference in convex probabilistic models, and quantify the approximation error by defining and bounding inference regret for online models. Together, my work retains the attractive features of probabilistic models while providing the scalability necessary for large-scale knowledge graph construction. These models have been applied on a number of real-world knowledge graph projects, including the NELL project at Carnegie Mellon and the Google Knowledge Graph.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today the Ria de Aveiro of northern Portugal has a hydromorphological regime in which river influence is limited to periods of flood. For most of the annual cycle, tidal currents and wind waves are the major forcing agents in this complex coastal lagoon–estuarine system. The system has evolved over two centuries from one that was naturally fluvially dominant to one that is today tidally dominant. Human influence was a trigger for these changes, starting in 1808 when its natural evolution was halted by the construction of a new inlet/outlet channel through the mobile sand spit that isolates it from the Atlantic Ocean. In consequence, tidal ranges in the lagoon increased rapidly from ~0.1 m to >1 m and continued to increase, as a result of continued engineering works and dredging, today reaching ~3 m on spring tides. Hydromorphological adjustments that have taken place include the deepening of channels, an increase in the area of inter-tidal flats, regression of salt marsh, increased tidal propagation and increased saline intrusion. Loss of once abundant submerged aquatic vegetation (SAV), due to increased tidal flows, exacerbated by increased recreational activities, has been accompanied by a change from fine cohesive sediments to coarser, mobile sediments with reduced biological activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of building Data Warehouses (DW) is well known with well defined stages but at the same time, mostly carried out manually by IT people in conjunction with business people. Web Warehouses (WW) are DW whose data sources are taken from the web. We define a flexible WW, which can be configured accordingly to different domains, through the selection of the web sources and the definition of data processing characteristics. A Business Process Management (BPM) System allows modeling and executing Business Processes (BPs) providing support for the automation of processes. To support the process of building flexible WW we propose a two BPs level: a configuration process to support the selection of web sources and the definition of schemas and mappings, and a feeding process which takes the defined configuration and loads the data into the WW. In this paper we present a proof of concept of both processes, with focus on the configuration process and the defined data.