982 resultados para Construction set
Resumo:
The UK construction industry is notorious for the sheer amount of disputes which are likely to arise on each building and engineering project. Despite numerous creative attempts at “dispute avoidance” and “dispute resolution”, this industry is still plagued with these costly disputes. Whilst both academic literature and professional practices have investigated the causes of disputes and the mechanisms for avoidance/resolution of these disputes, neither has studied in any detail the nature of the construction disputes and why they develop as they do once a construction lawyer is engaged. Accordingly, this research explores the question of what influences the outcome of a construction dispute and to what extent do construction lawyers control or direct this outcome? The research approach was ethnographic. Fieldwork took place at a leading construction law firm in London over 18 months. The primary focus was participant observation in all of the firm’s activities. In addition, a database was compiled from the firm’s files and archives, thus providing information for quantitative analysis. The basis of the theoretical framework, and indeed the research method, was the Actor‐Network Theory (ANT). As such, this research viewed a dispute as a set of associations – an entity which takes form and acquires its attributes as a result of its relations with other entities. This viewpoint is aligned with relational contract theories, which in turn provides a unified platform for exploring the disputes. The research investigated the entities and events which appeared to influence the dispute’s identity, shape and outcome. With regard to a dispute’s trajectory, the research took as its starting point that a dispute follows the transformation of “naming, blaming, claiming…”, as identified by Felstiner, Abel and Sarat in 1980. The research found that construction disputes generally materialise and develop prior to any one of the parties approaching a lawyer. Once the lawyer is engaged, we see the reverse of the trajectory “naming, blaming, claiming…” this being: “claiming, blaming, naming…” The lawyers’ role is to identify or name (or rename) the dispute in the best possible light for their client in order to achieve the desired outcome – the development of which is akin to the design process. The transformation of a dispute and the reverse trajectory is by no means linear, but rather, iterative and spatial as it requires alliances, dependencies and contingencies to assemble and take the shape it does. The research concludes that construction disputes are rarely ever completely “resolved” as such. Whilst an independent third party may hand down a judgment, or the parties may reach a settlement agreement, this state is only temporal. Some construction disputes dissipate whist others reach a state of hibernation for a period of time only to pick up momentum and energy some years later. Accordingly, this research suggests that the concept of “dispute resolution” does not exist in the UK construction industry. The ultimate goal should be for parties to reach this ultimate and perpetual state of equilibrium as quickly and as cost effectively as possible: “dispute dissolution”, the slowing down of the dispute’s momentum. Rather than focusing on the design and assemblage of the dispute, the lawyers’ role therein is, or should be, to assist with the “disassembling” of the dispute.
Resumo:
This paper aims to present a state-of-the-art review of the scope and practical implications of the Building Information Modelling (BIM) platform in the UK construction practice. Theoretical developments suggest that BIM is an integration of both product and process innovation, not just a disparate set of software tools. BIM provides effective collaboration, visual representation and data management, which enable the smooth flow of information throughout the project’s lifecycle. The most frequently reported benefits are related to Capital Cost (capex) and Operational costs (opex) and time savings. Key challenges, however, focus on the interoperability of software, capital installation costs, in-house experience, client preference and cultural issues within design teams and within the organisation. The paper concludes with a critical commentary on the changing roles and a process required to implement BIM in UK construction projects, and suggests areas for further research.
Resumo:
Les gènes, qui servent à encoder les fonctions biologiques des êtres vivants, forment l'unité moléculaire de base de l'hérédité. Afin d'expliquer la diversité des espèces que l'on peut observer aujourd'hui, il est essentiel de comprendre comment les gènes évoluent. Pour ce faire, on doit recréer le passé en inférant leur phylogénie, c'est-à-dire un arbre de gènes qui représente les liens de parenté des régions codantes des vivants. Les méthodes classiques d'inférence phylogénétique ont été élaborées principalement pour construire des arbres d'espèces et ne se basent que sur les séquences d'ADN. Les gènes sont toutefois riches en information, et on commence à peine à voir apparaître des méthodes de reconstruction qui utilisent leurs propriétés spécifiques. Notamment, l'histoire d'une famille de gènes en terme de duplications et de pertes, obtenue par la réconciliation d'un arbre de gènes avec un arbre d'espèces, peut nous permettre de détecter des faiblesses au sein d'un arbre et de l'améliorer. Dans cette thèse, la réconciliation est appliquée à la construction et la correction d'arbres de gènes sous trois angles différents: 1) Nous abordons la problématique de résoudre un arbre de gènes non-binaire. En particulier, nous présentons un algorithme en temps linéaire qui résout une polytomie en se basant sur la réconciliation. 2) Nous proposons une nouvelle approche de correction d'arbres de gènes par les relations d'orthologie et paralogie. Des algorithmes en temps polynomial sont présentés pour les problèmes suivants: corriger un arbre de gènes afin qu'il contienne un ensemble d'orthologues donné, et valider un ensemble de relations partielles d'orthologie et paralogie. 3) Nous montrons comment la réconciliation peut servir à "combiner'' plusieurs arbres de gènes. Plus précisément, nous étudions le problème de choisir un superarbre de gènes selon son coût de réconciliation.
Resumo:
Les gènes, qui servent à encoder les fonctions biologiques des êtres vivants, forment l'unité moléculaire de base de l'hérédité. Afin d'expliquer la diversité des espèces que l'on peut observer aujourd'hui, il est essentiel de comprendre comment les gènes évoluent. Pour ce faire, on doit recréer le passé en inférant leur phylogénie, c'est-à-dire un arbre de gènes qui représente les liens de parenté des régions codantes des vivants. Les méthodes classiques d'inférence phylogénétique ont été élaborées principalement pour construire des arbres d'espèces et ne se basent que sur les séquences d'ADN. Les gènes sont toutefois riches en information, et on commence à peine à voir apparaître des méthodes de reconstruction qui utilisent leurs propriétés spécifiques. Notamment, l'histoire d'une famille de gènes en terme de duplications et de pertes, obtenue par la réconciliation d'un arbre de gènes avec un arbre d'espèces, peut nous permettre de détecter des faiblesses au sein d'un arbre et de l'améliorer. Dans cette thèse, la réconciliation est appliquée à la construction et la correction d'arbres de gènes sous trois angles différents: 1) Nous abordons la problématique de résoudre un arbre de gènes non-binaire. En particulier, nous présentons un algorithme en temps linéaire qui résout une polytomie en se basant sur la réconciliation. 2) Nous proposons une nouvelle approche de correction d'arbres de gènes par les relations d'orthologie et paralogie. Des algorithmes en temps polynomial sont présentés pour les problèmes suivants: corriger un arbre de gènes afin qu'il contienne un ensemble d'orthologues donné, et valider un ensemble de relations partielles d'orthologie et paralogie. 3) Nous montrons comment la réconciliation peut servir à "combiner'' plusieurs arbres de gènes. Plus précisément, nous étudions le problème de choisir un superarbre de gènes selon son coût de réconciliation.
Resumo:
This dissertation examined the formation of Japanese identity politics after World War II. Since World War II, Japan has had to deal with a contradictory image of its national self. On the one hand, as a nation responsible for colonizing fellow Asian countries in the 1930s and 1940s, Japan has struggled with an image/identity as a regional aggressor. On the other hand, having faced the harsh realities of defeat after the war, Japan has seen itself depicted as a victim. By employing the technique of discourse analysis as a way to study identity formation through official foreign policy documents and news media narratives, this study reconceptualized Japanese foreign policy as a set of discursive practices that attempt to produce renewed images of Japan’s national self. The dissertation employed case studies to analyze two key sites of Japanese postwar identity formation: (1) the case of Okinawa, an island/territory integral to postwar relations between Japan and the United States and marked by a series of US military rapes of native Okinawan girls; and (2) the case of comfort women in Japan and East Asia, which has led to Japan being blamed for its wartime sexual enslavement of Asian women. These case studies found that it was through coping with the haunting ghost of its wartime past that Japan sought to produce “postwar Japan” as an identity distinct from “wartime imperial Japan” or from “defeated, emasculated Japan” and, thus, hoped to emerge as a “reborn” moral and pacifist nation. The research showed that Japan struggled to invent a new self in a way that mobilized gendered dichotomies and, furthermore, created “others” who were not just spatially located (the United States, Asian neighboring nations) but also temporally marked (“old Japan”). The dissertation concluded that Japanese foreign policy is an ongoing struggle to define the Japanese national self vis-à-vis both spatial and historical “others,” and that, consequently, postwar Japan has always been haunted by its past self, no matter how much Japan’s foreign policy discourses were trying to make this past self into a distant or forgotten other.
Resumo:
Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry’s standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device. This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.
Resumo:
Growing need for infrastructure has led to expanding research on advances in road pavement materials. Finding solutions that are sustainable, environmentally friendly and cost-efficient is a priority. Focusing such efforts on low-traffic and rural roads can contribute with a significant progress in the vital circulatory system of transport for rural and agricultural areas. An important alternative material for pavement construction is recycled aggregates from solid wastes, including waste from civil engineering activities, mainly construction and demolition. A literature review on studies is made; it is performed a planned set of laboratory testing procedures aimed to fully characterize and assess the potential in-situ mechanical performance and chemical impact. Furthermore, monitoring the full-scale response of the selected materials in a real field construction site, including the production, laying and compaction operations. Moreover, a novel single-phase solution for the construction of semi-flexible paving layers to be used as alternative material to common concrete and bituminous layers is experimented and introduced, aiming the production and laying of a single-phase laid material instead of a traditional two phases grouted macadam. Finally, on a parallel research work for farming pavements, the possible use of common geotechnical anti-erosive products for the improvement of soil bearing capacity of paddock areas in cattle husbandries of bio-farms is evaluated. this thesis has clearly demonstrated the feasibility of using the sustainable recycled aggregates for low-traffic rural roads and the pavements of farming and agriculture areas. The pavement layers constructed with recycled aggregates provided satisfying performance under heavy traffic conditions in experimental pavements. This, together with the fact that these aggregates can be available in most areas and in large quantities, provides great impetus towards shifting from traditional materials to more sustainable alternatives. The chemical and environmental stability of these materials proves their soundness to be utilized in farming environments.
Resumo:
In this thesis work, a cosmic-ray telescope was set up in the INFN laboratories in Bologna using smaller size replicas of CMS Drift Tubes chambers, called MiniDTs, to test and develop new electronics for the CMS Phase-2 upgrade. The MiniDTs were assembled in INFN National Laboratory in Legnaro, Italy. Scintillator tiles complete the telescope, providing a signal independent of the MiniDTs for offline analysis. The telescope readout is a test system for the CMS Phase-2 upgrade data acquisition design. The readout is based on the early prototype of a radiation-hard FPGA-based board developed for the High Luminosity LHC CMS upgrade, called On Board electronics for Drift Tubes. Once the set-up was operational, we developed an online monitor to display in real-time the most important observables to check the quality of the data acquisition. We performed an offline analysis of the collected data using a custom version of CMS software tools, which allowed us to estimate the time pedestal and drift velocity in each chamber, evaluate the efficiency of the different DT cells, and measure the space and time resolution of the telescope system.
Resumo:
From 2010, the Proton Radius has become one of the most interest value to determine. The first proof of not complete understanding of its internal structure was the measurement of the Lamb Shift using the muonic hydrogen, leading to a value 7σ lower. A new road so was open and the Proton Radius Puzzle epoch begun. FAMU Experiment is a project that tries to give an answer to this Puzzle implementing high precision experimental apparatus. The work of this thesis is based on the study, construction and first characterization of a new detection system. Thanks to the previous experiments and simulations, this apparatus is composed by 17 detectors positioned on a semicircular crown with the related electronic circuit. The detectors' characterization is based on the use of a LabView program controlling a digital potentiometer and on other two analog potentiometers, all three used to set the amplitude of each detector to a predefined value, around 1.2 V, set on the oscilloscope by which is possible to observe the signal. This is the requirement in order to have, in the final measurement, a single high peak given by the sum of all the signals coming from the detectors. Each signal has been acquired for almost half of an hour, but the entire circuit has been maintained active for more time to observe its capacity to work for longer periods. The principal results of this thesis are given by the spectra of 12 detectors and the corresponding values of Voltages, FWHM and Resolution. The outcomes of the acquisitions show also another expected behavior: the strong dependence of the detectors from the temperature, demonstrating that an its change causes fluctuations in the signal. In turn, these fluctuations will affect the spectrum, resulting in a shifting of the curve and a lower Resolution. On the other hand, a measurement performed in stable conditions will lead to accordance between the nominal and experimental measurements, as for the detectors 10, 11 and 12 of our system.
Resumo:
The scope of this paper is to reflect on the theoretical construction in the constitution of the sociology of health, still called medical sociology in some countries. Two main ideas constitute the basis for this: interdisciplinarity and the degree of articulation in the fields of medicine and sociology. We sought to establish a dialogue with some dimensions - macro/micro, structure/action - that constitute the basis for understanding medicine/health in relation to the social/sociological dimension. The main aspects of these dimensions are initially presented. Straus' two medical sociologies and the theory/application impasses are then addressed, as well as the dilemmas of the sociology of medicine in the 1960s and 1970s. From these analyses the theoretical production before 1970 is placed as a counterpoint. Lastly, the sociology of health is seen in the general context of sociology, which underwent a fragmentation process from 1970 with effects in all subfields of the social sciences. This process involves a rethinking of the theoretical issues in a broadened spectrum of possibilities. The 1980s are highlighted when theoretical issues in the sociology of health are reinvigorated and the issue of interdisciplinarity is once again addressed.
Resumo:
The Centers for High Cost Medication (Centros de Medicação de Alto Custo, CEDMAC), Health Department, São Paulo were instituted by project in partnership with the Clinical Hospital of the Faculty of Medicine, USP, sponsored by the Foundation for Research Support of the State of São Paulo (Fundação de Amparo à Pesquisa do Estado de São Paulo, FAPESP) aimed at the formation of a statewide network for comprehensive care of patients referred for use of immunobiological agents in rheumatological diseases. The CEDMAC of Hospital de Clínicas, Universidade Estadual de Campinas (HC-Unicamp), implemented by the Division of Rheumatology, Faculty of Medical Sciences, identified the need for standardization of the multidisciplinary team conducts, in face of the specificity of care conducts, verifying the importance of describing, in manual format, their operational and technical processes. The aim of this study is to present the methodology applied to the elaboration of the CEDMAC/HC-Unicamp Manual as an institutional tool, with the aim of offering the best assistance and administrative quality. In the methodology for preparing the manuals at HC-Unicamp since 2008, the premise was to obtain a document that is participatory, multidisciplinary, focused on work processes integrated with institutional rules, with objective and didactic descriptions, in a standardized format and with electronic dissemination. The CEDMAC/HC-Unicamp Manual was elaborated in 10 months, with involvement of the entire multidisciplinary team, with 19 chapters on work processes and techniques, in addition to those concerning the organizational structure and its annexes. Published in the electronic portal of HC Manuals in July 2012 as an e-Book (ISBN 978-85-63274-17-5), the manual has been a valuable instrument in guiding professionals in healthcare, teaching and research activities.
Resumo:
This is an analysis of the theoretical and practical construction of the methodology of Matrix Support by means of studies on Paideia Support (Institutional and Matrix Support), which is an inter-professional work of joint care in recent literature and official documents of the Unified Health System (SUS). An attempt was made to describe methodological concepts and strategies. A comparative analysis of Institutional Support and Matrix Support was also conducted using the epistemological framework of Field and Core Knowledge and Practices.
Resumo:
Measurement instruments are an integral part of clinical practice, health evaluation and research. These instruments are only useful and able to present scientifically robust results when they are developed properly and have appropriate psychometric properties. Despite the significant increase of rating scales, the literature suggests that many of them have not been adequately developed and validated. The scope of this study was to conduct a narrative review on the process of developing new measurement instruments and to present some tools which can be used in some stages of the development process. The steps described were: I-The establishment of a conceptual framework, and the definition of the objectives of the instrument and the population involved; II-Development of the items and of the response scales; III-Selection and organization of the items and structuring of the instrument; IV-Content validity, V-Pre-test. This study also included a brief discussion on the evaluation of the psychometric properties due to their importance for the instruments to be accepted and acknowledged in both scientific and clinical environments.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.