913 resultados para Drilling process monitoring
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
The sudden hydrocarbon influx from the formation into the wellbore poses a serious risk to the safety of the well. This sudden influx is termed a kick, which, if not controlled, may lead to a blowout. Therefore, early detection of the kick is crucial to minimize the possibility of a blowout occurrence. There is a high probability of delay in kick detection, apart from other issues when using a kick detection system that is exclusively based on surface monitoring. Down-hole monitoring techniques have a potential to detect a kick at its early stage. Down-hole monitoring could be particularly beneficial when the influx occurs as a result of a lost circulation scenario. In a lost circulation scenario, when the down-hole pressure becomes lower than the formation pore pressure, the formation fluid may starts to enter the wellbore. The lost volume of the drilling fluid is compensated by the formation fluid flowing into the well bore, making it difficult to identify the kick based on pit (mud tank) volume observations at the surface. This experimental study investigates the occurrence of a kick based on relative changes in the mass flow rate, pressure, density, and the conductivity of the fluid in the down-hole. Moreover, the parameters that are most sensitive to formation fluid are identified and a methodology to detect a kick without false alarms is reported. Pressure transmitter, the Coriolis flow and density meter, and the conductivity sensor are employed to observe the deteriorating well conditions in the down-hole. These observations are used to assess the occurrence of a kick and associated blowout risk. Monitoring of multiple down-hole parameters has a potential to improve the accuracy of interpretation related to kick occurrence, reduces the number of false alarms, and provides a broad picture of down-hole conditions. The down-hole monitoring techniques have a potential to reduce the kick detection period. A down-hole assembly of the laboratory scale drilling rig model and kick injection setup were designed, measuring instruments were acquired, a frame was fabricated, and the experimental set-up was assembled and tested. This set-up has the necessary features to evaluate kick events while implementing down-hole monitoring techniques. Various kick events are simulated on the drilling rig model. During the first set of experiments compressed air (which represents the formation fluid) is injected with constant pressure margin. In the second set of experiments the compressed air is injected with another pressure margin. The experiments are repeated with another pump (flow) rate as well. This thesis consists of three main parts. The first part gives the general introduction, motivation, outline of the thesis, and a brief description of influx: its causes, various leading and lagging indicators, and description of the several kick detection systems that are in practice in the industry. The second part describes the design and construction of the laboratory scale down-hole assembly of the drilling rig and kick injection setup, which is used to implement the proposed methodology for early kick detection. The third part discusses the experimental work, describes the methodology for early kick detection, and presents experimental results that show how different influx events affect the mass flow rate, pressure, conductivity, and density of the fluid in the down-hole, and the discussion of the results. The last chapter contains summary of the study and future research.
Resumo:
In the Flux Cored Arc Welding (FCAW) process, the transfer of filler metal (metal transfer modes) to the base material to accomplish the weld bead determines the weld quality and therefore studies of such phenomena is demanded. Thus, in this work, the metal transfer through the FCAW process is investigated by filming the phenomena with the assist of near infrared visualization. During the literature survey, it was found that this technic has not been used so far for analyzing the FCAW process. It must be pointed out that the radiation emitted from the weld arc, fumes and particles (spattering) in this process represent a barrier for these studies based in the process visualization. The monitoring of metal transfer for FCAW process was carried out within the operational envelope of voltage and wire feed speed with the electrode E71T-1 (1.2 mm diameter) and Ar+25%CO2 as a shielding gas. A local developed near infrared filming with frame rate of 300 Hz was employed for metal transfer visualization in order to contribute to a better understanding of this process and evaluating characteristics of metal transfer, unlike previous studies, which used shadowgraph technique. It can clearly be seen how the droplet is created and transferred in this process and also identify the different modes of metal transfer by changing the parameters of voltage and wire feed speed in metal transfer maps. The final result of this study is the metal transfer mode maps, which establish suitable conditions and provide the basis for developing arc control strategies for the FCAW process.
Resumo:
Inscription: Verso: Women at work: miscellaneous occupations. Cardiac Intensive Care Unite, Douglas County Hospital, Alexandria, Minnesota.
Resumo:
Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.
Resumo:
Monitoring and enforcement are perhaps the biggest challenges in the design and implementation of environmental policies in developing countries where the actions of many small informal actors cause significant impacts on the ecosystem services and where the transaction costs for the state to regulate them could be enormous. This dissertation studies the potential of innovative institutions based on decentralized coordination and enforcement to induce better environmental outcomes. Such policies have in common that the state plays the role of providing the incentives for organization but the process of compliance happens through decentralized agreements, trust building, signaling and monitoring. I draw from the literatures in collective action, common-pool resources, game-theory and non-point source pollution to develop the instruments proposed here. To test the different conditions in which such policies could be implemented I designed two field-experiments that I conducted with small-scale gold miners in the Colombian Pacific and with users and providers of ecosystem services in the states of Veracruz, Quintana Roo and Yucatan in Mexico. This dissertation is organized in three essays.
The first essay, “Collective Incentives for Cleaner Small-Scale Gold Mining on the Frontier: Experimental Tests of Compliance with Group Incentives given Limited State Monitoring”, examines whether collective incentives, i.e. incentives provided to a group conditional on collective compliance, could “outsource” the required local monitoring, i.e. induce group interactions that extend the reach of the state that can observe only aggregate consequences in the context of small-scale gold mining. I employed a framed field-lab experiment in which the miners make decisions regarding mining intensity. The state sets a collective target for an environmental outcome, verifies compliance and provides a group reward for compliance which is split equally among members. Since the target set by the state transforms the situation into a coordination game, outcomes depend on expectations of what others will do. I conducted this experiment with 640 participants in a mining region of the Colombian Pacific and I examine different levels of policy severity and their ordering. The findings of the experiment suggest that such instruments can induce compliance but this regulation involves tradeoffs. For most severe targets – with rewards just above costs – raise gains if successful but can collapse rapidly and completely. In terms of group interactions, better outcomes are found when severity initially is lower suggesting learning.
The second essay, “Collective Compliance can be Efficient and Inequitable: Impacts of Leaders among Small-Scale Gold Miners in Colombia”, explores the channels through which communication help groups to coordinate in presence of collective incentives and whether the reached solutions are equitable or not. Also in the context of small-scale gold mining in the Colombian Pacific, I test the effect of communication in compliance with a collective environmental target. The results suggest that communication, as expected, helps to solve coordination challenges but still some groups reach agreements involving unequal outcomes. By examining the agreements that took place in each group, I observe that the main coordination mechanism was the presence of leaders that help other group members to clarify the situation. Interestingly, leaders not only helped groups to reach efficiency but also played a key role in equity by defining how the costs of compliance would be distributed among group members.
The third essay, “Creating Local PES Institutions and Increasing Impacts of PES in Mexico: A real-Time Watershed-Level Framed Field Experiment on Coordination and Conditionality”, considers the creation of a local payments for ecosystem services (PES) mechanism as an assurance game that requires the coordination between two groups of participants: upstream and downstream. Based on this assurance interaction, I explore the effect of allowing peer-sanctions on upstream behavior in the functioning of the mechanism. This field-lab experiment was implemented in three real cases of the Mexican Fondos Concurrentes (matching funds) program in the states of Veracruz, Quintana Roo and Yucatan, where 240 real users and 240 real providers of hydrological services were recruited and interacted with each other in real time. The experimental results suggest that initial trust-game behaviors align with participants’ perceptions and predicts baseline giving in assurance game. For upstream providers, i.e. those who get sanctioned, the threat and the use of sanctions increase contributions. Downstream users contribute less when offered the option to sanction – as if that option signal an uncooperative upstream – then the contributions rise in line with the complementarity in payments of the assurance game.
Resumo:
Brain injury due to lack of oxygen or impaired blood flow around the time of birth, may cause long term neurological dysfunction or death in severe cases. The treatments need to be initiated as soon as possible and tailored according to the nature of the injury to achieve best outcomes. The Electroencephalogram (EEG) currently provides the best insight into neurological activities. However, its interpretation presents formidable challenge for the neurophsiologists. Moreover, such expertise is not widely available particularly around the clock in a typical busy Neonatal Intensive Care Unit (NICU). Therefore, an automated computerized system for detecting and grading the severity of brain injuries could be of great help for medical staff to diagnose and then initiate on-time treatments. In this study, automated systems for detection of neonatal seizures and grading the severity of Hypoxic-Ischemic Encephalopathy (HIE) using EEG and Heart Rate (HR) signals are presented. It is well known that there is a lot of contextual and temporal information present in the EEG and HR signals if examined at longer time scale. The systems developed in the past, exploited this information either at very early stage of the system without any intelligent block or at very later stage where presence of such information is much reduced. This work has particularly focused on the development of a system that can incorporate the contextual information at the middle (classifier) level. This is achieved by using dynamic classifiers that are able to process the sequences of feature vectors rather than only one feature vector at a time.
Resumo:
Laser micromachining is an important material processing technique used in industry and medicine to produce parts with high precision. Control of the material removal process is imperative to obtain the desired part with minimal thermal damage to the surrounding material. Longer pulsed lasers, with pulse durations of milli- and microseconds, are used primarily for laser through-cutting and welding. In this work, a two-pulse sequence using microsecond pulse durations is demonstrated to achieve consistent material removal during percussion drilling when the delay between the pulses is properly defined. The light-matter interaction moves from a regime of surface morphology changes to melt and vapour ejection. Inline coherent imaging (ICI), a broadband, spatially-coherent imaging technique, is used to monitor the ablation process. The pulse parameter space is explored and the key regimes are determined. Material removal is observed when the pulse delay is on the order of the pulse duration. ICI is also used to directly observe the ablation process. Melt dynamics are characterized by monitoring surface changes during and after laser processing at several positions in and around the interaction region. Ablation is enhanced when the melt has time to flow back into the hole before the interaction with the second pulse begins. A phenomenological model is developed to understand the relationship between material removal and pulse delay. Based on melt refilling the interaction region, described by logistic growth, and heat loss, described by exponential decay, the model is fit to several datasets. The fit parameters reflect the pulse energies and durations used in the ablation experiments. For pulse durations of 50 us with pulse energies of 7.32 mJ +/- 0.09 mJ, the logisitic growth component of the model reaches half maximum after 8.3 us +/- 1.1 us and the exponential decays with a rate of 64 us +/- 15 us. The phenomenological model offers an interpretation of the material removal process.
Resumo:
In Model-Driven Engineering (MDE), the developer creates a model using a language such as Unified Modeling Language (UML) or UML for Real-Time (UML-RT) and uses tools such as Papyrus or Papyrus-RT that generate code for them based on the model they create. Tracing allows developers to get insights such as which events occur and timing information into their own application as it runs. We try to add monitoring capabilities using Linux Trace Toolkit: next generation (LTTng) to models created in UML-RT using Papyrus-RT. The implementation requires changing the code generator to add tracing statements for the events that the user wants to monitor to the generated code. We also change the makefile to automate the build process and we create an Extensible Markup Language (XML) file that allows developers to view their traces visually using Trace Compass, an Eclipse-based trace viewing tool. Finally, we validate our results using three models we create and trace.
Resumo:
This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.
Resumo:
There have been over 3000 bridge weigh-in-motion (B-WIM) installations in 25 countries worldwide, this has led vast improvements in post processing of B-WIM systems since its introduction in the 1970’s. This paper introduces a new low-power B-WIM system using fibre optic sensors (FOS). The system consisted of a series of FOS which were attached to the soffit of an existing integral bridge with a single span of 19m. The site selection criteria and full installation process has been detailed in the paper. A method of calibration was adopted using live traffic at the bridge site and based on this calibration the accuracy of the system was determined.
Resumo:
Because the authors both did work on the North Ireland parades, they became integrally involved as fieldworking anthropologists in the monitoring of these events, and in the creation of policy for their management. They detail how they worked with individuals and groups at every level, from protestors on the street up to the Secretary of State for the region. Later funded to examine legal and policing approaches to protests in other countries, especially South Africa, they show how they used this comparative knowledge to urge the implementation of measures which appear to have led to a diminution of violence in the parades. Finally, they assess their own contribution to the peace process in terms of contingency, timing, luck, flexibility, and industry.
Resumo:
This paper addresses the two opposing extremes of standardisation in franchising and the dynamics of sales in search of a juncture point in order to reduce franchisees’ uncertainties in sales and improve sales performance. A conceptual framework is developed based on both theory and practice in order to investigate the sales process of a specific franchise network. The research is conducted over a period of six weeks in form of a customised sales report considering the sales funnel concept and performance indicators along the sales process. The received quantitative data is analysed through descriptive statistics and logistic regressions in respect to what variations in the sales process can be discovered and what practices yield higher performance. The results indicate an advantage of a prioritisation guideline regarding the activities and choices to make as a salesperson over strict standardisation. Defining the sales funnel plus engaging in the process of monitoring sales in itself has proven to be a way of reducing uncertainty as the franchisor and franchisees alike inherently gain a greater understanding of the process. The extended knowledge gained from this research allowed for both practical as well as theoretical implications and expands the knowledge on standardisation of sales and the appropriateness of the sales funnel and its management for dealing with the dilemma between standardisation and flexibility of sales in franchising contexts.
Resumo:
BACKGROUND: The recently developed Context Assessment for Community Health (COACH) tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources, community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment) through 49 items. OBJECTIVE: The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. DESIGNS: To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test-retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC) and percent agreement) and dimensions (ICC and Bland-Altman plots). RESULTS: In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test-retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5-0.7), demonstrating that the instrument has an acceptable level of stability. CONCLUSIONS: This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify some items and to further review items with low ICC.
Resumo:
There is an increasing emphasis on the restoration of ecosystem services as well as of biodiversity, especially where restoration projects are planned at a landscape scale. This increase in the diversity of restoration aims has a number of conceptual and practical implications for the way that restoration projects are monitored and evaluated. Landscape-scale projects require monitoring of not only ecosystem services and biodiversity but also of ecosystem processes since these can underpin both. Using the experiences gained at a landscape-scale wetland restoration project in the UK, we discuss a number of issues that need to be considered, including the choice of metrics for monitoring ecosystem services and the difficulties of assessing the interactions between ecosystem processes, biodiversity, and ecosystem services. Particular challenges that we identify, using two pilot data sets, include the decoupling of monetary metrics used for monitoring ecosystem services from biophysical change on the ground and the wide range of factors external to a project that influence the monitoring results. We highlight the fact that the wide range of metrics necessary to evaluate the ecosystem service, ecosystem process, and biodiversity outcomes of landscape-scale projects presents a number of practical challenges, including the need for high levels of varied expertise, high costs, incommensurate monitoring outputs, and the need for careful management of monitoring results, especially where they may be used in making decisions about the relative importance of project aims.