939 resultados para process parameter monitoring


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser micromachining is an important material processing technique used in industry and medicine to produce parts with high precision. Control of the material removal process is imperative to obtain the desired part with minimal thermal damage to the surrounding material. Longer pulsed lasers, with pulse durations of milli- and microseconds, are used primarily for laser through-cutting and welding. In this work, a two-pulse sequence using microsecond pulse durations is demonstrated to achieve consistent material removal during percussion drilling when the delay between the pulses is properly defined. The light-matter interaction moves from a regime of surface morphology changes to melt and vapour ejection. Inline coherent imaging (ICI), a broadband, spatially-coherent imaging technique, is used to monitor the ablation process. The pulse parameter space is explored and the key regimes are determined. Material removal is observed when the pulse delay is on the order of the pulse duration. ICI is also used to directly observe the ablation process. Melt dynamics are characterized by monitoring surface changes during and after laser processing at several positions in and around the interaction region. Ablation is enhanced when the melt has time to flow back into the hole before the interaction with the second pulse begins. A phenomenological model is developed to understand the relationship between material removal and pulse delay. Based on melt refilling the interaction region, described by logistic growth, and heat loss, described by exponential decay, the model is fit to several datasets. The fit parameters reflect the pulse energies and durations used in the ablation experiments. For pulse durations of 50 us with pulse energies of 7.32 mJ +/- 0.09 mJ, the logisitic growth component of the model reaches half maximum after 8.3 us +/- 1.1 us and the exponential decays with a rate of 64 us +/- 15 us. The phenomenological model offers an interpretation of the material removal process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Model-Driven Engineering (MDE), the developer creates a model using a language such as Unified Modeling Language (UML) or UML for Real-Time (UML-RT) and uses tools such as Papyrus or Papyrus-RT that generate code for them based on the model they create. Tracing allows developers to get insights such as which events occur and timing information into their own application as it runs. We try to add monitoring capabilities using Linux Trace Toolkit: next generation (LTTng) to models created in UML-RT using Papyrus-RT. The implementation requires changing the code generator to add tracing statements for the events that the user wants to monitor to the generated code. We also change the makefile to automate the build process and we create an Extensible Markup Language (XML) file that allows developers to view their traces visually using Trace Compass, an Eclipse-based trace viewing tool. Finally, we validate our results using three models we create and trace.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There have been over 3000 bridge weigh-in-motion (B-WIM) installations in 25 countries worldwide, this has led vast improvements in post processing of B-WIM systems since its introduction in the 1970’s. This paper introduces a new low-power B-WIM system using fibre optic sensors (FOS). The system consisted of a series of FOS which were attached to the soffit of an existing integral bridge with a single span of 19m. The site selection criteria and full installation process has been detailed in the paper. A method of calibration was adopted using live traffic at the bridge site and based on this calibration the accuracy of the system was determined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because the authors both did work on the North Ireland parades, they became integrally involved as fieldworking anthropologists in the monitoring of these events, and in the creation of policy for their management. They detail how they worked with individuals and groups at every level, from protestors on the street up to the Secretary of State for the region. Later funded to examine legal and policing approaches to protests in other countries, especially South Africa, they show how they used this comparative knowledge to urge the implementation of measures which appear to have led to a diminution of violence in the parades. Finally, they assess their own contribution to the peace process in terms of contingency, timing, luck, flexibility, and industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivated by environmental protection concerns, monitoring the flue gas of thermal power plant is now often mandatory due to the need to ensure that emission levels stay within safe limits. Optical based gas sensing systems are increasingly employed for this purpose, with regression techniques used to relate gas optical absorption spectra to the concentrations of specific gas components of interest (NOx, SO2 etc.). Accurately predicting gas concentrations from absorption spectra remains a challenging problem due to the presence of nonlinearities in the relationships and the high-dimensional and correlated nature of the spectral data. This article proposes a generalized fuzzy linguistic model (GFLM) to address this challenge. The GFLM is made up of a series of “If-Then” fuzzy rules. The absorption spectra are input variables in the rule antecedent. The rule consequent is a general nonlinear polynomial function of the absorption spectra. Model parameters are estimated using least squares and gradient descent optimization algorithms. The performance of GFLM is compared with other traditional prediction models, such as partial least squares, support vector machines, multilayer perceptron neural networks and radial basis function networks, for two real flue gas spectral datasets: one from a coal-fired power plant and one from a gas-fired power plant. The experimental results show that the generalized fuzzy linguistic model has good predictive ability, and is competitive with alternative approaches, while having the added advantage of providing an interpretable model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The recently developed Context Assessment for Community Health (COACH) tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources, community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment) through 49 items. OBJECTIVE: The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. DESIGNS: To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test-retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC) and percent agreement) and dimensions (ICC and Bland-Altman plots). RESULTS: In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test-retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5-0.7), demonstrating that the instrument has an acceptable level of stability. CONCLUSIONS: This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify some items and to further review items with low ICC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing emphasis on the restoration of ecosystem services as well as of biodiversity, especially where restoration projects are planned at a landscape scale. This increase in the diversity of restoration aims has a number of conceptual and practical implications for the way that restoration projects are monitored and evaluated. Landscape-scale projects require monitoring of not only ecosystem services and biodiversity but also of ecosystem processes since these can underpin both. Using the experiences gained at a landscape-scale wetland restoration project in the UK, we discuss a number of issues that need to be considered, including the choice of metrics for monitoring ecosystem services and the difficulties of assessing the interactions between ecosystem processes, biodiversity, and ecosystem services. Particular challenges that we identify, using two pilot data sets, include the decoupling of monetary metrics used for monitoring ecosystem services from biophysical change on the ground and the wide range of factors external to a project that influence the monitoring results. We highlight the fact that the wide range of metrics necessary to evaluate the ecosystem service, ecosystem process, and biodiversity outcomes of landscape-scale projects presents a number of practical challenges, including the need for high levels of varied expertise, high costs, incommensurate monitoring outputs, and the need for careful management of monitoring results, especially where they may be used in making decisions about the relative importance of project aims.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pavements tend to deteriorate with time under repeated traffic and/or environmental loading. By detecting pavement distresses and damage early enough, it is possible for transportation agencies to develop more effective pavement maintenance and rehabilitation programs and thereby achieve significant cost and time savings. The structural health monitoring (SHM) concept can be considered as a systematic method for assessing the structural state of pavement infrastructure systems and documenting their condition. Over the past several years, this process has traditionally been accomplished through the use of wired sensors embedded in bridge and highway pavement. However, the use of wired sensors has limitations for long-term SHM and presents other associated cost and safety concerns. Recently, micro-electromechanical sensors and systems (MEMS) and nano-electromechanical systems (NEMS) have emerged as advanced/smart-sensing technologies with potential for cost-effective and long-term SHM. This two-pronged study evaluated the performance of commercial off-the-shelf (COTS) MEMS sensors embedded in concrete pavement (Final Report Volume I) and developed a wireless MEMS multifunctional sensor system for health monitoring of concrete pavement (Final Report Volume II).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mobile network coverage is traditionally provided by outdoor macro base stations, which have a long range and serve several of customers. Due to modern passive houses and tightening construction legislation, mobile network service is deteriorated in many indoor locations. Typically, solutions for indoor coverage problem are expensive and demand actions from the mobile operator. Due to these, superior solutions are constantly researched. The solution presented in this thesis is based on Small Cell technology. Small Cells are low power access nodes designed to provide voice and data services.. This thesis concentrates on a specific Small Cell solution, which is called a Pico Cell. The problem regarding Pico Cells and Small Cells in general is that they are a new technological solution for the mobile operator, and the possible problem sources and incidents are not properly mapped. The purpose of this thesis is to figure out the possible problems in the Pico Cell deployment and how they could be solved within the operator’s incident management process. The research in the thesis is carried out with a literature research and a case study. The possible problems are investigated through lab testing. Pico Cell automated deployment process was tested in the lab environment and its proper functionality is confirmed. The related network elements were also tested and examined, and the emerged problems are resolvable. Operators existing incident management process can be used for Pico Cell troubleshooting with minor updates. Certain pre-requirements have to be met before Pico Cell deployment can be considered. The main contribution of this thesis is the Pico Cell integrated incident management process. The presented solution works in theory and solves the problems found during the lab testing. The limitations in the customer service level were solved by adding the necessary tools and by designing a working question pattern. Process structures for automated network discovery and pico specific radio parameter planning were also added for the mobile network management layer..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrometallurgical process modeling is the main objective of this Master’s thesis work. Three different leaching processes namely, high pressure pyrite oxidation, direct oxidation zinc concentrate (sphalerite) leaching and gold chloride leaching using rotating disc electrode (RDE) are modeled and simulated using gPROMS process simulation program in order to evaluate its model building capabilities. The leaching mechanism in each case is described in terms of a shrinking core model. The mathematical modeling carried out included process model development based on available literature, estimation of reaction kinetic parameters and assessment of the model reliability by checking the goodness fit and checking the cross correlation between the estimated parameters through the use of correlation matrices. The estimated parameter values in each case were compared with those obtained using the Modest simulation program. Further, based on the estimated reaction kinetic parameters, reactor simulation and modeling for direct oxidation zinc concentrate (sphalerite) leaching is carried out in Aspen Plus V8.6. The zinc leaching autoclave is based on Cominco reactor configuration and is modeled as a series of continuous stirred reactors (CSTRs). The sphalerite conversion is calculated and a sensitivity analysis is carried out so to determine the optimum reactor operation temperature and optimum oxygen mass flow rate. In this way, the implementation of reaction kinetic models into the process flowsheet simulation environment has been demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many maritime countries in Europe have implemented marine environmental monitoring programmes which include the measurement of chemical contaminants and related biological effects. How best to integrate data obtained in these two types of monitoring into meaningful assessments has been the subject of recent efforts by the International Council for Exploration of the Sea (ICES) Expert Groups. Work within these groups has concentrated on defining a core set of chemical and biological endpoints that can be used across maritime areas, defining confounding factors, supporting parameters and protocols for measurement. The framework comprised markers for concentrations of, exposure to and effects from, contaminants. Most importantly, assessment criteria for biological effect measurements have been set and the framework suggests how these measurements can be used in an integrated manner alongside contaminant measurements in biota, sediments and potentially water. Output from this process resulted in OSPAR Commission (www.ospar.org) guidelines that were adopted in 2012 on a trial basis for a period of 3 years. The developed assessment framework can furthermore provide a suitable approach for the assessment of Good Environmental Status (GES) for Descriptor 8 of the European Union (EU) Marine Strategy Framework Directive (MSFD).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marine Recirculating Aquaculture Systems (RAS) produce great volume of wastewater, which may be reutilized/recirculated or reutilized after undergoing different treatment/remediation methods, or partly discharged into neighbour water-bodies (DWW). Phosphates, in particular, are usually accumulated at high concentrations in DWW, both because its monitoring is not compulsory for fish production since it is not a limiting parameter, and also because there is no specific treatment so far developed to remove them, especially in what concerns saltwater effluents. As such, this work addresses two main scientific questions. One of them regards the understanding of the actual (bio)remediation methods applied to effluents produced in marine RAS, by identifying their advantages, drawbacks and gaps concerning their exploitation in saltwater effluents. The second one is the development of a new, innovative and efficient method for the treatment of saltwater effluents that potentially fulfil the gaps identified in the conventional treatments. Thereby, the aims of this thesis are: (i) to revise the conventional treatments targeting major contaminants in marine RAS effluents, with a particular focus on the bioremediation approaches already conducted for phosphates; (ii) to characterize and evaluate the potential of oyster-shell waste collected in Ria de Aveiro as a bioremediation agent of phosphates spiked into artificial saltwater, over different influencing factors (e.g., oyster-shell pre-treatment through calcination, particle size, adsorbent concentration). Despite the use of oyster-shells for phosphorous (P) removal has already been applied in freshwater, its biosorptive potential for P in saltwater was never evaluated, as far as I am aware. The results herein generated showed that NOS is mainly composed by carbonates, which are almost completely converted into lime (CaO) after calcination (COS). Such pre-treatment allowed obtaining a more reactive material for P removal, since higher removal percentages and adsorption capacity was observed for COS. Smaller particle size fractions for both NOS and COS samples also increased P removal. Kinetic models showed that NOS adsorption followed, simultaneously, Elovich and Intraparticle Difusion kinetic models, suggesting that P removal is both a diffusional and chemically rate-controlled process. The percentage of P removal by COS was not controlled by Intraparticle Diffusion and the Elovich model was the kinetic model that best fitted phosphate removal. This work demonstrated that waste oyster-shells, either NOS or COS, could be used as an effective biosorbent for P removal from seawater. Thereby, this biomaterial can sustain a cost-effective and eco-friendly bioremediation strategy with potential application in marine RAS.