966 resultados para Monitoring process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remote monitoring of a power boiler allows the supplying company to make sure that equipment is used as supposed to and gives a good chance for process optimization. This improves co-operation between the supplier and the customer and creates an aura of trust that helps securing future contracts. Remote monitoring is already in use with recovery boilers but the goal is to expand especially to biomass-fired BFB-boilers. To make remote monitoring possible, data has to be measured reliably on site and the link between the power plant and supplying company’s server has to work reliably. Data can be gathered either with the supplier’s sensors or with measurements originally installed in the power plant if the plant in question is not originally built by the supplying company. Main goal in remote monitoring is process optimization and avoiding unnecessary accidents. This can be achieved for instance by following the efficiency curves and fouling in different parts of the process and comparing them to past values. The final amount of calculations depends on the amount of data gathered. Sudden changes in efficiency or fouling require further notice and in such a case it’s important that dialogue toward the power plant in question also works.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, an (X) over bar -chart is used to control the process mean and an R-chart to control the process variance. However, these charts are not sensitive to small changes in process parameters. A good alternative to these charts is the exponentially weighted moving average (EWMA) control chart for controlling the process mean and variability, which is very effective in detecting small process disturbances. In this paper, we propose a single chart that is based on the non-central chi-square statistic, which is more effective than the joint (X) over bar and R charts in detecting assignable cause(s) that change the process mean and/or increase variability. It is also shown that the EWMA control chart based on a non-central chi-square statistic is more effective in detecting both increases and decreases in mean and/or variability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - The aim of this paper is to present a synthetic chart based on the non-central chi-square statistic that is operationally simpler and more effective than the joint X̄ and R chart in detecting assignable cause(s). This chart will assist in identifying which (mean or variance) changed due to the occurrence of the assignable causes. Design/methodology/approach - The approach used is based on the non-central chi-square statistic and the steady-state average run length (ARL) of the developed chart is evaluated using a Markov chain model. Findings - The proposed chart always detects process disturbances faster than the joint X̄ and R charts. The developed chart can monitor the process instead of looking at two charts separately. Originality/value - The most important advantage of using the proposed chart is that practitioners can monitor the process by looking at only one chart instead of looking at two charts separately. © Emerald Group Publishing Limted.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: Infantile haemangiomas (IHs) are very common vascular tumours. Propranolol is at present the first-line treatment for problematic and complicated haemangioma. In accordance with a Swiss protocol, children are monitored for 2 days at the start of the treatment to detect possible side effects of this drug. Our study advocates a simplification of the pretreatment monitoring process. METHODS: All children with a problematic and complicated haemangioma treated with propranolol between September 2009 and September 2012 were included in the study. All patients were hospitalised under constant nurse supervision for 48 hours at the start of the treatment and subjected to cardiac and blood measurements. The dosage of propranolol was 1 mg/kg/day on the first day and 2 mg/kg/day from the second day. Demographic data, clinical features, treatment outcome and complications were analysed. RESULTS: Twenty-nine infants were included in our study. Of these, 86.2% responded immediately to the treatment. There were no severe adverse reactions. Six patients presented transient side effects such as bradycardia, hypotension after the first dose and hypoglycaemia later. No side effects occurred after the second dose. Treatment was never interrupted. CONCLUSION: Propranolol (a β-blocker) is a safe treatment for problematic IH. Side effects may occur after the first dose. A strict 48 hour monitoring in hospital is expensive and may be unnecessary as long as the contraindications for the drug are respected.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Internet is becoming more and more popular among drug users. The use of websites and forums to obtain illicit drugs and relevant information about the means of consumption is a growing phenomenon mainly for new synthetic drugs. Gamma Butyrolactone (GBL), a chemical precursor of Gamma Hydroxy Butyric acid (GHB), is used as a "club drug" and also in drug facilitated sexual assaults. Its market takes place mainly on the Internet through online websites but the structure of the market remains unknown. This research aims to combine digital, physical and chemical information to help understand the distribution routes and the structure of the GBL market. Based on an Internet monitoring process, thirty-nine websites selling GBL, mainly in the Netherlands, were detected between January 2010 and December 2011. Seventeen websites were categorized into six groups based on digital traces (e.g. IP addresses and contact information). In parallel, twenty-five bulk GBL specimens were purchased from sixteen websites for packaging comparisons and carbon isotopic measurements. Packaging information showed a high correlation with digital data confirming the links previously established whereas chemical information revealed undetected links and provided complementary information. Indeed, while digital and packaging data give relevant information about the retailers, the supply routes and the distribution close to the consumer, the carbon isotopic data provides upstream information about the production level and in particular the synthesis pathways and the chemical precursors. A three-level structured market has been thereby identified with a production level mainly located in China and in Germany, an online distribution level mainly hosted in the Netherlands and the customers who order on the Internet.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Press forming is nowadays one of the most common industrial methods in use for producing deeper trays from paperboard. Demands for material properties like recyclability and sustainability have increased also in the packaging industry, but there are still limitations related to the formability of paperboard. A majority of recent studies have focused on material development, but the potential of the package manufacturing process can also be improved by the development of tooling and process control. In this study, advanced converting tools (die cutting tools and the press forming mould) are created for production scale paperboard tray manufacturing. Also monitoring methods that enable the production of paperboard trays with enhanced quality, and can be utilized in process control are developed. The principles for tray blank preparation, including creasing pattern and die cutting tool design are introduced. The mould heating arrangement and determination of mould clearance are investigated to improve the quality of the press formed trays. The effect of the spring back of the tray walls on the tray dimensions can be managed by adjusting the heat-related process parameters and estimating it at the mould design stage. This enables production speed optimization as the process parameters can be adjusted more freely. Real-time monitoring of pressing force by using multiple force sensors embedded in the mould structure can be utilized in the evaluation of material characteristics on a modified production machinery. Comprehensive process control can be achieved with a combination of measurement of the outer dimensions of the trays and pressing force monitoring. The control method enables detection of defects and tracking changes in the material properties. The optimized converting tools provide a basis for effective operation of the control system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

years 8 months) and 24 older (M == 7 years 4 months) children. A Monitoring Process Model (MPM) was developed and tested in order to ascertain at which component process ofthe MPM age differences would emerge. The MPM had four components: (1) assessment; (2) evaluation; (3) planning; and (4) behavioural control. The MPM was assessed directly using a referential communication task in which the children were asked to make a series of five Lego buildings (a baseline condition and one building for each MPM component). Children listened to instructions from one experimenter while a second experimenter in the room (a confederate) intetjected varying levels ofverbal feedback in order to assist the children and control the component ofthe MPM. This design allowed us to determine at which "stage" ofprocessing children would most likely have difficulty monitoring themselves in this social-cognitive task. Developmental differences were obselVed for the evaluation, planning and behavioural control components suggesting that older children were able to be more successful with the more explicit metacomponents. Interestingly, however, there was no age difference in terms ofLego task success in the baseline condition suggesting that without the intelVention ofthe confederate younger children monitored the task about as well as older children. This pattern ofresults indicates that the younger children were disrupted by the feedback rather than helped. On the other hand, the older children were able to incorporate the feedback offered by the confederate into a plan ofaction. Another aim ofthis study was to assess similar processing components to those investigated by the MPM Lego task in a more naturalistic observation. Together the use ofthe Lego Task ( a social cognitive task) and the naturalistic social interaction allowed for the appraisal of cross-domain continuities and discontinuities in monitoring behaviours. In this vein, analyses were undertaken in order to ascertain whether or not successful performance in the MPM Lego Task would predict cross-domain competence in the more naturalistic social interchange. Indeed, success in the two latter components ofthe MPM (planning and behavioural control) was related to overall competence in the naturalistic task. However, this cross-domain prediction was not evident for all levels ofthe naturalistic interchange suggesting that the nature ofthe feedback a child receives is an important determinant ofresponse competency. Individual difference measures reflecting the children's general cognitive capacity (Working Memory and Digit Span) and verbal ability (vocabulary) were also taken in an effort to account for more variance in the prediction oftask success. However, these individual difference measures did not serve to enhance the prediction oftask performance in either the Lego Task or the naturalistic task. Similarly, parental responses to questionnaires pertaining to their child's temperament and social experience also failed to increase prediction oftask performance. On-line measures ofthe children's engagement, positive affect and anxiety also failed to predict competence ratings.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.