913 resultados para Drilling process monitoring
Oral cancer treatments and adherence: medication event monitoring system assessment for capecitabine
Resumo:
Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.
Resumo:
BACKGROUND: Infantile haemangiomas (IHs) are very common vascular tumours. Propranolol is at present the first-line treatment for problematic and complicated haemangioma. In accordance with a Swiss protocol, children are monitored for 2 days at the start of the treatment to detect possible side effects of this drug. Our study advocates a simplification of the pretreatment monitoring process. METHODS: All children with a problematic and complicated haemangioma treated with propranolol between September 2009 and September 2012 were included in the study. All patients were hospitalised under constant nurse supervision for 48 hours at the start of the treatment and subjected to cardiac and blood measurements. The dosage of propranolol was 1 mg/kg/day on the first day and 2 mg/kg/day from the second day. Demographic data, clinical features, treatment outcome and complications were analysed. RESULTS: Twenty-nine infants were included in our study. Of these, 86.2% responded immediately to the treatment. There were no severe adverse reactions. Six patients presented transient side effects such as bradycardia, hypotension after the first dose and hypoglycaemia later. No side effects occurred after the second dose. Treatment was never interrupted. CONCLUSION: Propranolol (a β-blocker) is a safe treatment for problematic IH. Side effects may occur after the first dose. A strict 48 hour monitoring in hospital is expensive and may be unnecessary as long as the contraindications for the drug are respected.
Resumo:
Proteins can switch between different conformations in response to stimuli, such as pH or temperature variations, or to the binding of ligands. Such plasticity and its kinetics can have a crucial functional role, and their characterization has taken center stage in protein research. As an example, Topoisomerases are particularly interesting enzymes capable of managing tangled and supercoiled double-stranded DNA, thus facilitating many physiological processes. In this work, we describe the use of a cantilever-based nanomotion sensor to characterize the dynamics of human topoisomerase II (Topo II) enzymes and their response to different kinds of ligands, such as ATP, which enhance the conformational dynamics. The sensitivity and time resolution of this sensor allow determining quantitatively the correlation between the ATP concentration and the rate of Topo II conformational changes. Furthermore, we show how to rationalize the experimental results in a comprehensive model that takes into account both the physics of the cantilever and the dynamics of the ATPase cycle of the enzyme, shedding light on the kinetics of the process. Finally, we study the effect of aclarubicin, an anticancer drug, demonstrating that it affects directly the Topo II molecule inhibiting its conformational changes. These results pave the way to a new way of studying the intrinsic dynamics of proteins and of protein complexes allowing new applications ranging from fundamental proteomics to drug discovery and development and possibly to clinical practice.
Resumo:
Reduced capacity for executive cognitive function and for the autonomic control of cardiac responsivity are both concomitants of the aging process. These may be linked through their mutual dependence on medial prefrontal function, but the specifics ofthat linkage have not been well explored. Executive functions associated with medial prefrontal cortex involve various aspects ofperformance monitoring, whereas centrally mediated autonomic functions can be observed as heart rate variability (HRV), i.e., variability in the length of intervals between heart beats. The focus for this thesis was to examine the degree to which the capacity for phasic autonomic adjustments to heart rate relates to performance monitoring in younger and older adults, using measures of electrocortical and autonomic activity. Behavioural performance and attention allocation during two age-sensitive tasks could be predicted by various aspects of autonomic control. For young adults, greater influence of the parasympathetic system on HRV was beneficial for learning unfamiliar maze paths; for older adults, greater sympathetic influence was detrimental to these functions. Further, these relationships were primarily evoked when the task required the construction and use of internalized representations of mazes rather than passive responses to feedback. When memory for source was required, older adults made three times as many source errors as young adults. However, greater parasympathetic influence on HRV in the older group was conducive to avoiding source errors and to reduced electrocortical responses to irrelevant information. Higher sympathetic predominance, in contrast, was associated with higher rates of source error and greater electrocortical responses tq non-target information in both groups. These relations were not seen for 11 errors associated with a speeded perceptual task, irrespective of its difficulty level. Overall, autonomic modulation of cardiac activity was associated with higher levels of performance monitoring, but differentially across tasks and age groups. With respect to age, those older adults who had maintained higher levels of autonomic cardiac regulation appeared to have also maintained higher levels of executive control over task performance.
Resumo:
years 8 months) and 24 older (M == 7 years 4 months) children. A Monitoring Process Model (MPM) was developed and tested in order to ascertain at which component process ofthe MPM age differences would emerge. The MPM had four components: (1) assessment; (2) evaluation; (3) planning; and (4) behavioural control. The MPM was assessed directly using a referential communication task in which the children were asked to make a series of five Lego buildings (a baseline condition and one building for each MPM component). Children listened to instructions from one experimenter while a second experimenter in the room (a confederate) intetjected varying levels ofverbal feedback in order to assist the children and control the component ofthe MPM. This design allowed us to determine at which "stage" ofprocessing children would most likely have difficulty monitoring themselves in this social-cognitive task. Developmental differences were obselVed for the evaluation, planning and behavioural control components suggesting that older children were able to be more successful with the more explicit metacomponents. Interestingly, however, there was no age difference in terms ofLego task success in the baseline condition suggesting that without the intelVention ofthe confederate younger children monitored the task about as well as older children. This pattern ofresults indicates that the younger children were disrupted by the feedback rather than helped. On the other hand, the older children were able to incorporate the feedback offered by the confederate into a plan ofaction. Another aim ofthis study was to assess similar processing components to those investigated by the MPM Lego task in a more naturalistic observation. Together the use ofthe Lego Task ( a social cognitive task) and the naturalistic social interaction allowed for the appraisal of cross-domain continuities and discontinuities in monitoring behaviours. In this vein, analyses were undertaken in order to ascertain whether or not successful performance in the MPM Lego Task would predict cross-domain competence in the more naturalistic social interchange. Indeed, success in the two latter components ofthe MPM (planning and behavioural control) was related to overall competence in the naturalistic task. However, this cross-domain prediction was not evident for all levels ofthe naturalistic interchange suggesting that the nature ofthe feedback a child receives is an important determinant ofresponse competency. Individual difference measures reflecting the children's general cognitive capacity (Working Memory and Digit Span) and verbal ability (vocabulary) were also taken in an effort to account for more variance in the prediction oftask success. However, these individual difference measures did not serve to enhance the prediction oftask performance in either the Lego Task or the naturalistic task. Similarly, parental responses to questionnaires pertaining to their child's temperament and social experience also failed to increase prediction oftask performance. On-line measures ofthe children's engagement, positive affect and anxiety also failed to predict competence ratings.
Resumo:
Activity of the medial frontal cortex (MFC) has been implicated in attention regulation and performance monitoring. The MFC is thought to generate several event-related potential (ERPs) components, known as medial frontal negativities (MFNs), that are elicited when a behavioural response becomes difficult to control (e.g., following an error or shifting from a frequently executed response). The functional significance of MFNs has traditionally been interpreted in the context of the paradigm used to elicit a specific response, such as errors. In a series of studies, we consider the functional similarity of multiple MFC brain responses by designing novel performance monitoring tasks and exploiting advanced methods for electroencephalography (EEG) signal processing and robust estimation statistics for hypothesis testing. In study 1, we designed a response cueing task and used Independent Component Analysis (ICA) to show that the latent factors describing a MFN to stimuli that cued the potential need to inhibit a response on upcoming trials also accounted for medial frontal brain responses that occurred when individuals made a mistake or inhibited an incorrect response. It was also found that increases in theta occurred to each of these task events, and that the effects were evident at the group level and in single cases. In study 2, we replicated our method of classifying MFC activity to cues in our response task and showed again, using additional tasks, that error commission, response inhibition, and, to a lesser extent, the processing of performance feedback all elicited similar changes across MFNs and theta power. In the final study, we converted our response cueing paradigm into a saccade cueing task in order to examine the oscillatory dynamics of response preparation. We found that, compared to easy pro-saccades, successfully preparing a difficult anti-saccadic response was characterized by an increase in MFC theta and the suppression of posterior alpha power prior to executing the eye movement. These findings align with a large body of literature on performance monitoring and ERPs, and indicate that MFNs, along with their signature in theta power, reflects the general process of controlling attention and adapting behaviour without the need to induce error commission, the inhibition of responses, or the presentation of negative feedback.
Resumo:
In this article, we discuss the first year research plan for the INKE interface design team, which focuses on a prototype for chaining. Interpretable as a subclass of Unsworths scholarly primitive of discovering, chaining is the process of beginning with an exemplary article, then finding the articles that it cites, the articles they cite, and so on until the reader begins to get a feel for the terrain. The chaining strategy is of particular utility for scholars working in new areas, either through doing background work for interdisciplinary interests or else by pursuing a subtopic in a domain that generates a paper storm of publications every year. In our prototype project, we plan to produce a system that accepts a seed article, tunnels through a number of levels of citation, and generates a summary report listing the most frequent authors and articles. One of the innovative features of this prototype is its use of the experimental oil and water interface effect, which uses text animation to provide the user with a sense of the underlying process.
Resumo:
An attempt is made to study the possible relationship between the process of upwelling and zooplankton biomass in the shelf weters along the south west coast of India between Cape comorin and Ratnagiri based on oceanographic and Zooplankton data collected by the erstwhile FAO/UNDP Pelagic Fishery Project,Cochin between 1973 and 1978. Different factors such as the depth from which the bottom waters are induced upwards during the process of upwelling,the depth to which the bottom waters are drawn, vertical velocity of upwelling and the resultant zooplankton productivity were considered while arriving at the deductions. Except for nutrients and phytoplankton productivity, for which simultaneous data is lacking, all the major factors were taken into consideration before cocluding- xon positive/negative correlation.
Resumo:
At present, a fraction of 0.1 - 0.2% of the patients undergoing surgery become aware during the process. The situation is referred to as anesthesia awareness and is obviously very traumatic for the person experiencing it. The reason for its occurrence is mostly an insufficient dosage of the narcotic Propofol combined with the incapability of the technology monitoring the depth of the patient’s anesthetic state to notice the patient becoming aware. A solution can be a highly sensitive and selective real time monitoring device for Propofol based on optical absorption spectroscopy. Its working principle has been postulated by Prof. Dr. habil. H. Hillmer and formulated in DE10 2004 037 519 B4, filed on Aug 30th, 2004. It consists of the exploitation of Intra Cavity Absorption effects in a two mode laser system. In this Dissertation, a two mode external cavity semiconductor laser, which has been developed previously to this work is enhanced and optimized to a functional sensor. Enhancements include the implementation of variable couplers into the system and the implementation of a collimator arrangement into which samples can be introduced. A sample holder and cells are developed and characterized with a focus on compatibility with the measurement approach. Further optimization concerns the overall performance of the system: scattering sources are reduced by re-splicing all fiber-to-fiber connections, parasitic cavities are eliminated by suppressing the Fresnel reflexes of all one fiber ends by means of optical isolators and wavelength stability of the system is improved by the implementation of thermal insulation to the Fiber Bragg Gratings (FBG). The final laser sensor is characterized in detail thermally and optically. Two separate modes are obtained at 1542.0 and 1542.5 nm, tunable in a range of 1nm each. Mode Full Width at Half Maximum (FWHM) is 0.06nm and Signal to Noise Ratio (SNR) is as high as 55 dB. Independent of tuning the two modes of the system can always be equalized in intensity, which is important as the delicacy of the intensity equilibrium is one of the main sensitivity enhancing effects formulated in DE10 2004 037 519 B4. For the proof of concept (POC) measurements the target substance Propofol is diluted in the solvents Acetone and DiChloroMethane (DCM), which have been investigated for compatibility with Propofol beforehand. Eight measurement series (two solvents, two cell lengths and two different mode spacings) are taken, which draw a uniform picture: mode intensity ratio responds linearly to an increase of Propofol in all cases. The slope of the linear response indicates the sensitivity of the system. The eight series are split up into two groups: measurements taken in long cells and measurements taken in short cells.
Resumo:
Since its beginning in 1999, the Bologna Process has influenced various aspects of higher education in its member countries, e.g., degree structures, mobility, lifelong learning, social dimension and quality assurance. The social dimension creates the focus of this research. The social dimension entered the Bologna Process agenda in 2001. Despite a decade of reforms, it somehow remained as a vague element and received low scholarly attention. This research addresses to this gap. Firstly, different meanings of the social dimension according to the major European policy actors are analysed. Unfolding the understandings of the actors revealed that the social dimension is mostly understood in terms reflecting the diversity of population on the student body accessing to, progressing in and completing higher education, with a special concern on the underrepresented groups. However, it is not possible to observe a similar commonality concerning the actual policy measures to achieve this goal. Divergence occurs with respect to the addressed underrepresented groups, i.e., all underrepresented groups or people without formal qualifications and mature learners, and the values and institutional interests traditionally promoted by these actors. Secondly, the dissertation discusses the reflection of this social dimension understanding at the national level by looking at cases of Finland, Germany and Turkey. The in-depth analyses show an awareness of the social dimension among most of the national Bologna Process actors and a common understanding of the social dimension goals. However, this understanding has not triggered action in any of the countries. The countries acted on areas which they defined problematic before the Bologna Process. Finally, based on these findings the dissertation discusses the social dimension as a policy item that managed to get into the Bologna Process agenda, but neither grew into an implementable policy, nor drop out of it. To this aim, it makes use of the multiple streams framework and explains the low agenda status social dimension with: i. the lack of a pressing problem definition: the lack of clearly defined indicators and a comprehensive monitoring system, ii. the lack of a viable solution alternative: the proposal of developing national strategies and action plans closed the way to develop generic guidelines for the social dimension to be translated into national policy processes, iii. low political perceptivity: the recent trends opt for increasing efficiency, excellence and exclusiveness discourses rather than ensuring equality and inclusiveness iv. high constraints: the social dimension by definition requires more public funding which is less appreciated and strategic constraints of the actors in allocating their resources v. the type of policy entrepreneur: the social dimension is promoted by an international stakeholder, the European Students’ Union, instead of the ministers responsible for higher education The social dimension remains a policy item in the Bologna Process which is noble enough to agree but not urgent enough to act on.
Resumo:
In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.
Resumo:
Introduction. During the last two decades the larval therapy has reemerged as a safe and reliable alternative for the healing of cutaneous ulcers that do not respond to the conventional treatments. Objective. To evaluate the use of the larvae of Lucilia sericata as a treatment for infected wounds with Pseudomonas aeruginosa in an animal model. Materials and methods. Twelve rabbits were randomly distributed in 3 groups: the first group was treated with larval therapy; the second was treated with antibiotics therapy and to the third no treatment was applied, therefore was established as a control group. To each animal a wound was artificially induced, and then a suspension of P. aeruginosa was inoculated into the lesion. Finally, every rabbit was evaluated until the infection development was recognized and treatment was set up for the first two groups according with the protocols mentioned above. Macroscopic evaluation of the wounds was based on the presence of edema, exudates, bad odor, inflammation around the wound and the presence of granulation tissue. The healing process was evaluated by monitoring histological changes in the dermal tissue. Results. Differences in the time required for wound healing were observed between the first group treated with larval therapy (10 days) and the second group treated with conventional antibiotics therapy (20 days). Conclusion. The L. sericata larva is and efficient tool as a therapy for infected wounds with P. aeruginosa.
Resumo:
ABSRACT This thesis focuses on the monitoring, fault detection and diagnosis of Wastewater Treatment Plants (WWTP), which are important fields of research for a wide range of engineering disciplines. The main objective is to evaluate and apply a novel artificial intelligent methodology based on situation assessment for monitoring and diagnosis of Sequencing Batch Reactor (SBR) operation. To this end, Multivariate Statistical Process Control (MSPC) in combination with Case-Based Reasoning (CBR) methodology was developed, which was evaluated on three different SBR (pilot and lab-scales) plants and validated on BSM1 plant layout.
Resumo:
The effectiveness of development assistance has come under renewed scrutiny in recent years. In an era of growing economic liberalisation, research organisations are increasingly being asked to account for the use of public funds by demonstrating achievements. However, in the natural resources (NR) research field, conventional economic assessment techniques have focused on quantifying the impact achieved rather understanding the process that delivered it. As a result, they provide limited guidance for planners and researchers charged with selecting and implementing future research. In response, “pathways” or logic models have attracted increased interest in recent years as a remedy to this shortcoming. However, as commonly applied these suffer from two key limitations in their ability to incorporate risk and assess variance from plan. The paper reports the results of a case study that used a Bayesian belief network approach to address these limitations and outlines its potential value as a tool to assist the planning, monitoring and evaluation of development-orientated research.
Resumo:
The reduction of indigo (dispersed in water) to leuco-indigo (dissolved in water) is an important industrial process and investigated here for the case of glucose as an environmentally benign reducing agent. In order to quantitatively follow the formation of leuco-indigo two approaches based on (i) rotating disk voltammetry and (ii) sonovoltammetry are developed. Leuco-indigo, once formed in alkaline solution, is readily monitored at a glassy carbon electrode in the mass transport limit employing hydrodynamic voltammetry. The presence of power ultrasound further improves the leuco-indigo determination due to additional agitation and homogenization effects. While inactive at room temperature, glucose readily reduces indigo in alkaline media at 65 degrees C. In the presence of excess glucose, a surface dissolution kinetics limited process is proposed following the rate law d eta(leuco-indigo)/dt = k x c(OH-) x S-indigo where eta(leuco-indigo) is the amount of leuco-indigo formed, k = 4.1 x 10(-9) m s(-1) (at 65 degrees C, assuming spherical particles of I gm diameter) is the heterogeneous dissolution rate constant,c(OH-) is the concentration of hydroxide, and Sindigo is the reactive surface area. The activation energy for this process in aqueous 0.2 M NaOH is E-A = 64 U mol(-1) consistent with a considerable temperature effects. The redox mediator 1,8-dihydroxyanthraquinone is shown to significantly enhance the reaction rate by catalysing the electron transfer between glucose and solid indigo particles. (c) 2006 Elsevier Ltd. All fights reserved.