940 resultados para practical epistemology analysis
Beauty and personal care in mass market: A strategic analysis of perfumery and cosmetics at Sonae MC
Resumo:
Directed internship
Resumo:
Background: To achieve good outcomes in critically ill obstetric patients, it is necessary to identify organ dysfunction rapidly so that life-saving interventions can be appropriately commenced. However, timely access to clinical chemistry results is problematic, even in referral institutions, in the sub-Saharan African region. Reliable point-of-care tests licensed for clinical use are now available for lactate and creatinine. Aim: We aimed to assess whether implementation of point-of-care testing for lactate and creatinine is feasible in the obstetric unit at the Queen Elizabeth Central Hospital (QECH) in Blantyre, Malawi, by obtaining the opinions of clinical staff on the use of these tests in practice. Methods: During a two-month evaluation period nurse-midwives, medical interns, clinical officers, registrars, and consultants were given the opportunity to use StatStrip® and StatSensor® (Nova Biomedical, Waltham, USA) devices, for lactate and creatinine estimation, as part of their routine clinical practice in the obstetric unit. They were subsequently asked to complete a short questionnaire. Results: Thirty-seven questionnaires were returned by participants: 22 from nurse-midwives and the remainder from clinicians. The mean satisfaction score for the devices was 7.6/10 amongst clinicians and 8.0/10 amongst nurse-midwives. The majority of participants stated that the obstetric high dependency unit (HDU) was the most suitable location for the devices. For lactate, 31 participants strongly agreed that testing should be continued and 24 strongly agreed that it would influence patient management. For creatinine, 29 strongly agreed that testing should be continued and 28 strongly agreed that it would influence their patient management. Twenty participants strongly agreed that they trust point-of-care devices. Conclusions: Point-of-care clinical chemistry testing was feasible, practical, and well received by staff, and was considered to have a useful role to play in the clinical care of sick obstetric patients at this referral centre.
Resumo:
Purpose – Curve fitting from unordered noisy point samples is needed for surface reconstruction in many applications -- In the literature, several approaches have been proposed to solve this problem -- However, previous works lack formal characterization of the curve fitting problem and assessment on the effect of several parameters (i.e. scalars that remain constant in the optimization problem), such as control points number (m), curve degree (b), knot vector composition (U), norm degree (k), and point sample size (r) on the optimized curve reconstruction measured by a penalty function (f) -- The paper aims to discuss these issues -- Design/methodology/approach - A numerical sensitivity analysis of the effect of m, b, k and r on f and a characterization of the fitting procedure from the mathematical viewpoint are performed -- Also, the spectral (frequency) analysis of the derivative of the angle of the fitted curve with respect to u as a means to detect spurious curls and peaks is explored -- Findings - It is more effective to find optimum values for m than k or b in order to obtain good results because the topological faithfulness of the resulting curve strongly depends on m -- Furthermore, when an exaggerate number of control points is used the resulting curve presents spurious curls and peaks -- The authors were able to detect the presence of such spurious features with spectral analysis -- Also, the authors found that the method for curve fitting is robust to significant decimation of the point sample -- Research limitations/implications - The authors have addressed important voids of previous works in this field -- The authors determined, among the curve fitting parameters m, b and k, which of them influenced the most the results and how -- Also, the authors performed a characterization of the curve fitting problem from the optimization perspective -- And finally, the authors devised a method to detect spurious features in the fitting curve -- Practical implications – This paper provides a methodology to select the important tuning parameters in a formal manner -- Originality/value - Up to the best of the knowledge, no previous work has been conducted in the formal mathematical evaluation of the sensitivity of the goodness of the curve fit with respect to different possible tuning parameters (curve degree, number of control points, norm degree, etc.)
Resumo:
In this work, we further extend the recently developed adaptive data analysis method, the Sparse Time-Frequency Representation (STFR) method. This method is based on the assumption that many physical signals inherently contain AM-FM representations. We propose a sparse optimization method to extract the AM-FM representations of such signals. We prove the convergence of the method for periodic signals under certain assumptions and provide practical algorithms specifically for the non-periodic STFR, which extends the method to tackle problems that former STFR methods could not handle, including stability to noise and non-periodic data analysis. This is a significant improvement since many adaptive and non-adaptive signal processing methods are not fully capable of handling non-periodic signals. Moreover, we propose a new STFR algorithm to study intrawave signals with strong frequency modulation and analyze the convergence of this new algorithm for periodic signals. Such signals have previously remained a bottleneck for all signal processing methods. Furthermore, we propose a modified version of STFR that facilitates the extraction of intrawaves that have overlaping frequency content. We show that the STFR methods can be applied to the realm of dynamical systems and cardiovascular signals. In particular, we present a simplified and modified version of the STFR algorithm that is potentially useful for the diagnosis of some cardiovascular diseases. We further explain some preliminary work on the nature of Intrinsic Mode Functions (IMFs) and how they can have different representations in different phase coordinates. This analysis shows that the uncertainty principle is fundamental to all oscillating signals.
Resumo:
We present a study where the energy loss function of Ta2O5, initially derived in the optical limit for a limited region of excitation energies from reflection electron energy loss spectroscopy (REELS) measurements, was improved and extended to the whole momentum and energy excitation region through a suitable theoretical analysis using the Mermin dielectric function and requiring the fulfillment of physically motivated restrictions, such as the f- and KK-sum rules. The material stopping cross section (SCS) and energy-loss straggling measured for 300–2000 keV proton and 200–6000 keV helium ion beams by means of Rutherford backscattering spectrometry (RBS) were compared to the same quantities calculated in the dielectric framework, showing an excellent agreement, which is used to judge the reliability of the Ta2O5 energy loss function. Based on this assessment, we have also predicted the inelastic mean free path and the SCS of energetic electrons in Ta2O5.
Resumo:
The goal of this project is to learn the necessary steps to create a finite element model, which can accurately predict the dynamic response of a Kohler Engines Heavy Duty Air Cleaner (HDAC). This air cleaner is composed of three glass reinforced plastic components and two air filters. Several uncertainties arose in the finite element (FE) model due to the HDAC’s component material properties and assembly conditions. To help understand and mitigate these uncertainties, analytical and experimental modal models were created concurrently to perform a model correlation and calibration. Over the course of the project simple and practical methods were found for future FE model creation. Similarly, an experimental method for the optimal acquisition of experimental modal data was arrived upon. After the model correlation and calibration was performed a validation experiment was used to confirm the FE models predictive capabilities.
Resumo:
This thesis presents a system for visually analyzing the electromagnetic fields of the electrical machines in the energy conversion laboratory. The system basically utilizes the finite element method to achieve a real-time effect in the analysis of electrical machines during hands-on experimentation. The system developed is a tool to support the student's understanding of the electromagnetic field by calculating performance measures and operational concepts pertaining to the practical study of electrical machines. Energy conversion courses are fundamental in electrical engineering. The laboratory is conducted oriented to facilitate the practical application of the theory presented in class, enabling the student to use electromagnetic field solutions obtained numerically to calculate performance measures and operating characteristics. Laboratory experiments are utilized to help the students understand the electromagnetic concepts by the use of this visual and interactive analysis system. In this system, this understanding is accomplished while hands-on experimentation takes place in real-time.
Resumo:
Purpose: The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach: This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings: This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications: This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications: LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value: LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.
Resumo:
There has recently been a great deal of interest in the potential of computer games to function as innovative educational tools. However, there is very little evidence of games fulfilling that potential. Indeed, the process of merging the disparate goals of education and games design appears problematic, and there are currently no practical guidelines for how to do so in a coherent manner. In this paper, we describe the successful, empirically validated teaching methods developed by behavioural psychologists and point out how they are uniquely suited to take advantage of the benefits that games offer to education. We conclude by proposing some practical steps for designing educational games, based on the techniques of Applied Behaviour Analysis. It is intended that this paper can both focus educational games designers on the features of games that are genuinely useful for education, and also introduce a successful form of teaching that this audience may not yet be familiar with.
Resumo:
BACKGROUND: The well-being of relatives of patients having chronic heart diseases (CHD) has been found to be negatively affected by the patient's condition. Studies examining relatives of patients with atrial fibrillation (AF) indicate that their well-being may be affected in a similar manner, but further research is needed. AIM: To explore and describe critical incidents in which relatives of patients experience how AF affects their well-being and what actions they take to handle these situations. DESIGN AND METHOD: An explorative, descriptive design based on the critical incident technique (CIT) was used. Interviews were conducted with 19 relatives (14 women and five men) of patients hospitalised in southern Sweden due to acute symptoms of the AF. RESULTS: The well-being of relatives was found to be affected by their worries (patient-related health), as well as the sacri-ficing of their own needs (self-related health). In handling their own well-being, these relatives adjusted to and supported the patient (practical involvement), along with adjusting their own feelings and responding to the mood of the patients (emotional involvement). CONCLUSION: The well-being of relatives of patients with AF was affected depending on the patients' well-being. In their attempt to handle their own well-being, the relatives adjusted to and supported the patients. Further research is needed in order to evaluate the effects of support to relatives and patients respectively and together.
Resumo:
Ever since the birth of the Smart City paradigm, a wide variety of initiatives have sprung up involving this phenomenon: best practices, projects, pilot projects, transformation plans, models, standards, indicators, measuring systems, etc. The question to ask, applicable to any government official, city planner or researcher, is whether this effect is being felt in how cities are transforming, or whether, in contrast, it is not very realistic to speak of cities imbued with this level of intelligence. Many cities are eager to define themselves as smart, but the variety, complexity and scope of the projects needed for this transformation indicate that the change process is longer than it seems. If our goal is to carry out a comparative analysis of this progress among cities by using the number of projects executed and their scope as a reference for the transformation, we could find such a task inconsequential due to the huge differences and characteristics that define a city. We believe that the subject needs simplification (simpler, more practical models) and a new approach. This paper presents a detailed analysis of the smart city transformation process in Spain and provides a support model that helps us understand the changes and the speed at which they are being implemented. To this end we define a set of elements of change called "transformation factors" that group a city's smartness into one of three levels (Low/Medium/Fully) and more homogeneously identify the level of advancement of this process. © 2016 IEEE.
Resumo:
Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.
Resumo:
This paper focusses on the study of the underdrawings of 16th century easel paintings attributed to the workshop of the Portuguese-Flemish Master Frei Carlos. This investigation encompasses multidisciplinary research that relates the results of surface exams (infrared reflectography, standard light photography and infrared photography) with analytical investigations. The surface analysis of Frei Carlos’ underdrawings by infrared reflectography has shown heterogeneous work, revealing two different situations: (1) an abundant and expressive underdrawing, revealing a Flemish influence and (2) a simple and outlined underdrawing. This preliminary research raised an important question related to this Portuguese-Flemish workshop and to the analytical approach: Is the underdrawing's heterogeneity, as observed in the reflectograms, related to different artists or is this rather an effect that is produced due to the use of different materials in the underdrawing's execution? Consequently, if different materials were used, how can we have access to the hidden underdrawings? In order to understand the reasons for this dissemblance, chemical analysis of micro-samples collected in underdrawing areas and representing both situations were carried out by optical microscopy, micro Fourier transform infrared spectroscopy (μ-FTIR), scanning electron microscopy coupled with energy dispersive X-ray spectrometry (SEM-EDX) and micro-Raman spectroscopy (μ-Raman). Taking into account the different possibilities and practical and theoretical limitations of surface and punctual examinations in the study of easel painting underdrawings, the methodology of research was adjusted, sometimes resulting in a re-analysis of experimental results. This research shows the importance of combining multispectral surface exams and chemical analysis in the understanding of the artistic creative processes of 16th century easel paintings.
Resumo:
The porpoise of this study was to implement research methodologies and assess the effectiveness and impact of management tools to promote best practices for the long term conservation of the endangered African wild dog (Lycaon pictus). Different methods were included in the project framework to investigate and expand the applicability of these methodologies to free-ranging African wild dogs in the southern African region: ethology, behavioural endocrinology and ecology field methodologies were tested and implemented. Additionally, research was performed to test the effectiveness and implication of a contraceptive implant (Suprenolin) as a management tool for the species of a subpopulation hosted in fenced areas. Attention was especially given to social structure and survival of treated packs. This research provides useful tools and advances the applicability of these methods for field studies, standardizing and improving research instruments in the field of conservation biology and behavioural endocrinology. Results reported here provide effective methodologies to expand the applicability of non-invasive endocrine assessment to previously prohibited fields, and validation of sampling methods for faecal hormone analysis. The final aim was to fill a knowledge gap on behaviours of the species and provide a common ground for future researchers to apply non-invasive methods to this species research and to test the effectiveness of the contraception on a managed metapopulation.
Resumo:
Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations.