841 resultados para implementation of organizational values
Resumo:
Despite abundant research on work meaningfulness, the link between work meaningfulness and general ethical attitude at work has not been discussed so far. In this article, we propose a theoretical framework to explain how work meaningfulness contributes to enhanced ethical behavior. We argue that by providing a way for individuals to relate work to one's personal core values and identity, work meaningfulness leads to affective commitment - the involvement of one's cognitive, emotional, and physical resources. This, in turn, leads to engagement and so facilitates the integration of one's personal values in the daily work routines, and so reduces the risk of unethical behavior. On the contrary, anomie, that is, the absence of meaning and consequently of personal involvement, will lead to lower rational commitment rather than affective commitment, and consequently to disengagement and a-morality. We conclude with implications for the management of ethical attitudes.
Resumo:
This paper presents the implementation details of a coded structured light system for rapid shape acquisition of unknown surfaces. Such techniques are based on the projection of patterns onto a measuring surface and grabbing images of every projection with a camera. Analyzing the pattern deformations that appear in the images, 3D information of the surface can be calculated. The implemented technique projects a unique pattern so that it can be used to measure moving surfaces. The structure of the pattern is a grid where the color of the slits are selected using a De Bruijn sequence. Moreover, since both axis of the pattern are coded, the cross points of the grid have two codewords (which permits to reconstruct them very precisely), while pixels belonging to horizontal and vertical slits have also a codeword. Different sets of colors are used for horizontal and vertical slits, so the resulting pattern is invariant to rotation. Therefore, the alignment constraint between camera and projector considered by a lot of authors is not necessary
Formulation and Implementation of Air Quality Control Pogrammes : Patterns of Interest Consideration
Resumo:
This article investigates some central aspects of the relationships between programme structure and implementation of sulphur dioxide air quality control policies. Previous implementation research, primarily adopting American approaches, has neglected the connections between the processes of programme formulation and implementation. 'Programme', as the key variable in implementation studies, has been defined too narrowly. On the basis of theoretical and conceptual reflections and provisional empirical results from studies in France, Italy, England, and the Federal Republic of Germany, the authors demonstrate that an integral process analysis using a more extended programme concept is necessary if patterns of interest recognition in policies are to be discovered. Otherwise, the still important question of critical social science cannot be answered, namely, what is the impact of special interests upon implementation processes.
Resumo:
BACKGROUND: Enhanced recovery protocols may reduce postoperative complications and length of hospital stay. However, the implementation of these protocols requires time and financial investment. This study evaluated the cost-effectiveness of enhanced recovery implementation. METHODS: The first 50 consecutive patients treated during implementation of an enhanced recovery programme were compared with 50 consecutive patients treated in the year before its introduction. The enhanced recovery protocol principally implemented preoperative counselling, reduced preoperative fasting, preoperative carbohydrate loading, avoidance of premedication, optimized fluid balance, standardized postoperative analgesia, use of a no-drain policy, as well as early nutrition and mobilization. Length of stay, readmissions and complications within 30 days were compared. A cost-minimization analysis was performed. RESULTS: Hospital stay was significantly shorter in the enhanced recovery group: median 7 (interquartile range 5-12) versus 10 (7-18) days (P = 0·003); two patients were readmitted in each group. The rate of severe complications was lower in the enhanced recovery group (12 versus 20 per cent), but there was no difference in overall morbidity. The mean saving per patient in the enhanced recovery group was euro1651. CONCLUSION: Enhanced recovery is cost-effective, with savings evident even in the initial implementation period.
Resumo:
It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL's focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory.
Resumo:
Age-related seroprevalence studies that have been conducted in Brazil have indicated a transition from a high to a medium endemicity of hepatitis A virus (HAV) infection in the population. However, most of these studies have focused on urban populations that experience lower incidence rates of HAV infection. In the current study, the prevalence of anti-HAV antibodies was investigated in children with a low socioeconomic status (SES) that live on the periphery of three capital cities in Brazil. A total of 1,162 dried blood spot samples were collected from individuals whose ages ranged from one-18 years and tested for anti-HAV antibodies. A large number of children under five years old (74.1-90%) were identified to be susceptible to HAV infection. The anti-HAV antibody prevalence reached ≥ 50% among those that were 10-14 years of age or older. The anti-HAV prevalence rates observed were characteristics of regions with intermediate level of hepatitis A endemicity. These data indicated that a large proportion of children with a low SES that live at the periphery of urban cities might be at risk of contracting an HAV infection. The hepatitis A vaccine that is currently offered in Brazil is only available for high-risk groups or at private clinics and is unaffordable for individuals with a lower SES. The results from this study suggest that the hepatitis A vaccine should be included in the Brazilian National Program for Immunisation.
Real-Time implementation of a blind authentication method using self-synchronous speech watermarking
Resumo:
A blind speech watermarking scheme that meets hard real-time deadlines is presented and implemented. In addition, one of the key issues in these block-oriented watermarking techniques is to preserve the synchronization. Namely, to recover the exact position of each block in the mark extract process. In fact, the presented scheme can be split up into two distinguished parts, the synchronization and the information mark methods. The former is embedded into the time domain and it is fast enough to be run meeting real-time requirements. The latter contains the authentication information and it is embedded into the wavelet domain. The synchronization and information mark techniques are both tunable in order to allow a con gurable method. Thus, capacity, transparency and robustness can be con gured depending on the needs. It makes the scheme useful for professional applications, such telephony authentication or even sending information throw radio applications.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
We performed a comprehensive study to assess the fit for purpose of four chromatographic conditions for the determination of six groups of marine lipophilic toxins (okadaic acid and dinophysistoxins, pectenotoxins, azaspiracids, yessotoxins, gymnodimine and spirolides) by LC-MS/MS to select the most suitable conditions as stated by the European Union Reference Laboratory for Marine Biotoxins (EURLMB). For every case, the elution gradient has been optimized to achieve a total run-time cycle of 12 min. We performed a single-laboratory validation for the analysis of three relevant matrices for the seafood aquaculture industry (mussels, pacific oysters and clams), and for sea urchins for which no data about lipophilic toxins have been reported before. Moreover, we have compared the method performance under alkaline conditions using two quantification strategies: the external standard calibration (EXS) and the matrix-matched standard calibration (MMS). Alkaline conditions were the only scenario that allowed detection windows with polarity switching in a 3200 QTrap mass spectrometer, thus the analysis of all toxins can be accomplished in a single run, increasing sample throughput. The limits of quantification under alkaline conditions met the validation requirements established by the EURLMB for all toxins and matrices, while the remaining conditions failed in some cases. The accuracy of the method and the matrix effects where generally dependent on the mobile phases and the seafood species. The MMS had a moderate positive impact on method accuracy for crude extracts, but it showed poor trueness for seafood species other than mussels when analyzing hydrolyzed extracts. Alkaline conditions with EXS and recovery correction for OA were selected as the most proper conditions in the context of our laboratory. This comparative study can help other laboratories to choose the best conditions for the implementation of LC-MS/MS according to their own necessities.
Resumo:
The pace of development of new healthcare technologies and related knowledge is very fast. Implementation of high quality evidence-based knowledge is thus mandatory to warrant an effective healthcare system and patient safety. However, even though only a small fraction of the approximate 2500 scientific publication indexed daily in Medline is actually useful to clinical practice, the amountof the new information is much too large to allow busy healthcare professionals to stay aware of possibly important evidence-based information.
Resumo:
It is well known that multiple-input multiple-output (MIMO) techniques can bring numerous benefits, such as higher spectral efficiency, to point-to-point wireless links. More recently, there has been interest in extending MIMO concepts tomultiuser wireless systems. Our focus in this paper is on network MIMO, a family of techniques whereby each end user in a wireless access network is served through several access points within its range of influence. By tightly coordinating the transmission and reception of signals at multiple access points, network MIMO can transcend the limits on spectral efficiency imposed by cochannel interference. Taking prior information-theoretic analyses of networkMIMO to the next level, we quantify the spectral efficiency gains obtainable under realistic propagation and operational conditions in a typical indoor deployment. Our study relies on detailed simulations and, for specificity, is conducted largely within the physical-layer framework of the IEEE 802.16e Mobile WiMAX system. Furthermore,to facilitate the coordination between access points, we assume that a high-capacity local area network, such as Gigabit Ethernet,connects all the access points. Our results confirm that network MIMO stands to provide a multiple-fold increase in spectralefficiency under these conditions.
Resumo:
Winter maintenance, particularly snow removal and the stress of snow removal materials on public structures, is an enormous budgetary burden on municipalities and nongovernmental maintenance organizations in cold climates. Lately, geospatial technologies such as remote sensing, geographic information systems (GIS), and decision support tools are roviding a valuable tool for planning snow removal operations. A few researchers recently used geospatial technologies to develop winter maintenance tools. However, most of these winter maintenance tools, while having the potential to address some of these information needs, are not typically placed in the hands of planners and other interested stakeholders. Most tools are not constructed with a nontechnical user in mind and lack an easyto-use, easily understood interface. A major goal of this project was to implement a web-based Winter Maintenance Decision Support System (WMDSS) that enhances the capacity of stakeholders (city/county planners, resource managers, transportation personnel, citizens, and policy makers) to evaluate different procedures for managing snow removal assets optimally. This was accomplished by integrating geospatial analytical techniques (GIS and remote sensing), the existing snow removal asset management system, and webbased spatial decision support systems. The web-based system was implemented using the ESRI ArcIMS ActiveX Connector and related web technologies, such as Active Server Pages, JavaScript, HTML, and XML. The expert knowledge on snow removal procedures is gathered and integrated into the system in the form of encoded business rules using Visual Rule Studio. The system developed not only manages the resources but also provides expert advice to assist complex decision making, such as routing, optimal resource allocation, and monitoring live weather information. This system was developed in collaboration with Black Hawk County, IA, the city of Columbia, MO, and the Iowa Department of transportation. This product was also demonstrated for these agencies to improve the usability and applicability of the system.
Qualitative analysis of pharmacists' experiences during the implementation of a medication adherence