825 resultados para activity-based management
Resumo:
The purpose of this study was to identify whether activity modeling framework supports problem analysis and provides a traceable and tangible connection from the problem identification up to solution modeling. Methodology validation relied on a real problem from a Portuguese teaching syndicate (ASPE), regarding courses development and management. The study was carried out with a perspective to elaborate a complete tutorial of how to apply activity modeling framework to a real world problem. Within each step of activity modeling, we provided a summary elucidation of the relevant elements required to perform it, pointed out some improvements and applied it to ASPE’s real problem. It was found that activity modeling potentiates well structured problem analysis as well as provides a guiding thread between problem and solution modeling. It was concluded that activity-based task modeling is key to shorten the gap between problem and solution. The results revealed that the solution obtained using activity modeling framework solved the core concerns of our customer and allowed them to enhance the quality of their courses development and management. The principal conclusion was that activity modeling is a properly defined methodology that supports software engineers in problem analysis, keeping a traceable guide among problem and solution.
Resumo:
Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes.
Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT). Participants were randomised to the intervention (computer-program) or control group (usual care). Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ)) measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD) years). After completion there was a significant between-group difference in the “knowledge and beliefs scale” of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand.
Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management.
Resumo:
Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.
Resumo:
Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.
Resumo:
BACKGROUND & AIMS: Standardized instruments are needed to assess the activity of eosinophilic esophagitis (EoE) and to provide end points for clinical trials and observational studies. We aimed to develop and validate a patient-reported outcome (PRO) instrument and score, based on items that could account for variations in patient assessments of disease severity. We also evaluated relationships between patient assessment of disease severity and EoE-associated endoscopic, histologic, and laboratory findings. METHODS: We collected information from 186 patients with EoE in Switzerland and the United States (69.4% male; median age, 43 y) via surveys (n = 135), focus groups (n = 27), and semistructured interviews (n = 24). Items were generated for the instruments to assess biologic activity based on physician input. Linear regression was used to quantify the extent to which variations in patient-reported disease characteristics could account for variations in patient assessment of EoE severity. The PRO instrument was used prospectively in 153 adult patients with EoE (72.5% male; median age, 38 y), and validated in an independent group of 120 patients with EoE (60.8% male; median age, 40.5 y). RESULTS: Seven PRO factors that are used to assess characteristics of dysphagia, behavioral adaptations to living with dysphagia, and pain while swallowing accounted for 67% of the variation in patient assessment of disease severity. Based on statistical consideration and patient input, a 7-day recall period was selected. Highly active EoE, based on endoscopic and histologic findings, was associated with an increase in patient-assessed disease severity. In the validation study, the mean difference between patient assessment of EoE severity (range, 0-10) and PRO score (range, 0-8.52) was 0.15. CONCLUSIONS: We developed and validated an EoE scoring system based on 7 PRO items that assess symptoms over a 7-day recall period. Clinicaltrials.gov number: NCT00939263.
Resumo:
In this research we are examining what is the status of logistics and operations management in Finnish and Swedish companies. Empirical data is based on the web based questionnaire, which was completed in the end of 2007 and early 2008. Our examination consists of roughly 30 answers from largest manufacturing (highest representation in our sample), trade and logistics/distribution companies. Generally it could be argued that these companies operate in complex environment, where number of products, raw materials/components and suppliers is high. However, usually companies rely on small amount of suppliers per raw material/component (highest frequency is 2), and this was especially the case among Swedish companies, and among those companies, which favoured overseas sourcing. Sample consisted of companies which mostly are operating in an international environment, and are quite often multinationals. Our survey findings reveal that companies in general have taken logistics and information technology as part of their strategy process; utilization of performance measures as well as system implementations have followed the strategy decisions. In the transportation mode side we identify that road transports dominate all transport flow classes (inbound, internal and outbound), followed by sea and air. Surprisingly small amount of companies use railways, but in general we could argue that Swedish companies prefer this mode over Finnish counterparts. With respect of operations outsourcing, we found that more traditional areas of logistics outsourcing are driving factors in company's performance measurement priority. In contrary to previous research, our results indicate that the scope of outsourcing is not that wide in logistics/operations management area, and companies are not planning to outsource more in the near future. Some support is found for more international operations and increased outsourcing activity. From the increased time pressure of companies, we find evidence that local as well as overseas customers expect deliveries within days or weeks, but suppliers usually supply within weeks or months. So, basically this leads into considerable inventory holding. Interestingly local and overseas sourcing strategy does not have that great influence on lead time performance of these particular sourcing areas - local strategy is anyway considerably better in responding on market changes due to shorter supply lead times. In the end of our research work we have completed correlation analysis concerning items asked with Likert scale. Our analysis shows that seeing logistics more like a process rather than function, applying time based management, favouring partnerships and measuring logistics within different performance dimensions results on preferred features and performance found in logistics literature.
Resumo:
This paper asks whether school based management may help reducing risky sexual behavior of teenagers. For this purpose we use student level data from Bogot´a to identify students from Concession School (CS), who are enrolled in public education system with a more school management autonomy at school level, and to compare them with those students at the traditional public education system. We use propensity score matching methods to have a comparable sample between pupils at CS and traditional schools. Our results show that on average the behavior of students from CS do not have a sexual behavior that differs from those in traditional public schools except for boys in CS who have a lower probability of being sexual active. However, there are important differences when heterogeneity is considered. For example we find that CS where girls per boys ratio is higher have lower teenage pregnancy rates than public schools with also high girls per boys ratios. We also find that teachers’ human capital, teacher-pupil ratio or whether school offers sexual education are also related to statistically significant differences between CS and traditional public schools.
Resumo:
In the evolution of strategic disciplines much of the knowledge produced has been widely diffused by the management consulting industry. But can this sector be regarded as knowledge intensive activity based on true structure of expertise knowledge? One way to understand if we can consider that sector as a source of knowledge dissemination is realizing its relationship with the market in terms of knowledge, rather than identify only as a set of static techniques to be applied as in most of times they have been doing. This article presents itself as a reflection about the real reasons for the increasing use ofmanagement consulting services, indicating simultaneously that can really be a true field of opportunities for the academic class if the study will focused in the establishment and institutionalization of micropractices (strategy-as-practice) that there are used and its implications in terms of organizational results.
Resumo:
Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
This study explores educational technology and management education by analyzing fidelity in game-based management education interventions. A sample of 31 MBA students was selected to help answer the research question: To what extent do MBA students tend to recognize specific game-based academic experiences, in terms of fidelity, as relevant to their managerial performance? Two distinct game-based interventions (BG1 and BG2) with key differences in fidelity levels were explored: BG1 presented higher physical and functional fidelity levels and lower psychological fidelity levels. Hypotheses were tested with data from the participants, collected shortly after their experiences, related to the overall perceived quality of game-based interventions. The findings reveal a higher overall perception of quality towards BG1: (a) better for testing strategies, (b) offering better business and market models, (c) based on a pace that better stimulates learning, and (d) presenting a fidelity level that better supports real world performance. This study fosters the conclusion that MBA students tend to recognize, to a large extent, that specific game-based academic experiences are relevant and meaningful to their managerial development, mostly with heightened fidelity levels of adopted artifacts. Agents must be ready and motivated to explore the new, to try and err, and to learn collaboratively in order to perform.
Resumo:
BACKGROUND & Aims: Standardized instruments are needed to assess the activity of eosinophilic esophagitis (EoE), to provide endpoints for clinical trials and observational studies. We aimed to develop and validate a patient-reported outcome (PRO) instrument and score, based on items that could account for variations in patients' assessments of disease severity. We also evaluated relationships between patients' assessment of disease severity and EoE-associated endoscopic, histologic, and laboratory findings. METHODS We collected information from 186 patients with EoE in Switzerland and the US (69.4% male; median age, 43 years) via surveys (n = 135), focus groups (n = 27), and semi-structured interviews (n = 24). Items were generated for the instruments to assess biologic activity based on physician input. Linear regression was used to quantify the extent to which variations in patient-reported disease characteristics could account for variations in patients' assessment of EoE severity. The PRO instrument was prospectively used in 153 adult patients with EoE (72.5% male; median age, 38 years), and validated in an independent group of 120 patients with EoE (60.8% male; median age, 40.5 years). RESULTS Seven PRO factors that are used to assess characteristics of dysphagia, behavioral adaptations to living with dysphagia, and pain while swallowing accounted for 67% of the variation in patients' assessment of disease severity. Based on statistical consideration and patient input, a 7-day recall period was selected. Highly active EoE, based on endoscopic and histologic findings, was associated with an increase in patient-assessed disease severity. In the validation study, the mean difference between patient assessment of EoE severity and PRO score was 0.13 (on a scale from 0 to 10). CONCLUSIONS We developed and validated an EoE scoring system based on 7 PRO items that assesses symptoms over a 7-day recall period. Clinicaltrials.gov number: NCT00939263.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Local food diversity and traditional crops are essential for cost-effective management of the global epidemic of type 2 diabetes and associated complications of hypertension. Water and 12% ethanol extracts of native Peruvian fruits such as Lucuma (Pouteria lucuma), Pacae (Inga feuille), Papayita arequipena (Carica pubescens), Capuli (Prunus capuli), Aguaymanto (Physalis peruviana), and Algarrobo (Prosopis pallida) were evaluated for total phenolics, antioxidant activity based on 2, 2-diphenyl-1-picrylhydrazyl radical scavenging assay, and functionality such as in vitro inhibition of alpha-amylase, alpha-glucosidase, and angiotensin I-converting enzyme (ACE) relevant for potential management of hyperglycemia and hypertension linked to type 2 diabetes. The total phenolic content ranged from 3.2 (Aguaymanto) to 11.4 (Lucuma fruit) mg/g of sample dry weight. A significant positive correlation was found between total phenolic content and antioxidant activity for the ethanolic extracts. No phenolic compound was detected in Lucuma (fruit and powder) and Pacae. Aqueous extracts from Lucuma and Algarrobo had the highest alpha-glucosidase inhibitory activities. Papayita arequipena and Algarrobo had significant ACE inhibitory activities reflecting antihypertensive potential. These in vitro results point to the excellent potential of Peruvian fruits for food-based strategies for complementing effective antidiabetes and antihypertension solutions based on further animal and clinical studies.
Resumo:
The progression of rheumatoid arthritis (RA) is quite variable, ranging from very mild or subclinical forms (approx. 10%) to rapidly progressing and debilitating forms (10-15%). The majority of patients present with an intermediate stage with episodes of exacerbation separated by periods of relative inactivity, which evolves to progressive functional losses. To optimise the therapeutic management of early RA it is necessary to perform periodic evaluations of the clinical and laboratory test responses to the treatment instituted, as well as the parameters indicating disease prognosis. Composite measures are frequently used to evaluate the disease activity score (DAS), including the response criteria of the American College of Rheumatology (ACR), the response criteria and the DAS according to the European League Against Rheumatism (EULAR) and the composite indices of disease activity (CIDsA): DAS, the index of disease activity based on 28 joints (DAS 28), the simplified disease activity index (SDAI) and the clinical disease activity index (CDAI). The evaluation of prognosis includes investigation of the absence or occurrence of disease and joint damage remission. Due to the multifaceted nature of RA, no single clinical or laboratory parameter is able to describe satisfactorily the level of inflammatory activity or the disease prognosis at any given time.