923 resultados para Business impact analysis
Resumo:
Projeto apresentado obtenção do Grau de Mestre em Auditoria Orientada pela Professora Doutora Alcina Augusta Sena Dias
Resumo:
Dissertação apresentada ao Instituto Superior de Contabilidade e Administração do Porto (ISCAP) para a obtenção do Grau de Mestre em Auditoria Docente orientador: Mestre Domingos da Silva Duarte
Resumo:
O documento em anexo encontra-se na versão pre-print (versão inicial enviada para o editor).
Resumo:
Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.
Resumo:
This paper presents a Multi-Agent Market simulator designed for developing new agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. This tool studies negotiations based on different market mechanisms and, time and behavior dependent strategies. The results of the negotiations between agents are analyzed by data mining algorithms in order to extract rules that give agents feedback to improve their strategies. The system also includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agent reactions.
Resumo:
Introduction: The present paper deals with the issue of the increasing usage of corporation mergers and acquisitions strategies within pharmaceutical industry environment. The aim is to identify the triggers of such business phenomenon and the immediate impact on the financial outcome of two powerful biopharmaceutical corporations: Pfizer and GlaxoSmithKline, which have been sampled due to their successful approach of the tactics in question. Materials and Methods: In order to create an overview of the development steps through mergers and acquisitions, the historical data of the two corporations has been consulted, from their official websites. The most relevant events were then associated with adequate information from the financial reports and statements of the two corporations indulged by web-based financial data providers. Results and Discussions: In the past few decades Pfizer and GlaxoSmithKline have purchased or merged with various companies in order to monopolize new markets, diversify products and services portfolios, survive and surpass competitors. The consequences proved to be positive although this approach implies certain capital availability. Conclusions: Results reveal the fact that, as far as the two sampled companies are concerned, acquisitions and mergers are reactions at the pressure of the highly competitive environment. Moreover, the continuous diversification of the market’s needs is also a consistent motive. However, the prevalence and the eminence of mergers and acquisition strategies are conditioned by the tender offer, the announcer’s caliber, research and development status and further other factors determined by the internal and external actors of the market.
Resumo:
OBJECTIVE: To carry out a survey data collection from health care workers in Brazil, Croatia, Poland, Ukraine and the USA with two primary goals: (1) to provide information about which aspects of well-being are most likely to need attention when shiftwork management solutions are being developed, and (2) to explore whether nations are likely to differ with respect to the impacts of night work on the well-being of workers involved in health care work. METHODS: The respondents from each nation were sorted into night worker and non-night worker groups. Worker perceptions of being physically tired, mentally tired, and tense at the end of the workday were examined. Subjective reports of perceived felt age were also studied. For each of these four dependent variables, an ANCOVA analysis was carried out. Hours worked per week, stability of weekly work schedule, and chronological age were the covariates for these analyses. RESULTS: The results clearly support the general proposal that nations differ significantly in worker perceptions of well-being. In addition, perceptions of physical and mental tiredness at the end of the workday were higher for night workers. For the perception of being physically tired at the end of a workday, the manner and degree to which the night shift impacts the workers varies by nation. CONCLUSIONS: Additional research is needed to determine if the nation and work schedule differences observed are related to differences in job tasks, work schedule structure, off-the-job variables, and/or other worker demographic variables.
Resumo:
Lisbon is the largest urban area in the Western European coast. Due to this geographical position the Atlantic Ocean serves as an important source of particles and plays an important role in many atmospheric processes. The main objectives of this study were to (1) perform a chemical characterization of particulate matter (PM2.5) sampled in Lisbon, (2) identify the main sources of particles, (3) determine PM contribution to this urban area, and (4) assess the impact of maritime air mass trajectories on concentration and composition of respirable PM sampled in Lisbon. During 2007, PM2.5 was collected on a daily basis in the center of Lisbon with a Partisol sampler. The exposed Teflon filters were measured by gravimetry and cut into two parts: one for analysis by instrumental neutron activation analysis (INAA) and the other by ion chromatography (IC). Principal component analysis (PCA) and multilinear regression analysis (MLRA) were used to identify possible sources of PM2.5 and determine mass contribution. Five main groups of sources were identified: secondary aerosols, traffic, calcium, soil, and sea. Four-day backtracking trajectories ending in Lisbon at the starting sampling time were calculated using the HYSPLIT model. Results showed that maritime transport scenarios were frequent. These episodes were characterized by a significant decrease of anthropogenic aerosol concentrations and exerted a significant role on air quality in this urban area.
Resumo:
Dissertação de Mestrado apresentado ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Auditoria, sob orientação de Doutor José Campos Amorim
Resumo:
Orientada pela Prof. Doutora Cláudia Lopes
Resumo:
In studies assessing the effects of a given exposure variable and a specific outcome of interest, confusion may arise from the mistaken impression that the exposure variable is producing the outcome of interest, when in fact the observed effect is due to an existing confounder. However, quantitative techniques are rarely used to determine the potential influence of unmeasured confounders. Sensitivity analysis is a statistical technique that allows to quantitatively measuring the impact of an unmeasured confounding variable on the association of interest that is being assessed. The purpose of this study was to make it feasible to apply two sensitivity analysis methods available in the literature, developed by Rosenbaum and Greenland, using an electronic spreadsheet. Thus, it can be easier for researchers to include this quantitative tool in the set of procedures that have been commonly used in the stage of result validation.
Resumo:
Dissertação de Mestrado em Gestão de Empresas/MBA.
Resumo:
Dissertação de Mestrado em Ambiente, Saúde e Segurança.
Resumo:
Mestrado em Contabilidade
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.