923 resultados para Business -- Data processing -- Management
Resumo:
This paper examines the international diffusion of one business practice, project management, through the prism of prior literature and data on the diffusion of ISO 9000. The study took an inductive approach, building theory through the iterative collection and analysis of quantitative and qualitative data. The findings problematise the central position accorded to the S-curve model and neo-institutional theory in explaining technology diffusion. The research posits three distinct processes driving the diffusion process: utility, institutional isomorphism, and competitive isomorphism, with the latter consisting of three primary mechanisms: competitive imitation, trendslators and fashion retailers. Contrary to prior literature, national, quasi-professional associations are found to be central to the diffusion process and play a key role in advocating and containing management technologies.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The process of building Data Warehouses (DW) is well known with well defined stages but at the same time, mostly carried out manually by IT people in conjunction with business people. Web Warehouses (WW) are DW whose data sources are taken from the web. We define a flexible WW, which can be configured accordingly to different domains, through the selection of the web sources and the definition of data processing characteristics. A Business Process Management (BPM) System allows modeling and executing Business Processes (BPs) providing support for the automation of processes. To support the process of building flexible WW we propose a two BPs level: a configuration process to support the selection of web sources and the definition of schemas and mappings, and a feeding process which takes the defined configuration and loads the data into the WW. In this paper we present a proof of concept of both processes, with focus on the configuration process and the defined data.
Resumo:
Beef businesses in northern Australia are facing increased pressure to be productive and profitable with challenges such as climate variability and poor financial performance over the past decade. Declining terms of trade, limited recent gains in on-farm productivity, low profit margins under current management systems and current climatic conditions will leave little capacity for businesses to absorb climate change-induced losses. In order to generate a whole-of-business focus towards management change, the Climate Clever Beef project in the Maranoa-Balonne region of Queensland trialled the use of business analysis with beef producers to improve financial literacy, provide a greater understanding of current business performance and initiate changes to current management practices. Demonstration properties were engaged and a systematic approach was used to assess current business performance, evaluate impacts of management changes on the business and to trial practices and promote successful outcomes to the wider industry. Focus was concentrated on improving financial literacy skills, understanding the business’ key performance indicators and modifying practices to improve both business productivity and profitability. To best achieve the desired outcomes, several extension models were employed: the ‘group facilitation/empowerment model’, the ‘individual consultant/mentor model’ and the ‘technology development model’. Providing producers with a whole-of-business approach and using business analysis in conjunction with on-farm trials and various extension methods proved to be a successful way to encourage producers in the region to adopt new practices into their business, in the areas of greatest impact. The areas targeted for development within businesses generally led to improvements in animal performance and grazing land management further improving the prospects for climate resilience.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.
Resumo:
This thesis builds a framework for evaluating downside risk from multivariate data via a special class of risk measures (RM). The peculiarity of the analysis lies in getting rid of strong data distributional assumptions and in orientation towards the most critical data in risk management: those with asymmetries and heavy tails. At the same time, under typical assumptions, such as the ellipticity of the data probability distribution, the conformity with classical methods is shown. The constructed class of RM is a multivariate generalization of the coherent distortion RM, which possess valuable properties for a risk manager. The design of the framework is twofold. The first part contains new computational geometry methods for the high-dimensional data. The developed algorithms demonstrate computability of geometrical concepts used for constructing the RM. These concepts bring visuality and simplify interpretation of the RM. The second part develops models for applying the framework to actual problems. The spectrum of applications varies from robust portfolio selection up to broader spheres, such as stochastic conic optimization with risk constraints or supervised machine learning.
Resumo:
It is well known that human resources play a valuable role in a sustainable organizational development. Indeed, this work will focus on the development of a decision support system to assess workers’ satisfaction based on factors related to human resources management practices. The framework is built on top of a Logic Programming approach to Knowledge Representation and Reasoning, complemented with a Case Based approach to computing. The proposed solution is unique in itself, once it caters for the explicit treatment of incomplete, unknown, or even self-contradictory information, either in terms of a qualitative or quantitative setting. Furthermore, clustering methods based on similarity analysis among cases were used to distinguish and aggregate collections of historical data or knowledge in order to reduce the search space, therefore enhancing the cases retrieval and the overall computational process.
Resumo:
In this work, an algorithm to compute the envelope of non-destructive testing (NDT) signals is proposed. This method allows increasing the speed and reducing the memory in extensive data processing. Also, this procedure presents advantage of preserving the data information for physical modeling applications of time-dependent measurements. The algorithm is conceived to be applied for analyze data from non-destructive testing. The comparison between different envelope methods and the proposed method, applied to Magnetic Bark Signal (MBN), is studied. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this work, a system using active RFID tags to supervise truck bulk cargo is described. The tags are attached to the bodies of the trucks and readers are distributed in the cargo buildings and attached to weighs and the discharge platforms. PDAs with camera and support to a WiFi network are provided to the inspectors and access points are installed throughout the discharge area to allow effective confirmations of unload actions and the acquisition of pictures for future audit. Broadband radio equipments are used to establish efficient communication links between the weighs and cargo buildings which are usually located very far from each other in the field. A web application software was especially developed to enable robust communication between the equipments for efficient device management, data processing and reports generation to the operating personal. The system was deployed in a cargo station of a Brazilian seashore port. The obtained results prove the effectiveness of the proposed system.
Resumo:
In this paper, based on the results of the Global Leadership and Organizational Behavior Effectiveness (GLOBE) 61-nation study of culture and leadership, we present findings related to three ‘clusters’ of countries. These clusters are: (1) the ‘Anglo culture’ cluster (Australia, Canada, Ireland, New Zealand, white South Africa, UK, and USA), the ‘Southern Asia’ cluster (Iran, India, Thailand, Malaysia, Indonesia, and the Philippines), and the ‘Confucian Asia’ cluster (China, Hong Kong, Japan, Singapore, South Korea, and Taiwan). Data from the GLOBE study, reporting middle managers’ perceptions of societal practices and values, and of the factors that facilitate and inhibit effective leadership will be compared across the three clusters. Results demonstrate that, despite differences in cultures, especially cultural values, perceptions of effective leadership vary substantially only in respect of the extent that participation is seen to facilitate leadership. In the Anglo cluster, participative leadership is seen as much more facilitative of leadership, than in either of the Asian clusters. Results are discussed in terms of effective leadership styles suitable for management in the twenty-first century, where Asian economies are likely to play a more dominant role than they have in recent history.
Resumo:
Mestrado em Engenharia Informática
Resumo:
Dissertação de Mestrado, Gestão do Turismo Internacional, 3 de Dezembro de 2015, Universidade dos Açores.
Resumo:
Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.
Resumo:
Mestrado em Intervenção Sócio-Organizacional na Saúde - Área de especialização: Políticas de Administração e Gestão de Serviços de Saúde