939 resultados para process model collection


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Finnish legislation requires for a safe and secure learning environment. However, the comprehensive, risk based safety and security management (SSM) and the management commitment in the implementation and development of the SSM are not mentioned in the legislation. Multiple institutions, operators and researchers have studied and developed safety and security in educational institutions over the past decade. Typically the approach has been fragmented and without bringing up the importance of the comprehensive SSM. The development needs of the safety and security operations in universities have been studied. However, in universities of applied sciences (UASs) and in elementary schools (ESs), the performance level, strengths and weaknesses of the comprehensive SSM have not been studied. The objective of this study was to develop the comprehensive, risk based SSM of educational institutions by developing the new Asteri consultative auditing process and study its effects on auditees. Furthermore, the performance level in the comprehensive SSM in UASs and ESs were studied using Asteri and the TUTOR model developed by the Keski-Uusimaa Department for Rescue Services. In addition, strengths, development needs and differences were identified. In total, 76 educational institutions were audited between the years 2011 and 2014. The study is based on logical empiricism, and an observational applied research design was used. Auditing, observation and an electronic survey were used for data collection. Statistical analysis was used to analyze the collected information. In addition, thematic analysis was used to analyze the development areas of the organizations mentioned by the respondents in the survey. As one of the main contributions, this research presents the new Asteri consultative auditing process. Organizations with low performance levels on the audited subject benefit the most from the Asteri consultative auditing process. Asteri may be usable in many different types of audits, not only in SSM audits. As a new result, this study provides new knowledge on attitudes related to auditing. According to the research findings, auditing may generate negative attitudes and the auditor should take them into account when planning and preparing for audits. Negative attitudes can be compensated by producing added value, objectivity and positivity for the audit and, thus, improve the positive effects of auditing on knowledge and skills. Moreover, as the results of this study shows, auditing safety and security issues do not increase feelings of insecurity, but rather increase feelings of safety and security when using the new Asteri consultative auditing process with the TUTOR model. The results showed that the SSM in the audited UASs was statistically significantly more advanced than that in the audited ESs. However, there is still room for improvement in the ESs and the UASs as the approach to the SSM was fragmented. It can be assumed that the majority of Finnish UASs and ESs do not likely meet the basic level of the comprehensive, risk based the SSM.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis was conducted on assignment by a multinational chemical corporation as a case study. The purpose of this study is to find ways to improve the purchasing process for small purchases at the case company. The improvements looked after are mainly cost and time savings. Purchasing process is the process that starts from the requisition of goods or services and ends when the invoice is paid. In this thesis the purchases with value less than 1000€ are considered to be small. The theoretical framework of the thesis consists of general theoretical view of costs and performance of the purchasing process, different types of purchasing processes and a model for improving purchasing processes. The categorization to small and large purchases is the most important followed by the division between direct and indirect purchases. Also models that provide more strategic perspective for categorization were found to be useful. Auditing and managerial control are important parts of the purchasing process. When considering the transaction costs of purchasing from the costs–benefits perspective large and small purchases should not have the same processes. Purchasing cards, e-procurement and vendor managed inventory are seen as an alternative to the traditional purchasing process. The empirical data collection was done by interviewing the company employees that take part of the purchasing process in their daily work. The interviews had open-ended questions and the answers were coded and analyzed. The results consist of process description and assessment as well as suggestions for potential improvements. At the case company the basic purchasing process was similar to the traditional purchasing process that is entirely done with computers and online. For some categories there was already more sophisticated e-procurement solutions in use. To improve the current e-procurement based solutions elimination of authorization workflow and better information exchange can be seen as potential improvements for most of the case purchases. Purchasing cards and a lightweight form of vendor managed inventory can be seen as potential improvements for some categories. Implementing the changes incurs at least some cost and the benefits might be hard to measure. This thesis has revealed that the small purchases have potential for significant cost and time savings at the case company.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this work was to establish a taxonomy of hand made model construction as a platform for an approach to project an operative method in architecture. It was therefore studied and catalogued in a systematic approach a broad model production in the work of ARX. A wide range of families and sub-families of models were found, with different purposes according to each phase of development, from searching steps for a new possible configuration to detailed refined decisions. This working method revealed as most relevant characteristics, the grounds for a potential personal reflection and open discussion on project method, its flexibility on space modeling, an accuracy on the representation of real construction situations and its constant and stimulating opening to new suggestions. This research helped on a meta-reflection about this method, having been useful on creating a consciousness of processes that pretend to become an autonomous language, knowledge that might become useful to those who pretend to implement a haptic modus operandi in the work of an architectural project.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We compared output from 3 dynamic process-based models (DMs: ECOSSE, MILLENNIA and the Durham Carbon Model) and 9 bioclimatic envelope models (BCEMs; including BBOG ensemble and PEATSTASH) ranging from simple threshold to semi-process-based models. Model simulations were run at 4 British peatland sites using historical climate data and climate projections under a medium (A1B) emissions scenario from the 11-RCM (regional climate model) ensemble underpinning UKCP09. The models showed that blanket peatlands are vulnerable to projected climate change; however, predictions varied between models as well as between sites. All BCEMs predicted a shift from presence to absence of a climate associated with blanket peat, where the sites with the lowest total annual precipitation were closest to the presence/absence threshold. DMs showed a more variable response. ECOSSE predicted a decline in net C sink and shift to net C source by the end of this century. The Durham Carbon Model predicted a smaller decline in the net C sink strength, but no shift to net C source. MILLENNIA predicted a slight overall increase in the net C sink. In contrast to the BCEM projections, the DMs predicted that the sites with coolest temperatures and greatest total annual precipitation showed the largest change in carbon sinks. In this model inter-comparison, the greatest variation in model output in response to climate change projections was not between the BCEMs and DMs but between the DMs themselves, because of different approaches to modelling soil organic matter pools and decomposition amongst other processes. The difference in the sign of the response has major implications for future climate feedbacks, climate policy and peatland management. Enhanced data collection, in particular monitoring peatland response to current change, would significantly improve model development and projections of future change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In Part I of this study it was shown that moving from a moisture-convergent- to a relative-humidity-dependent organized entrainment rate in the formulation for deep convection was responsible for significant advances in the simulation of the Madden – Julian Oscillation (MJO) in the ECMWF model. However, the application of traditional MJO diagnostics were not adequate to understand why changing the control on convection had such a pronounced impact on the representation of the MJO. In this study a set of process-based diagnostics are applied to the hindcast experiments described in Part I to identify the physical mechanisms responsible for the advances in MJO simulation. Increasing the sensitivity of the deep convection scheme to environmental moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid troposphere. Due to the modified precipitation – moisture relationship more moisture is able to build up, which effectively preconditions the tropical atmosphere for the t ransition t o d eep convection. R esults from this study suggest that a tropospheric moisture control on convection is key to simulating the interaction between the convective heating and the large-scale wave forcing associated with the MJO.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A process-based fire regime model (SPITFIRE) has been developed, coupled with ecosystem dynamics in the LPJ Dynamic Global Vegetation Model, and used to explore fire regimes and the current impact of fire on the terrestrial carbon cycle and associated emissions of trace atmospheric constituents. The model estimates an average release of 2.24 Pg C yr−1 as CO2 from biomass burning during the 1980s and 1990s. Comparison with observed active fire counts shows that the model reproduces where fire occurs and can mimic broad geographic patterns in the peak fire season, although the predicted peak is 1–2 months late in some regions. Modelled fire season length is generally overestimated by about one month, but shows a realistic pattern of differences among biomes. Comparisons with remotely sensed burnt-area products indicate that the model reproduces broad geographic patterns of annual fractional burnt area over most regions, including the boreal forest, although interannual variability in the boreal zone is underestimated.