18 resultados para business process reengineering
em Digital Commons at Florida International University
Resumo:
The total time a customer spends in the business process system, called the customer cycle-time, is a major contributor to overall customer satisfaction. Business process analysts and designers are frequently asked to design process solutions with optimal performance. Simulation models have been very popular to quantitatively evaluate the business processes; however, simulation is time-consuming and it also requires extensive modeling experiences to develop simulation models. Moreover, simulation models neither provide recommendations nor yield optimal solutions for business process design. A queueing network model is a good analytical approach toward business process analysis and design, and can provide a useful abstraction of a business process. However, the existing queueing network models were developed based on telephone systems or applied to manufacturing processes in which machine servers dominate the system. In a business process, the servers are usually people. The characteristics of human servers should be taken into account by the queueing model, i.e. specialization and coordination. ^ The research described in this dissertation develops an open queueing network model to do a quick analysis of business processes. Additionally, optimization models are developed to provide optimal business process designs. The queueing network model extends and improves upon existing multi-class open-queueing network models (MOQN) so that the customer flow in the human-server oriented processes can be modeled. The optimization models help business process designers to find the optimal design of a business process with consideration of specialization and coordination. ^ The main findings of the research are, first, parallelization can reduce the cycle-time for those customer classes that require more than one parallel activity; however, the coordination time due to the parallelization overwhelms the savings from parallelization under the high utilization servers since the waiting time significantly increases, thus the cycle-time increases. Third, the level of industrial technology employed by a company and coordination time to mange the tasks have strongest impact on the business process design; as the level of industrial technology employed by the company is high; more division is required to improve the cycle-time; as the coordination time required is high; consolidation is required to improve the cycle-time. ^
Resumo:
The emergence of a technology-intensive economy requires the transformation of business models in the hospitality industry Established companies can face technological, cultural, organizations and relationship barriers in moving from a traditional business model to an e-business model. The authors suggest that market, learning, and business process orientations at the organizational level can help remove some of the barriers toward e-business and facilitate the development of e-business within existing organizational infrastructures.
Resumo:
Distributed applications are exposed as reusable components that are dynamically discovered and integrated to create new applications. These new applications, in the form of aggregate services, are vulnerable to failure due to the autonomous and distributed nature of their integrated components. This vulnerability creates the need for adaptability in aggregate services. The need for adaptation is accentuated for complex long-running applications as is found in scientific Grid computing, where distributed computing nodes may participate to solve computation and data-intensive problems. Such applications integrate services for coordinated problem solving in areas such as Bioinformatics. For such applications, when a constituent service fails, the application fails, even though there are other nodes that can substitute for the failed service. This concern is not addressed in the specification of high-level composition languages such as that of the Business Process Execution Language (BPEL). We propose an approach to transparently autonomizing existing BPEL processes in order to make them modifiable at runtime and more resilient to the failures in their execution environment. By transparent introduction of adaptive behavior, adaptation preserves the original business logic of the aggregate service and does not tangle the code for adaptive behavior with that of the aggregate service. The major contributions of this dissertation are: first, we assessed the effectiveness of BPEL language support in developing adaptive mechanisms. As a result, we identified the strengths and limitations of BPEL and came up with strategies to address those limitations. Second, we developed a technique to enhance existing BPEL processes transparently in order to support dynamic adaptation. We proposed a framework which uses transparent shaping and generative programming to make BPEL processes adaptive. Third, we developed a technique to dynamically discover and bind to substitute services. Our technique was evaluated and the result showed that dynamic utilization of components improves the flexibility of adaptive BPEL processes. Fourth, we developed an extensible policy-based technique to specify how to handle exceptional behavior. We developed a generic component that introduces adaptive behavior for multiple BPEL processes. Fifth, we identify ways to apply our work to facilitate adaptability in composite Grid services.
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
Resumo:
A model was tested to examine relationships among leadership behaviors, team diversity, and team process measures with team performance and satisfaction at both the team and leader-member levels of analysis. Relationships between leadership behavior and team demographic and cognitive diversity were hypothesized to have both direct effects on organizational outcomes as well as indirect effects through team processes. Leader member differences were investigated to determine the effects of leader-member diversity leader-member exchange quality, individual effectiveness and satisfaction.^ Leadership had little direct effect on team performance, but several strong positive indirect effects through team processes. Demographic Diversity had no impact on team processes, directly impacted only one performance measure, and moderated the leadership to team process relationship.^ Cognitive Diversity had a number of direct and indirect effects on team performance, the net effects uniformly positive, and did not moderate the leadership to team process relationship.^ In sum, the team model suggests a complex combination of leadership behaviors positively impacting team processes, demographic diversity having little impact on team process or performance, cognitive diversity having a positive net impact impact, and team processes having mixed effects on team outcomes.^ At the leader-member level, leadership behaviors were a strong predictor of Leader-Member Exchange (LMX) quality. Leader-member demographic and cognitive dissimilarity were each predictors of LMX quality, but failed to moderate the leader behavior to LMX quality relationship. LMX quality was strongly and positively related to self reported effectiveness and satisfaction.^ The study makes several contributions to the literature. First, it explicitly links leadership and team diversity. Second, demographic and cognitive diversity are conceptualized as distinct and multi-faceted constructs. Third, a methodology for creating an index of categorical demographic and interval cognitive measures is provided so that diversity can be measured in a holistic conjoint fashion. Fourth, the study simultaneously investigates the impact of diversity at the team and leader-member levels of analyses. Fifth, insights into the moderating impact of different forms of team diversity on the leadership to team process relationship are provided. Sixth, this study incorporates a wide range of objective and independent measures to provide a 360$\sp\circ$ assessment of team performance. ^
Resumo:
Research on the adoption of innovations by individuals has been criticized for focusing on various factors that lead to the adoption or rejection of an innovation while ignoring important aspects of the dynamic process that takes place. Theoretical process-based models hypothesize that individuals go through consecutive stages of information gathering and decision making but do not clearly explain the mechanisms that cause an individual to leave one stage and enter the next one. Research on the dynamics of the adoption process have lacked a structurally formal and quantitative description of the process. ^ This dissertation addresses the adoption process of technological innovations from a Systems Theory perspective and assumes that individuals roam through different, not necessarily consecutive, states, determined by the levels of quantifiable state variables. It is proposed that different levels of these state variables determine the state in which potential adopters are. Various events that alter the levels of these variables can cause individuals to migrate into different states. ^ It was believed that Systems Theory could provide the required infrastructure to model the innovation adoption process, particularly applied to information technologies, in a formal, structured fashion. This dissertation assumed that an individual progressing through an adoption process could be considered a system, where the occurrence of different events affect the system's overall behavior and ultimately the adoption outcome. The research effort aimed at identifying the various states of such system and the significant events that could lead the system from one state to another. By mapping these attributes onto an “innovation adoption state space” the adoption process could be fully modeled and used to assess the status, history, and possible outcomes of a specific adoption process. ^ A group of Executive MBA students were observed as they adopted Internet-based technological innovations. The data collected were used to identify clusters in the values of the state variables and consequently define significant system states. Additionally, events were identified across the student sample that systematically moved the system from one state to another. The compilation of identified states and change-related events enabled the definition of an innovation adoption state-space model. ^
Resumo:
The purpose of this dissertation was to examine the form of the consumer satisfaction/dissatisfaction (CS/D) response to disconfirmation. In addition, the cognitive and affective processes underlying the response were also explored. ^ Respondents were provided with information from a prior market research study about a new brand of printer that was being tested. This market research information helped set prior expectations regarding the print quality. Subjects were randomly assigned to an experimental condition that manipulated prior expectations to be either positive or negative. Respondents were then provided with printouts that had performance quality that was either worse (negative disconfirmation) or better (positive disconfirmation) than the prior expectations. In other words, for each level of expectation, respondents were assigned to either positive or negative disconfirmation condition. Subjects were also randomly assigned to a condition of either a high or low level of outcome involvement. ^ Analyses of variance indicated that positive disconfirmation led to a more intense CS/D response than negative disconfirmation, even though there was no significant difference in the intensity for positive and negative disconfirmation. Intensity of CS/D was measured by the distance of the CS/D rating from the midpoint of the scale. The study also found that although outcome involvement did not influence the polarity of the CS/D response, the more direct measures of processing involvement such as the subjects' concentration, attention and care in evaluating the printout did have a significant positive effect on CS/D intensity. ^ Analyses of covariance also indicated that the relationship between the intensity of the CS/D response and the intensity of the disconfirmation was mediated by the intensity of affective responses. Positive disconfirmation led to more intense affective responses than negative disconfirmation. ^
Resumo:
The purpose of this descriptive study was to evaluate the banking and insurance technology curriculum at ten junior colleges in Taiwan. The study focused on curriculum, curriculum materials, instruction, support services, student achievement and job performance. Data was collected from a diverse sample of faculty, students, alumni, and employers. ^ Questionnaires on the evaluation of curriculum at technical junior colleges were developed for use in this specific case. Data were collected from the sample described above and analyzed utilizing ANOVA, T-Tests and crosstabulations. Findings are presented which indicate that there is room for improvement in terms of meeting individual students' needs. ^ Using Stufflebeam's CIPP model for curriculum evaluation it was determined that the curriculum was adequate in terms of the knowledge and skills imparted to students. However, students were dissatisfied with the rigidity of the curriculum and the lack of opportunity to satisfy the individual needs of students. Employers were satisfied with both the academic preparation of students and their on the job performance. ^ In sum, the curriculum of the two-year banking and insurance technology programs of junior college in Taiwan was shown to have served adequately preparing a work force to enter businesses. It is now time to look toward the future and adapt the curriculum and instruction for the future needs of the ever evolving high-tech society. ^
Resumo:
Coordination of business processes is the management of dependencies where dependencies constrain how the tasks are performed. It has been traditionally done in an intuitive fashion, without paying much attention to the coordination load. Coordination load is being defined as the ratio between the time spent on coordination activities and the total task time. Previous efforts to understand and analyze coordination have resulted in mostly qualitative approaches to categorize and recommend coordination strategies. This research seeks to answer two questions: (1) How can we analyze process coordination problems to improve overall performance? (2) What guidance can we provide to reduce the coordination load of the process and consequently improve the organization's performance? Thus, this effort developed a quantitative measure for coordination load of business processes and a methodology to apply such measure. ^ This effort used a management simulation game to have a controlled laboratory environment enabling the manipulation of the task factors variability, analyzability, and interdependence to measure their impact on coordination load. The hypothesis was that the more variable, non-analyzable, and interdependent a process, the higher the coordination load, and that a higher coordination load would have a negative impact on performance. Coordination load was measured via the surrogate coordination time, and performance via profit. ^ A 22 x 31 full factorial design, with two replicates, was run to observe the impact on the variables coordination time and profit. Properly validated spreadsheets and questionnaires were used as data collection instruments for each scenario. The experimental results indicate that lower task analyzability (ρ=0.036) and higher task interdependence (ρ=0.000) lead to higher coordination load, and higher levels of task variability (ρ=0.049) lead to lower performance. However, contrary to the hypotheses postulated by this work, coordination load did not prove to be strong predictor of performance (correlation of -0.086). ^ These findings from the laboratory experiment and other lessons learned were incorporated to develop a quantitative measure, a tool (survey) to use to gather data for the variables in the measures, and a methodology to quantify coordination load of production business processes. The practicality of the methodology is demonstrated with an example.^
Resumo:
With advances in science and technology, computing and business intelligence (BI) systems are steadily becoming more complex with an increasing variety of heterogeneous software and hardware components. They are thus becoming progressively more difficult to monitor, manage and maintain. Traditional approaches to system management have largely relied on domain experts through a knowledge acquisition process that translates domain knowledge into operating rules and policies. It is widely acknowledged as a cumbersome, labor intensive, and error prone process, besides being difficult to keep up with the rapidly changing environments. In addition, many traditional business systems deliver primarily pre-defined historic metrics for a long-term strategic or mid-term tactical analysis, and lack the necessary flexibility to support evolving metrics or data collection for real-time operational analysis. There is thus a pressing need for automatic and efficient approaches to monitor and manage complex computing and BI systems. To realize the goal of autonomic management and enable self-management capabilities, we propose to mine system historical log data generated by computing and BI systems, and automatically extract actionable patterns from this data. This dissertation focuses on the development of different data mining techniques to extract actionable patterns from various types of log data in computing and BI systems. Four key problems—Log data categorization and event summarization, Leading indicator identification , Pattern prioritization by exploring the link structures , and Tensor model for three-way log data are studied. Case studies and comprehensive experiments on real application scenarios and datasets are conducted to show the effectiveness of our proposed approaches.
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
The present study—employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders—examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for. ^
Resumo:
This thesis examines two research questions: (1) Why do Multinational Enterprises (MNEs) try to influence trade negotiations in the Latin American context? and (2) How do MNEs influence the trade negotiation process in Latin America? The results show that the MNE's main reasons for participation are: (1) to gain market access and, specifically, to reduce tariff and non-tariff barriers; (2) to create a beneficial regulatory environment for the MNE; and (3) to set the rules of the game by influencing the business environment in which its industry or its specific company is required to operate. The main approaches reported by the interviewees as to how MNEs participate are: (1) the MNE directly lobbies domestic government officials, principally the United States Trade Representative office; (2) a business, trade or industry association lobbies domestic government officials on the MNE's behalf; and (3) the MNE lobbies Congress.
Resumo:
This dissertation explored the capacity of business group diversification to generate value to their affiliates in an institutional environment characterized by the adoption of structural pro-market reforms. In particular, the three empirical essays explored the impact of business group diversification on the internationalization process of their affiliates. ^ The first essay examined the direct effect of business group diversification on firm performance and its moderating effect on the multinationality-performance relationship. It further explored whether such moderating effect varies depending upon whether the focal affiliate is a manufacturing or service firm. The findings suggested that the benefits of business group diversification on firm performance have a threshold, that those benefits are significant at earlier stages of internationalization and that these benefits are stronger for service firms. ^ The second essay studied the capacity of business group diversification to ameliorate the negative effects of the added complexity faced by its affiliates when they internationalized. The essay explored this capacity in different dimensions of international complexity. The results indicated that business group diversification effectively ameliorated the effects of the added international complexity. This positive effect is stronger in the institutional voids rather than the societal complexity dimension. In the former dimension, diversified business groups can use both their non-market resources and previous experience to ameliorate the effects of complexity on firm performance. ^ The last essay explored whether the benefits of business group diversification on the scope-performance relationship varies depending on the level of development of the network of subsidiaries and the region of operation of the focal firm. The results suggested that the benefits of business group diversification are location bound within the region but that they are not related to the level of development of the targeted countries. ^ The three essays use longitudinal analyses on a sample of Latin American firms to test the hypotheses. While the first essay used multilevel models and fix effects models, the last two essays used exclusively fix effects models to assess the impact of business group diversification. In conclusion, this dissertation aimed to explain the capacity of business group diversification to generate value under conditions of institutional change.^
Resumo:
In the article - Discipline and Due Process in the Workplace – by Edwin B. Dean, Assistant Professor, the School of Hospitality Management at Florida International University, Assistant Professor Dean prefaces his article with the statement: “Disciplining employees is often necessary for the maintenance of an effective operation. The author discusses situations which require discipline and methods of handling employees, including the need for rules and due process.” In defining what constitutes appropriate discipline and what doesn’t, Dean says, “Fair play is the keystone to discipline in the workplace. Discrimination, caprice, favoritism, and erratic and inconsistent discipline can be costly and harmful to employee relations, and often are a violation of law.” Violation of law is a key phrase in this statement. The author offers a short primer on tact in regard to disciplining an employee. “Discipline must be tailored to the individual,” Dean offers a pearl of wisdom. “A frown for one can cause a tearful outbreak; another employee may need the proverbial two-by-four in order to get his attention.” This is a perceptive comment, indeed, and one in which most would concede but not all would follow. Dean presents a simple outline for steps in the disciplinary process by submitting this suggestion for your approval: “The steps in the disciplinary process begin perhaps with a friendly warning or word of advice. The key here is friendly,” Dean declares. “It could progress to an oral or written reprimand, followed by a disciplinary layoff, terminating in that equivalent of capital punishment, discharge.” Ouch [!]; in order from lenient to strident. Dean suggests these steps are necessary in order to maintain decorum in the workplace. Assistant Professor Dean references the Weingarter Rule. It is a rule that although significant, most employees, at least non-union employees, don’t know is in their quiver. “If an interview is likely to result in discipline, the employee is entitled to have a representative present, whether a union is involved or not,” the rule states. “The employer is not obligated to inform the employee of the rule, but he is obligated to honor the employee's request, if made,” Dean explains. Dean makes an interesting point by revealing that a termination often reflects as much on the institution as it does the employee suffering the termination. The author goes on to list several infractions that could warrant an employee disciplinary action, with possible approaches toward each. Dean also cautions against capricious disciplinary action; if not handled properly a discipline could and can result in a lawsuit against the institution itself.