12 resultados para management strategy
em Digital Commons at Florida International University
Resumo:
In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.
Resumo:
As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied-- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton--are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy--executive orders--redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress. ^
Resumo:
As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton-are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy-executive orders redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress.
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
Natural and man-made disasters have gained attention at all levels of policy-making in recent years. Emergency management tasks are inherently complex and unpredictable, and often require coordination among multiple organizations across different levels and locations. Effectively managing various knowledge areas and the organizations involved has become a critical emergency management success factor. However, there is a general lack of understanding about how to describe and assess the complex nature of emergency management tasks and how knowledge integration can help managers improve emergency management task performance. ^ The purpose of this exploratory research was first, to understand how emergency management operations are impacted by tasks that are complex and inter-organizational and second, to investigate how knowledge integration as a particular knowledge management strategy can improve the efficiency and effectiveness of the emergency tasks. Three types of specific knowledge were considered: context-specific, technology-specific, and context-and-technology-specific. ^ The research setting was the Miami-Dade Emergency Operations Center (EOC) and the study was based on the survey responses from the participants in past EOC activations related to their emergency tasks and knowledge areas. The data included task attributes related to complexity, knowledge area, knowledge integration, specificity of knowledge, and task performance. The data was analyzed using multiple linear regressions and path analyses, to (1) examine the relationships between task complexity, knowledge integration, and performance, (2) the moderating effects of each type of specific knowledge on the relationship between task complexity and performance, and (3) the mediating role of knowledge integration. ^ As per theory-based propositions, the results indicated that overall component complexity and interactive complexity tend to have a negative effect on task performance. But surprisingly, procedural rigidity tended to have a positive effect on performance in emergency management tasks. Also as per our expectation, knowledge integration had a positive relationship with task performance. Interestingly, the moderating effects of each type of specific knowledge on the relationship between task complexity and performance were varied and the extent of mediation of knowledge integration depended on the dimension of task complexity. ^
Resumo:
A suite of seagrass indicator metrics is developed to evaluate four essential measures of seagrass community status for Florida Bay. The measures are based on several years of monitoring data using the Braun-Blanquet Cover Abundance (BBCA) scale to derive information about seagrass spatial extent, abundance, species diversity and presence of target species. As ecosystem restoration proceeds in south Florida, additional freshwater will be discharged to Florida Bay as a means to restore the bay's hydrology and salinity regime. Primary hypotheses about restoring ecological function of the keystone seagrass community are based on the premise that hydrologic restoration will increase environmental variability and reduce hypersalinity. This will create greater niche space and permit multiple seagrass species to co-exist while maintaining good environmental conditions for Thalassia testudinum, the dominant climax seagrass species. Greater species diversity is considered beneficial to habitat for desired higher trophic level species such as forage fish and shrimp. It is also important to maintenance of a viable seagrass community that will avoid die-off events observed in the past. Indicator metrics are assigned values at the basin spatial scale and are aggregated to five larger zones. Three index metrics are derived by combining the four indicators through logic gates at the zone spatial scale and aggregated to derive a single bay-wide system status score standardized on the System-wide Indicator protocol. The indicators will provide a way to assess progress toward restoration goals or reveal areas of concern. Reporting for each indicator, index and overall system status score is presented in a red–yellow–green format that summarizes information in a readily accessible form for mangers, policy-makers and stakeholders in planning and implementing an adaptive management strategy.
Resumo:
Just as all types of business firms are now expected to go beyond their profit-oriented activities in boosting the well-being of the community, so, too, is corporate social responsibility (CSR) expected from foodservice firms. The significance of the obesity epidemic, combined with the foodservice industry's role in the development of this epidemic, suggests that the industry has an ethical responsibility to implement CSR activities that will help reduce obesity, particularly among children. CSR should be seen as an efficient management strategy through which a firm voluntarily integrates social and environmental concerns into its business operations and its interactions with stakeholders. Although costs are associated with CSR initiatives, benefits accrue to the firm. Decisions regarding alternative CSR activities should be based on a cost-benefit analysis and calculation of the present value of the revenue stream that can be identified as resulting from the specific CSR activities. CSR initiatives should be viewed as long-term investments that will enhance the firms’ value. Key areas for foodservice firms' CSR activities include marketing practices, particularly practices impacting advertising to children and marketing that will enhance the firms’ visibility; portion-size modification; new-product development; and consistent nutrition labeling on menus.
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
Natural and man-made disasters have gained attention at all levels of policy-making in recent years. Emergency management tasks are inherently complex and unpredictable, and often require coordination among multiple organizations across different levels and locations. Effectively managing various knowledge areas and the organizations involved has become a critical emergency management success factor. However, there is a general lack of understanding about how to describe and assess the complex nature of emergency management tasks and how knowledge integration can help managers improve emergency management task performance. The purpose of this exploratory research was first, to understand how emergency management operations are impacted by tasks that are complex and inter-organizational and second, to investigate how knowledge integration as a particular knowledge management strategy can improve the efficiency and effectiveness of the emergency tasks. Three types of specific knowledge were considered: context-specific, technology-specific, and context-and-technology-specific. The research setting was the Miami-Dade Emergency Operations Center (EOC) and the study was based on the survey responses from the participants in past EOC activations related to their emergency tasks and knowledge areas. The data included task attributes related to complexity, knowledge area, knowledge integration, specificity of knowledge, and task performance. The data was analyzed using multiple linear regressions and path analyses, to (1) examine the relationships between task complexity, knowledge integration, and performance, (2) the moderating effects of each type of specific knowledge on the relationship between task complexity and performance, and (3) the mediating role of knowledge integration. As per theory-based propositions, the results indicated that overall component complexity and interactive complexity tend to have a negative effect on task performance. But surprisingly, procedural rigidity tended to have a positive effect on performance in emergency management tasks. Also as per our expectation, knowledge integration had a positive relationship with task performance. Interestingly, the moderating effects of each type of specific knowledge on the relationship between task complexity and performance were varied and the extent of mediation of knowledge integration depended on the dimension of task complexity.
Resumo:
The state of Florida has one of the most severe exotic species invasion problems in the United States, but little is known about their influence on soil biogeochemistry. My dissertation research includes a cross-continental field study in Australia, Florida, and greenhouse and growth chamber experiments, focused on the soil-plant interactions of one of the most problematic weeds introduced in south Florida, Lygodium microphyllum (Old World climbing fern). Analysis of field samples from the ferns introduced and their native range indicate that L microphyllum is highly dependent on arbuscular mycorrhizal fungi (AMF) for phosphorus uptake and biomass accumulation. Relationship with AMF is stronger in relatively dry conditions, which are commonly found in some Florida sites, compared to more common wet sites where the fern is found in its native Australia. In the field, L. microphyllum is found to thrive in a wide range of soil pH, texture, and nutrient conditions, with strongly acidic soils in Australia and slightly acidic soils in Florida. Soils with pH 5.5 - 6.5 provide the most optimal growth conditions for L. microphyllum, and the growth declines significantly at soil pH 8.0, indicating that further reduction could happen in more alkaline soils. Comparison of invaded and uninvaded soil characteristics demonstrates that L. microphyllum can change the belowground soil environment, with more conspicuous impact on nutrient-poor sandy soils, to its own benefit by enhancing the soil nutrient status. Additionally, the nitrogen concentration in the leaves, which has a significant influence in the relative growth rate and photosynthesis, was significantly higher in Florida plants compared to Australian plants. Given that L. microphyllum allocates up to 40% of the total biomass to rhizomes, which aid in rapid regeneration after burning, cutting or chemical spray, hence management techniques targeting the rhizomes look promising. Over all, my results reveal for the first time that soil pH, texture, and AMF are major factors facilitating the invasive success of L. mcirophyllum. Finally, herbicide treatments targeting rhizomes will most likely become the widely used technique to control invasiveness of L. microphyllum in the future. However, a complete understanding of the soil ecosystem is necessary before adding any chemicals to the soil to achieve a successful long-term invasive species management strategy.