19 resultados para Key management

em Digital Commons at Florida International University


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tropical coastal marine ecosystems including mangroves, seagrass beds and coral reef communities are undergoing intense degradation in response to natural and human disturbances, therefore, understanding the causes and mechanisms present challenges for scientist and managers. In order to protect our marine resources, determining the effects of nutrient loads on these coastal systems has become a key management goal. Data from monitoring programs were used to detect trends of macroalgae abundances and develop correlations with nutrient availability, as well as forecast potential responses of the communities monitored. Using eight years of data (1996–2003) from complementary but independent monitoring programs in seagrass beds and water quality of the Florida Keys National Marine Sanctuary (FKNMS), we: (1) described the distribution and abundance of macroalgae groups; (2) analyzed the status and spatiotemporal trends of macroalgae groups; and (3) explored the connection between water quality and the macroalgae distribution in the FKNMS. In the seagrass beds of the FKNMS calcareous green algae were the dominant macroalgae group followed by the red group; brown and calcareous red algae were present but in lower abundance. Spatiotemporal patterns of the macroalgae groups were analyzed with a non-linear regression model of the abundance data. For the period of record, all macroalgae groups increased in abundance (Abi) at most sites, with calcareous green algae increasing the most. Calcareous green algae and red algae exhibited seasonal pattern with peak abundances (Φi) mainly in summer for calcareous green and mainly in winter for red. Macroalgae Abi and long-term trend (mi) were correlated in a distinctive way with water quality parameters. Both the Abi and mi of calcareous green algae had positive correlations with NO3−, NO2−, total nitrogen (TN) and total organic carbon (TOC). Red algae Abi had a positive correlation with NO2−, TN, total phosphorus and TOC, and the mi in red algae was positively correlated with N:P. In contrast brown and calcareous red algae Abi had negative correlations with N:P. These results suggest that calcareous green algae and red algae are responding mainly to increases in N availability, a process that is happening in inshore sites. A combination of spatially variable factors such as local current patterns, nutrient sources, and habitat characteristics result in a complex array of the macroalgae community in the seagrass beds of the FKNMS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied-- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton--are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy--executive orders--redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the proliferation of multimedia data and ever-growing requests for multimedia applications, there is an increasing need for efficient and effective indexing, storage and retrieval of multimedia data, such as graphics, images, animation, video, audio and text. Due to the special characteristics of the multimedia data, the Multimedia Database management Systems (MMDBMSs) have emerged and attracted great research attention in recent years. Though much research effort has been devoted to this area, it is still far from maturity and there exist many open issues. In this dissertation, with the focus of addressing three of the essential challenges in developing the MMDBMS, namely, semantic gap, perception subjectivity and data organization, a systematic and integrated framework is proposed with video database and image database serving as the testbed. In particular, the framework addresses these challenges separately yet coherently from three main aspects of a MMDBMS: multimedia data representation, indexing and retrieval. In terms of multimedia data representation, the key to address the semantic gap issue is to intelligently and automatically model the mid-level representation and/or semi-semantic descriptors besides the extraction of the low-level media features. The data organization challenge is mainly addressed by the aspect of media indexing where various levels of indexing are required to support the diverse query requirements. In particular, the focus of this study is to facilitate the high-level video indexing by proposing a multimodal event mining framework associated with temporal knowledge discovery approaches. With respect to the perception subjectivity issue, advanced techniques are proposed to support users' interaction and to effectively model users' perception from the feedback at both the image-level and object-level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of automation (specifically Global Positioning Systems (GPS)) and Information and Communications Technology (ICT) through the creation of a Total Jobsite Management Tool (TJMT) in construction contractor companies can revolutionize the way contractors do business. The key to this integration is the collection and processing of real-time GPS data that is produced on the jobsite for use in project management applications. This research study established the need for an effective planning and implementation framework to assist construction contractor companies in navigating the terrain of GPS and ICT use. An Implementation Framework was developed using the Action Research approach. The framework consists of three components, as follows: (i) ICT Infrastructure Model, (ii) Organizational Restructuring Model, and (iii) Cost/Benefit Analysis. The conceptual ICT infrastructure model was developed for the purpose of showing decision makers within highway construction companies how to collect, process, and use GPS data for project management applications. The organizational restructuring model was developed to assist companies in the analysis and redesign of business processes, data flows, core job responsibilities, and their organizational structure in order to obtain the maximum benefit at the least cost in implementing GPS as a TJMT. A cost-benefit analysis which identifies and quantifies the cost and benefits (both direct and indirect) was performed in the study to clearly demonstrate the advantages of using GPS as a TJMT. Finally, the study revealed that in order to successfully implement a program to utilize GPS data as a TJMT, it is important for construction companies to understand the various implementation and transitioning issues that arise when implementing this new technology and business strategy. In the study, Factors for Success were identified and ranked to allow a construction company to understand the factors that may contribute to or detract from the prospect for success during implementation. The Implementation Framework developed as a result of this study will serve to guide highway construction companies in the successful integration of GPS and ICT technologies for use as a TJMT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research document is motivated by the need for a systemic, efficient quality improvement methodology at universities. There exists no methodology designed for a total quality management (TQM) program in a university. The main objective of this study is to develop a TQM Methodology that enables a university to efficiently develop an integral total quality improvement (TQM) Plan. ^ Current research focuses on the need of improving the quality of universities, the study of the perceived best quality universities, and the measurement of the quality of universities through rankings. There is no evidence of research on how to plan for an integral quality improvement initiative for the university as a whole, which is the main contribution of this study. ^ This research is built on various reference TQM models and criteria provided by ISO 9000, Baldrige and Six Sigma; and educational accreditation criteria found in ABET and SACS. The TQM methodology is proposed by following a seven-step metamethodology. The proposed methodology guides the user to develop a TQM plan in five sequential phases: initiation, assessment, analysis, preparation and acceptance. Each phase defines for the user its purpose, key activities, input requirements, controls, deliverables, and tools to use. The application of quality concepts in education and higher education is particular; since there are unique factors in education which ought to be considered. These factors shape the quality dimensions in a university and are the main inputs to the methodology. ^ The proposed TQM Methodology is used to guide the user to collect and transform appropriate inputs to a holistic TQM Plan, ready to be implemented by the university. Different input data will lead to a unique TQM plan for the specific university at the time. It may not necessarily transform the university into a world-class institution, but aims to strive for stakeholder-oriented improvements, leading to a better alignment with its mission and total quality advancement. ^ The proposed TQM methodology is validated in three steps. First, it is verified by going through a test activity as part of the meta-methodology. Secondly, the methodology is applied to a case university to develop a TQM plan. Lastly, the methodology and the TQM plan both are verified by an expert group consisting of TQM specialists and university administrators. The proposed TQM methodology is applicable to any university at all levels of advancement, regardless of changes in its long-term vision and short-term needs. It helps to assure the quality of a TQM plan, while making the process more systemic, efficient, and cost effective. This research establishes a framework with a solid foundation for extending the proposed TQM methodology into other industries. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper assesses the status of pre-disaster risk management in the case of Turkey. By focusing on the period following the catastrophic August 17, 1999 earthquake, the study benefits from USAID’s Disaster Risk Management Benchmarking Tool (DRMBT). In line with the benchmarking tool, the paper covers key developments in the four components of pre-disaster risk management, namely: risk identification, risk mitigation, risk transfer and disaster preparedness. In the end, it will present three major conclusions: (i) Although post-1999 Turkey has made some important progress in the pre-disaster phase of DRM, particularly with the enactment of obligatory earthquake insurance and tightened standards for building construction, the country is far away from substantial levels of success in DRM. (ii) In recent years, local governments have had been given more authority in the realm of DRM, however, Turkey’s approach to DRM is still predominantly centralized at the expense of successful DRM practices at the local level. (iii) While the devastating 1999 earthquake has resulted in advances in the pre-disaster components of DRM; progress has been mostly in the realm of earthquakes. Turkey’s other major disasters (landslides, floods, wild fires i.e.) also require similar attention by local and central authorities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Front Office Manager: Key to Hotel Communications is a written study by Denney G. Rutherford, Department of Hotel and Restaurant Administration, College of Business and Economics at Washington State University. In it he initially observes, “Since the front office manager is usually viewed as the key to the efficient and orderly operation of a hotel, the author has researched the job and activities of this individual in an attempt to provide data about an area which he says was "intuitively known" but never "empirically explored." “Current literature implies that the activities of the front office are so important to the daily operations of the hotel that it occupies a preeminent position among other departments,” Rutherford says. He also references, Gray and Liguori, who describe the front office as: “the nerve center of the hote1,” echoing an early work by Heldenbrand indicating that it “becomes a sort of listening post for management.” The quotes are cited. The primary stage of the article relies on a seven-page, two-part questionnaire, which was used to collect data regarding the FOM – front office manager - position. Even though the position is considered a crucial one, it seems there is a significant lack of pragmatic data regarding it. Rutherford graphs the studies. Good communication skills are imperative. “Other recent research has suggested that the skills of effective communication are among the most vital a manager at any level can bring to his/her endeavors in the service industries,” Rutherford notes. He provides a detailed – front office communications model – to illustrate the functions. In, Table 4, for example - Office Manager as Facilitator – Rutherford provides Likert Rating Scale values for a comprehensive list of front office tasks. Rutherford informs you that the communicative skills of a front office manager flow across the board, encompassing variables from guest relation exchanges to all the disparate components of employee relations. Not withstanding and compared to technical knowledge, such as computer and fiscal skills, Rutherford suggests: “The most powerful message derived from analysis of the data on the FOM's job is that communication in its various forms is clearly central to the successful mission of the front office.”

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In - Protecting Your Assets: A Well-Defined Credit Policy Is The Key – an essay by Steven V. Moll, Associate Professor, The School of Hospitality Management at Florida International University, Professor Moll observes at the outset: “Bad debts as a percentage of credit sales have climbed to record levels in the industry. The author offers suggestions on protecting assets and working with the law to better manage the business.” “Because of the nature of the hospitality industry and its traditional liberal credit policies, especially in hotels, bad debts as a percentage of credit sales have climbed to record levels,” our author says. “In 1977, hotels showing a net income maintained an average accounts receivable ratio to total sales of 3.4 percent. In 1983, the accounts receivable ratio to total sales increased to 4.1 percent in hotels showing a net income and 4.4 percent in hotels showing a net loss,” he further cites. As the professor implies, there are ways to mitigate the losses from bad credit or difficult to collect credit sales. In this article Professor Moll offers suggestions on how to do that. Moll would suggest that hotels and food & beverage operations initially tighten their credit extension policies, and on the following side, be more aggressive in their collection-of-debt pursuits. There is balance to consider here and bad credit in and of itself as a negative element is not the only reflection the profit/loss mirror would offer. “Credit managers must know what terms to offer in order to compete and afford the highest profit margin allowable,” Moll says. “They must know the risk involved with each guest account and be extremely alert to the rights and wrongs of good credit management,” he advocates. A sound profit policy can be the result of some marginal and additional credit risk on the part of the operation manager. “Reality has shown that high profits, not small credit losses, are the real indicator of good credit management,” the author reveals. “A low bad debt history may indicate that an establishment has an overly conservative credit management policy and is sacrificing potential sales and profits by turning away marginal accounts,” Moll would have you believe, and the science suggests there is no reason not to. Professor Moll does provide a fairly comprehensive list to illustrate when a manager would want to adopt a conservative credit policy. In the final analysis the design is to implement a policy which weighs an acceptable amount of credit risk against a potential profit ratio. In closing, Professor Moll does offer some collection strategies for loose credit accounts, with reference to computer and attorney participation, and brings cash and cash discounts into the discussion as well. Additionally, there is some very useful information about what debt collectors – can’t – do!

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The professional success of future hospitality graduates will require that they have gone beyond the acquisition of contemporary industry knowledge and training in current best practices. Increasingly relevant hospitality education will emphasize skill development. Managerial thinking and renewal skills will be especially useful in an industry which is constantly changing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of the private management of public housing is an important topic to be critically analyzed as the government search for ways to increase efficiency in providing housing for the poor. Public Housing Authorities must address the cost for repairing or replacing the deteriorating housing stock, the increase in the need for affordable housing, and the lack of supply. There is growing pressure on efficient use of public funds that has heightened the need for profound structural reform. An important strategy for carrying out such reform is through privatization. Although privatization does not work in every case, the majority position in the traditional privatization literature is that reliance on private organizations normally, but not always, results in cost savings. ^ The primary purpose of this dissertation is to determine whether a consensus exist among decision-makers on the efficiency of privatizing the management of public housing. A secondary purpose is to review the techniques (best practices) used by the private sector that results in cost-efficiencies in the management of public housing. The study employs the use of a triangulated research design utilizing cross-sectional survey methodology that included use of a survey instrument to solicit responses from the private managers. The study consists of qualitative methods using interviews from key informants of private-sector management firms and public housing agencies, case studies, focus groups, archival records and housing authorities documents. ^ Results indicated that important decision-makers perceive that private managers made a positive contribution to cost-efficiencies in the management of public housing. The performance of private contractors served as a yardstick for comparison of efficiency of services that are produced in-house. The study concluded that private managers made the benefits of their management techniques well known creating a sense of competition between public and private managers. Competition from private contractors spurred municipal worker and management productivity improvements creating better management results for the public housing authorities. The study results are in concert with a review of recent research and studies that also concluded private managers have some distinct advantages to controlling costs in the management of public housing. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-native fishes present a management challenge to maintaining Everglades National Park (ENP) in a natural state. We summarized data from long-term fish monitoring studies in ENP and reviewed the timing of introductions relative to water-management changes. Beginning in the early 1950s, management actions have added canals, altered wetland habitats by flooding and drainage, and changed inflows into ENP, particularly in the Taylor Slough/C-111 basin and Rocky Glades. The first non-native fishes likely entered ENP by the late 1960s, but species numbers increased sharply in the early 1980s when new water-management actions were implemented. After 1999, eight non-native species and three native species, all previously recorded outside of Park boundaries, were found for the first time in ENP. Several of these incursions occurred following structural and operational changes that redirected water deliveries to wetlands open to the eastern boundary canals. Once established, control non-native fishes in Everglades wetlands is difficult; therefore, preventing introductions is key to their management. Integrating actions that minimize the spread of non-native species into protected natural areas into the adaptive management process for planning, development, and operation of water-management features may help to achieve the full suite of objectives for Everglades restoration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pine rocklands of South Florida are characterized by an herbaceous flora with many narrowly endemic taxa, a diverse shrub layer containing several palms and numerous tropical hardwoods, and an overstory of south Florida slash pine (Pinus elliottii var. densa). Fire has been considered as an important environmental factor for these ecosystems, since in the absence of fire these pine forests are replaced by dense hardwood communities, resulting in loss of the characteristic pineland herb flora. Hence, in the Florida Keys pine forests, prescribed fire has been used since the creation of the National Key Deer Refuge. However, such prescribed burns were conducted in the Refuge mainly for fuel reduction, without much consideration of ecological factors. The USGS and Florida International University conducted a research study for four years, from 1998 to 2001, the objective of which was to document the response of pine rockland vegetation to a range of fire management options and to provide Fish and Wildlife Service and other land managers with information useful in deciding when and where to burn to perpetuate these unique pine forests. This study is described in detail in Snyder et al. (2005).