15 resultados para performance data

em Digital Commons at Florida International University


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The purpose of this research was to apply the concepts of power and influence tactics to the joint venture context by examining how they relate to venture performance. In addition, culture and the expectations of future cooperation were examined for their association with influence tactic use and joint venture performance. Data were collected from 58 parent firms of U.S.-based domestic and international joint ventures about their relationships with their partners.^ Under the theories of social exchange and power dependence, a parent's level of power is based on its partner's dependence on the relationship. The statistical results indicated that: (1) the greater the total of power of both parents in an equal power relationship, the greater the joint venture's performance; and (2) the greater the inequality between each parent's level of power, the lower the joint venture's performance. It was also found that the way in which a parent firm tried to influence its partner was related to joint venture performance. Specifically, the use of references to a partner's legitimate authority was negatively related to performance, while the use of rational arguments and compromises was positively related.^ Contrary to expectations, the cultural backgrounds of the parents were not shown to have a relationship to influence tactic use or joint venture's performance. On the other hand, greater expectation of future cooperation had a positive association with performance, and a significant relationship with influence tactic use. The greater the expectation, the less partners used more confrontational tactics such as pressure or legitimate authority. ^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

An Automatic Vehicle Location (AVL) system is a computer-based vehicle tracking system that is capable of determining a vehicle's location in real time. As a major technology of the Advanced Public Transportation System (APTS), AVL systems have been widely deployed by transit agencies for purposes such as real-time operation monitoring, computer-aided dispatching, and arrival time prediction. AVL systems make a large amount of transit performance data available that are valuable for transit performance management and planning purposes. However, the difficulties of extracting useful information from the huge spatial-temporal database have hindered off-line applications of the AVL data. ^ In this study, a data mining process, including data integration, cluster analysis, and multiple regression, is proposed. The AVL-generated data are first integrated into a Geographic Information System (GIS) platform. The model-based cluster method is employed to investigate the spatial and temporal patterns of transit travel speeds, which may be easily translated into travel time. The transit speed variations along the route segments are identified. Transit service periods such as morning peak, mid-day, afternoon peak, and evening periods are determined based on analyses of transit travel speed variations for different times of day. The seasonal patterns of transit performance are investigated by using the analysis of variance (ANOVA). Travel speed models based on the clustered time-of-day intervals are developed using important factors identified as having significant effects on speed for different time-of-day periods. ^ It has been found that transit performance varied from different seasons and different time-of-day periods. The geographic location of a transit route segment also plays a role in the variation of the transit performance. The results of this research indicate that advanced data mining techniques have good potential in providing automated techniques of assisting transit agencies in service planning, scheduling, and operations control. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Increasing use of the term, Strategic Human Resource Management (SHRM), reflects the recognition of the interdependencies between corporate strategy, organization and human resource management in the functioning of the firm. Dyer and Holder (1988) proposed a comprehensive Human Resource Strategic Typology consisting of three strategic types--inducement, investment and involvement. This research attempted to empirically validate their typology and also test the performance implications of the match between corporate strategy and HR strategy. Hypotheses were tested to determine the relationships between internal consistency in HRM sub-systems, match between corporate strategy and HR strategy, and firm performance. Data were collected by a mail survey of 998 senior HR executives of whom 263 returned the completed questionnaire. Financial information on 909 firms was collected from secondary sources like 10-K reports and CD-Disclosure. Profitability ratios were indexed to industry averages. Confirmatory Factor Analysis using LISREL provided support in favor of the six-factor HR measurement model; the six factors were staffing, training, compensation, appraisal, job design and corporate involvement. Support was also found for the presence of a second-order factor labeled "HR Strategic Orientation" explaining the variations among the six factors. LISREL analysis also supported the congruence hypothesis that HR Strategic Orientation significantly affects firm performance. There was a significant associative relationship between HR Strategy and Corporate Strategy. However, the contingency effects of the match between HR and Corporate strategies were not supported. Several tests were conducted to show that the survey results are not affected by non-response bias nor by mono-method bias. Implications of these findings for both researchers and practitioners are discussed. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this descriptive study was to evaluate the banking and insurance technology curriculum at ten junior colleges in Taiwan. The study focused on curriculum, curriculum materials, instruction, support services, student achievement and job performance. Data was collected from a diverse sample of faculty, students, alumni, and employers. ^ Questionnaires on the evaluation of curriculum at technical junior colleges were developed for use in this specific case. Data were collected from the sample described above and analyzed utilizing ANOVA, T-Tests and crosstabulations. Findings are presented which indicate that there is room for improvement in terms of meeting individual students' needs. ^ Using Stufflebeam's CIPP model for curriculum evaluation it was determined that the curriculum was adequate in terms of the knowledge and skills imparted to students. However, students were dissatisfied with the rigidity of the curriculum and the lack of opportunity to satisfy the individual needs of students. Employers were satisfied with both the academic preparation of students and their on the job performance. ^ In sum, the curriculum of the two-year banking and insurance technology programs of junior college in Taiwan was shown to have served adequately preparing a work force to enter businesses. It is now time to look toward the future and adapt the curriculum and instruction for the future needs of the ever evolving high-tech society. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigated the differences in personality, consistent with the vocational theory of personality as proposed by Holland (1997), for a modern day firefighter. This study also investigates the relationships between personality characteristics and job duties performed by firefighters and firefighter-paramedics. Archival data from employees (N = 98) of a Southeastern Florida fire department who completed the Hogan Personality Inventory (HPI), Hogan Development Survey (HDS) and Motives, Values, Preferences Inventory (MVPI), as well as a self-report questionnaire on variety proneness (boredom), job satisfaction, and affective well-being data were analyzed. The scores of the firefighters on the HPI, HDS, and MVPI were used as predictive data, and criterion data used in this study were self-report satisfaction data on job involvement, variety proneness (boredom), and affective well-being. In addition, criterion data on performance were obtained from the employment histories of the participants, and were correlated with the personality scale scores to determine if personality is predictive of aspects of performance. ^ Participants in this study varied with respect to the type of firefighter duties required from them on their jobs. The participants were categorized into three duty classifications: Group 1 (G1) are the firefighters hired before 1990 and are only certified as firefighters; Group 2 (G2) are the firefighters hired before 1990 who became paramedics at some point after employment and after fire college training; and Group 3 (G3) are the firefighters hired after 1990 who were trained as paramedics in the fire college and who were aware of the paramedic requirement at time of application or were already trained as paramedics at the time of application. From the research reviewed and presented in this paper, hypotheses were generated about differences between the personality types of firefighter groups G1 and G2 versus G3, in accordance with Holland's theories. In addition, it was hypothesized that personality will predict outcomes of satisfaction and performance. ^ Results found that job satisfaction was not found to be statistically different among the groups. However, the groups differed significantly on 5 of the predictive instrument scales, and personality was found to be a predictor of limited performance data. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The idea of comparative performance assessment is crucial. Recent study findings show that in South Florida the use by most municipalities of external benchmarks for performance comparison is virtually non-existent. On one level this study sought to identify the factors impacting resident perceptions of municipal service quality. On a different and more practical level, this study sought to identify a core set of measures that could serve for multi jurisdictional comparisons of performance. ^ This study empirically tested three groups of hypotheses. Data were collected via custom designed survey instruments from multiple jurisdictions, representing diverse socioeconomic backgrounds, and across two counties. A second layer of analysis was conducted on municipal budget documents for the presence of performance measures. A third layer of analysis was conducted via face-to-face interviews with residents at the point of service delivery. Research questions were analyzed using descriptive and inferential statistic methodologies. ^ Results of survey data yielded inconsistent findings. In absolute aggregated terms, the use of sociological determinants to guide inquiry failed to yield conclusive answers regarding the factors impacting resident perceptions of municipal service quality. At disaggregated community levels, however, definite differences emerged but these had weak predictive ability. More useful were the findings of performance measures reporting via municipal budget documents and analyses of interviews with residents at the point of service delivery. Regardless of socio-economic profile, neighborhood characteristics, level of civic engagement or type of community, the same aspects were important to citizens when making assessments of service quality. For parks and recreation, respondents most frequently cited maintenance, facility amenities, and program offerings as important while for garbage collection services timely and consistent service delivery mattered most. Surprisingly municipalities participating in the study track performance data on items indicated as important by citizen assessments but regular feed back from residents or reporting to the same is rarely done. ^ The implications of these findings suggest that endeavors, such as the one undertaken in this study, can assist in determining a core set of measures for cross jurisdictional comparisons of municipal service quality, improving municipal delivery of services, and to communicate with the public. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to determine if an experimental context-based delivery format for mathematics would be more effective than a traditional model for increasing the performance in mathematics of at-risk students in a public high school of choice, as evidenced by significant gains in achievement on the standards-based Mathematics subtest of the FCAT and final academic grades in Algebra I. The guiding rationale for this approach is captured in the Secretary's Commission on Achieving Necessary Skills (SCANS) report of 1992 that resulted in school-to-work initiatives (United States Department of Labor). Also, the charge for educational reform has been codified at the state level as Educational Accountability Act of 1971 (Florida Statutes, 1995) and at the national level as embodied in the No Child Left Behind Act of 2001. A particular focus of educational reform is low performing, at-risk students. ^ This dissertation explored the effects of a context-based curricular reform designed to enhance the content of Algebra I content utilizing a research design consisting of two delivery models: a traditional content-based course; and, a thematically structured, content-based course. In this case, the thematic element was business education as there are many advocates in career education who assert that this format engages students who are often otherwise disinterested in mathematics in a relevant, SCANS skills setting. The subjects in each supplementary course were ninth grade students who were both low performers in eighth grade mathematics and who had not passed the eighth grade administration of the standards-based FCAT Mathematics subtest. The sample size was limited to two groups of 25 students and two teachers. The site for this study was a public charter school. Student-generated performance data were analyzed using descriptive statistics. ^ Results indicated that contrary to the beliefs held by many, contextual presentation of content did not cause significant gains in either academic performance or test performance for those in the experimental treatment group. Further, results indicated that there was no meaningful difference in performance between the two groups. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this research was to study the effect of the Florida A+ Plan accountability program on curriculum and instruction in four Title I public elementary schools in the Miami-Dade County Public Schools system. It focused on the experiences of the school principals and the classroom teachers of the four schools as they related to curriculum and instruction. The study included an analysis of the school improvement plans in curriculum and instruction for each school during the school years 1998-2004. ^ The study was conducted in the format of interviews with the school principals and principal selected classroom teachers who taught third, fourth, or fifth grade during the first six years of the Florida A+ Plan. The analysis of the school improvement plans focused on the implementation of curriculum and instruction for each of the four schools. It focused on the goals and measurable objectives selected by each school to improve its instructional program in the academic subjects of reading, mathematics, writing, and science. ^ The findings indicated that under the pressure to improve their school grade on the Florida A+ Plan, each of the target schools, based on individual needs assessments, and restructured their instructional program each school year as documented in their school improvement plans. They altered their programs by analyzing student performance data to realign curriculum and instruction. The analysis of the interviews with the principals and the teachers showed that each school year they restructured their program to align it with the FCAT content. This realigning was a collaborative effort on the part of the administration and the instructional staff. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modern data centers host hundreds of thousands of servers to achieve economies of scale. Such a huge number of servers create challenges for the data center network (DCN) to provide proportionally large bandwidth. In addition, the deployment of virtual machines (VMs) in data centers raises the requirements for efficient resource allocation and find-grained resource sharing. Further, the large number of servers and switches in the data center consume significant amounts of energy. Even though servers become more energy efficient with various energy saving techniques, DCN still accounts for 20% to 50% of the energy consumed by the entire data center. The objective of this dissertation is to enhance DCN performance as well as its energy efficiency by conducting optimizations on both host and network sides. First, as the DCN demands huge bisection bandwidth to interconnect all the servers, we propose a parallel packet switch (PPS) architecture that directly processes variable length packets without segmentation-and-reassembly (SAR). The proposed PPS achieves large bandwidth by combining switching capacities of multiple fabrics, and it further improves the switch throughput by avoiding padding bits in SAR. Second, since certain resource demands of the VM are bursty and demonstrate stochastic nature, to satisfy both deterministic and stochastic demands in VM placement, we propose the Max-Min Multidimensional Stochastic Bin Packing (M3SBP) algorithm. M3SBP calculates an equivalent deterministic value for the stochastic demands, and maximizes the minimum resource utilization ratio of each server. Third, to provide necessary traffic isolation for VMs that share the same physical network adapter, we propose the Flow-level Bandwidth Provisioning (FBP) algorithm. By reducing the flow scheduling problem to multiple stages of packet queuing problems, FBP guarantees the provisioned bandwidth and delay performance for each flow. Finally, while DCNs are typically provisioned with full bisection bandwidth, DCN traffic demonstrates fluctuating patterns, we propose a joint host-network optimization scheme to enhance the energy efficiency of DCNs during off-peak traffic hours. The proposed scheme utilizes a unified representation method that converts the VM placement problem to a routing problem and employs depth-first and best-fit search to find efficient paths for flows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conceptual database design is an unusually difficult and error-prone task for novice designers. This study examined how two training approaches---rule-based and pattern-based---might improve performance on database design tasks. A rule-based approach prescribes a sequence of rules for modeling conceptual constructs, and the action to be taken at various stages while developing a conceptual model. A pattern-based approach presents data modeling structures that occur frequently in practice, and prescribes guidelines on how to recognize and use these structures. This study describes the conceptual framework, experimental design, and results of a laboratory experiment that employed novice designers to compare the effectiveness of the two training approaches (between-subjects) at three levels of task complexity (within subjects). Results indicate an interaction effect between treatment and task complexity. The rule-based approach was significantly better in the low-complexity and the high-complexity cases; there was no statistical difference in the medium-complexity case. Designer performance fell significantly as complexity increased. Overall, though the rule-based approach was not significantly superior to the pattern-based approach in all instances, it out-performed the pattern-based approach at two out of three complexity levels. The primary contributions of the study are (1) the operationalization of the complexity construct to a degree not addressed in previous studies; (2) the development of a pattern-based instructional approach to database design; and (3) the finding that the effectiveness of a particular training approach may depend on the complexity of the task.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modern data centers host hundreds of thousands of servers to achieve economies of scale. Such a huge number of servers create challenges for the data center network (DCN) to provide proportionally large bandwidth. In addition, the deployment of virtual machines (VMs) in data centers raises the requirements for efficient resource allocation and find-grained resource sharing. Further, the large number of servers and switches in the data center consume significant amounts of energy. Even though servers become more energy efficient with various energy saving techniques, DCN still accounts for 20% to 50% of the energy consumed by the entire data center. The objective of this dissertation is to enhance DCN performance as well as its energy efficiency by conducting optimizations on both host and network sides. First, as the DCN demands huge bisection bandwidth to interconnect all the servers, we propose a parallel packet switch (PPS) architecture that directly processes variable length packets without segmentation-and-reassembly (SAR). The proposed PPS achieves large bandwidth by combining switching capacities of multiple fabrics, and it further improves the switch throughput by avoiding padding bits in SAR. Second, since certain resource demands of the VM are bursty and demonstrate stochastic nature, to satisfy both deterministic and stochastic demands in VM placement, we propose the Max-Min Multidimensional Stochastic Bin Packing (M3SBP) algorithm. M3SBP calculates an equivalent deterministic value for the stochastic demands, and maximizes the minimum resource utilization ratio of each server. Third, to provide necessary traffic isolation for VMs that share the same physical network adapter, we propose the Flow-level Bandwidth Provisioning (FBP) algorithm. By reducing the flow scheduling problem to multiple stages of packet queuing problems, FBP guarantees the provisioned bandwidth and delay performance for each flow. Finally, while DCNs are typically provisioned with full bisection bandwidth, DCN traffic demonstrates fluctuating patterns, we propose a joint host-network optimization scheme to enhance the energy efficiency of DCNs during off-peak traffic hours. The proposed scheme utilizes a unified representation method that converts the VM placement problem to a routing problem and employs depth-first and best-fit search to find efficient paths for flows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.