954 resultados para Internal Process-Level Performance
Resumo:
Hazardous radioactive liquid waste is the legacy of more than 50 years of plutonium production associated with the United States' nuclear weapons program. It is estimated that more than 245,000 tons of nitrate wastes are stored at facilities such as the single-shell tanks (SST) at the Hanford Site in the state of Washington, and the Melton Valley storage tanks at Oak Ridge National Laboratory (ORNL) in Tennessee. In order to develop an innovative, new technology for the destruction and immobilization of nitrate-based radioactive liquid waste, the United State Department of Energy (DOE) initiated the research project which resulted in the technology known as the Nitrate to Ammonia and Ceramic (NAC) process. However, inasmuch as the nitrate anion is highly mobile and difficult to immobilize, especially in relatively porous cement-based grout which has been used to date as a method for the immobilization of liquid waste, it presents a major obstacle to environmental clean-up initiatives. Thus, in an effort to contribute to the existing body of knowledge and enhance the efficacy of the NAC process, this research involved the experimental measurement of the rheological and heat transfer behaviors of the NAC product slurry and the determination of the optimal operating parameters for the continuous NAC chemical reaction process. Test results indicate that the NAC product slurry exhibits a typical non-Newtonian flow behavior. Correlation equations for the slurry's rheological properties and heat transfer rate in a pipe flow have been developed; these should prove valuable in the design of a full-scale NAC processing plant. The 20-percent slurry exhibited a typical dilatant (shear thickening) behavior and was in the turbulent flow regime due to its lower viscosity. The 40-percent slurry exhibited a typical pseudoplastic (shear thinning) behavior and remained in the laminar flow regime throughout its experimental range. The reactions were found to be more efficient in the lower temperature range investigated. With respect to leachability, the experimental final NAC ceramic waste form is comparable to the final product of vitrification, the technology chosen by DOE to treat these wastes. As the NAC process has the potential of reducing the volume of nitrate-based radioactive liquid waste by as much as 70 percent, it not only promises to enhance environmental remediation efforts but also effect substantial cost savings. ^
Resumo:
The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^
Resumo:
The purpose of this dissertation was to examine the form of the consumer satisfaction/dissatisfaction (CS/D) response to disconfirmation. In addition, the cognitive and affective processes underlying the response were also explored. ^ Respondents were provided with information from a prior market research study about a new brand of printer that was being tested. This market research information helped set prior expectations regarding the print quality. Subjects were randomly assigned to an experimental condition that manipulated prior expectations to be either positive or negative. Respondents were then provided with printouts that had performance quality that was either worse (negative disconfirmation) or better (positive disconfirmation) than the prior expectations. In other words, for each level of expectation, respondents were assigned to either positive or negative disconfirmation condition. Subjects were also randomly assigned to a condition of either a high or low level of outcome involvement. ^ Analyses of variance indicated that positive disconfirmation led to a more intense CS/D response than negative disconfirmation, even though there was no significant difference in the intensity for positive and negative disconfirmation. Intensity of CS/D was measured by the distance of the CS/D rating from the midpoint of the scale. The study also found that although outcome involvement did not influence the polarity of the CS/D response, the more direct measures of processing involvement such as the subjects' concentration, attention and care in evaluating the printout did have a significant positive effect on CS/D intensity. ^ Analyses of covariance also indicated that the relationship between the intensity of the CS/D response and the intensity of the disconfirmation was mediated by the intensity of affective responses. Positive disconfirmation led to more intense affective responses than negative disconfirmation. ^
Resumo:
This dissertation analyzes the current status of emergency management professionalization in the United States and Florida using a qualitative case study. I investigate the efforts of various organizations at the national and state levels in the private and public sectors to organize emergency management as a profession. I conceptualize emergency management professionalization as occurring in two phases: the indirect institutionalization of the occupation of emergency management and the formal advancement toward an emergency management profession. The legislative, organizational, and procedural developments that occurred between approximately 1900 and the late 1970s became the indirect institutionalization of the occupation of emergency management. Over time, as our society developed and became increasingly complex, more disasters affect the security of the population. In order to adapt to increasing risks and vulnerabilities the emergency management system emerged and with it the necessary elements upon which a future profession could be established providing the basis for the formal advancement toward an emergency management profession. ^ During approximately the last twenty years, the formal advancement toward an emergency management profession has encompassed two primary strategies—certification and accreditation—motivated by the objective to organize a profession. Certification applies to individual emergency managers and includes all training and education. Accreditation of state and local emergency management agencies is reached by complying to a minimum level of proficiency with established standards of performance. Certification and accreditation are the mechanisms used to create an emergency management profession and thus act as axes around which the field of emergency management is organizing. ^ The purpose of this research is to provide a frame of reference for whether or not the field of emergency management is a profession. Based on sociology of professions literature, emergency management can be considered to be professionalizing. The current emergency management professionalization efforts may or may not be sufficient to achieve the ultimate goal of becoming a legitimate profession based on legal and public support for the exclusive right to perform emergency management tasks (monopoly) as well as self-regulation of those tasks (autonomy). ^
Resumo:
Increasing parental involvement was made an important goal for all Florida schools in educational reform legislation in the 1990's. A forum for this input was established and became known as the School Advisory Council (SAC). To demonstrate the importance of process and inclusion, a south Florida school district and its local teacher's union agreed on the following five goals for SACs: (a) to foster an environment of professional collaboration among all stakeholders, (b) to assist in the preparation and evaluation of the school improvement plan, (c) to address all state and district goals, (d) to serve as the avenue for authentic and representative input from all stakeholders, and (e) to ensure the continued existence of the consensus-building process on all issues related to the school's instructional program. ^ The purpose of this study was to determine to what extent and in what ways the parent members of one south Florida middle school's SAC achieved the five district goals during its first three years of implementation. The primary participants were 16 parents who served as members of the SAC, while 16 non-parent members provided perspective on parent involvement as “outside sources.” Being qualitative by design, factors such as school climate, leadership styles, and the quality of parental input were described from data collected from four sources: parent interviews, a questionnaire of non-parents, researcher observations, and relevant documents. A cross-case analysis of all data informed a process evaluation that described the similarities and differences of intended and observed outcomes of parent involvement from each source using Stake's descriptive matrix model. A formative evaluation of the process compared the observed outcomes with standards set for successful SACs, such as the district's five goals. ^ The findings indicated that parents elected to the SACs did not meet the intended goals set by the state and district. The school leadership did not foster an environment of professional collaboration and authentic decision-making for parents and other stakeholders. The overall process did not include consensus-building, and there was little if any input by parents on school improvement and other important issues relating to the instructional program. Only two parents gave the SAC a successful rating for involving parents in the decision-making process. Although compliance was met in many of the procedural transactions of the SAC, the reactions of parents to their perceived role and influence often reflected feelings of powerlessness and frustration with a process that many thought lacked meaningfulness and productivity. Two conclusions made from this study are as follows: (a) that the role of the principal in the collaborative process is pivotal, and (b) that the normative-re-educative approach to change would be most appropriate for SACs. ^
Resumo:
The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^
Resumo:
Antenna design is an iterative process in which structures are analyzed and changed to comply with certain performance parameters required. The classic approach starts with analyzing a "known" structure, obtaining the value of its performance parameter and changing this structure until the "target" value is achieved. This process relies on having an initial structure, which follows some known or "intuitive" patterns already familiar to the designer. The purpose of this research was to develop a method of designing UWB antennas. What is new in this proposal is that the design process is reversed: the designer will start with the target performance parameter and obtain a structure as the result of the design process. This method provided a new way to replicate and optimize existing performance parameters. The base of the method was the use of a Genetic Algorithm (GA) adapted to the format of the chromosome that will be evaluated by the Electromagnetic (EM) solver. For the electromagnetic study we used XFDTD™ program, based in the Finite-Difference Time-Domain technique. The programming portion of the method was created under the MatLab environment, which serves as the interface for converting chromosomes, file formats and transferring of data between the XFDTD™ and GA. A high level of customization had to be written into the code to work with the specific files generated by the XFDTD™ program. Two types of cost functions were evaluated; the first one seeking broadband performance within the UWB band, and the second one searching for curve replication of a reference geometry. The performance of the method was evaluated considering the speed provided by the computer resources used. Balance between accuracy, data file size and speed of execution was achieved by defining parameters in the GA code as well as changing the internal parameters of the XFDTD™ projects. The results showed that the GA produced geometries that were analyzed by the XFDTD™ program and changed following the search criteria until reaching the target value of the cost function. Results also showed how the parameters can change the search criteria and influence the running of the code to provide a variety of geometries.
Resumo:
The total time a customer spends in the business process system, called the customer cycle-time, is a major contributor to overall customer satisfaction. Business process analysts and designers are frequently asked to design process solutions with optimal performance. Simulation models have been very popular to quantitatively evaluate the business processes; however, simulation is time-consuming and it also requires extensive modeling experiences to develop simulation models. Moreover, simulation models neither provide recommendations nor yield optimal solutions for business process design. A queueing network model is a good analytical approach toward business process analysis and design, and can provide a useful abstraction of a business process. However, the existing queueing network models were developed based on telephone systems or applied to manufacturing processes in which machine servers dominate the system. In a business process, the servers are usually people. The characteristics of human servers should be taken into account by the queueing model, i.e. specialization and coordination. ^ The research described in this dissertation develops an open queueing network model to do a quick analysis of business processes. Additionally, optimization models are developed to provide optimal business process designs. The queueing network model extends and improves upon existing multi-class open-queueing network models (MOQN) so that the customer flow in the human-server oriented processes can be modeled. The optimization models help business process designers to find the optimal design of a business process with consideration of specialization and coordination. ^ The main findings of the research are, first, parallelization can reduce the cycle-time for those customer classes that require more than one parallel activity; however, the coordination time due to the parallelization overwhelms the savings from parallelization under the high utilization servers since the waiting time significantly increases, thus the cycle-time increases. Third, the level of industrial technology employed by a company and coordination time to mange the tasks have strongest impact on the business process design; as the level of industrial technology employed by the company is high; more division is required to improve the cycle-time; as the coordination time required is high; consolidation is required to improve the cycle-time. ^
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^
Resumo:
Many restaurant organizations have committed a substantial amount of effort to studying the relationship between a firm’s performance and its effort to develop an effective human resources management reward-and-retention system. These studies have produced various metrics for determining the efficacy of restaurant management and human resources management systems. This paper explores the best metrics to use when calculating the overall unit performance of casual restaurant managers. These metrics were identified through an exploratory qualitative case study method that included interviews with executives and a Delphi study. Experts proposed several diverse metrics for measuring management value and performance. These factors seem to represent all stakeholders’interest.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
Resumo:
In this dissertation, I present an integrated model of organizational performance. Most prior research has relied extensively on testing individual linkages, often with cross-sectional data. In this dissertation, longitudinal unit-level data from 559 restaurants, collected over a one-year period, were used to test the proposed model. The model was hypothesized to begin with employee satisfaction as a key antecedent that would ultimately lead to improved financial performance. Several variables including turnover, efficiency, and guest satisfaction are proposed as mediators of the satisfaction-performance relationship. The current findings replicate and extend past research using individual-level data. The overall model adequately explained the data, but was significantly improved with an additional link from employee satisfaction to efficiency, which was not originally hypothesized. Management turnover was a strong predictor of hourly level team turnover, and both were significant predictors of efficiency. Full findings for each hypothesis are presented and practical organizational implications are given. Limitations and recommendations for future research are provided. ^
Resumo:
This sequential explanatory, mixed methods research design examines the role teachers should enact in the development process of the teacher evaluation system in Louisiana. These insights will ensure teachers are catalysts in the classroom to significantly increase student achievement and allow policymakers, practitioners, and instructional leaders to direct as learned decision makers.
Resumo:
Increasing parental involvement was made an important goal for all Florida schools in educational reform legislation in the 1990's. A forum for this input was established and became known as the School Advisory Council (SAC). To demonstrate the importance of process and inclusion, a south Florida school district and its local teacher's union agreed on the following five goals for SACs: (a) to foster an environment of professional collaboration among all stakeholders, (b) to assist in the preparation and evaluation of the school improvement plan, (c) to address all state and district goals, (d) to serve as the avenue for authentic and representative input from all stakeholders, and (e) to ensure the continued existence of the consensus-building process on all issues related to the school's instructional program. The purpose of this study was to determine to what extent and in what ways the parent members of one south Florida middle school's SAC achieved the five district goals during its first three years of implementation. The primary participants were 16 parents who served as members of the SAC, while 16 non-parent members provided perspective on parent involvement as "outside sources." Being qualitative by design, factors such as school climate, leadership styles, and the quality of parental input were described from data collected from four sources: parent interviews, a questionnaire of non-parents, researcher observations, and relevant documents. A cross-case analysis of all data informed a process evaluation that described the similarities and differences of intended and observed outcomes of parent involvement from each source using Stake's descriptive matrix model. A formative evaluation of the process compared the observed outcomes with standards set for successful SACs, such as the district's five goals. The findings indicated that parents elected to the SACs did not meet the intended goals set by the state and district. The school leadership did not foster an environment of professional collaboration and authentic decision-making for parents and other stakeholders. The overall process did not include consensus-building, and there was little if any input by parents on school improvement and other important issues relating to the instructional program. Only two parents gave the SAC a successful rating for involving parents in the decision-making process. Although compliance was met in many of the procedural transactions of the SAC, the reactions of parents to their perceived role and influence often reflected feelings of powerlessness and frustration with a process that many thought lacked meaningfulness and productivity. Two conclusions made from this study are as follows: (a) that the role of the principal in the collaborative process is pivotal, and (b) that the normative-re-educative approach to change would be most appropriate for SACs.