17 resultados para Data-based Safety Evaluation
em Digital Commons at Florida International University
Resumo:
This qualitative case study explored how employees learn from Team Primacy Concept (TPC)-based employee evaluation and how they apply the knowledge in their job performance. Kolb's experiential learning model (1974) served as a conceptual framework for the study to reveal the process of how employees learn from TPC evaluation, namely, how they experience, reflect, conceptualize and act on performance feedback. TPC based evaluation is a form of multirater evaluation that consists of three components: self-feedback, supervisor's feedback, and peer feedback. The distinctive characteristic of TPC based evaluation is the team evaluation component during which the employee's professional performance is discussed by one's peers in a face-to-face team setting, while other forms of multirater evaluation are usually conducted in a confidential and anonymous manner.^ Case study formed the methodological framework. The case was the Southeastern Virginia (SEVA) region of the Institute for Family Centered Services, and the participants were eight employees of the SEVA region. Findings showed that the evaluation process was anxiety producing for employees, especially the process of peer evaluation in a team setting. Preparation was found to be an important phase of TPC evaluation. Overall, the positive feedback delivered in a team setting made team members feel acknowledged. The study participants felt that honesty in providing feedback and openness to hearing challenges were significant prerequisites to the TPC evaluation process. Further, in the planning phase, employees strove to develop goals for themselves that were meaningful. Also, the catalyst for feedback implementation appeared to stem from one's accountability to self and to the client or community. Generally, the participants identified a number of performance improvement goals that they attained during their employment with IFCS, which were supported by their developmental plans.^ In conclusion, the study identified the process by which employees learned from TPC-based employee evaluation and the ways in which they used the knowledge to improve their job performance. Specifically, the study examined how participants felt and what they thought about TPC-based feedback, in what ways they reflected and made meaning of the feedback, and how they used the feedback to improve their job performance.^
Resumo:
Due to the rapid advances in computing and sensing technologies, enormous amounts of data are being generated everyday in various applications. The integration of data mining and data visualization has been widely used to analyze these massive and complex data sets to discover hidden patterns. For both data mining and visualization to be effective, it is important to include the visualization techniques in the mining process and to generate the discovered patterns for a more comprehensive visual view. In this dissertation, four related problems: dimensionality reduction for visualizing high dimensional datasets, visualization-based clustering evaluation, interactive document mining, and multiple clusterings exploration are studied to explore the integration of data mining and data visualization. In particular, we 1) propose an efficient feature selection method (reliefF + mRMR) for preprocessing high dimensional datasets; 2) present DClusterE to integrate cluster validation with user interaction and provide rich visualization tools for users to examine document clustering results from multiple perspectives; 3) design two interactive document summarization systems to involve users efforts and generate customized summaries from 2D sentence layouts; and 4) propose a new framework which organizes the different input clusterings into a hierarchical tree structure and allows for interactive exploration of multiple clustering solutions.
Resumo:
Voice communication systems such as Voice-over IP (VoIP), Public Switched Telephone Networks, and Mobile Telephone Networks, are an integral means of human tele-interaction. These systems pose distinctive challenges due to their unique characteristics such as low volume, burstiness and stringent delay/loss requirements across heterogeneous underlying network technologies. Effective quality evaluation methodologies are important for system development and refinement, particularly by adopting user feedback based measurement. Presently, most of the evaluation models are system-centric (Quality of Service or QoS-based), which questioned us to explore a user-centric (Quality of Experience or QoE-based) approach as a step towards the human-centric paradigm of system design. We research an affect-based QoE evaluation framework which attempts to capture users' perception while they are engaged in voice communication. Our modular approach consists of feature extraction from multiple information sources including various affective cues and different classification procedures such as Support Vector Machines (SVM) and k-Nearest Neighbor (kNN). The experimental study is illustrated in depth with detailed analysis of results. The evidences collected provide the potential feasibility of our approach for QoE evaluation and suggest the consideration of human affective attributes in modeling user experience.
Resumo:
The safety of workers in nighttime roadway work zones has become a major concern for state transportation agencies due to the increase in the number of work zone fatalities. During the last decade, several studies have focused on the improvement of safety in nighttime roadway work zones; but the element that is still missing is a set of tools for translating the research results into practice. This paper discusses: 1) the importance of translating the research results related to the safety of workers and safety planning of nighttime work zones into practice, and 2) examples of tools that can be used for translating the results of such studies into practice. A tool that can propose safety recommendations in nighttime work zones and a web-based safety training tool for workers are presented in this paper. The tools were created as a component of a five-year research study on the assessment of the safety of nighttime roadway construction. The objectives of both tools are explained as well as their functionalities (i.e., what the tools can do for the users); their components (e.g., knowledge base, database, and interfaces); and their structures (i.e., how the components of the tools are organized to meet the objectives). Evaluations by the proposed users of each tool are also presented.
Resumo:
Stable isotope analysis has become a standard ecological tool for elucidating feeding relationships of organisms and determining food web structure and connectivity. There remain important questions concerning rates at which stable isotope values are incorporated into tissues (turnover rates) and the change in isotope value between a tissue and a food source (discrimination values). These gaps in our understanding necessitate experimental studies to adequately interpret field data. Tissue turnover rates and discrimination values vary among species and have been investigated in a broad array of taxa. However, little attention has been paid to ectothermic top predators in this regard. We quantified the turnover rates and discrimination values for three tissues (scutes, red blood cells, and plasma) in American alligators (Alligator mississippiensis). Plasma turned over faster than scutes or red blood cells, but turnover rates of all three tissues were very slow in comparison to those in endothermic species. Alligator δ15N discrimination values were surprisingly low in comparison to those of other top predators and varied between experimental and control alligators. The variability of δ15N discrimination values highlights the difficulties in using δ15N to assign absolute and possibly even relative trophic levels in field studies. Our results suggest that interpreting stable isotope data based on parameter estimates from other species can be problematic and that large ectothermic tetrapod tissues may be characterized by unique stable isotope dynamics relative to species occupying lower trophic levels and endothermic tetrapods.
Resumo:
Current demands for accountability in education emphasize outcome-based program evaluation and tie program funding to individual student performance. As has been the case for elementary and secondary programs, demands for accountability have increased pressure on adult educators to show evidence of the benefits of their programs in order to justify their financial support. In Florida, recent legislation fundamentally changes the delivery of adult education in the state by establishing a performance-based funding system that is based on outcomes related to the retention, completion, and employment of program participants.^ A performance-based funding system requires an evaluation process that stresses outcome indicators over indicators that focus on program context or process. Although the state has adopted indicators of program quality to evaluate its adult education programs, these indicators focus mostly on program processes rather than student outcomes. In addition, the indicators are not specifically tied to workforce development outcomes, a priority to federal and local funding agents.^ Improving the accountability of adult education programs and defining the role of these programs in Florida's Workforce Development System has become a priority to policy makers across the state. Another priority has been to involve adult education practitioners in every step of this process.^ This study was conducted in order to determine what performance indicators, as judged by the directors and supervisors of adult education programs in the state of Florida, are important and feasible in measuring the quality and effectiveness of these programs. The results of the study indicated that, both statewide and by region, the respondents consistently gave the highest ratings on both importance and feasibility to the indicators of Program Context, which reflect the needs, composition, and structure of the programs, and to the indicators of Educational Gain, which reflect learner progress in the attainment of basic skills and competencies. In turn, the respondents gave the lowest ratings on both importance and feasibility to the indicators in the areas of Return on State's Investment, Efficiency, Retention, and Workforce Training. In general, the indicators that received high ratings for importance also received high ratings for feasibility. ^
Resumo:
The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature did not differentiate between single and multiple restatements announcements. This research investigated the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm’s rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examined the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigated how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results supported the non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examined the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examined the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supported the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examined the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis showed that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.
Resumo:
The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature does not differentiate between single and multiple restatements announcements. This research investigates the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm's rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examines the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigates how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results support for non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examines the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examines the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supports the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examines the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis shows that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.
Resumo:
Advances in multiscale material modeling of structural concrete have created an upsurge of interest in the accurate evaluation of mechanical properties and volume fractions of its nano constituents. The task is accomplished by analyzing the response of a material to indentation, obtained as an outcome of a nanoindentation experiment, using a procedure called the Oliver and Pharr (OP) method. Despite its widespread use, the accuracy of this method is often questioned when it is applied to the data from heterogeneous materials or from the materials that show pile-up and sink-in during indentation, which necessitates the development of an alternative method. ^ In this study, a model is developed within the framework defined by contact mechanics to compute the nanomechanical properties of a material from its indentation response. Unlike the OP method, indentation energies are employed in the form of dimensionless constants to evaluate model parameters. Analysis of the load-displacement data pertaining to a wide range of materials revealed that the energy constants may be used to determine the indenter tip bluntness, hardness and initial unloading stiffness of the material. The proposed model has two main advantages: (1) it does not require the computation of the contact area, a source of error in the existing method; and (2) it incorporates the effect of peak indentation load, dwelling period and indenter tip bluntness on the measured mechanical properties explicitly. ^ Indentation tests are also carried out on samples from cement paste to validate the energy based model developed herein by determining the elastic modulus and hardness of different phases of the paste. As a consequence, it has been found that the model computes the mechanical properties in close agreement with that obtained by the OP method; a discrepancy, though insignificant, is observed more in the case of C-S-H than in the anhydrous phase. Nevertheless, the proposed method is computationally efficient, and thus it is highly suitable when the grid indentation technique is required to be performed. In addition, several empirical relations are developed that are found to be crucial in understanding the nanomechanical behavior of cementitious materials.^
Resumo:
Buildings and other infrastructures located in the coastal regions of the US have a higher level of wind vulnerability. Reducing the increasing property losses and causalities associated with severe windstorms has been the central research focus of the wind engineering community. The present wind engineering toolbox consists of building codes and standards, laboratory experiments, and field measurements. The American Society of Civil Engineers (ASCE) 7 standard provides wind loads only for buildings with common shapes. For complex cases it refers to physical modeling. Although this option can be economically viable for large projects, it is not cost-effective for low-rise residential houses. To circumvent these limitations, a numerical approach based on the techniques of Computational Fluid Dynamics (CFD) has been developed. The recent advance in computing technology and significant developments in turbulence modeling is making numerical evaluation of wind effects a more affordable approach. The present study targeted those cases that are not addressed by the standards. These include wind loads on complex roofs for low-rise buildings, aerodynamics of tall buildings, and effects of complex surrounding buildings. Among all the turbulence models investigated, the large eddy simulation (LES) model performed the best in predicting wind loads. The application of a spatially evolving time-dependent wind velocity field with the relevant turbulence structures at the inlet boundaries was found to be essential. All the results were compared and validated with experimental data. The study also revealed CFD's unique flow visualization and aerodynamic data generation capabilities along with a better understanding of the complex three-dimensional aerodynamics of wind-structure interactions. With the proper modeling that realistically represents the actual turbulent atmospheric boundary layer flow, CFD can offer an economical alternative to the existing wind engineering tools. CFD's easy accessibility is expected to transform the practice of structural design for wind, resulting in more wind-resilient and sustainable systems by encouraging optimal aerodynamic and sustainable structural/building design. Thus, this method will help ensure public safety and reduce economic losses due to wind perils.
Resumo:
Elemental analysis can become an important piece of evidence to assist the solution of a case. The work presented in this dissertation aims to evaluate the evidential value of the elemental composition of three particular matrices: ink, paper and glass. In the first part of this study, the analytical performance of LIBS and LA-ICP-MS methods was evaluated for paper, writing inks and printing inks. A total of 350 ink specimens were examined including black and blue gel inks, ballpoint inks, inkjets and toners originating from several manufacturing sources and/or batches. The paper collection set consisted of over 200 paper specimens originating from 20 different paper sources produced by 10 different plants. Micro-homogeneity studies show smaller variation of elemental compositions within a single source (i.e., sheet, pen or cartridge) than the observed variation between different sources (i.e., brands, types, batches). Significant and detectable differences in the elemental profile of the inks and paper were observed between samples originating from different sources (discrimination of 87–100% of samples, depending on the sample set under investigation and the method applied). These results support the use of elemental analysis, using LA-ICP-MS and LIBS, for the examination of documents and provide additional discrimination to the currently used techniques in document examination. In the second part of this study, a direct comparison between four analytical methods (µ-XRF, solution-ICP-MS, LA-ICP-MS and LIBS) was conducted for glass analyses using interlaboratory studies. The data provided by 21 participants were used to assess the performance of the analytical methods in associating glass samples from the same source and differentiating different sources, as well as the use of different match criteria (confidence interval (±6s, ±5s, ±4s, ±3s, ±2s), modified confidence interval, t-test (sequential univariate, p=0.05 and p=0.01), t-test with Bonferroni correction (for multivariate comparisons), range overlap, and Hotelling's T2 tests. Error rates (Type 1 and Type 2) are reported for the use of each of these match criteria and depend on the heterogeneity of the glass sources, the repeatability between analytical measurements, and the number of elements that were measured. The study provided recommendations for analytical performance-based parameters for µ-XRF and LA-ICP-MS as well as the best performing match criteria for both analytical techniques, which can be applied now by forensic glass examiners.
Resumo:
This study evaluated the early development and pilot-testing of Project IMPACT, a case management intervention for victims of stalking. The Design and Development framework (Rothman & Thomas, 1994) was used as a guide for program development and evaluation. Nine research questions examined the processes and outcomes associated with program implementation. ^ The sample included all 36 clients who participated in Project IMPACT between February of 2000 and June of 2001, as well as the victim advocates who provided them with services. Quantitative and qualitative data were drawn from client case files, participant observation field notes and interview transcriptions. Quantitative data were entered into three databases where: (1) clients were the units of analysis (n = 36), (2) services were the units of analysis (n = 1146), and (3) goals were the units of analysis (n = 149). These data were analyzed using descriptive statistics, Pearson's Chi-square, Spearman's Rho, Phi, Cramer's V, Wilcoxon's Matched Pairs Signed-Ranked Test and McNemar's Test Statistic. Qualitative data were reduced via open, axial and selective coding methods. Grounded theory and case study frameworks were utilized to analyze these data. ^ Results showed that most clients noted an improved sense of well-being and safety, although residual symptoms of trauma remained for numerous individuals. Stalkers appeared to respond to criminal and civil justice-based interventions by reducing violent and threatening behaviors; however, covert behaviors continued. The study produced findings that provided preliminary support for the use of several intervention components including support services, psycho-education, safety planning, and boundary spanning. The psycho-education and safety planning in particular seemed to help clients cognitively reframe their perceptions of the stalking experience and gain a sense of increased safety and well-being. A 65% level of satisfactory goal achievement was observed overall, although goals involving justice-based organizations were associated with lower achievement. High service usage was related to low-income clients and those lacking in social support. Numerous inconsistencies in program implementation were found to be associated with the skills and experiences of victim advocates. Thus, recommendations were made to further refine, develop and evaluate the intervention. ^
Resumo:
The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system's EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter's components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled
Resumo:
The objective of this study was to develop a GIS-based multi-class index overlay model to determine areas susceptible to inland flooding during extreme precipitation events in Broward County, Florida. Data layers used in the method include Airborne Laser Terrain Mapper (ALTM) elevation data, excess precipitation depth determined through performing a Soil Conservation Service (SCS) Curve Number (CN) analysis, and the slope of the terrain. The method includes a calibration procedure that uses "weights and scores" criteria obtained from Hurricane Irene (1999) records, a reported 100-year precipitation event, Doppler radar data and documented flooding locations. Results are displayed in maps of Eastern Broward County depicting types of flooding scenarios for a 100-year, 24-hour storm based on the soil saturation conditions. As expected the results of the multi-class index overlay analysis showed that an increase for the potential of inland flooding could be expected when a higher antecedent moisture condition is experienced. The proposed method proves to have some potential as a predictive tool for flooding susceptibility based on a relatively simple approach.
Resumo:
Buildings and other infrastructures located in the coastal regions of the US have a higher level of wind vulnerability. Reducing the increasing property losses and causalities associated with severe windstorms has been the central research focus of the wind engineering community. The present wind engineering toolbox consists of building codes and standards, laboratory experiments, and field measurements. The American Society of Civil Engineers (ASCE) 7 standard provides wind loads only for buildings with common shapes. For complex cases it refers to physical modeling. Although this option can be economically viable for large projects, it is not cost-effective for low-rise residential houses. To circumvent these limitations, a numerical approach based on the techniques of Computational Fluid Dynamics (CFD) has been developed. The recent advance in computing technology and significant developments in turbulence modeling is making numerical evaluation of wind effects a more affordable approach. The present study targeted those cases that are not addressed by the standards. These include wind loads on complex roofs for low-rise buildings, aerodynamics of tall buildings, and effects of complex surrounding buildings. Among all the turbulence models investigated, the large eddy simulation (LES) model performed the best in predicting wind loads. The application of a spatially evolving time-dependent wind velocity field with the relevant turbulence structures at the inlet boundaries was found to be essential. All the results were compared and validated with experimental data. The study also revealed CFD’s unique flow visualization and aerodynamic data generation capabilities along with a better understanding of the complex three-dimensional aerodynamics of wind-structure interactions. With the proper modeling that realistically represents the actual turbulent atmospheric boundary layer flow, CFD can offer an economical alternative to the existing wind engineering tools. CFD’s easy accessibility is expected to transform the practice of structural design for wind, resulting in more wind-resilient and sustainable systems by encouraging optimal aerodynamic and sustainable structural/building design. Thus, this method will help ensure public safety and reduce economic losses due to wind perils.