991 resultados para Productivity tools competency
Resumo:
Most powerful analytical tools used in the social sciences are well suited for studying static situations. Static and mechanistic analysis, however, is not adequate to understand the changing world in which we live. In order to adequately address the most pressing social and environmental challenges looming ahead, we need to develop analytical tools for analyzing dynamic situations -particularly institutional change. In this paper, we develop an analytical tool to study institutional change, more specifically, the evolution of rules and norms. We believe that in order for such an analytical tool to be useful to develop a general theory of institutional change, it needs to enable the analyst to concisely record the processes of change in multiple specific settings so that lessons from such settings can eventually be integrated into a more general predictive theory of change. Copyright © The JOIE Foundation 2010.
Resumo:
Duke Medicine utilized interprofessional case conferences (ICCs) from 2008-2012 with the objective of modeling and facilitating development of teamwork skills among diverse health profession students, including physical therapy, physician assistant, medical doctor and nursing. The purpose of this publication was to describe the operational process used to develop and implement the ICCs and measure the success of the ICCs in order to shape future work. The ICCs were offered to develop skills and attitudes essential for participation in healthcare teams. Students were facilitated by faculty of different professions to conduct a comprehensive historical assessment of a standardized patient (SP), determine pertinent physical and lab assessments to undertake, and develop and share a comprehensive management plan. Cases included patient problems that were authentic and relevant to each professional student in attendance. The main barriers to implementation are outlined and the focus on the process of working together is highlighted. Evaluation showed high satisfaction rates among participants and the outcomes from these experiences are presented. The limitations of these results are discussed and recommendations for future assessment are emphasized. The ICCs demonstrated that students will come together voluntarily to learn in teams, even at a research-focused institution, and express benefit from the collaborative exercise.
Resumo:
An enterprise information system (EIS) is an integrated data-applications platform characterized by diverse, heterogeneous, and distributed data sources. For many enterprises, a number of business processes still depend heavily on static rule-based methods and extensive human expertise. Enterprises are faced with the need for optimizing operation scheduling, improving resource utilization, discovering useful knowledge, and making data-driven decisions.
This thesis research is focused on real-time optimization and knowledge discovery that addresses workflow optimization, resource allocation, as well as data-driven predictions of process-execution times, order fulfillment, and enterprise service-level performance. In contrast to prior work on data analytics techniques for enterprise performance optimization, the emphasis here is on realizing scalable and real-time enterprise intelligence based on a combination of heterogeneous system simulation, combinatorial optimization, machine-learning algorithms, and statistical methods.
On-demand digital-print service is a representative enterprise requiring a powerful EIS.We use real-life data from Reischling Press, Inc. (RPI), a digit-print-service provider (PSP), to evaluate our optimization algorithms.
In order to handle the increase in volume and diversity of demands, we first present a high-performance, scalable, and real-time production scheduling algorithm for production automation based on an incremental genetic algorithm (IGA). The objective of this algorithm is to optimize the order dispatching sequence and balance resource utilization. Compared to prior work, this solution is scalable for a high volume of orders and it provides fast scheduling solutions for orders that require complex fulfillment procedures. Experimental results highlight its potential benefit in reducing production inefficiencies and enhancing the productivity of an enterprise.
We next discuss analysis and prediction of different attributes involved in hierarchical components of an enterprise. We start from a study of the fundamental processes related to real-time prediction. Our process-execution time and process status prediction models integrate statistical methods with machine-learning algorithms. In addition to improved prediction accuracy compared to stand-alone machine-learning algorithms, it also performs a probabilistic estimation of the predicted status. An order generally consists of multiple series and parallel processes. We next introduce an order-fulfillment prediction model that combines advantages of multiple classification models by incorporating flexible decision-integration mechanisms. Experimental results show that adopting due dates recommended by the model can significantly reduce enterprise late-delivery ratio. Finally, we investigate service-level attributes that reflect the overall performance of an enterprise. We analyze and decompose time-series data into different components according to their hierarchical periodic nature, perform correlation analysis,
and develop univariate prediction models for each component as well as multivariate models for correlated components. Predictions for the original time series are aggregated from the predictions of its components. In addition to a significant increase in mid-term prediction accuracy, this distributed modeling strategy also improves short-term time-series prediction accuracy.
In summary, this thesis research has led to a set of characterization, optimization, and prediction tools for an EIS to derive insightful knowledge from data and use them as guidance for production management. It is expected to provide solutions for enterprises to increase reconfigurability, accomplish more automated procedures, and obtain data-driven recommendations or effective decisions.
Resumo:
The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments.
Resumo:
The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated that promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests behavioral markers can be observed late in the first year of life. Many of these studies involved extensive frame-by-frame video observation and analysis of a child's natural behavior. Although non-intrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are impractical for clinical and large population research purposes. Diagnostic measures for ASD are available for infants but are only accurate when used by specialists experienced in early diagnosis. This work is a first milestone in a long-term multidisciplinary project that aims at helping clinicians and general practitioners accomplish this early detection/measurement task automatically. We focus on providing computer vision tools to measure and identify ASD behavioral markers based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure three critical AOSI activities that assess visual attention. We augment these AOSI activities with an additional test that analyzes asymmetrical patterns in unsupported gait. The first set of algorithms involves assessing head motion by tracking facial features, while the gait analysis relies on joint foreground segmentation and 2D body pose estimation in video. We show results that provide insightful knowledge to augment the clinician's behavioral observations obtained from real in-clinic assessments.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Resumo:
BACKGROUND: In the domain of academia, the scholarship of research may include, but not limited to, peer-reviewed publications, presentations, or grant submissions. Programmatic research productivity is one of many measures of academic program reputation and ranking. Another measure or tool for quantifying learning success among physical therapists education programs in the USA is 100 % three year pass rates of graduates on the standardized National Physical Therapy Examination (NPTE). In this study, we endeavored to determine if there was an association between research productivity through artifacts and 100 % three year pass rates on the NPTE. METHODS: This observational study involved using pre-approved database exploration representing all accredited programs in the USA who graduated physical therapists during 2009, 2010 and 2011. Descriptive variables captured included raw research productivity artifacts such as peer reviewed publications and books, number of professional presentations, number of scholarly submissions, total grant dollars, and numbers of grants submitted. Descriptive statistics and comparisons (using chi square and t-tests) among program characteristics and research artifacts were calculated. Univariate logistic regression analyses, with appropriate control variables were used to determine associations between research artifacts and 100 % pass rates. RESULTS: Number of scholarly artifacts submitted, faculty with grants, and grant proposals submitted were significantly higher in programs with 100 % three year pass rates. However, after controlling for program characteristics such as grade point average, diversity percentage of cohort, public/private institution, and number of faculty, there were no significant associations between scholarly artifacts and 100 % three year pass rates. CONCLUSIONS: Factors outside of research artifacts are likely better predictors for passing the NPTE.
Resumo:
The hepatitis delta virus (HDV) ribozyme is a self-cleaving RNA enzyme essential for processing viral transcripts during rolling circle viral replication. The first crystal structure of the cleaved ribozyme was solved in 1998, followed by structures of uncleaved, mutant-inhibited and ion-complexed forms. Recently, methods have been developed that make the task of modeling RNA structure and dynamics significantly easier and more reliable. We have used ERRASER and PHENIX to rebuild and re-refine the cleaved and cis-acting C75U-inhibited structures of the HDV ribozyme. The results correct local conformations and identify alternates for RNA residues, many in functionally important regions, leading to improved R values and model validation statistics for both structures. We compare the rebuilt structures to a higher resolution, trans-acting deoxy-inhibited structure of the ribozyme, and conclude that although both inhibited structures are consistent with the currently accepted hammerhead-like mechanism of cleavage, they do not add direct structural evidence to the biochemical and modeling data. However, the rebuilt structures (PDBs: 4PR6, 4PRF) provide a more robust starting point for research on the dynamics and catalytic mechanism of the HDV ribozyme and demonstrate the power of new techniques to make significant improvements in RNA structures that impact biologically relevant conclusions.
Resumo:
Team NAVIGATE aims to create a robust, portable navigational aid for the blind. Our prototype uses depth data from the Microsoft Kinect to perform realtime obstacle avoidance in unfamiliar indoor environments. The device augments the white cane by performing two signi cant functions: detecting overhanging objects and identifying stairs. Based on interviews with blind individuals, we found a combined audio and haptic feedback system best for communicating environmental information. Our prototype uses vibration motors to indicate the presence of an obstacle and an auditory command to alert the user to stairs ahead. Through multiple trials with sighted and blind participants, the device was successful in detecting overhanging objects and approaching stairs. The device increased user competency and adaptability across all trials.
Resumo:
SCOPUS: no.j
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.
Resumo:
The consecutive, partly overlapping emergence of expert systems and then neural computation methods among intelligent technologies, is reflected in the evolving scene of their application to nuclear engineering. This paper provides a bird's eye view of the state of the application in the domain, along with a review of a particular task, the one perhaps economically more important: refueling design in nuclear power reactors.
Resumo:
The Production Workstation developed at the University of Greenwich is evaluated as a tool for assisting all those concerned with production. It enables the producer, director, and cinematographer to explore the quality of the images obtainable when using a plethora of tools. Users are free to explore many possible choices, ranging from 35mm to DV, and combine them with the many image manipulation tools of the cinematographer. The validation required for the system is explicitly examined, concerning the accuracy of the resulting imagery. Copyright © 1999 by the Society of Motion Picture and Television Engineers, Inc.