940 resultados para practical epistemology analysis
Resumo:
The use of ex-transportation battery system (i.e. second life EV/HEV batteries) in grid applications is an emerging field of study. A hybrid battery scheme offers a more practical approach in second life battery energy storage systems because battery modules could be from different sources/ vehicle manufacturers depending on the second life supply chain and have different characteristics e.g. voltage levels, maximum capacity and also different levels of degradations. Recent research studies have suggested a dc-side modular multilevel converter topology to integrate these hybrid batteries to a grid-tie inverter. Depending on the battery module characteristics, the dc-side modular converter can adopt different modes such as boost, buck or boost-buck to suitably transfer the power from battery to the grid. These modes have different switching techniques, control range, different efficiencies, which give a system designer choice on operational mode. This paper presents an analysis and comparative study of all the modes of the converter along with their switching performances in detail to understand the relative advantages and disadvantages of each mode to help to select the suitable converter mode. Detailed study of all the converter modes and thorough experimental results based on a multi-modular converter prototype based on hybrid batteries has been presented to validate the analysis.
Resumo:
Purpose - It is important to advance operations management (OM) knowledge while being mindful of the theoretical developments of the discipline. The purpose of this paper is to explore which theoretical perspectives have dominated the OM field. This analysis allows the authors to identify theory trends and gaps in the literature and to identify fruitful areas for future research. A reflection on theory is also practical, given that it guides research toward important questions and enlightens OM practitioners. Design/methodology/approach - The authors provide an analysis of OM theory developments in the last 30 years. The study encompasses three decades of OM publications across three OM journals and contains an analysis of over 3,000 articles so as to identify which theories, over time, have been adopted by authors in order to understand OM topics. Findings - The authors find that the majority of studies are atheoretical, empirical, and focussed upon theory testing rather than on theory development. Some theories, such as the resource-based view and contingency theory, have an enduring relevance within OM. The authors also identify theories from psychology, economics, sociology, and organizational behavior that may, in the future, have salience to explain burgeoning OM research areas such as servitization and sustainability. Research limitations/implications - The study makes a novel contribution by exploring which main theories have been adopted or developed in OM, doing so by systematically analyzing articles from the three main journals in the field (the Journal of Operations Management, Production and Operations Management, and the International Journal of Operations and Production Management), which encompass three decades of OM publications. In order to focus the study, the authors may have missed important OM articles in other journals. Practical implications - A reflection on theories is important because theories inform how a researcher or practicing manager interprets and solves OM problems. This study allows the authors to reflect on the collective OM journey to date, to spot trends and gaps in the literature, and to identify fruitful areas for future research. Originality/value - As far as the authors are aware, there has not been an assessment of the main theoretical perspectives in OM. The research also identifies which topics are published in OM journals, and which theories are adopted to investigate them. The authors also reflect on whether the most cited papers and those winning best paper awards are theoretical. This gives the authors a richer understanding of the current state of OM research.
Resumo:
Purpose – The purpose of this paper is to develop an integrated patient-focused analytical framework to improve quality of care in accident and emergency (A&E) unit of a Maltese hospital. Design/methodology/approach – The study adopts a case study approach. First, a thorough literature review has been undertaken to study the various methods of healthcare quality management. Second, a healthcare quality management framework is developed using combined quality function deployment (QFD) and logical framework approach (LFA). Third, the proposed framework is applied to a Maltese hospital to demonstrate its effectiveness. The proposed framework has six steps, commencing with identifying patients’ requirements and concluding with implementing improvement projects. All the steps have been undertaken with the involvement of the concerned stakeholders in the A&E unit of the hospital. Findings – The major and related problems being faced by the hospital under study were overcrowding at A&E and shortage of beds, respectively. The combined framework ensures better A&E services and patient flow. QFD identifies and analyses the issues and challenges of A&E and LFA helps develop project plans for healthcare quality improvement. The important outcomes of implementing the proposed quality improvement programme are fewer hospital admissions, faster patient flow, expert triage and shorter waiting times at the A&E unit. Increased emergency consultant cover and faster first significant medical encounter were required to start addressing the problems effectively. Overall, the combined QFD and LFA method is effective to address quality of care in A&E unit. Practical/implications – The proposed framework can be easily integrated within any healthcare unit, as well as within entire healthcare systems, due to its flexible and user-friendly approach. It could be part of Six Sigma and other quality initiatives. Originality/value – Although QFD has been extensively deployed in healthcare setup to improve quality of care, very little has been researched on combining QFD and LFA in order to identify issues, prioritise them, derive improvement measures and implement improvement projects. Additionally, there is no research on QFD application in A&E. This paper bridges these gaps. Moreover, very little has been written on the Maltese health care system. Therefore, this study contributes demonstration of quality of emergency care in Malta.
Resumo:
Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.
Resumo:
This paper explores engineering students' perceptions of developing practical competencies as experienced in their industrial placements. In addition, it discusses the criticisms in the literature on Problem Based Learning, Project Based Learning and Conceive-Design-Implement-Operate in relation to the evaluation of effective learning and teaching during placements. The paper goes on to discuss a study which examines how undergraduate engineering students develop practical competencies during their industrial placements. A phenomenological research approach is adopted using in-depth interviews and document analysis. The research findings from this PhD study will contribute to the knowledge, theory and practice for the students, the industries and the institutions of higher education as students' practical competencies are developed and graduate employability rises. In conclusion, this study explores students' experiences of developing practical competencies during industrial placements. Hence, the study should be able to contribute to a set of evidence-based guidelines for higher education institutions and industry.
Resumo:
This paper reports potential benefits around dynamic thermal rating prediction of primary transformers within Western Power Distribution (WPD) managed Project FALCON (Flexible Approaches to Low Carbon Optimised Networks). Details of the thermal modelling, parameter optimisation and results validation are presented with asset and environmental data (measured and day/week-ahead forecast) which are used for determining dynamic ampacity. Detailed analysis of ratings and benefits and confidence in ability to accurately predict dynamic ratings are presented. Investigating the effect of sustained ONAN rating compared to a dynamic rating shows that there is scope to increase sustained ratings under ONAN operating conditions by up to 10% higher between December and March with a high degree of confidence. However, under high ambient temperature conditions this dynamic rating may also reduce in the summer months.
Resumo:
Purpose – The purpose of this paper is to examine challenges and potential of big data in heterogeneous business networks and relate these to an implemented logistics solution. Design/methodology/approach – The paper establishes an overview of challenges and opportunities of current significance in the area of big data, specifically in the context of transparency and processes in heterogeneous enterprise networks. Within this context, the paper presents how existing components and purpose-driven research were combined for a solution implemented in a nationwide network for less-than-truckload consignments. Findings – Aside from providing an extended overview of today’s big data situation, the findings have shown that technical means and methods available today can comprise a feasible process transparency solution in a large heterogeneous network where legacy practices, reporting lags and incomplete data exist, yet processes are sensitive to inadequate policy changes. Practical implications – The means introduced in the paper were found to be of utility value in improving process efficiency, transparency and planning in logistics networks. The particular system design choices in the presented solution allow an incremental introduction or evolution of resource handling practices, incorporating existing fragmentary, unstructured or tacit knowledge of experienced personnel into the theoretically founded overall concept. Originality/value – The paper extends previous high-level view on the potential of big data, and presents new applied research and development results in a logistics application.
Resumo:
Queuing is a key efficiency criterion in any service industry, including Healthcare. Almost all queue management studies are dedicated to improving an existing Appointment System. In developing countries such as Pakistan, there are no Appointment Systems for outpatients, resulting in excessive wait times. Additionally, excessive overloading, limited resources and cumbersome procedures lead to over-whelming queues. Despite numerous Healthcare applications, Data Envelopment Analysis (DEA) has not been applied for queue assessment. The current study aims to extend DEA modelling and demonstrate its usefulness by evaluating the queue system of a busy public hospital in a developing country, Pakistan, where all outpatients are walk-in; along with construction of a dynamic framework dedicated towards the implementation of the model. The inadequate allocation of doctors/personnel was observed as the most critical issue for long queues. Hence, the Queuing-DEA model has been developed such that it determines the ‘required’ number of doctors/personnel. The results indicated that given extensive wait times or length of queue, or both, led to high target values for doctors/personnel. Hence, this crucial information allows the administrators to ensure optimal staff utilization and controlling the queue pre-emptively, minimizing wait times. The dynamic framework constructed, specifically targets practical implementation of the Queuing-DEA model in resource-poor public hospitals of developing countries such as Pakistan; to continuously monitor rapidly changing queue situation and display latest required personnel. Consequently, the wait times of subsequent patients can be minimized, along with dynamic staff scheduling in the absence of appointments. This dynamic framework has been designed in Excel, requiring minimal training and work for users and automatic update features, with complex technical aspects running in the background. The proposed model and the dynamic framework has the potential to be applied in similar public hospitals, even in other developing countries, where appointment systems for outpatients are non-existent.
Resumo:
Background and objective: Safe prescribing requires accurate and practical information about drugs. Our objective was to measure the utility of current sources of prescribing guidance when used to inform practical prescribing decisions, and to compare current sources of prescribing guidance in the UK with idealized prescribing guidance. Methods: We developed 25 clinical scenarios. Two independent assessors rated and ranked the performance of five common sources of prescribing guidance in the UK when used to answer the clinical scenarios. A third adjudicator facilitated review of any disparities. An idealized list of contents for prescribing guidance was developed and sent for comments to academics and users of prescribing guidance. Following consultation an operational check was used to assess compliance with the idealized criteria. The main outcome measures were relative utility in answering the clinical scenarios and compliance with the idealized prescribing guidance. Results: Current sources of prescribing guidance used in the UK differ in their utility, when measured using clinical scenarios. The British National Formulary (BNF) and EMIS LV were the best performing sources in terms of both ranking [mean rank 1·24 and 2·20] and rating [%excellent or adequate 100% and 72%]. Current sources differed in the extent to which they fulfilled criteria for ideal prescribing guidance, but the BNF, and EMIS LV to a lesser extent, closely matched the criteria. Discussion: We have demonstrated how clinical scenarios can be used to assess prescribing guidance resources. Producers of prescribing guidance documents should consider our idealized template. Prescribers require high-quality information to support their practice. Conclusion: Our test was helpful in distinguishing between prescribing resources. Producers of prescribing guidance should consider the utility of their products to end-users, particularly in those more complex areas where prescribers may need most support. Existing UK prescribing guidance resources differ in their ability to provide assistance to prescribers. © 2010 Blackwell Publishing Ltd.
Resumo:
In 1972 the ionized cluster beam (ICB) deposition technique was introduced as a new method for thin film deposition. At that time the use of clusters was postulated to be able to enhance film nucleation and adatom surface mobility, resulting in high quality films. Although a few researchers reported singly ionized clusters containing 10$\sp2$-10$\sp3$ atoms, others were unable to repeat their work. The consensus now is that film effects in the early investigations were due to self-ion bombardment rather than clusters. Subsequently in recent work (early 1992) synthesis of large clusters of zinc without the use of a carrier gas was demonstrated by Gspann and repeated in our laboratory. Clusters resulted from very significant changes in two source parameters. Crucible pressure was increased from the earlier 2 Torr to several thousand Torr and a converging-diverging nozzle 18 mm long and 0.4 mm in diameter at the throat was used in place of the 1 mm x 1 mm nozzle used in the early work. While this is practical for zinc and other high vapor pressure materials it remains impractical for many materials of industrial interest such as gold, silver, and aluminum. The work presented here describes results using gold and silver at pressures of around 1 and 50 Torr in order to study the effect of the pressure and nozzle shape. Significant numbers of large clusters were not detected. Deposited films were studied by atomic force microscopy (AFM) for roughness analysis, and X-ray diffraction.^ Nanometer size islands of zinc deposited on flat silicon substrates by ICB were also studied by atomic force microscopy and the number of atoms/cm$\sp2$ was calculated and compared to data from Rutherford backscattering spectrometry (RBS). To improve the agreement between data from AFM and RBS, convolution and deconvolution algorithms were implemented to study and simulate the interaction between tip and sample in atomic force microscopy. The deconvolution algorithm takes into account the physical volume occupied by the tip resulting in an image that is a more accurate representation of the surface.^ One method increasingly used to study the deposited films both during the growth process and following, is ellipsometry. Ellipsometry is a surface analytical technique used to determine the optical properties and thickness of thin films. In situ measurements can be made through the windows of a deposition chamber. A method for determining the optical properties of a film, that is sensitive only to the growing film and accommodates underlying interfacial layers, multiple unknown underlayers, and other unknown substrates was developed. This method is carried out by making an initial ellipsometry measurement well past the real interface and by defining a virtual interface in the vicinity of this measurement. ^
Resumo:
The purpose of this study was to identify the state and trait anxiety and the perceived causes of anxiety in licensed practical nurses (LPNs) returning to an associate degree nursing program in order to become registered nurses (RNs). The subjects for this study were 98 students enrolled in a transitional LPN/RN associate degree nursing program in two community colleges in the state of Florida. The State-Trait Anxiety Inventory (STAI) developed by Spielberger (1983), was used as the measuring instrument for this study.^ In addition, a Q-sort technique was used to obtain information from the subjects regarding perceived causes of anxiety. Anxiety causes for the Q-sort cards used in the study were developed from the themes identified by a sample of LPN/RN students in a pilot study. The state and trait anxiety levels were obtained using the STAI for college students scoring key and scales. Descriptive statistics were used to determine the state and trait anxiety of the students. Correlational statistics were used to determine if relationships existed between the state and trait anxiety levels and perceived causes of anxiety identified by LPN students returning to an associate degree nursing program.^ The analysis of the Q-sort was performed by computing the means, standard deviations, and frequencies of each cause. The mean trait anxiety level of the students was 57.56, $SD=29.69.$ The mean state anxiety level of the students was 68.21, $SD=25.78.$ Higher percentile scores of trait anxiety were associated with higher ranks of the Q-sort category, "failing out of the program," $\rm r\sb{s}=.27,\ p=.008.$ Implications for future nursing research and application of the findings to nursing education are presented. ^
Resumo:
Small errors proved catastrophic. Our purpose to remark that a very small cause which escapes our notice determined a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. Small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. When dealing with any kind of electrical device specification, it is important to note that there exists a pair of test conditions that define a test: the forcing function and the limit. Forcing functions define the external operating constraints placed upon the device tested. The actual test defines how well the device responds to these constraints. Forcing inputs to threshold for example, represents the most difficult testing because this put those inputs as close as possible to the actual switching critical points and guarantees that the device will meet the Input-Output specifications. ^ Prediction becomes impossible by classical analytical analysis bounded by Newton and Euclides. We have found that non linear dynamics characteristics is the natural state of being in all circuits and devices. Opportunities exist for effective error detection in a nonlinear dynamics and chaos environment. ^ Nowadays there are a set of linear limits established around every aspect of a digital or analog circuits out of which devices are consider bad after failing the test. Deterministic chaos circuit is a fact not a possibility as it has been revived by our Ph.D. research. In practice for linear standard informational methodologies, this chaotic data product is usually undesirable and we are educated to be interested in obtaining a more regular stream of output data. ^ This Ph.D. research explored the possibilities of taking the foundation of a very well known simulation and modeling methodology, introducing nonlinear dynamics and chaos precepts, to produce a new error detector instrument able to put together streams of data scattered in space and time. Therefore, mastering deterministic chaos and changing the bad reputation of chaotic data as a potential risk for practical system status determination. ^
Resumo:
The purpose of this study was to describe the experiences of five educators participating in a teacher-initiated learning community that valued practical teacher knowledge. Connelly and Clandinin (2000) argued that practical teacher knowledge grew out of experience through interaction in the professional knowledge landscape. Collaboration that promoted teacher learning was the foundation to effective school change (Wood, 1997). This teacher-initiated learning community consisted of members who had equal status and collaborated by participating in discourse on curriculum and instruction. The collegiality of the community fostered teacher professionalism that improved practice and benefited the school. This study focused on the following research questions: (1) What was the experience of these five educators in this learning community? (2) What did these five individuals understand about the nature of practical teacher knowledge? (3) According to the participants, what was the relationship between teacher empowerment and effective school change? ^ The participants were chosen because each voluntarily attended this teacher-initiated learning community. Each participant answered questions regarding the experience during three semi-structured tape-recorded interviews. The interviews were transcribed, and significant statements of meaning were extracted. Using a triangulation of ideas that were common to at least three of the participants ensured the trustworthiness of the analysis. These statements were combined to describe what was experienced and how the participants described their experience. The emerging themes were the characteristics of and the relationships, methods, conditions, and environment for the teachers. The teachers described how a knowledge base of practical teacher knowledge was gained as a spirit of camaraderie developed. The freedom that the teachers experienced to collaborate and learn fostered new classroom practice that affected school change as student interaction and productivity increased. ^ The qualitative analysis of this study provided a description of a learning community that valued practical teacher knowledge and fostered professional development. This description was important to educational stakeholders because it demonstrated how practical teacher knowledge was gained during the teachers' daily work. By sharing every day experiences, the teacher talk generated collaboration and accountability that the participants felt improved practice and fostered a safe, productive learning environment for students. ^
Resumo:
If we classify variables in a program into various security levels, then a secure information flow analysis aims to verify statically that information in a program can flow only in ways consistent with the specified security levels. One well-studied approach is to formulate the rules of the secure information flow analysis as a type system. A major trend of recent research focuses on how to accommodate various sophisticated modern language features. However, this approach often leads to overly complicated and restrictive type systems, making them unfit for practical use. Also, problems essential to practical use, such as type inference and error reporting, have received little attention. This dissertation identified and solved major theoretical and practical hurdles to the application of secure information flow. ^ We adopted a minimalist approach to designing our language to ensure a simple lenient type system. We started out with a small simple imperative language and only added features that we deemed most important for practical use. One language feature we addressed is arrays. Due to the various leaking channels associated with array operations, arrays have received complicated and restrictive typing rules in other secure languages. We presented a novel approach for lenient array operations, which lead to simple and lenient typing of arrays. ^ Type inference is necessary because usually a user is only concerned with the security types for input/output variables of a program and would like to have all types for auxiliary variables inferred automatically. We presented a type inference algorithm B and proved its soundness and completeness. Moreover, algorithm B stays close to the program and the type system and therefore facilitates informative error reporting that is generated in a cascading fashion. Algorithm B and error reporting have been implemented and tested. ^ Lastly, we presented a novel framework for developing applications that ensure user information privacy. In this framework, core computations are defined as code modules that involve input/output data from multiple parties. Incrementally, secure flow policies are refined based on feedback from the type checking/inference. Core computations only interact with code modules from involved parties through well-defined interfaces. All code modules are digitally signed to ensure their authenticity and integrity. ^
Resumo:
This study examined how the themes of environmental sustainability are evident in the national, state and local standards that guide k–12 science curriculum. The study applied the principles of content analysis within the framework of an ecological paradigm. In education, an ecological paradigm focuses on students' use of a holistic lens to view and understand material. The intent of this study was to analyze the seventh grade science content standards at the national, state, and local textbook levels to determine how and the extent to which each of the five themes of environmental sustainability are presented in the language of each text. The themes are: (a) Climate Change Indicators, (b) Biodiversity, (c) Human Population Density, (d) Impact and Presence of Environmental Pollution, (e) Earth as a Closed System. The research study offers practical insight on using a method of content analysis to locate keywords of environmental sustainability in the three texts and determine if the context of each term relates to this ecological paradigm. Using a concordance program, the researcher identified the frequency and context of each vocabulary item associated with these themes. Nine chi squares were run to determine if there were differences in content between the national and state standards and the textbook. Within each level chi squares were also run to determine if there were differences between the appearance of content knowledge and skill words. Results indicate that there is a lack of agreement between levels that is significant p < .01. A discussion of these results in relation to curriculum development and standardized assessments followed. The study found that at the national and state levels, there is a lack of articulation of the goals of environmental sustainability or an ecological paradigm. With respect to the science textbook, a greater number of keywords were present; however, the context of many of these keywords did not align with the discourse of an ecological paradigm. Further, the environmental sustainability themes present in the textbook were limited to the last four chapters of the text. Additional research is recommended to determine whether this situation also exists in other settings.