942 resultados para practical application
Resumo:
Well–prepared, adaptive and sustainably developing specialists are an important competitive advantage, but also one of the main challenges for businesses. One option of the education system for creation and development of staff adequate to the needs is the development of pro jects with topics from real economy ("Practical Projects"). The objective assessment is an essential driver and motivator, and is based on a system of well-chosen, well-defined and specific criteria and indicators. An approach to a more objective evaluation of practical projects is finding more objective weights of the criteria. A natural and reasonable approach is the accumulation of opinions of proven experts and subsequent bringing out the weights from the accumulated data. The preparation and conduction of a survey among recognized experts in the field of project-based learning in mathematics, informatics and information technologies is described. The processing of the data accumulated by applying AHP, allowed us to objectively determine weights of evaluation criteria and hence to achieve the desired objectiveness. ACM Computing Classification System (1998): K.3.2.
Resumo:
Queuing is a key efficiency criterion in any service industry, including Healthcare. Almost all queue management studies are dedicated to improving an existing Appointment System. In developing countries such as Pakistan, there are no Appointment Systems for outpatients, resulting in excessive wait times. Additionally, excessive overloading, limited resources and cumbersome procedures lead to over-whelming queues. Despite numerous Healthcare applications, Data Envelopment Analysis (DEA) has not been applied for queue assessment. The current study aims to extend DEA modelling and demonstrate its usefulness by evaluating the queue system of a busy public hospital in a developing country, Pakistan, where all outpatients are walk-in; along with construction of a dynamic framework dedicated towards the implementation of the model. The inadequate allocation of doctors/personnel was observed as the most critical issue for long queues. Hence, the Queuing-DEA model has been developed such that it determines the ‘required’ number of doctors/personnel. The results indicated that given extensive wait times or length of queue, or both, led to high target values for doctors/personnel. Hence, this crucial information allows the administrators to ensure optimal staff utilization and controlling the queue pre-emptively, minimizing wait times. The dynamic framework constructed, specifically targets practical implementation of the Queuing-DEA model in resource-poor public hospitals of developing countries such as Pakistan; to continuously monitor rapidly changing queue situation and display latest required personnel. Consequently, the wait times of subsequent patients can be minimized, along with dynamic staff scheduling in the absence of appointments. This dynamic framework has been designed in Excel, requiring minimal training and work for users and automatic update features, with complex technical aspects running in the background. The proposed model and the dynamic framework has the potential to be applied in similar public hospitals, even in other developing countries, where appointment systems for outpatients are non-existent.
Resumo:
A tanulmány egy nemrég lezárt TÁMOP-kutatás keretébe illeszkedik.1 A kutatás célja, hogy átfogó képet nyújtson az innovációról, tisztázza az innovációval kapcsolatos fogalmakat, bemutassa az innovációs trendeket és – egy empirikus felmérés nyomán – számot adjon arról, hogy ezek hogyan jelentkeznek Magyarország sajátos viszonyai között. Az innovációs fogalomkör megvilágításához hozzátartozik a tanulás és az innováció kapcsolatának tisztázása. Bár a témáról a menedzsment-szakirodalomban számtalan mű jelenik meg, a szerző saját diszciplináris elkötelezettségének megfelelően közgazdasági nézőpontból igyekszik megközelíteni a problémakört. Végül megkísérli Magyarországot elhelyezni a tanulás nemzetközi térképén. _______ This research aims to provide a comprehensive view on innovation, to clarify innovation concepts, to present innovation trends, and – based on an empirical survey – to give an account about how they appear specific conditions of Hungary. The refining of innovation category includes clarification of the relationship between learning and innovation. While many works have been published in this topic of management literature, the author in accordance his own disciplinary engagement tries to approach the problem area from economic point of view. Finally he attempts to place Hungary on the international map of learning.
Resumo:
A könyvvizsgálati kockázat a téves auditjelentés kiadásának kockázata olyan esetekben, amikor a beszámoló lényeges hibás állítást tartalmaz. Ez a kockázat indirekt módon a hitelintézetek és pénzügyi vállalkozások működésében is megjelenik azokban az esetekben, amikor a lényeges hibás állítást a finanszírozott vállalkozás auditált beszámolója tartalmazza, amelynek az alapján finanszírozási döntést hoznak, vagy a finanszírozás folytatásáról a beszámolóban szereplő, hibás információkból számított hitelkovenánsok alapján döntenek. A könyvvizsgálat kockázatában a vizsgált gazdálkodó üzleti kockázatai tükröződnek vissza, ezért a kockázat felmérése és az ellenőrzés ennek alapján való megtervezése, majd végrehajtása kulcsfontosságú. Jelen tanulmány – kapcsolódva a Hitelintézeti Szemle 2011. évi 4. számához – szintén a kockázat és bizonytalanság témakörét tárgyalja, pontosabban ennek egy gyakorlati vetületét: a bizonyosságfüggvények (belief functions) alkalmazását a könyvvizsgálatban; mindezt a teljesség és a tankönyvszerű rendszerfelépítés igénye nélkül. A módszer ugyanis hazánkban szinte ismeretlen, nemzetközi viszonylatban viszont empirikus kutatásban is rámutattak már az alkalmazás lehetséges előnyeire a hagyományos valószínűségelméleten alapuló számszerű kockázatbecslésekkel szemben. Eszerint a bizonyosságfüggvények jobban reprezentálják a könyvvizsgálóknak a kockázatról alkotott képét, mint a valószínűségek, mert – szemben a hagyományos modellel – nem két, hanem három állapotot kezelnek: a pozitív bizonyíték létezését, a negatív bizonyíték létezését és a bizonyíték hiányának esetét. _______ Audit risk is the risk that the auditor expresses an inappropriate audit opinion when the fi nancial statements are materially misstated. This kind of risk indirectly appears in the fi nancial statements of fi nancial institutions, when the material misstatement is in the fi nanced entity’s statements that serve as a basis for lending decisions or when the decision is made based upon credit covenants calculated from misstated information. The risks of the audit process refl ect the business risks of the auditee, so the assessment of risks, and further the planning and performance of the audit based on it is of key importance. The current study – connecting to No 4 2011 of Hitelintézeti Szemle – also discusses the topic of risk and uncertainty, or to be more precise a practical implementation of the aforementioned: the application of belief functions in the fi eld of external audit. All this without the aim of achieving completeness or textbook-like scrutiny in building up the theory. While the formalism is virtually unknown in Hungary, on the international scene empirical studies pointed out the possible advantages of the application of the method in contrast to risk assessments based on the traditional theory of probability. Accordingly, belief functions provide a better representation of auditors’ perception of risk, as in contrast to the traditional model, belief functions deal with three rather than two states: the existence of supportive evidence, that of negative evidence and the lack of evidence.
Resumo:
A környezeti hatások rendszerint túlmutatnak egy vállalat határain, éppen ezért az ellátási lánc kontextusban a környezeti szempontok érvényesítése során fontos szerep jut a beszerzési döntéseknek is. Számos olyan példát lehetne említeni, amikor egy adott szempont szerint egy alternatíva környezetileg előnyös, de az ellátási lánc egészét nézve már környezetterhelő. A környezeti hatások ellátási lánc szinten való mérése azonban komoly kihívásokat jelent. Ezzel jelentős kutatásokat és fejlesztéseket inspirált a téma. Az egyik olyan terület, amelyben komoly kutatási eredmények születtek, az a környezeti szempontok beszállítói értékelésbe való beépítése. A kutatások ezen irányához csatlakozva a szerzők tanulmányunkban azt keresik, hogyan lehet meghatározni az egyik legáltalánosabban használt szállítóértékelési módszerben, a súlyozott pontrendszerben egy adott szemponthoz azt a súlyt, amely mellett az adott szempont már döntésbefolyásoló tényezővé válik. Ehhez a DEA (Data Envelopment Analysis) összetett indikátorok (Composite Indicators, CI) módszerét alkalmazzák. A szempontok közös súlyának fontossága megállapításához a lineáris programozás elméletét használják. _____ Management decisions often have an environmental effect not just within the company, but outside as well, this is why supply chain context is highlighted in literature. Measuring environmental issues of supply decisions raise a lot of problems from methodological and practical point of view. This inspires a rapidly growing literature as a lot of studies were published focusing on how to incorporate environmental issues into supplier evaluation. This paper contributes to this stream of research as it develops a method to help weight selection. In the authors’ paper the method of Data Envelope Analysis (DEA) is used to study the extension of traditional supplier selection methods with environmental factors. The selection of the weight system can control the result of the selection process.
Resumo:
The purpose of this study was to identify the state and trait anxiety and the perceived causes of anxiety in licensed practical nurses (LPNs) returning to an associate degree nursing program in order to become registered nurses (RNs). The subjects for this study were 98 students enrolled in a transitional LPN/RN associate degree nursing program in two community colleges in the state of Florida. The State-Trait Anxiety Inventory (STAI) developed by Spielberger (1983), was used as the measuring instrument for this study.^ In addition, a Q-sort technique was used to obtain information from the subjects regarding perceived causes of anxiety. Anxiety causes for the Q-sort cards used in the study were developed from the themes identified by a sample of LPN/RN students in a pilot study. The state and trait anxiety levels were obtained using the STAI for college students scoring key and scales. Descriptive statistics were used to determine the state and trait anxiety of the students. Correlational statistics were used to determine if relationships existed between the state and trait anxiety levels and perceived causes of anxiety identified by LPN students returning to an associate degree nursing program.^ The analysis of the Q-sort was performed by computing the means, standard deviations, and frequencies of each cause. The mean trait anxiety level of the students was 57.56, $SD=29.69.$ The mean state anxiety level of the students was 68.21, $SD=25.78.$ Higher percentile scores of trait anxiety were associated with higher ranks of the Q-sort category, "failing out of the program," $\rm r\sb{s}=.27,\ p=.008.$ Implications for future nursing research and application of the findings to nursing education are presented. ^
Resumo:
If we classify variables in a program into various security levels, then a secure information flow analysis aims to verify statically that information in a program can flow only in ways consistent with the specified security levels. One well-studied approach is to formulate the rules of the secure information flow analysis as a type system. A major trend of recent research focuses on how to accommodate various sophisticated modern language features. However, this approach often leads to overly complicated and restrictive type systems, making them unfit for practical use. Also, problems essential to practical use, such as type inference and error reporting, have received little attention. This dissertation identified and solved major theoretical and practical hurdles to the application of secure information flow. ^ We adopted a minimalist approach to designing our language to ensure a simple lenient type system. We started out with a small simple imperative language and only added features that we deemed most important for practical use. One language feature we addressed is arrays. Due to the various leaking channels associated with array operations, arrays have received complicated and restrictive typing rules in other secure languages. We presented a novel approach for lenient array operations, which lead to simple and lenient typing of arrays. ^ Type inference is necessary because usually a user is only concerned with the security types for input/output variables of a program and would like to have all types for auxiliary variables inferred automatically. We presented a type inference algorithm B and proved its soundness and completeness. Moreover, algorithm B stays close to the program and the type system and therefore facilitates informative error reporting that is generated in a cascading fashion. Algorithm B and error reporting have been implemented and tested. ^ Lastly, we presented a novel framework for developing applications that ensure user information privacy. In this framework, core computations are defined as code modules that involve input/output data from multiple parties. Incrementally, secure flow policies are refined based on feedback from the type checking/inference. Core computations only interact with code modules from involved parties through well-defined interfaces. All code modules are digitally signed to ensure their authenticity and integrity. ^
Resumo:
Effective interaction with personal computers is a basic requirement for many of the functions that are performed in our daily lives. With the rapid emergence of the Internet and the World Wide Web, computers have become one of the premier means of communication in our society. Unfortunately, these advances have not become equally accessible to physically handicapped individuals. In reality, a significant number of individuals with severe motor disabilities, due to a variety of causes such as Spinal Cord Injury (SCI), Amyothrophic Lateral Sclerosis (ALS), etc., may not be able to utilize the computer mouse as a vital input device for computer interaction. The purpose of this research was to further develop and improve an existing alternative input device for computer cursor control to be used by individuals with severe motor disabilities. This thesis describes the development and the underlying principle for a practical hands-off human-computer interface based on Electromyogram (EMG) signals and Eye Gaze Tracking (EGT) technology compatible with the Microsoft Windows operating system (OS). Results of the software developed in this thesis show a significant improvement in the performance and usability of the EMG/EGT cursor control HCI.
Resumo:
Background: An evaluation was completed on the One-Day Meditech Magic Training Program for Registered Nurses (RNs) and Licensed Practical Nurses (LPNs) developed for the Long Term Care (LTC) Program. Methods: Both a literature review and consultation with stakeholders were completed to determine possible evaluation methods, expected outcomes, and ways to measure the effectiveness of the education program. A pretest/posttest design and questionnaire were chosen as the evaluation tools for this project. Results: No significant difference was found between the pretest and posttest total scores indicating that learners retained information from the orientation session (Z = -1.820, p = 0.069). Additional Wilcoxon matched-pairs signed rank tests were performed on the individual sections of the tests and revealed a significant decrease in the posttest scores for entering a Diagnostic Imaging requisition (Z = -1.975, p = 0.048). No other significant findings were present. Questionnaires were also analyzed revealing that most participants were pleased with the Meditech documentation education they received and did not indicate barriers that would affect electronic documentation. Conclusions: Further testing is required to ensure reliability and validity of the evaluation tools. Finally, caution is needed due to a small sample size. However, problematic documentation tasks were identified during the evaluation, and as a result both the training session and support materials will be improved as a result of this project.
Resumo:
This dissertation presents a calibration procedure for a pressure velocity probe. The dissertation is divided into four main chapters. The first chapter is divided into six main sections. In the firsts two, the wave equation in fluids and the velocity of sound in gases are calculated, the third section contains a general solution of the wave equation in the case of plane acoustic waves. Section four and five report the definition of the acoustic impedance and admittance, and the practical units the sound level is measured with, i.e. the decibel scale. Finally, the last section of the chapter is about the theory linked to the frequency analysis of a sound wave and includes the analysis of sound in bands and the discrete Fourier analysis, with the definition of some important functions. The second chapter describes different reference field calibration procedures that are used to calibrate the P-V probes, between them the progressive plane wave method, which is that has been used in this work. Finally, the last section of the chapter contains a description of the working principles of the two transducers that have been used, with a focus on the velocity one. The third chapter of the dissertation is devoted to the explanation of the calibration set up and the instruments used for the data acquisition and analysis. Since software routines were extremely important, this chapter includes a dedicated section on them and the proprietary routines most used are thoroughly explained. Finally, there is the description of the work that has been done, which is identified with three different phases, where the data acquired and the results obtained are presented. All the graphs and data reported were obtained through the Matlab® routine. As for the last chapter, it briefly presents all the work that has been done as well as an excursus on a new probe and on the way the procedure implemented in this dissertation could be applied in the case of a general field.
Resumo:
The aim of this thesis is to merge two of the emerging paradigms about web programming: RESTful Web Development and Service-Oriented Programming. REST is the main architectural paradigm about web applications, they are characterised by procedural structure which avoid the use of handshaking mechanisms. Even though REST has a standard structure to access the resources of the web applications, the backend side is usually not very modular if not complicated. Service-Oriented Programming, instead, has as one of the fundamental principles, the modularisation of the components. Service-Oriented Applications are characterised by separate modules that allow to simplify the devel- opment of the web applications. There are very few example of integration between these two technologies: it seems therefore reasonable to merge them. In this thesis the methodologies studied to reach this results are explored through an application that helps to handle documents and notes among several users, called MergeFly. The MergeFly practical case, once that all the specifics had been set, will be utilised in order to develop and handle HTTP requests through SOAP. In this document will be first defined the 1) characteristics of the application, 2) SOAP technology, partially introduced the 3) Jolie Language, 4) REST and finally a 5) Jolie-REST implementation will be offered through the MergeFly case. It is indeed implemented a token mechanism for authentication: it has been first discarded sessions and cookies algorithm of authentication in so far not into the pure RESTness theory, even if often used). In the final part the functionality and effectiveness of the results will be evaluated, judging the Jolie-REST duo.
Resumo:
Recent technological developments in the field of experimental quantum annealing have made prototypical annealing optimizers with hundreds of qubits commercially available. The experimental demonstration of a quantum speedup for optimization problems has since then become a coveted, albeit elusive goal. Recent studies have shown that the so far inconclusive results, regarding a quantum enhancement, may have been partly due to the benchmark problems used being unsuitable. In particular, these problems had inherently too simple a structure, allowing for both traditional resources and quantum annealers to solve them with no special efforts. The need therefore has arisen for the generation of harder benchmarks which would hopefully possess the discriminative power to separate classical scaling of performance with size from quantum. We introduce here a practical technique for the engineering of extremely hard spin-glass Ising-type problem instances that does not require "cherry picking" from large ensembles of randomly generated instances. We accomplish this by treating the generation of hard optimization problems itself as an optimization problem, for which we offer a heuristic algorithm that solves it. We demonstrate the genuine thermal hardness of our generated instances by examining them thermodynamically and analyzing their energy landscapes, as well as by testing the performance of various state-of-the-art algorithms on them. We argue that a proper characterization of the generated instances offers a practical, efficient way to properly benchmark experimental quantum annealers, as well as any other optimization algorithm.
Resumo:
Purpose: This paper extends the use of Radio Frequency Identification (RFID) data for accounting of warehouse costs and services. Time Driven Activity Based Costing (TDABC) methodology is enhanced with the real-time collected RFID data about duration of warehouse activities. This allows warehouse managers to have accurate and instant calculations of costs. The RFID enhanced TDABC (RFID-TDABC) is proposed as a novel application of the RFID technology. Research Approach: Application of RFID-TDABC in a warehouse is implemented on warehouse processes of a case study company. Implementation covers receiving, put-away, order picking, and despatching. Findings and Originality: RFID technology is commonly used for the identification and tracking items. The use of the RFID generated information with the TDABC can be successfully extended to the area of costing. This RFID-TDABC costing model will benefit warehouse managers with accurate and instant calculations of costs. Research Impact: There are still unexplored benefits to RFID technology in its applications in warehousing and the wider supply chain. A multi-disciplinary research approach led to combining RFID technology and TDABC accounting method in order to propose RFID-TDABC. Combining methods and theories from different fields with RFID, may lead researchers to develop new techniques such as RFID-TDABC presented in this paper. Practical Impact: RFID-TDABC concept will be of value to practitioners by showing how warehouse costs can be accurately measured by using this approach. Providing better understanding of incurred costs may result in a further optimisation of warehousing operations, lowering costs of activities, and thus provide competitive pricing to customers. RFID-TDABC can be applied in a wider supply chain.
Resumo:
Oscillating Water Column (OWC) is one type of promising wave energy devices due to its obvious advantage over many other wave energy converters: no moving component in sea water. Two types of OWCs (bottom-fixed and floating) have been widely investigated, and the bottom-fixed OWCs have been very successful in several practical applications. Recently, the proposal of massive wave energy production and the availability of wave energy have pushed OWC applications from near-shore to deeper water regions where floating OWCs are a better choice. For an OWC under sea waves, the air flow driving air turbine to generate electricity is a random process. In such a working condition, single design/operation point is nonexistent. To improve energy extraction, and to optimise the performance of the device, a system capable of controlling the air turbine rotation speed is desirable. To achieve that, this paper presents a short-term prediction of the random, process by an artificial neural network (ANN), which can provide near-future information for the control system. In this research, ANN is explored and tuned for a better prediction of the airflow (as well as the device motions for a wide application). It is found that, by carefully constructing ANN platform and optimizing the relevant parameters, ANN is capable of predicting the random process a few steps ahead of the real, time with a good accuracy. More importantly, the tuned ANN works for a large range of different types of random, process.
Resumo:
We will be presenting the following practical proposal that will consist of two sessions implemented with different courses of Secondary Education (ESO) of the Colegio Círculo Católico (Catholic School Group), located in the city of Burgos. Each session lasts 55 minutes. These sessions focus on the morphology of the Spanish language. Its design has been carried out by keeping in mind the theoretical basis of the communicative approach and cooperative learning.