914 resultados para Errors and blunders, Literary.
Resumo:
Purpose of this paper:
Recent literature indicates that around one third of perishable products finish as waste (Mena et al., 2014): 60% of this waste can be classified as avoidable (EC, 2010) suggesting logistics and operational inefficiencies along the supply chain. In developed countries perishable products are predominantly wasted in wholesale and retail (Gustavsson et al., 2011) due to customer demand uncertainty the errors and delays in the supply chain (Fernie and Sparks, 2014). While research on logistics of large retail supply chains is well documented, research on retail small and medium enterprises’ (SMEs) capabilities to prevent and manage waste of perishable products is in its infancy (c.f. Ellegaard, 2008) and needs further exploration. In our study, we investigate the retail logistics practice of small food retailers, the factors that contribute to perishable products waste and the barriers and opportunities of SMEs in retail logistics to preserve product quality and participate in reverse logistics flows.
Design/methodology/approach:
As research on waste of perishable products for SMEs is scattered, we focus on identifying key variables that contribute to the creation of avoidable waste. Secondly we identify patterns of waste creation at the retail level and its possibilities for value added recovery. We use explorative case studies (Eisenhardt, 1989) and compare four SMEs and one large retailer that operate in a developed market. To get insights into specificities of SMEs that affect retail logistics practice, we select two types of food retailers: specialised (e.g. greengrocers and bakers) and general (e.g. convenience store that sells perishable products as a part of the assortment)
Findings:
Our preliminary findings indicate that there is a difference between large retailers and SME retailers in factors that contribute to the waste creation, as well as opportunities for value added recovery of products. While more factors appear to affect waste creation and management at large retailers, a small number of specific factors appears to affect SMEs. Similarly, large retailers utilise a range of practices to reduce risks of product perishability and short shelf life, manage demand, and manage reverse logistics practices. Retail SMEs on the other hand have limited options to address waste creation and value added recovery. However, our findings show that specialist SMEs could successfully minimize waste and even create possibilities for value added recovery of perishable products. Data indicates that business orientation of the SME, the buyersupplier relationship, and an extent of adoption of lean principles in retail coupled with SME resources, product specific regulations and support from local authorities for waste management or partnerships with other organizations determine extent of successful preservation of a product quality and value added recovery.
Value:
Our contribution to the SCM academic literature is threefold: first, we identify major factors that contribute to the generation waste of perishable products in retail environment; second, we identify possibilities for value added recovery for perishable products and third, we present opportunities and challenges for SME retailers to manage or participate in activities of value added recovery. Our findings contribute to theory by filling a gap in the literature that considers product quality preservation and value added recovery in the context of retail logistics and SMEs.
Research limitations/implications:
Our findings are limited to insights from five case studies of retail companies that operate within a developed market. To improve on generalisability, we intend to increase the number of cases and include data obtained from the suppliers and organizations involved in reverse logistics flows (e.g. local authorities, charities, etc.).
Practical implications:
With this paper, we contribute to the improvement of retail logistics and operations in SMEs which constitute over 99% of business activities in UK (Rhodes, 2015). Our findings will help retail managers and owners to better understand the possibilities for value added recovery, investigate a range of logistics and retail strategies suitable for the specificities of SME environment and, ultimately, improve their profitability and sustainability.
Resumo:
The comments of Charles Kegan Paul, the Victorian publisher who was involved in publishing the novels of the nineteenth-century British-Indian author Philip Meadows Taylor as single volume reprints in the 1880s, are illuminating. They are indicative of the publisher's position with regard to publishing - that there was often no correlation between commercial success and the artistic merit of a work. According to Kegan Paul, a substandard or mediocre text would be commercially successful as long it met a perceived want on the part of the public. In effect, the ruminations of the publisher suggests that a firm desirous of acquiring commercial success for a work should be an astute judge of the pre-existing wants of consumers within the market. Yet Theodor Adorno, writing in the mid-twentieth century, offers an entirely distinctive perspective to Kegan Paul's observations, arguing that there is nothing foreordained about consumer demand for certain cultural tropes or productions. They in fact are driven by an industry that preempts and conditions the possible reactions of the consumer. Both Kegan Paul's and Adorno's insights are illuminating when it comes to addressing the key issues explored in this essay. Kegan Paul's comments allude to the ways in which the publisher's promotion of Philip Meadows Taylor's fictional depictions of India and its peoples were to a large extent driven in the mid- to late-nineteenth century by their expectations of what metropolitan readers desired at any given time, whereas Adorno's insights reveal the ways in which British-Indian narratives and the public identity of their authors were not assured in advance, but were, to a large extent, engineered by the publishing industry and the literary marketplace.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
The traditional process of filling the medicine trays and dispensing the medicines to the patients in the hospitals is manually done by reading the printed paper medicine chart. This process can be very strenuous and error-prone, given the number of sub-tasks involved in the entire workflow and the dynamic nature of the work environment. Therefore, efforts are being made to digitalise the medication dispensation process by introducing a mobile application called Smart Dosing application. The introduction of the Smart Dosing application into hospital workflow raises security concerns and calls for security requirement analysis. This thesis is written as a part of the smart medication management project at Embedded Systems Laboratory, A° bo Akademi University. The project aims at digitising the medicine dispensation process by integrating information from various health systems, and making them available through the Smart Dosing application. This application is intended to be used on a tablet computer which will be incorporated on the medicine tray. The smart medication management system include the medicine tray, the tablet device, and the medicine cups with the cup holders. Introducing the Smart Dosing application should not interfere with the existing process carried out by the nurses, and it should result in minimum modifications to the tray design and the workflow. The re-designing of the tray would include integrating the device running the application into the tray in a manner that the users find it convenient and make less errors while using it. The main objective of this thesis is to enhance the security of the hospital medicine dispensation process by ensuring the security of the Smart Dosing application at various levels. The methods used for writing this thesis was to analyse how the tray design, and the application user interface design can help prevent errors and what secure technology choices have to be made before starting the development of the next prototype of the Smart Dosing application. The thesis first understands the context of the use of the application, the end-users and their needs, and the errors made in everyday medication dispensation workflow by continuous discussions with the nursing researchers. The thesis then gains insight to the vulnerabilities, threats and risks of using mobile application in hospital medication dispensation process. The resulting list of security requirements was made by analysing the previously built prototype of the Smart Dosing application, continuous interactive discussions with the nursing researchers, and an exhaustive stateof- the-art study on security risks of using mobile applications in hospital context. The thesis also uses Octave Allegro method to make the readers understand the likelihood and impact of threats, and what steps should be taken to prevent or fix them. The security requirements obtained, as a result, are a starting point for the developers of the next iteration of the prototype for the Smart Dosing application.
Resumo:
This paper reports the use of proof planning to diagnose errors in program code. In particular it looks at the errors that arise in the base cases of recursive programs produced by undergraduates. It describes two classes of error that arise in this situation. The use of test cases would catch these errors but would fail to distinguish between them. The system adapts proof critics, commonly used to patch faulty proofs, to diagnose such errors and distinguish between the two classes. It has been implemented in Lambda-clam, a proof planning system, and applied successfully to a small set of examples.
Resumo:
Abstract : Many individuals that had a stroke have motor impairments such as timing deficits that hinder their ability to complete daily activities like getting dressed. Robotic rehabilitation is an increasingly popular therapeutic avenue in order to improve motor recovery among this population. Yet, most studies have focused on improving the spatial aspect of movement (e.g. reaching), and not the temporal one (e.g. timing). Hence, the main aim of this study was to compare two types of robotic rehabilitation on the immediate improvement of timing accuracy: haptic guidance (HG), which consists of guiding the person to make the correct movement, and thus decreasing his or her movement errors, and error amplification (EA), which consists of increasing the person’s movement errors. The secondary objective consisted of exploring whether the side of the stroke lesion had an effect on timing accuracy following HG and EA training. Thirty-four persons that had a stroke (average age 67 ± 7 years) participated in a single training session of a timing-based task (simulated pinball-like task), where they had to activate a robot at the correct moment to successfully hit targets that were presented a random on a computer screen. Participants were randomly divided into two groups, receiving either HG or EA. During the same session, a baseline phase and a retention phase were given before and after each training, and these phases were compared in order to evaluate and compare the immediate impact of HG and EA on movement timing accuracy. The results showed that HG helped improve the immediate timing accuracy (p=0.03), but not EA (p=0.45). After comparing both trainings, HG was revealed to be superior to EA at improving timing (p=0.04). Furthermore, a significant correlation was found between the side of stroke lesion and the change in timing accuracy following EA (r[subscript pb]=0.7, p=0.001), but not HG (r[subscript pb]=0.18, p=0.24). In other words, a deterioration in timing accuracy was found for participants with a lesion in the left hemisphere that had trained with EA. On the other hand, for the participants having a right-sided stroke lesion, an improvement in timing accuracy was noted following EA. In sum, it seems that HG helps improve the immediate timing accuracy for individuals that had a stroke. Still, the side of the stroke lesion seems to play a part in the participants’ response to training. This remains to be further explored, in addition to the impact of providing more training sessions in order to assess any long-term benefits of HG or EA.
Resumo:
This thesis examines three key moments in the intersecting histories of Scotland, Ireland and England, and their impact on literature. Chapter one Robert Bruce and the Last King of Ireland: Writing the Irish Invasion, 1315- 1826‘, is split into two parts. Part one, Barbour‘s (other) Bruce‘ focuses on John Barbour‘s The Bruce (1375) and its depiction of the Bruce‘s Irish campaign (1315-1318). It first examines the invasion material from the perspective of the existing Irish and Scottish relationship and their opposition to English authority. It highlights possible political and ideological motivations behind Barbour‘s negative portrait of Edward Bruce - whom Barbour presents as the catalyst for the invasion and the source of its carnage and ultimate failure - and his partisan comparison between Edward and his brother Robert I. It also probes the socio-polticial and ideological background to the Bruce and its depiction of the Irish campaign, in addition to Edward and Robert. It peers behind some of the Bruce‘s most lauded themes such as chivalry, heroism, loyalty, and patriotism, and exposes its militaristic feudal ideology, its propaganda rich rhetoric, and its illusions of freedom‘. Part one concludes with an examination of two of the Irish section‘s most marginalised figures, the Irish and a laundry woman. Part two, Cultural Memories of the Bruce Invasion of Ireland, 1375-1826‘, examines the cultural memory of the Bruce invasion in three literary works from the Medieval, Early Modern and Romantic periods. The first, and by far the most significant memorialisation of the invasion is Barbour‘s Bruce, which is positioned for the first time within the tradition of ars memoriae (art of memory) and present-day cultural memory theories. The Bruce is evaluated as a site of memory and Barbour‘s methods are compared with Icelandic literature of the same period. The recall of the invasion in late sixteenth century Anglo-Irish literature is then considered, specifically Edmund Spenser‘s A View of the State of Ireland, which is viewed in the context of contemporary Ulster politics. The final text to be considered is William Hamilton Drummond‘s Bruce’s Invasion of Ireland (1826). It is argued that Drummond‘s poem offers an alternative Irish version of the invasion; a counter-memory that responds to nineteenth-century British politics, in addition to the controversy surrounding the publication of the Ossian fragments. Chapter two, The Scots in Ulster: Policies, Proposals and Projects, 1551-1575‘, examines the struggle between Irish and Scottish Gaels and the English for dominance in north Ulster, and its impact on England‘s wider colonial ideology, strategy, literature and life writing. Part one entitled Noisy neighbours, 1551-1567‘ covers the deputyships of Sir James Croft, Sir Thomas Radcliffe, and Sir Henry Sidney, and examines English colonial writing during a crucial period when the Scots provoked an increase in militarisation in the region. Part two Devices, Advices, and Descriptions, 1567-1575‘, deals with the relationship between the Scots and Turlough O‘Neill, the influence of the 5th Earl of Argyll, and the rise of Sorley Boy MacDonnell. It proposes that a renewed Gaelic alliance hindered England‘s conquest of Ireland and generated numerous plantation proposals and projects for Ulster. Many of which exhibit a blurring‘ between the documentary and the literary; while all attest to the considerable impact of the Gaelic Scots in both motivating and frustrating various projects for that province, the most prominent of which were undertaken by Sir Thomas Smith in 1571 and Walter Devereux, 1st Earl of Essex in 1573.
Resumo:
The traditional process of filling the medicine trays and dispensing the medicines to the patients in the hospitals is manually done by reading the printed paper medicinechart. This process can be very strenuous and error-prone, given the number of sub-tasksinvolved in the entire workflow and the dynamic nature of the work environment.Therefore, efforts are being made to digitalise the medication dispensation process byintroducing a mobile application called Smart Dosing application. The introduction ofthe Smart Dosing application into hospital workflow raises security concerns and callsfor security requirement analysis. This thesis is written as a part of the smart medication management project at EmbeddedSystems Laboratory, A˚bo Akademi University. The project aims at digitising the medicine dispensation process by integrating information from various health systems, and making them available through the Smart Dosing application. This application is intended to be used on a tablet computer which will be incorporated on the medicine tray. The smart medication management system include the medicine tray, the tablet device, and the medicine cups with the cup holders. Introducing the Smart Dosing application should not interfere with the existing process carried out by the nurses, and it should result in minimum modifications to the tray design and the workflow. The re-designing of the tray would include integrating the device running the application into the tray in a manner that the users find it convenient and make less errors while using it. The main objective of this thesis is to enhance the security of the hospital medicine dispensation process by ensuring the security of the Smart Dosing application at various levels. The methods used for writing this thesis was to analyse how the tray design, and the application user interface design can help prevent errors and what secure technology choices have to be made before starting the development of the next prototype of the Smart Dosing application. The thesis first understands the context of the use of the application, the end-users and their needs, and the errors made in everyday medication dispensation workflow by continuous discussions with the nursing researchers. The thesis then gains insight to the vulnerabilities, threats and risks of using mobile application in hospital medication dispensation process. The resulting list of security requirements was made by analysing the previously built prototype of the Smart Dosing application, continuous interactive discussions with the nursing researchers, and an exhaustive state-of-the-art study on security risks of using mobile applications in hospital context. The thesis also uses Octave Allegro method to make the readers understand the likelihood and impact of threats, and what steps should be taken to prevent or fix them. The security requirements obtained, as a result, are a starting point for the developers of the next iteration of the prototype for the Smart Dosing application.
Resumo:
The view that Gothic literature emerged as a reaction against the prominence of the Greek classics, and that, as a result, it bears no trace of their influence, is a commonplace in Gothic studies. This thesis re-examines this view, arguing that the Gothic and the Classical were not in opposition to one another, and that Greek tragic poetry and myth should be counted among the literary sources that inspired early Gothic writers. The discussion is organised in three parts. Part I focuses on evidence which suggests that the Gothic and the Hellenic were closely associated in the minds of several British literati both on a political and aesthetic level. As is shown, the coincidence of the Hellenic with the Gothic revival in the second half of the eighteenth century inspired them not only to trace common ground between the Greek and Gothic traditions, but also to look at Greek tragic poetry and myth through Gothic eyes, bringing to light an unruly, ‘Dionysian’ world that suited their taste. The particulars of this coincidence, which has not thus far been discussed in Gothic studies, as well as evidence which suggests that several early Gothic writers were influenced by Greek tragedy and myth, open up new avenues for research on the thematic and aesthetic heterogeneity of early Gothic literature. Parts II and III set out to explore this new ground and to support the main argument of this thesis by examining the influence of Greek tragic poetry and myth on the works of two early Gothic novelists and, in many ways, shapers of the genre, William Beckford and Matthew Gregory Lewis. Part II focuses on William Beckford’s Vathek and its indebtedness to Euripides’s Bacchae, and Part III on Matthew Gregory Lewis’s The Monk and its indebtedness to Sophocles’s Oedipus Tyrannus. As is discussed, Beckford and Lewis participated actively in both the Gothic and Hellenic revivals, producing highly imaginative works that blended material from the British and Greek literary traditions.
Resumo:
Since the beginning of the Haitian theatrical tradition there has been an ineluctable dedication to the representation of Haitian history on stage. Given the rich theatrical archive about Haiti throughout the world, this study considers operas and plays written solely by Haitian playwrights. By delving into the works of Juste Chanlatte, Massillon Coicou, and Vendenesse Ducasse this study proposes a re-reading of Haitian theater that considers the stage as an innovative site for contesting negative and clichéd representations of the Haitian Revolution and its revolutionary leadership. A genre long mired in accusations of mimicking European literary forms, this study proposes a reevaluation of Haitian theater and its literary origins.
Resumo:
A comparison of the Rietveld quantitative phase analyses (RQPA) obtained using Cu-Kα1, Mo-Kα1, and synchrotron strictly monochromatic radiations is presented. The main aim is to test a simple hypothesis: high energy Mo-radiation, combined with high resolution laboratory X-ray powder diffraction optics, could yield more accurate RQPA, for challenging samples, than well-established Cu-radiation procedure(s). In order to do so, three set of mixtures with increasing amounts of a given phase (spiking-method) were prepared and the corresponding RQPA results have been evaluated. Firstly, a series of crystalline inorganic phase mixtures with increasing amounts of an analyte was studied in order to determine if Mo-Kα1 methodology is as robust as the well-established Cu-Kα1 one. Secondly, a series of crystalline organic phase mixtures with increasing amounts of an organic compound was analyzed. This type of mixture can result in transparency problems in reflection and inhomogeneous loading in narrow capillaries for transmission studies. Finally, a third series with variable amorphous content was studied. Limit of detection in Cu-patterns, ~0.2 wt%, are slightly lower than those derived from Mo-patterns, ~0.3 wt%, for similar recording times and limit of quantification for a well crystallized inorganic phase using laboratory powder diffraction was established ~0.10 wt%. However, the accuracy was comprised as relative errors were ~100%. Contents higher than 1.0 wt% yielded analyses with relative errors lower than 20%. From the obtained results it is inferred that RQPA from Mo-Kα1 radiation have slightly better accuracies than those obtained from Cu-Kα1. This behavior has been established with the calibration graphics obtained through the spiking method and also from Kullback-Leibler distance statistic studies. We explain this outcome, in spite of the lower diffraction power for Mo-radiation (compared to Cu-radiation), due to the larger volume tested with Mo, also because higher energy minimize pattern systematic errors and the microabsorption effect.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
Nowadays, technological advancements have brought industry and research towards the automation of various processes. Automation brings a reduction in costs and an improvement in product quality. For this reason, companies are pushing research to investigate new technologies. The agriculture industry has always looked towards automating various processes, from product processing to storage. In the last years, the automation of harvest and cultivation phases also has become attractive, pushed by the advancement of autonomous driving. Nevertheless, ADAS systems are not enough. Merging different technologies will be the solution to obtain total automation of agriculture processes. For example, sensors that estimate products' physical and chemical properties can be used to evaluate the maturation level of fruit. Therefore, the fusion of these technologies has a key role in industrial process automation. In this dissertation, ADAS systems and sensors for precision agriculture will be both treated. Several measurement procedures for characterizing commercial 3D LiDARs will be proposed and tested to cope with the growing need for comparison tools. Axial errors and transversal errors have been investigated. Moreover, a measurement method and setup for evaluating the fog effect on 3D LiDARs will be proposed. Each presented measurement procedure has been tested. The obtained results highlight the versatility and the goodness of the proposed approaches. Regarding the precision agriculture sensors, a measurement approach for the Moisture Content and density estimation of crop directly on the field is presented. The approach regards the employment of a Near Infrared spectrometer jointly with Partial Least Square statistical analysis. The approach and the model will be described together with a first laboratory prototype used to evaluate the NIRS approach. Finally, a prototype for on the field analysis is realized and tested. The test results are promising, evidencing that the proposed approach is suitable for Moisture Content and density estimation.
Resumo:
Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.
Resumo:
In the field of industrial automation, there is an increasing need to use optimal control systems that have low tracking errors and low power and energy consumption. The motors we are dealing with are mainly Permanent Magnet Synchronous Motors (PMSMs), controlled by 3 different types of controllers: a position controller, a speed controller, and a current controller. In this thesis, therefore, we are going to act on the gains of the first two controllers by going to find, through the TwinCAT 3 software, what might be the best set of parameters. To do this, starting with the default parameters recommended by TwinCAT, two main methods were used and then compared: the method of Ziegler and Nichols, which is a tabular method, and advanced tuning, an auto-tuning software method of TwinCAT. Therefore, in order to analyse which set of parameters was the best,several experiments were performed for each case, using the Motion Control Function Blocks. Moreover, some machines, such as large robotic arms, have vibration problems. To analyse them in detail, it was necessary to use the Bode Plot tool, which, through Bode plots, highlights in which frequencies there are resonance and anti-resonance peaks. This tool also makes it easier to figure out which and where to apply filters to improve control.