90 resultados para Linear and multilinear programming
Resumo:
In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. The sugarcane transport system is very complex and uses daily schedules, consisting of a set of locomotives runs, to satisfy the requirements of the mill and harvesters. The total cost of sugarcane transport operations is very high; over 35% of the total cost of sugarcane production in Australia is incurred in cane transport. Efficient schedules for sugarcane transport can reduce the cost and limit the negative effects that this system can have on the raw sugar production system. There are several benefits to formulating the train scheduling problem as a blocking parallel-machine job shop scheduling (BPMJSS) problem, namely to prevent two trains passing in one section at the same time; to keep the train activities (operations) in sequence during each run (trip) by applying precedence constraints; to pass the trains on one section in the correct order (priorities of passing trains) by applying disjunctive constraints; and, to ease passing trains by solving rail conflicts by applying blocking constraints and Parallel Machine Scheduling. Therefore, the sugarcane rail operations are formulated as BPMJSS problem. A mixed integer programming and constraint programming approaches are used to describe the BPMJSS problem. The model is solved by the integration of constraint programming, mixed integer programming and search techniques. The optimality performance is tested by Optimization Programming Language (OPL) and CPLEX software on small and large size instances based on specific criteria. A real life problem is used to verify and validate the approach. Constructive heuristics and new metaheuristics including simulated annealing and tabu search are proposed to solve this complex and NP-hard scheduling problem and produce a more efficient scheduling system. Innovative hybrid and hyper metaheuristic techniques are developed and coded using C# language to improve the solutions quality and CPU time. Hybrid techniques depend on integrating heuristic and metaheuristic techniques consecutively, while hyper techniques are the complete integration between different metaheuristic techniques, heuristic techniques, or both.
Resumo:
"Whe' yu' from?" The question was put to me as I wandered, camera in hand, in the old square of Spanish Town, Jamaica's former capital. The local man, lounging in the shade of one of the colonial Georgian buildings that enclose the square, was mildly curious about what he took to be a typical white tourish photgraphing the sights of the decayed historic town. At that time, my home was in Kingston where i lived with my wife and baby son. I was then working in the Jamaican Government Town Planning Department in a job that took me all over the island. Turning to my questioner, I replied, "Kingston". There was a brief pause, and then the man spoke again: "No Man! Whe' yu' really from?" I still have difficulties when asked this question. Where am I from? What does this question mean? Does it refer to where I was born, where I spent my previous life or where I live now? Does it have a broader meaning, an enquiry about my origins in terms of background and previous experience? The following chapters are my attempt to answer these questions for my own satisfaction and, I hope, for the amusement of others who may be interested in the life of an ordinary English boy whose dream to travel and see the world was realized in ways he could not possibly have imagined. Finding an appropriate title for this book was difficult. Thursday's Child, North and South and War and Peace all came to mind but, unfortunately for me, those titles had been appropriated by other writers. Thursdays's Child is quite a popular book title, presumably because people who were born on that day and, in the words of the nursery rhyme, had 'far to go', are especially likely to have travellers' tales to tell or life stories of the rags-to-riches variety. Born on a Thursday, I have travelled a lot and I suppose that I have gone far in life. Coming from a working class family, I 'got on' by 'getting a good education' and a 'good job'. I decided against adding to the list of Thursday's Children. North and South would have reflected my life in Britain, spent in both the North and South of England, and my later years, divided between the Northern and Southern Hemispheres of the globe, as well as in countries commonly referred to as the 'advanced' North and the 'underdeveloped' South. North and South has already been appropriated by Mrs Gaskell, something that did not deter one popular American writer from using the title for a book of his. My memories of World War Two and the years afterwards made War and Peace a possible candidate, but readers expectnig an epic tale of Tolstoyan proportions may have been disappointed. To my knowledge, no other book has the title "Whe' Yu' From?". I am grateful to the Jamaican man whose question lingered in my memory and provided the title of this memoir, written decades later. This book is a word picture. It is, in a sense, a self-portrait, and like all portraits, it captures something of the character, it attempts to tell the truth, but it is not the whole truth. This is because it is not my intention to write my entire life story; rather I wish to tell about some of the things in my experience of life that have seemed important or interesting to me. Unlike a painted portrait, the picture I have created is intended to suggest the passage of time. While, for most of us in Western society, time is linear and unidirectional, like the flight of an arrov or the trajectory of a bullet, memory rearranges things, calling up images of the past in no particular order, making connections that may link events in various patterns, circular, web-like, superimposed. The stream of consciousness is very unlike that of streams we encounter in the physical world. Connections are made in all directions; thoughts hop back and forth in time and space, from topic to topic. My book is a composition drawn from periods, events and thoughts as I remember them. Like life itself, it is made up of patches, some good, some bad, but in my experience, always fascinating. In recording my memories, I have been as accurate as possible. Little of what I have written is about spectacular sights and strange customs. Much of it focuses on my more modest explorations includng observations of everyday things that have attracted my attention. Reading through the chapters, I am struck by my childhood freedom to roam and engage in 'dangerous' activities like climbing trees and playing beside streams, things that many children today are no longer allowed to enjoy. Also noticeable is the survival of traditions and superstitions from the distant past. Obvious too, is my preoccupation with place names, both official ones that appear on maps and sign boards and those used by locals and children, names rarely seen in print. If there is any uniting theme to be found in what I have written, it must be my education in the fields, woods and streets of my English homeland, in the various other countries in which I have lived and travelled, as well as more formally from books and in classrooms. Much of my book is concerned with people and places. Many of the people I mention are among those who have been, and often have remained, important and close to me. Others I remember from only the briefest of encounters, but they remain in my memory because of some specific incident or circumstance that fixed a lasting image in my mind. Some of my closest friends and relatives, however, appear nowhere in these pages or they receive only the slightest mention. This is not because they played an unimportant roles in my life. It is because this book is not the whole story. Among those whe receive little or no mention are some who are especially close to me, with whom I have shared happy and sad times and who have shown me and my family much kindness, giving support when this was needed. Some I have known since childhood and have popped up at various times in my life, often in different parts of the world. Although years may pass without me seeing them, in an important sense they are always with me. These people know who they are. I hope that they know how much I love and appreciate them. When writing my memoir, I consulted a few of the people mentioned in this book, but in the main, I have relied on my own memory, asided by daiary and notebook entries and old correspondence. In the preparation of this manuscript, I benefited greatly from the expert advice and encouragement of Neil Marr of BeWrite Books. My wife Anne, inspiration for this book, also contributed in the valuable role of critic. She has my undying gratitude.
Resumo:
Background: Extreme temperatures are associated with cardiovascular disease (CVD) deaths. Previous studies have investigated the relative CVD mortality risk of temperature, but this risk is heavily influenced by deaths in frail elderly persons. To better estimate the burden of extreme temperatures we estimated their effects on years of life lost due to CVD. Methods and Results: The data were daily observations on weather and CVD mortality for Brisbane, Australia between 1996 and 2004. We estimated the association between daily mean temperature and years of life lost due to CVD, after adjusting for trend, season, day of the week, and humidity. To examine the non-linear and delayed effects of temperature, a distributed lag non-linear model was used. The model’s residuals were examined to investigate if there were any added effects due to cold spells and heat waves. The exposure-response curve between temperature and years of life lost was U-shaped, with the lowest years of life lost at 24 °C. The curve had a sharper rise at extremes of heat than of cold. The effect of cold peaked two days after exposure, whereas the greatest effect of heat occurred on the day of exposure. There were significantly added effects of heat waves on years of life lost. Conclusions: Increased years of life lost due to CVD are associated with both cold and hot temperatures. Research on specific interventions is needed to reduce temperature-related years of life lost from CVD deaths.
Resumo:
This chapter reports on eleven interviews with Pro-Am archivists of Australian television which aimed to find out how they decide what materials are important enough to archive. Interviewees mostly choose to collect materials in which they have a personal interest. But they are also aware of the relationship between their own favourites and wider accounts of Australian television history, and negotiate between these two positions. Most interviewees acknowledged Australian television’s links with British and American programming, but also felt that Australian television is distinctive. They argued that Australian television history is ignored in a way that isn’t true for the UK or the US. Several also argued that Australian television has had a ‘naïve’ nature that has allowed it to be more experimental.
Resumo:
Our paper approaches Twitter through the lens of “platform politics” (Gillespie, 2010), focusing in particular on controversies around user data access, ownership, and control. We characterise different actors in the Twitter data ecosystem: private and institutional end users of Twitter, commercial data resellers such as Gnip and DataSift, data scientists, and finally Twitter, Inc. itself; and describe their conflicting interests. We furthermore study Twitter’s Terms of Service and application programming interface (API) as material instantiations of regulatory instruments used by the platform provider and argue for a more promotion of data rights and literacy to strengthen the position of end users.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
Background Aphasia is an acquired language disorder that can present a significant barrier to patient involvement in healthcare decisions. Speech-language pathologists (SLPs) are viewed as experts in the field of communication. However, many SLP students do not receive practical training in techniques to communicate with people with aphasia (PWA) until they encounter PWA during clinical education placements. Methods This study investigated the confidence and knowledge of SLP students in communicating with PWA prior to clinical placements using a customised questionnaire. Confidence in communicating with people with aphasia was assessed using a 100-point visual analogue scale. Linear, and logistic, regressions were used to examine the association between confidence and age, as well as confidence and course type (graduate-entry masters or undergraduate), respectively. Knowledge of strategies to assist communication with PWA was examined by asking respondents to list specific strategies that could assist communication with PWA. Results SLP students were not confident with the prospect of communicating with PWA; reporting a median 29-points (inter-quartile range 17–47) on the visual analogue confidence scale. Only, four (8.2%) of respondents rated their confidence greater than 55 (out of 100). Regression analyses indicated no relationship existed between confidence and students‘ age (p = 0.31, r-squared = 0.02), or confidence and course type (p = 0.22, pseudo r-squared = 0.03). Students displayed limited knowledge about communication strategies. Thematic analysis of strategies revealed four overarching themes; Physical, Verbal Communication, Visual Information and Environmental Changes. While most students identified potential use of resources (such as images and written information), fewer students identified strategies to alter their verbal communication (such as reduced speech rate). Conclusions SLP students who had received aphasia related theoretical coursework, but not commenced clinical placements with PWA, were not confident in their ability to communicate with PWA. Students may benefit from an educational intervention or curriculum modification to incorporate practical training in effective strategies to communicate with PWA, before they encounter PWA in clinical settings. Ensuring students have confidence and knowledge of potential communication strategies to assist communication with PWA may allow them to focus their learning experiences in more specific clinical domains, such as clinical reasoning, rather than building foundation interpersonal communication skills.
Resumo:
Automated crowd counting has become an active field of computer vision research in recent years. Existing approaches are scene-specific, as they are designed to operate in the single camera viewpoint that was used to train the system. Real world camera networks often span multiple viewpoints within a facility, including many regions of overlap. This paper proposes a novel scene invariant crowd counting algorithm that is designed to operate across multiple cameras. The approach uses camera calibration to normalise features between viewpoints and to compensate for regions of overlap. This compensation is performed by constructing an 'overlap map' which provides a measure of how much an object at one location is visible within other viewpoints. An investigation into the suitability of various feature types and regression models for scene invariant crowd counting is also conducted. The features investigated include object size, shape, edges and keypoints. The regression models evaluated include neural networks, K-nearest neighbours, linear and Gaussian process regresion. Our experiments demonstrate that accurate crowd counting was achieved across seven benchmark datasets, with optimal performance observed when all features were used and when Gaussian process regression was used. The combination of scene invariance and multi camera crowd counting is evaluated by training the system on footage obtained from the QUT camera network and testing it on three cameras from the PETS 2009 database. Highly accurate crowd counting was observed with a mean relative error of less than 10%. Our approach enables a pre-trained system to be deployed on a new environment without any additional training, bringing the field one step closer toward a 'plug and play' system.
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
Employee engagement is linked to higher productivity, lower attrition, and improved organizational reputations resulting in increased focus and resourcing by managers to foster an engaged workforce. While drivers of employee engagement have been identified as perceived support, job characteristics, and value congruence, internal communication is theoretically suggested to be a key influence in both the process and maintenance of employee engagement efforts. However, understanding the mechanisms by which internal communication influences employee engagement has emerged as a key question in the literature. The purpose of this research is to investigate whether social factors, namely perceived support and identification, play a mediating role in the relationship between internal communication and engagement. To test the theoretical model, data are collected from 200 non-executive employees using an online self-administered survey. The study applies linear and mediated regression to the model and finds that organizations and supervisors should focus internal communication efforts toward building greater perceptions of support and stronger identification among employees in order to foster optimal levels of engagement.
Resumo:
In this paper the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. Lett., 73 (1994), pp.1311-1315; Phys. Rev. E, 54 (1996), pp.376-394] is presented in a pedagogical way to increase its visibility in applied mathematics and to argue favorably for its incorporation into the corresponding graduate curriculum.The method is illustrated by some linear and nonlinear singular perturbation problems. Key word. © 2012 Society for Industrial and Applied Mathematics.
Resumo:
This article elucidates and analyzes the fundamental underlying structure of the renormalization group (RG) approach as it applies to the solution of any differential equation involving multiple scales. The amplitude equation derived through the elimination of secular terms arising from a naive perturbation expansion of the solution to these equations by the RG approach is reduced to an algebraic equation which is expressed in terms of the Thiele semi-invariants or cumulants of the eliminant sequence { Zi } i=1 . Its use is illustrated through the solution of both linear and nonlinear perturbation problems and certain results from the literature are recovered as special cases. The fundamental structure that emerges from the application of the RG approach is not the amplitude equation but the aforementioned algebraic equation. © 2008 The American Physical Society.
Resumo:
We report the Heck coupling of 2-vinyl-4,5-dicyanoimidazole (vinazene) with selected di- and trihalo aromatics in an effort to prepare linear and branched electron-accepting conjugated materials for application in organic electronics. By selecting the suitable halo-aromatic moiety, it is possible to tune the HOMO - LUMO energy levels, absorption, and emission properties for a specific application. In this regard, materials with strong photoluminescence from blue → green → red are reported that may have potential application in organic light-emitting diodes (OLEDs). Furthermore, derivatives with strong absorption in the visible spectrum, coupled with favorable HOMO-LUMO levels, have been used to prepare promising organic photovoltaic devices (OPVs) when combined with commercially available semiconducting donor polymers.
Impact of child labor on academic performance : evidence from the program "Edúcame Primero Colombia"
Resumo:
In this study, the effects of different variables of child labor on academic performance are investigated. To this end, 3302 children participating in the child labor eradication program “Edúcame Primero Colombia” were interviewed. The interview format used for the children's enrollment into the program was a template from which socioeconomic conditions, academic performance, and child labor variables were evaluated. The academic performance factor was determined using the Analytic Hierarchy Process (AHP). The data were analyzed through a logistic regression model that took into account children who engaged in a type of labor (n = 921). The results showed that labor conditions, the number of weekly hours dedicated to work, and the presence of work scheduled in the morning negatively affected the academic performance of child laborers. These results show that the relationship between child labor and academic performance is based on the conflict between these two activities. These results do not indicate a linear and simple relationship associated with the recognition of the presence or absence of child labor. This study has implications for the formulation of policies, programs, and interventions for preventing, eradicating, and attenuating the negative effects of child labor on the social and educational development of children.
Resumo:
In recent years, the beauty leaf plant (Calophyllum Inophyllum) is being considered as a potential 2nd generation biodiesel source due to high seed oil content, high fruit production rate, simple cultivation and ability to grow in a wide range of climate conditions. However, however, due to the high free fatty acid (FFA) content in this oil, the potential of this biodiesel feedstock is still unrealized, and little research has been undertaken on it. In this study, transesterification of beauty leaf oil to produce biodiesel has been investigated. A two-step biodiesel conversion method consisting of acid catalysed pre-esterification and alkali catalysed transesterification has been utilized. The three main factors that drive the biodiesel (fatty acid methyl ester (FAME)) conversion from vegetable oil (triglycerides) were studied using response surface methodology (RSM) based on a Box-Behnken experimental design. The factors considered in this study were catalyst concentration, methanol to oil molar ratio and reaction temperature. Linear and full quadratic regression models were developed to predict FFA and FAME concentration and to optimize the reaction conditions. The significance of these factors and their interaction in both stages was determined using analysis of variance (ANOVA). The reaction conditions for the largest reduction in FFA concentration for acid catalysed pre-esterification was 30:1 methanol to oil molar ratio, 10% (w/w) sulfuric acid catalyst loading and 75 °C reaction temperature. In the alkali catalysed transesterification process 7.5:1 methanol to oil molar ratio, 1% (w/w) sodium methoxide catalyst loading and 55 °C reaction temperature were found to result in the highest FAME conversion. The good agreement between model outputs and experimental results demonstrated that this methodology may be useful for industrial process optimization for biodiesel production from beauty leaf oil and possibly other industrial processes as well.