71 resultados para Gross operating margin


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A classical condition for fast learning rates is the margin condition, first introduced by Mammen and Tsybakov. We tackle in this paper the problem of adaptivity to this condition in the context of model selection, in a general learning framework. Actually, we consider a weaker version of this condition that allows one to take into account that learning within a small model can be much easier than within a large one. Requiring this “strong margin adaptivity” makes the model selection problem more challenging. We first prove, in a general framework, that some penalization procedures (including local Rademacher complexities) exhibit this adaptivity when the models are nested. Contrary to previous results, this holds with penalties that only depend on the data. Our second main result is that strong margin adaptivity is not always possible when the models are not nested: for every model selection procedure (even a randomized one), there is a problem for which it does not demonstrate strong margin adaptivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sunnybank represents a distinctly Australian take on the classic ‘Chinatown’ – or indeed other ethic community enclaves such as ‘little Italy’, ‘little Bombay’, ‘little Athens’ and so on. In the Northern Hemisphere these tended to grow up in the dense working class neighbourhoods of industrial cities, especially in port cities like Liverpool, London, New York and San Francisco. The existing Chinatowns of Sydney and Melbourne, and to some extent Brisbane’s Fortitude Valley, are of this variety. In the late 1970s, with the growth of suburbanisation and the de-industrialisation and consequent dereliction of the ‘inner city’, these ethnic communities were one of the few signs of life in the city. Apart from the daily commute into the CBD, business with the city council or a trip to the big shopping streets these areas were one of the few reasons for visiting city centres stigmatised by urban decay and petty crime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A hospital consists of a number of wards, units and departments that provide a variety of medical services and interact on a day-to-day basis. Nearly every department within a hospital schedules patients for the operating theatre (OT) and most wards receive patients from the OT following post-operative recovery. Because of the interrelationships between units, disruptions and cancellations within the OT can have a flow-on effect to the rest of the hospital. This often results in dissatisfied patients, nurses and doctors, escalating waiting lists, inefficient resource usage and undesirable waiting times. The objective of this study is to use Operational Research methodologies to enhance the performance of the operating theatre by improving elective patient planning using robust scheduling and improving the overall responsiveness to emergency patients by solving the disruption management and rescheduling problem. OT scheduling considers two types of patients: elective and emergency. Elective patients are selected from a waiting list and scheduled in advance based on resource availability and a set of objectives. This type of scheduling is referred to as ‘offline scheduling’. Disruptions to this schedule can occur for various reasons including variations in length of treatment, equipment restrictions or breakdown, unforeseen delays and the arrival of emergency patients, which may compete for resources. Emergency patients consist of acute patients requiring surgical intervention or in-patients whose conditions have deteriorated. These may or may not be urgent and are triaged accordingly. Most hospitals reserve theatres for emergency cases, but when these or other resources are unavailable, disruptions to the elective schedule result, such as delays in surgery start time, elective surgery cancellations or transfers to another institution. Scheduling of emergency patients and the handling of schedule disruptions is an ‘online’ process typically handled by OT staff. This means that decisions are made ‘on the spot’ in a ‘real-time’ environment. There are three key stages to this study: (1) Analyse the performance of the operating theatre department using simulation. Simulation is used as a decision support tool and involves changing system parameters and elective scheduling policies and observing the effect on the system’s performance measures; (2) Improve viability of elective schedules making offline schedules more robust to differences between expected treatment times and actual treatment times, using robust scheduling techniques. This will improve the access to care and the responsiveness to emergency patients; (3) Address the disruption management and rescheduling problem (which incorporates emergency arrivals) using innovative robust reactive scheduling techniques. The robust schedule will form the baseline schedule for the online robust reactive scheduling model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With increasing rate of shipping traffic, the risk of collisions in busy and congested port waters is likely to rise. However, due to low collision frequencies in port waters, it is difficult to analyze such risk in a sound statistical manner. A convenient approach of investigating navigational collision risk is the application of the traffic conflict techniques, which have potential to overcome the difficulty of obtaining statistical soundness. This study aims at examining port water conflicts in order to understand the characteristics of collision risk with regard to vessels involved, conflict locations, traffic and kinematic conditions. A hierarchical binomial logit model, which considers the potential correlations between observation-units, i.e., vessels, involved in the same conflicts, is employed to evaluate the association of explanatory variables with conflict severity levels. Results show higher likelihood of serious conflicts for vessels of small gross tonnage or small overall length. The probability of serious conflict also increases at locations where vessels have more varied headings, such as traffic intersections and anchorages; becoming more critical at night time. Findings from this research should assist both navigators operating in port waters as well as port authorities overseeing navigational management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network RTK (Real-Time Kinematic) is a technology that is based on GPS (Global Positioning System) or more generally on GNSS (Global Navigation Satellite System) observations to achieve centimeter-level accuracy positioning in real time. It is enabled by a network of Continuously Operating Reference Stations (CORS). CORS placement is an important problem in the design of network RTK as it directly affects not only the installation and running costs of the network RTK, but also the Quality of Service (QoS) provided by the network RTK. In our preliminary research on the CORS placement, we proposed a polynomial heuristic algorithm for a so-called location-based CORS placement problem. From a computational point of view, the location-based CORS placement is a largescale combinatorial optimization problem. Thus, although the heuristic algorithm is efficient in computation time it may not be able to find an optimal or near optimal solution. Aiming at improving the quality of solutions, this paper proposes a repairing genetic algorithm (RGA) for the location-based CORS placement problem. The RGA has been implemented and compared to the heuristic algorithm by experiments. Experimental results have shown that the RGA produces better quality of solutions than the heuristic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typical flow fields in a stormwater gross pollutant trap (GPT) with blocked retaining screens were experimentally captured and visualised. Particle image velocimetry (PIV) software was used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. A technique was developed to apply the Image Based Flow Visualization (IBFV) algorithm to the experimental raw dataset generated by the PIV software. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding gross pollutant capture and retention within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate specific areas and identify the flow features within the GPT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image representations derived from simplified models of the primary visual cortex (V1), such as HOG and SIFT, elicit good performance in a myriad of visual classification tasks including object recognition/detection, pedestrian detection and facial expression classification. A central question in the vision, learning and neuroscience communities regards why these architectures perform so well. In this paper, we offer a unique perspective to this question by subsuming the role of V1-inspired features directly within a linear support vector machine (SVM). We demonstrate that a specific class of such features in conjunction with a linear SVM can be reinterpreted as inducing a weighted margin on the Kronecker basis expansion of an image. This new viewpoint on the role of V1-inspired features allows us to answer fundamental questions on the uniqueness and redundancies of these features, and offer substantial improvements in terms of computational and storage efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background It has been proposed that the feral horse foot is a benchmark model for foot health in horses. However, the foot health of feral horses has not been formally investigated. Objectives To investigate the foot health of Australian feral horses and determine if foot health is affected by environmental factors, such as substrate properties and distance travelled. Methods Twenty adult feral horses from five populations (n = 100) were investigated. Populations were selected on the basis of substrate hardness and the amount of travel typical for the population. Feet were radiographed and photographed, and digital images were surveyed by two experienced assessors blinded to each other's assessment and to the population origin. Lamellar samples from 15 feet from three populations were investigated histologically for evidence of laminitis. Results There was a total of 377 gross foot abnormalities identified in 100 left forefeet. There were no abnormalities detected in three of the feet surveyed. Each population had a comparable prevalence of foot abnormalities, although the type and severity of abnormality varied among populations. Of the three populations surveyed by histopathology, the prevalence of chronic laminitis ranged between 40% and 93%. Conclusions Foot health appeared to be affected by the environment inhabited by the horses. The observed chronic laminitis may be attributable to either nutritional or traumatic causes. Given the overwhelming evidence of suboptimal foot health, it may not be appropriate for the feral horse foot to be the benchmark model for equine foot health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been known since Rhodes Fairbridge’s first attempt to establish a global pattern of Holocene sea-level change by combining evidence from Western Australia and from sites in the northern hemisphere that the details of sea-level history since the Last Glacial Maximum vary considerably across the globe. The Australian region is relatively stable tectonically and is situated in the ‘far-field’ of former ice sheets. It therefore preserves important records of post-glacial sea levels that are less complicated by neotectonics or glacio-isostatic adjustments. Accordingly, the relative sea-level record of this region is dominantly one of glacio-eustatic (ice equivalent) sea-level changes. The broader Australasian region has provided critical information on the nature of post-glacial sea level, including the termination of the Last Glacial Maximum when sea level was approximately 125 m lower than present around 21,000–19,000 years BP, and insights into meltwater pulse 1A between 14,600 and 14,300 cal. yr BP. Although most parts of the Australian continent reveals a high degree of tectonic stability, research conducted since the 1970s has shown that the timing and elevation of a Holocene highstand varies systematically around its margin. This is attributed primarily to variations in the timing of the response of the ocean basins and shallow continental shelves to the increased ocean volumes following ice-melt, including a process known as ocean siphoning (i.e. glacio-hydro-isostatic adjustment processes). Several seminal studies in the early 1980s produced important data sets from the Australasian region that have provided a solid foundation for more recent palaeo-sea-level research. This review revisits these key studies emphasising their continuing influence on Quaternary research and incorporates relatively recent investigations to interpret the nature of post-glacial sea-level change around Australia. These include a synthesis of research from the Northern Territory, Queensland, New South Wales, South Australia and Western Australia. A focus of these more recent studies has been the re-examination of: (1) the accuracy and reliability of different proxy sea-level indicators; (2) the rate and nature of post-glacial sea-level rise; (3) the evidence for timing, elevation, and duration of mid-Holocene highstands; and, (4) the notion of mid- to late Holocene sea-level oscillations, and their basis. Based on this synthesis of previous research, it is clear that estimates of past sea-surface elevation are a function of eustatic factors as well as morphodynamics of individual sites, the wide variety of proxy sea-level indicators used, their wide geographical range, and their indicative meaning. Some progress has been made in understanding the variability of the accuracy of proxy indicators in relation to their contemporary sea level, the inter-comparison of the variety of dating techniques used and the nuances of calibration of radiocarbon ages to sidereal years. These issues need to be thoroughly understood before proxy sea-level indicators can be incorporated into credible reconstructions of relative sea-level change at individual locations. Many of the issues, which challenged sea-level researchers in the latter part of the twentieth century, remain contentious today. Divergent opinions remain about: (1) exactly when sea level attained present levels following the most recent post-glacial marine transgression (PMT); (2) the elevation that sea-level reached during the Holocene sea-level highstand; (3) whether sea-level fell smoothly from a metre or more above its present level following the PMT; (4) whether sea level remained at these highstand levels for a considerable period before falling to its present position; or (5) whether it underwent a series of moderate oscillations during the Holocene highstand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The control paradigms of the distributed generation (DG) sources in the smart grid are realised by either utilising virtual power plant (VPP) or by employing MicroGrid structures. Both VPP and MicroGrid are presented with the problem of control of power flow between their comprising DG sources. This study depicts this issue for VPP and proposes a novel and improved universal active and reactive power flow controllers for three-phase pulse width modulated voltage source inverters (PWM-VSI) operating in the VPP environment. The proposed controller takes into account all cases of R-X relationship, thus allowing it to function in systems operating at high, medium (MV) and low-voltage (LV) levels. Also proposed control scheme for the first time in an inverter control takes into account the capacitance of the transmission line which is an important factor to accurately represent medium length transmission lines. This allows the proposed control scheme to be applied in VPP structures, where DG sources can operate at MV LV levels over a short/medium length transmission line. The authors also conducted small signal stability analysis of the proposed controller and compared it against the small signal study of the existing controllers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract An experimental dataset representing a typical flow field in a stormwater gross pollutant trap (GPT) was visualised. A technique was developed to apply the image-based flow visualisation (IBFV) algorithm to the raw dataset. Particle image velocimetry (PIV) software was previously used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding stormwater pollutant capture and retention behaviour within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate the possible flow paths of pollutants entering the GPT. The investigated flow paths were compared with the behaviour of pollutants monitored during experiments.