967 resultados para Eliminate lost time
Resumo:
The primary objective of this study is to develop a robust queue estimation algorithm for motorway on-ramps. Real-time queue information is the most vital input for a dynamic queue management that can treat long queues on metered on-ramps more sophistically. The proposed algorithm is developed based on the Kalman filter framework. The fundamental conservation model is used to estimate the system state (queue size) with the flow-in and flow-out measurements. This projection results are updated with the measurement equation using the time occupancies from mid-link and link-entrance loop detectors. This study also proposes a novel single point correction method. This method resets the estimated system state to eliminate the counting errors that accumulate over time. In the performance evaluation, the proposed algorithm demonstrated accurate and reliable performances and consistently outperformed the benchmarked Single Occupancy Kalman filter (SOKF) method. The improvements over SOKF are 62% and 63% in average in terms of the estimation accuracy (MAE) and reliability (RMSE), respectively. The benefit of the innovative concepts of the algorithm is well justified by the improved estimation performance in the congested ramp traffic conditions where long queues may significantly compromise the benchmark algorithm’s performance.
Resumo:
Objective: To estimate the time spent by the researchers for preparing grant proposals, and to examine whether spending more time increase the chances of success. Design: Observational study. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant proposals in March 2012. Main outcome measures: Total researcher time spent preparing proposals; funding success as predicted by the time spent. Results: The NHMRC received 3727 proposals of which 3570 were reviewed and 731 (21%) were funded. Among our 285 participants who submitted 632 proposals, 21% were successful. Preparing a new proposal took an average of 38 working days of researcher time and a resubmitted proposal took 28 working days, an overall average of 34 days per proposal. An estimated 550 working years of researchers' time (95% CI 513 to 589) was spent preparing the 3727 proposals, which translates into annual salary costs of AU$66 million. More time spent preparing a proposal did not increase the chances of success for the lead researcher (prevalence ratio (PR) of success for 10 day increase=0.91, 95% credible interval 0.78 to 1.04) or other researchers (PR=0.89, 95% CI 0.67 to 1.17). Conclusions: Considerable time is spent preparing NHMRC Project Grant proposals. As success rates are historically 20–25%, much of this time has no immediate benefit to either the researcher or society, and there are large opportunity costs in lost research output. The application process could be shortened so that only information relevant for peer review, not administration, is collected. This would have little impact on the quality of peer review and the time saved could be reinvested into research.
Resumo:
Health complaint commissions in Australia: Time for a national approach • There is considerable variation between jurisdictions in the ways complaint data are defined, collected and recorded by the Health complaint commissions. • Complaints from the public are an important accountability mechanism and an indicator of service quality. • The lack of a consistent approach leads to fragmentation of complaint data and a lost opportunity to use national data to assist policy development and identify the main areas causing consumers to complain. • We need a national approach to complaints data collection by the Health complaints commissions in order to better respond to patients’ concerns
Resumo:
Lean strategies have been developed to eliminate or reduce waste and thus improve operational efficiency in a manufacturing environment. However, in practice, manufacturers encounter difficulties to select appropriate lean strategies within their resource constraints and to quantitatively evaluate the perceived value of manufacturing waste reduction. This paper presents a methodology developed to quantitatively evaluate the contribution of lean strategies selected to reduce manufacturing wastes within the manufacturers’ resource (time) constraints. A mathematical model has been developed for evaluating the perceived value of lean strategies to manufacturing waste reduction and a step-by-step methodology is provided for selecting appropriate lean strategies to improve the manufacturing performance within their resource constraints. A computer program is developed in MATLAB for finding the optimum solution. With the help of a case study, the proposed methodology and developed model has been validated. A ‘lean strategy-wastes’ correlation matrix has been proposed to establish the relationship between the manufacturing wastes and lean strategies. Using the correlation matrix and applying the proposed methodology and developed mathematical model, authors came out with optimised perceived value of reduction of a manufacturer's wastes by implementing appropriate lean strategies within a manufacturer's resources constraints. Results also demonstrate that the perceived value of reduction of manufacturing wastes can significantly be changed based on policies and product strategy taken by a manufacturer. The proposed methodology can also be used in dynamic situations by changing the input in the programme developed in MATLAB. By identifying appropriate lean strategies for specific manufacturing wastes, a manufacturer can better prioritise implementation efforts and resources to maximise the success of implementing lean strategies in their organisation.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Purpose Individuals who experience stroke have a higher likelihood of subsequent stroke events, making it imperative to plan for future medical care. In the event of a further serious health event, engaging in the process of advanced care planning (ACP) can help family members and health care professionals (HCPs) make medical decisions for individuals who have lost the capacity to do so. Few studies have explored the views and experiences of patients with stroke about discussing their wishes and preferences for future medical events, and the extent to which stroke HCPs engage in conversations around planning for such events. In this study, we sought to understand how the process of ACP unfolded between HCPs and patients post-stroke. Patients and methods Using grounded theory (GT) methodology, we engaged in direct observation of HCP and patient interactions on an acute stroke unit and two stroke rehabilitation units. Using semi-structured interviews, 14 patients and four HCPs were interviewed directly about the ACP process. Results We found that open and continual ACP conversations were not taking place, patients experienced an apparent lack of urgency to engage in ACP, and HCPs were uncomfortable initiating ACP conversations due to the sensitive nature of the topic. Conclusion In this study, we identified lack of engagement in ACP post-stroke, attributable to patient and HCP factors. This encourages us to look further into the process of ACP in order to develop open communication between the patient with stroke, their families, and stroke HCPs.
Resumo:
Considering ultrasound propagation through complex composite media as an array of parallel sonic rays, a comparison of computer simulated prediction with experimental data has previously been reported for transmission mode (where one transducer serves as transmitter, the other as receiver) in a series of ten acrylic step-wedge samples, immersed in water, exhibiting varying degrees of transit time inhomogeneity. In this study, the same samples were used but in pulse-echo mode, where the same ultrasound transducer served as both transmitter and receiver, detecting both ‘primary’ (internal sample interface) and ‘secondary’ (external sample interface) echoes. A transit time spectrum (TTS) was derived, describing the proportion of sonic rays with a particular transit time. A computer simulation was performed to predict the transit time and amplitude of various echoes created, and compared with experimental data. Applying an amplitude-tolerance analysis, 91.7±3.7% of the simulated data was within ±1 standard deviation (STD) of the experimentally measured amplitude-time data. Correlation of predicted and experimental transit time spectra provided coefficients of determination (R2) ranging from 100.0% to 96.8% for the various samples tested. The results acquired from this study provide good evidence for the concept of parallel sonic rays. Further, deconvolution of experimental input and output signals has been shown to provide an effective method to identify echoes otherwise lost due to phase cancellation. Potential applications of pulse-echo ultrasound transit time spectroscopy (PE-UTTS) include improvement of ultrasound image fidelity by improving spatial resolution and reducing phase interference artefacts.
Resumo:
Objective: To examine if streamlining a medical research funding application process saved time for applicants. Design: Cross-sectional surveys before and after the streamlining. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures: Average researcher time spent preparing an application and the total time for all applications in working days. Results: The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions: Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications.
Resumo:
In this note, the fallacy in the method given by Sharma and Swarup, in their paper on time minimising transportation problem, to determine the setS hkof all nonbasic cells which when introduced into the basis, either would eliminate a given basic cell (h, k) from the basis or reduce the amountx hkis pointed out.
Resumo:
Arguments arising from quantum mechanics and gravitation theory as well as from string theory, indicate that the description of space-time as a continuous manifold is not adequate at very short distances. An important candidate for the description of space-time at such scales is provided by noncommutative space-time where the coordinates are promoted to noncommuting operators. Thus, the study of quantum field theory in noncommutative space-time provides an interesting interface where ordinary field theoretic tools can be used to study the properties of quantum spacetime. The three original publications in this thesis encompass various aspects in the still developing area of noncommutative quantum field theory, ranging from fundamental concepts to model building. One of the key features of noncommutative space-time is the apparent loss of Lorentz invariance that has been addressed in different ways in the literature. One recently developed approach is to eliminate the Lorentz violating effects by integrating over the parameter of noncommutativity. Fundamental properties of such theories are investigated in this thesis. Another issue addressed is model building, which is difficult in the noncommutative setting due to severe restrictions on the possible gauge symmetries imposed by the noncommutativity of the space-time. Possible ways to relieve these restrictions are investigated and applied and a noncommutative version of the Minimal Supersymmetric Standard Model is presented. While putting the results obtained in the three original publications into their proper context, the introductory part of this thesis aims to provide an overview of the present situation in the field.
Resumo:
This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.
Resumo:
The importance of lying behavior to dairy cows and the feasible definition of lying has attracted many studies on the subject. Cattle show both behavioral and physiological stress responses when subjected to thwarting of their lying behavior. If cows are unable to lie down they later compensate for lost lying time when possible. Environmental factors such as housing and bedding systems have been noted to affect the time spent lying, but there is usually large variation in lying time between individuals. Internal factors such as the reproductive stage, age and health of cows affect their lying time and can cause variation. However, the effect of higher milk production on behavior has not previously been illuminated. The objective of this study was to provide data applicable for the improvement of resting conditions of cows. The preference of stall surface material, differences in normal behavior per unit time and various health measures were observed. The aim was to evaluate lying behavior and cow comfort on different stall bedding materials. In addition, the effect of milk yield on behavior was examined in a tie stall experiment. The preferences for surface materials were investigated in 5 experiments using 3 surface materials with bedding manipulations. According to the results, the cows preferred abundant straw bedding and soft rubber mats. However, they showed an aversion to sand bedding. Some individuals even refused to use stalls with sand when no organic bedding material was present. However, this study was unable to determine the reason for the avoidance, as neither the sand particle size nor thermal properties appeared critical. However, previous exposure to particular surface materials increased the preference for them. The amount of straw bedding was found to be an important factor affecting the preferences for stalls, and the lying time in stalls increased when the flooring softness was improved by applying straw or by installing elastic mats. Despite sand being the least preferred flooring material in preference tests, the health of legs improved during exposure to sand-floored stalls. Moreover cows using sand were cleaner than those that used straw stalls. Thus, sand bedding entailed some health benefits despite the contradictory results of preference tests, which more strongly reflected the perceptions of individual animals. Milk yield was observed to affect behavior by reducing the lying time, possibly due to factors other than longer duration of eating. High yielding cows seemed to intensify their lying bouts, as they were observed to lie with the neck muscles relaxed sooner after lying down than lower yielding cows. In conclusion, cows were found to prefer softer stall surface materials and organic bedding material. In addition, the lying time was reduced by a high milk yield, although the lying time seemed to be important for resting. Cows might differ in the needs for their lying environment. The management of dairy cows should eliminate any unnecessary prevention of lying, as even in tie-stalls high yielding cows seem to be affected by time constraints. Adding fresh bedding material to stalls increases the comfort of any stall flooring material.
Resumo:
Sosiaali- ja terveysministeriön 2006 julkaiseman vanhustyön ja geriatrisen lääkehoidon kehittämistä koskevan selvityksen yhtenä tärkeänä huolenaiheena oli iäkkäiden lääkehoidon useat epäkohdat, kuten iäkkäitä hoitavien hoitajien lääkeosaamiseen liittyvät puutteet ja ongelmat. Yksi keino parantaa iäkkäitä hoitavien eri tahojen lääkehoito-osaamista on täydennyskoulutus, johon kaikilla sosiaali- ja terveydenhuollon ammattiryhmillä on oikeus ja velvollisuus. Täydennyskoulutuksella pystytään myös kehittämään organisaatioiden toimintaa ja tuottamaan uusia, parempia palveluita. Tutkimuksessa selvitettiin Lohjan, Siuntion, Inkoon ja Karjalohjan muodostaman sosiaali- ja terveydenhuollon yhteistoiminta-alueen LOSTin kotihoidon yksiköiden iäkkäiden lääkehoitoihin liittyviä koulutustarpeita. Tämän tutkimuksen avulla syvennettiin samalle tutkimusryhmälle tehdyn kyselytutkimuksen tuloksia. Tutkimusaineistona käytettiin LOST-alueen kotihoidon yksiköiden hoitajille (n=150) farmaseutin lopputyönä tehtyä kyselyaineistoa sekä työntekijöille (n=6) ja esimiehille (n=6) tehtyjä erillisiä ryhmäkeskusteluja. Lisäksi näkökulman laajentamiseksi ja moniammatillisuuden korostamiseksi aineistona käytettiin kotihoidon asiakkaita hoitavien lääkärien (n=4) teemahaastatteluja. Kyselyaineistosta analysoitiin erikseen sairaanhoitajien, lähihoitajien ja kodinhoitajien koulutustarpeet. Samat asiat nousivat esille kunkin ammattiryhmän tuloksissa. Tärkeimpinä lääkehoito-osaamiseen liittyvinä teoreettisina koulutettavina asioina kyselystä nousivat esille iäkkäiden farmakokinetiikka ja lääkkeiden käyttöön liittyvät erityispiirteet, lääkkeiden vaikutukset, lääkkeiden haittavaikutukset sekä lääkkeiden yhteisvaikutukset ja yhteensopivuus. Lisäksi teoreettisista taidoista nousi hoitotyön etiikkaan liittyvät tarkkuus ja huolellisuus työssä. Käytännön taidoista tärkeimpinä koulutettavina aiheina kyselystä nousivat asiakkaiden lääkehoidon ja voinnin seuranta, lääkkeiden jakaminen sekä lääkkeiden annosteluun liittyen se, että annostellaan oikeaa lääkettä ja vahvuutta, oikea annos ja oikeaan aikaan sekä oikeat antotavat. Ryhmäkeskusteluista ja lääkärien teemahaastatteluista haettiin syvempää ymmärrystä kyselyn tuloksiin. Yksi tärkeimmistä tämän laadullisen tutkimuksen löydöksistä oli kotihoidon yhteistyöhön liittyvät epäkohdat. Lääkehoitojen toteuttamista ja seurantaa voitaisiin tulosten perusteella parantaa lääkärien ja kotihoidon hoitajien yhteisellä koulutuksella. Tärkeimpiä sairauksia tai oireita, joihin hoitajat toivoisivat yhteisiä toimintakäytäntöjä, ovat diabetes, sydän- ja verisuonisairaudet, kipu, muistisairaudet sekä psyykensairaudet. Lisäksi koulutusaiheiksi tutkimuksesta nousivat iäkkäiden lääkehoidon erityispiirteet, lääkkeiden antoreitit ja lääkemuodot. Kyselyn sekä ryhmäkeskustelujen ja lääkärien teemahaastattelujen tuloksista tehtiin lopuksi synteesi, jonka lopputuloksena LOST-alueen kotihoidon hoitohenkilöstölle sekä kotihoidon lääkäreille koottiin yhteinen tarvelähtöinen täydennyskoulutussuunnitelma. Suunnitelma tehtiin aineistosta nousseiden koulutusaiheiden pohjalta, eikä siihen lisätty aiheita tutkimuksen ulkopuolelta.
Resumo:
The key requirements for enabling real-time remote healthcare service on a mobile platform, in the present day heterogeneous wireless access network environment, are uninterrupted and continuous access to the online patient vital medical data, monitor the physical condition of the patient through video streaming, and so on. For an application, this continuity has to be sufficiently transparent both from a performance perspective as well as a Quality of Experience (QoE) perspective. While mobility protocols (MIPv6, HIP, SCTP, DSMIP, PMIP, and SIP) strive to provide both and do so, limited or non-availability (deployment) of these protocols on provider networks and server side infrastructure has impeded adoption of mobility on end user platforms. Add to this, the cumbersome OS configuration procedures required to enable mobility protocol support on end user devices and the user's enthusiasm to add this support is lost. Considering the lack of proper mobility implementations that meet the remote healthcare requirements above, we propose SeaMo+ that comprises a light-weight application layer framework, termed as the Virtual Real-time Multimedia Service (VRMS) for mobile devices to provide an uninterrupted real-time multimedia information access to the mobile user. VRMS is easy to configure, platform independent, and does not require additional network infrastructure unlike other existing schemes. We illustrate the working of SeaMo+ in two realistic remote patient monitoring application scenarios.
Resumo:
This paper deals with the convergence of a remote iterative learning control system subject to data dropouts. The system is composed by a set of discrete-time multiple input-multiple output linear models, each one with its corresponding actuator device and its sensor. Each actuator applies the input signals vector to its corresponding model at the sampling instants and the sensor measures the output signals vector. The iterative learning law is processed in a controller located far away of the models so the control signals vector has to be transmitted from the controller to the actuators through transmission channels. Such a law uses the measurements of each model to generate the input vector to be applied to its subsequent model so the measurements of the models have to be transmitted from the sensors to the controller. All transmissions are subject to failures which are described as a binary sequence taking value 1 or 0. A compensation dropout technique is used to replace the lost data in the transmission processes. The convergence to zero of the errors between the output signals vector and a reference one is achieved as the number of models tends to infinity.