991 resultados para baseline conditions
Resumo:
Background The prevalence of type 2 diabetes is rising internationally. Patients with diabetes have a higher risk of cardiovascular events accounting for substantial premature morbidity and mortality, and health care expenditure. Given healthcare workforce limitations, there is a need to improve interventions that promote positive self-management behaviours that enable patients to manage their chronic conditions effectively, across different cultural contexts. Previous studies have evaluated the feasibility of including telephone and Short Message Service (SMS) follow up in chronic disease self-management programs, but only for single diseases or in one specific population. Therefore, the aim of this study is to evaluate the feasibility and short-term efficacy of incorporating telephone and text messaging to support the care of patients with diabetes and cardiac disease, in Australia and in Taiwan. Methods/design A randomised controlled trial design will be used to evaluate a self-management program for people with diabetes and cardiac disease that incorporates the use of simple remote-access communication technologies. A sample size of 180 participants from Australia and Taiwan will be recruited and randomised in a one-to-one ratio to receive either the intervention in addition to usual care (intervention) or usual care alone (control). The intervention will consist of in-hospital education as well as follow up utilising personal telephone calls and SMS reminders. Primary short term outcomes of interest include self-care behaviours and self-efficacy assessed at baseline and four weeks. Discussion If the results of this investigation substantiate the feasibility and efficacy of the telephone and SMS intervention for promoting self management among patients with diabetes and cardiac disease in Australia and Taiwan, it will support the external validity of the intervention. It is anticipated that empirical data from this investigation will provide valuable information to inform future international collaborations, while providing a platform for further enhancements of the program, which has potential to benefit patients internationally.
Resumo:
Following eco-driving instructions can reduce fuel consumption between 5 to 20% on urban roads with manual cars. The majority of Australian cars have an automatic transmission gear-box. It is therefore of interest to verify whether current eco-driving instructions are e cient for such vehicles. In this pilot study, participants (N=13) drove an instrumented vehicle (Toyota Camry 2007) with an automatic transmission. Fuel consumption of the participants was compared before and after they received simple eco-driving instructions. Participants drove the same vehicle on the same urban route under similar tra c conditions. We found that participants drove at similar speeds during their baseline and eco-friendly drives, and reduced the level of their accelerations and decelerations during eco-driving. Fuel consumption decreased for the complete drive by 7%, but not on the motorway and inclined sections of the study. Gas emissions were estimated with the VT-micro model, and emissions of the studied pollutants (CO2, CO, NOX and HC) were reduced, but no di erence was observed for CO2 on the motorway and inclined sections. The di erence for the complete lap is 3% for CO2. We have found evidence showing that simple eco-driving instructions are e cient in the case of automatic transmission in an urban environment, but towards the lowest values of the spectrum of fuel consumption reduction from the di erent eco-driving studies.
Resumo:
Distraction resulting from mobile phone use whilst driving has been shown to increase the reaction times of drivers, thereby increasing the likelihood of a crash. This study compares the effects of mobile phone conversations on reaction times of drivers responding to traffic events that occur at different points in a driver’s field of view. The CARRS-Q Advanced Driving Simulator was used to test a group of young drivers on various simulated driving tasks including a traffic event that occurred within the driver’s central vision—a lead vehicle braking suddenly—and an event that occurred within the driver’s peripheral—a pedestrian entering a zebra crossing from a footpath. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), and while engaged in hands-free and handheld phone conversations. The drivers were aged between 21 to 26 years and split evenly by gender. Differences in reaction times for an event in a driver’s central vision were not statistically significant across phone conditions, probably due to a lower speed selection by the distracted drivers. In contrast, the reaction times to detect an event that originated in a distracted driver’s peripheral vision were more than 50% longer compared to the baseline condition. A further statistical analysis revealed that deterioration of reaction times to an event in the peripheral vision was greatest for distracted drivers holding a provisional licence. Many critical events originate in a driver’s periphery, including vehicles, bicyclists, and pedestrians emerging from side streets. A reduction in the ability to detect these events while distracted presents a significant safety concern that must be addressed.
Resumo:
The use of mobile phones while driving is more prevalent among young drivers—a less experienced cohort with elevated crash risk. The objective of this study was to examine and better understand the reaction times of young drivers to a traffic event originating in their peripheral vision whilst engaged in a mobile phone conversation. The CARRS-Q Advanced Driving Simulator was used to test a sample of young drivers on various simulated driving tasks, including an event that originated within the driver’s peripheral vision, whereby a pedestrian enters a zebra crossing from a sidewalk. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free and handheld. In addition to driving the simulator each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The participants were 21 to 26 years old and split evenly by gender. Drivers’ reaction times to a pedestrian in the zebra crossing were modelled using a parametric accelerated failure time (AFT) duration model with a Weibull distribution. Also tested where two different model specifications to account for the structured heterogeneity arising from the repeated measures experimental design. The Weibull AFT model with gamma heterogeneity was found to be the best fitting model and identified four significant variables influencing the reaction times, including phone condition, driver’s age, license type (Provisional license holder or not), and self-reported frequency of usage of handheld phones while driving. The reaction times of drivers were more than 40% longer in the distracted condition compared to baseline (not distracted). Moreover, the impairment of reaction times due to mobile phone conversations was almost double for provisional compared to open license holders. A reduction in the ability to detect traffic events in the periphery whilst distracted presents a significant and measurable safety concern that will undoubtedly persist unless mitigated.
Resumo:
This article examines the conditions of penal hope behind suggestions that the penal expansionism of the last three decades may be at a ‘turning point’. The article proceeds by outlining David Green’s (2013b) suggested catalysts of penal reform and considers how applicable they are in the Australian context. Green’s suggested catalysts are: the cycles and saturation thesis; shifts in the dominant conception of the offender; the global financial crisis (GFC) and budgetary constraints; the drop in crime; the emergence of the prisoner re‐entry movement; apparent shifts in public opinion; the influence of evangelical Christian ideas; and the Right on Crime initiative. The article then considers a number of other possible catalysts or forces: the role of trade unions; the role of courts; the emergence of recidivism as a political issue; the influence of ‘evidence based’/‘what works’ discourse; and the emergence of justice reinvestment (JR). The article concludes with some comments about the capacity of criminology and criminologists to contribute to penal reductionism, offering an optimistic assessment for the prospects of a reflexive criminology that engages in and engenders a wider politics around criminal justice issues.
Resumo:
The ability of the technique of large-amplitude Fourier transformed (FT) ac voltammetry to facilitate the quantitative evaluation of electrode processes involving electron transfer and catalytically coupled chemical reactions has been evaluated. Predictions derived on the basis of detailed simulations imply that the rate of electron transfer is crucial, as confirmed by studies on the ferrocenemethanol (FcMeOH)-mediated electrocatalytic oxidation of ascorbic acid. Thus, at glassy carbon, gold, and boron-doped diamond electrodes, the introduction of the coupled electrocatalytic reaction, while producing significantly enhanced dc currents, does not affect the ac harmonics. This outcome is as expected if the FcMeOH (0/+) process remains fully reversible in the presence of ascorbic acid. In contrast, the ac harmonic components available from FT-ac voltammetry are predicted to be highly sensitive to the homogeneous kinetics when an electrocatalytic reaction is coupled to a quasi-reversible electron-transfer process. The required quasi-reversible scenario is available at an indium tin oxide electrode. Consequently, reversible potential, heterogeneous charge-transfer rate constant, and charge-transfer coefficient values of 0.19 V vs Ag/AgCl, 0.006 cm s (-1) and 0.55, respectively, along with a second-order homogeneous chemical rate constant of 2500 M (-1) s (-1) for the rate-determining step in the catalytic reaction were determined by comparison of simulated responses and experimental voltammograms derived from the dc and first to fourth ac harmonic components generated at an indium tin oxide electrode. The theoretical concepts derived for large-amplitude FT ac voltammetry are believed to be applicable to a wide range of important solution-based mediated electrocatalytic reactions.
Resumo:
In order to establish the influence of the drying air characteristics on the drying performance and fluidization quality of bovine intestine for pet food, several drying tests have been carried out in a laboratory scale heat pump assisted fluid bed dryer. Bovine intestine samples were heat pump fluidized bed dried at atmospheric pressure and at temperatures below and above the materials freezing points, equipped with a continuous monitoring system. The investigation of the drying characteristics have been conducted in the temperature range −10 to 25 ◦C and the airflow in the range 1.5–2.5 m/s. Some experiments were conducted as single temperature drying experiments and others as two stage drying experiments employing two temperatures. An Arrhenius-type equation was used to interpret the influence of the drying air temperature on the effective diffusivity, calculated with the method of slopes in terms of energy activation, and this was found to be sensitive to the temperature. The effective diffusion coefficient of moisture transfer was determined by the Fickian method using uni-dimensional moisture movement in both moisture, removal by evaporation and combined sublimation and evaporation. Correlations expressing the effective moisture diffusivity and drying temperature are reported. Bovine particles were characterized according to the Geldart classification and the minimum fluidization velocity was calculated using the Ergun Equation and generalized equation for all drying conditions at the beginning and end of the trials. Walli’s model was used to categorize stability of the fluidization at the beginning and end of the dryingv for each trial. The determined Walli’s values were positive at the beginning and end of all trials indicating stable fluidization at the beginning and end for each drying condition.
Resumo:
Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.
Resumo:
Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.
Resumo:
The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.
Resumo:
In the field of rolling element bearing diagnostics envelope analysis, and in particular the squared envelope spectrum, have gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of squared envelope spectrum has been extended to cases in which small speed fluctuations occur, maintaining the effectiveness and efficiency that characterize this successful technique. However, the constraint on speed has to be removed completely, making envelope analysis suitable also for speed and load transients, to implement an algorithm valid for all the industrial application. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This paper is aimed at providing and testing a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.
Resumo:
Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.
Resumo:
Diagnostics of rolling element bearings have been traditionally developed for constant operating conditions, and sophisticated techniques, like Spectral Kurtosis or Envelope Analysis, have proven their effectiveness by means of experimental tests, mainly conducted in small-scale laboratory test-rigs. Algorithms have been developed for the digital signal processing of data collected at constant speed and bearing load, with a few exceptions, allowing only small fluctuations of these quantities. Owing to the spreading of condition based maintenance in many industrial fields, in the last years a need for more flexible algorithms emerged, asking for compatibility with highly variable operating conditions, such as acceleration/deceleration transients. This paper analyzes the problems related with significant speed and load variability, discussing in detail the effect that they have on bearing damage symptoms, and propose solutions to adapt existing algorithms to cope with this new challenge. In particular, the paper will i) discuss the implication of variable speed on the applicability of diagnostic techniques, ii) address quantitatively the effects of load on the characteristic frequencies of damaged bearings and iii) finally present a new approach for bearing diagnostics in variable conditions, based on envelope analysis. The research is based on experimental data obtained by using artificially damaged bearings installed on a full scale test-rig, equipped with actual train traction system and reproducing the operation on a real track, including all the environmental noise, owing to track irregularity and electrical disturbances of such a harsh application.
Resumo:
The transmission path from the excitation to the measured vibration on the surface of a mechanical system introduces a distortion both in amplitude and in phase. Moreover, in variable speed conditions, the amplification/attenuation and the phase shift, due to the transfer function of the mechanical system, varies in time. This phenomenon reduces the effectiveness of the traditionally tachometer based order tracking, compromising the results of a discrete-random separation performed by a synchronous averaging. In this paper, for the first time, the extent of the distortion is identified both in the time domain and in the order spectrum of the signal, highlighting the consequences for the diagnostics of rotating machinery. A particular focus is given to gears, providing some indications on how to take advantage of the quantification of the disturbance to better tune the techniques developed for the compensation of the distortion. The full theoretical analysis is presented and the results are applied to an experimental case.
Resumo:
In the field of rolling element bearing diagnostics, envelope analysis has gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of this technique has been extended to cases in which small speed fluctuations occur, maintaining high effectiveness and efficiency. In order to make this algorithm suitable to all industrial applications, the constraint on speed has to be removed completely. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This chapter presents a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.