205 resultados para fitting trends prescribing habits
Resumo:
100 year old gasoline engine technology vehicles have now become one of the major contributors of greenhouse gases. Plug-in Electric Vehicles (PEVs) have been proposed to achieve environmental friendly transportation. Even though the PEV usage is currently increasing, a technology breakthrough would be required to overcome battery related drawbacks. Although battery technology is evolving, drawbacks inherited with batteries such as; cost, size, weight, slower charging characteristic and low energy density would still be dominating constrains for development of EVs. Furthermore, PEVs have not been accepted as preferred choice by many consumers due to charging related issues. To address battery related limitations, the concept of dynamic Wireless Power Transfer (WPT) enabled EVs have been proposed in which EV is being charged while it is in motion. WPT enabled infrastructure has to be employed to achieve dynamic EV charging concept. The weight of the battery pack can be reduced as the required energy storage is lower if the vehicle can be powered wirelessly while driving. Stationary WPT charging where EV is charged wirelessly when it is stopped, is simpler than dynamic WPT in terms of design complexity. However, stationary WPT does not increase vehicle range compared to wired-PEVs. State-of-art WPT technology for future transportation is discussed in this chapter. Analysis of the WPT system and its performance indices are introduced. Modelling the WPT system using different methods such as equivalent circuit theory, two port network theory and coupled mode theory is described illustrating their own merits in Sect. 2.3. Both stationary and dynamic WPT for EV applications are illustrated in Sect. 2.4. Design challenges and optimization directions are analysed in Sect. 2.5. Adaptive tuning techniques such as adaptive impedance matching and frequency tuning are also discussed. A case study for optimizing resonator design is presented in Sect. 2.6. Achievements by the research community is introduced highlighting directions for future research.
Resumo:
Background The frequency of prescribing potentially inappropriate medications (PIMs) in older patients remains high regardless of the evidence of adverse outcomes from their use. This study aims to identify the prevalence and nature of PIMs at admission to acute care and at discharge to residential aged care facilities (RACFs) using the recently updated Beers’ Criteria. We also aim to identify if polypharmacy, age, gender and the frailty status of patients are independent risk factors for receiving a PIM. Methods This was a retrospective study of 206 patients discharged to RACFs from acute care. All patients were aged at least70 years and were admitted between July 2005 and May 2010; their admission and discharge medications were evaluated. Frailty status was measured as the Frailty Index (FI), adding each individual’s deficits and dividing by the total number of deficits considered, with FI 0.25 used as the cut-off between “fit” and “frail”. Results Mean patient age was 84.8 ± 6.7 years; the majority (57%) were older than 85 years and approximately 90% were frail. Patients were prescribed a mean of 7.2 regular medications at admission and 8.1 on discharge. At least one PIM was identified in 112 (54.4%) patients on admission and 102 (49.5%) patients on discharge. Of all medications prescribed at admission (1728), 10.8% were PIMs and at discharge of 1759 medications, 9.6% were PIMs. Of the total 187 PIMs on admission, 56 (30%) were stopped, and 131 were continued; 32 new PIMs were introduced. Commonly prescribed PIMs at both admission and discharge were central nervous system, cardiovascular and gastrointestinal drugs and analgesics. Of the potential risk factors, frailty status was the only significant predictor of PIMs at both admission and discharge (p = 0.016). Conclusion A high prevalence of unnecessary drug use was observed in frail older patients on admission to acute care hospitals and on discharge to RACFs. The only association with PIM use was the frailty status of patients. Further studies are needed to further evaluate this association.
Resumo:
There have been different approaches to studying penalty-kick performance in association football. In this paper, the authors synthesize key findings within an ecological dynamics theoretical framework. According to this theoretical perspective, information is the cornerstone for understanding the dynamics of action regulation in penalty-kick performance. Research suggests that investigators need to identify the information sources that are most relevant to penalty-kick performance. An important task is to understand how constraints can channel (i.e. change, emphasize or mask) information sources used to regulate upcoming actions and how the influence of these constraints is expressed in players' behavioural dynamics. Due to the broad range of constraints influencing penalty-kick performance, it is recommended that future research adopts an interdisciplinary focus on performance assessment to overcome the current lack of representativeness in penalty-kick experimental designs. Such an approach would serve to capture the information-based control of action of both players as components of this dyadic system in competitive sport.
Resumo:
The Mapping Futures of News research and seminar programme, sponsored by the Institute for Advanced Studies in 2009-10, addressed those questions, as well as the many more immediate issues facing the Scottish news industry, such as how to survive the current period of often traumatic transition. This document summarises that work, and identifies: Mapping Futures for News: Programme Report iii • Where the main Scottish print and broadcast news media are in 2010, in terms of circulation and ratings figures; • the key trends currently impacting on Scottish news media; • the responses up to now of government and regulators to assist the Scottish media through the present problems; • the responses of the news media themselves.
Resumo:
Background Pharmacist prescribing has been introduced in several countries and is a possible future role for pharmacy in Australia. Objective To assess whether patient satisfaction with the pharmacist as a prescriber, and patient experiences in two settings of collaborative doctor-pharmacist prescribing may be barriers to implementation of pharmacist prescribing. Design Surveys containing closed questions, and Likert scale responses, were completed in both settings to investigate patient satisfaction after each consultation. A further survey investigating attitudes towards pharmacist prescribing, after multiple consultations, was completed in the sexual health clinic. Setting and Participants A surgical pre-admission clinic (PAC) in a tertiary hospital and an outpatient sexual health clinic at a university hospital. Two hundred patients scheduled for elective surgery, and 17 patients diagnosed with HIV infection, respectively, recruited to the pharmacist prescribing arm of two collaborative doctor-pharmacist prescribing studies. Results Consultation satisfaction response rates in PAC and the sexual health clinic were 182/200 (91%) and 29/34 (85%), respectively. In the sexual health clinic, the attitudes towards pharmacist prescribing survey response rate were 14/17 (82%). Consultation satisfaction was high in both studies, most patients (98% and 97%, respectively) agreed they were satisfied with the consultation. In the sexual health clinic, all patients (14/14) agreed that they trusted the pharmacist’s ability to prescribe, care was as good as usual care, and they would recommend seeing a pharmacist prescriber to friends. Discussion and Conclusion Most of the patients had a high satisfaction with pharmacist prescriber consultations, and a positive outlook on the collaborative model of care in the sexual health clinic.
Resumo:
Quantifying the competing rates of intake and elimination of persistent organic pollutants (POPs) in the human body is necessary to understand the levels and trends of POPs at a population level. In this paper we reconstruct the historical intake and elimination of ten polychlorinated biphenyls (PCBs) and five organochlorine pesticides (OCPs) from Australian biomonitoring data by fitting a population-level pharmacokinetic (PK) model. Our analysis exploits two sets of cross-sectional biomonitoring data for PCBs and OCPs in pooled blood serum samples from the Australian population that were collected in 2003 and 2009. The modeled adult reference intakes in 1975 for PCB congeners ranged from 0.89 to 24.5 ng/kg bw/day, lower than the daily intakes of OCPs ranging from 73 to 970 ng/kg bw/day. Modeled intake rates are declining with half-times from 1.1 to 1.3 years for PCB congeners and 0.83 to 0.97 years for OCPs. The shortest modeled intrinsic human elimination half-life among the compounds studied here is 6.4 years for hexachlorobenzene, and the longest is 30 years for PCB-74. Our results indicate that it is feasible to reconstruct intakes and to estimate intrinsic human elimination half-lives using the population-level PK model and biomonitoring data only. Our modeled intrinsic human elimination half-lives are in good agreement with values from a similar study carried out for the population of the United Kingdom, and are generally longer than reported values from other industrialized countries in the Northern Hemisphere.
Resumo:
Pilot and industrial scale dilute acid pretreatment data can be difficult to obtain due to the significant infrastructure investment required. Consequently, models of dilute acid pretreatment by necessity use laboratory scale data to determine kinetic parameters and make predictions about optimal pretreatment conditions at larger scales. In order for these recommendations to be meaningful, the ability of laboratory scale models to predict pilot and industrial scale yields must be investigated. A mathematical model of the dilute acid pretreatment of sugarcane bagasse has previously been developed by the authors. This model was able to successfully reproduce the experimental yields of xylose and short chain xylooligomers obtained at the laboratory scale. In this paper, the ability of the model to reproduce pilot scale yield and composition data is examined. It was found that in general the model over predicted the pilot scale reactor yields by a significant margin. Models that appear very promising at the laboratory scale may have limitations when predicting yields on a pilot or industrial scale. It is difficult to comment whether there are any consistent trends in optimal operating conditions between reactor scale and laboratory scale hydrolysis due to the limited reactor datasets available. Further investigation is needed to determine whether the model has some efficacy when the kinetic parameters are re-evaluated by parameter fitting to reactor scale data, however, this requires the compilation of larger datasets. Alternatively, laboratory scale mathematical models may have enhanced utility for predicting larger scale reactor performance if bulk mass transport and fluid flow considerations are incorporated into the fibre scale equations. This work reinforces the need for appropriate attention to be paid to pilot scale experimental development when moving from laboratory to pilot and industrial scales for new technologies.
Resumo:
The future of civic engagement is characterised by both technological innovation as well as new technological user practices that are fuelled by trends towards mobile, personal devices; broadband connectivity; open data; urban interfaces; and, cloud computing. These technology trends are progressing at a rapid pace, and have led global technology vendors to package and sell the ‘Smart City’ as a centralized service delivery platform predicted to optimize and enhance cities’ key performance indicators – and generate a profitable market. The top-down deployment of these large and proprietary technology platforms have helped sectors such as energy, transport, and healthcare to increase efficiencies. However, an increasing number of scholars and commentators warn of another ‘IT bubble’ emerging. Along with some city leaders, they argue that the top-down approach does not fit the governance dynamics and values of a liberal democracy when applied across sectors. A thorough understanding is required, of the socio-cultural nuances of how people work, live, play across different environments, and how they employ social media and mobile devices to interact with, engage in, and constitute public realms. Although the term ‘slacktivism’ is sometimes used to denote a watered down version of civic engagement and activism that is reduced to clicking a ‘Like’ button and signing online petitions, we believe that we are far from witnessing another Biedermeier period that saw people focus on the domestic and the non-political. There is plenty of evidence to the contrary, such as post-election violence in Kenya in 2008, the Occupy movements in New York, Hong Kong and elsewhere, the Arab Spring, Stuttgart 21, Fukushima, the Taksim Gezi Park in Istanbul, and the Vinegar Movement in Brazil in 2013. These examples of civic action shape the dynamics of governments, and in turn, call for new processes to be incorporated into governance structures. Participatory research into these new processes across the triad of people, place and technology is a significant and timely investment to foster productive, sustainable, and livable human habitats. With this chapter, we want to reframe the current debates in academia and priorities in industry and government to allow citizens and civic actors to take their rightful centerpiece place in civic movements. This calls for new participatory approaches for co-inquiry and co-design. It is an evolving process with an explicit agenda to facilitate change, and we propose participatory action research (PAR) as an indispensable component in the journey to develop new governance infrastructures and practices for civic engagement. This chapter proposes participatory action research as a useful and fitting research paradigm to guide methodological considerations surrounding the study, design, development, and evaluation of civic technologies. We do not limit our definition of civic technologies to tools specifically designed to simply enhance government and governance, such as renewing your car registration online or casting your vote electronically on election day. Rather, we are interested in civic media and technologies that foster citizen engagement in the widest sense, and particularly the participatory design of such civic technologies that strive to involve citizens in political debate and action as well as question conventional approaches to political issues (DiSalvo, 2012; Dourish, 2010; Foth et al., 2013). Following an outline of some underlying principles and assumptions behind participatory action research, especially as it applies to cities, we will critically review case studies to illustrate the application of this approach with a view to engender robust, inclusive, and dynamic societies built on the principles of engaged liberal democracy. The rationale for this approach is an alternative to smart cities in a ‘perpetual tomorrow,’ (cf. e.g. Dourish & Bell, 2011), based on many weak and strong signals of civic actions revolving around technology seen today. It seeks to emphasize and direct attention to active citizenry over passive consumerism, human actors over human factors, culture over infrastructure, and prosperity over efficiency. First, we will have a look at some fundamental issues arising from applying simplistic smart city visions to the kind of a problem a city is (cf. Jacobs, 1961). We focus on the touch points between “the city” and its civic body, the citizens. In order to provide for meaningful civic engagement, the city must provide appropriate interfaces.
Resumo:
This paper discusses the emergence of assessment for learning (AfL) across the globe with particular attention given to Western educational jurisdictions. Authors from Australia, Canada, Ireland, Israel, New Zealand, Norway, and the USA explain the genesis of AfL, its evolution and impact on school systems, and discuss current trends in policy directions for AfL within their respective countries. The authors also discuss the implications of these various shifts and the ongoing tensions that exist between AfL and summative forms of assessment within national policy initiatives.
Resumo:
Book description: "Over 50,000 new cases of head and neck cancer are diagnosed each year in the United States. The majority of these are squamous cell carcinoma (HNSCC), associated with human papillomavirus infection and carcinogenic behaviors such as tobacco use and alcohol consumption. Although these are more common, there are several other causes that this book addresses. This book examines the epidemiology of head and neck cancer. It discusses the management of head neck cancer as well as treatment outcomes."--publisher website
Resumo:
Cold water immersion and ice baths are popular methods of recovery used by athletes. From the simple wheelie bin with water and ice, to the inflatable baths with complex water cooling units to recovery sessions in the ocean, the practice of cold water immersion is wide and varied. Research into cold water immersion was conducted as early as 1963 when Clarke1 examined the influence of cold water on performance recovery after a sustained handgrip exercise. Research has been conducted to understand how cold water immersion might affect the body’s physiological systems and how factors such as water temperature and the duration of immersion might enhance recovery after training and/or competition. Despite this research activity, how are we to know if research is being put into practice? In more serious situations, where guidelines and policies need to be standardised for the safe use of a product, one would expect that there is a straight forward follow-on from research into practice. Although cold water immersion may not need the rigor of testing compared to drug treatments, for example, the decision on whether to use cold water immersion in specific situations (e.g. after training or competition) may rest with one or two of the staff associated with the athlete/team. Therefore, it would be expected that these staff are well-informed on the current literature regarding cold water immersion.
Resumo:
Despite recent efforts to assess the release of nanoparticles to the workplace during different nanotechnology activities, the existence of a generalizable trend in the particle release has yet to be identified. This study aimed to characterize the release of synthetic clay nanoparticles from a laboratory-based jet milling process by quantifying the variations arising from primary particle size and surface treatment of the material used, as well as the feed rate of the machine. A broad range of materials were used in this study, and the emitted particles mass (PM2.5) and number concentrations (PNC) were measured at the release source. Analysis of variance, followed by linear mixed-effects modeling, was applied to quantify the variations in PM2.5 and PNC of the released particles caused by the abovementioned factors. The results confirmed that using materials of different primary size and surface treatment affects the release of the particles from the same process by causing statistically-significant variations in PM2.5 and PNC. The interaction of these two factors should also be taken into account as it resulted in variations in the measured particles release properties. Furthermore, the feed rate of the milling machine was confirmed to be another influencing parameter. Although this research does not identify a specific pattern in the release of synthetic clay nanoparticles from the jet milling process generalizable to other similar settings, it emphasizes that each tested case should be handled individually in terms of exposure considerations.
Resumo:
Background Stroke incidence has fallen since 1950. Recent trends suggest that stroke incidence may be stabilizing or increasing. We investigated time trends in stroke occurrence and in-hospital morbidity and mortality in the Calgary Health Region. Methods All patients admitted to hospitals in the Calgary Health Region between 1994 and 2002 with a primary discharge diagnosis code (ICD-9 or ICD-10) of stroke were included. In-hospital strokes were also included. Stroke type, date of admission, age, gender,discharge disposition (died, discharged) and in-hospital complications (pneumonia, pulmonary embolism, deep venous thrombosis) were recorded. Poisson and simple linear regression was used to model time trends of occurrence by stroke type and age-group and to extrapolate future time trends. Results From 1994 to 2002, 11642 stroke events were observed. Of these, 9879 patients (84.8%) were discharged from hospital, 1763 (15.1%) died in hospital, and 591 (5.1%) developed in-hospital complications from pneumonia, pulmonary embolism or deep venous thrombosis. Both in-hospital mortality and complication rates were highest for hemorrhages. Over the period of study, the rate of stroke admission has remained stable. However, total numbers of stroke admission to hospital have faced a significant increase (p=0.012) due to the combination of increases in intracerebral hemorrhage (p=0.021) and ischemic stroke admissions (p=0.011). Sub-arachnoid hemorrhage rates have declined. In-hospital stroke mortality has experienced an overall decline due to a decrease in deaths from ischemic stroke, intracerebral hemorrhage and sub-arachnoid hemorrhage. Conclusions Although age-adjusted stroke occurrence rates were stable from 1994 to 2002, this is associated with both a sharp increase in the absolute number of stroke admissions and decline in proportional in-hospital mortality. Further research is needed into changes in stroke severity over time to understand the causes of declining in-hospital stroke mortality rates.