290 resultados para Immediate provisionalization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Eosinophilic esophagitis (EE) is an emerging condition where patients commonly present with symptoms of gastroesophageal reflux disease and fail to respond adequately to anti-reflux therapy. Food allergy is currently recognized as the main immunological cause of EE; recent evidence suggests an etiological role for inhalant allergens. The presence of EE appears to be associated with other atopic illnesses. Objectives: To report the sensitization profile of both food and inhalant allergens in our EE patient cohort in relation to age, and to profile the prevalence of other allergic conditions in patients with EE. Method: The study prospectively analyzed allergen sensitization profiles using skin prick tests to common food allergens and inhalant allergens in 45 children with EE. Patch testing to common food allergens was performed on 33 patients in the same cohort. Comorbidity of atopic eczema, asthma, allergic rhinitis and anaphylaxis were obtained from patient history. Results: Younger patients with EE showed more IgE and patch sensitization to foods while older patients showed greater IgE sensitization to inhalant allergens. The prevalence of atopic eczema, allergic rhinitis and asthma was significantly increased in our EE cohort compared with the general Australian population. A total of 24% of our cohort of patients with EE had a history of anaphylaxis. Conclusion: In children with EE, the sensitization to inhalant allergens increases with age, particularly after 4 years. Also, specific enquiry about severe food reactions in patients presenting with EE is strongly recommended as it appears this patient group has a high incidence of anaphylaxis. © 2007 The Authors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We aimed to identify novel genetic variants affecting asthma risk, since these might provide novel insights into molecular mechanisms underlying the disease. We did a genome-wide association study (GWAS) in 2669 physician-diagnosed asthmatics and 4528 controls from Australia. Seven loci were prioritised for replication after combining our results with those from the GABRIEL consortium (n=26 475), and these were tested in an additional 25 358 independent samples from four in-silico cohorts. Quantitative multi-marker scores of genetic load were constructed on the basis of results from the GABRIEL study and tested for association with asthma in our Australian GWAS dataset. Two loci were confirmed to associate with asthma risk in the replication cohorts and reached genome-wide significance in the combined analysis of all available studies (n=57 800): rs4129267 (OR 1·09, combined p= 2·4×10-8) in the interleukin-6 receptor (IL6R) gene and rs7130588 (OR 1·09, p=1·8×10-8) on chromosome 11q13.5 near the leucine-rich repeat containing 32 gene (LRRC32, also known as GARP). The 11q13.5 locus was significantly associated with atopic status among asthmatics (OR 1·33, p=7×10-4), suggesting that it is a risk factor for allergic but not non-allergic asthma. Multi-marker association results are consistent with a highly polygenic contribution to asthma risk, including loci with weak effects that might be shared with other immune-related diseases, such as NDFIP1, HLA-B, LPP, and BACH2. The IL6R association further supports the hypothesis that cytokine signalling dysregulation affects asthma risk, and raises the possibility that an IL6R antagonist (tocilizumab) may be effective to treat the disease, perhaps in a genotype-dependent manner. Results for the 11q13.5 locus suggest that it directly increases the risk of allergic sensitisation which, in turn, increases the risk of subsequent development of asthma. Larger or more functionally focused studies are needed to characterise the many loci with modest effects that remain to be identified for asthma. National Health and Medical Research Council of Australia. A full list of funding sources is provided in the webappendix. © 2011 Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Young people are over-represented in road crashes and school-based education programs, including the RACQ Docudrama program, represent initiatives aimed at improving road safety among this high-risk group. The aim of the study was to apply an extended Theory of Planned Behaviour framework to understand more about the extent to which the program influenced individuals‟ intentions to speak up to a driver engaging in risky behaviours (e.g., speeding). Senior high school students (N=260) from 5 Queensland schools completed a survey in class. The study included a Control group (n = 86) who responded to the survey prior to completing the Docudrama program and an Intervention group comprising an Intervention-Immediate (n=100) and an Intervention-Delayed group (n = 74) who completed the survey after having participated in the program either on the day or up to a week later, respectively. Overall, the findings provided support for the beneficial effects of the program. Some of the study’s key findings included: (i) Intervention group participants consistently reported significantly stronger intentions to speak up than participants in the control group; (ii) among the significant predictors of intentions, a notable finding was that the more individuals anticipated feeling regretful for not having spoken up to a risky driver, the stronger their intentions were to speak up, and; (iii) the level of fear reported by students significantly decreased and was lowest at the conclusion of the program, following facilitated group discussion. The implications of the results for future research, program development and practice are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Widening participation or outreach agendas have been a major part of higher education policy since the early 2000s. These policies and programs seek to increase marginalised groups’ access to further study through activities, tutoring programs, workshops, and other provisions. Some programs openly state their intention to assist people from low socioeconomic backgrounds to become more civically engaged and socially mobile by improving their education, which creates an immediate link between education and social capital (see Morley 2012; Hillmert and Jacob 2010). Social capital refers to the ‘connections among individuals’ and the consequent value of the things they do together (Putnam 2000; Gauntlett 2011). Media and creative arts widening participation programs, arguably, are better equipped to build social capital than any other form of outreach, due to their relationship-building capacity (Gauntlett 2011; Kinder and Harland 2004). This article analyses Queensland University of Technology’s Creative Industries Widening Participation Program. It investigates social capital and its relationship with higher education in outreach initiatives in order to identify how media and creative arts widening participation programs have the capacity to influence the attitudes of low socioeconomic background students towards higher education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper redefines the focus for narrating histories of education in the USA through a ‘glancing history’. It highlights the important role played by ‘not-dead-yet students’ who occupied a liminal place on the scale of life in late nineteenth- and early twentieth-century. Traditional histories of education have been more singularly focused on the advent and dynamics of public schooling, ignoring the functionality of such child subjects to public schooling’s existence. This paper argues that public schools as historical objects cannot be understood outside of a broader trinary system of prior institutions that were established for ‘delinquent’ and ‘special’ children. These prior institutions facilitated the formation of ‘the public’ in public schooling less in opposition to ‘the private’ and more in consonance with ‘the human’. The existence of prior institutions enabled the enforcement of compulsory attendance legislation. Compulsory attendance legislation, in place across all existing states by 1918, was concerned more with the conditions for exclusion and exemption than with compelling attendance. Thus, at the most immediate level, this paper historicizes some of the discursive and hence institutional events that linked an array of tutelary complexes by the early 1900s, and which enabled such legislation. This part of the argument extends the notion of institution to consider broader places of confinement and systematicity. It examines the prior practice of reservation and slavery systems, and the efficacy they lent to further institutionalized segregation in the USA. At a second level, the narrative reflects on how such a narration has become possible. It considers how histories of education can currently be rethought and rewritten around the notion of dis/ability, historicizing the formation of dis/ability as identity categories made noticeable in part (and circularly) through the crystallization of a segregated but linked common schooling system. The paper thus provides a counter-memory against dominant economic foundationalist and psychomedical accounts of schooling’s past. It documents both ‘external’ conditions of possibility for public schooling’s emergence and ‘internal’ effects that emerged through the experiences of confinement

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The guardians of children brought to the Port Moresby General Hospital's Children's Outpatient Department with a chief complaint of diarrhoeal disease were questioned regarding their preference of glucose-based vs rice-based oral rehydration solution (ORS) in order to determine the acceptability of a rice-based ORS. Of the 93 guardians interviewed, greater than 60% preferred the glucose-based solution in its mixability, appearance and taste, and 65% initially reported that their children preferred the taste of the glucose solution. However, after a 30-minute trial, only 58% of children still preferred the glucose solution. In a country where diarrhoeal disease is a leading cause of child death and guardians are the primary health care providers, the acceptability of an ORS is critical to the morbidity and mortality of Papua New Guinea's children. Killing an estimated 2.9 million children annually, diarrheal disease is the second leading cause of child mortality worldwide. Diarrheal disease is also the second leading cause of child mortality in Papua New Guinea (PNG), killing an average 193 inpatient children per year over the period 1984-90. However, despite the high level of diarrhea-related mortality and the proven efficacy of oral rehydration therapy (ORT) in managing diarrhea-related dehydration, standardized ORT has been underutilized in PNG. The current glucose-based oral rehydration solution (ORS) does not reduce the frequency or volume of a child's diarrhea, the most immediate concern of caregivers during episodes of illness. Cereal-based ORS, made from cereals which are commonly available as food staples in most countries, better address the short-term concerns of caregivers while offering a superior nutritional profile. A sample of guardians of children brought to the Port Moresby General Hospital's Children's Outpatient Department complaining of child diarrhea were asked about their preferences on glucose-based versus rice-based ORS in order to determine the acceptability of a rice-based ORS. More than 60% of the 93 guardians interviewed preferred the glucose-based solution for its mixability, appearance, and taste. 65% initially reported that their children preferred the taste of the glucose solution. However, after a 30-minute trial, only 58% of children still preferred the glucose solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many software applications extend their functionality by dynamically loading libraries into their allocated address space. However, shared libraries are also often of unknown provenance and quality and may contain accidental bugs or, in some cases, deliberately malicious code. Most sandboxing techniques which address these issues require recompilation of the libraries using custom tool chains, require significant modifications to the libraries, do not retain the benefits of single address-space programming, do not completely isolate guest code, or incur substantial performance overheads. In this paper we present LibVM, a sandboxing architecture for isolating libraries within a host application without requiring any modifications to the shared libraries themselves, while still retaining the benefits of a single address space and also introducing a system call inter-positioning layer that allows complete arbitration over a shared library’s functionality. We show how to utilize contemporary hardware virtualization support towards this end with reasonable performance overheads and, in the absence of such hardware support, our model can also be implemented using a software-based mechanism. We ensure that our implementation conforms as closely as possible to existing shared library manipulation functions, minimizing the amount of effort needed to apply such isolation to existing programs. Our experimental results show that it is easy to gain immediate benefits in scenarios where the goal is to guard the host application against unintentional programming errors when using shared libraries, as well as in more complex scenarios, where a shared library is suspected of being actively hostile. In both cases, no changes are required to the shared libraries themselves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For a multiarmed bandit problem with exponential discounting the optimal allocation rule is defined by a dynamic allocation index defined for each arm on its space. The index for an arm is equal to the expected immediate reward from the arm, with an upward adjustment reflecting any uncertainty about the prospects of obtaining rewards from the arm, and the possibilities of resolving those uncertainties by selecting that arm. Thus the learning component of the index is defined to be the difference between the index and the expected immediate reward. For two arms with the same expected immediate reward the learning component should be larger for the arm for which the reward rate is more uncertain. This is shown to be true for arms based on independent samples from a fixed distribution with an unknown parameter in the cases of Bernoulli and normal distributions, and similar results are obtained in other cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developments of surgical attachments for bone-anchored prostheses are slowly but surely winning over the initial disbelief in the orthopedic community. Clearly, this option is becoming accessible to a wide range of individuals with limb loss. Seminal studies have demonstrated that the pioneering procedure relying on screw-type fixation engenders major clinical benefits and acceptable safety. The surgical procedure for press-fit implants, such as the Integral-Leg-Prosthesis (ILP) has been described Dr Aschoff and his team. Some clinical benefits of press-fit implants have been also established. Here, his team is once again taking a leading role by sharing the progression over 15 years of the rate of deep infections for 69 individuals with transfemoral amputation fitted with three successive refined versions of the ILP. By definition, a double-blind randomized clinical trial to test the effect of different fixation’s design is difficult. Alternatively, Juhnke and colleagues are reporting the outcomes of action-research study for a cohort of participants. The first and foremost important outcome of this study is the confirmation that the current design of the IPL and rehabilitation program are altogether leading to an acceptable rate of deep infection and other adverse events (e.g., structural failure of implant, periprosthetic factures). This study is also providing a strong insight onto the effect of major phases in redesign of an implant on the risk of infection. This is an important reminder that the development of a successful osseointegrated implant is unlikely to be immediate but the results of a learning curve made of empirical and sequential changes led by a reflective clinical practice. Clearly, this study provided better understanding of the safety of the ILP surgical and rehabilitation procedure while establishing standards and benchmark data for future studies focusing on design and infection of press-fit implants. Complementary observations of relationship between infection and cofounders such as loading of the prosthesis and prosthetic components used would be beneficial.Further definitive evidences of the clinical benefits with the latest design would be valuable, although an increase in health related quality of life and functional outcomes are likely to be confirmed. Altogether, the authors are providing compelling evidence that bone-anchored attachments particularly those relying on press-fit implants are an established alternative to socket prostheses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acquiring detailed knowledge of surface treatments effectiveness is required to improve performance-based decisions for allocating resources to preserve and maintain pavements on any road network. Measurement of treatment effectiveness is a complex task that requires historical records of treatments with observations of before and after performance trends. Lack of data is often an obstacle that impedes development and incorporation of surface maintenance treatments into pavement management. This paper analyzes the effect of surface treatments on asphalt paved arterial roads for several control sections of New Brunswick. The method uses a Transition Probability Matrix to capture main effects by mapping mean trends of surface improvement and pavement structure decay. It was found that surface treatments have an immediate effect reducing the rate of loss of structural capacity. Pavements with international roughness index (IRI) smaller than 1.4 m/km did not seem to benefit from surface treatments. Those with IRI higher than 1.66 m/km gained from 6 to 8 years of additional life. Reset value for surface treatments fall between 1.18 and 1.29 m/km. This paper aims to serve to practitioners seeking to capture and incorporate effectiveness of surface treatments (i.e., crack-sealing) into Pavement Management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Changing the topology of a railway network can greatly affect its capacity. Railway networks however can be altered in a multitude of different ways. As each way has significant immediate and long term financial ramifications, it is a difficult task to decide how and where to expand the network. In response some railway capacity expansion models (RCEM) have been developed to help capacity planning activities, and to remove physical bottlenecks in the current railway system. The exact purpose of these models is to decide given a fixed budget, where track duplications and track sub divisions should be made, in order to increase theoretical capacity most. These models are high level and strategic, and this is why increases to the theoretical capacity is concentrated upon. The optimization models have been applied to a case study to demonstrate their application and their worth. The case study evidently shows how automated approaches of this nature could be a formidable alternative to current manual planning techniques and simulation. If the exact effect of track duplications and sub-divisions can be sufficiently approximated, this approach will be very applicable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anticipating the number and identity of bidders has significant influence in many theoretical results of the auction itself and bidders’ bidding behaviour. This is because when a bidder knows in advance which specific bidders are likely competitors, this knowledge gives a company a head start when setting the bid price. However, despite these competitive implications, most previous studies have focused almost entirely on forecasting the number of bidders and only a few authors have dealt with the identity dimension qualitatively. Using a case study with immediate real-life applications, this paper develops a method for estimating every potential bidder’s probability of participating in a future auction as a function of the tender economic size removing the bias caused by the contract size opportunities distribution. This way, a bidder or auctioner will be able to estimate the likelihood of a specific group of key, previously identified bidders in a future tender.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acetaminophen (paracetamol) is available in a wide range of oral formulations designed to meet the needs of the population across the age-spectrum, but for people with impaired swallowing, i.e. dysphagia, both solid and liquid medications can be difficult to swallow without modification. The effect of a commercial polysaccharide thickener, designed to be added to fluids to promote safe swallowing by dysphagic patients, on rheology and acetaminophen dissolution was tested using crushed immediate-release tablets in water, effervescent tablets in water, elixir and suspension. The inclusion of the thickener, comprised of xanthan gum and maltodextrin, had a considerable impact on dissolution; acetaminophen release from modified medications reached 12-50% in 30 minutes, which did not reflect the pharmacopeia specification for immediate release preparations. Flow curves reflect the high zero-shear viscosity and the apparent yield stress of the thickened products. The weak gel nature, in combination with high G’ values compared to G” (viscoelasticity) and high apparent yield stress, impact drug release. The restriction on drug release from these formulations is not influenced by the theoretical state of the drug (dissolved or dispersed), and the approach typically used in clinical practice (mixing crushed tablets into pre-prepared thickened fluid) cannot be improved by altering the order of incorporation or mixing method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The internet erupted in outrage last week at reports that Twitter is poised to increase the limit for tweets from 140 to 10,000 characters. The first rumours of such a move emerged in the tech news website Re/code back in September then again last week. The response on Twitter was immediate and, for the most part, somewhere between incensed and bemused, with many thousands of tweets posted with hashtags such as #10kTwitter, #Twitter10k, #10000gate, #140twitter, #beyond140 and #longtweets...