924 resultados para publication lag time
Resumo:
Precise identification of the time when a change in a hospital outcome has occurred enables clinical experts to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for survival time of a clinical procedure in the presence of patient mix in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step change in the mean survival time of patients who underwent cardiac surgery. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. Markov Chain Monte Carlo is used to obtain posterior distributions of the change point parameters including location and magnitude of changes and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time CUSUM control charts for different magnitude scenarios. The proposed estimator shows a better performance where a longer follow-up period, censoring time, is applied. In comparison with the alternative built-in CUSUM estimator, more accurate and precise estimates are obtained by the Bayesian estimator. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.
Resumo:
The publication of the book The interior, in 1902, would change the course of thinking about the War of Canudos, who for many years, had been known simply as' the history of Euclid. President Getulio Vargas became interested in the backwoods bloodbath after reading the book avenger-Euclidean. Liked the work he visited the place of occurrence of war promising enjoy the river poured-Barris with the construction of the weir Cocorobo. Euclides da Cunha lived and produced his work in a time of great change in thought, politics and technology. Despite having worked in the press throughout his life, was best known as an engineer, for having exercised the office during the reconstruction of the bridge, in Sao Jose do Rio Pardo. This article aims to illuminate the event of war in light of the Euclidean work. We will examine the trajectory of Euclides da Cunha in journalism. Your learning process to execute the office newsreader and war correspondent, the newspaper O Estado de S. Paul, as well as their reports and work-monument the hinterlands. Resumo: A publicação da obra Os sertões, em 1902, mudaria os rumos do pensamento sobre a Guerra de Canudos, que, por muitos anos, ficara conhecida, simplesmente, como ‘história de Euclides’. O presidente Getúlio Vargas interessou-se pela hecatombe sertaneja após ter lido o livro-vingador euclidiano. Gostou tanto da obra que visitou o lugar de acontecimento da guerra prometendo aproveitar as águas do rio Vaza-Barris com a construção do açude de Cocorobó. Euclides da Cunha viveu e produziu a sua obra em um momento de grandes transformações no pensamento, na política e na tecnologia. Apesar de ter atuado na imprensa ao longo de toda a sua vida, ficou mais conhecido como engenheiro, por ter exercido o ofício, durante a reconstrução da ponte, em São José do Rio Pardo. O presente artigo visa iluminar o acontecimento da guerra à luz da obra euclidiana. Examinaremos a trajetória de Euclides da Cunha no jornalismo. O seu processo de aprendizagem para exercer o ofício de noticiarista e correspondente de guerra, pelo jornal O Estado de S. Paulo, bem como, as suas reportagens e obra-monumento Os sertões.
Resumo:
Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.
Resumo:
Exploiting wind-energy is one possible way to ex- tend flight duration for Unmanned Arial Vehicles. Wind-energy can also be used to minimise energy consumption for a planned path. In this paper, we consider uncertain time-varying wind fields and plan a path through them. A Gaussian distribution is used to determine uncertainty in the Time-varying wind fields. We use Markov Decision Process to plan a path based upon the uncertainty of Gaussian distribution. Simulation results that compare the direct line of flight between start and target point and our planned path for energy consumption and time of travel are presented. The result is a robust path using the most visited cell while sampling the Gaussian distribution of the wind field in each cell.
Resumo:
Blogs and other online platforms for personal writing such as LiveJournal have been of interest to researchers across the social sciences and humanities for a decade now. Although growth in the uptake of blogging has stalled somewhat since the heyday of blogs in the early 2000s, blogging continues to be a major genre of Internet-based communication. Indeed, at the same time that mass participation has moved on to Facebook, Twitter, and other more recent communication phenomena, what has been left behind by the wave of mass adoption is a slightly smaller but all the more solidly established blogosphere of engaged and committed participants. Blogs are now an accepted part of institutional, group, and personal communications strategies (Bruns and Jacobs, 2006); in style and substance, they are situated between the more static information provided by conventional Websites and Webpages and the continuous newsfeeds provided through Facebook and Twitter updates. Blogs provide a vehicle for authors (and their commenters) to think through given topics in the space of a few hundred to a few thousand words – expanding, perhaps, on shorter tweets, and possibly leading to the publication of more fully formed texts elsewhere. Additionally, they are also a very flexible medium: they readily provide the functionality to include images, audio, video, and other additional materials – as well as the fundamental tool of blogging, the hyperlink itself. Indeed, the role of the link in blogs and blog posts should not be underestimated. Whatever the genre and topic that individual bloggers engage in, for the most part blogging is used to provide timely updates and commentary – and it is typical for such material to link both to relevant posts made by other bloggers, and to previous posts by the present author, both to background material which provides readers with further information about the blogger’s current topic, and to news stories and articles which the blogger found interesting or worthy of critique. Especially where bloggers are part of a larger community of authors sharing similar interests or views (and such communities are often indicated by the presence of yet another type of link – in blogrolls, often in a sidebar on the blog site, which list the blogger’s friends or favourites), then, the reciprocal writing and linking of posts often constitutes an asynchronous, distributed conversation that unfolds over the course of days, weeks, and months. Research into blogs is interesting for a variety of reasons, therefore. For one, a qualitative analysis of one or several blogs can reveal the cognitive and communicative processes through which individual bloggers define their online identity, position themselves in relation to fellow bloggers, frame particular themes, topics and stories, and engage with one another’s points of view. It may also shed light on how such processes may differ across different communities of interest, perhaps in correlation with the different societal framing and valorisation of specific areas of interest, with the socioeconomic backgrounds of individual bloggers, or with other external or internal factors. Such qualitative research now looks back on a decade-long history (for key collections, see Gurak, et al., 2004; Bruns and Jacobs, 2006; also see Walker Rettberg, 2008) and has recently shifted also to specifically investigate how blogging practices differ across different cultures (Russell and Echchaibi, 2009). Other studies have also investigated the practices and motivations of bloggers in specific countries from a sociological perspective, through large-scale surveys (e.g. Schmidt, 2009). Blogs have also been directly employed within both K-12 and higher education, across many disciplines, as tools for reflexive learning and discussion (Burgess, 2006).
Resumo:
The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.
Resumo:
Driver response (reaction) time (tr) of the second queuing vehicle is generally longer than other vehicles at signalized intersections. Though this phenomenon was revealed in 1972, the above factor is still ignored in conventional departure models. This paper highlights the need for quantitative measurements and analysis of queuing vehicle performance in spontaneous discharge pattern because it can improve microsimulation. Video recording from major cities in Australia plus twenty two sets of vehicle trajectories extracted from the Next Generation Simulation (NGSIM) Peachtree Street Dataset have been analyzed to better understand queuing vehicle performance in the discharge process. Findings from this research will alleviate driver response time and also can be used for the calibration of the microscopic traffic simulation model.
Resumo:
This paper presents the benefits and issues related to travel time prediction on urban network. Travel time information quantifies congestion and is perhaps the most important network performance measure. Travel time prediction has been an active area of research for the last five decades. The activities related to ITS have increased the attention of researchers for better and accurate real-time prediction of travel time. Majority of the literature on travel time prediction is applicable to freeways where, under non-incident conditions, traffic flow is not affected by external factors such as traffic control signals and opposing traffic flows. On urban environment the problem is more complicated due to conflicting areas (intersections), mid-link sources and sinks etc. and needs to be addressed.
Resumo:
Raman spectroscopy, when used in spatially offset mode, has become a potential tool for the identification of explosives and other hazardous substances concealed in opaque containers. The molecular fingerprinting capability of Raman spectroscopy makes it an attractive tool for the unambiguous identification of hazardous substances in the field. Additionally, minimal sample preparation is required compared with other techniques. We report a field portable time resolved Raman sensor for the detection of concealed chemical hazards in opaque containers. The new sensor uses a pulsed nanosecond laser source in conjunction with an intensified CCD detector. The new sensor employs a combination of time and space resolved Raman spectroscopy to enhance the detection capability. The new sensor can identify concealed hazards by a single measurement without any chemometric data treatments.
Resumo:
The fashion ecosystem is at boiling point as consumers turn up the heat in all areas of the fashion value, trend and supply chain. While traditionally fashion has been a monologue from designer brand to consumer, new technology and the virtual world has given consumers a voice to engage brands in a conversation to express evolving needs, ideas and feedback. Product customisation is no longer innovative. Successful brands are including customers in the design process and holding conversations ‘with’ them to improve product, manufacturing, sales, distribution, marketing and sustainable business practices. Co-creation and crowd sourcing are integral to any successful business model and designers and manufacturers are supplying the technology or tools for these creative, active, participatory ‘prosumers’. With this collaboration however, there arises a worrying trend for fashion professionals. The ‘design it yourself’, ‘indiepreneur’ who with the combination of technology, the internet, excess manufacturing capacity, crowd funding and the idea of sharing the creative integrity of a product (‘copyleft’ not copyright) is challenging the notion that the fashion supply chain is complex. The passive ‘consumer’ no longer exists. Fashion designers now share the stage with ‘amateur’ creators who are disrupting every activity they touch, while being motivated by profit as well as a quest for originality and innovation. This paper examines the effects this ‘consumer’ engagement is having on traditional fashion models and the fashion supply chain. Crowd sourcing, crowd funding, co-creating, design it yourself, global sourcing, the virtual supply chain, social media, online shopping, group buying, consumer to consumer marketing and retail, and branding the ‘individual’ are indicative of the new consumer-driven fashion models. Consumers now drive the fashion industry - from setting trends, through to creating, producing, selling and marketing product. They can turn up the heat at any time _ and any point _ in the fashion supply chain. They are raising the temperature at each and every stage of the chain, decreasing or eliminating the processes involved: decreasing the risk of fashion obsolescence, quantities for manufacture, complexity of distribution and the consumption of product; eliminating certain stages altogether and limiting the brand as custodians of marketing. Some brands are discovering a new ‘enemy’ – the very people they are trying to sell to. Keywords: fashion supply chain, virtual world, consumer, ‘prosumers’, co-creation, fashion designers
Resumo:
A new spatial logic encompassing redefined concepts of time and place, space and distance, requires a comprehensive shift in the approach to designing workplace environments for today’s adaptive, collaborative organizations operating in a dynamic business world. Together with substantial economic and cultural shifts and an increased emphasis on lifestyle considerations, the advances in information technology have prompted a radical re-ordering of organizational relationships and the associated structures, processes, and places of doing business. Within the duality of space and an augmentation of the traditional notions of place, organizational and institutional structures pose new challenges for the design professions. The literature reveals that there has always been a mono-organizational focus in relation to workplace design strategies and the burgeoning trend towards inter-organizational collaboration, enabled the identification of a gap in the knowledge relative to workplace design. The NetWorkPlaceTM© constitutes a multi-dimensional concept having the capacity to deal with the fluidity and ambiguity characteristic of the network context, as both a topic of research and the way of going about it.
Resumo:
Using Gray and McNaughton’s (2000) revised Reinforcement Sensitivity Theory (r-RST), we examined the influence of personality on processing of words presented in gain-framed and loss-framed anti-speeding messages and how the processing biases associated with personality influenced message acceptance. The r-RST predicts that the nervous system regulates personality and that behaviour is dependent upon the activation of the Behavioural Activation System (BAS), activated by reward cues and the Fight-Flight-Freeze System (FFFS), activated by punishment cues. According to r-RST, individuals differ in the sensitivities of their BAS and FFFS (i.e., weak to strong), which in turn leads to stable patterns of behaviour in the presence of rewards and punishments, respectively. It was hypothesised that individual differences in personality (i.e., strength of the BAS and the FFFS) would influence the degree of both message processing (as measured by reaction time to previously viewed message words) and message acceptance (measured three ways by perceived message effectiveness, behavioural intentions, and attitudes). Specifically, it was anticipated that, individuals with a stronger BAS would process the words presented in the gain-frame messages faster than those with a weaker BAS and individuals with a stronger FFFS would process the words presented in the loss-frame messages faster than those with a weaker FFFS. Further, it was expected that greater processing (faster reaction times) would be associated with greater acceptance for that message. Driver licence holding students (N = 108) were recruited to view one of four anti-speeding messages (i.e., social gain-frame, social loss-frame, physical gain-frame, and physical loss-frame). A computerised lexical decision task assessed participants’ subsequent reaction times to message words, as an indicator of the extent of processing of the previously viewed message. Self-report measures assessed personality and the three message acceptance measures. As predicted, the degree of initial processing of the content of the social gain-framed message mediated the relationship between the reward sensitive trait and message effectiveness. Initial processing of the physical loss-framed message partially mediated the relationship between the punishment sensitive trait and both message effectiveness and behavioural intention ratings. These results show that reward sensitivity and punishment sensitivity traits influence cognitive processing of gain-framed and loss-framed message content, respectively, and subsequently, message effectiveness and behavioural intention ratings. Specifically, a range of road safety messages (i.e., gain-frame and loss-frame messages) could be designed which align with the processing biases associated with personality and which would target those individuals who are sensitive to rewards and those who are sensitive to punishments.
Resumo:
A wireless sensor network collected real-time water-quality measurements to investigate how current irrigation practices—in particular, underground water salination—affect the environment. New protocols provided high end-to-end packet delivery rates in the hostile deployment environment.
Resumo:
Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.
Resumo:
U-Healthcare means that it provides healthcare services "at anytime and anywhere" using wired, wireless and ubiquitous sensor network technologies. As a main field of U-healthcare, Telehealth has been developed as an enhancement of Telemedicine. This system includes two-way interactive web-video communications, sensor technology, and health informatics. With these components, it will assist patients to receive their first initial diagnosis. Futhermore, Telehealth will help doctors diagnose patient's diseases at early stages and recommend treatments to patients. However, this system has a few limitations such as privacy issues, interruption of real-time service and a wrong ordering from remote diagnosis. To deal with those flaws, security procedures such as authorised access should be applied to as an indispensible component in medical environment. As a consequence, Telehealth system with these protection procedures in clinical services will cope with anticipated vulnerabilities of U-Healthcare services and security issues involved.