920 resultados para FRONTIERS
Resumo:
Irritable bowel syndrome (IBS) is a common chronic disorder with a prevalence ranging from 5 to 10 % of the world's population. This condition is characterised by abdominal discomfort or pain, altered bowel habits, and often bloating and abdominal distension. IBS reduces quality of life in the same degree of impairment as major chronic diseases such as congestive heart failure and diabetes and the economic burden on the health care system and society is high. Abnormalities have been reported in the neuroendocrine peptides/amines of the stomach, small- and large intestine in patients with IBS. These abnormalities would cause disturbances in digestion, gastrointestinal motility and visceral hypersensitivity, which have been reported in patients with IBS. These abnormalities seem to contribute to the symptom development and appear to play a central role in the pathogenesis of IBS. Neuroendocrine peptides/amines are potential tools in the treatment and diagnosis of IBS. In particular, the cell density of duodenal chromogranin A expressing cells appears to be a good histopathological marker for the diagnosis of IBS with high sensitivity and specificity.
Resumo:
This volume puts together the works of a group of distinguished scholars and active researchers in the field of media and communication studies to reflect upon the past, present, and future of new media research. The chapters examine the implications of new media technologies on everyday life, existing social institutions, and the society at large at various levels of analysis. Macro-level analyses of changing techno-social formation – such as discussions of the rise of surveillance society and the "fifth estate" – are combined with studies on concrete and specific new media phenomena, such as the rise of Pro-Am collaboration and "fan labor" online. In the process, prominent concepts in the field of new media studies, such as social capital, displacement, and convergence, are critically examined, while new theoretical perspectives are proposed and explicated. Reflecting the inter-disciplinary nature of the field of new media studies and communication research in general, the chapters interrogate into the problematic through a range of theoretical and methodological approaches. The book should offer students and researchers who are interested in the social impact of new media both critical reviews of the existing literature and inspirations for developing new research questions.
Resumo:
In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.
Resumo:
Health Informatics is an intersection of information technology, several disciplines of medicine and health care. It sits at the common frontiers of health care services including patient centric, processes driven and procedural centric care. From the information technology perspective it can be viewed as computer application in medical and/or health processes for delivering better health care solutions. In spite of the exaggerated hype, this field is having a major impact in health care solutions, in particular health care deliveries, decision making, medical devices and allied health care industries. It also affords enormous research opportunities for new methodological development. Despite the obvious connections between Medical Informatics, Nursing Informatics and Health Informatics, most of the methodologies and approaches used in Health Informatics have so far originated from health system management, care aspects and medical diagnostic. This paper explores reasoning for domain knowledge analysis that would establish Health Informatics as a domain and recognised as an intellectual discipline in its own right.
Resumo:
Most of the national Health Information Systems (HIS) in resource limited developing countries do not serve the purpose of management support and thus the service is adversely affected. While emphasising the importance of timely and accurate health information in decision making in healthcare planning, this paper explains that Health Management Information System Failure is commonly seen in developing countries as well as the developed countries. It is suggested that the possibility of applying principles of Health Informatics and the technology of Decision Support Systems should be seriously considered to improve the situation. A brief scientific explanation of the evolution of these two disciplines is included.
Resumo:
Standardisation of validated communication protocols that aid in the adoption of policies, methods and tools in a secure eHealth setting require a significant cultural shift among clinicians
Resumo:
This paper describes the content and delivery of a software internationalisation subject (ITN677) that was developed for Master of Information Technology (MIT) students in the Faculty of Information Technology at Queensland University of Technology. This elective subject introduces students to the strategies, technologies, techniques and current development associated with this growing 'software development for the world' specialty area. Students learn what is involved in planning and managing a software internationalisation project as well as designing, building and using a software internationalisation application. Students also learn about how a software internationalisation project must fit into an over-all product localisation and globalisation that may include culturalisation, tailored system architectures, and reliance upon industry standards. In addition, students are exposed to the different software development techniques used by organizations in this arena and the perils and pitfalls of managing software internationalisation projects.
Resumo:
Genomics and genetic findings have been hailed with promises of unlocked codes and new frontiers of personalized medicine. Despite cautions about gene hype, the strong cultural pull of genes and genomics has allowed consideration of genomic personhood. Populated by the complicated records of mass spectrometer, proteomics, which studies the human protein, has not achieved either the funding or the popular cultural appeal proteomics scientists had hoped it would. While proteomics, being focused on the proteins that actually indicate and create disease states, has a more direct potential for clinical applications than genomic risk predictions, culturally, it has not provided the material for identity creation. In our ethnographic research, we explore how proteomic scientists attempting to shape an appeal to personhood through which legitimacy may be defined.
Resumo:
Robotic systems are increasingly being utilised as fundamental data-gathering tools by scientists, allowing new perspectives and a greater understanding of the planet and its environmental processes. Today's robots are already exploring our deep oceans, tracking harmful algal blooms and pollution spread, monitoring climate variables, and even studying remote volcanoes. This article collates and discusses the significant advancements and applications of marine, terrestrial, and airborne robotic systems developed for environmental monitoring during the last two decades. Emerging research trends for achieving large-scale environmental monitoring are also reviewed, including cooperative robotic teams, robot and wireless sensor network (WSN) interaction, adaptive sampling and model-aided path planning. These trends offer efficient and precise measurement of environmental processes at unprecedented scales that will push the frontiers of robotic and natural sciences.
Resumo:
Literacy Theories for the Digital Age insightfully brings together six essential approaches to literacy research and educational practice. The book provides powerful and accessible theories for readers, including Socio-cultural, Critical, Multimodal, Socio-spatial, Socio-material and Sensory Literacies. The brand new Sensory Literacies approach is an original and visionary contribution to the field, coupled with a provocative foreword from leading sensory anthropologist David Howes. This dynamic collection explores a legacy of literacy research while showing the relationships between each paradigm, highlighting their complementarity and distinctions. This highly relevant compendium will inspire readers to explore new frontiers of thought and practice in times of diversity and technological change.
Resumo:
Pesticides used in agricultural systems must be applied in economically viable and environmentally sensitive ways, and this often requires expensive field trials on spray deposition and retention by plant foliage. Computational models to describe whether a spray droplet sticks (adheres), bounces or shatters on impact, and if any rebounding parent or shatter daughter droplets are recaptured, would provide an estimate of spray retention and thereby act as a useful guide prior to any field trials. Parameter-driven interactive software has been implemented to enable the end-user to study and visualise droplet interception and impaction on a single, horizontal leaf. Living chenopodium, wheat and cotton leaves have been scanned to capture the surface topography and realistic virtual leaf surface models have been generated. Individual leaf models have then been subjected to virtual spray droplets and predictions made of droplet interception with the virtual plant leaf. Thereafter, the impaction behaviour of the droplets and the subsequent behaviour of any daughter droplets, up until re-capture, are simulated to give the predicted total spray retention by the leaf. A series of critical thresholds for the stick, bounce, and shatter elements in the impaction process have been developed for different combinations of formulation, droplet size and velocity, and leaf surface characteristics to provide this output. The results show that droplet properties, spray formulations and leaf surface characteristics all influence the predicted amount of spray retained on a horizontal leaf surface. Overall the predicted spray retention increases as formulation surface tension, static contact angle, droplet size and velocity decreases. Predicted retention on cotton is much higher than on chenopodium. The average predicted retention on a single horizontal leaf across all droplet size, velocity and formulations scenarios tested, is 18, 30 and 85% for chenopodium, wheat and cotton, respectively.
Resumo:
Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), we show to be a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.
Resumo:
The market of building retrofits is increasingly more intensified as existing buildings are aging. The building retrofit projects involve existing buildings which impose constraints on stakeholders throughout the project process. They are also risky, complex, less predictable and difficult to be well planned with on-site waste becoming one of the critical issues. Small and Medium Enterprises (SMEs) carry out most of the work in retrofit projects as subcontractors, but they often do not have adequate resources to deal with the specific technical challenges and project risks related to waste. This paper first discusses the requirements of waste management in building retrofit projects considering specific project characteristics and work natures, and highlights the importance of involving SMEs in waste planning and management through an appropriate way. By utilizing semi-structured interviews, this research develops a process model for SMEs to be applied in waste management. A collaboration scenario is also developed for collaborative waste planning and management by SMEs as subcontractors and large companies as main contractors. Findings from the paper will promote coordination of project delivery and waste management in building retrofit projects, and improve the involvement and performance of SMEs in dealing with waste problems.
Resumo:
While scientists are still debating the level of climate change impact to new weather patterns, there have been some devastating natural disasters worldwide in the last decade. From cyclones to earthquakes and from Tsunamis to landslides, these disasters occur with formidable forces and crushing effects. As one of the most important arrangements to erase the negative influence of natural disasters and help with the recovery and redevelopment of the hit area, reconstruction is of utmost importance in light of sustainable objectives. However, current reconstruction practice confronts quite a lot of criticisms for focusing on providing short-term necessities. How to conduct the post disaster reconstruction in a long-term perspective and achieve sustainable development is thereby a highlight for industry practice and research. This paper introduced an on-going research project which is aimed at establishing an operational framework for improving sustainability performance of post disaster reconstruction by identifying critical sustainable factors and exploring their internal relationships. The research reported in this paper is part of the project. After a comprehensive literature review, 17 potential critical sustainability factors for post disaster reconstruction were identified. Preliminary examination and discussion of the factors was conducted.
Resumo:
In 1963, the National Institutes of Health (NIH) first issued guidelines for animal housing and husbandry. The most recent 2010 revision emphasizes animal care “in ways judged to be scientifically, technically, and humanely appropriate” (National Institutes of Health, 2010, p. XIII). The goal of these guidelines is to ensure humanitarian treatment of animals and to optimize the quality of research. Although these animal care guidelines cover a substantial amount of information regarding animal housing and husbandry, researchers generally do not report all these variables (see Table Table1).1). The importance of housing and husbandry conditions with respect to standardization across different research laboratories has been debated previously (Crabbe et al., 1999; Van Der Staay and Steckler, 2002; Wahlsten et al., 2003; Wolfer et al., 2004; Van Der Staay, 2006; Richter et al., 2010, 2011). This paper focuses on several animal husbandry and housing issues that are particularly relevant to stress responses in rats, including transportation, handling, cage changing, housing conditions, light levels and the light–dark cycle. We argue that these key animal housing and husbandry variables should be reported in greater detail in an effort to raise awareness about extraneous experimental variables, especially those that have the potential to interact with the stress response.