985 resultados para Temporal dimension
Resumo:
The quality of dried food is affected by a number of factors including quality of raw material, initial microstructure, and drying conditions. The structure of the food materials goes through deformations due to the simultaneous effect of heat and mass transfer during the drying process. Shrinkage and changes in porosity, microstructure and appearance are some of the most remarkable features that directly influence overall product quality. Porosity and microstructure are the important material properties in relation to the quality attributes of dried foods. Fractal dimension (FD) is a quantitative approach of measuring surface, pore characteristics, and microstructural changes [1]. However, in the field of fractal analysis, there is a lack of research in developing relationship between porosity, shrinkage and microstructure of different solid food materials in different drying process and conditions [2-4]. Establishing a correlation between microstructure and porosity through fractal dimension during convective drying is the main objective of this work.
Resumo:
Lyngbya majuscula is a cyanobacterium (blue-green algae) occurring naturally in tropical and subtropical coastal areas worldwide. Deception Bay, in Northern Moreton Bay, Queensland, has a history of Lyngbya blooms, and forms a case study for this investigation. The South East Queensland (SEQ) Healthy Waterways Partnership, collaboration between government, industry, research and the community, was formed to address issues affecting the health of the river catchments and waterways of South East Queensland. The Partnership coordinated the Lyngbya Research and Management Program (2005-2007) which culminated in a Coastal Algal Blooms (CAB) Action Plan for harmful and nuisance algal blooms, such as Lyngbya majuscula. This first phase of the project was predominantly of a scientific nature and also facilitated the collection of additional data to better understand Lyngbya blooms. The second phase of this project, SEQ Healthy Waterways Strategy 2007-2012, is now underway to implement the CAB Action Plan and as such is more management focussed. As part of the first phase of the project, a Science model for the initiation of a Lyngbya bloom was built using Bayesian Networks (BN). The structure of the Science Bayesian Network was built by the Lyngbya Science Working Group (LSWG) which was drawn from diverse disciplines. The BN was then quantified with annual data and expert knowledge. Scenario testing confirmed the expected temporal nature of bloom initiation and it was recommended that the next version of the BN be extended to take this into account. Elicitation for this BN thus occurred at three levels: design, quantification and verification. The first level involved construction of the conceptual model itself, definition of the nodes within the model and identification of sources of information to quantify the nodes. The second level included elicitation of expert opinion and representation of this information in a form suitable for inclusion in the BN. The third and final level concerned the specification of scenarios used to verify the model. The second phase of the project provides the opportunity to update the network with the newly collected detailed data obtained during the previous phase of the project. Specifically the temporal nature of Lyngbya blooms is of interest. Management efforts need to be directed to the most vulnerable periods to bloom initiation in the Bay. To model the temporal aspects of Lyngbya we are using Object Oriented Bayesian networks (OOBN) to create ‘time slices’ for each of the periods of interest during the summer. OOBNs provide a framework to simplify knowledge representation and facilitate reuse of nodes and network fragments. An OOBN is more hierarchical than a traditional BN with any sub-network able to contain other sub-networks. Connectivity between OOBNs is an important feature and allows information flow between the time slices. This study demonstrates more sophisticated use of expert information within Bayesian networks, which combine expert knowledge with data (categorized using expert-defined thresholds) within an expert-defined model structure. Based on the results from the verification process the experts are able to target areas requiring greater precision and those exhibiting temporal behaviour. The time slices incorporate the data for that time period for each of the temporal nodes (instead of using the annual data from the previous static Science BN) and include lag effects to allow the effect from one time slice to flow to the next time slice. We demonstrate a concurrent steady increase in the probability of initiation of a Lyngbya bloom and conclude that the inclusion of temporal aspects in the BN model is consistent with the perceptions of Lyngbya behaviour held by the stakeholders. This extended model provides a more accurate representation of the increased risk of algal blooms in the summer months and show that the opinions elicited to inform a static BN can be readily extended to a dynamic OOBN, providing more comprehensive information for decision makers.
Resumo:
Background Commercially available instrumented treadmill systems that provide continuous measures of temporospatial gait parameters have recently become available for clinical gait analysis. This study evaluated the level of agreement between temporospatial gait parameters derived from a new instrumented treadmill, which incorporated a capacitance-based pressure array, with those measured by a conventional instrumented walkway (criterion standard). Methods Temporospatial gait parameters were estimated from 39 healthy adults while walking over an instrumented walkway (GAITRite®) and instrumented treadmill system (Zebris) at matched speed. Differences in temporospatial parameters derived from the two systems were evaluated using repeated measures ANOVA models. Pearson-product-moment correlations were used to investigate relationships between variables measured by each system. Agreement was assessed by calculating the bias and 95% limits of agreement. Results All temporospatial parameters measured via the instrumented walkway were significantly different from those obtained from the instrumented treadmill (P < .01). Temporospatial parameters derived from the two systems were highly correlated (r, 0.79–0.95). The 95% limits of agreement for temporal parameters were typically less than ±2% of gait cycle duration. However, 95% limits of agreement for spatial measures were as much as ±5 cm. Conclusions Differences in temporospatial parameters between systems were small but statistically significant and of similar magnitude to changes reported between shod and unshod gait in healthy young adults. Temporospatial parameters derived from an instrumented treadmill, therefore, are not representative of those obtained from an instrumented walkway and should not be interpreted with reference to literature on overground walking.
Resumo:
This study investigates the role of environmental dynamics (i.e., market turbulence) as a factor influencing an organisation’s top management temporal orientation, and the impact of temporal orientation on innovative and financial performance. Results show that firm’s operating in highly turbulent markets exhibit higher degrees of future orientation, as opposed to present orientation. Future-oriented (rather than present-oriented) firms also experience higher levels of both incremental and radical innovations, which in turn generate financial performance. The study highlights the important role of shared strategic mindset (which is contextually influenced) as a driving factor behind the firm innovative and financial performance.
Resumo:
INTRODUCTION Dengue fever (DF) in Vietnam remains a serious emerging arboviral disease, which generates significant concerns among international health authorities. Incidence rates of DF have increased significantly during the last few years in many provinces and cities, especially Hanoi. The purpose of this study was to detect DF hot spots and identify the disease dynamics dispersion of DF over the period between 2004 and 2009 in Hanoi, Vietnam. METHODS Daily data on DF cases and population data for each postcode area of Hanoi between January 1998 and December 2009 were obtained from the Hanoi Center for Preventive Health and the General Statistic Office of Vietnam. Moran's I statistic was used to assess the spatial autocorrelation of reported DF. Spatial scan statistics and logistic regression were used to identify space-time clusters and dispersion of DF. RESULTS The study revealed a clear trend of geographic expansion of DF transmission in Hanoi through the study periods (OR 1.17, 95% CI 1.02-1.34). The spatial scan statistics showed that 6/14 (42.9%) districts in Hanoi had significant cluster patterns, which lasted 29 days and were limited to a radius of 1,000 m. The study also demonstrated that most DF cases occurred between June and November, during which the rainfall and temperatures are highest. CONCLUSIONS There is evidence for the existence of statistically significant clusters of DF in Hanoi, and that the geographical distribution of DF has expanded over recent years. This finding provides a foundation for further investigation into the social and environmental factors responsible for changing disease patterns, and provides data to inform program planning for DF control.
Resumo:
The numerical solution in one space dimension of advection--reaction--diffusion systems with nonlinear source terms may invoke a high computational cost when the presently available methods are used. Numerous examples of finite volume schemes with high order spatial discretisations together with various techniques for the approximation of the advection term can be found in the literature. Almost all such techniques result in a nonlinear system of equations as a consequence of the finite volume discretisation especially when there are nonlinear source terms in the associated partial differential equation models. This work introduces a new technique that avoids having such nonlinear systems of equations generated by the spatial discretisation process when nonlinear source terms in the model equations can be expanded in positive powers of the dependent function of interest. The basis of this method is a new linearisation technique for the temporal integration of the nonlinear source terms as a supplementation of a more typical finite volume method. The resulting linear system of equations is shown to be both accurate and significantly faster than methods that necessitate the use of solvers for nonlinear system of equations.
Resumo:
This paper presents a novel framework for the unsupervised alignment of an ensemble of temporal sequences. This approach draws inspiration from the axiom that an ensemble of temporal signals stemming from the same source/class should have lower rank when "aligned" rather than "misaligned". Our approach shares similarities with recent state of the art methods for unsupervised images ensemble alignment (e.g. RASL) which breaks the problem into a set of image alignment problems (which have well known solutions i.e. the Lucas-Kanade algorithm). Similarly, we propose a strategy for decomposing the problem of temporal ensemble alignment into a similar set of independent sequence problems which we claim can be solved reliably through Dynamic Time Warping (DTW). We demonstrate the utility of our method using the Cohn-Kanade+ dataset, to align expression onset across multiple sequences, which allows us to automate the rapid discovery of event annotations.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
In this paper, a Bayesian hierarchical model is used to anaylze the female breast cancer mortality rates for the State of Missouri from 1969 through 2001. The logit transformations of the mortality rates are assumed to be linear over the time with additive spatial and age effects as intercepts and slopes. Objective priors of the hierarchical model are explored. The Bayesian estimates are quite robustness in terms change of the hyperparamaters. The spatial correlations are appeared in both intercepts and slopes.
Resumo:
We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identity-based encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier lattice-based HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions.
Resumo:
Heparan sulfate proteoglycans (HSPGs) are complex and labile macromolecular moieties on the surfaces of cells that control the activities of a range of extracellular proteins, particularly those driving growth and regeneration. Here, we examine the biosynthesis of heparan sulfate (HS) sugars produced by cultured MC3T3-E1 mouse calvarial pre-osteoblast cells in order to explore the idea that changes in HS activity in turn drive phenotypic development during osteogenesis. Cells grown for 5 days under proliferating conditions were compared to cells grown for 20 days under mineralizing conditions with respect to their phenotype, the forms of HS core protein produced, and their HS sulfotransferase biosynthetic enzyme levels. RQ-PCR data was supported by the results from the purification of day 5 and day 20 HS forms by anionic exchange chromatography. The data show that cells in active growth phases produce more complex forms of sugar than cells that have become relatively quiescent during active mineralization, and that these in turn can differentially influence rates of cell growth when added exogenously back to preosteoblasts.
Resumo:
Using Media-Access-Control (MAC) address for data collection and tracking is a capable and cost effective approach as the traditional ways such as surveys and video surveillance have numerous drawbacks and limitations. Positioning cell-phones by Global System for Mobile communication was considered an attack on people's privacy. MAC addresses just keep a unique log of a WiFi or Bluetooth enabled device for connecting to another device that has not potential privacy infringements. This paper presents the use of MAC address data collection approach for analysis of spatio-temporal dynamics of human in terms of shared space utilization. This paper firstly discuses the critical challenges and key benefits of MAC address data as a tracking technology for monitoring human movement. Here, proximity-based MAC address tracking is postulated as an effective methodology for analysing the complex spatio-temporal dynamics of human movements at shared zones such as lounge and office areas. A case study of university staff lounge area is described in detail and results indicates a significant added value of the methodology for human movement tracking. By analysis of MAC address data in the study area, clear statistics such as staff’s utilisation frequency, utilisation peak periods, and staff time spent is obtained. The analyses also reveal staff’s socialising profiles in terms of group and solo gathering. The paper is concluded with a discussion on why MAC address tracking offers significant advantages for tracking human behaviour in terms of shared space utilisation with respect to other and more prominent technologies, and outlines some of its remaining deficiencies.
Resumo:
Digital learning has come a long way from the days of simple 'if-then' queries. It is now enabled by countless innovations that support knowledge sharing, openness, flexibility, and independent inquiry. Set against an evolutionary context this study investigated innovations that directly support human inquiry. Specifically, it identified five activities that together are defined as the 'why dimension' – asking, learning, understanding, knowing, and explaining why. Findings highlight deficiencies in mainstream search-based approaches to inquiry, which tend to privilege the retrieval of information as distinct from explanation. Instrumental to sense-making, the 'why dimension' provides a conceptual framework for development of 'sense-making technologies'.
Resumo:
Until recently, sustainable development was perceived as essentially an environmental issue, relating to the integration of environmental concerns into economic decision-making. As a result, environmental considerations have been the primary focus of sustainability decision making during the economic development process for major projects, and the assessment and preservation of social and cultural systems has been arguably too limited. The practice of social impact and sustainability assessment is an established and accepted part of project planning, however, these practices are not aimed at delivering sustainability outcomes for social systems, rather they are designed to minimise ‘unsustainability’ and contribute to project approval. Currently, there exists no widely recognised standard approach for assessing social sustainability and accounting for positive externalities of existing social systems in project decision making. As a result, very different approaches are applied around the world, and even by the same organisations from one project to another. This situation is an impediment not only to generating a shared understanding of the social implications as related to major projects, but more importantly, to identifying common approaches to help improve social sustainability outcomes of proposed activities. This paper discusses the social dimension of sustainability decision making of mega-projects, and argues that to improve accountability and transparency of project outcomes it is important to understand the characteristics that make some communities more vulnerable than others to mega-project development. This paper highlights issues with current operational level approaches to social sustainability assessment at the project level, and asserts that the starting point for project planning and sustainability decision making of mega-projects needs to include the preservation, maintenance, and enhancement of existing social and cultural systems. It draws attention to the need for a scoping mechanism to systematically assess community vulnerability (or sensitivity) to major infrastructure development during the feasibility and planning stages of a project.