638 resultados para Analytical models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the emergence of Unmanned Aircraft Systems (UAS) there is a growing need for safety standards and regulatory frameworks to manage the risks associated with their operations. The primary driver for airworthiness regulations (i.e., those governing the design, manufacture, maintenance and operation of UAS) are the risks presented to people in the regions overflown by the aircraft. Models characterising the nature of these risks are needed to inform the development of airworthiness regulations. The output from these models should include measures of the collective, individual and societal risk. A brief review of these measures is provided. Based on the review, it was determined that the model of the operation of an UAS over inhabited areas must be capable of describing the distribution of possible impact locations, given a failure at a particular point in the flight plan. Existing models either do not take the impact distribution into consideration, or propose complex and computationally expensive methods for its calculation. A computationally efficient approach for estimating the boundary (and in turn area) of the impact distribution for fixed wing unmanned aircraft is proposed. A series of geometric templates that approximate the impact distributions are derived using an empirical analysis of the results obtained from a 6-Degree of Freedom (6DoF) simulation. The impact distributions can be aggregated to provide impact footprint distributions for a range of generic phases of flight and missions. The maximum impact footprint areas obtained from the geometric template are shown to have a relative error of typically less than 1% compared to the areas calculated using the computationally more expensive 6DoF simulation. Computation times for the geometric models are on the order of one second or less, using a standard desktop computer. Future work includes characterising the distribution of impact locations within the footprint boundaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A building information model (BIM) is an electronic repository of structured, three-dimensional data that captures both the physical and dynamic functional characteristics of a facility. In addition to its more traditional function as a tool to aid design and construction, a BIM can be used throughout the life cycle of a facility, functioning as a living database that places resources contained within the building in their spatial and temporal context. Through its comprehension of spatial relationships, a BIM can meaningfully represent and integrate previously isolated control and management systems and processes, and thereby provide a more intuitive interface to users. By placing processes in a spatial context, decision-making can be improved, with positive flow-on effects for security and efficiency. In this article, we systematically analyse the authorization requirements involved in the use of BIMs. We introduce the concept of using a BIM as a graphical tool to support spatial access control configuration and management (including physical access control). We also consider authorization requirements for regulating access to the structured data that exists within a BIM as well as to external systems and data repositories that can be accessed via the BIM interface. With a view to addressing these requirements we present a survey of relevant spatiotemporal access control models, focusing on features applicable to BIMs and highlighting capability gaps. Finally, we present a conceptual authorization framework that utilizes BIMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three dimensional models and groundwater quality are combined to better understand and conceptualise groundwater systems in complex geological settings in the Wairau Plain, Marlborough. Hydrochemical facies, which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters, are identified within geological formations to assess natural water-rock interactions, redox potential and human agricultural impact on groundwater quality in the Wairau Plain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Articular cartilage is a highly resilient tissue located at the ends of long bones. It has a zonal structure, which has functional significance in load-bearing. Cartilage does not spontaneously heal itself when damaged, and untreated cartilage lesions or age-related wear often lead to osteoarthritis (OA). OA is a degenerative condition that is highly prevalent, age-associated, and significantly affects patient mobility and quality of life. There is no cure for OA, and patients usually resort to replacing the biological joint with an artificial prosthesis. An alternative approach is to dynamically regenerate damaged or diseased cartilage through cartilage tissue engineering, where cells, materials, and stimuli are combined to form new cartilage. However, despite extensive research, major limitations remain that have prevented the wide-spread application of tissue-engineered cartilage. Critically, there is a dearth of information on whether autologous chondrocytes obtained from OA patients can be used to successfully generate cartilage tissues with structural hierarchy typically found in normal articular cartilage. I aim to address these limitations in this thesis by showing that chondrocyte subpopulations isolated from macroscopically normal areas of the cartilage can be used to engineer stratified cartilage tissues and that compressive loading plays an important role in zone-dependent biosynthesis of these chondrocytes. I first demonstrate that chondrocyte subpopulations from the superficial (S) and middle/deep (MD) zones of OA cartilage are responsive to compressive stimulation in vitro, and that the effect of compression on construct quality is zone-dependent. I also show that compressive stimulation can influence pericelluar matrix production, matrix metalloproteinase secretion, and cytokine expression in zonal chondrocytes in an alginate hydrogel model. Subsequently, I focus on recreating the zonal structure by forming layered constructs using the alginate-released chondrocyte (ARC) method either with or without polymeric scaffolds. Resulting zonal ARC constructs had hyaline morphology, and expressed cartilage matrix molecules such as proteoglycans and collagen type II in both scaffold-free and scaffold-based approaches. Overall, my findings demonstrate that chondrocyte subpopulations obtained from OA joints respond sensitively to compressive stimulation, and are able to form cartilaginous constructs with stratified organization similar to native cartilage using the scaffold-free and scaffold-based ARC technique. The ultimate goal in tissue engineering is to help provide improved treatment options for patients suffering from debilitating conditions such as OA. Further investigations in developing functional cartilage replacement tissues using autologous chondrocytes will bring us a step closer to improving the quality of life for millions of OA patients worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Problems involving the solution of advection-diffusion-reaction equations on domains and subdomains whose growth affects and is affected by these equations, commonly arise in developmental biology. Here, a mathematical framework for these situations, together with methods for obtaining spatio-temporal solutions and steady states of models built from this framework, is presented. The framework and methods are applied to a recently published model of epidermal skin substitutes. Despite the use of Eulerian schemes, excellent agreement is obtained between the numerical spatio-temporal, numerical steady state, and analytical solutions of the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crop simulation models have the potential to assess the risk associated with the selection of a specific N fertilizer rate, by integrating the effects of soil-crop interactions on crop growth under different pedo-climatic and management conditions. The objective of this study was to simulate the environmental and economic impact (nitrate leaching and N2O emissions) of a spatially variable N fertilizer application in an irrigated maize field in Italy. The validated SALUS model was run with 5 nitrogen rates scenarios, 50, 100, 150, 200, and 250 kg N ha−1, with the latter being the N fertilization adopted by the farmer. The long-term (25 years) simulations were performed on two previously identified spatially and temporally stable zones, a high yielding and low yielding zone. The simulation results showed that N fertilizer rate can be reduced without affecting yield and net return. The marginal net return was on average higher for the high yield zone, with values ranging from 1550 to 2650 € ha−1 for the 200 N and 1485 to 2875 € ha−1 for the 250 N. N leaching varied between 16.4 and 19.3 kg N ha−1 for the 200 N and the 250 N in the high yield zone. In the low yield zone, the 250 N had a significantly higher N leaching. N2O emissions varied between 0.28 kg N2O ha−1 for the 50 kg N ha−1 rate to a maximum of 1.41 kg N2O ha−1 for the 250 kg N ha−1 rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical access control systems play a central role in the protection of critical infrastructures, where both the provision of timely access and preserving the security of sensitive areas are paramount. In this paper we discuss the shortcomings of existing approaches to the administration of physical access control in complex environments. At the heart of the problem is the current dependency on human administrators to reason about the implications of the provision or the revocation of staff access to an area within these facilities. We demonstrate how utilising Building Information Models (BIMs) and the capabilities they provide, including 3D representation of a facility and path-finding can reduce possible intentional or accidental errors made by security administrators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent efforts in mission planning for underwater vehicles have utilised predictive models to aid in navigation, optimal path planning and drive opportunistic sampling. Although these models provide information at a unprecedented resolutions and have proven to increase accuracy and effectiveness in multiple campaigns, most are deterministic in nature. Thus, predictions cannot be incorporated into probabilistic planning frameworks, nor do they provide any metric on the variance or confidence of the output variables. In this paper, we provide an initial investigation into determining the confidence of ocean model predictions based on the results of multiple field deployments of two autonomous underwater vehicles. For multiple missions conducted over a two-month period in 2011, we compare actual vehicle executions to simulations of the same missions through the Regional Ocean Modeling System in an ocean region off the coast of southern California. This comparison provides a qualitative analysis of the current velocity predictions for areas within the selected deployment region. Ultimately, we present a spatial heat-map of the correlation between the ocean model predictions and the actual mission executions. Knowing where the model provides unreliable predictions can be incorporated into planners to increase the utility and application of the deterministic estimations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a Langevin dynamics model of the diffusion of water in articular cartilage was developed. Numerical simulations of the translational dynamics of water molecules and their interaction with collagen fibers were used to study the quantitative relationship between the organization of the collagen fiber network and the diffusion tensor of water in model cartilage. Langevin dynamics was used to simulate water diffusion in both ordered and partially disordered cartilage models. In addition, an analytical approach was developed to estimate the diffusion tensor for a network comprising a given distribution of fiber orientations. The key findings are that (1) an approximately linear relationship was observed between collagen volume fraction and the fractional anisotropy of the diffusion tensor in fiber networks of a given degree of alignment, (2) for any given fiber volume fraction, fractional anisotropy follows a fiber alignment dependency similar to the square of the second Legendre polynomial of cos(θ), with the minimum anisotropy occurring at approximately the magic angle (θMA), and (3) a decrease in the principal eigenvalue and an increase in the transverse eigenvalues is observed as the fiber orientation angle θ progresses from 0◦ to 90◦. The corresponding diffusion ellipsoids are prolate for θ < θMA, spherical for θ ≈ θMA, and oblate for θ > θMA. Expansion of the model to include discrimination between the combined effects of alignment disorder and collagen fiber volume fraction on the diffusion tensor is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo approach to Bayesian sequential design for the incorporation of model uncertainty. The methodology is demonstrated through the development and implementation of two model discrimination utilities; mutual information and total separation, but it can also be applied more generally if one has different experimental aims. A sequential Monte Carlo algorithm is run for each rival model (in parallel), and provides a convenient estimate of the marginal likelihood (of each model) given the data, which can be used for model comparison and in the evaluation of utility functions. A major benefit of this approach is that it requires very little problem specific tuning and is also computationally efficient when compared to full Markov chain Monte Carlo approaches. This research is motivated by applications in drug development and chemical engineering.