949 resultados para mesh: Biological Models
Resumo:
Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.
Resumo:
Harmful Algal Blooms (HABs) have become an important environmental concern along the western coast of the United States. Toxic and noxious blooms adversely impact the economies of coastal communities in the region, pose risks to human health, and cause mortality events that have resulted in the deaths of thousands of fish, marine mammals and seabirds. One goal of field-based research efforts on this topic is the development of predictive models of HABs that would enable rapid response, mitigation and ultimately prevention of these events. In turn, these objectives are predicated on understanding the environmental conditions that stimulate these transient phenomena. An embedded sensor network (Fig. 1), under development in the San Pedro Shelf region off the Southern California coast, is providing tools for acquiring chemical, physical and biological data at high temporal and spatial resolution to help document the emergence and persistence of HAB events, supporting the design and testing of predictive models, and providing contextual information for experimental studies designed to reveal the environmental conditions promoting HABs. The sensor platforms contained within this network include pier-based sensor arrays, ocean moorings, HF radar stations, along with mobile sensor nodes in the form of surface and subsurface autonomous vehicles. FreewaveTM radio modems facilitate network communication and form a minimally-intrusive, wireless communication infrastructure throughout the Southern California coastal region, allowing rapid and cost-effective data transfer. An emerging focus of this project is the incorporation of a predictive ocean model that assimilates near-real time, in situ data from deployed Autonomous Underwater Vehicles (AUVs). The model then assimilates the data to increase the skill of both nowcasts and forecasts, thus providing insight into bloom initiation as well as the movement of blooms or other oceanic features of interest (e.g., thermoclines, fronts, river discharge, etc.). From these predictions, deployed mobile sensors can be tasked to track a designated feature. This focus has led to the creation of a technology chain in which algorithms are being implemented for the innovative trajectory design for AUVs. Such intelligent mission planning is required to maneuver a vehicle to precise depths and locations that are the sites of active blooms, or physical/chemical features that might be sources of bloom initiation or persistence. The embedded network yields high-resolution, temporal and spatial measurements of pertinent environmental parameters and resulting biology (see Fig. 1). Supplementing this with ocean current information and remotely sensed imagery and meteorological data, we obtain a comprehensive foundation for developing a fundamental understanding of HAB events. This then directs labor- intensive and costly sampling efforts and analyses. Additionally, we provide coastal municipalities, managers and state agencies with detailed information to aid their efforts in providing responsible environmental stewardship of their coastal waters.
Resumo:
Chronicwounds fail to proceed through an orderly process to produce anatomic and functional integrity and are a significant socioeconomic problem. There is much debate about the best way to treat these wounds. In this thesis we review earlier mathematical models of angiogenesis and wound healing. Many of these models assume a chemotactic response of endothelial cells, the primary cell type involved in angiogenesis. Modelling this chemotactic response leads to a system of advection-dominated partial differential equations and we review numerical methods to solve these equations and argue that the finite volume method with flux limiting is best-suited to these problems. One treatment of chronic wounds that is shrouded with controversy is hyperbaric oxygen therapy (HBOT). There is currently no conclusive data showing that HBOT can assist chronic wound healing, but there has been some clinical success. In this thesis we use several mathematical models of wound healing to investigate the use of hyperbaric oxygen therapy to assist the healing process - a novel threespecies model and a more complex six-species model. The second model accounts formore of the biological phenomena but does not lend itself tomathematical analysis. Bothmodels are then used tomake predictions about the efficacy of hyperbaric oxygen therapy and the optimal treatment protocol. Based on our modelling, we are able to make several predictions including that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds, treatment should continue until healing is complete and finding the right protocol for an individual patient is crucial if HBOT is to be effective. Analysis of the models allows us to derive constraints for the range of HBOT protocols that will stimulate healing, which enables us to predict which patients are more likely to have a positive response to HBOT and thus has the potential to assist in improving both the success rate and thus the cost-effectiveness of this therapy.
Resumo:
Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them being used for information systems development. In this paper, we examine two factors that we predict will influence the understanding of a business process that novice developers obtain from a corresponding process model: the content presentation form chosen to articulate the business domain, and the user characteristics of the novice developers working with the model. Our experimental study provides evidence that novice developers obtain similar levels of understanding when confronted with an unfamiliar or a familiar process model. However, previous modeling experience, the use of English as a second language, and previous work experience in BPM are important influencing factors of model understanding. Our findings suggest that education and research in process modeling should increase the focus on human factors and how they relate to content and content presentation formats for different modeling tasks. We discuss implications for practice and research.
Resumo:
The term Design is used to describe a wide range of activities. Like the term innovation, it is often used to describe both an activity and an outcome. Many products and services are often described as being designed, as they describe a conscious process of linking form and function. Alternatively, the many and varied processes of design are often used to describe a cost centre of an organisation to demonstrate a particular competency. However design is often not used to describe the ‘value’ it provides to an organisation and more importantly the ‘value’ it provides to both existing and future customers. Design Led Innovation bridges this gap. Design Led Innovation is a process of creating a sustainable competitive advantage, by radically changing the customer value proposition. A conceptual model has been developed to assist organisations apply and embed design in a company’s vision, strategy, culture, leadership and development processes.
Resumo:
In many product categories of durable goods such as TV, PC, and DVD player, the largest component of sales is generated by consumers replacing existing units. Aggregate sales models proposed by diffusion of innovation researchers for the replacement component of sales have incorporated several different replacement distributions such as Rayleigh, Weibull, Truncated Normal and Gamma. Although these alternative replacement distributions have been tested using both time series sales data and individual-level actuarial “life-tables” of replacement ages, there is no census on which distributions are more appropriate to model replacement behaviour. In the current study we are motivated to develop a new “modified gamma” distribution by two reasons. First we recognise that replacements have two fundamentally different drivers – those forced by failure and early, discretionary replacements. The replacement distribution for each of these drivers is expected to be quite different. Second, we observed a poor fit of other distributions to out empirical data. We conducted a survey of 8,077 households to empirically examine models of replacement sales for six electronic consumer durables – TVs, VCRs, DVD players, digital cameras, personal and notebook computers. This data allows us to construct individual-level “life-tables” for replacement ages. We demonstrate the new modified gamma model fits the empirical data better than existing models for all six products using both a primary and a hold-out sample.
Resumo:
Methicillin-resistant Staphylococcus Aureus (MRSA) is a pathogen that continues to be of major concern in hospitals. We develop models and computational schemes based on observed weekly incidence data to estimate MRSA transmission parameters. We extend the deterministic model of McBryde, Pettitt, and McElwain (2007, Journal of Theoretical Biology 245, 470–481) involving an underlying population of MRSA colonized patients and health-care workers that describes, among other processes, transmission between uncolonized patients and colonized health-care workers and vice versa. We develop new bivariate and trivariate Markov models to include incidence so that estimated transmission rates can be based directly on new colonizations rather than indirectly on prevalence. Imperfect sensitivity of pathogen detection is modeled using a hidden Markov process. The advantages of our approach include (i) a discrete valued assumption for the number of colonized health-care workers, (ii) two transmission parameters can be incorporated into the likelihood, (iii) the likelihood depends on the number of new cases to improve precision of inference, (iv) individual patient records are not required, and (v) the possibility of imperfect detection of colonization is incorporated. We compare our approach with that used by McBryde et al. (2007) based on an approximation that eliminates the health-care workers from the model, uses Markov chain Monte Carlo and individual patient data. We apply these models to MRSA colonization data collected in a small intensive care unit at the Princess Alexandra Hospital, Brisbane, Australia.
Resumo:
Three recent papers published in Chemical Engineering Journal studied the solution of a model of diffusion and nonlinear reaction using three different methods. Two of these studies obtained series solutions using specialized mathematical methods, known as the Adomian decomposition method and the homotopy analysis method. Subsequently it was shown that the solution of the same particular model could be written in terms of a transcendental function called Gauss’ hypergeometric function. These three previous approaches focused on one particular reactive transport model. This particular model ignored advective transport and considered one specific reaction term only. Here we generalize these previous approaches and develop an exact analytical solution for a general class of steady state reactive transport models that incorporate (i) combined advective and diffusive transport, and (ii) any sufficiently differentiable reaction term R(C). The new solution is a convergent Maclaurin series. The Maclaurin series solution can be derived without any specialized mathematical methods nor does it necessarily involve the computation of any transcendental function. Applying the Maclaurin series solution to certain case studies shows that the previously published solutions are particular cases of the more general solution outlined here. We also demonstrate the accuracy of the Maclaurin series solution by comparing with numerical solutions for particular cases.
Resumo:
As organizations reach to higher levels of business process management maturity, they often find themselves maintaining repositories of hundreds or even thousands of process models, representing valuable knowledge about their operations. Over time, process model repositories tend to accumulate duplicate fragments (also called clones) as new process models are created or extended by copying and merging fragments from other models. This calls for methods to detect clones in process models, so that these clones can be refactored as separate subprocesses in order to improve maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. The proposed index is based on a novel combination of a method for process model decomposition (specifically the Refined Process Structure Tree), with established graph canonization and string matching techniques. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.
Resumo:
Process models in organizational collections are typically modeled by the same team and using the same conventions. As such, these models share many characteristic features like size range, type and frequency of errors. In most cases merely small samples of these collections are available due to e.g. the sensitive information they contain. Because of their sizes, these samples may not provide an accurate representation of the characteristics of the originating collection. This paper deals with the problem of constructing collections of process models, in the form of Petri nets, from small samples of a collection for accurate estimations of the characteristics of this collection. Given a small sample of process models drawn from a real-life collection, we mine a set of generation parameters that we use to generate arbitrary-large collections that feature the same characteristics of the original collection. In this way we can estimate the characteristics of the original collection on the generated collections.We extensively evaluate the quality of our technique on various sample datasets drawn from both research and industry.
Resumo:
As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.
Resumo:
A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modelling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for configurable process modelling is restricted, thus hindering their applicability. Specifically, these notations focus on capturing tasks and control-flow dependencies, neglecting equally important ingredients of business processes such as data and resources. This research fills this gap by proposing a configurable process modelling notation incorporating features for capturing resources, data and physical objects involved in the performance of tasks. The proposal has been implemented in a toolset that assists analysts during the configuration phase and guarantees the correctness of the resulting process models. The approach has been validated by means of a case study from the film industry.
Resumo:
This review collects and summarises the biological applications of the element cobalt. Small amounts of the ferromagnetic metal can be found in rock, soil, plants and animals, but is mainly obtained as a by-product of nickel and copper mining, and is separated from the ores (mainly cobaltite, erythrite, glaucodot and skutterudite) using a variety of methods. Compounds of cobalt include several oxides, including: green cobalt(II) (CoO), blue cobalt(II,III) (Co3O4), and black cobalt(III) (Co2O3); four halides including pink cobalt(II) fluoride (CoF2), blue cobalt(II) chloride (CoCl2), green cobalt(II) bromide (CoBr2), and blue-black cobalt(II) iodide (CoI2). The main application of cobalt is in its metal form in cobalt-based super alloys, though other uses include lithium cobalt oxide batteries, chemical reaction catalyst, pigments and colouring, and radioisotopes in medicine. It is known to mimic hypoxia on the cellular level by stabilizing the α subunit of hypoxia inducing factor (HIF), when chemically applied as cobalt chloride (CoCl2). This is seen in many biological research applications, where it has shown to promote angiogenesis, erythropoiesis and anaerobic metabolism through the transcriptional activation of genes such as vascular endothelial growth factor (VEGF) and erythropoietin (EPO), contributing significantly to the pathophysiology of major categories of disease, such as myocardial, renal and cerebral ischaemia, high altitude related maladies and bone defects. As a necessary constituent for the formation of vitamin B12, it is essential to all animals, including humans, however excessive exposure can lead to tissue and cellular toxicity. Cobalt has been shown to provide promising potential in clinical applications, however further studies are necessary to clarify its role in hypoxia-responsive genes and the applications of cobalt-chloride treated tissues.
Resumo:
In the exclusion-process literature, mean-field models are often derived by assuming that the occupancy status of lattice sites is independent. Although this assumption is questionable, it is the foundation of many mean-field models. In this work we develop methods to relax the independence assumption for a range of discrete exclusion process-based mechanisms motivated by applications from cell biology. Previous investigations that focussed on relaxing the independence assumption have been limited to studying initially-uniform populations and ignored any spatial variations. By ignoring spatial variations these previous studies were greatly simplified due to translational invariance of the lattice. These previous corrected mean-field models could not be applied to many important problems in cell biology such as invasion waves of cells that are characterised by moving fronts. Here we propose generalised methods that relax the independence assumption for spatially inhomogeneous problems, leading to corrected mean-field descriptions of a range of exclusion process-based models that incorporate (i) unbiased motility, (ii) biased motility, and (iii) unbiased motility with agent birth and death processes. The corrected mean-field models derived here are applicable to spatially variable processes including invasion wave type problems. We show that there can be large deviations between simulation data and traditional mean-field models based on invoking the independence assumption. Furthermore, we show that the corrected mean-field models give an improved match to the simulation data in all cases considered.
Resumo:
Complex surveillance problems are common in biosecurity, such as prioritizing detection among multiple invasive species, specifying risk over a heterogeneous landscape, combining multiple sources of surveillance data, designing for specified power to detect, resource management, and collateral effects on the environment. Moreover, when designing for multiple target species, inherent biological differences among species result in different ecological models underpinning the individual surveillance systems for each. Species are likely to have different habitat requirements, different introduction mechanisms and locations, require different methods of detection, have different levels of detectability, and vary in rates of movement and spread. Often there is a further challenge of a lack of knowledge, literature, or data, for any number of the above problems. Even so, governments and industry need to proceed with surveillance programs which aim to detect incursions in order to meet environmental, social and political requirements. We present an approach taken to meet these challenges in one comprehensive and statistically powerful surveillance design for non-indigenous terrestrial vertebrates on Barrow Island, a high conservation nature reserve off the Western Australian coast. Here, the possibility of incursions is increased due to construction and expanding industry on the island. The design, which includes mammals, amphibians and reptiles, provides a complete surveillance program for most potential terrestrial vertebrate invaders. Individual surveillance systems were developed for various potential invaders, and then integrated into an overall surveillance system which meets the above challenges using a statistical model and expert elicitation. We discuss the ecological basis for the design, the flexibility of the surveillance scheme, how it meets the above challenges, design limitations, and how it can be updated as data are collected as a basis for adaptive management.