279 resultados para geospatial modeling
Resumo:
Over the last 30 years, numerous research groups have attempted to provide mathematical descriptions of the skin wound healing process. The development of theoretical models of the interlinked processes that underlie the healing mechanism has yielded considerable insight into aspects of this critical phenomenon that remain difficult to investigate empirically. In particular, the mathematical modeling of angiogenesis, i.e., capillary sprout growth, has offered new paradigms for the understanding of this highly complex and crucial step in the healing pathway. With the recent advances in imaging and cell tracking, the time is now ripe for an appraisal of the utility and importance of mathematical modeling in wound healing angiogenesis research. The purpose of this review is to pedagogically elucidate the conceptual principles that have underpinned the development of mathematical descriptions of wound healing angiogenesis, specifically those that have utilized a continuum reaction-transport framework, and highlight the contribution that such models have made toward the advancement of research in this field. We aim to draw attention to the common assumptions made when developing models of this nature, thereby bringing into focus the advantages and limitations of this approach. A deeper integration of mathematical modeling techniques into the practice of wound healing angiogenesis research promises new perspectives for advancing our knowledge in this area. To this end we detail several open problems related to the understanding of wound healing angiogenesis, and outline how these issues could be addressed through closer cross-disciplinary collaboration.
Resumo:
If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.
Resumo:
Prostate cancer is the most commonly diagnosed malignancy in men and advanced disease is incurable. Model systems are a fundamental tool for research and many in vitro models of prostate cancer use cancer cell lines in monoculture. Although these have yielded significant insight they are inherently limited by virtue of their two-dimensional (2D) growth and inability to include the influence of tumour microenvironment. These major limitations can be overcome with the development of newer systems that more faithfully recreate and mimic the complex in vivo multi-cellular, three-dimensional (3D) microenvironment. This article presents the current state of in vitro models for prostate cancer, with particular emphasis on 3D systems and the challenges that remain before their potential to advance our understanding of prostate disease and aid in the development and testing of new therapeutic agents can be realised.
Resumo:
Large sized power transformers are important parts of the power supply chain. These very critical networks of engineering assets are an essential base of a nation’s energy resource infrastructure. This research identifies the key factors influencing transformer normal operating conditions and predicts the asset management lifespan. Engineering asset research has developed few lifespan forecasting methods combining real-time monitoring solutions for transformer maintenance and replacement. Utilizing the rich data source from a remote terminal unit (RTU) system for sensor-data driven analysis, this research develops an innovative real-time lifespan forecasting approach applying logistic regression based on the Weibull distribution. The methodology and the implementation prototype are verified using a data series from 161 kV transformers to evaluate the efficiency and accuracy for energy sector applications. The asset stakeholders and suppliers significantly benefit from the real-time power transformer lifespan evaluation for maintenance and replacement decision support.
Resumo:
Statistical comparison of oil samples is an integral part of oil spill identification, which deals with the process of linking an oil spill with its source of origin. In current practice, a frequentist hypothesis test is often used to evaluate evidence in support of a match between a spill and a source sample. As frequentist tests are only able to evaluate evidence against a hypothesis but not in support of it, we argue that this leads to unsound statistical reasoning. Moreover, currently only verbal conclusions on a very coarse scale can be made about the match between two samples, whereas a finer quantitative assessment would often be preferred. To address these issues, we propose a Bayesian predictive approach for evaluating the similarity between the chemical compositions of two oil samples. We derive the underlying statistical model from some basic assumptions on modeling assays in analytical chemistry, and to further facilitate and improve numerical evaluations, we develop analytical expressions for the key elements of Bayesian inference for this model. The approach is illustrated with both simulated and real data and is shown to have appealing properties in comparison with both standard frequentist and Bayesian approaches
Resumo:
Structural equation modeling (SEM) is a powerful statistical approach for the testing of networks of direct and indirect theoretical causal relationships in complex data sets with intercorrelated dependent and independent variables. SEM is commonly applied in ecology, but the spatial information commonly found in ecological data remains difficult to model in a SEM framework. Here we propose a simple method for spatially explicit SEM (SE-SEM) based on the analysis of variance/covariance matrices calculated across a range of lag distances. This method provides readily interpretable plots of the change in path coefficients across scale and can be implemented using any standard SEM software package. We demonstrate the application of this method using three studies examining the relationships between environmental factors, plant community structure, nitrogen fixation, and plant competition. By design, these data sets had a spatial component, but were previously analyzed using standard SEM models. Using these data sets, we demonstrate the application of SE-SEM to regularly spaced, irregularly spaced, and ad hoc spatial sampling designs and discuss the increased inferential capability of this approach compared with standard SEM. We provide an R package, sesem, to easily implement spatial structural equation modeling.
Resumo:
A single-generation dataset consisting of 1,730 records from a selection program for high growth rate in giant freshwater prawn (GFP, Macrobrachium rosenbergii) was used to derive prediction equations for meat weight and meat yield. Models were based on body traits [body weight, total length and abdominal width (AW)] and carcass measurements (tail weight and exoskeleton-off weight). Lengths and width were adjusted for the systematic effects of selection line, male morphotypes and female reproductive status, and for the covariables of age at slaughter within sex and body weight. Body and meat weights adjusted for the same effects (except body weight) were used to calculate meat yield (expressed as percentage of tail weight/body weight and exoskeleton-off weight/body weight). The edible meat weight and yield in this GFP population ranged from 12 to 15 g and 37 to 45 %, respectively. The simple (Pearson) correlation coefficients between body traits (body weight, total length and AW) and meat weight were moderate to very high and positive (0.75–0.94), but the correlations between body traits and meat yield were negative (−0.47 to −0.74). There were strong linear positive relationships between measurements of body traits and meat weight, whereas relationships of body traits with meat yield were moderate and negative. Step-wise multiple regression analysis showed that the best model to predict meat weight included all body traits, with a coefficient of determination (R 2) of 0.99 and a correlation between observed and predicted values of meat weight of 0.99. The corresponding figures for meat yield were 0.91 and 0.95, respectively. Body weight or length was the best predictor of meat weight, explaining 91–94 % of observed variance when it was fitted alone in the model. By contrast, tail width explained a lower proportion (69–82 %) of total variance in the single trait models. It is concluded that in practical breeding programs, improvement of meat weight can be easily made through indirect selection for body trait combinations. The improvement of meat yield, albeit being more difficult, is possible by genetic means, with 91 % of the variation in the trait explained by the body and carcass traits examined in this study.
Resumo:
This paper demonstrates the procedures for probabilistic assessment of a pesticide fate and transport model, PCPF-1, to elucidate the modeling uncertainty using the Monte Carlo technique. Sensitivity analyses are performed to investigate the influence of herbicide characteristics and related soil properties on model outputs using four popular rice herbicides: mefenacet, pretilachlor, bensulfuron-methyl and imazosulfuron. Uncertainty quantification showed that the simulated concentrations in paddy water varied more than those of paddy soil. This tendency decreased as the simulation proceeded to a later period but remained important for herbicides having either high solubility or a high 1st-order dissolution rate. The sensitivity analysis indicated that PCPF-1 parameters requiring careful determination are primarily those involve with herbicide adsorption (the organic carbon content, the bulk density and the volumetric saturated water content), secondary parameters related with herbicide mass distribution between paddy water and soil (1st-order desorption and dissolution rates) and lastly, those involving herbicide degradations. © Pesticide Science Society of Japan.
Resumo:
Pesticide use in paddy rice production may contribute to adverse ecological effects in surface waters. Risk assessments conducted for regulatory purposes depend on the use of simulation models to determine predicted environment concentrations (PEC) of pesticides. Often tiered approaches are used, in which assessments at lower tiers are based on relatively simple models with conservative scenarios, while those at higher tiers have more realistic representations of physical and biochemical processes. This chapter reviews models commonly used for predicting the environmental fate of pesticides in rice paddies. Theoretical considerations, unique features, and applications are discussed. This review is expected to provide information to guide model selection for pesticide registration, regulation, and mitigation in rice production areas.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.
Resumo:
Introduction Two symposia on “cardiovascular diseases and vulnerable plaques” Cardiovascular disease (CVD) is the leading cause of death worldwide. Huge effort has been made in many disciplines including medical imaging, computational modeling, bio- mechanics, bioengineering, medical devices, animal and clinical studies, population studies as well as genomic, molecular, cellular and organ-level studies seeking improved methods for early detection, diagnosis, prevention and treatment of these diseases [1-14]. However, the mechanisms governing the initiation, progression and the occurrence of final acute clinical CVD events are still poorly understood. A large number of victims of these dis- eases who are apparently healthy die suddenly without prior symptoms. Available screening and diagnostic methods are insufficient to identify the victims before the event occurs [8,9]. Most cardiovascular diseases are associated with vulnerable plaques. A grand challenge here is to develop new imaging techniques, predictive methods and patient screening tools to identify vulnerable plaques and patients who are more vulnerable to plaque rupture and associated clinical events such as stroke and heart attack, and recommend proper treatment plans to prevent those clinical events from happening. Articles in this special issue came from two symposia held recently focusing on “Cardio-vascular Diseases and Vulnerable Plaques: Data, Modeling, Predictions and Clinical Applications.” One was held at Worcester Polytechnic Institute (WPI), Worcester, MA, USA, July 13-14, 2014, right after the 7th World Congress of Biomechanics. This symposium was endorsed by the World Council of Biomechanics, and partially supported by a grant from NIH-National Institute of Biomedical Image and Bioengineering. The other was held at Southeast University (SEU), Nanjing, China, April 18-20, 2014.
Resumo:
A modeling paradigm is proposed for covariate, variance and working correlation structure selection for longitudinal data analysis. Appropriate selection of covariates is pertinent to correct variance modeling and selecting the appropriate covariates and variance function is vital to correlation structure selection. This leads to a stepwise model selection procedure that deploys a combination of different model selection criteria. Although these criteria find a common theoretical root based on approximating the Kullback-Leibler distance, they are designed to address different aspects of model selection and have different merits and limitations. For example, the extended quasi-likelihood information criterion (EQIC) with a covariance penalty performs well for covariate selection even when the working variance function is misspecified, but EQIC contains little information on correlation structures. The proposed model selection strategies are outlined and a Monte Carlo assessment of their finite sample properties is reported. Two longitudinal studies are used for illustration.
Resumo:
With the rapid development of various technologies and applications in smart grid implementation, demand response has attracted growing research interests because of its potentials in enhancing power grid reliability with reduced system operation costs. This paper presents a new demand response model with elastic economic dispatch in a locational marginal pricing market. It models system economic dispatch as a feedback control process, and introduces a flexible and adjustable load cost as a controlled signal to adjust demand response. Compared with the conventional “one time use” static load dispatch model, this dynamic feedback demand response model may adjust the load to a desired level in a finite number of time steps and a proof of convergence is provided. In addition, Monte Carlo simulation and boundary calculation using interval mathematics are applied for describing uncertainty of end-user's response to an independent system operator's expected dispatch. A numerical analysis based on the modified Pennsylvania-Jersey-Maryland power pool five-bus system is introduced for simulation and the results verify the effectiveness of the proposed model. System operators may use the proposed model to obtain insights in demand response processes for their decision-making regarding system load levels and operation conditions.