868 resultados para damage evolution process
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however undesirable processing losses are unavoidable and always have been the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical methods are the most preferred; mechanical peeling methods do not create any harmful effects on the tissue and they keep edible portions of produce fresh. The main disadvantage of mechanical peeling is the rate of material loss and deformations. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry, this needs more study on technological aspects of these operations. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. A computer model of mechanical peeling process will be developed in this study to stimulate the energy consumption and stress strain interactions of cutter and tissue. The available Finite Element softwares and methods will be applied to establish the model. Improving the knowledge of interactions and involves variables in food operation particularly in peeling process is the main objectives of the proposed study. Understanding of these interrelationships will help researchers and designer of food processing equipments to develop new and more efficient technologies. Presented work intends to review available literature and previous works has been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
The study presents a multi-layer genetic algorithm (GA) approach using correlation-based methods to facilitate damage determination for through-truss bridge structures. To begin, the structure’s damage-suspicious elements are divided into several groups. In the first GA layer, the damage is initially optimised for all groups using correlation objective function. In the second layer, the groups are combined to larger groups and the optimisation starts over at the normalised point of the first layer result. Then the identification process repeats until reaching the final layer where one group includes all structural elements and only minor optimisations are required to fine tune the final result. Several damage scenarios on a complicated through-truss bridge example are nominated to address the proposed approach’s effectiveness. Structural modal strain energy has been employed as the variable vector in the correlation function for damage determination. Simulations and comparison with the traditional single-layer optimisation shows that the proposed approach is efficient and feasible for complicated truss bridge structures when the measurement noise is taken into account.
Resumo:
Process bus networks are the next stage in the evolution of substation design, bringing digital technology to the high voltage switchyard. Benefits of process buses include facilitating the use of Non-Conventional Instrument Transformers, improved disturbance recording and phasor measurement and the removal of costly, and potentially hazardous, copper cabling from substation switchyards and control rooms. This paper examines the role a process bus plays in an IEC 61850 based Substation Automation System. Measurements taken from a process bus substation are used to develop an understanding of the network characteristics of "whole of substation" process buses. The concept of "coherent transmission" is presented and the impact of this on Ethernet switches is examined. Experiments based on substation observations are used to investigate in detail the behavior of Ethernet switches with sampled value traffic. Test methods that can be used to assess the adequacy of a network are proposed, and examples of the application and interpretation of these tests are provided. Once sampled value frames are queued by an Ethernet switch the additional delay incurred by subsequent switches is minimal, and this allows their use in switchyards to further reduce communications cabling, without significantly impacting operation. The performance and reliability of a process bus network operating with close to the theoretical maximum number of digital sampling units (merging units or electronic instrument transformers) was investigated with networking equipment from several vendors, and has been demonstrated to be acceptable.
Resumo:
The resection of DNA double-strand breaks (DSBs) to generate ssDNA tails is a pivotal event in the cellular response to these breaks. In the two-step model of resection, primarily elucidated in yeast, initial resection by Mre11-CtIP is followed by extensive resection by two distinct pathways involving Exo1 or BLM/WRN-Dna2. However, resection pathways and their exact contributions in humans in vivo are not as clearly worked out as in yeast. Here, we examined the contribution of Exo1 to DNA end resection in humans in vivo in response to ionizing radiation (IR) and its relationship with other resection pathways (Mre11-CtIP or BLM/WRN). We find that Exo1 plays a predominant role in resection in human cells along with an alternate pathway dependent on WRN. While Mre11 and CtIP stimulate resection in human cells, they are not absolutely required for this process and Exo1 can function in resection even in the absence of Mre11-CtIP. Interestingly, the recruitment of Exo1 to DNA breaks appears to be inhibited by the NHEJ protein Ku80, and the higher level of resection that occurs upon siRNA-mediated depletion of Ku80 is dependent on Exo1. In addition, Exo1 may be regulated by 53BP1 and Brca1, and the restoration of resection in BRCA1-deficient cells upon depletion of 53BP1 is dependent on Exo1. Finally, we find that Exo1-mediated resection facilitates a transition from ATM- to ATR-mediated cell cycle checkpoint signaling. Our results identify Exo1 as a key mediator of DNA end resection and DSB repair and damage signaling decisions in human cells.
Resumo:
This paper presents the idea of a compendium of process technologies, i.e., a concise but comprehensive collection of techniques for process model analysis that support research on the design, execution, and evaluation of processes. The idea originated from observations on the evolution of process-related research disciplines. Based on these observations, we derive design goals for a compendium. Then, we present the jBPT library, which addresses these goals by means of an implementation of common analysis techniques in an open source codebase.
Resumo:
Instances of parallel ecotypic divergence where adaptation to similar conditions repeatedly cause similar phenotypic changes in closely related organisms are useful for studying the role of ecological selection in speciation. Here we used a combination of traditional and next generation genotyping techniques to test for the parallel divergence of plants from the Senecio lautus complex, a phenotypically variable groundsel that has adapted to disparate environments in the South Pacific. Phylogenetic analysis of a broad selection of Senecio species showed that members of the S. lautus complex form a distinct lineage that has diversified recently in Australasia. An inspection of thousands of polymorphisms in the genome of 27 natural populations from the S. lautus complex in Australia revealed a signal of strong genetic structure independent of habitat and phenotype. Additionally, genetic differentiation between populations was correlated with the geographical distance separating them, and the genetic diversity of populations strongly depended on geographical location. Importantly, coastal forms appeared in several independent phylogenetic clades, a pattern that is consistent with the parallel evolution of these forms. Analyses of the patterns of genomic differentiation between populations further revealed that adjacent populations displayed greater genomic heterogeneity than allopatric populations and are differentiated according to variation in soil composition. These results are consistent with a process of parallel ecotypic divergence in face of gene flow.
Resumo:
Phylogenetic relationships within the Tabanidae are largely unknown, despite their considerable medical and ecological importance. The first robust phylogenetic hypothesis for the horse fly tribe Scionini is provided, completing the systematic placement of all tribes in the subfamily Pangoniinae. The Scionini consists of seven mostly southern hemisphere genera distributed in Australia, New Guinea, New Zealand and South America. A 5757. bp alignment of 6 genes, including mitochondrial (COI and COII), ribosomal (28S) and nuclear (AATS and CAD regions 1, 3 and 4) genes, was analysed for 176 taxa using both Bayesian and maximum likelihood approaches. Results indicate the Scionini are strongly monophyletic, with the exclusion of the only northern hemisphere genus Goniops. The South American genera Fidena, Pityocera and Scione were strongly monophyletic, corresponding to current morphology-based classification schemes. The most widespread genus Scaptia was paraphyletic and formed nine strongly supported monophyletic clades, each corresponding to either the current subgenera or several previously synonymised genera that should be formally resurrected. Molecular results also reveal a newly recognised genus endemic to New Zealand, formerly placed within Scaptia. Divergence time estimation was employed to assess the global biogeographical patterns in the Pangoniinae. These analyses demonstrated that the Scionini are a typical Gondwanan group whose diversification was influenced by the fragmentation of that ancient land mass. Furthermore, results indicate that the Scionini most likely originated in Australia and subsequently radiated to New Zealand and South American by both long distance dispersal and vicariance. The phylogenetic framework of the Scionini provided herein will be valuable for taxonomic revisions of the Tabanidae.
Resumo:
The growth of graphene on SiC/Si substrates is an appealing alternative to the growth on bulk SiC for cost reduction and to better integrate the material with Si based electronic devices. In this paper, we present a complete in-situ study of the growth of epitaxial graphene on 3C SiC (111)/Si (111) substrates via high temperature annealing (ranging from 1125˚C to 1375˚C) in ultra high vacuum (UHV). The quality and number of graphene layers have been thoroughly investigated by using x-ray photoelectron spectroscopy (XPS), while the surface characterization have been studied by scanning tunnelling microscopy (STM). Ex-situ Raman spectroscopy measurements confirm our findings, which demonstrate the exponential dependence of the number of graphene layer from the annealing temperature.
Resumo:
It was demonstrated recently that dramatic changes in the redox behaviour of gold/aqueous solution interfaces may be observed following either cathodic or thermal electrode pretreatment. Further work on the cathodic pretreatment of gold in acid solution revealed that as the activity of the gold surface was increased, its performance as a substrate for hydrogen gas evolution under constant potential conditions deteriorated. The change in activity of the gold atoms at the interface, which was attributed to a hydrogen embrittlement process (the occurrence of the latter was subsequently checked by surface microscopy), was confirmed, as in earlier work, by the appearance of a substantial anodic peak at ca. 0.5 V (RHE) in a post-activation positive sweep. Changes in the catalytic activity of a metal surface reflect the fact that the structure (or topography), thermodynamic activity and electronic properties of a surface are dependent not only on pretreatment but also, in the case of the hydrogen evolution reaction, vary with time during the course of reaction. As will be reported shortly, similar (and often more dramatic) time-dependent behaviour was observed for hydrogen gas evolution on other metal electrodes.
Resumo:
The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.
Resumo:
Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.
Resumo:
Multimedia communication capabilities are rapidly expanding, and visual information is easily shared electronically, yet funding bodies still rely on paper grant proposal submissions. Incorporating modern technologies will streamline the granting process by increasing the fidelity of grant communication, improving the efficiency of review, and reducing the cost of the process.
Resumo:
Urban agriculture plays an increasingly vital role in supplying food to urban populations. Changes in Information and Communications Technology (ICT) are already driving widespread change in diverse food-related industries such as retail, hospitality and marketing. It is reasonable to suspect that the fields of ubiquitous technology, urban informatics and social media equally have a lot to offer the evolution of core urban food systems. We use communicative ecology theory to describe emerging innovations in urban food systems according to their technical, discursive and social components. We conclude that social media in particular accentuate fundamental social interconnections normally effaced by conventional industrialised approaches to food production and consumption.
Resumo:
There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.
Resumo:
This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.