129 resultados para 2447: modelling and forecasting


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background:
The physical periphery of a biological cell is mainly described by signaling pathways which are triggered by transmembrane proteins and receptors that are sentinels to control the whole gene regulatory network of a cell. However, our current knowledge about the gene regulatory mechanisms that are governed by extracellular signals is severely limited.Results: The purpose of this paper is three fold. First, we infer a gene regulatory network from a large-scale B-cell lymphoma expression data set using the C3NET algorithm. Second, we provide a functional and structural analysis of the largest connected component of this network, revealing that this network component corresponds to the peripheral region of a cell. Third, we analyze the hierarchical organization of network components of the whole inferred B-cell gene regulatory network by introducing a new approach which exploits the variability within the data as well as the inferential characteristics of C3NET. As a result, we find a functional bisection of the network corresponding to different cellular components.

Conclusions:
Overall, our study allows to highlight the peripheral gene regulatory network of B-cells and shows that it is centered around hub transmembrane proteins located at the physical periphery of the cell. In addition, we identify a variety of novel pathological transmembrane proteins such as ion channel complexes and signaling receptors in B-cell lymphoma. © 2012 Simoes et al.; licensee BioMed Central Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of programmable logic devices as processing platforms for digital signal processing applications poses challenges concerning rapid implementation and high level optimization of algorithms on these platforms. This paper describes Abhainn, a rapid implementation methodology and toolsuite for translating an algorithmic expression of the system to a working implementation on a heterogeneous multiprocessor/field programmable gate array platform, or a standalone system on programmable chip solution. Two particular focuses for Abhainn are the automated but configurable realisation of inter-processor communuication fabrics, and the establishment of novel dedicated hardware component design methodologies allowing algorithm level transformation for system optimization. This paper outlines the approaches employed in both these particular instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is now accepted that changes in the Earth’s climate are having a profound effect on the distributions of a wide variety of species. One aspect of these changes that has only recently received any attention, however, is their potential effect on levels of within-species genetic diversity. Theoretical, empirical and modelling studies suggest that the impact of trailing-edge population extirpation on range-wide intraspecific diversity will be most pronounced in species that harbour the majority of their genetic variation at low latitudes as a result of changes during the Quaternary glaciations. In the present review, I describe the historical factors that have determined current patterns of genetic variation across the ranges of Northern North Atlantic species, highlight the fact that the majority of these species do indeed harbour a disproportionate level of genetic diversity in rear-edge populations, and outline how combined species distribution modelling and genetic analyses can provide insights into the potential effects of climate change on their overall genetic diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The creation of idealised, dimensionally reduced meshes for preliminary design and optimisation remains a time-consuming, manual task. A dimensionally reduced model is ideal for assessing design changes through modification of element properties without the need to create a new geometry or mesh. In this paper, a novel approach for automating the creation of mixed dimensional meshes is presented. The input to the process is a solid model which has been decomposed into a non-manifold assembly of smaller volumes with different meshing significance. Associativity between the original solid model and the dimensionally reduced equivalent is maintained. The approach is validated by means of a free-free modal analysis on an output mesh of a gas turbine engine component of industrial complexity. Extensions and enhancements to this work are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current conceptual models of reciprocal interactions linking soil structure, plants and arbuscular mycorrhizal fungi emphasise positive feedbacks among the components of the system. However, dynamical systems with high dimensionality and several positive feedbacks (i.e. mutualism) are prone to instability. Further, organisms such as arbuscular mycorrhizal fungi (AMF) are obligate biotrophs of plants and are considered major biological agents in soil aggregate stabilization. With these considerations in mind, we developed dynamical models of soil ecosystems that reflect the main features of current conceptual models and empirical data, especially positive feedbacks and linear interactions among plants, AMF and the component of soil structure dependent on aggregates. We found that systems become increasingly unstable the more positive effects with Type I functional response (i.e., the growth rate of a mutualist is modified by the density of its partner through linear proportionality) are added to the model, to the point that increasing the realism of models by adding linear effects produces the most unstable systems. The present theoretical analysis thus offers a framework for modelling and suggests new directions for experimental studies on the interrelationship between soil structure, plants and AMF. Non-linearity in functional responses, spatial and temporal heterogeneity, and indirect effects can be invoked on a theoretical basis and experimentally tested in laboratory and field experiments in order to account for and buffer the local instability of the simplest of current scenarios. This first model presented here may generate interest in more explicitly representing the role of biota in soil physical structure, a phenomenon that is typically viewed in a more process- and management-focused context. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

System Dynamics enables modelling and simulation of highly non-linear feedback systems to predict future system behaviour. Parameter estimation and equation formulation are techniques in System Dynamics, used to retrieve the values of parameters or the equations for ?ows and/or variables. These techniques are crucial for the annotations and thereafter the simulation. This paper critically examines existing and well established approaches in parameter estimation and equation formulation along with their limitations, identifying performance gaps as well as providing directions for potential future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS: To investigate the potential dosimetric and clinical benefits predicted by using four-dimensional computed tomography (4DCT) compared with 3DCT in the planning of radical radiotherapy for non-small cell lung cancer.

MATERIALS AND METHODS:
Twenty patients were planned using free breathing 4DCT then retrospectively delineated on three-dimensional helical scan sets (3DCT). Beam arrangement and total dose (55 Gy in 20 fractions) were matched for 3D and 4D plans. Plans were compared for differences in planning target volume (PTV) geometrics and normal tissue complication probability (NTCP) for organs at risk using dose volume histograms. Tumour control probability and NTCP were modelled using the Lyman-Kutcher-Burman (LKB) model. This was compared with a predictive clinical algorithm (Maastro), which is based on patient characteristics, including: age, performance status, smoking history, lung function, tumour staging and concomitant chemotherapy, to predict survival and toxicity outcomes. Potential therapeutic gains were investigated by applying isotoxic dose escalation to both plans using constraints for mean lung dose (18 Gy), oesophageal maximum (70 Gy) and spinal cord maximum (48 Gy).

RESULTS:
4DCT based plans had lower PTV volumes, a lower dose to organs at risk and lower predicted NTCP rates on LKB modelling (P < 0.006). The clinical algorithm showed no difference for predicted 2-year survival and dyspnoea rates between the groups, but did predict for lower oesophageal toxicity with 4DCT plans (P = 0.001). There was no correlation between LKB modelling and the clinical algorithm for lung toxicity or survival. Dose escalation was possible in 15/20 cases, with a mean increase in dose by a factor of 1.19 (10.45 Gy) using 4DCT compared with 3DCT plans.

CONCLUSIONS:
4DCT can theoretically improve therapeutic ratio and dose escalation based on dosimetric parameters and mathematical modelling. However, when individual characteristics are incorporated, this gain may be less evident in terms of survival and dyspnoea rates. 4DCT allows potential for isotoxic dose escalation, which may lead to improved local control and better overall survival.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The disjunct distributions of the Lusitanian flora, which are found only in south-west Ireland and northern Iberia, and are generally absent from intervening regions, have been of great interest to biogeographers. There has been much debate as to whether Irish populations represent relicts that survived the Last Glacial Maximum (LGM; approximately 21 kya), or whether they recolonized from southern refugia subsequent to the retreat of the ice and, if so, whether this occurred directly (i.e. the result of long distance dispersal) or successively (i.e. in the manner of a ‘steeplechase’, with the English Channel and Irish Sea representing successive ‘water-jumps’ that have to be successfully crossed). In the present study, we used a combined palaeodistribution modelling and phylogeographical approach to determine the glacial history of the Irish spurge, Euphorbia hyberna, the sole member of the Lusitanian flora that is also considered to occur naturally in south-western England. Our findings suggest that the species persisted through the LGM in several southern refugia, and that northern populations are the result of successive recolonization of Britain and Ireland during the postglacial Littletonian warm stage, akin to the ‘steeplechase’ hypothesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is the first to compare random regret minimisation (RRM) and random utility maximisation (RUM) in freight transport application. This paper aims to compare RRM and RUM in a freight transport scenario involving negative shock in the reference alternative. Based on data from two stated choice experiments conducted among Swiss logistics managers, this study contributes to related literature by exploring for the first time the use of mixed logit models in the most recent version of the RRM approach. We further investigate two paradigm choices by computing elasticities and forecasting choice probability. We find that regret is important in describing the managers’ choices. Regret increases in the shock scenario, supporting the idea that a shift in reference point can cause a shift towards regret minimisation. Differences in elasticities and forecast probability are identified and discussed appropriately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reliable and valid instrument is needed to screen for depression in palliative patients. The interRAI Depression Rating Scale (DRS) is based on seven items in the interRAI Palliative Care instrument. This study is the first to explore the dimensionality, reliability and validity of the DRS in a palliative population. Palliative home care patients (n = 5,175) residing in Ontario (Canada) were assessed with the interRAI Palliative Care instrument. Exploratory factor analysis and Mokken scale analysis were used to identify candidate conceptual models and evaluate scale homogeneity/performance. Confirmatory factor analysis compared models using standard goodness-of-fit indices. Convergent and divergent validity were investigated by examining polychoric correlations between the DRS and other items. The “known groups” test determined if the DRS meaningfully distinguished among client subgroups. The non-hierarchical two factor model showed acceptable fit with the data, and ordinal alpha coefficients of 0.83 and 0.82 were observed for the two DRS subscales. Omega hierarchical (ωh) was 0.78 for the bifactor model, with the general factor explaining three quarters of the common variance. Despite the multidimensionality evident in the factor analyses, bifactor modelling and the Mokken homogeneity coefficient (0.34) suggest that the DRS is a coherent scale that captures important information on sub-constructs of depression (e.g., somatic symptoms). Higher correlations were seen between the DRS and mood and psychosocial well-being items, and lower correlations with functional status and demographic variables. The DRS distinguished in the expected manner for known risk factors (e.g., social support, pain). The results suggest that the DRS is primarily unidimensional and reliable for use in screening for depression in palliative care patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

G-protein coupled receptors (GPCRs) are the targets of over half of all prescribed drugs today. The UniProt database has records for about 800 proteins classified as GPCRs, but drugs have only been developed against 50 of these. Thus, there is huge potential in terms of the number of targets for new therapies to be designed. Several breakthroughs in GPCRs biased pharmacology, structural biology, modelling and scoring have resulted in a resurgence of interest in GPCRs as drug targets. Therefore, an international conference, sponsored by the Royal Society, with world-renowned researchers from industry and academia was recently held to discuss recent progress and highlight key areas of future research needed to accelerate GPCR drug discovery. Several key points emerged. Firstly, structures for all three major classes of GPCRs have now been solved and there is increasing coverage across the GPCR phylogenetic tree. This is likely to be substantially enhanced with data from x-ray free electron sources as they move beyond proof of concept. Secondly, the concept of biased signalling or functional selectivity is likely to be prevalent in many GPCRs, and this presents exciting new opportunities for selectivity and the control of side effects, especially when combined with increasing data regarding allosteric modulation. Thirdly, there will almost certainly be some GPCRs that will remain difficult targets because they exhibit complex ligand dependencies and have many metastable states rendering them difficult to resolve by crystallographic methods. Subtle effects within the packing of the transmembrane helices are likely to mask and contribute to this aspect, which may play a role in species dependent behaviour. This is particularly important because it has ramifications for how we interpret pre-clinical data. In summary, collaborative efforts between industry and academia have delivered significant progress in terms of structure and understanding of GPCRs and will be essential for resolving problems associated with the more difficult targets in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a multi-agent system approach to address the difficulties encountered in traditional SCADA systems deployed in critical environments such as electrical power generation, transmission and distribution. The approach models uncertainty and combines multiple sources of uncertain information to deliver robust plan selection. We examine the approach in the context of a simplified power supply/demand scenario using a residential grid connected solar system and consider the challenges of modelling and reasoning with
uncertain sensor information in this environment. We discuss examples of plans and actions required for sensing, establish and discuss the effect of uncertainty on such systems and investigate different uncertainty theories and how they can fuse uncertain information from multiple sources for effective decision making in
such a complex system.