964 resultados para Model combination


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model was tested to examine relationships among leadership behaviors, team diversity, and team process measures with team performance and satisfaction at both the team and leader-member levels of analysis. Relationships between leadership behavior and team demographic and cognitive diversity were hypothesized to have both direct effects on organizational outcomes as well as indirect effects through team processes. Leader member differences were investigated to determine the effects of leader-member diversity leader-member exchange quality, individual effectiveness and satisfaction.^ Leadership had little direct effect on team performance, but several strong positive indirect effects through team processes. Demographic Diversity had no impact on team processes, directly impacted only one performance measure, and moderated the leadership to team process relationship.^ Cognitive Diversity had a number of direct and indirect effects on team performance, the net effects uniformly positive, and did not moderate the leadership to team process relationship.^ In sum, the team model suggests a complex combination of leadership behaviors positively impacting team processes, demographic diversity having little impact on team process or performance, cognitive diversity having a positive net impact impact, and team processes having mixed effects on team outcomes.^ At the leader-member level, leadership behaviors were a strong predictor of Leader-Member Exchange (LMX) quality. Leader-member demographic and cognitive dissimilarity were each predictors of LMX quality, but failed to moderate the leader behavior to LMX quality relationship. LMX quality was strongly and positively related to self reported effectiveness and satisfaction.^ The study makes several contributions to the literature. First, it explicitly links leadership and team diversity. Second, demographic and cognitive diversity are conceptualized as distinct and multi-faceted constructs. Third, a methodology for creating an index of categorical demographic and interval cognitive measures is provided so that diversity can be measured in a holistic conjoint fashion. Fourth, the study simultaneously investigates the impact of diversity at the team and leader-member levels of analyses. Fifth, insights into the moderating impact of different forms of team diversity on the leadership to team process relationship are provided. Sixth, this study incorporates a wide range of objective and independent measures to provide a 360$\sp\circ$ assessment of team performance. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Melanoma is one of the most aggressive types of cancer. It originates from the transformation of melanocytes present in the epidermal/dermal junction of the human skin. It is commonly accepted that melanomagenesis is influenced by the interaction of environmental factors, genetic factors, as well as tumor-host interactions. DNA photoproducts induced by UV radiation are, in normal cells, repaired by the nucleotide excision repair (NER) pathway. The prominent role of NER in cancer resistance is well exemplified by patients with Xeroderma Pigmentosum (XP). This disease results from mutations in the components of the NER pathway, such as XPA and XPC proteins. In humans, NER pathway disruption leads to the development of skin cancers, including melanoma. Similar to humans afflicted with XP, Xpa and Xpc deficient mice show high sensibility to UV light, leading to skin cancer development, except melanoma. The Endothelin 3 (Edn3) signaling pathway is essential for proliferation, survival and migration of melanocyte precursor cells. Excessive production of Edn3 leads to the accumulation of large numbers of melanocytes in the mouse skin, where they are not normally found. In humans, Edn3 signaling pathway has also been implicated in melanoma progression and its metastatic potential. The goal of this study was the development of the first UV-induced melanoma mouse model dependent on the over-expression of Edn3 in the skin. The UV-induced melanoma mouse model reported here is distinguishable from all previous published models by two features: melanocytes are not transformed a priori and melanomagenesis arises only upon neonatal UV exposure. In this model, melanomagenesis depends on the presence of Edn3 in the skin. Disruption of the NER pathway due to the lack of Xpa or Xpc proteins was not essential for melanomagenesis; however, it enhanced melanoma penetrance and decreased melanoma latency after one single neonatal erythemal UV dose. Exposure to a second dose of UV at six weeks of age did not change time of appearance or penetrance of melanomas in this mouse model. Thus, a combination of neonatal UV exposure with excessive Edn3 in the tumor microenvironment is sufficient for melanomagenesis in mice; furthermore, NER deficiency exacerbates this process.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To predict the maneuvering performance of a propelled SPAR vessel, a mathematical model was established as a path simulator. A system-based mathematical model was chosen as it offers advantages in cost and time over full Computational Fluid Dynamics (CFD) simulations. The model is intended to provide a means of optimizing the maneuvering performance of this new vessel type. In this study the hydrodynamic forces and control forces are investigated as individual components, combined in a vectorial setting, and transferred to a body-fixed basis. SPAR vessels are known to be very sensitive to large amplitude motions during maneuvers due to the relatively small hydrostatic restoring forces. Previous model tests of SPAR vessels have shown significant roll and pitch amplitudes, especially during course change maneuvers. Thus, a full 6 DOF equation of motion was employed in the current numerical model. The mathematical model employed in this study was a combination of the model introduced by the Maneuvering Modeling Group (MMG) and the Abkowitz (1964) model. The new model represents the forces applied to the ship hull, the propeller forces and the rudder forces independently, as proposed by the MMG, but uses a 6DOF equation of motion introduced by Abkowitz to describe the motion of a maneuvering ship. The mathematical model was used to simulate the trajectory and motions of the propelled SPAR vessel in 10˚/10˚, 20˚/20˚ and 30˚/30˚ standard zig-zag maneuvers, as well as turning circle tests at rudder angles of 20˚ and 30˚. The simulation results were used to determine the maneuverability parameters (e.g. advance, transfer and tactical diameter) of the vessel. The final model provides the means of predicting and assessing the performance of the vessel type and can be easily adapted to specific vessel configurations based on the generic SPAR-type vessel used in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Melanoma is one of the most aggressive types of cancer. It originates from the transformation of melanocytes present in the epidermal/dermal junction of the human skin. It is commonly accepted that melanomagenesis is influenced by the interaction of environmental factors, genetic factors, as well as tumor-host interactions. DNA photoproducts induced by UV radiation are, in normal cells, repaired by the nucleotide excision repair (NER) pathway. The prominent role of NER in cancer resistance is well exemplified by patients with Xeroderma Pigmentosum (XP). This disease results from mutations in the components of the NER pathway, such as XPA and XPC proteins. In humans, NER pathway disruption leads to the development of skin cancers, including melanoma. Similar to humans afflicted with XP, Xpa and Xpc deficient mice show high sensibility to UV light, leading to skin cancer development, except melanoma. The Endothelin 3 (Edn3) signaling pathway is essential for proliferation, survival and migration of melanocyte precursor cells. Excessive production of Edn3 leads to the accumulation of large numbers of melanocytes in the mouse skin, where they are not normally found. In humans, Edn3 signaling pathway has also been implicated in melanoma progression and its metastatic potential. The goal of this study was the development of the first UV-induced melanoma mouse model dependent on the over-expression of Edn3 in the skin. The UV-induced melanoma mouse model reported here is distinguishable from all previous published models by two features: melanocytes are not transformed a priori and melanomagenesis arises only upon neonatal UV exposure. In this model, melanomagenesis depends on the presence of Edn3 in the skin. Disruption of the NER pathway due to the lack of Xpa or Xpc proteins was not essential for melanomagenesis; however, it enhanced melanoma penetrance and decreased melanoma latency after one single neonatal erythemal UV dose. Exposure to a second dose of UV at six weeks of age did not change time of appearance or penetrance of melanomas in this mouse model. Thus, a combination of neonatal UV exposure with excessive Edn3 in the tumor microenvironment is sufficient for melanomagenesis in mice; furthermore, NER deficiency exacerbates this process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea ice models contain many different parameterizations of which one of the most commonly used is a subgrid-scale ice thickness distribution (ITD). The effect of this model component and the associated ice strength formulation on the reproduction of observed Arctic sea ice is assessed. To this end the model's performance in reproducing satellite observations of sea ice concentration, thickness and drift is evaluated. For an unbiased comparison, different model configurations with and without an ITD are tuned with an automated parameter optimization. The original combination of ITD and ice strength parameterization does not lead to better results than a simple single category model. Yet changing to a simpler ice strength formulation, which depends linearly on the mean ice thickness across all thickness categories, allows to clearly improve the model-data misfit when using an ITD. In the original formulation, the ice strength depends strongly on the number of thickness categories, so that introducing more categories can lead to thicker albeit weaker ice on average.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-source heat pump (GSHP) systems represent one of the most promising techniques for heating and cooling in buildings. These systems use the ground as a heat source/sink, allowing a better efficiency thanks to the low variations of the ground temperature along the seasons. The ground-source heat exchanger (GSHE) then becomes a key component for optimizing the overall performance of the system. Moreover, the short-term response related to the dynamic behaviour of the GSHE is a crucial aspect, especially from a regulation criteria perspective in on/off controlled GSHP systems. In this context, a novel numerical GSHE model has been developed at the Instituto de Ingeniería Energética, Universitat Politècnica de València. Based on the decoupling of the short-term and the long-term response of the GSHE, the novel model allows the use of faster and more precise models on both sides. In particular, the short-term model considered is the B2G model, developed and validated in previous research works conducted at the Instituto de Ingeniería Energética. For the long-term, the g-function model was selected, since it is a previously validated and widely used model, and presents some interesting features that are useful for its combination with the B2G model. The aim of the present paper is to describe the procedure of combining these two models in order to obtain a unique complete GSHE model for both short- and long-term simulation. The resulting model is then validated against experimental data from a real GSHP installation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The FIREDASS (FIRE Detection And Suppression Simulation) project is concerned with the development of fine water mist systems as a possible replacement for the halon fire suppression system currently used in aircraft cargo holds. The project is funded by the European Commission, under the BRITE EURAM programme. The FIREDASS consortium is made up of a combination of Industrial, Academic, Research and Regulatory partners. As part of this programme of work, a computational model has been developed to help engineers optimise the design of the water mist suppression system. This computational model is based on Computational Fluid Dynamics (CFD) and is composed of the following components: fire model; mist model; two-phase radiation model; suppression model and detector/activation model. The fire model - developed by the University of Greenwich - uses prescribed release rates for heat and gaseous combustion products to represent the fire load. Typical release rates have been determined through experimentation conducted by SINTEF. The mist model - developed by the University of Greenwich - is a Lagrangian particle tracking procedure that is fully coupled to both the gas phase and the radiation field. The radiation model - developed by the National Technical University of Athens - is described using a six-flux radiation model. The suppression model - developed by SINTEF and the University of Greenwich - is based on an extinguishment crietrion that relies on oxygen concentration and temperature. The detector/ activation model - developed by Cerberus - allows the configuration of many different detector and mist configurations to be tested within the computational model. These sub-models have been integrated by the University of Greenwich into the FIREDASS software package. The model has been validated using data from the SINTEF/GEC test campaigns and it has been found that the computational model gives good agreement with these experimental results. The best agreement is obtained at the ceiling which is where the detectors and misting nozzles would be located in a real system. In this paper the model is briefly described and some results from the validation of the fire and mist model are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Targeted cancer therapy aims to disrupt aberrant cellular signalling pathways. Biomarkers are surrogates of pathway state, but there is limited success in translating candidate biomarkers to clinical practice due to the intrinsic complexity of pathway networks. Systems biology approaches afford better understanding of complex, dynamical interactions in signalling pathways targeted by anticancer drugs. However, adoption of dynamical modelling by clinicians and biologists is impeded by model inaccessibility. Drawing on computer games technology, we present a novel visualisation toolkit, SiViT, that converts systems biology models of cancer cell signalling into interactive simulations that can be used without specialist computational expertise. SiViT allows clinicians and biologists to directly introduce for example loss of function mutations and specific inhibitors. SiViT animates the effects of these introductions on pathway dynamics, suggesting further experiments and assessing candidate biomarker effectiveness. In a systems biology model of Her2 signalling we experimentally validated predictions using SiViT, revealing the dynamics of biomarkers of drug resistance and highlighting the role of pathway crosstalk. No model is ever complete: the iteration of real data and simulation facilitates continued evolution of more accurate, useful models. SiViT will make accessible libraries of models to support preclinical research, combinatorial strategy design and biomarker discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Expressed Sequence Tags (ESTs) are in general used to gain a first insight into gene activities from a species of interest. Subsequently, and typically based on a combination of EST and genome sequences, microarray-based expression analyses are performed for a variety of conditions. In some cases, a multitude of EST and microarray experiments are conducted for one species, covering different tissues, cell states, and cell types. Under these circumstances, the challenge arises to combine results derived from the different expression profiling strategies, with the goal to uncover novel information on the basis of the integrated datasets. Findings: Using our new analysis tool, MediPlEx (MEDIcago truncatula multiPLe EXpression analysis), expression data from EST experiments, oligonucleotide microarrays and Affymetrix GeneChips® can be combined and analyzed, leading to a novel approach to integrated transcriptome analysis. We have validated our tool via the identification of a set of well-characterized AM-specific and AM-induced marker genes, identified by MediPlEx on the basis of in silico and experimental gene expression profiles from roots colonized with AM fungi. Conclusions: MediPlEx offers an integrated analysis pipeline for different sets of expression data generated for the model legume Medicago truncatula. As expected, in silico and experimental gene expression data that cover the same biological condition correlate well. The collection of differentially expressed genes identified via MediPlEx provides a starting point for functional studies in plant mutants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms alone cannot solve optimization problems very efficiently since there are many random (not very rational) decisions in these algorithms. Combination of evolutionary algorithms and other techniques have been proven to be an efficient optimization methodology. In this talk, I will explain the basic ideas of our three algorithms along this line (1): Orthogonal genetic algorithm which treats crossover/mutation as an experimental design problem, (2) Multiobjective evolutionary algorithm based on decomposition (MOEA/D) which uses decomposition techniques from traditional mathematical programming in multiobjective optimization evolutionary algorithm, and (3) Regular model based multiobjective estimation of distribution algorithms (RM-MEDA) which uses the regular property and machine learning methods for improving multiobjective evolutionary algorithms.