857 resultados para clustering and QoS-aware routing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: To identify what medicines related information children/young people or their parents/carers are able to recall following an out-patient clinic appointment. METHOD: A convenience sample of patients' prescribed at least one new long-term (>6 weeks) medicine were recruited from a single UK paediatric hospital out-patient pharmacy. A face-to-face semi-structured questionnaire was administered to participants when they presented with their prescription. The questionnaire included the following themes: names of the medicines, therapeutic indication, dose regimen, duration of treatment and adverse effects.The results were analysed using Microsoft Excel 2013. RESULTS: One hundred participants consented and were included in the study. One hundred and forty-five medicines were prescribed in total. Participants were able to recall the names of 96 (66%) medicines and were aware of the therapeutic indication for 142 (97.9%) medicines. The dose regimen was accurately described for 120 (82.8%) medicines with the duration of treatment known for 132 (91%). Participants mentioned that they had been advised about side effects for 44 (30.3%) medicines. Specific counselling points recommended by the BNFc1, were either omitted or not recalled by participants for the following systemic treatments: cetirizine (1), chlorphenamine (1), desmopressin (2), hydroxyzine (2), itraconazole (1), piroxicam (2), methotrexate (1), stiripentol (1) and topiramate (1). CONCLUSION: Following an out-patient consultation, where a new medicine is prescribed, children and their parents/carers are usually able to recall the indication, dose regimen and duration of treatment. Few were able to recall, or were told about, possible adverse effects. This may include some important drug specific effects that require vigilance during treatment.Patients, along with families and carers, should be involved in the decision to prescribe a medicine.2 This includes a discussion about the benefits of the medicine on the patient's condition and possible adverse effects.2 Treatment side effects have been shown to be a factor in treatment non-adherence in paediatric long-term medical conditions.3 Practitioners should explain to patients, and their family members or carers where appropriate, how to identify and report medicines-related patient safety incidents.4 However, this study suggests that medical staff may not be comfortable discussing the adverse effects of medicines with patients or their parents/carers.Further research in to the shared decision making process in the paediatric out-patient clinic when a new long-term medicine is prescribed is required to further support medicines adherence and the patient safety agenda.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sarcoma metastatic to the brain is uncommon and rarely occurs as the initial manifestation of tumor. Alveolar soft part sarcoma (ASPS) is a rare but well-studied subtype of sarcoma. A 39-year-old man presented with seizures due to a left temporal meningeal-enhancing lesion with striking brain edema on MRI. The patient underwent neurosurgical resection for suspected meningioma. Histology showed large tumor cells clustering and forming small nests, in places with pseudoalveolar pattern. Diastase-resistant periodic acid-Schiff revealed very rare granular and rod-like cytoplasmic inclusions. Immunohistochemistry showed convincing positivity only with vimentin and smooth muscle actin. The histological features were strongly suggestive of ASPS. At the molecular level RT-PCR and sequencing analysis demonstrated ASPCR1-TFE3 fusion confirming the histological diagnosis of ASPS. There was no evidence of primary extracranial tumor by physical examination and on chest and abdominal CT scan 11 months after presentation. ASPS typically arise from the soft tissues of the extremities and develop multiple metastatic deposits usually with a long clinical course. This case may represent primary meningeal ASPS although metastatic deposit from an undiscovered primary site cannot be entirely excluded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bakgrund Dagligvaruhandel på internet växer och efterfrågas från fler kunder än någonsin tidigare. De större aktörerna märker att efterfrågan finns och inser att de behöver utöka sina försäljningskanaler, samtidigt som de mindre företagen som varit med och bidragit till denna efterfrågan behöver arbeta för att hålla sina kunder lojala och inte förlora marknadsandelar när marknadssituationen förändras av de stora aktörerna. Den förändrade konkurrenssituationen innebär att företag behöver lägga större fokus på kunden genom strategi och framförallt uppbyggnaden av lojalitet. Syfte Syftet är att få en ökad kunskap om hur ett företag som var tidigt ute på en marknad håller sina kunder lojala i samband med att konkurrenssituationen förändras. Metod För att besvara syftet har en kvalitativ fallstudie om företaget MatHem gjorts, för att slutligen kunna dra en generaliserad slutsats. Där insamling av primärdata och sekundärdata har analyserats för att slutligen nå en slutsats. Slutsats Det undersökta företaget håller sina kunder lojala genom att ha en hög generell kvalitet. Vilket betyder att de har hög kvalitet på produkterna, väl fungerande kundservice, överträffar kundens förväntningar och ett brett sortiment. När konkurrenssituationen förändras har företaget inga speciella strategier för att hålla sina kunder lojala, eftersom att företaget inte ser de andra aktörerna som konkurrenter. Företaget är den dyraste aktören på marknaden, men differentierar sig med ekologiska produkter för att erhålla lojala kunder.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the light of the twofold mission of Swedish schools, that is to say enabling pupils to develop both subject knowledge and a democratic attitude, the purpose of this thesis is to investigate to what extent adult higher education students from different language and social backgrounds, studying Swedish as a second language, are able to carry out joint writing assignments with the aid of deliberative discourse, and to what extent they thereby also develop a deliberative attitude. The twofold mission of education applies to them too. While there already exists a certain amount of research into deliberative discourse relating to education in schools, the perspective of higher education didactics in this research is still lacking. The present study is to be viewed as a first contribution to this research. The theoretical starting point of this study includes previous research into deliberative discourse by further developing an existing model regarding criteria for deliberative discourse, for example that there is a striving towards agreement, although the consensus may be temporary, that diverging opinions can be set against each other, that tolerance and respect for views other than one’s own are shown, and that traditional outlooks can be questioned. This model is supplemented by designations for a number of disruptive behaviours, such as ridiculing, ignoring, interrupting people and engaging in private conversations. The thus further developed model will thereafter act as a lens in the analysis of students’ discussions when writing joint texts. Another theoretical starting point is the view of education as communication, and of the possibility of communication creating a third place, thereby developing democracy in the here and now-situation. For this study, comprising 18 hours of observation of nine students, that is to say the discussions of three groups in connection with writing texts on different occasions, various ethnographic data collection methods have been employed, for example video recordings, participant observations, field notes and interviews in conjunction with the discussions. The analysis clarifies that the three groups developed their deliberation as the discussions about the joint assignment proceeded, and that most of the nine students furthermore expressed at least an openness towards a deliberative attitude for further discussions in the future. The disruptive behaviours mentioned in connection with the analytical model that could be identified in the discussions, for example interruptions and private conversations, proved not to constitute real disturbances; on the contrary they actually contributed towards the discussions developing, enabling them to continue. On the other hand, other and not previously identified disturbances occurred, for example a focus on grades, the lack of time and lacking language ability, which all in different ways affected the students’ attitudes towards their work. For any future didactical work on deliberative discourse in Swedish as a second language within higher education, these disturbances would need to be highlighted and made aware of for both teachers and students. Keywords: higher education didactics, communication, deliberative discourse, deliberative attitude, John Dewey, Tomas Englund, heterogeneity, ethnographic data collection methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to acknowledge the domain level and influence of the neuromarketing construct. This is done considering professionals at advertising agencies in Brazil. The presence of concepts related to this new approach is very little divulged, and there are little analysis performed on this area. Thus, the research is of qualitative and exploratory nature and used as primary fonts books, articles related to marketing, neuroscience, and psychology as well as secondary fonts. A profound interview was realized aiming the main advertising agencies in Brazil. The public was composed by managers responsible for planning. A content analysis was performed afterwards. The advances related to the brain science have permitted the development of technological innovation. These go primarily towards knowledge and unconscious experiences of consumers, which are responsible for the impulse of decision making and consumer behavior. These issues are related to Neuromarketing, that in turn, uses techniques such as FMRI, PET and FDOT. These scan the consumer s brain and produces imagines on the neuron s structures and functioning. This is seen while activities such as mental tasks for the visualization of brands, images or products, watching videos and commercials are performed. It is observed that the agencies are constantly in search of new technologies and are aware of the limitations of the current research instruments. On the other hand, they are not totally familiar with concepts related to neuromarketing. In relation to the neuroimage techniques it is pointed out by the research that there is full unawareness, but some agencies seem to visualize positive impacts with the use of these techniques for the evaluation of films and in ways that permit to know the consumer better. It is also seen that neuroimage is perceived as a technique amongst others, but its application is not real, there are some barriers in the market and in the agencies itself. These barriers as well as some questioning allied to the scarce knowledge of neuromarketing, make it not possible to be put into practice in the advertising market. It is also observed that even though there is greater use of neuromarketing; there would not be any meaningful changes in functioning and structuring of these agencies. The use of the neuro-image machines should be done in research institutes and centers of big companies. Results show that the level of domain of the neuromarketing construct in the Brazilian advertising agencies is only a theoretical one. Little is known of this subject and the neurological studies and absolutely nothing of neuroimage techniques

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette étude concerne un écosystème paralique, la lagune d’Aveiro (Portugal). Elle vise à déterminer l’organisation des peuplements de poissons en fonction des caractéristiques et du fonctionnement de cet écosystème. L’ichtyofaune a été échantillonnée mensuellement en 10 stations d’août 1987 à juillet 1988 et de janvier 1999 à décembre 2000, avec une seine de plage traditionnelle. La répartition des peuplements de poissons est étudiée au moyen de descripteurs populationnels (richesses spécifique et familiale, densité, biomasse et indice de diversité) et d’analyses statistiques (groupements et ordination). La lagune d’Aveiro présente de fortes variations, dans l’espace et le temps, de ses paramètres physico- chimiques reflétant ainsi les variations climatiques annuelles. Si l’on considère la mobilité des poissons et la géomorphologie et l’hydrologie du système étudié, nous pouvions nous attendre à une forte homogénéité de la distribution des poissons. À l’inverse, une diminution de l’influence marine a pour conséquence une diminution des richesses spécifiques et familiales, de la densité et de la biomasse. Nous avons également observé une modification de composition de l’assemblage de poissons et la présence d’espèces dominantes caractéristiques des différents niveaux de confinement (taux de renouvellement des eaux marines en un point donné du système). Le peuplement de poissons présente une organisation semblable à la zonation biologique, indépendamment des paramètres physico-chimiques tels que la salinité, décrite par la macrofaune benthique et induite par le confinement. La comparaison des résultats avec des données obtenues douze ans plus tôt, montre que l’organisation générale de la lagune est demeurée inchangée, illustrant ainsi la stabilité des écosystèmes paraliques. De plus, des modifications du niveau de confinement dans les marges nord et sud, induites principalement par des changements locaux de l’hydrodynamisme, ont été constatées. Le déconfinement de la zone nord est la conséquence de l’entretien des canaux de navigation par dragage. À l’inverse, le confinement de la zone sud est l’évolution naturelle des bassins paraliques soumis souvent à une sédimentation élevée et rapide. Cette étude montre que l’organisation du peuplement de poissons valide le concept du confinement pour l’organisation biologique des milieux paraliques, et peut être employé pour expliquer les changements de ces écosystèmes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Scheduling problems are generally NP-hard combinatorial problems, and a lot of research has been done to solve these problems heuristically. However, most of the previous approaches are problem-specific and research into the development of a general scheduling algorithm is still in its infancy. Mimicking the natural evolutionary process of the survival of the fittest, Genetic Algorithms (GAs) have attracted much attention in solving difficult scheduling problems in recent years. Some obstacles exist when using GAs: there is no canonical mechanism to deal with constraints, which are commonly met in most real-world scheduling problems, and small changes to a solution are difficult. To overcome both difficulties, indirect approaches have been presented (in [1] and [2]) for nurse scheduling and driver scheduling, where GAs are used by mapping the solution space, and separate decoding routines then build solutions to the original problem. In our previous indirect GAs, learning is implicit and is restricted to the efficient adjustment of weights for a set of rules that are used to construct schedules. The major limitation of those approaches is that they learn in a non-human way: like most existing construction algorithms, once the best weight combination is found, the rules used in the construction process are fixed at each iteration. However, normally a long sequence of moves is needed to construct a schedule and using fixed rules at each move is thus unreasonable and not coherent with human learning processes. When a human scheduler is working, he normally builds a schedule step by step following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not completed yet, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this research we intend to design more human-like scheduling algorithms, by using ideas derived from Bayesian Optimization Algorithms (BOA) and Learning Classifier Systems (LCS) to implement explicit learning from past solutions. BOA can be applied to learn to identify good partial solutions and to complete them by building a Bayesian network of the joint distribution of solutions [3]. A Bayesian network is a directed acyclic graph with each node corresponding to one variable, and each variable corresponding to individual rule by which a schedule will be constructed step by step. The conditional probabilities are computed according to an initial set of promising solutions. Subsequently, each new instance for each node is generated by using the corresponding conditional probabilities, until values for all nodes have been generated. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the Bayesian network is updated again using the current set of good rule strings. The algorithm thereby tries to explicitly identify and mix promising building blocks. It should be noted that for most scheduling problems the structure of the network model is known and all the variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus learning can amount to 'counting' in the case of multinomial distributions. In the LCS approach, each rule has its strength showing its current usefulness in the system, and this strength is constantly assessed [4]. To implement sophisticated learning based on previous solutions, an improved LCS-based algorithm is designed, which consists of the following three steps. The initialization step is to assign each rule at each stage a constant initial strength. Then rules are selected by using the Roulette Wheel strategy. The next step is to reinforce the strengths of the rules used in the previous solution, keeping the strength of unused rules unchanged. The selection step is to select fitter rules for the next generation. It is envisaged that the LCS part of the algorithm will be used as a hill climber to the BOA algorithm. This is exciting and ambitious research, which might provide the stepping-stone for a new class of scheduling algorithms. Data sets from nurse scheduling and mall problems will be used as test-beds. It is envisaged that once the concept has been proven successful, it will be implemented into general scheduling algorithms. It is also hoped that this research will give some preliminary answers about how to include human-like learning into scheduling algorithms and may therefore be of interest to researchers and practitioners in areas of scheduling and evolutionary computation. References 1. Aickelin, U. and Dowsland, K. (2003) 'Indirect Genetic Algorithm for a Nurse Scheduling Problem', Computer & Operational Research (in print). 2. Li, J. and Kwan, R.S.K. (2003), 'Fuzzy Genetic Algorithm for Driver Scheduling', European Journal of Operational Research 147(2): 334-344. 3. Pelikan, M., Goldberg, D. and Cantu-Paz, E. (1999) 'BOA: The Bayesian Optimization Algorithm', IlliGAL Report No 99003, University of Illinois. 4. Wilson, S. (1994) 'ZCS: A Zeroth-level Classifier System', Evolutionary Computation 2(1), pp 1-18.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Bayesian optimisation algorithm for a nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. When a human scheduler works, he normally builds a schedule systematically following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not yet completed, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this paper, we design a more human-like scheduling algorithm, by using a Bayesian optimisation algorithm to implement explicit learning from past solutions. A nurse scheduling problem from a UK hospital is used for testing. Unlike our previous work that used Genetic Algorithms to implement implicit learning [1], the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The Bayesian optimisation algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, new rule strings have been obtained. Sets of rule strings are generated in this way, some of which will replace previous strings based on fitness. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. For clarity, consider the following toy example of scheduling five nurses with two rules (1: random allocation, 2: allocate nurse to low-cost shifts). In the beginning of the search, the probabilities of choosing rule 1 or 2 for each nurse is equal, i.e. 50%. After a few iterations, due to the selection pressure and reinforcement learning, we experience two solution pathways: Because pure low-cost or random allocation produces low quality solutions, either rule 1 is used for the first 2-3 nurses and rule 2 on remainder or vice versa. In essence, Bayesian network learns 'use rule 2 after 2-3x using rule 1' or vice versa. It should be noted that for our and most other scheduling problems, the structure of the network model is known and all variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus, learning can amount to 'counting' in the case of multinomial distributions. For our problem, we use our rules: Random, Cheapest Cost, Best Cover and Balance of Cost and Cover. In more detail, the steps of our Bayesian optimisation algorithm for nurse scheduling are: 1. Set t = 0, and generate an initial population P(0) at random; 2. Use roulette-wheel selection to choose a set of promising rule strings S(t) from P(t); 3. Compute conditional probabilities of each node according to this set of promising solutions; 4. Assign each nurse using roulette-wheel selection based on the rules' conditional probabilities. A set of new rule strings O(t) will be generated in this way; 5. Create a new population P(t+1) by replacing some rule strings from P(t) with O(t), and set t = t+1; 6. If the termination conditions are not met (we use 2000 generations), go to step 2. Computational results from 52 real data instances demonstrate the success of this approach. They also suggest that the learning mechanism in the proposed approach might be suitable for other scheduling problems. Another direction for further research is to see if there is a good constructing sequence for individual data instances, given a fixed nurse scheduling order. If so, the good patterns could be recognized and then extracted as new domain knowledge. Thus, by using this extracted knowledge, we can assign specific rules to the corresponding nurses beforehand, and only schedule the remaining nurses with all available rules, making it possible to reduce the solution space. Acknowledgements The work was funded by the UK Government's major funding agency, Engineering and Physical Sciences Research Council (EPSRC), under grand GR/R92899/01. References [1] Aickelin U, "An Indirect Genetic Algorithm for Set Covering Problems", Journal of the Operational Research Society, 53(10): 1118-1126,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El rescate de cautivos fue un proceso laborioso que requirió, además, la estabilización, aunque fuera temporal y frágil, de unas fronteras que posibilitaran contactos entre las distintas comunidades. Los rescates surgieron como consecuencia del desarrollo del comercio, planteándose el intercambio de prisioneros o su rescate como una actividad más, aunque habría de contar con el beneplácito de las autoridades políticas y religiosas que muy pronto se implicaron. Lógicamente eran necesarias personas dotadas de determinadas cualidades, entre las que iban a destacar los comerciantes, al tratarse de individuos que pasaban a uno y otro lado de la frontera con cierta frecuencia y eran conocedoras de su problemática. Posteriormente alguno de ellos se especializaron en la negociación de la libertad de los prisioneros.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fundamental step in understanding the effects of irradiation on metallic uranium and uranium dioxide ceramic fuels, or any material, must start with the nature of radiation damage on the atomic level. The atomic damage displacement results in a multitude of defects that influence the fuel performance. Nuclear reactions are coupled, in that changing one variable will alter others through feedback. In the field of fuel performance modeling, these difficulties are addressed through the use of empirical models rather than models based on first principles. Empirical models can be used as a predictive code through the careful manipulation of input variables for the limited circumstances that are closely tied to the data used to create the model. While empirical models are efficient and give acceptable results, these results are only applicable within the range of the existing data. This narrow window prevents modeling changes in operating conditions that would invalidate the model as the new operating conditions would not be within the calibration data set. This work is part of a larger effort to correct for this modeling deficiency. Uranium dioxide and metallic uranium fuels are analyzed through a kinetic Monte Carlo code (kMC) as part of an overall effort to generate a stochastic and predictive fuel code. The kMC investigations include sensitivity analysis of point defect concentrations, thermal gradients implemented through a temperature variation mesh-grid, and migration energy values. In this work, fission damage is primarily represented through defects on the oxygen anion sublattice. Results were also compared between the various models. Past studies of kMC point defect migration have not adequately addressed non-standard migration events such as clustering and dissociation of vacancies. As such, the General Utility Lattice Program (GULP) code was utilized to generate new migration energies so that additional non-migration events could be included into kMC code in the future for more comprehensive studies. Defect energies were calculated to generate barrier heights for single vacancy migration, clustering and dissociation of two vacancies, and vacancy migration while under the influence of both an additional oxygen and uranium vacancy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As introduced by Bentley et al. (2005), artificial immune systems (AIS) are lacking tissue, which is present in one form or another in all living multi-cellular organisms. Some have argued that this concept in the context of AIS brings little novelty to the already saturated field of the immune inspired computational research. This article aims to show that such a component of an AIS has the potential to bring an advantage to a data processing algorithm in terms of data pre-processing, clustering and extraction of features desired by the immune inspired system. The proposed tissue algorithm is based on self-organizing networks, such as self-organizing maps (SOM) developed by Kohonen (1996) and an analogy of the so called Toll-Like Receptors (TLR) affecting the activation function of the clusters developed by the SOM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent advent of new technologies has led to huge amounts of genomic data. With these data come new opportunities to understand biological cellular processes underlying hidden regulation mechanisms and to identify disease related biomarkers for informative diagnostics. However, extracting biological insights from the immense amounts of genomic data is a challenging task. Therefore, effective and efficient computational techniques are needed to analyze and interpret genomic data. In this thesis, novel computational methods are proposed to address such challenges: a Bayesian mixture model, an extended Bayesian mixture model, and an Eigen-brain approach. The Bayesian mixture framework involves integration of the Bayesian network and the Gaussian mixture model. Based on the proposed framework and its conjunction with K-means clustering and principal component analysis (PCA), biological insights are derived such as context specific/dependent relationships and nested structures within microarray where biological replicates are encapsulated. The Bayesian mixture framework is then extended to explore posterior distributions of network space by incorporating a Markov chain Monte Carlo (MCMC) model. The extended Bayesian mixture model summarizes the sampled network structures by extracting biologically meaningful features. Finally, an Eigen-brain approach is proposed to analyze in situ hybridization data for the identification of the cell-type specific genes, which can be useful for informative blood diagnostics. Computational results with region-based clustering reveals the critical evidence for the consistency with brain anatomical structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As introduced by Bentley et al. (2005), artificial immune systems (AIS) are lacking tissue, which is present in one form or another in all living multi-cellular organisms. Some have argued that this concept in the context of AIS brings little novelty to the already saturated field of the immune inspired computational research. This article aims to show that such a component of an AIS has the potential to bring an advantage to a data processing algorithm in terms of data pre-processing, clustering and extraction of features desired by the immune inspired system. The proposed tissue algorithm is based on self-organizing networks, such as self-organizing maps (SOM) developed by Kohonen (1996) and an analogy of the so called Toll-Like Receptors (TLR) affecting the activation function of the clusters developed by the SOM.