967 resultados para Complex problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Competitive learning is an important machine learning approach which is widely employed in artificial neural networks. In this paper, we present a rigorous definition of a new type of competitive learning scheme realized on large-scale networks. The model consists of several particles walking within the network and competing with each other to occupy as many nodes as possible, while attempting to reject intruder particles. The particle's walking rule is composed of a stochastic combination of random and preferential movements. The model has been applied to solve community detection and data clustering problems. Computer simulations reveal that the proposed technique presents high precision of community and cluster detections, as well as low computational complexity. Moreover, we have developed an efficient method for estimating the most likely number of clusters by using an evaluator index that monitors the information generated by the competition process itself. We hope this paper will provide an alternative way to the study of competitive learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cutting and packing problems arise in a variety of industries, including garment, wood and shipbuilding. Irregular shape packing is a special case which admits irregular items and is much more complex due to the geometry of items. In order to ensure that items do not overlap and no item from the layout protrudes from the container, the collision free region concept was adopted. It represents all possible translations for a new item to be inserted into a container with already placed items. To construct a feasible layout, collision free region for each item is determined through a sequence of Boolean operations over polygons. In order to improve the speed of the algorithm, a parallel version of the layout construction was proposed and it was applied to a simulated annealing algorithm used to solve bin packing problems. Tests were performed in order to determine the speed improvement of the parallel version over the serial algorithm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis consists of three independent parts. Part I: Polynomial amoebas We study the amoeba of a polynomial, as de ned by Gelfand, Kapranov and Zelevinsky. A central role in the treatment is played by a certain convex function which is linear in each complement component of the amoeba, which we call the Ronkin function. This function is used in two di erent ways. First, we use it to construct a polyhedral complex, which we call a spine, approximating the amoeba. Second, the Monge-Ampere measure of the Ronkin function has interesting properties which we explore. This measure can be used to derive an upper bound on the area of an amoeba in two dimensions. We also obtain results on the number of complement components of an amoeba, and consider possible extensions of the theory to varieties of codimension higher than 1. Part II: Differential equations in the complex plane We consider polynomials in one complex variable arising as eigenfunctions of certain differential operators, and obtain results on the distribution of their zeros. We show that in the limit when the degree of the polynomial approaches innity, its zeros are distributed according to a certain probability measure. This measure has its support on the union of nitely many curve segments, and can be characterized by a simple condition on its Cauchy transform. Part III: Radon transforms and tomography This part is concerned with different weighted Radon transforms in two dimensions, in particular the problem of inverting such transforms. We obtain stability results of this inverse problem for rather general classes of weights, including weights of attenuation type with data acquisition limited to a 180 degrees range of angles. We also derive an inversion formula for the exponential Radon transform, with the same restriction on the angle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we made the first steps towards the systematic application of a methodology for automatically building formal models of complex biological systems. Such a methodology could be useful also to design artificial systems possessing desirable properties such as robustness and evolvability. The approach we follow in this thesis is to manipulate formal models by means of adaptive search methods called metaheuristics. In the first part of the thesis we develop state-of-the-art hybrid metaheuristic algorithms to tackle two important problems in genomics, namely, the Haplotype Inference by parsimony and the Founder Sequence Reconstruction Problem. We compare our algorithms with other effective techniques in the literature, we show strength and limitations of our approaches to various problem formulations and, finally, we propose further enhancements that could possibly improve the performance of our algorithms and widen their applicability. In the second part, we concentrate on Boolean network (BN) models of gene regulatory networks (GRNs). We detail our automatic design methodology and apply it to four use cases which correspond to different design criteria and address some limitations of GRN modeling by BNs. Finally, we tackle the Density Classification Problem with the aim of showing the learning capabilities of BNs. Experimental evaluation of this methodology shows its efficacy in producing network that meet our design criteria. Our results, coherently to what has been found in other works, also suggest that networks manipulated by a search process exhibit a mixture of characteristics typical of different dynamical regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study aims at providing a framework conceptualizing patenting activities under the condition of intellectual property rights fragmentation. Such a framework has to deal with the interrelated problems of technological complexity in the modern patent landscape. In that respect, ex-post licensing agreements have been incorporated into the analysis. More precisely, by consolidating the right to use patents required for commercialization of a product, private market solutions, such as cross-licensing agreements and patent pools help firms to overcome problems triggered by the intellectual property rights fragmentation. Thereby, private bargaining between parties as such cannot be isolated from the legal framework. A result of this analysis is that policies ignoring market solutions and only focusing on static gains can mitigate the dynamic efficiency gains as induced by the patent system. The evidence found in this thesis supports the opinion that legal reforms that aim to decrease the degree of patent protection or to lift it all together can hamper the functioning of the current system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of the problems in modern structural design can be described with a set of equation; solutions of these mathematical models can lead the engineer and designer to get info during the design stage. The same holds true for physical-chemistry; this branch of chemistry uses mathematics and physics in order to explain real chemical phenomena. In this work two extremely different chemical processes will be studied; the dynamic of an artificial molecular motor and the generation and propagation of the nervous signals between excitable cells and tissues like neurons and axons. These two processes, in spite of their chemical and physical differences, can be both described successfully by partial differential equations, that are, respectively the Fokker-Planck equation and the Hodgkin and Huxley model. With the aid of an advanced engineering software these two processes have been modeled and simulated in order to extract a lot of physical informations about them and to predict a lot of properties that can be, in future, extremely useful during the design stage of both molecular motors and devices which rely their actions on the nervous communications between active fibres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to present various aspects of numerical simulation of particle and radiation transport for industrial and environmental protection applications, to enable the analysis of complex physical processes in a fast, reliable, and efficient way. In the first part we deal with speed-up of numerical simulation of neutron transport for nuclear reactor core analysis. The convergence properties of the source iteration scheme of the Method of Characteristics applied to be heterogeneous structured geometries has been enhanced by means of Boundary Projection Acceleration, enabling the study of 2D and 3D geometries with transport theory without spatial homogenization. The computational performances have been verified with the C5G7 2D and 3D benchmarks, showing a sensible reduction of iterations and CPU time. The second part is devoted to the study of temperature-dependent elastic scattering of neutrons for heavy isotopes near to the thermal zone. A numerical computation of the Doppler convolution of the elastic scattering kernel based on the gas model is presented, for a general energy dependent cross section and scattering law in the center of mass system. The range of integration has been optimized employing a numerical cutoff, allowing a faster numerical evaluation of the convolution integral. Legendre moments of the transfer kernel are subsequently obtained by direct quadrature and a numerical analysis of the convergence is presented. In the third part we focus our attention to remote sensing applications of radiative transfer employed to investigate the Earth's cryosphere. The photon transport equation is applied to simulate reflectivity of glaciers varying the age of the layer of snow or ice, its thickness, the presence or not other underlying layers, the degree of dust included in the snow, creating a framework able to decipher spectral signals collected by orbiting detectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Despite advances in surgical and interventional techniques, the optimal surgical treatment of severe aortic (re) coarctation and hypoplastic aortic arch is still controversial. Anatomic repair may require extensive dissection, cardiopulmonary bypass, and deep hypothermic circulatory arrest with their inherent risks. The aim of this study was to analyze the outcome of off-pump extraanatomic aortic bypass as a surgical alternative to local repair. METHODS: From February 2000 to December 2005, ten consecutive patients (median age 20 years; range, 11 to 38 years) with severe aortic (re) coarctation (n = 4) and (or) hypoplastic aortic arch (n = 7) underwent off-pump extraanatomic aortic bypass through median sternotomy. All but three patients had undergone previous surgery for coarctation and angioplasty or stenting. Three patients underwent concomitant replacement of the ascending aorta because of an aneurysm using cardiopulmonary bypass. RESULTS: Postoperative hospital course was uneventful in all patients. There was no perioperative mortality or significant morbidity. During a mean follow-up of 48 +/- 22 months no patient required additional procedures. All patients were free of symptoms; no patient showed signs of heart failure after follow-up. At last follow-up, no patient presented with claudication, nor any patient experienced orthostatic problems due to a steal phenomenon. During follow-up, hypertension resolved in all patients with residual mild hypertension in two patients. CONCLUSIONS: Off-pump extraanatomic aortic bypass is an attractive treatment option for complex aortic (re) coarctation and hypoplastic aortic arch. Perioperative risks are minimized, hypertension is influenced favorably, and midterm survival is event-free.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In autumn 2007 the Swiss Medical School of Berne (Switzerland) implemented mandatory short-term clerkships in primary health care for all undergraduate medical students. Students studying for a Bachelor degree complete 8 half-days per year in the office of a general practitioner, while students studying for a Masters complete a three-week clerkship. Every student completes his clerkships in the same GP office during his four years of study. The purpose of this paper is to show how the goals and learning objectives were developed and evaluated. Method:A working group of general practitioners and faculty had the task of defining goals and learning objectives for a specific training program within the complex context of primary health care. The group based its work on various national and international publications. An evaluation of the program, a list of minimum requirements for the clerkships, an oral exam in the first year and an OSCE assignment in the third year assessed achievement of the learning objectives. Results: The findings present the goals and principal learning objectives for these clerkships, the results of the evaluation and the achievement of minimum requirements. Most of the defined learning objectives were taught and duly learned by students. Some learning objectives proved to be incompatible in the context of ambulatory primary care and had to be adjusted accordingly. Discussion: The learning objectives were evaluated and adapted to address students’ and teachers’ needs and the requirements of the medical school. The achievement of minimum requirements (and hence of the learning objectives) for clerkships has been mandatory since 2008. Further evaluations will show whether additional learning objectives need to be adopte

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Limited range of finger motion is a frequent complication after plate fixation of phalangeal fractures. The purpose of this study was to evaluate the results of plate fixation of extra-articular fractures of the proximal phalanx using current low-profile mini-fragment-systems. METHODS From 2006 to 2012, 32 patients with 36 extra-articular fractures of the proximal phalanx of the triphalangeal fingers were treated with open reduction and plate fixation (ORPF) using 1.2 and 1.5 mm mini-fragment systems. Patients presenting with open fractures grade 2 and 3 or relevant laceration of adjacent structures were excluded from the study. We retrospectively evaluated the rate of mal-union or non-union after ORPF, the need for revision surgery, for plate removal, and for tenolysis. Data were analyzed for further complications with regard to infections or complex regional pain syndrome (CRPS). RESULTS No infections were noted. Five patients developed transient symptoms of CRPS. Six weeks postoperatively, total active finger motion (TAM) averaged 183°, and all 32 patients underwent formal hand therapy. At the latest follow-up or at the time of plate removal, respectively, the mean TAM improved to 213°. Extension lag of proximal interphalangeal joints was found in 67 % of all fractured fingers. Secondary surgery was necessary in 14 of 32 patients (2 corrective osteotomies, 12 plate removals including 7 procedures explicitly because of reduced mobility). CONCLUSIONS Despite of new implant designs significant problems persist. Adhesions of extensor tendons leading to limited range of finger motion are still the most frequent complications after ORPF of proximal phalangeal fractures, even in absence of significant soft-tissue damage. LEVEL OF EVIDENCE Therapeutic, Retrospective, Level IV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades affine algebraic varieties and Stein manifolds with big (infinite-dimensional) automorphism groups have been intensively studied. Several notions expressing that the automorphisms group is big have been proposed. All of them imply that the manifold in question is an Oka–Forstnerič manifold. This important notion has also recently merged from the intensive studies around the homotopy principle in Complex Analysis. This homotopy principle, which goes back to the 1930s, has had an enormous impact on the development of the area of Several Complex Variables and the number of its applications is constantly growing. In this overview chapter we present three classes of properties: (1) density property, (2) flexibility, and (3) Oka–Forstnerič. For each class we give the relevant definitions, its most significant features and explain the known implications between all these properties. Many difficult mathematical problems could be solved by applying the developed theory, we indicate some of the most spectacular ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The digestive tract is colonized from birth by a bacterial population called the microbiota which influences the development of the immune system. Modifications in its composition are associated with problems such as obesity or inflammatory bowel diseases. Antibiotics are known to influence the intestinal microbiota but other environmental factors such as cigarette smoking also seem to have an impact on its composition. This influence might partly explain weight gain which is observed after smoking cessation. Indeed there is a modification of the gut microbiota which becomes similar to that of obese people with a microbiotical profile which is more efficient to extract calories from ingested food. These new findings open new fields of diagnostic and therapeutic approaches through the regulation of the microbiota.