894 resultados para Many-to-many-assignment problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEIS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban space occupation occurs in an extremely disordered way. Political, economical and social aspects converge to create a problem that affects mostly poor people who are impelled to occupy areas where social and environmental problems are likely to emerge. Floods, landslides and different forms of pollution have more effects on the most vulnerable groups. In many cases, actions taken by the government legitimate this structure and reinforce its reproduction. This work seeks to confirm both that poor people are vulnerable to social and environmental problems resulting from inappropriate urban solid waste disposal and that the local government contributes to this situation. It is assumed that the issue is related first to the government’s disregard for poor people who live in unhealthy places and, second, to the inability of such people to demand for better living conditions. In this study, a waste disposal area within Jardim Graminha neighborhood in Leme (São Paulo) was selected to be analyzed by means of systematic observation. The study clearly shows that poor people living near that area where different types of garbage are disposed are vulnerable and the government does not control the situation. It is also pointed out that governmental intervention and the use of political and technical tools are necessary for planning and managing the area to mitigate these problems and to decrease poor people’s social and environmental vulnerability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em História - FCHS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent likely extinction of the baiji (Chinese river dolphin [Lipotes vexillifer]) (Turvey et al. 2007) makes the vaquita (Gulf of California porpoise [Phocoena sinus]) the most endangered cetacean. The vaquita has the smallest range of any porpoise, dolphin, or whale and, like the baiji, has long been threatened primarily by accidental deaths in fishing gear (bycatch) (Rojas-Bracho et al. 2006). Despite repeated recommendations from scientific bodies and conservation organizations, no effective actions have been taken to remove nets from the vaquita’s environment. Here, we address three questions that are important to vaquita conservation: (1) How many vaquitas remain? (2) How much time is left to find a solution to the bycatch problem? and (3) Are further abundance surveys or bycatch estimates needed to justify the immediate removal of all entangling nets from the range of the vaquita? Our answers are, in short: (1) there are about 150 vaquitas left, (2) there are at most 2 years within which to find a solution, and (3) further abundance surveys or bycatch estimates are not needed. The answers to the first two questions make clear that action is needed now, whereas the answer to the last question removes the excuse of uncertainty as a delay tactic. Herein we explain our reasoning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software product line (SPL) engineering offers several advantages in the development of families of software products such as reduced costs, high quality and a short time to market. A software product line is a set of software intensive systems, each of which shares a common core set of functionalities, but also differs from the other products through customization tailored to fit the needs of individual groups of customers. The differences between products within the family are well-understood and organized into a feature model that represents the variability of the SPL. Products can then be built by generating and composing features described in the feature model. Testing of software product lines has become a bottleneck in the SPL development lifecycle, since many of the techniques used in their testing have been borrowed from traditional software testing and do not directly take advantage of the similarities between products. This limits the overall gains that can be achieved in SPL engineering. Recent work proposed by both industry and the research community for improving SPL testing has begun to consider this problem, but there is still a need for better testing techniques that are tailored to SPL development. In this thesis, I make two primary contributions to software product line testing. First I propose a new definition for testability of SPLs that is based on the ability to re-use test cases between products without a loss of fault detection effectiveness. I build on this idea to identify elements of the feature model that contribute positively and/or negatively towards SPL testability. Second, I provide a graph based testing approach called the FIG Basis Path method that selects products and features for testing based on a feature dependency graph. This method should increase our ability to re-use results of test cases across successive products in the family and reduce testing effort. I report the results of a case study involving several non-trivial SPLs and show that for these objects, the FIG Basis Path method is as effective as testing all products, but requires us to test no more than 24% of the products in the SPL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Carr-Purcell-Meiboom-Gill (CPMG) pulse sequence has been used in many applications of magnetic resonance imaging (MRI) and low-resolution NMR (LRNMR) spectroscopy. Recently. CPMG was used in online LRNMR measurements that use long RF pulse trains, causing an increase in probe temperature and, therefore, tuning and matching maladjustments. To minimize this problem, the use of a low-power CPMG sequence based on low refocusing pulse flip angles (LRFA) was studied experimentally and theoretically. This approach has been used in several MRI protocols to reduce incident RF power and meet the specific absorption rate. The results for CPMG with LRFA of 3 pi/4 (CPMG(135)), pi/2 (CPMG(90)) and pi/4 (CPMG(45)) were compared with conventional CPMG with refocusing pi pulses. For a homogeneous field, with linewidth equal to Delta nu = 15 Hz, the refocusing flip angles can be as low as pi/4 to obtain the transverse relaxation time (T(2)) value with errors below 5%. For a less homogeneous magnetic field. Delta nu = 100 Hz, the choice of the LRFA has to take into account the reduction in the intensity of the CPMG signal and the increase in the time constant of the CPMG decay that also becomes dependent on longitudinal relaxation time (T(1)). We have compared the T(2) values measured by conventional CPMG and CPMG(90) for 30 oilseed species, and a good correlation coefficient, r = 0.98, was obtained. Therefore, for oilseeds, the T(2) measurements performed with pi/2 refocusing pulses (CPMG(90)), with the same pulse width of conventional CPMG, use only 25% of the RF power. This reduces the heating problem in the probe and reduces the power deposition in the samples. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cold shock proteins (CSPs) are nucleic acid binding chaperones, first described as being induced to solve the problem of mRNA stabilization after temperature downshift. Caulobacter crescentus has four CSPs: CspA and CspB, which are cold induced, and CspC and CspD, which are induced only in stationary phase. In this work we have determined that the synthesis of both CspA and CspB reaches the maximum levels early in the acclimation phase. The deletion of cspA causes a decrease in growth at low temperature, whereas the strain with a deletion of cspB has a very subtle and transient cold-related growth phenotype. The cspA cspB double mutant has a slightly more severe phenotype than that of the cspA mutant, suggesting that although CspA may be more important to cold adaptation than CspB, both proteins have a role in this process. Gene expression analyses were carried out using cspA and cspB regulatory fusions to the lacZ reporter gene and showed that both genes are regulated at the transcriptional and posttranscriptional levels. Deletion mapping of the long 5'-untranslated region (5'-UTR) of each gene identified a common region important for cold induction, probably via translation enhancement. In contrast to what was reported for other bacteria, these cold shock genes have no regulatory regions downstream from ATG that are important for cold induction. This work shows that the importance of CspA and CspB to C. crescentus cold adaptation, mechanisms of regulation, and pattern of expression during the acclimation phase apparently differs in many aspects from what has been described so far for other bacteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] The seminal work of Horn and Schunck [8] is the first variational method for optical flow estimation. It introduced a novel framework where the optical flow is computed as the solution of a minimization problem. From the assumption that pixel intensities do not change over time, the optical flow constraint equation is derived. This equation relates the optical flow with the derivatives of the image. There are infinitely many vector fields that satisfy the optical flow constraint, thus the problem is ill-posed. To overcome this problem, Horn and Schunck introduced an additional regularity condition that restricts the possible solutions. Their method minimizes both the optical flow constraint and the magnitude of the variations of the flow field, producing smooth vector fields. One of the limitations of this method is that, typically, it can only estimate small motions. In the presence of large displacements, this method fails when the gradient of the image is not smooth enough. In this work, we describe an implementation of the original Horn and Schunck method and also introduce a multi-scale strategy in order to deal with larger displacements. For this multi-scale strategy, we create a pyramidal structure of downsampled images and change the optical flow constraint equation with a nonlinear formulation. In order to tackle this nonlinear formula, we linearize it and solve the method iteratively in each scale. In this sense, there are two common approaches: one that computes the motion increment in the iterations, like in ; or the one we follow, that computes the full flow during the iterations, like in. The solutions are incrementally refined ower the scales. This pyramidal structure is a standard tool in many optical flow methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Máster Universitario en Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.