871 resultados para Indivisible objects allocation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The in vitro production (IVP) of embryos by in vitro fertilization or cloning procedures has been known to cause epigenetic changes in the conceptus that in turn are associated with abnormalities in pre- and postnatal development. Handmade cloning (HMC) procedures and the culture of zona-free embryos in individual microwells provide excellent tools for studies in developmental biology, since embryo development and cell allocation patterns can be evaluated under a wide range of embryo reconstruction arrangements and in in vitro embryo culture conditions. As disturbances in embryonic cell allocation after in vitro embryo manipulations and unusual in vivo conditions during the first third of pregnancy appear to be associated with large offspring, embryo aggregation procedures may allow a compensation for epigenetic defects between aggregated embryos or even may influence more favorable cell allocation in embryonic lineages, favoring subsequent development. Thus, the aim of this study was to evaluate in vitro embryo developmental potential and the pattern of cell allocation in blastocysts developed after the aggregation of handmade cloned embryos produced using syngeneic wild type and/or transgenic somatic cells. Materials, Methods & Results: In vitro-matured bovine cumulus-oocyte complexes (COC) were manually bisected after cumulus and zona pellucida removal; then, two enucleated hemi-oocytes were paired and fused with either a wild type (WT) or a GFP-expressing (GFP) fetal skin cell at the 11th and 19th passages, respectively. Following chemical activation, reconstructed cloned embryos and zona-free parthenote embryos were in vitro-cultured in microwells, for 7 days, either individually (1 x 100%) or after the aggregation of two structures (2 x 100%) per microwell, as follows: (G1) one WT cloned embryo; (G2) two aggregated WT embryos; (G3) one GFP cloned embryo; (G4) two aggregated GFP embryos; (G5) aggregation of a WT embryo and a GFP embryo; (G6) one parthenote embryo; or (G7) two aggregated parthenote embryos. Fusion (clones), cleavage (Day 2), and blastocyst (Day 7) rates, and embryonic cell allocation were compared by the. 2 or Fisher tests. Total cell number (TCN) in blastocysts was analyzed by the Student's test (P < 0.05). Fusion and cleavage rates, and cell allocation were similar between groups. On a per WOW basis, development to the blastocyst stage was similar between groups, except for lower rates of development seen in G3. However, when based on number of embryos per group (one or two), blastocyst development was higher in G1 than all other groups, which were similar between one another. Cloned GFP embryos had lower in vitro development to the blastocyst stage than WT embryos, which had more TCN than parthenote or aggregated chimeric WT/GFP embryos. Aggregated GFP embryos had fewer cells than the other embryo groups. Discussion: The in vitro development of GFP cloned embryos was lower than WT embryos, with no effects on cell allocation in resulting blastocysts. Differences in blastocyst rate between groups were likely due to lower GFP-expressing cell viability, as GFP donor cells were at high population cell doublings when used for cloning. On a per embryo basis, embryo aggregation on Day 1 resulted in blastocyst development similar to non-aggregated embryos on Day 7, with no differences in cell proportion between groups. The use of GFP-expressing cells was proven a promising strategy for the study of cell allocation during embryo development, which may assist in the elucidation of mechanisms of abnormalities after in vitro embryo manipulations, leading to the development of improved protocols for the in vitro production (IVP) of bovine embryos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let D be a division ring with center k, and let D-dagger be its multiplicative group. We investigate the existence of free groups in D-dagger, and free algebras and free group algebras in D. We also go through the case when D has an involution * and consider the existence of free symmetric and unitary pairs in D-dagger.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the power allocation with fixed rate constraint problem in multi-carrier code division multiple access (MC-CDMA) networks, that has been solved through game theoretic perspective by the use of an iterative water-filling algorithm (IWFA). The problem is analyzed under various interference density configurations, and its reliability is studied in terms of solution existence and uniqueness. Moreover, numerical results reveal the approach shortcoming, thus a new method combining swarm intelligence and IWFA is proposed to make practicable the use of game theoretic approaches in realistic MC-CDMA systems scenarios. The contribution of this paper is twofold: (i) provide a complete analysis for the existence and uniqueness of the game solution, from simple to more realist and complex interference scenarios; (ii) propose a hybrid power allocation optimization method combining swarm intelligence, game theory and IWFA. To corroborate the effectiveness of the proposed method, an outage probability analysis in realistic interference scenarios, and a complexity comparison with the classical IWFA are presented. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the underlying mechanisms that account for the impact of potassium (K) fertilization and its replacement by sodium (Na) on tree growth is key to improving the management of forest plantations that are expanding over weathered tropical soils with low amounts of exchangeable bases. A complete randomized block design was planted with Eucalyptus grandis (W. Hill ex Maiden) to quantify growth, carbon uptake and carbon partitioning using a carbon budget approach. A combination of approaches including the establishment of allometric relationships over the whole rotation and measurements of soil CO2 efflux and aboveground litterfall at the end of the rotation were used to estimate aboveground net production (ANPP), total belowground carbon flux and gross primary production (GPP). The stable carbon isotope (delta C-13) of stem wood alpha-cellulose produced every year was used as a proxy for stomatal limitation of photosynthesis. Potassium fertilization increased GPP and decreased the fraction of carbon allocated belowground. Aboveground net production was strongly enhanced, and because leaf lifespan increased, leaf biomass was enhanced without any change in leaf production, and wood production (P-W) was dramatically increased. Sodium application decreased the fraction of carbon allocated belowground in a similar way, and enhanced GPP, ANPP and P-W, but to a lesser extent compared with K fertilization. Neither K nor Na affected delta C-13 of stem wood alpha-cellulose, suggesting that water-use efficiency was the same among the treatments and that the inferred increase in leaf photosynthesis was not only related to a higher stomatal conductance. We concluded that the response to K fertilization and Na addition on P-W resulted from drastic changes in carbon allocation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the results of a simulation using physical objects. This concept integrates the physical dimensions of an entity such as length, width, and weight, with the usual process flow paradigm, recurrent in the discrete event simulation models. Based on a naval logistics system, we applied this technique in an access channel of the largest port of Latin America. This system is composed by vessel movement constrained by the access channel dimensions. Vessel length and width dictates whether it is safe or not to have one or two ships simultaneously. The success delivered by the methodology proposed was an accurate validation of the model, approximately 0.45% of deviation, when compared to real data. Additionally, the model supported the design of new terminals operations for Santos, delivering KPIs such as: canal utilization, queue time, berth utilization, and throughput capability

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents exact, hybrid algorithms for mixed resource Allocation and Scheduling problems; in general terms, those consist into assigning over time finite capacity resources to a set of precedence connected activities. The proposed methods have broad applicability, but are mainly motivated by applications in the field of Embedded System Design. In particular, high-performance embedded computing recently witnessed the shift from single CPU platforms with application-specific accelerators to programmable Multi Processor Systems-on-Chip (MPSoCs). Those allow higher flexibility, real time performance and low energy consumption, but the programmer must be able to effectively exploit the platform parallelism. This raises interest in the development of algorithmic techniques to be embedded in CAD tools; in particular, given a specific application and platform, the objective if to perform optimal allocation of hardware resources and to compute an execution schedule. On this regard, since embedded systems tend to run the same set of applications for their entire lifetime, off-line, exact optimization approaches are particularly appealing. Quite surprisingly, the use of exact algorithms has not been well investigated so far; this is in part motivated by the complexity of integrated allocation and scheduling, setting tough challenges for ``pure'' combinatorial methods. The use of hybrid CP/OR approaches presents the opportunity to exploit mutual advantages of different methods, while compensating for their weaknesses. In this work, we consider in first instance an Allocation and Scheduling problem over the Cell BE processor by Sony, IBM and Toshiba; we propose three different solution methods, leveraging decomposition, cut generation and heuristic guided search. Next, we face Allocation and Scheduling of so-called Conditional Task Graphs, explicitly accounting for branches with outcome not known at design time; we extend the CP scheduling framework to effectively deal with the introduced stochastic elements. Finally, we address Allocation and Scheduling with uncertain, bounded execution times, via conflict based tree search; we introduce a simple and flexible time model to take into account duration variability and provide an efficient conflict detection method. The proposed approaches achieve good results on practical size problem, thus demonstrating the use of exact approaches for system design is feasible. Furthermore, the developed techniques bring significant contributions to combinatorial optimization methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RTLS and RFID systems are becoming more and more important in several fields. When these systems meet the UWB technology, they can take advantage of each other strengths. Since nowadays a strong importance is given to the "green" technology, we chose to adopt a passive solution. In this case the backscattering modulation can be used to carry data. Therefore it is necessary to analyze the behavior of the antennas used as tags, when they are closed to dierent material objects. In particular, the antenna mode part has been deeply observed, as it is the crucial part of the signal regarding the backscatter modulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carbon fluxes and allocation pattern, and their relationship with the main environmental and physiological parameters, were studied in an apple orchard for one year (2010). I combined three widely used methods: eddy covariance, soil respiration and biometric measurements, and I applied a measurement protocol allowing a cross-check between C fluxes estimated using different methods. I attributed NPP components to standing biomass increment, detritus cycle and lateral export. The influence of environmental and physiological parameters on NEE, GPP and Reco was analyzed with a multiple regression model approach. I found that both NEP and GPP of the apple orchard were of similar magnitude to those of forests growing in similar climate conditions, while large differences occurred in the allocation pattern and in the fate of produced biomass. Apple production accounted for 49% of annual NPP, organic material (leaves, fine root litter, pruned wood and early fruit drop) contributing to detritus cycle was 46%, and only 5% went to standing biomass increment. The carbon use efficiency (CUE), with an annual average of 0.68 ± 0.10, was higher than the previously suggested constant values of 0.47-0.50. Light and leaf area index had the strongest influence on both NEE and GPP. On a diurnal basis, NEE and GPP reached their peak approximately at noon, while they appeared to be limited by high values of VPD and air temperature in the afternoon. The proposed models can be used to explain and simulate current relations between carbon fluxes and environmental parameters at daily and yearly time scale. On average, the annual NEP balanced the carbon annually exported with the harvested apples. These data support the hypothesis of a minimal or null impact of the apple orchard ecosystem on net C emission to the atmosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we consider a simple model problem for the electromagnetic exploration of small perfectly conducting objects buried within the lower halfspace of an unbounded two–layered background medium. In possible applications, such as, e.g., humanitarian demining, the two layers would correspond to air and soil. Moving a set of electric devices parallel to the surface of ground to generate a time–harmonic field, the induced field is measured within the same devices. The goal is to retrieve information about buried scatterers from these data. In mathematical terms, we are concerned with the analysis and numerical solution of the inverse scattering problem to reconstruct the number and the positions of a collection of finitely many small perfectly conducting scatterers buried within the lower halfspace of an unbounded two–layered background medium from near field measurements of time–harmonic electromagnetic waves. For this purpose, we first study the corresponding direct scattering problem in detail and derive an asymptotic expansion of the scattered field as the size of the scatterers tends to zero. Then, we use this expansion to justify a noniterative MUSIC–type reconstruction method for the solution of the inverse scattering problem. We propose a numerical implementation of this reconstruction method and provide a series of numerical experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In computer systems, specifically in multithread, parallel and distributed systems, a deadlock is both a very subtle problem - because difficult to pre- vent during the system coding - and a very dangerous one: a deadlocked system is easily completely stuck, with consequences ranging from simple annoyances to life-threatening circumstances, being also in between the not negligible scenario of economical losses. Then, how to avoid this problem? A lot of possible solutions has been studied, proposed and implemented. In this thesis we focus on detection of deadlocks with a static program analysis technique, i.e. an analysis per- formed without actually executing the program. To begin, we briefly present the static Deadlock Analysis Model devel- oped for coreABS−− in chapter 1, then we proceed by detailing the Class- based coreABS−− language in chapter 2. Then, in Chapter 3 we lay the foundation for further discussions by ana- lyzing the differences between coreABS−− and ASP, an untyped Object-based calculi, so as to show how it can be possible to extend the Deadlock Analysis to Object-based languages in general. In this regard, we explicit some hypotheses in chapter 4 first by present- ing a possible, unproven type system for ASP, modeled after the Deadlock Analysis Model developed for coreABS−−. Then, we conclude our discussion by presenting a simpler hypothesis, which may allow to circumvent the difficulties that arises from the definition of the ”ad-hoc” type system discussed in the aforegoing chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a simple (but fully three-dimensional) mathematical model for the electromagnetic exploration of buried, perfect electrically conducting objects within the soil underground. Moving an electric device parallel to the ground at constant height in order to generate a magnetic field, we measure the induced magnetic field within the device, and factor the underlying mathematics into a product of three operations which correspond to the primary excitation, some kind of reflection on the surface of the buried object(s) and the corresponding secondary excitation, respectively. Using this factorization we are able to give a justification of the so-called sampling method from inverse scattering theory for this particular set-up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Le scelte di asset allocation costituiscono un problema ricorrente per ogni investitore. Quest’ultimo è continuamente impegnato a combinare diverse asset class per giungere ad un investimento coerente con le proprie preferenze. L’esigenza di supportare gli asset manager nello svolgimento delle proprie mansioni ha alimentato nel tempo una vasta letteratura che ha proposto numerose strategie e modelli di portfolio construction. Questa tesi tenta di fornire una rassegna di alcuni modelli innovativi di previsione e di alcune strategie nell’ambito dell’asset allocation tattica, per poi valutarne i risvolti pratici. In primis verificheremo la sussistenza di eventuali relazioni tra la dinamica di alcune variabili macroeconomiche ed i mercati finanziari. Lo scopo è quello di individuare un modello econometrico capace di orientare le strategie dei gestori nella costruzione dei propri portafogli di investimento. L’analisi prende in considerazione il mercato americano, durante un periodo caratterizzato da rapide trasformazioni economiche e da un’elevata volatilità dei prezzi azionari. In secondo luogo verrà esaminata la validità delle strategie di trading momentum e contrarian nei mercati futures, in particolare quelli dell’Eurozona, che ben si prestano all’implementazione delle stesse, grazie all’assenza di vincoli sulle operazioni di shorting ed ai ridotti costi di transazione. Dall’indagine emerge che entrambe le anomalie si presentano con carattere di stabilità. I rendimenti anomali permangono anche qualora vengano utilizzati i tradizionali modelli di asset pricing, quali il CAPM, il modello di Fama e French e quello di Carhart. Infine, utilizzando l’approccio EGARCH-M, verranno formulate previsioni sulla volatilità dei rendimenti dei titoli appartenenti al Dow Jones. Quest’ultime saranno poi utilizzate come input per determinare le views da inserire nel modello di Black e Litterman. I risultati ottenuti, evidenziano, per diversi valori dello scalare tau, extra rendimenti medi del new combined vector superiori al vettore degli extra rendimenti di equilibrio di mercato, seppur con livelli più elevati di rischio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.