914 resultados para Network Graph and RAN Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary Ecotones are sensitive to change because they contain high numbers of species living at the margin of their environmental tolerance. This is equally true of tree-lines, which are determined by attitudinal or latitudinal temperature gradients. In the current context of climate change, they are expected to undergo modifications in position, tree biomass and possibly species composition. Attitudinal and latitudinal tree-lines differ mainly in the steepness of the underlying temperature gradient: distances are larger at latitudinal tree-lines, which could have an impact on the ability of tree species to migrate in response to climate change. Aside from temperature, tree-lines are also affected on a more local level by pressure from human activities. These are also changing as a consequence of modifications in our societies and may interact with the effects of climate change. Forest dynamics models are often used for climate change simulations because of their mechanistic processes. The spatially-explicit model TreeMig was used as a base to develop a model specifically tuned for the northern European and Alpine tree-line ecotones. For the latter, a module for land-use change processes was also added. The temperature response parameters for the species in the model were first calibrated by means of tree-ring data from various species and sites at both tree-lines. This improved the growth response function in the model, but also lead to the conclusion that regeneration is probably more important than growth for controlling tree-line position and species' distributions. The second step was to implement the module for abandonment of agricultural land in the Alps, based on an existing spatial statistical model. The sensitivity of its most important variables was tested and the model's performance compared to other modelling approaches. The probability that agricultural land would be abandoned was strongly influenced by the distance from the nearest forest and the slope, bath of which are proxies for cultivation costs. When applied to a case study area, the resulting model, named TreeMig-LAb, gave the most realistic results. These were consistent with observed consequences of land-abandonment such as the expansion of the existing forest and closing up of gaps. This new model was then applied in two case study areas, one in the Swiss Alps and one in Finnish Lapland, under a variety of climate change scenarios. These were based on forecasts of temperature change over the next century by the IPCC and the HadCM3 climate model (ΔT: +1.3, +3.5 and +5.6 °C) and included a post-change stabilisation period of 300 years. The results showed radical disruptions at both tree-lines. With the most conservative climate change scenario, species' distributions simply shifted, but it took several centuries reach a new equilibrium. With the more extreme scenarios, some species disappeared from our study areas (e.g. Pinus cembra in the Alps) or dwindled to very low numbers, as they ran out of land into which they could migrate. The most striking result was the lag in the response of most species, independently from the climate change scenario or tree-line type considered. Finally, a statistical model of the effect of reindeer (Rangifer tarandus) browsing on the growth of Pinus sylvestris was developed, as a first step towards implementing human impacts at the boreal tree-line. The expected effect was an indirect one, as reindeer deplete the ground lichen cover, thought to protect the trees against adverse climate conditions. The model showed a small but significant effect of browsing, but as the link with the underlying climate variables was unclear and the model was not spatial, it was not usable as such. Developing the TreeMig-LAb model allowed to: a) establish a method for deriving species' parameters for the growth equation from tree-rings, b) highlight the importance of regeneration in determining tree-line position and species' distributions and c) improve the integration of social sciences into landscape modelling. Applying the model at the Alpine and northern European tree-lines under different climate change scenarios showed that with most forecasted levels of temperature increase, tree-lines would suffer major disruptions, with shifts in distributions and potential extinction of some tree-line species. However, these responses showed strong lags, so these effects would not become apparent before decades and could take centuries to stabilise. Résumé Les écotones son sensibles au changement en raison du nombre élevé d'espèces qui y vivent à la limite de leur tolérance environnementale. Ceci s'applique également aux limites des arbres définies par les gradients de température altitudinaux et latitudinaux. Dans le contexte actuel de changement climatique, on s'attend à ce qu'elles subissent des modifications de leur position, de la biomasse des arbres et éventuellement des essences qui les composent. Les limites altitudinales et latitudinales diffèrent essentiellement au niveau de la pente des gradients de température qui les sous-tendent les distance sont plus grandes pour les limites latitudinales, ce qui pourrait avoir un impact sur la capacité des espèces à migrer en réponse au changement climatique. En sus de la température, la limite des arbres est aussi influencée à un niveau plus local par les pressions dues aux activités humaines. Celles-ci sont aussi en mutation suite aux changements dans nos sociétés et peuvent interagir avec les effets du changement climatique. Les modèles de dynamique forestière sont souvent utilisés pour simuler les effets du changement climatique, car ils sont basés sur la modélisation de processus. Le modèle spatialement explicite TreeMig a été utilisé comme base pour développer un modèle spécialement adapté pour la limite des arbres en Europe du Nord et dans les Alpes. Pour cette dernière, un module servant à simuler des changements d'utilisation du sol a également été ajouté. Tout d'abord, les paramètres de la courbe de réponse à la température pour les espèces inclues dans le modèle ont été calibrées au moyen de données dendrochronologiques pour diverses espèces et divers sites des deux écotones. Ceci a permis d'améliorer la courbe de croissance du modèle, mais a également permis de conclure que la régénération est probablement plus déterminante que la croissance en ce qui concerne la position de la limite des arbres et la distribution des espèces. La seconde étape consistait à implémenter le module d'abandon du terrain agricole dans les Alpes, basé sur un modèle statistique spatial existant. La sensibilité des variables les plus importantes du modèle a été testée et la performance de ce dernier comparée à d'autres approches de modélisation. La probabilité qu'un terrain soit abandonné était fortement influencée par la distance à la forêt la plus proche et par la pente, qui sont tous deux des substituts pour les coûts liés à la mise en culture. Lors de l'application en situation réelle, le nouveau modèle, baptisé TreeMig-LAb, a donné les résultats les plus réalistes. Ceux-ci étaient comparables aux conséquences déjà observées de l'abandon de terrains agricoles, telles que l'expansion des forêts existantes et la fermeture des clairières. Ce nouveau modèle a ensuite été mis en application dans deux zones d'étude, l'une dans les Alpes suisses et l'autre en Laponie finlandaise, avec divers scénarios de changement climatique. Ces derniers étaient basés sur les prévisions de changement de température pour le siècle prochain établies par l'IPCC et le modèle climatique HadCM3 (ΔT: +1.3, +3.5 et +5.6 °C) et comprenaient une période de stabilisation post-changement climatique de 300 ans. Les résultats ont montré des perturbations majeures dans les deux types de limites de arbres. Avec le scénario de changement climatique le moins extrême, les distributions respectives des espèces ont subi un simple glissement, mais il a fallu plusieurs siècles pour qu'elles atteignent un nouvel équilibre. Avec les autres scénarios, certaines espèces ont disparu de la zone d'étude (p. ex. Pinus cembra dans les Alpes) ou ont vu leur population diminuer parce qu'il n'y avait plus assez de terrains disponibles dans lesquels elles puissent migrer. Le résultat le plus frappant a été le temps de latence dans la réponse de la plupart des espèces, indépendamment du scénario de changement climatique utilisé ou du type de limite des arbres. Finalement, un modèle statistique de l'effet de l'abroutissement par les rennes (Rangifer tarandus) sur la croissance de Pinus sylvestris a été développé, comme première étape en vue de l'implémentation des impacts humains sur la limite boréale des arbres. L'effet attendu était indirect, puisque les rennes réduisent la couverture de lichen sur le sol, dont on attend un effet protecteur contre les rigueurs climatiques. Le modèle a mis en évidence un effet modeste mais significatif, mais étant donné que le lien avec les variables climatiques sous jacentes était peu clair et que le modèle n'était pas appliqué dans l'espace, il n'était pas utilisable tel quel. Le développement du modèle TreeMig-LAb a permis : a) d'établir une méthode pour déduire les paramètres spécifiques de l'équation de croissance ä partir de données dendrochronologiques, b) de mettre en évidence l'importance de la régénération dans la position de la limite des arbres et la distribution des espèces et c) d'améliorer l'intégration des sciences sociales dans les modèles de paysage. L'application du modèle aux limites alpines et nord-européennes des arbres sous différents scénarios de changement climatique a montré qu'avec la plupart des niveaux d'augmentation de température prévus, la limite des arbres subirait des perturbations majeures, avec des glissements d'aires de répartition et l'extinction potentielle de certaines espèces. Cependant, ces réponses ont montré des temps de latence importants, si bien que ces effets ne seraient pas visibles avant des décennies et pourraient mettre plusieurs siècles à se stabiliser.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain fluctuations at rest are not random but are structured in spatial patterns of correlated activity across different brain areas. The question of how resting-state functional connectivity (FC) emerges from the brain's anatomical connections has motivated several experimental and computational studies to understand structure-function relationships. However, the mechanistic origin of resting state is obscured by large-scale models' complexity, and a close structure-function relation is still an open problem. Thus, a realistic but simple enough description of relevant brain dynamics is needed. Here, we derived a dynamic mean field model that consistently summarizes the realistic dynamics of a detailed spiking and conductance-based synaptic large-scale network, in which connectivity is constrained by diffusion imaging data from human subjects. The dynamic mean field approximates the ensemble dynamics, whose temporal evolution is dominated by the longest time scale of the system. With this reduction, we demonstrated that FC emerges as structured linear fluctuations around a stable low firing activity state close to destabilization. Moreover, the model can be further and crucially simplified into a set of motion equations for statistical moments, providing a direct analytical link between anatomical structure, neural network dynamics, and FC. Our study suggests that FC arises from noise propagation and dynamical slowing down of fluctuations in an anatomically constrained dynamical system. Altogether, the reduction from spiking models to statistical moments presented here provides a new framework to explicitly understand the building up of FC through neuronal dynamics underpinned by anatomical connections and to drive hypotheses in task-evoked studies and for clinical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractBACKGROUND: Scientists have been trying to understand the molecular mechanisms of diseases to design preventive and therapeutic strategies for a long time. For some diseases, it has become evident that it is not enough to obtain a catalogue of the disease-related genes but to uncover how disruptions of molecular networks in the cell give rise to disease phenotypes. Moreover, with the unprecedented wealth of information available, even obtaining such catalogue is extremely difficult.PRINCIPAL FINDINGS: We developed a comprehensive gene-disease association database by integrating associations from several sources that cover different biomedical aspects of diseases. In particular, we focus on the current knowledge of human genetic diseases including mendelian, complex and environmental diseases. To assess the concept of modularity of human diseases, we performed a systematic study of the emergent properties of human gene-disease networks by means of network topology and functional annotation analysis. The results indicate a highly shared genetic origin of human diseases and show that for most diseases, including mendelian, complex and environmental diseases, functional modules exist. Moreover, a core set of biological pathways is found to be associated with most human diseases. We obtained similar results when studying clusters of diseases, suggesting that related diseases might arise due to dysfunction of common biological processes in the cell.CONCLUSIONS: For the first time, we include mendelian, complex and environmental diseases in an integrated gene-disease association database and show that the concept of modularity applies for all of them. We furthermore provide a functional analysis of disease-related modules providing important new biological insights, which might not be discovered when considering each of the gene-disease association repositories independently. Hence, we present a suitable framework for the study of how genetic and environmental factors, such as drugs, contribute to diseases.AVAILABILITY: The gene-disease networks used in this study and part of the analysis are available at http://ibi.imim.es/DisGeNET/DisGeNETweb.html#Download

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice network revenue management (RM) model incorporates customer purchase behavioras customers purchasing products with certain probabilities that are a function of the offeredassortment of products, and is the appropriate model for airline and hotel network revenuemanagement, dynamic sales of bundles, and dynamic assortment optimization. The underlyingstochastic dynamic program is intractable and even its certainty-equivalence approximation, inthe form of a linear program called Choice Deterministic Linear Program (CDLP) is difficultto solve in most cases. The separation problem for CDLP is NP-complete for MNL with justtwo segments when their consideration sets overlap; the affine approximation of the dynamicprogram is NP-complete for even a single-segment MNL. This is in contrast to the independentclass(perfect-segmentation) case where even the piecewise-linear approximation has been shownto be tractable. In this paper we investigate the piecewise-linear approximation for network RMunder a general discrete-choice model of demand. We show that the gap between the CDLP andthe piecewise-linear bounds is within a factor of at most 2. We then show that the piecewiselinearapproximation is polynomially-time solvable for a fixed consideration set size, bringing itinto the realm of tractability for small consideration sets; small consideration sets are a reasonablemodeling tradeoff in many practical applications. Our solution relies on showing that forany discrete-choice model the separation problem for the linear program of the piecewise-linearapproximation can be solved exactly by a Lagrangian relaxation. We give modeling extensionsand show by numerical experiments the improvements from using piecewise-linear approximationfunctions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to compare the relative efficiency of initial selection and genetic parameter estimation, using augmented blocks design (ABD), augmented blocks twice replicated design (DABD) and group of randomised block design experiments with common treatments (ERBCT), by simulations, considering fixed effect model and mixed model with regular treatment effects as random. For the simulations, eight different conditions (scenarios) were considered. From the 600 simulations in each scenario, the mean percentage selection coincidence, the Pearsons´s correlation estimates between adjusted means for the fixed effects model, and the heritability estimates for the mixed model were evaluated. DABD and ERBCT were very similar in their comparisons and slightly superior to ABD. Considering the initial stages of selection in a plant breeding program, ABD is a good alternative for selecting superior genotypes, although none of the designs had been effective to estimate heritability in all the different scenarios evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: In this study, we investigated the structural plasticity of the contralesional motor network in ischemic stroke patients using diffusion magnetic resonance imaging (MRI) and explored a model that combines a MRI-based metric of contralesional network integrity and clinical data to predict functional outcome at 6 months after stroke. METHODS: MRI and clinical examinations were performed in 12 patients in the acute phase, at 1 and 6 months after stroke. Twelve age- and gender-matched controls underwent 2 MRIs 1 month apart. Structural remodeling after stroke was assessed using diffusion MRI with an automated measurement of generalized fractional anisotropy (GFA), which was calculated along connections between contralesional cortical motor areas. The predictive model of poststroke functional outcome was computed using a linear regression of acute GFA measures and the clinical assessment. RESULTS: GFA changes in the contralesional motor tracts were found in all patients and differed significantly from controls (0.001 ≤ p < 0.05). GFA changes in intrahemispheric and interhemispheric motor tracts correlated with age (p ≤ 0.01); those in intrahemispheric motor tracts correlated strongly with clinical scores and stroke sizes (p ≤ 0.001). GFA measured in the acute phase together with a routine motor score and age were a strong predictor of motor outcome at 6 months (r(2) = 0.96, p = 0.0002). CONCLUSION: These findings represent a proof of principle that contralesional diffusion MRI measures may provide reliable information for personalized rehabilitation planning after ischemic motor stroke. Neurology® 2012;79:39-46.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Several European HIV observational data bases have, over the last decade, accumulated a substantial number of resistance test results and developed large sample repositories, There is a need to link these efforts together, We here describe the development of such a novel tool that allows to bind these data bases together in a distributed fashion for which the control and data remains with the cohorts rather than classic data mergers.METHODS: As proof-of-concept we entered two basic queries into the tool: available resistance tests and available samples. We asked for patients still alive after 1998-01-01, and between 180 and 195 cm of height, and how many samples or resistance tests there would be available for these patients, The queries were uploaded with the tool to a central web server from which each participating cohort downloaded the queries with the tool and ran them against their database, The numbers gathered were then submitted back to the server and we could accumulate the number of available samples and resistance tests.RESULTS: We obtained the following results from the cohorts on available samples/resistance test: EuResist: not availableI11,194; EuroSIDA: 20,71611,992; ICONA: 3,751/500; Rega: 302/302; SHCS: 53,78311,485, In total, 78,552 samples and 15,473 resistance tests were available amongst these five cohorts. Once these data items have been identified, it is trivial to generate lists of relevant samples that would be usefuI for ultra deep sequencing in addition to the already available resistance tests, Saon the tool will include small analysis packages that allow each cohort to pull a report on their cohort profile and also survey emerging resistance trends in their own cohort,CONCLUSIONS: We plan on providing this tool to all cohorts within the Collaborative HIV and Anti-HIV Drug Resistance Network (CHAIN) and will provide the tool free of charge to others for any non-commercial use, The potential of this tool is to ease collaborations, that is, in projects requiring data to speed up identification of novel resistance mutations by increasing the number of observations across multiple cohorts instead of awaiting single cohorts or studies to reach the critical number needed to address such issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Euroopan sähkösektori on ollut viimeisen vuosikymmenen suurten mullistusten kourissa. Sähkömarkkinoiden avautumisen jälkeen monopoliliiketoimintaa harjoittavien sähköyhtiöiden on ollut pakko parantaa tuottavuuttaan. Ratkaisuksi tähän on etsitty apua huolto- ja rakennustoimintojen ulkoistamisella. Ulkoistaminen on kuitenkin uusi menetelmä tällä sektorilla. Tämän tutkielman tavoitteena on selvittää syyt, jotka tanskalaisella sähköverkkoyhtiöllä oli huolto- ja rakennustoimintojen ulkoistamiseen, sekä löytää siitä saatavat hyödyt ja siihen sisältyvät riskit. Tutkimus suoritetaan käyttäen apuna kirjallisuutta, saatavilla olevia due diligence-, sekä muita raportteja ja analyysejä, sekä tapausta koskettavien tahojen haastatteluja.Lisäksi sähköverkkoalan asiantuntijoiden kanssa käytyjä konsultointia käytetäänselvitykseen. Tutkimus osoittaa, että perimmäiset ajurit huolto- ja rakennustoimintojen ulkoistamiseen tulivat lainmuutosten ja vapautuneiden sähkömarkkinoiden asettamista paineista. Kunnallisessa organisaatiossa parantaa tehokkuutta ulkoistamalla jotain toimintoja yksityisomisteiselle palvelun tuottajalle. Muut ulkoistamisesta odotetut hyödyt olivat alentuneet kustannukset, virtaviivaisempi organisaation ja sähköverkkoyhtiön tehottomista osista eroon pääseminen ennen sen myymistä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The regulation of electricity transmission and distribution business is an essential issue for any electricity market; it is widely introduced in developed electricity markets of Great Britain, Scandinavian countries and United States of America and other. Those markets which were liberalized recently also need well planned regulation model to be chosen and implemented. In open electricity markets the sectors of electricity distribution and transmission remain monopolies, so called "natural monopolies", as introducing the competition into these sectors in most cases appears to be inefficient. Thatis why regulation becomes very important as its main tasks are: to set reasonable tariffs for customers, to ensure non-discriminating process of electricity transmission and distribution, at the same time to provide distribution companies with incentives to operate efficiently and the owners of the companies with reasonable profits as well; the problem of power quality should be solved at the same time. It should be mentioned also, that there is no incentive scheme which will be suitable for any conditions, that is why it is essential to study differentregulation models in order to form the best one for concrete situation. The aim of this Master's Thesis is to give an overview over theregulation of electricity transmission and distribution in Russia. First, the general information about theory of regulation of natural monopolies will be described; the situation in Russian network business and the importance of regulation process for it will be discussed next. Then there is a detailed description ofexisting regulatory system and the process of tariff calculation with an example. And finally, in the work there is a brief analysis of problems of present scheme of regulation, an attempt to predict the following development of regulationin Russia and the perspectives and risks connected to regulation which could face the companies that try to enter Russian electricity market (such as FORTUM OY).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forensic intelligence is a distinct dimension of forensic science. Forensic intelligence processes have mostly been developed to address either a specific type of trace or a specific problem. Even though these empirical developments have led to successes, they are trace-specific in nature and contribute to the generation of silos which hamper the establishment of a more general and transversal model. Forensic intelligence has shown some important perspectives but more general developments are required to address persistent challenges. This will ensure the progress of the discipline as well as its widespread implementation in the future. This paper demonstrates that the description of forensic intelligence processes, their architectures, and the methods for building them can, at a certain level, be abstracted from the type of traces considered. A comparative analysis is made between two forensic intelligence approaches developed independently in Australia and in Europe regarding the monitoring of apparently very different kind of problems: illicit drugs and false identity documents. An inductive effort is pursued to identify similarities and to outline a general model. Besides breaking barriers between apparently separate fields of study in forensic science and intelligence, this transversal model would assist in defining forensic intelligence, its role and place in policing, and in identifying its contributions and limitations. The model will facilitate the paradigm shift from the current case-by-case reactive attitude towards a proactive approach by serving as a guideline for the use of forensic case data in an intelligence-led perspective. A follow-up article will specifically address issues related to comparison processes, decision points and organisational issues regarding forensic intelligence (part II).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El principal objectiu d'aquest treball és implementar i exposar una descripció teòrica per a diferents esquemes de Physical Layer Network Coding. Utilitzant un esquema bàsic com a punt de partida, el projecte presenta la construcció i l'anàlisis de diferents esquemes de comunicació on la complexitat va augmentant a mesura que anem avançant en el projecte. El treball està estructurat en diferents parts: primer, es presenta una introducció a Physical Layer Network Coding i a Lattice Network Codes. A continuació, s'introdueixen les eines matemàtiques necessàries per entendre el CF System. Després, s'analitza i implementa el primer esquema bàsic. A partir del qual, implementem una versió vectorial del CF System i una versió codificada amb un Hamming q-ari. Finalment, s'estudien i implementen diferents estratègies per millorar la matriu de coeficients A.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The major objective of this thesis is to describe and analyse how a railcarrier is engaged in an intermodal freight transportation network through its role and position. Because of the fact that the role as a conceptualisation has a lot of parallels with the position, both these phenomena are evaluated theoretically and empirically. VR Cargo (a strategical business unitof the Finnish railway company VR Ltd.) was chosen to be the focal firm surrounded by the actors of the focal net. Because of the fact that networks are sets of relationships rather than sets of actors, it is essential to describe the dimensions of the relationships created through the time thus having a past, presentand future. The roles are created during long common history shared by the actors especially when IM networks are considered. The presence of roles is embeddedin the tasks, and the future is anchored to the expectations. Furthermore, in this study role refers to network dynamics, and to incremental and radical changes in the network, in a similar way as position refers to stability and to the influences of bonded structures. The main purpose of the first part of the study was to examine how the two distinctive views that have a dominant position in modern logistics ¿ the network view (particularly IMP-based network approach) and the managerial view (represented by Supply Chain Management) differ, especially when intermodalism is under consideration. In this study intermodalism was defined as a form of interorganisational behaviour characterized by the physical movement of unitized goods with Intermodal Transport Units, using more than one mode as performed by the net of operators. In this particular stage the study relies mainly on theoretical evaluation broadened by some discussions with the practitioners. This is essential, because the continuous dialogue between theory and practice is highly emphasized. Some managerial implications are discussed on the basis of the theoretical examination. A tentative model for empirical analysis in subsequent research is suggested. The empirical investigation, which relies on the interviews among the members in the focal net, shows that the major role of the focal company in the network is the common carrier. This role has some behavioural and functional characteristics, such as an executive's disclosure expressing strategic will attached with stable and predictable managerial and organisational behaviour. Most important is the notion that the focal company is neutral for all the other operators, and willing to enhance and strengthen the collaboration with all the members in the IM network. This also means that all the accounts are aimed at being equal in terms of customer satisfaction. Besides, the adjustments intensify the adopted role. However, the focal company is also obliged tosustain its role as it still has a government-erected right to maintain solely the railway operations on domestic tracks. In addition, the roles of a dominator, principal, partner, subcontractor, and integrator were present appearing either in a dyadic relationship or in net(work) context. In order to reveal differentroles, a dualistic interpretation of the concept of role/position was employed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuronal dynamics are fundamentally constrained by the underlying structural network architecture, yet much of the details of this synaptic connectivity are still unknown even in neuronal cultures in vitro. Here we extend a previous approach based on information theory, the Generalized Transfer Entropy, to the reconstruction of connectivity of simulated neuronal networks of both excitatory and inhibitory neurons. We show that, due to the model-free nature of the developed measure, both kinds of connections can be reliably inferred if the average firing rate between synchronous burst events exceeds a small minimum frequency. Furthermore, we suggest, based on systematic simulations, that even lower spontaneous inter-burst rates could be raised to meet the requirements of our reconstruction algorithm by applying a weak spatially homogeneous stimulation to the entire network. By combining multiple recordings of the same in silico network before and after pharmacologically blocking inhibitory synaptic transmission, we show then how it becomes possible to infer with high confidence the excitatory or inhibitory nature of each individual neuron.