865 resultados para multi-environments experiments
Resumo:
Background Besides α1,3 galactosyltransferase (Gal) gene knockout several transgene combinations to prevent pig-to-human xenograft rejection are being investigated. hCD46/HLA-E double transgenic pigs were tested for prevention of xenograft rejection in an ex vivo pig-to-human xenoperfusion model. In addition, expression of human thrombomodulin (hTM-) on wild-type and/or multi-transgenic (GalTKO/hCD46) background was evaluated to overcome pig-to-human coagulation incompatibility. Methods hCD46/HLA-E double transgenic as well as wild-type pig forelimbs were ex vivo perfused with whole, heparinized human blood and autologous blood, respectively. Blood samples were analyzed for production of porcine and/or human inflammatory cytokines. Biopsy samples were examined for deposition of complement proteins as well as E-selectin and VCAM-1 expression. Serial blood cell counts were performed to analyze changes in human blood cell populations. In vitro, PAEC were analyzed for ASGR1 mediated human platelet phagocytosis. In addition, a biochemical assay was performed using hTM-only and multi-transgenic (GalTKO/hCD46/hTM) pig aortic endothelial cells (PAEC) to evaluate the ability of hTM to generate activated protein C (APC). Subsequently, the anti-coagulant properties of hTM were tested in a microcarrier based coagulation assay with PAEC and human whole blood. Results No hyperacute rejection was seen in the ex vivo perfusion model. Extremity perfusions lasted for up to 12 h without increase of vascular resistance and had to be terminated due to continuous small blood losses. Plasma levels of porcine IL1β (P < 0.0001), and IL-8 (P = 0.019) as well as human C3a, C5a and soluble C5b-9 were significantly (P < 0.05–<0.0001) lower in blood perfused through hCD46/HLA-E transgenic as compared to wild-type limbs. C3b/c, C4b/c, and C6 deposition as well as E-selectin and VCAM-1 expression were significantly (P < 0.0001) higher in tissue of wild-type as compared to transgenic limbs. Preliminary immunofluorescence staining results showed that the expression of hCD46/HLA-E is associated with a reduction of NK cell tissue infiltration (P < 0.05). A rapid decrease of platelets was observed in all xenoperfusions. In vitro findings showed that PAEC express ASGR1 and suggest that this molecule is involved in human platelet phagocytosis. In vitro, we found that the amount of APC in the supernatant of hTM transgenic cells increased significantly (P < 0.0001) with protein C concentration in a dose-dependent manner as compared to control PAEC lacking hTM, where the turnover of the protein C remained at the basal level for all of the examined concentration. In further experiments, hTM also showed the ability to prevent blood coagulation by three- to four-fold increased (P < 0.001) clotting time as compared to wild-type PAEC. The formation of TAT complexes was significantly lower when hTM-transgenic cells (P < 0.0001) were used as compared to wild-type cells. Conclusions Transgenic hCD46/HLA-E expression clearly reduced humoral xenoresponses since the terminal pathway of complement, endothelial cell activation, inflammatory cytokine production and NK-cell tissue infiltration were all down-regulated. We also found ASGR1 expression on the vascular endothelium of pigs, and this molecule may thus be involved in binding and phagocytosis of human platelets during pig-to-human xenotransplantation. In addition, use of the hTM transgene has the potential to overcome coagulation incompatibilities in pig-to-human xenotransplantation.
Resumo:
In this paper, we propose a fully automatic, robust approach for segmenting proximal femur in conventional X-ray images. Our method is based on hierarchical landmark detection by random forest regression, where the detection results of 22 global landmarks are used to do the spatial normalization, and the detection results of the 59 local landmarks serve as the image cue for instantiation of a statistical shape model of the proximal femur. To detect landmarks in both levels, we use multi-resolution HoG (Histogram of Oriented Gradients) as features which can achieve better accuracy and robustness. The efficacy of the present method is demonstrated by experiments conducted on 150 clinical x-ray images. It was found that the present method could achieve an average point-to-curve error of 2.0 mm and that the present method was robust to low image contrast, noise and occlusions caused by implants.
Resumo:
Abstract Claystones are considered worldwide as barrier materials for nuclear waste repositories. In the Mont Terri underground research laboratory (URL), a nearly 4-year diffusion and retention (DR) experiment has been performed in Opalinus Clay. It aimed at (1) obtaining data at larger space and time scales than in laboratory experiments and (2) under relevant in situ conditions with respect to pore water chemistry and mechanical stress, (3) quantifying the anisotropy of in situ diffusion, and (4) exploring possible effects of a borehole-disturbed zone. The experiment included two tracer injection intervals in a borehole perpendicular to bedding, through which traced artificial pore water (APW) was circulated, and a pressure monitoring interval. The APW was spiked with neutral tracers (HTO, HDO, H2O-18), anions (Br, I, SeO4), and cations (Na-22, Ba-133, Sr-85, Cs-137, Co-60, Eu-152, stable Cs, and stable Eu). Most tracers were added at the beginning, some were added at a later stage. The hydraulic pressure in the injection intervals was adjusted according to the measured value in the pressure monitoring interval to ensure transport by diffusion only. Concentration time-series in the APW within the borehole intervals were obtained, as well as 2D concentration distributions in the rock at the end of the experiment after overcoring and subsampling which resulted in �250 samples and �1300 analyses. As expected, HTO diffused the furthest into the rock, followed by the anions (Br, I, SeO4) and by the cationic sorbing tracers (Na-22, Ba-133, Cs, Cs-137, Co-60, Eu-152). The diffusion of SeO4 was slower than that of Br or I, approximately proportional to the ratio of their diffusion coefficients in water. Ba-133 diffused only into �0.1 m during the �4 a. Stable Cs, added at a higher concentration than Cs-137, diffused further into the rock than Cs-137, consistent with a non-linear sorption behavior. The rock properties (e.g., water contents) were rather homogeneous at the centimeter scale, with no evidence of a borehole-disturbed zone. In situ anisotropy ratios for diffusion, derived for the first time directly from field data, are larger for HTO and Na-22 (�5) than for anions (�3�4 for Br and I). The lower ionic strength of the pore water at this location (�0.22 M) as compared to locations of earlier experiments in the Mont Terri URL (�0.39 M) had no notable effect on the anion accessible pore fraction for Cl, Br, and I: the value of 0.55 is within the range of earlier data. Detailed transport simulations involving different codes will be presented in a companion paper.
Resumo:
A reliable and robust routing service for Flying Ad-Hoc Networks (FANETs) must be able to adapt to topology changes. User experience on watching live video sequences must also be satisfactory even in scenarios with buffer overflow and high packet loss ratio. In this paper, we introduce a Cross-layer Link quality and Geographical-aware beaconless opportunistic routing protocol (XLinGO). It enhances the transmission of simultaneous multiple video flows over FANETs by creating and keeping reliable persistent multi-hop routes. XLinGO considers a set of cross-layer and human-related information for routing decisions, as performance metrics and Quality of Experience (QoE). Performance evaluation shows that XLinGO achieves multimedia dissemination with QoE support and robustness in a multi-hop, multi-flow, and mobile network environments.
Resumo:
The Hamamatsu R11410 photomultiplier, a tube of 3" diameter and with a very low intrinsic radioactivity, is an interesting light sensor candidate for future experiments using liquid xenon (LXe) as target for direct dark matter searches. We have performed several experiments with the R11410 with the goal of testing its performance in environments similar to a dark matter detector setup. In particular, we examined its long-term behavior and stability in LXe and its response in various electric field configurations.
Resumo:
Cloud Computing is an enabler for delivering large-scale, distributed enterprise applications with strict requirements in terms of performance. It is often the case that such applications have complex scaling and Service Level Agreement (SLA) management requirements. In this paper we present a simulation approach for validating and comparing SLA-aware scaling policies using the CloudSim simulator, using data from an actual Distributed Enterprise Information System (dEIS). We extend CloudSim with concurrent and multi-tenant task simulation capabilities. We then show how different scaling policies can be used for simulating multiple dEIS applications. We present multiple experiments depicting the impact of VM scaling on both datacenter energy consumption and dEIS performance indicators.
Resumo:
Cloud Computing enables provisioning and distribution of highly scalable services in a reliable, on-demand and sustainable manner. However, objectives of managing enterprise distributed applications in cloud environments under Service Level Agreement (SLA) constraints lead to challenges for maintaining optimal resource control. Furthermore, conflicting objectives in management of cloud infrastructure and distributed applications might lead to violations of SLAs and inefficient use of hardware and software resources. This dissertation focusses on how SLAs can be used as an input to the cloud management system, increasing the efficiency of allocating resources, as well as that of infrastructure scaling. First, we present an extended SLA semantic model for modelling complex service-dependencies in distributed applications, and for enabling automated cloud infrastructure management operations. Second, we describe a multi-objective VM allocation algorithm for optimised resource allocation in infrastructure clouds. Third, we describe a method of discovering relations between the performance indicators of services belonging to distributed applications and then using these relations for building scaling rules that a CMS can use for automated management of VMs. Fourth, we introduce two novel VM-scaling algorithms, which optimally scale systems composed of VMs, based on given SLA performance constraints. All presented research works were implemented and tested using enterprise distributed applications.
Resumo:
ATLAS measurements of the azimuthal anisotropy in lead–lead collisions at √sNN = 2.76 TeV are shown using a dataset of approximately 7μb−1 collected at the LHC in 2010. The measurements are performed for charged particles with transversemomenta 0.5 < pT < 20 GeV and in the pseudorapidity range |η| < 2.5. The anisotropy is characterized by the Fourier coefficients, vn, of the charged-particle azimuthal angle distribution for n = 2–4. The Fourier coefficients are evaluated using multi-particle cumulants calculated with the generating function method. Results on the transverse momentum, pseudorapidity and centrality dependence of the vn coefficients are presented. The elliptic flow, v2, is obtained from the two-, four-, six- and eight-particle cumulants while higher-order coefficients, v3 and v4, are determined with two- and four-particle cumulants. Flow harmonics vn measured with four-particle cumulants are significantly reduced compared to the measurement involving two-particle cumulants. A comparison to vn measurements obtained using different analysis methods and previously reported by the LHC experiments is also shown. Results of measurements of flow fluctuations evaluated with multiparticle cumulants are shown as a function of transverse momentum and the collision centrality. Models of the initial spatial geometry and its fluctuations fail to describe the flow fluctuations measurements.
Resumo:
The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.
Resumo:
Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.
Resumo:
Information-centric networking (ICN) is a promising approach for wireless communication because users can exploit the broadcast nature of the wireless medium to quickly find desired content at nearby nodes. However, wireless multi-hop communication is prone to collisions and it is crucial to quickly detect and react to them to optimize transmission times and a void spurious retransmissions. Several adaptive retransmission timers have been used in related ICN literature but they have not been compared and evaluated in wireless multi-hop environments. In this work, we evaluate existing algorithms in wireless multi-hop communication. We find that existing algorithms are not optimized for wireless communication but slight modificati ons can result in considerably better performance without increasing the number of transmitted Interests.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
The evolution of porosity due to dissolution/precipitation processes of minerals and the associated change of transport parameters are of major interest for natural geological environments and engineered underground structures. We designed a reproducible and fast to conduct 2D experiment, which is flexible enough to investigate several process couplings implemented in the numerical code OpenGeosys-GEM (OGS-GEM). We investigated advective-diffusive transport of solutes, effect of liquid phase density on advective transport, and kinetically controlled dissolution/precipitation reactions causing porosity changes. In addition, the system allowed to investigate the influence of microscopic (pore scale) processes on macroscopic (continuum scale) transport. A Plexiglas tank of dimension 10 × 10 cm was filled with a 1 cm thick reactive layer consisting of a bimodal grain size distribution of celestite (SrSO4) crystals, sandwiched between two layers of sand. A barium chloride solution was injected into the tank causing an asymmetric flow field to develop. As the barium chloride reached the celestite region, dissolution of celestite was initiated and barite precipitated. Due to the higher molar volume of barite, its precipitation caused a porosity decrease and thus also a decrease in the permeability of the porous medium. The change of flow in space and time was observed via injection of conservative tracers and analysis of effluents. In addition, an extensive post-mortem analysis of the reacted medium was conducted. We could successfully model the flow (with and without fluid density effects) and the transport of conservative tracers with a (continuum scale) reactive transport model. The prediction of the reactive experiments initially failed. Only the inclusion of information from post-mortem analysis gave a satisfactory match for the case where the flow field changed due to dissolution/precipitation reactions. We concentrated on the refinement of post-mortem analysis and the investigation of the dissolution/precipitation mechanisms at the pore scale. Our analytical techniques combined scanning electron microscopy (SEM) and synchrotron X-ray micro-diffraction/micro-fluorescence performed at the XAS beamline (Swiss Light Source). The newly formed phases include an epitaxial growth of barite micro-crystals on large celestite crystals (epitaxial growth) and a nano-crystalline barite phase (resulting from the dissolution of small celestite crystals) with residues of celestite crystals in the pore interstices. Classical nucleation theory, using well-established and estimated parameters describing barite precipitation, was applied to explain the mineralogical changes occurring in our system. Our pore scale investigation showed limits of the continuum scale reactive transport model. Although kinetic effects were implemented by fixing two distinct rates for the dissolution of large and small celestite crystals, instantaneous precipitation of barite was assumed as soon as oversaturation occurred. Precipitation kinetics, passivation of large celestite crystals and metastability of supersaturated solutions, i.e. the conditions under which nucleation cannot occur despite high supersaturation, were neglected. These results will be used to develop a pore scale model that describes precipitation and dissolution of crystals at the pore scale for various transport and chemical conditions. Pore scale modelling can be used to parameterize constitutive equations to introduce pore-scale corrections into macroscopic (continuum) reactive transport models. Microscopic understanding of the system is fundamental for modelling from the pore to the continuum scale.