21 resultados para LV Network Constraints
em Universidade do Minho
Resumo:
Nowadays, many P2P applications proliferate in the Internet. The attractiveness of many of these systems relies on the collaborative approach used to exchange large resources without the dependence and associated constraints of centralized approaches where a single server is responsible to handle all the requests from the clients. As consequence, some P2P systems are also interesting and cost-effective approaches to be adopted by content-providers and other Internet players. However, there are several coexistence problems between P2P applications and In- ternet Service Providers (ISPs) due to the unforeseeable behavior of P2P traffic aggregates in ISP infrastructures. In this context, this work proposes a collaborative P2P/ISP system able to underpin the development of novel Traffic Engi- neering (TE) mechanisms contributing for a better coexistence between P2P applications and ISPs. Using the devised system, two TE methods are described being able to estimate and control the impact of P2P traffic aggregates on the ISP network links. One of the TE methods allows that ISP administrators are able to foresee the expected impact that a given P2P swarm will have in the underlying network infrastructure. The other TE method enables the definition of ISP friendly P2P topologies, where specific network links are protected from P2P traffic. As result, the proposed system and associated mechanisms will contribute for improved ISP resource management tasks and to foster the deployment of innovative ISP-friendly systems.
Resumo:
We investigate the impact of cross-delisting on firms’ financial constraints and investment sensitivities. We find that firms that cross-delisted from a U.S. stock exchange face stronger post-delisting financial constraints than their cross-listed counterparts, as measured by investment-to-cash flow sensitivity. Following a delisting, the sensitivity of investment-to-cash flow increases significantly and firms also tend to save more cash out of cash flows. Moreover, this increase appears to be primarily driven by informational frictions that constrain access to external financing. We document that information asymmetry problems are stronger for firms from countries with weaker shareholders protection and for firms from less developed capital markets.
Resumo:
We present a study on human mobility at small spatial scales. Differently from large scale mobility, recently studied through dollar-bill tracking and mobile phone data sets within one big country or continent, we report Brownian features of human mobility at smaller scales. In particular, the scaling exponents found at the smallest scales is typically close to one-half, differently from the larger values for the exponent characterizing mobility at larger scales. We carefully analyze 12-month data of the Eduroam database within the Portuguese university of Minho. A full procedure is introduced with the aim of properly characterizing the human mobility within the network of access points composing the wireless system of the university. In particular, measures of flux are introduced for estimating a distance between access points. This distance is typically non-Euclidean, since the spatial constraints at such small scales distort the continuum space on which human mobility occurs. Since two different ex- ponents are found depending on the scale human motion takes place, we raise the question at which scale the transition from Brownian to non-Brownian motion takes place. In this context, we discuss how the numerical approach can be extended to larger scales, using the full Eduroam in Europe and in Asia, for uncovering the transi- tion between both dynamical regimes.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
PhD Thesis in Bioengineering
Resumo:
A search is performed for Higgs bosons produced in association with top quarks using the diphoton decay mode of the Higgs boson. Selection requirements are optimized separately for leptonic and fully hadronic final states from the top quark decays. The dataset used corresponds to an integrated luminosity of 4.5 fb−1 of proton--proton collisions at a center-of-mass energy of 7 TeV and 20.3 fb−1 at 8 TeV recorded by the ATLAS detector at the CERN Large Hadron Collider. No significant excess over the background prediction is observed and upper limits are set on the tt¯H production cross section. The observed exclusion upper limit at 95% confidence level is 6.7 times the predicted Standard Model cross section value. In addition, limits are set on the strength of the Yukawa coupling between the top quark and the Higgs boson, taking into account the dependence of the tt¯H and tH cross sections as well as the H→γγ branching fraction on the Yukawa coupling. Lower and upper limits at 95% confidence level are set at −1.3 and +8.0 times the Yukawa coupling strength in the Standard Model.
Resumo:
[Extrat] The answer to the social and economic challenges that it is assumed literacy (or its lack) puts to developed countries deeply concerns public policies of governments namely those of the OECD area. In the last decades, these concerns gave origin to several and diverse monitoring devices, initiatives and programmes for reading (mainly) development, putting a strong stress on education. UNESCO (2006, p. 6), for instance, assumes that the literacy challenge can only be met raising the quality of primary and secondary education and intensifying programmes explicitly oriented towards youth and adult literacy. (...)
Resumo:
Schizophrenia stands for a long-lasting state of mental uncertainty that may bring to an end the relation among behavior, thought, and emotion; that is, it may lead to unreliable perception, not suitable actions and feelings, and a sense of mental fragmentation. Indeed, its diagnosis is done over a large period of time; continuos signs of the disturbance persist for at least 6 (six) months. Once detected, the psychiatrist diagnosis is made through the clinical interview and a series of psychic tests, addressed mainly to avoid the diagnosis of other mental states or diseases. Undeniably, the main problem with identifying schizophrenia is the difficulty to distinguish its symptoms from those associated to different untidiness or roles. Therefore, this work will focus on the development of a diagnostic support system, in terms of its knowledge representation and reasoning procedures, based on a blended of Logic Programming and Artificial Neural Networks approaches to computing, taking advantage of a novel approach to knowledge representation and reasoning, which aims to solve the problems associated in the handling (i.e., to stand for and reason) of defective information.
Resumo:
Thrombotic disorders have severe consequences for the patients and for the society in general, being one of the main causes of death. These facts reveal that it is extremely important to be preventive; being aware of how probable is to have that kind of syndrome. Indeed, this work will focus on the development of a decision support system that will cater for an individual risk evaluation with respect to the surge of thrombotic complaints. The Knowledge Representation and Reasoning procedures used will be based on an extension to the Logic Programming language, allowing the handling of incomplete and/or default data. The computational framework in place will be centered on Artificial Neural Networks.
Resumo:
Long-term exposure to transmeridian flights has been shown to impact cognitive functioning. Nevertheless, the immediate effects of jet lag in the activation of specific brain networks have not been investigated. We analyzed the impact of short-term jet lag on the activation of the default mode network (DMN). A group of individuals who were on a transmeridian flight and a control group went through a functional magnetic resonance imaging acquisition. Statistical analysis was performed to test for differences in the DMN activation between groups. Participants from the jet lag group presented decreased activation in the anterior nodes of the DMN, specifically in bilateral medial prefrontal and anterior cingulate cortex. No areas of increased activation were observed for the jet lag group. These results may be suggestive of a negative impact of jet lag on important cognitive functions such as introspection, emotional regulation and decision making in a few days after individuals arrive at their destination.
Resumo:
Currently, the quality of the Indonesian national road network is inadequate due to several constraints, including overcapacity and overloaded trucks. The high deterioration rate of the road infrastructure in developing countries along with major budgetary restrictions and high growth in traffic have led to an emerging need for improving the performance of the highway maintenance system. However, the high number of intervening factors and their complex effects require advanced tools to successfully solve this problem. The high learning capabilities of Data Mining (DM) are a powerful solution to this problem. In the past, these tools have been successfully applied to solve complex and multi-dimensional problems in various scientific fields. Therefore, it is expected that DM can be used to analyze the large amount of data regarding the pavement and traffic, identify the relationship between variables, and provide information regarding the prediction of the data. In this paper, we present a new approach to predict the International Roughness Index (IRI) of pavement based on DM techniques. DM was used to analyze the initial IRI data, including age, Equivalent Single Axle Load (ESAL), crack, potholes, rutting, and long cracks. This model was developed and verified using data from an Integrated Indonesia Road Management System (IIRMS) that was measured with the National Association of Australian State Road Authorities (NAASRA) roughness meter. The results of the proposed approach are compared with the IIRMS analytical model adapted to the IRI, and the advantages of the new approach are highlighted. We show that the novel data-driven model is able to learn (with high accuracy) the complex relationships between the IRI and the contributing factors of overloaded trucks
Resumo:
The ATLAS experiment at the LHC has measured the Higgs boson couplings and mass, and searched for invisible Higgs boson decays, using multiple production and decay channels with up to 4.7 fb−1 of pp collision data at √s=7 TeV and 20.3 fb−1 at √s=8 TeV. In the current study, the measured production and decay rates of the observed Higgs boson in the γγ, ZZ, W W , Zγ, bb, τ τ , and μμ decay channels, along with results from the associated production of a Higgs boson with a top-quark pair, are used to probe the scaling of the couplings with mass. Limits are set on parameters in extensions of the Standard Model including a composite Higgs boson, an additional electroweak singlet, and two-Higgs-doublet models. Together with the measured mass of the scalar Higgs boson in the γγ and ZZ decay modes, a lower limit is set on the pseudoscalar Higgs boson mass of m A > 370 GeV in the “hMSSM” simplified Minimal Supersymmetric Standard Model. Results from direct searches for heavy Higgs bosons are also interpreted in the hMSSM. Direct searches for invisible Higgs boson decays in the vector-boson fusion and associated production of a Higgs boson with W/Z (Z → ℓℓ, W/Z → jj) modes are statistically combined to set an upper limit on the Higgs boson invisible branching ratio of 0.25. The use of the measured visible decay rates in a more general coupling fit improves the upper limit to 0.23, constraining a Higgs portal model of dark matter.
Resumo:
Tese de Doutoramento em Ciências da Saúde.
Resumo:
Tese de Doutoramento em Biologia de Plantas.
Resumo:
PhD thesis in Biomedical Engineering