951 resultados para COMPLEX NETWORKS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The benzoyl hydrazone based dimeric dicopper(II) complex [Cu2(R)(CH3O)(NO3)]2(CH3O)2 (R-Cu2+), recently reported by us, catalyzes the aerobic oxidation of catechols (catechol (S1), 3,5- itertiarybutylcatechol (S2) and 3-nitrocatechol (S3)) to the corresponding quinones (catecholase like activity), as shown by UV–Vis absorption spectroscopy in methanol/HEPES buffer (pH 8.2) medium at 25 C. The highest activity is observed for the substituted catechol (S2) with the electron donor tertiary butyl group, resulting in a turnover frequency (TOF) value of 1.13 103 h1. The complex R-Cu2+ also exhibits a good catalytic activity in the oxidation (without added solvent) of 1-phenylethanol to acetophenone by But OOH under low power (10 W) microwave (MW) irradiation. 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The catalytic peroxidative oxidation (with H2O2) of cyclohexane in an ionic liquid (IL) using the tetracopper(II) complex [(CuL)2(μ4-O,O′,O′′,O′′′-CDC)]2·2H2O [HL = 2-(2-pyridylmethyleneamino)benzenesulfonic acid, CDC = cyclohexane-1,4-dicarboxylate] as a catalyst is reported. Significant improvements on the catalytic performance, in terms of product yield (up to 36%), TON (up to 529), reaction time, selectivity towards cyclohexanone and easy recycling (negligible loss in activity after three consecutive runs), are observed using 1-butyl-3-methylimidazolium hexafluorophosphate as the chosen IL instead of a molecular organic solvent including the commonly used acetonitrile. The catalytic behaviors in the IL and in different molecular solvents are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prediction of the time and the efficiency of the remediation of contaminated soils using soil vapor extraction remain a difficult challenge to the scientific community and consultants. This work reports the development of multiple linear regression and artificial neural network models to predict the remediation time and efficiency of soil vapor extractions performed in soils contaminated separately with benzene, toluene, ethylbenzene, xylene, trichloroethylene, and perchloroethylene. The results demonstrated that the artificial neural network approach presents better performances when compared with multiple linear regression models. The artificial neural network model allowed an accurate prediction of remediation time and efficiency based on only soil and pollutants characteristics, and consequently allowing a simple and quick previous evaluation of the process viability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MSc. Dissertation presented at Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa to obtain the Master degree in Electrical and Computer Engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relatório de estágio apresentado à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Publicidade e Marketing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We start by presenting the current status of a complex flavour conserving two-Higgs doublet model. We will focus on some very interesting scenarios where unexpectedly the light Higgs couplings to leptons and to b-quarks can have a large pseudoscalar component with a vanishing scalar component. Predictions for the allowed parameter space at end of the next run with a total collected luminosity of 300 fb(-1) and 3000 fb(-1) are also discussed. These scenarios are not excluded by present data and most probably will survive the next LHC run. However, a measurement of the mixing angle phi(tau), between the scalar and pseudoscalar component of the 125 GeV Higgs, in the decay h -> tau(+)tau(-) will be able to probe many of these scenarios, even with low luminosity. Similarly, a measurement of phi(t) in the vertex (t) over bar th could help to constrain the low tan beta region in the Type I model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last decade, both scientific community and automotive industry enabled communications among vehicles in different kinds of scenarios proposing different vehicular architectures. Vehicular delay-tolerant networks (VDTNs) were proposed as a solution to overcome some of the issues found in other vehicular architectures, namely, in dispersed regions and emergency scenarios. Most of these issues arise from the unique characteristics of vehicular networks. Contrary to delay-tolerant networks (DTNs), VDTNs place the bundle layer under the network layer in order to simplify the layered architecture and enable communications in sparse regions characterized by long propagation delays, high error rates, and short contact durations. However, such characteristics turn contacts very important in order to exchange as much information as possible between nodes at every contact opportunity. One way to accomplish this goal is to enforce cooperation between network nodes. To promote cooperation among nodes, it is important that nodes share their own resources to deliver messages from others. This can be a very difficult task, if selfish nodes affect the performance of cooperative nodes. This paper studies the performance of a cooperative reputation system that detects, identify, and avoid communications with selfish nodes. Two scenarios were considered across all the experiments enforcing three different routing protocols (First Contact, Spray and Wait, and GeoSpray). For both scenarios, it was shown that reputation mechanisms that punish aggressively selfish nodes contribute to increase the overall network performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless Body Area Networks (WBANs) have emerged as a promising technology for medical and non-medical applications. WBANs consist of a number of miniaturized, portable, and autonomous sensor nodes that are used for long-term health monitoring of patients. These sensor nodes continuously collect information of patients, which are used for ubiquitous health monitoring. In addition, WBANs may be used for managing catastrophic events and increasing the effectiveness and performance of rescue forces. The huge amount of data collected by WBAN nodes demands scalable, on-demand, powerful, and secure storage and processing infrastructure. Cloud computing is expected to play a significant role in achieving the aforementioned objectives. The cloud computing environment links different devices ranging from miniaturized sensor nodes to high-performance supercomputers for delivering people-centric and context-centric services to the individuals and industries. The possible integration of WBANs with cloud computing (WBAN-cloud) will introduce viable and hybrid platform that must be able to process the huge amount of data collected from multiple WBANs. This WBAN-cloud will enable users (including physicians and nurses) to globally access the processing and storage infrastructure at competitive costs. Because WBANs forward useful and life-critical information to the cloud – which may operate in distributed and hostile environments, novel security mechanisms are required to prevent malicious interactions to the storage infrastructure. Both the cloud providers and the users must take strong security measures to protect the storage infrastructure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hand-off (or hand-over), the process where mobile nodes select the best access point available to transfer data, has been well studied in wireless networks. The performance of a hand-off process depends on the specific characteristics of the wireless links. In the case of low-power wireless networks, hand-off decisions must be carefully taken by considering the unique properties of inexpensive low-power radios. This paper addresses the design, implementation and evaluation of smart-HOP, a hand-off mechanism tailored for low-power wireless networks. This work has three main contributions. First, it formulates the hard hand-off process for low-power networks (such as typical wireless sensor networks - WSNs) with a probabilistic model, to investigate the impact of the most relevant channel parameters through an analytical approach. Second, it confirms the probabilistic model through simulation and further elaborates on the impact of several hand-off parameters. Third, it fine-tunes the most relevant hand-off parameters via an extended set of experiments, in a realistic experimental scenario. The evaluation shows that smart-HOP performs well in the transitional region while achieving more than 98 percent relative delivery ratio and hand-off delays in the order of a few tens of a milliseconds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT - Starting with the explanation of metanarrative as a sort of self-reflexive storytelling (as defended by Kenneth Weaver Hope in his unpublished PhD. thesis), I propose to talk about enunciative practices that stress the telling more than the told. In line with some metaficcional practices applied to cinema, such as the ‘mindfuck’ film (Jonathan Eig, 2003), the ‘psychological puzzle film’ (Elliot Panek, 2003) and the ‘mind-game film’ (Thomas Elsaesser, 2009), I will address the manipulations that a narrative film endures in order to produce a more fruitful and complex experience for the viewer. I will particularly concentrate on the misrepresentation of time as a way to produce a labyrinthine work of fiction where the linear description of events is replaced by a game of time disclosure. The viewer is thus called upon to reconstruct the order of the various situations portrayed in a process that I call ‘temporal mapping’. However, as the viewer attempts to do this, the film, ironically, because of the intricate nature of the plot and the uncertain status of the characters, resists the attempt. There is a sort of teasing taking place between the film and its spectator: an invitation of decoding that is half-denied until the end, where the puzzle is finally solved. I will use three of Alejandro Iñárritu’s films to better convey my point: Amores perros (2000), 21 Grams (2003) and Babel (2006). I will consider Iñárritu’s methods to produce a non-linear storytelling as a way to stress the importance of time and its validity as one of the elements that make up for a metanarrative experience in films. I will focus especially on 21 Grams, which I consider to be a paragon of the labyrinth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The incorporation of small amount of highly anisotropic nanoparticles into liquid crystalline hydroxypropylcellulose (LC-HPC) matrix improves its response when is exposed to humidity gradients due to an anisotropic increment of order in the structure. Dispersed nanoparticles give rise to faster order/disorder transitions when exposed to moisture as it is qualitatively observed and quantified by stress-time measurements. The presence of carbon nanotubes derives in a improvement of the mechanical properties of LC-HPC thin films.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coevolution between two antagonistic species has been widely studied theoretically for both ecologically- and genetically-driven Red Queen dynamics. A typical outcome of these systems is an oscillatory behavior causing an endless series of one species adaptation and others counter-adaptation. More recently, a mathematical model combining a three-species food chain system with an adaptive dynamics approach revealed genetically driven chaotic Red Queen coevolution. In the present article, we analyze this mathematical model mainly focusing on the impact of species rates of evolution (mutation rates) in the dynamics. Firstly, we analytically proof the boundedness of the trajectories of the chaotic attractor. The complexity of the coupling between the dynamical variables is quantified using observability indices. By using symbolic dynamics theory, we quantify the complexity of genetically driven Red Queen chaos computing the topological entropy of existing one-dimensional iterated maps using Markov partitions. Co-dimensional two bifurcation diagrams are also built from the period ordering of the orbits of the maps. Then, we study the predictability of the Red Queen chaos, found in narrow regions of mutation rates. To extend the previous analyses, we also computed the likeliness of finding chaos in a given region of the parameter space varying other model parameters simultaneously. Such analyses allowed us to compute a mean predictability measure for the system in the explored region of the parameter space. We found that genetically driven Red Queen chaos, although being restricted to small regions of the analyzed parameter space, might be highly unpredictable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A chromatographic separation of active ingredients of Combivir, Epivir, Kaletra, Norvir, Prezista, Retrovir, Trivizir, Valcyte, and Viramune is performed on thin layer chromatography. The spectra of these nine drugs were recorded using the Fourier transform infrared spectroscopy. This information is then analyzed by means of the cosine correlation. The comparison of the infrared spectra in the perspective of the adopted similarity measure is possible to visualize with present day computer tools, and the emerging clusters provide additional information about the similarities of the investigated set of complex drugs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho final de Mestrado para obtenção do grau de Mestre em Engenharia de Redes de Comunicação e Multimédia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by the dark matter and the baryon asymmetry problems, we analyze a complex singlet extension of the Standard Model with a Z(2) symmetry (which provides a dark matter candidate). After a detailed two-loop calculation of the renormalization group equations for the new scalar sector, we study the radiative stability of the model up to a high energy scale (with the constraint that the 126 GeV Higgs boson found at the LHC is in the spectrum) and find it requires the existence of a new scalar state mixing with the Higgs with a mass larger than 140 GeV. This bound is not very sensitive to the cutoff scale as long as the latter is larger than 10(10) GeV. We then include all experimental and observational constraints/measurements from collider data, from dark matter direct detection experiments, and from the Planck satellite and in addition force stability at least up to the grand unified theory scale, to find that the lower bound is raised to about 170 GeV, while the dark matter particle must be heavier than about 50 GeV.