995 resultados para Security constraints


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks are emerging as effective tools in the gathering and dissemination of data. They can be applied in many fields including health, environmental monitoring, home automation and the military. Like all other computing systems it is necessary to include security features, so that security sensitive data traversing the network is protected. However, traditional security techniques cannot be applied to wireless sensor networks. This is due to the constraints of battery power, memory, and the computational capacities of the miniature wireless sensor nodes. Therefore, to address this need, it becomes necessary to develop new lightweight security protocols. This dissertation focuses on designing a suite of lightweight trust-based security mechanisms and a cooperation enforcement protocol for wireless sensor networks. This dissertation presents a trust-based cluster head election mechanism used to elect new cluster heads. This solution prevents a major security breach against the routing protocol, namely, the election of malicious or compromised cluster heads. This dissertation also describes a location-aware, trust-based, compromise node detection, and isolation mechanism. Both of these mechanisms rely on the ability of a node to monitor its neighbors. Using neighbor monitoring techniques, the nodes are able to determine their neighbors’ reputation and trust level through probabilistic modeling. The mechanisms were designed to mitigate internal attacks within wireless sensor networks. The feasibility of the approach is demonstrated through extensive simulations. The dissertation also addresses non-cooperation problems in multi-user wireless sensor networks. A scalable lightweight enforcement algorithm using evolutionary game theory is also designed. The effectiveness of this cooperation enforcement algorithm is validated through mathematical analysis and simulation. This research has advanced the knowledge of wireless sensor network security and cooperation by developing new techniques based on mathematical models. By doing this, we have enabled others to build on our work towards the creation of highly trusted wireless sensor networks. This would facilitate its full utilization in many fields ranging from civilian to military applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose three research problems to explore the relations between trust and security in the setting of distributed computation. In the first problem, we study trust-based adversary detection in distributed consensus computation. The adversaries we consider behave arbitrarily disobeying the consensus protocol. We propose a trust-based consensus algorithm with local and global trust evaluations. The algorithm can be abstracted using a two-layer structure with the top layer running a trust-based consensus algorithm and the bottom layer as a subroutine executing a global trust update scheme. We utilize a set of pre-trusted nodes, headers, to propagate local trust opinions throughout the network. This two-layer framework is flexible in that it can be easily extensible to contain more complicated decision rules, and global trust schemes. The first problem assumes that normal nodes are homogeneous, i.e. it is guaranteed that a normal node always behaves as it is programmed. In the second and third problems however, we assume that nodes are heterogeneous, i.e, given a task, the probability that a node generates a correct answer varies from node to node. The adversaries considered in these two problems are workers from the open crowd who are either investing little efforts in the tasks assigned to them or intentionally give wrong answers to questions. In the second part of the thesis, we consider a typical crowdsourcing task that aggregates input from multiple workers as a problem in information fusion. To cope with the issue of noisy and sometimes malicious input from workers, trust is used to model workers' expertise. In a multi-domain knowledge learning task, however, using scalar-valued trust to model a worker's performance is not sufficient to reflect the worker's trustworthiness in each of the domains. To address this issue, we propose a probabilistic model to jointly infer multi-dimensional trust of workers, multi-domain properties of questions, and true labels of questions. Our model is very flexible and extensible to incorporate metadata associated with questions. To show that, we further propose two extended models, one of which handles input tasks with real-valued features and the other handles tasks with text features by incorporating topic models. Our models can effectively recover trust vectors of workers, which can be very useful in task assignment adaptive to workers' trust in the future. These results can be applied for fusion of information from multiple data sources like sensors, human input, machine learning results, or a hybrid of them. In the second subproblem, we address crowdsourcing with adversaries under logical constraints. We observe that questions are often not independent in real life applications. Instead, there are logical relations between them. Similarly, workers that provide answers are not independent of each other either. Answers given by workers with similar attributes tend to be correlated. Therefore, we propose a novel unified graphical model consisting of two layers. The top layer encodes domain knowledge which allows users to express logical relations using first-order logic rules and the bottom layer encodes a traditional crowdsourcing graphical model. Our model can be seen as a generalized probabilistic soft logic framework that encodes both logical relations and probabilistic dependencies. To solve the collective inference problem efficiently, we have devised a scalable joint inference algorithm based on the alternating direction method of multipliers. The third part of the thesis considers the problem of optimal assignment under budget constraints when workers are unreliable and sometimes malicious. In a real crowdsourcing market, each answer obtained from a worker incurs cost. The cost is associated with both the level of trustworthiness of workers and the difficulty of tasks. Typically, access to expert-level (more trustworthy) workers is more expensive than to average crowd and completion of a challenging task is more costly than a click-away question. In this problem, we address the problem of optimal assignment of heterogeneous tasks to workers of varying trust levels with budget constraints. Specifically, we design a trust-aware task allocation algorithm that takes as inputs the estimated trust of workers and pre-set budget, and outputs the optimal assignment of tasks to workers. We derive the bound of total error probability that relates to budget, trustworthiness of crowds, and costs of obtaining labels from crowds naturally. Higher budget, more trustworthy crowds, and less costly jobs result in a lower theoretical bound. Our allocation scheme does not depend on the specific design of the trust evaluation component. Therefore, it can be combined with generic trust evaluation algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article analyses the viability of promoting crop-specific programs as a mean to improve smallholder net farm income and food security. The case study explores the relevance of European Union Stabilisation of Export Earnings (STABEX) funds in supporting Sierra Leone’s agricultural development agenda. By analysing the drivers of food security for a number of targeted smallholders in the two most important agricultural zones of Sierra Leone, it is possible to compare the suitability of crop-specific support (in rice, cocoa and coffee) versus general aid programs (public infrastructure, on and off farm diversification opportunities, sustainable practices, access to productive assets, etc.). The results indicate that crop diversification strategies are widespread and closely related to risk minimisation and enhanced food security among smallholders. Similarly, crop-specific programs mainly focusing on commercialisation tend to overlook important constraints associated to self-consumption and productivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To describe the clinical history of a child with aggressive behavior and recurring death-theme speech, and report the experience of the team of authors, who proposed an alternative to medication through the establishment of a protection network and the inter-sector implementation of the circle of security concept. A 5-year-old child has a violent and aggressive behavior at the day-care. The child was diagnosed by the healthcare center with depressive disorder and behavioral disorder, and was medicated with sertraline and risperidone. Side effects were observed, and the medications were discontinued. Despite several actions, such as talks, teamwork, psychological and psychiatric follow-up, the child's behavior remained unchanged. A unique therapeutic project was developed by Universidade Estadual de Campinas' Medical School students in order to establish a connection between the entities responsible for the child's care (daycare center, healthcare center, and family). Thus, the team was able to develop a basic care protection network. The implementation of the inter-sector circle of security, as well as the communication and cooperation among the teams, produced very favorable results in this case. This initiative was shown to be a feasible and effective alternative to the use of medication for this child.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a re-analysis of the Geneva-Copenhagen survey, which benefits from the infrared flux method to improve the accuracy of the derived stellar effective temperatures and uses the latter to build a consistent and improved metallicity scale. Metallicities are calibrated on high-resolution spectroscopy and checked against four open clusters and a moving group, showing excellent consistency. The new temperature and metallicity scales provide a better match to theoretical isochrones, which are used for a Bayesian analysis of stellar ages. With respect to previous analyses, our stars are on average 100 K hotter and 0.1 dex more metal rich, which shift the peak of the metallicity distribution function around the solar value. From Stromgren photometry we are able to derive for the first time a proxy for [alpha/Fe] abundances, which enables us to perform a tentative dissection of the chemical thin and thick disc. We find evidence for the latter being composed of an old, mildly but systematically alpha-enhanced population that extends to super solar metallicities, in agreement with spectroscopic studies. Our revision offers the largest existing kinematically unbiased sample of the solar neighbourhood that contains full information on kinematics, metallicities, and ages and thus provides better constraints on the physical processes relevant in the build-up of the Milky Way disc, enabling a better understanding of the Sun in a Galactic context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss the dynamics of the Universe within the framework of the massive graviton cold dark matter scenario (MGCDM) in which gravitons are geometrically treated as massive particles. In this modified gravity theory, the main effect of the gravitons is to alter the density evolution of the cold dark matter component in such a way that the Universe evolves to an accelerating expanding regime, as presently observed. Tight constraints on the main cosmological parameters of the MGCDM model are derived by performing a joint likelihood analysis involving the recent supernovae type Ia data, the cosmic microwave background shift parameter, and the baryonic acoustic oscillations as traced by the Sloan Digital Sky Survey red luminous galaxies. The linear evolution of small density fluctuations is also analyzed in detail. It is found that the growth factor of the MGCDM model is slightly different (similar to 1-4%) from the one provided by the conventional flat Lambda CDM cosmology. The growth rate of clustering predicted by MGCDM and Lambda CDM models are confronted to the observations and the corresponding best fit values of the growth index (gamma) are also determined. By using the expectations of realistic future x-ray and Sunyaev-Zeldovich cluster surveys we derive the dark matter halo mass function and the corresponding redshift distribution of cluster-size halos for the MGCDM model. Finally, we also show that the Hubble flow differences between the MGCDM and the Lambda CDM models provide a halo redshift distribution departing significantly from the those predicted by other dark energy models. These results suggest that the MGCDM model can observationally be distinguished from Lambda CDM and also from a large number of dark energy models recently proposed in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss the properties of homogeneous and isotropic flat cosmologies in which the present accelerating stage is powered only by the gravitationally induced creation of cold dark matter (CCDM) particles (Omega(m) = 1). For some matter creation rates proposed in the literature, we show that the main cosmological functions such as the scale factor of the universe, the Hubble expansion rate, the growth factor, and the cluster formation rate are analytically defined. The best CCDM scenario has only one free parameter and our joint analysis involving baryonic acoustic oscillations + cosmic microwave background (CMB) + SNe Ia data yields (Omega) over tilde = 0.28 +/- 0.01 (1 sigma), where (Omega) over tilde (m) is the observed matter density parameter. In particular, this implies that the model has no dark energy but the part of the matter that is effectively clustering is in good agreement with the latest determinations from the large- scale structure. The growth of perturbation and the formation of galaxy clusters in such scenarios are also investigated. Despite the fact that both scenarios may share the same Hubble expansion, we find that matter creation cosmologies predict stronger small scale dynamics which implies a faster growth rate of perturbations with respect to the usual Lambda CDM cosmology. Such results point to the possibility of a crucial observational test confronting CCDM with Lambda CDM scenarios through a more detailed analysis involving CMB, weak lensing, as well as the large-scale structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. We calculate the theoretical event rate of gamma-ray bursts (GRBs) from the collapse of massive first-generation (Population III; Pop III) stars. The Pop III GRBs could be super-energetic with the isotropic energy up to E(iso) greater than or similar to 10(55-57) erg, providing a unique probe of the high-redshift Universe. Methods. We consider both the so-called Pop III.1 stars (primordial) and Pop III.2 stars (primordial but affected by radiation from other stars). We employ a semi-analytical approach that considers inhomogeneous hydrogen reionization and chemical evolution of the intergalactic medium. Results. We show that Pop III.2 GRBs occur more than 100 times more frequently than Pop III.1 GRBs, and thus should be suitable targets for future GRB missions. Interestingly, our optimistic model predicts an event rate that is already constrained by the current radio transient searches. We expect similar to 10-10(4) radio afterglows above similar to 0.3 mJy on the sky with similar to 1 year variability and mostly without GRBs (orphans), which are detectable by ALMA, EVLA, LOFAR, and SKA, while we expect to observe maximum of N < 20 GRBs per year integrated over at z > 6 for Pop III.2 and N < 0.08 per year integrated over at z > 10 for Pop III.1 with EXIST, and N < 0.2 for Pop III.2 GRBs per year integrated over at z > 6 with Swift.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The kinematic approach to cosmological tests provides direct evidence to the present accelerating stage of the Universe that does not depend on the validity of general relativity, as well as on the matter-energy content of the Universe. In this context, we consider here a linear two-parameter expansion for the decelerating parameter, q(z)=q(0)+q(1)z, where q(0) and q(1) are arbitrary constants to be constrained by the union supernovae data. By assuming a flat Universe we find that the best fit to the pair of free parameters is (q(0),q(1))=(-0.73,1.5) whereas the transition redshift is z(t)=0.49(-0.07)(+0.14)(1 sigma) +0.54-0.12(2 sigma). This kinematic result is in agreement with some independent analyses and more easily accommodates many dynamical flat models (like Lambda CDM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports results from a search for nu(mu) -> nu(e) transitions by the MINOS experiment based on a 7 x 10(20) protons-on-target exposure. Our observation of 54 candidate nu(e) events in the far detector with a background of 49.1 +/- 7.0(stat) +/- 2.7(syst) events predicted by the measurements in the near detector requires 2sin(2)(2 theta(13))sin(2)theta(23) < 0.12(0.20) at the 90% C.L. for the normal (inverted) mass hierarchy at delta(CP) = 0. The experiment sets the tightest limits to date on the value of theta(13) for nearly all values of delta(CP) for the normal neutrino mass hierarchy and maximal sin(2)(2 theta(23)).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For Au + Au collisions at 200 GeV, we measure neutral pion production with good statistics for transverse momentum, p(T), up to 20 GeV/c. A fivefold suppression is found, which is essentially constant for 5 < p(T) < 20 GeV/c. Experimental uncertainties are small enough to constrain any model-dependent parametrization for the transport coefficient of the medium, e. g., <(q) over cap > in the parton quenching model. The spectral shape is similar for all collision classes, and the suppression does not saturate in Au + Au collisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The PHENIX experiment has measured the suppression of semi-inclusive single high-transverse-momentum pi(0)'s in Au+Au collisions at root s(NN) = 200 GeV. The present understanding of this suppression is in terms of energy loss of the parent (fragmenting) parton in a dense color-charge medium. We have performed a quantitative comparison between various parton energy-loss models and our experimental data. The statistical point-to-point uncorrelated as well as correlated systematic uncertainties are taken into account in the comparison. We detail this methodology and the resulting constraint on the model parameters, such as the initial color-charge density dN(g)/dy, the medium transport coefficient <(q) over cap >, or the initial energy-loss parameter epsilon(0). We find that high-transverse-momentum pi(0) suppression in Au+Au collisions has sufficient precision to constrain these model-dependent parameters at the +/- 20-25% (one standard deviation) level. These constraints include only the experimental uncertainties, and further studies are needed to compute the corresponding theoretical uncertainties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present rigorous upper and lower bounds for the momentum-space ghost propagator G(p) of Yang-Mills theories in terms of the smallest nonzero eigenvalue (and of the corresponding eigenvector) of the Faddeev-Popov matrix. We apply our analysis to data from simulations of SU(2) lattice gauge theory in Landau gauge, using the largest lattice sizes to date. Our results suggest that, in three and in four space-time dimensions, the Landau gauge ghost propagator is not enhanced as compared to its tree-level behavior. This is also seen in plots and fits of the ghost dressing function. In the two-dimensional case, on the other hand, we find that G(p) diverges as p(-2-2 kappa) with kappa approximate to 0.15, in agreement with A. Maas, Phys. Rev. D 75, 116004 (2007). We note that our discussion is general, although we make an application only to pure gauge theory in Landau gauge. Our simulations have been performed on the IBM supercomputer at the University of Sao Paulo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present rigorous upper and lower bounds for the zero-momentum gluon propagator D(0) of Yang-Mills theories in terms of the average value of the gluon field. This allows us to perform a controlled extrapolation of lattice data to infinite volume, showing that the infrared limit of the Landau-gauge gluon propagator in SU(2) gauge theory is finite and nonzero in three and in four space-time dimensions. In the two-dimensional case, we find D(0)=0, in agreement with Maas. We suggest an explanation for these results. We note that our discussion is general, although we apply our analysis only to pure gauge theory in the Landau gauge. Simulations have been performed on the IBM supercomputer at the University of Sao Paulo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we discuss school schedules and their implications in the context of chronobiological contemporary knowledge, arguing for the need to reconsider time planning in the school setting. We present anecdotal observations regarding chronobiological challenges imposed by the school system throughout different ages and discuss the effects of these schedules in terms of sleepiness and its deleterious consequences on learning, memory, and attention. Different settings (including urban vs. rural habitats) influence timing, which also depends on self-selected sleep schedules. Finally, we criticize the traditional view of a necessary strict stability of sleep-wake habits.