865 resultados para false problem
Resumo:
Peer-reviewed
Resumo:
Spectrum is an essential resource for the provision of mobile services. In order to control and delimit its use, governmental agencies set up regulatory policies. Unfortunately, such policies have led to a deficiency of spectrum as only few frequency bands are left unlicensed, and these are used for the majority of new emerging wireless applications. One promising way to alleviate the spectrum shortage problem is adopting a spectrum sharing paradigm in which frequency bands are used opportunistically. Cognitive radio is the key technology to enable this shift of paradigm.Cognitive radio networks are self-organized systems in which devices cooperate to use those spectrum ranges that are not occupied by licensed users. They carry out spectrum sensing in order to detect vacant channels that can be used for communication. Even though spectrum sensing is an active area of research, an important issue remains unsolved: the secure authentication of sensing reports. Not providing security enables the input of false data in the system thus empowering false results. This paper presents a distributed protocol based on wireless physical layer security, symmetric cryptography and one-way functions that allows determining a final sensing decision from multiple sources in a quick and secure way, as well as it preserves users¿ privacy.
Resumo:
The continuous wavelet transform is obtained as a maximumentropy solution of the corresponding inverse problem. It is well knownthat although a signal can be reconstructed from its wavelet transform,the expansion is not unique due to the redundancy of continuous wavelets.Hence, the inverse problem has no unique solution. If we want to recognizeone solution as "optimal", then an appropriate decision criterion hasto be adopted. We show here that the continuous wavelet transform is an"optimal" solution in a maximum entropy sense.
Resumo:
The General Assembly Line Balancing Problem with Setups (GALBPS) was recently defined in the literature. It adds sequence-dependent setup time considerations to the classical Simple Assembly Line Balancing Problem (SALBP) as follows: whenever a task is assigned next to another at the same workstation, a setup time must be added to compute the global workstation time, thereby providing the task sequence inside each workstation. This paper proposes over 50 priority-rule-based heuristic procedures to solve GALBPS, many of which are an improvement upon heuristic procedures published to date.
Resumo:
A maximum entropy statistical treatment of an inverse problem concerning frame theory is presented. The problem arises from the fact that a frame is an overcomplete set of vectors that defines a mapping with no unique inverse. Although any vector in the concomitant space can be expressed as a linear combination of frame elements, the coefficients of the expansion are not unique. Frame theory guarantees the existence of a set of coefficients which is “optimal” in a minimum norm sense. We show here that these coefficients are also “optimal” from a maximum entropy viewpoint.
Resumo:
PURPOSE AND METHOD: This questionnaire survey of 190 university music students assessed negative feelings of music performance anxiety (MPA) before performing, the experience of stage fright as a problem, and how closely they are associated with each other. The study further investigated whether the experience of stage fright as a problem and negative feelings of MPA predict the coping behavior of the music students. Rarely addressed coping issues were assessed, i.e., self-perceived effectiveness of different coping strategies, knowledge of possible risks and acceptance of substance-based coping strategies, and need for more support.RESULTS: The results show that one-third of the students experienced stage fright as a problem and that this was only moderately correlated with negative feelings of MPA. The experience of stage fright as a problem significantly predicted the frequency of use and the acceptance of medication as a coping strategy. Breathing exercises and self-control techniques were rated as effective as medication. Finally, students expressed a strong need to receive more support (65%) and more information (84%) concerning stage fright.CONCLUSION: Stage fright was experienced as a problem and perceived as having negative career consequences by a considerable percentage of the surveyed students. In addition to a desire for more help and support, the students expressed an openness and willingness to seriously discuss and address the topic of stage fright. This provides a necessary and promising basis for optimal career preparation and, hence, an opportunity to prevent occupational problems in professional musicians. [Authors]
Resumo:
A wide variety of whole cell bioreporter and biosensor assays for arsenic detection has been developed over the past decade. The assays permit flexible detection instrumentation while maintaining excellent method of detection limits in the environmentally relevant range of 10-50 μg arsenite per L and below. New emerging trends focus on genetic rewiring of reporter cells and/or integration into microdevices for more optimal detection. A number of case studies have shown realistic field applicability of bioreporter assays.
Resumo:
Tämän tutkielman tavoitteena on tutkia peso-ongelmaa sekä devalvaatio-odotuksia seuraavissa Latinalaisen Amerikan maissa: Argentiina, Brasilia, Costa Rica, Uruguay ja Venezuela. Lisäksi tutkitaan, onko peso-ongelmalla mahdollista selittää korkojen epäsäännöllistä käyttäytymistä ennen todellisen devalvaation tapahtumista. Jotta näiden tutkiminen olisi mahdollista, lasketaan markkinoiden odotettu devalvaation todennäköisyys tutkittavissa maissa. Odotettu devalvaation todennäköisyys lasketaan aikavälillä tammikuusta 1996 joulukuuhun 2006 käyttäen kahta erilaista mallia. Korkoero-mallin mukaan maiden välisestä korkoerosta on mahdollista laskea markkinoiden devalvaatio-odotukset. Toiseksi, Probit-mallissa käytetään useita makrotaloudellisia tekijöitä selittävinä muuttujina laskettaessa odotettua devalvaation todennäköisyyttä. Lisäksi tutkitaan, miten yksittäisten makrotaloudellisten muuttujien kehitys vaikuttaa odotettuun devalvaation todennäköisyyteen. Empiiriset tulokset osoittavat, että tutkituissa Latinalaisen Amerikan maissa oli peso-ongelma aikavälillä tammikuusta 1996 joulukuuhun 2006. Korkoero-mallin tulosten mukaan peso-ongelma löytyi kaikista muista tutkituista maista lukuun ottamatta Argentiinaa. Vastaavasti Probit-mallin mukaan peso-ongelma löytyi kaikista tutkituista maista. Tulokset osoittavat myös, että korkojen epäsäännöllinen kehitys ennen varsinaista devalvaatiota on mahdollista selittää peso-ongelmalla. Probit-mallin tulokset osoittavat lisäksi, että makrotaloudellisten muuttujien kehityksellä ei ole mitään tiettyä kaavaa liittyen siihen, kuinka ne vaikuttavat markkinoiden devalvaatio-odotuksiin Latinalaisessa Amerikassa. Pikemmin vaikutukset näyttävät olevan maakohtaisia.
Resumo:
The hydrological and biogeochemical processes that operate in catchments influence the ecological quality of freshwater systems through delivery of fine sediment, nutrients and organic matter. Most models that seek to characterise the delivery of diffuse pollutants from land to water are reductionist. The multitude of processes that are parameterised in such models to ensure generic applicability make them complex and difficult to test on available data. Here, we outline an alternative - data-driven - inverse approach. We apply SCIMAP, a parsimonious risk based model that has an explicit treatment of hydrological connectivity. we take a Bayesian approach to the inverse problem of determining the risk that must be assigned to different land uses in a catchment in order to explain the spatial patterns of measured in-stream nutrient concentrations. We apply the model to identify the key sources of nitrogen (N) and phosphorus (P) diffuse pollution risk in eleven UK catchments covering a range of landscapes. The model results show that: 1) some land use generates a consistently high or low risk of diffuse nutrient pollution; but 2) the risks associated with different land uses vary both between catchments and between nutrients; and 3) that the dominant sources of P and N risk in the catchment are often a function of the spatial configuration of land uses. Taken on a case-by-case basis, this type of inverse approach may be used to help prioritise the focus of interventions to reduce diffuse pollution risk for freshwater ecosystems. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.
Resumo:
In this paper we consider a sequential allocation problem with n individuals. The first individual can consume any amount of some endowment leaving the remaining for the second individual, and so on. Motivated by the limitations associated with the cooperative or non-cooperative solutions we propose a new approach. We establish some axioms that should be satisfied, representativeness, impartiality, etc. The result is a unique asymptotic allocation rule. It is shown for n = 2; 3; 4; and a claim is made for general n. We show that it satisfies a set of desirable properties. Key words: Sequential allocation rule, River sharing problem, Cooperative and non-cooperative games, Dictator and ultimatum games. JEL classification: C79, D63, D74.
Resumo:
Despite global environmental governance has traditionally couched global warming in terms of annual CO2 emissions (a flow), global mean temperature is actually determined by cumulative CO2 emissions in the atmosphere (a stock). Thanks to advances of scientific community, nowadays it is possible to quantify the \global carbon budget", that is, the amount of available cumulative CO2 emissions before crossing the 2oC threshold (Meinshausen et al., 2009). The current approach proposes to analyze the allocation of such global carbon budget among countries as a classical conflicting claims problem (O'Neill, 1982). Based on some appealing principles, it is proposed an efficient and sustainable allocation of the available carbon budget from 2000 to 2050 taking into account different environmental risk scenarios. Keywords: Carbon budget, Conflicting claims problem, Distribution, Climate change. JEL classification: C79, D71, D74, H41, H87, Q50, Q54, Q58.
Resumo:
We prove that there are one-parameter families of planar differential equations for which the center problem has a trivial solution and on the other hand the cyclicity of the weak focus is arbitrarily high. We illustrate this phenomenon in several examples for which this cyclicity is computed.
Resumo:
La fabrication, la distribution et l'usage de fausses pièces d'identité constituent une menace pour la sécurité autant publique que privée. Ces faux documents représentent en effet un catalyseur pour une multitude de formes de criminalité, des plus anodines aux formes les plus graves et organisées. La dimension, la complexité, la faible visibilité, ainsi que les caractères répétitif et évolutif de la fraude aux documents d'identité appellent des réponses nouvelles qui vont au-delà d'une approche traditionnelle au cas par cas ou de la stratégie du tout technologique dont la perspective historique révèle l'échec. Ces nouvelles réponses passent par un renforcement de la capacité de comprendre les problèmes criminels que posent la fraude aux documents d'identité et les phénomènes qui l'animent. Cette compréhension est tout bonnement nécessaire pour permettre d'imaginer, d'évaluer et de décider les solutions et mesures les plus appropriées. Elle requière de développer les capacités d'analyse et la fonction de renseignement criminel qui fondent en particulier les modèles d'action de sécurité les plus récents, tels que l'intelligence-led policing ou le problem-oriented policing par exemple. Dans ce contexte, le travail doctoral adopte une position originale en postulant que les fausses pièces d'identité se conçoivent utilement comme la trace matérielle ou le vestige résultant de l'activité de fabrication ou d'altération d'un document d'identité menée par les faussaires. Sur la base de ce postulat fondamental, il est avancé que l'exploitation scientifique, méthodique et systématique de ces traces au travers d'un processus de renseignement forensique permet de générer des connaissances phénoménologiques sur les formes de criminalité qui fabriquent, diffusent ou utilisent les fausses pièces d'identité, connaissances qui s'intègrent et se mettent avantageusement au service du renseignement criminel. A l'appui de l'épreuve de cette thèse de départ et de l'étude plus générale du renseignement forensique, le travail doctoral propose des définitions et des modèles. Il décrit des nouvelles méthodes de profilage et initie la constitution d'un catalogue de formes d'analyses. Il recourt également à des expérimentations et des études de cas. Les résultats obtenus démontrent que le traitement systématique de la donnée forensique apporte une contribution utile et pertinente pour le renseignement criminel stratégique, opérationnel et tactique, ou encore la criminologie. Combiné aux informations disponibles par ailleurs, le renseignement forensique produit est susceptible de soutenir l'action de sécurité dans ses dimensions répressive, proactive, préventive et de contrôle. En particulier, les méthodes de profilage des fausses pièces d'identité proposées permettent de révéler des tendances au travers de jeux de données étendus, d'analyser des modus operandi ou d'inférer une communauté ou différence de source. Ces méthodes appuient des moyens de détection et de suivi des séries, des problèmes et des phénomènes criminels qui s'intègrent dans le cadre de la veille opérationnelle. Ils permettent de regrouper par problèmes les cas isolés, de mettre en évidence les formes organisées de criminalité qui méritent le plus d'attention, ou de produire des connaissances robustes et inédites qui offrent une perception plus profonde de la criminalité. Le travail discute également les difficultés associées à la gestion de données et d'informations propres à différents niveaux de généralité, ou les difficultés relatives à l'implémentation du processus de renseignement forensique dans la pratique. Ce travail doctoral porte en premier lieu sur les fausses pièces d'identité et leur traitement par les protagonistes de l'action de sécurité. Au travers d'une démarche inductive, il procède également à une généralisation qui souligne que les observations ci-dessus ne valent pas uniquement pour le traitement systématique des fausses pièces d'identité, mais pour celui de tout type de trace dès lors qu'un profil en est extrait. Il ressort de ces travaux une définition et une compréhension plus transversales de la notion et de la fonction de renseignement forensique. The production, distribution and use of false identity documents constitute a threat to both public and private security. Fraudulent documents are a catalyser for a multitude of crimes, from the most trivial to the most serious and organised forms. The dimension, complexity, low visibility as well as the repetitive and evolving character of the production and use of false identity documents call for new solutions that go beyond the traditional case-by-case approach, or the technology-focused strategy whose failure is revealed by the historic perspective. These new solutions require to strengthen the ability to understand crime phenomena and crime problems posed by false identity documents. Such an understanding is pivotal in order to be able to imagine, evaluate and decide on the most appropriate measures and responses. Therefore, analysis capacities and crime intelligence functions, which found the most recent policing models such as intelligence-led policing or problem-oriented policing for instance, have to be developed. In this context, the doctoral research work adopts an original position by postulating that false identity documents can be usefully perceived as the material remnant resulting from the criminal activity undertook by forgers, namely the manufacture or the modification of identity documents. Based on this fundamental postulate, it is proposed that a scientific, methodical and systematic processing of these traces through a forensic intelligence approach can generate phenomenological knowledge on the forms of crime that produce, distribute and use false identity documents. Such knowledge should integrate and serve advantageously crime intelligence efforts. In support of this original thesis and of a more general study of forensic intelligence, the doctoral work proposes definitions and models. It describes new profiling methods and initiates the construction of a catalogue of analysis forms. It also leverages experimentations and case studies. Results demonstrate that the systematic processing of forensic data usefully and relevantly contributes to strategic, tactical and operational crime intelligence, and also to criminology. Combined with alternative information available, forensic intelligence may support policing in its repressive, proactive, preventive and control activities. In particular, the proposed profiling methods enable to reveal trends among extended datasets, to analyse modus operandi, or to infer that false identity documents have a common or different source. These methods support the detection and follow-up of crime series, crime problems and phenomena and therefore contribute to crime monitoring efforts. They enable to link and regroup by problems cases that were previously viewed as isolated, to highlight organised forms of crime which deserve greatest attention, and to elicit robust and novel knowledge offering a deeper perception of crime. The doctoral research work discusses also difficulties associated with the management of data and information relating to different levels of generality, or difficulties associated with the implementation in practice of the forensic intelligence process. The doctoral work focuses primarily on false identity documents and their treatment by policing stakeholders. However, through an inductive process, it makes a generalisation which underlines that observations do not only apply to false identity documents but to any kind of trace as soon as a profile is extracted. A more transversal definition and understanding of the concept and function of forensic intelligence therefore derives from the doctoral work.