994 resultados para problem complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the estimation of the code-phase(pseudorange) and the carrier-phase of the direct signal received from a direct-sequence spread-spectrum satellite transmitter. Thesignal is received by an antenna array in a scenario with interferenceand multipath propagation. These two effects are generallythe limiting error sources in most high-precision positioning applications.A new estimator of the code- and carrier-phases is derivedby using a simplified signal model and the maximum likelihood(ML) principle. The simplified model consists essentially ofgathering all signals, except for the direct one, in a component withunknown spatial correlation. The estimator exploits the knowledgeof the direction-of-arrival of the direct signal and is much simplerthan other estimators derived under more detailed signal models.Moreover, we present an iterative algorithm, that is adequate for apractical implementation and explores an interesting link betweenthe ML estimator and a hybrid beamformer. The mean squarederror and bias of the new estimator are computed for a numberof scenarios and compared with those of other methods. The presentedestimator and the hybrid beamforming outperform the existingtechniques of comparable complexity and attains, in manysituations, the Cramér–Rao lower bound of the problem at hand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The General Assembly Line Balancing Problem with Setups (GALBPS) was recently defined in the literature. It adds sequence-dependent setup time considerations to the classical Simple Assembly Line Balancing Problem (SALBP) as follows: whenever a task is assigned next to another at the same workstation, a setup time must be added to compute the global workstation time, thereby providing the task sequence inside each workstation. This paper proposes over 50 priority-rule-based heuristic procedures to solve GALBPS, many of which are an improvement upon heuristic procedures published to date.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A maximum entropy statistical treatment of an inverse problem concerning frame theory is presented. The problem arises from the fact that a frame is an overcomplete set of vectors that defines a mapping with no unique inverse. Although any vector in the concomitant space can be expressed as a linear combination of frame elements, the coefficients of the expansion are not unique. Frame theory guarantees the existence of a set of coefficients which is “optimal” in a minimum norm sense. We show here that these coefficients are also “optimal” from a maximum entropy viewpoint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a programming environment for supporting learning in STEM, particularly mobile robotic learning. It was designed to maintain progressive learning for people with and without previous knowledge of programming and/or robotics. The environment was multi platform and built with open source tools. Perception, mobility, communication, navigation and collaborative behaviour functionalities can be programmed for different mobile robots. A learner is able to programme robots using different programming languages and editor interfaces: graphic programming interface (basic level), XML-based meta language (intermediate level) or ANSI C language (advanced level). The environment supports programme translation transparently into different languages for learners or explicitly on learners’ demand. Learners can access proposed challenges and learning interfaces by examples. The environment was designed to allow characteristics such as extensibility, adaptive interfaces, persistence and low software/hardware coupling. Functionality tests were performed to prove programming environment specifications. UV BOT mobile robots were used in these tests

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE AND METHOD: This questionnaire survey of 190 university music students assessed negative feelings of music performance anxiety (MPA) before performing, the experience of stage fright as a problem, and how closely they are associated with each other. The study further investigated whether the experience of stage fright as a problem and negative feelings of MPA predict the coping behavior of the music students. Rarely addressed coping issues were assessed, i.e., self-perceived effectiveness of different coping strategies, knowledge of possible risks and acceptance of substance-based coping strategies, and need for more support.RESULTS: The results show that one-third of the students experienced stage fright as a problem and that this was only moderately correlated with negative feelings of MPA. The experience of stage fright as a problem significantly predicted the frequency of use and the acceptance of medication as a coping strategy. Breathing exercises and self-control techniques were rated as effective as medication. Finally, students expressed a strong need to receive more support (65%) and more information (84%) concerning stage fright.CONCLUSION: Stage fright was experienced as a problem and perceived as having negative career consequences by a considerable percentage of the surveyed students. In addition to a desire for more help and support, the students expressed an openness and willingness to seriously discuss and address the topic of stage fright. This provides a necessary and promising basis for optimal career preparation and, hence, an opportunity to prevent occupational problems in professional musicians. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A wide variety of whole cell bioreporter and biosensor assays for arsenic detection has been developed over the past decade. The assays permit flexible detection instrumentation while maintaining excellent method of detection limits in the environmentally relevant range of 10-50 μg arsenite per L and below. New emerging trends focus on genetic rewiring of reporter cells and/or integration into microdevices for more optimal detection. A number of case studies have shown realistic field applicability of bioreporter assays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämän tutkielman tavoitteena on tutkia peso-ongelmaa sekä devalvaatio-odotuksia seuraavissa Latinalaisen Amerikan maissa: Argentiina, Brasilia, Costa Rica, Uruguay ja Venezuela. Lisäksi tutkitaan, onko peso-ongelmalla mahdollista selittää korkojen epäsäännöllistä käyttäytymistä ennen todellisen devalvaation tapahtumista. Jotta näiden tutkiminen olisi mahdollista, lasketaan markkinoiden odotettu devalvaation todennäköisyys tutkittavissa maissa. Odotettu devalvaation todennäköisyys lasketaan aikavälillä tammikuusta 1996 joulukuuhun 2006 käyttäen kahta erilaista mallia. Korkoero-mallin mukaan maiden välisestä korkoerosta on mahdollista laskea markkinoiden devalvaatio-odotukset. Toiseksi, Probit-mallissa käytetään useita makrotaloudellisia tekijöitä selittävinä muuttujina laskettaessa odotettua devalvaation todennäköisyyttä. Lisäksi tutkitaan, miten yksittäisten makrotaloudellisten muuttujien kehitys vaikuttaa odotettuun devalvaation todennäköisyyteen. Empiiriset tulokset osoittavat, että tutkituissa Latinalaisen Amerikan maissa oli peso-ongelma aikavälillä tammikuusta 1996 joulukuuhun 2006. Korkoero-mallin tulosten mukaan peso-ongelma löytyi kaikista muista tutkituista maista lukuun ottamatta Argentiinaa. Vastaavasti Probit-mallin mukaan peso-ongelma löytyi kaikista tutkituista maista. Tulokset osoittavat myös, että korkojen epäsäännöllinen kehitys ennen varsinaista devalvaatiota on mahdollista selittää peso-ongelmalla. Probit-mallin tulokset osoittavat lisäksi, että makrotaloudellisten muuttujien kehityksellä ei ole mitään tiettyä kaavaa liittyen siihen, kuinka ne vaikuttavat markkinoiden devalvaatio-odotuksiin Latinalaisessa Amerikassa. Pikemmin vaikutukset näyttävät olevan maakohtaisia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hydrological and biogeochemical processes that operate in catchments influence the ecological quality of freshwater systems through delivery of fine sediment, nutrients and organic matter. Most models that seek to characterise the delivery of diffuse pollutants from land to water are reductionist. The multitude of processes that are parameterised in such models to ensure generic applicability make them complex and difficult to test on available data. Here, we outline an alternative - data-driven - inverse approach. We apply SCIMAP, a parsimonious risk based model that has an explicit treatment of hydrological connectivity. we take a Bayesian approach to the inverse problem of determining the risk that must be assigned to different land uses in a catchment in order to explain the spatial patterns of measured in-stream nutrient concentrations. We apply the model to identify the key sources of nitrogen (N) and phosphorus (P) diffuse pollution risk in eleven UK catchments covering a range of landscapes. The model results show that: 1) some land use generates a consistently high or low risk of diffuse nutrient pollution; but 2) the risks associated with different land uses vary both between catchments and between nutrients; and 3) that the dominant sources of P and N risk in the catchment are often a function of the spatial configuration of land uses. Taken on a case-by-case basis, this type of inverse approach may be used to help prioritise the focus of interventions to reduce diffuse pollution risk for freshwater ecosystems. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the possibilities of integrating cost information and engineering design. Special emphasis is put on finding the potential of using the activity-based costing (ABC) method. Today, the problem of cost estimation in engineering design is that there are two separate extremes of knowledge. On the one extreme, the engineers model the technical parametres behindcosts in great detail but do not get appropriate cost information to their elegant models. On the other extreme, the accounting professionals are stuck with traditional cost accounting methods driven by the procedures and cycles of financial accounting. Therefore, in many cases, the cost information needs of various decision making groups, for example design engineers, are not served satisfactorily. This paper studies if the activity-based costing (ABC) method could offer a compromise between the two extremes. Recognizing activities and activity chains as well as activity and cost drivers could be specially beneficial for design engineers. Also, recognizing the accurate and reliable product costs of existing products helps when doing variant design. However, ABC is not at its best if the cost system becomes too complicated. This is why a comprehensive ABC-cost information system with detailed cost information for the use of design engineers should be examined critically. ABC is at its best when considering such issues as which activities drive costs, the cost of product complexity, allocating indirect costs on the products, the relationships between processes and costs, and the cost of excess capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the rapid change in today's business environment there are relatively few studies about corporate renewal. This study aims for its part at filling that research gap by studying the concepts of strategy, corporate renewal, innovation and corporate venturing. Its purpose is to enhance our understanding of how established companies operating in dynamic and global environment can benefit from their corporate venturing activities. The theoretical part approaches the research problem in corporate and venture levels. Firstly, it focuses on mapping the determinants of strategy and suggests using industry, location, resources, knowledge, structure and culture, market, technology and business model to assess the environment and using these determinants to optimize speed and magnitude of change.Secondly, it concludes that the choice of innovation strategy is dependent on the type and dimensions of innovation and suggests assessing market, technology, business model as well as novelty and complexity related to each of them for choosing an optimal context for developing innovations further. Thirdly, it directsattention on processes through which corporate renewal takes place. On corporate level these processes are identified as strategy formulation, strategy formation and strategy implementation. On the venture level the renewal processes are identified as learning, leveraging and nesting. The theoretical contribution of this study, the framework of strategic corporate venturing, joins corporate and venture level management issues together and concludes that strategy processes and linking processes are the mechanism through which continuous corporate renewaltakes place. The framework of strategic corporate venturing proposed by this study is a new way to illustrate the role of corporate venturing as a purposefullybuilt, different view of a company's business environment. The empirical part extended the framework by enhancing our understanding of the link between corporate renewal and corporate venturing in its real life environment in three Finnish companies: Metso, Nokia and TeliaSonera. Characterizing companies' environmentwith the determinants of strategy identified in this study provided a structured way to analyze their competitive position and renewal challenges that they arefacing. More importantly the case studies confirmed that a link between corporate renewal and corporate venturing exists and found out that the link is not as straight forward as indicated by the theory. Furthermore, the case studies enhanced the framework by indicating a sequence according to which the processes work. Firstly, the induced strategy processes strategy formulation and strategy implementation set the scene for corporate venturing context and management processes and leave strategy formation for the venture. Only after that can strategies formed by ventures come back to the corporate level - and if found viable in the corporate level be formalized through formulation and implementation. With the help of the framework of strategic corporate venturing the link between corporaterenewal and corporate venturing can be found and managed. The suggested response to the continuous need for change is continuous renewal i.e. institutionalizing corporate renewal in the strategy processes of the company. As far as benefiting from venturing is concerned the answer lies in deliberately managing venturing in a context different to the mainstream businesses and establishing efficientlinking processes to exploit the renewal potential of individual ventures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.