926 resultados para Elementary Methods In Number Theory
Resumo:
Several methods are used to estimate anaerobic threshold (AT) during exercise. The aim of the present study was to compare AT obtained by a graphic visual method for the estimate of ventilatory and metabolic variables (gold standard), to a bi-segmental linear regression mathematical model of Hinkley's algorithm applied to heart rate (HR) and carbon dioxide output (VCO2) data. Thirteen young (24 ± 2.63 years old) and 16 postmenopausal (57 ± 4.79 years old) healthy and sedentary women were submitted to a continuous ergospirometric incremental test on an electromagnetic braking cycloergometer with 10 to 20 W/min increases until physical exhaustion. The ventilatory variables were recorded breath-to-breath and HR was obtained beat-to-beat over real time. Data were analyzed by the nonparametric Friedman test and Spearman correlation test with the level of significance set at 5%. Power output (W), HR (bpm), oxygen uptake (VO2; mL kg-1 min-1), VO2 (mL/min), VCO2 (mL/min), and minute ventilation (VE; L/min) data observed at the AT level were similar for both methods and groups studied (P > 0.05). The VO2 (mL kg-1 min-1) data showed significant correlation (P < 0.05) between the gold standard method and the mathematical model when applied to HR (r s = 0.75) and VCO2 (r s = 0.78) data for the subjects as a whole (N = 29). The proposed mathematical method for the detection of changes in response patterns of VCO2 and HR was adequate and promising for AT detection in young and middle-aged women, representing a semi-automatic, non-invasive and objective AT measurement.
Resumo:
Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.
Resumo:
Phenomena in cyber domain, especially threats to security and privacy, have proven an increasingly heated topic addressed by different writers and scholars at an increasing pace – both nationally and internationally. However little public research has been done on the subject of cyber intelligence. The main research question of the thesis was: To what extent is the applicability of cyber intelligence acquisition methods circumstantial? The study was conducted in sequential a manner, starting with defining the concept of intelligence in cyber domain and identifying its key attributes, followed by identifying the range of intelligence methods in cyber domain, criteria influencing their applicability, and types of operatives utilizing cyber intelligence. The methods and criteria were refined into a hierarchical model. The existing conceptions of cyber intelligence were mapped through an extensive literature study on a wide variety of sources. The established understanding was further developed through 15 semi-structured interviews with experts of different backgrounds, whose wide range of points of view proved to substantially enhance the perspective on the subject. Four of the interviewed experts participated in a relatively extensive survey based on the constructed hierarchical model on cyber intelligence that was formulated in to an AHP hierarchy and executed in the Expert Choice Comparion online application. It was concluded that Intelligence in cyber domain is an endorsing, cross-cutting intelligence discipline that adds value to all aspects of conventional intelligence and furthermore that it bears a substantial amount of characteristic traits – both advantageous and disadvantageous – and furthermore that the applicability of cyber intelligence methods is partly circumstantially limited.
Resumo:
Optimization of quantum measurement processes has a pivotal role in carrying out better, more accurate or less disrupting, measurements and experiments on a quantum system. Especially, convex optimization, i.e., identifying the extreme points of the convex sets and subsets of quantum measuring devices plays an important part in quantum optimization since the typical figures of merit for measuring processes are affine functionals. In this thesis, we discuss results determining the extreme quantum devices and their relevance, e.g., in quantum-compatibility-related questions. Especially, we see that a compatible device pair where one device is extreme can be joined into a single apparatus essentially in a unique way. Moreover, we show that the question whether a pair of quantum observables can be measured jointly can often be formulated in a weaker form when some of the observables involved are extreme. Another major line of research treated in this thesis deals with convex analysis of special restricted quantum device sets, covariance structures or, in particular, generalized imprimitivity systems. Some results on the structure ofcovariant observables and instruments are listed as well as results identifying the extreme points of covariance structures in quantum theory. As a special case study, not published anywhere before, we study the structure of Euclidean-covariant localization observables for spin-0-particles. We also discuss the general form of Weyl-covariant phase-space instruments. Finally, certain optimality measures originating from convex geometry are introduced for quantum devices, namely, boundariness measuring how ‘close’ to the algebraic boundary of the device set a quantum apparatus is and the robustness of incompatibility quantifying the level of incompatibility for a quantum device pair by measuring the highest amount of noise the pair tolerates without becoming compatible. Boundariness is further associated to minimum-error discrimination of quantum devices, and robustness of incompatibility is shown to behave monotonically under certain compatibility-non-decreasing operations. Moreover, the value of robustness of incompatibility is given for a few special device pairs.
Resumo:
Consumer neuroscience (neuromarketing) is an emerging field of marketing research which uses brain imaging techniques to study neural conditions and processes that underlie consumption. The purpose of this study was to map this fairly new and growing field in Finland by studying the opinions of both Finnish consumers and marketing professionals towards it and comparing the opinions to the current consumer neuroscience literature, and based on that evaluate the usability of brain imaging techniques as a marketing research method. Mixed methods research design was chosen for this study. Quantitative data was collected from 232 consumers and 28 marketing professionals by means of online surveys. Both respondent groups had either neutral opinions or lacked knowledge about the four themes chosen for this study: benefits, limitations and challenges, ethical issues and future prospects of consumer neuroscience. Qualitative interview data was collected from 2 individuals from Finnish neuromarketing companies to deepen insights gained from quantitative research. The four interview themes were the same as in the surveys and the interviewees’ answers were mostly in line with the current literature, although more optimistic about the future of the field. The interviews also exposed a gap between academic consumer neuroscience research and practical level applications. The results of this study suggest that there are still many unresolved challenges and relevant populations either have neutral opinions or lack information about consumer neuroscience. The practical level applications are, however, already being successfully used and this new field of marketing research is growing both globally and in Finland.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
Consumer neuroscience (neuromarketing) is an emerging field of marketing research which uses brain imaging techniques to study neural conditions and processes that underlie consumption. The purpose of this study was to map this fairly new and growing field in Finland by studying the opinions of both Finnish consumers and marketing professionals towards it and comparing the opinions to the current consumer neuroscience literature, and based on that evaluate the usability of brain imaging techniques as a marketing research method. Mixed methods research design was chosen for this study. Quantitative data was collected from 232 consumers and 28 marketing professionals by means of online surveys. Both respondent groups had either neutral opinions or lacked knowledge about the four themes chosen for this study: benefits, limitations and challenges, ethical issues and future prospects of consumer neuroscience. Qualitative interview data was collected from 2 individuals from Finnish neuromarketing companies to deepen insights gained from quantitative research. The four interview themes were the same as in the surveys and the interviewees’ answers were mostly in line with the current literature, although more optimistic about the future of the field. The interviews also exposed a gap between academic consumer neuroscience research and practical level applications. The results of this study suggest that there are still many unresolved challenges and relevant populations either have neutral opinions or lack information about consumer neuroscience. The practical level applications are, however, already being successfully used and this new field of marketing research is growing both globally and in Finland.
Resumo:
Interpretation has been used in many tourism sectors as a technique in achieving building hannony between resources and human needs. The objectives of this study are to identify the types of the interpretive methods used, and to evaluate their effectiveness, in marine parks. This study reviews the design principles of an effective interpretation for marine wildlife tourism, and adopts Drams' five design principles (1997) into a conceptual framework. Enjoyment increase, knowledge gain, attitude and intention change, and behaviour modification were used as key indicators in the assessment of the interpretive effectiveness of the Vancouver Aquarium (VA) and Marineland Canada (MC). Since on-site research is unavailable, a virtual tour is created to represent the interpretive experiences in the two study sites. Self-administered questionnaires are used to measure responses. Through comparing responses to the questionnaires (pre-, post-virtual tours and follow-up), this study found that interpretation increased enjoyment and added to respondents' knowledge. Although the changes in attitudes and intentions are not significant, the findings indicate that attitude and intention changes did occur as a result of interpretation, but only to a limited extent. Overall results suggest that more techniques should be added to enhance the effectiveness of the interpretation in marine parks or self-guiding tours, and with careful design, virtual tours are the innovative interpretation techniques for marine parks or informal educational facilities.
Resumo:
We provide an algorithm that automatically derives many provable theorems in the equational theory of allegories. This was accomplished by noticing properties of an existing decision algorithm that could be extended to provide a derivation in addition to a decision certificate. We also suggest improvements and corrections to previous research in order to motivate further work on a complete derivation mechanism. The results presented here are significant for those interested in relational theories, since we essentially have a subtheory where automatic proof-generation is possible. This is also relevant to program verification since relations are well-suited to describe the behaviour of computer programs. It is likely that extensions of the theory of allegories are also decidable and possibly suitable for further expansions of the algorithm presented here.
Resumo:
In the context of multivariate regression (MLR) and seemingly unrelated regressions (SURE) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. in this paper, we propose finite-and large-sample likelihood-based test procedures for possibly non-linear hypotheses on the coefficients of MLR and SURE systems.
Resumo:
Philippe van Parijs (2003) has argued that an egalitarian ethos cannot be part of a post- Political Liberalism Rawlsian view of justice, because the demands of political justice are confined to principles for institutions of the basic structure alone. This paper argues, by contrast, that certain principles for individual conduct—including a principle requiring relatively advantaged individuals to sometimes make their economic choices with the aim of maximising the prospects of the least advantaged—are an integral part of a Rawlsian political conception of justice. It concludes that incentive payments will have a clearly limited role in a Rawlsian theory of justice.
Resumo:
La nature des acides dans un environnement aqueux est primordiale dans de nombreux aspects de la chimie et de la biologie. La caractéristique principale d'un acide est sa capacité à transférer un proton vers une molécule d'eau ou vers n'importe quelle base, mais ce procédé n'est pas aussi simple qu'il y paraît. Il peut au contraire être extrêmement complexe et dépendre de manière cruciale de la solvatation des différents intermédiaires de réaction impliqués. Cette thèse décrit les études computationnelles basées sur des simulations de dynamique moléculaire ab initio qui ont pour but d'obtenir une description à l'échelle moléculaire des divers procédés de transferts de proton entre acide et bases dans un milieu aqueux. Pour cela, nous avons étudié une serie de système, dont l'acide hydrofluorique aqueux, l'acide trifluoroacétique aqueux, et un système modèle constitué d'un phénol et d'une entité carboxylate reliés entre eux par une molécule d'eau en solution aqueuse. Deux états intermédiaires ont été identifiés pour le transfert d'un proton depuis un acide. Ces intermédiaires apparaissent stabilisés par un motif local de solvatation via des ponts H. Leurs signatures spectroscopiques ont été caractérisées au moyen de la spectroscopie infrarouge, en utilisant le formalisme de la dynamique moléculaire ab initio, qui inclut l'effet quantique nucléaire de manière explicite. Cette étude a aussi identifié trois chemins de réaction élémentaire, qui sont responsable pour le transfert d'un proton d'un acide à une base, ainsi que leurs échelles de temps caractéristiques. Les conclusions tirées de ces études sont discutées dans les détails, au niveau moléculaire, avec une emphase sur les comparaisons entre les résultats théoriques et les mesures expérimentales obtenues dans a littérature ou via des collaborateurs.