955 resultados para arguments by definition


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Self-adaptive systems have the capability to autonomously modify their behavior at run-time in response to changes in their environment. Self-adaptation is particularly necessary for applications that must run continuously, even under adverse conditions and changing requirements; sample domains include automotive systems, telecommunications, and environmental monitoring systems. While a few techniques have been developed to support the monitoring and analysis of requirements for adaptive systems, limited attention has been paid to the actual creation and specification of requirements of self-adaptive systems. As a result, self-adaptivity is often constructed in an ad-hoc manner. In order to support the rigorous specification of adaptive systems requirements, this paper introduces RELAX, a new requirements language for self-adaptive systems that explicitly addresses uncertainty inherent in adaptive systems. We present the formal semantics for RELAX in terms of fuzzy logic, thus enabling a rigorous treatment of requirements that include uncertainty. RELAX enables developers to identify uncertainty in the requirements, thereby facilitating the design of systems that are, by definition, more flexible and amenable to adaptation in a systematic fashion. We illustrate the use of RELAX on smart home applications, including an adaptive assisted living system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the paper, an ontogenic artificial neural network (ANNs) is proposed. The network uses orthogonal activation functions that allow significant reducing of computational complexity. Another advantage is numerical stability, because the system of activation functions is linearly independent by definition. A learning procedure for proposed ANN with guaranteed convergence to the global minimum of error function in the parameter space is developed. An algorithm for structure network structure adaptation is proposed. The algorithm allows adding or deleting a node in real-time without retraining of the network. Simulation results confirm the efficiency of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Владимир Тодоров, Петър Стоев - Тази бележка съдържа елементарна конструкция на множество с указаните в заглавието свойства. Да отбележим в допълнение, че така полученото множество остава напълно несвързано дори и след като се допълни с краен брой елементи.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Egyes alternatívák, forgatókönyvek, technológiák stb. fenntarthatóságának értékelése – definíciószerűen többdimenziós probléma. A megfelelő alternatíva kiválasztásánál ugyanis a döntéshozóknak egyszerre kell figyelembe venniük környezetvédelmi, gazdasági és társadalmi szempontokat. Az ilyen döntéseket támogathatják többszempontú döntéshozatali modellek. A tanulmány hét többszempontú döntési módszertan (MAU, AHP, ELECTRE, PROMETHEE, REGIME, NAIADE és ideális-referencia pont) alkalmazhatóságát vizsgálja részvételi körülmények között. Az utóbbi évek e témában publikált esettanulmányait áttekintve megállapítható, hogy egyik módszer sem dominálja a többit, azok különböző feltételek mellett eltérő sikerrel használhatók. Ennek ellenére a különböző technikák kombinációjával előállíthatunk olyan eljárásokat, melyekkel az egyes módszerek előnyeit még jobban kiaknázhatjuk. ________ Measuring and comparing the sustainability of certain actions, scenarios, technologies, etc. – by definition – is a multidimensional problem. Decision makers must consider environmental, economic and social aspects when choosing an alternative course of action. Such decisions can be aided by multi-criteria decision analysis (MCDA). In this paper participatory seven different MCDA methodologies are investigated (MAU, the Analytic Hierarchic Process (AHP), the ELECTRE, PROMETHEE, REGIME, and NAIADE methods and the “Ideal and reference point” approaches). It is based on a series of reports, in which more than 30 real world case studies focusing on participatory MCDA were reviewed. It is emphasized that there is no “best” choice from the list of MCDA techniques, but some methods fit certain decision problems more than others. However, with the combination of these methodologies some complementary benefits of the different techniques can be exploited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A fenntarthatóság értékelése definíciószerűen többdimenziós probléma. A megfelelő alternatíva, forgatókönyv, eljárás stb. kiválasztásakor ugyanis a döntéshozóknak egyszerre kell figyelembe venniük környezetvédelmi, gazdasági és társadalmi szempontokat. Az ilyen döntéseket alátámaszthatják a több szempontú döntéshozatali modellek. A tanulmány a több szempontú döntési eljárások közül a legfontosabb hétnek az alkalmazhatóságát vizsgálja részvételi körülmények között. Az utóbbi évek e témában publikált esettanulmányainak áttekintésével megállapítható, hogy egyik módszer sem uralja a többit, azok különböző feltételek mellett eltérő sikerrel használhatók. Ennek ellenére a különböző módszerek kombinációjával végrehajthatunk olyan eljárásokat, amelyekkel az egyes módszerek előnyeit még jobban kiaknázhatjuk. ________ Measuring and comparing the sustainability of certain actions, scenarios, technologies, etc. is by definition a multidimensional problem. Decision-makers must consider environmental, economic and social aspects when choosing an alternative course of action. Such decisions can be aided by multi-criteria decision analysis (MCDA). This paper investigates seven different MCDA methodologies: MAU, the Analytic Hierarchic Process (AHP), the ELECTRE, PROMETHEE, REGIME, and NAIADE methods, and "Ideal and reference point" approaches). It is based on a series of reports in which over 30 real-world case studies focusing on participatory MCDA were reviewed. It is stressed, however, that there is no "best" choice in the list of MCDA techniques. Some methods fit certain decision problems better than others. Nonetheless, some complementary benefits of the different techniques can be exploited by combining these methodologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of several techniques applied to production processes oil is the artificial lift, using equipment in order to reduce the bottom hole pressure, providing a pressure differential, resulting in a flow increase. The choice of the artificial lift method depends on a detailed analysis of the some factors, such as initial costs of installation, maintenance, and the existing conditions in the producing field. The Electrical Submersible Pumping method (ESP) appears to be quite efficient when the objective is to produce high liquid flow rates in both onshore and offshore environments, in adverse conditions of temperature and in the presence of viscous fluids. By definition, ESP is a method of artificial lift in which a subsurface electric motor transforms electrical into mechanical energy to trigger a centrifugal pump of multiple stages, composed of a rotating impeller (rotor) and a stationary diffuser (stator). The pump converts the mechanical energy of the engine into kinetic energy in the form of velocity, which pushes the fluid to the surface. The objective of this work is to implement the optimization method of the flexible polyhedron, known as Modified Simplex Method (MSM) applied to the study of the influence of the modification of the input and output parameters of the centrifugal pump impeller in the channel of a system ESP. In the use of the optimization method by changing the angular parameters of the pump, the resultant data applied to the simulations allowed to obtain optimized values of the Head (lift height), lossless efficiency and the power with differentiated results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I present here a sequence of short videos, Scenes of Provincial Life, forming a unified, ongoing online work. In my written commentary I discuss the work‘s context, genesis and facture and presentation and thereby demonstrate its claim to originality as art work. I go on to suggest one possible interpretive framework for it. I then discuss the nature of art works as candidates for the generation of new knowledge and conclude that art works in general fulfil this function, in a very carefully defined way, as a necessary condition of being art works. I further connect the success of any work as art work with the richness of its knowledge generating capacity, inseparably allied to its aesthetic force. I conclude that if Scenes of Provincial Life is seen to have value as artwork it will therefore by definition be a creator of new knowledge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: Congenital erythrocytosis is by definition present from birth. Patients frequently present in childhood or as young adults and a family history may be present. The erythrocytosis can be primary where there is a defect in the erythroid compartment of secondary where increased erythropoietin production produced due to the defect leads to an erythrocytosis.

MATERIAL AND METHODS: Primary causes include erythropoietin receptor mutations. Congenital secondary causes include mutations in the genes involved in the oxygen-sensing pathway and haemoglobins with abnormal oxygen affinity. Investigations for the cause include an erythropoietin level, oxygen dissociation curve, haemoglobin electrophoresis and sequencing for known gene variants.

RESULTS: The finding of a known or new molecular variant confirms a diagnosis of congenital erythrocytosis. A congenital erythrocytosis may be an incidental finding but nonspecific symptoms are described. Major thromboembolic events have been noted in some cases. Low-dose aspirin and venesection are therapeutic manoeuvres which should be considered in managing these patients.

CONCLUSIONS: Rare individuals presenting often at a young age may have a congenital erythrocytosis. Molecular investigation may reveal a lesion. However, in the majority, currently no defect is identified.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using two-year longitudinal data from a large sample of US employees from a service-related organization, the present study investigates the relative effects of three forms of pay-for-performance plans on employees’ job performance (incentive effects) and voluntary turnover (sorting effects). The study differentiates between three forms of pay: merit pay, individual-based bonuses, and long-term incentives. By definition, these PFP plans have different structural elements that distinguish them from each other (i.e., pay plan form) and different characteristics (functionality), such as the degree to which pay and performance are linked and the size of the rewards, which can vary both within and across plan types. Our results provide evidence that merit raises have larger incentive and sorting effects than bonuses and long-term incentives in multi-PFP plan environments where the three PFP plans are operating simultaneously. Only merit pay has both incentive and sorting effects among the three PFP plans. The implications for the PFP-related theory, as well as for the design and implementation of PFP plans, are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les déficits cognitifs sont présents chez les patients atteints de cancer. Les tests cognitifs tels que le Montreal Cognitive Assessment se sont révélés peu spécifiques, incapables de détecter des déficits légers et ne sont pas linéaires. Pour suppléer à ces limitations nous avons développé un questionnaire cognitif simple, bref et adapté aux dimensions cognitives atteintes chez les patients avec un cancer, le FaCE « The Fast Cognitif Evaluation », en utilisant la modélisation Rasch (MR). La MR est une méthode mathématique probabiliste qui détermine les conditions pour qu’un outil soit considéré une échelle de mesure et elle est indépendante de l’échantillon. Si les résultats s’ajustent au modèle, l’échelle de mesure est linéaire avec des intervalles égaux. Les réponses sont basées sur la capacité des sujets et la difficulté des items. La carte des items permet de sélectionner les items les plus adaptés pour l’évaluation de chaque aspect cognitif et d’en réduire le nombre au minimum. L’analyse de l’unidimensionnalité évalue si l’outil mesure une autre dimension que celle attendue. Les résultats d’analyses, conduites sur 165 patients, montrent que le FaCE distingue avec une excellente fiabilité et des niveaux suffisamment différents les compétences des patients (person-reliability-index=0.86; person-separation-index=2.51). La taille de la population et le nombre d’items sont suffisants pour que les items aient une hiérarchisation fiable et précise (item-reliability=0.99; item-séparation-index=8.75). La carte des items montre une bonne dispersion de ceux-ci et une linéarité du score sans effet plafond. Enfin, l’unidimensionnalité est respectée et le temps d’accomplissement moyen est d’environ 6 minutes. Par définition la MR permet d’assurer la linéarité et la continuité de l’échelle de mesure. Nous avons réussi à développer un questionnaire bref, simple, rapide et adapté aux déficits cognitifs des patients avec un cancer. Le FaCE pourrait, aussi, servir de mesure de référence pour les futures recherches dans le domaine.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In economics of information theory, credence products are those whose quality is difficult or impossible for consumers to assess, even after they have consumed the product (Darby & Karni, 1973). This dissertation is focused on the content, consumer perception, and power of online reviews for credence services. Economics of information theory has long assumed, without empirical confirmation, that consumers will discount the credibility of claims about credence quality attributes. The same theories predict that because credence services are by definition obscure to the consumer, reviews of credence services are incapable of signaling quality. Our research aims to question these assumptions. In the first essay we examine how the content and structure of online reviews of credence services systematically differ from the content and structure of reviews of experience services and how consumers judge these differences. We have found that online reviews of credence services have either less important or less credible content than reviews of experience services and that consumers do discount the credibility of credence claims. However, while consumers rationally discount the credibility of simple credence claims in a review, more complex argument structure and the inclusion of evidence attenuate this effect. In the second essay we ask, “Can online reviews predict the worst doctors?” We examine the power of online reviews to detect low quality, as measured by state medical board sanctions. We find that online reviews are somewhat predictive of a doctor’s suitability to practice medicine; however, not all the data are useful. Numerical or star ratings provide the strongest quality signal; user-submitted text provides some signal but is subsumed almost completely by ratings. Of the ratings variables in our dataset, we find that punctuality, rather than knowledge, is the strongest predictor of medical board sanctions. These results challenge the definition of credence products, which is a long-standing construct in economics of information theory. Our results also have implications for online review users, review platforms, and for the use of predictive modeling in the context of information systems research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las bibliotecas de museo deben responder, por definición, a los objetivos de la institución a la que pertenecen. El propósito de esta investigación consiste en identificar y documentar el trabajo de las bibliotecas de los museos históricos del Ministerio de Cultura y Juventud de Costa Rica, de manera que se pueda tener un panorama de esta realidad nacional como punto de partida de este análisis. Adicionalmente, se pretende brindar pautas para su gestión, ya que los profesionales se han visto en la necesidad de dirigir e iniciar bibliotecas de museo con lineamientos de bibliotecas convencionales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Direito, Programa de Pós-Graduação Stricto Sensu em Direito, 2016.