799 resultados para Static average-case analysis
Resumo:
Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising two different types of processors—such a platform is referred to as two-type platform. We present two low degree polynomial time-complexity algorithms, SA and SA-P, each providing the following guarantee. For a given two-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then (i) using SA, it is guaranteed to find such an assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which processors are 1+α times faster. The parameter 0<α≤1 is a property of the task set; it is the maximum of all the task utilizations that are no greater than 1. We evaluate average-case performance of both the algorithms by generating task sets randomly and measuring how much faster processors the algorithms need (which is upper bounded by 1+α/2 for SA and 1+α for SA-P) in order to output a feasible task assignment (intra-migrative for SA and non-migrative for SA-P). In our evaluations, for the vast majority of task sets, these algorithms require significantly smaller processor speedup than indicated by their theoretical bounds. Finally, we consider a special case where no task utilization in the given task set can exceed one and for this case, we (re-)prove the performance guarantees of SA and SA-P. We show, for both of the algorithms, that changing the adversary from intra-migrative to a more powerful one, namely fully-migrative, in which tasks can migrate between processors of any type, does not deteriorate the performance guarantees. For this special case, we compare the average-case performance of SA-P and a state-of-the-art algorithm by generating task sets randomly. In our evaluations, SA-P outperforms the state-of-the-art by requiring much smaller processor speedup and by running orders of magnitude faster.
Resumo:
Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.
Resumo:
Case Study
Resumo:
There is an undeniable link between child support and scholarship, under article 1880 of the portuguese Civil Code. Of course, by being within family relationships, such link could not be out of controversy. At a time when the continuation of studies is more and more urgent, this link between the two, is often subject of disputes, especially resultant from the interpretation of the law, due to the wide extension that it is entitle to; and many times is also insufficient to the most interested people – the youngsters that want to study. Regardless of the imprecision that rules under article 1880 of the portuguese Civil Code, this article reveals a huge importance by enabling young adults and students to continue their studies, with the financial help from their parents - the responsibility of the parents with the support of their children should have ended by the time they have become legal adults, but it is extended by this article, once the criteria is filled, especially related to the reasonableness of what is required to the parents and the temporal duration of the education chosen. That is, considering that reaching adulthood does not cease the duty of support from the parents, it is important to know how much can parents provide to their children, bearing in mind their income and the child’s and his/her needs, behavior and the intellectual capacity of the child as a student and also the parent-child relationship; and, until when is such support due, taking in to account several circumstances of life and the difficulties inherent to the degree chosen and even the extension of the studies to a master or to a PhD degree that justifies the extension of the parent’s duty. Anyway, the application of article 1880 of the portuguese Civil Code is always based on a case by case analysis and on the economic insufficient of the youngsters to suffice themselves and the simultaneous desire to continue their studies.
Resumo:
O atual panorama dos teatros municipais do país configurou-se, sobretudo, a partir de finais da década de 1990, com iniciativas do governo central e do governo local. A diversidade de gestão desses equipamentos culturais leva a refletir sobre o conceito de teatro municipal. Neste trabalho de projeto, propõe-se uma definição do conceito de teatro municipal e uma análise do caso do Teatro Municipal Joaquim Benite em Almada. Apresenta-se também uma análise da programação para os anos de 2007 a 2012.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.
Resumo:
Monte Carlo Simulations were carried out using a nearest neighbour ferromagnetic XYmodel, on both 2-D and 3-D quasi-periodic lattices. In the case of 2-D, both the unfrustrated and frustrated XV-model were studied. For the unfrustrated 2-D XV-model, we have examined the magnetization, specific heat, linear susceptibility, helicity modulus and the derivative of the helicity modulus with respect to inverse temperature. The behaviour of all these quatities point to a Kosterlitz-Thouless transition occuring in temperature range Te == (1.0 -1.05) JlkB and with critical exponents that are consistent with previous results (obtained for crystalline lattices) . However, in the frustrated case, analysis of the spin glass susceptibility and EdwardsAnderson order parameter, in addition to the magnetization, specific heat and linear susceptibility, support a spin glass transition. In the case where the 'thin' rhombus is fully frustrated, a freezing transition occurs at Tf == 0.137 JlkB , which contradicts previous work suggesting the critical dimension of spin glasses to be de > 2 . In the 3-D systems, examination of the magnetization, specific heat and linear susceptibility reveal a conventional second order phase transition. Through a cumulant analysis and finite size scaling, a critical temperature of Te == (2.292 ± 0.003) JI kB and critical exponents of 0:' == 0.03 ± 0.03, f3 == 0.30 ± 0.01 and I == 1.31 ± 0.02 have been obtained.
Resumo:
La légitimité d’une organisation est fondée sur sa mission, c’est-à-dire sur sa raison d’être. Des responsables des bibliothèques et de nombreux chercheurs craignent que la légitimité des bibliothèques publiques soit contestée dans la société de l’information. De plus, les textes officiels présentant les missions des bibliothèques publiques sont divers et les missions y sont délibérément non définies. Au Québec, où une grande majorité des bibliothèques publiques autonomes sont placées directement sous la tutelle des municipalités, les bibliothèques publiques doivent définir et légitimer leurs missions avec les élus municipaux. L’objectif principal de cette recherche est de comprendre, via les discours, le point de vue des élus municipaux québécois sur les missions des bibliothèques publiques autonomes, en comparaison avec les pratiques et les ressources des bibliothèques au plan local. Basé sur la théorie de la construction sociale de la réalité, un cadre conceptuel est proposé de manière à étudier non seulement les discours dans leur dimension textuelle, mais aussi à contextualiser ces discours et analyser l’écart entre ces discours et les pratiques des bibliothèques.La stratégie de recherche adoptée est une étude de cas multiples. L’objectif est de développer une analyse en profondeur de chaque cas et une analyse inter cas. Les douze cas (municipalités) ont été sélectionnés en fonction de deux critères de variation (la taille de la municipalité et le budget annuel alloué par la municipalité à la bibliothèque) et un critère discriminant (la distance par rapport à l’Université de Montréal). Des entrevues ont été menées auprès des élus municipaux présidant la commission ou le comité dont dépendent les bibliothèques publiques. Ces entrevues et les politiques culturelles ont fait l’objet d’une analyse de discours. Les entrevues auprès des responsables des bibliothèques et la documentation ont fait l’objet d’une analyse de contenu. Ces analyses ont permis la triangulation des méthodes et des sources de données.Les élus municipaux québécois, comme les professionnels, n’offrent pas un discours homogène sur les missions des bibliothèques publiques. Toutefois, un modèle de discours émerge. Il montre un discours « limité » par rapport à la littérature, dans lequel une image passive de la bibliothèque est présentée et dans lequel la tradition perdure malgré le contexte de la société de l’information. Mais l’analyse révèle aussi que les élus municipaux construisent leurs points de vue sur leurs propres convictions en tant qu’individus, sur leur rôle dans la gestion de la municipalité en tant qu’élus et sur l’image qu’ils ont des usagers des bibliothèques publiques. Enfin, l’analyse a révélé un axe de différenciation des points de vue selon que le discours s’appuie sur des valeurs fondamentales ou sur les usages (réels ou supposés) de la bibliothèque.
Resumo:
Les données manquantes sont fréquentes dans les enquêtes et peuvent entraîner d’importantes erreurs d’estimation de paramètres. Ce mémoire méthodologique en sociologie porte sur l’influence des données manquantes sur l’estimation de l’effet d’un programme de prévention. Les deux premières sections exposent les possibilités de biais engendrées par les données manquantes et présentent les approches théoriques permettant de les décrire. La troisième section porte sur les méthodes de traitement des données manquantes. Les méthodes classiques sont décrites ainsi que trois méthodes récentes. La quatrième section contient une présentation de l’Enquête longitudinale et expérimentale de Montréal (ELEM) et une description des données utilisées. La cinquième expose les analyses effectuées, elle contient : la méthode d’analyse de l’effet d’une intervention à partir de données longitudinales, une description approfondie des données manquantes de l’ELEM ainsi qu’un diagnostic des schémas et du mécanisme. La sixième section contient les résultats de l’estimation de l’effet du programme selon différents postulats concernant le mécanisme des données manquantes et selon quatre méthodes : l’analyse des cas complets, le maximum de vraisemblance, la pondération et l’imputation multiple. Ils indiquent (I) que le postulat sur le type de mécanisme MAR des données manquantes semble influencer l’estimation de l’effet du programme et que (II) les estimations obtenues par différentes méthodes d’estimation mènent à des conclusions similaires sur l’effet de l’intervention.
Resumo:
Le Canada consacre chaque année des milliards en aide internationale. Selon le Ministère des affaires étrangères, commerce et développement, l’aide déployée en 2013 s’est chiffrée à plus de 5,48 milliards de dollars. Dans chaque projet mis en œuvre dans les pays en développement, des ressources humaines donnent de leur temps et s’efforcent de contribuer au renforcement des capacités des organisations locales. Ces projets sont des initiatives de coopération technique ou renferment des composantes de coopération technique; les personnes qui y sont affectées doivent accomplir de multiples tâches, dont celle d’agent de partage de connaissances. Cette thèse explore ce phénomène en apportant un éclairage sur les processus relationnels sous-jacents aux échanges entre les personnes liées à ces initiatives, soient les conseillers volontaires expatriés et les membres des équipes locales qui accueillent de telles initiatives. Elle tend à appuyer l’influence marquée des relations interpersonnelles sur les résultats de partage de connaissances, sauf que la confiance, à elle seule, ne suffit pas pour atteindre des objectifs de développement durable. L’analyse des cas, s’appuyant principalement sur des entrevues semi-dirigées effectuées à Haïti et au Sénégal, nous permet d’affirmer l’importance de s’attarder à la capacité d’assimilation dynamique des parties au partage, mais également aux rôles des gestionnaires des organismes partenaires locaux dans leur engagement à réaliser des mandats visant le partage de connaissances.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Bank switching in embedded processors having partitioned memory architecture results in code size as well as run time overhead. An algorithm and its application to assist the compiler in eliminating the redundant bank switching codes introduced and deciding the optimum data allocation to banked memory is presented in this work. A relation matrix formed for the memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Data allocation to memory is done by considering all possible permutation of memory banks and combination of data. The compiler output corresponding to each data mapping scheme is subjected to a static machine code analysis which identifies the one with minimum number of bank switching codes. Even though the method is compiler independent, the algorithm utilizes certain architectural features of the target processor. A prototype based on PIC 16F87X microcontrollers is described. This method scales well into larger number of memory blocks and other architectures so that high performance compilers can integrate this technique for efficient code generation. The technique is illustrated with an example
Resumo:
This paper presents the impact of integrating interventions like nutrition gardening, livestock rearing, product diversification and allied income generation activities in small and marginal coconut homesteads along with nutrition education in improving the food and nutritional security as well as the income of the family members. The activities were carried out through registered Community Based Organizations (CBOs) in three locations in Kerala, India during 2005-2008. Data was collected before and after the project periods through interviews using a pre-tested questionnaire containing statements indicating the adequacy, quality and diversity of food materials. Fifty respondents each were randomly selected from the three communities, thereby resulting in a total sample size of 150. The data was analysed using SPSS by adopting statistical tools like frequency, average, percentage analysis, t – test and regression. Participatory planning and implementation of diverse interventions notably intercropping and off-farm activities along with nutrition education brought out significant improvements in the food and nutritional security, in terms of frequency and quantity of consumption as well as diet diversity. At the end of the project, 96%of the members became completely food secure and 72% nutritionally secure. The overall consumption of fruits, vegetables and milk by both children and adults and egg by children recorded increase over the project period. Consumption of fish was more than the Recommended Dietary Intake (RDI) level during pre and post project periods. Project interventions like nutrition gardening could bring in surplus consumption of vegetables (35%) and fruits (10%) than RDI. In spite of the increased consumption of green leafy vegetables and milk and milk products over the project period, the levels of consumption were still below the RDI levels. CBO-wise analysis of the consumption patterns revealed the need for location-specific interventions matching to the needs and preferences of the communities.
Resumo:
This thesis presents there important results in visual object recognition based on shape. (1) A new algorithm (RAST; Recognition by Adaptive Sudivisions of Tranformation space) is presented that has lower average-case complexity than any known recognition algorithm. (2) It is shown, both theoretically and empirically, that representing 3D objects as collections of 2D views (the "View-Based Approximation") is feasible and affects the reliability of 3D recognition systems no more than other commonly made approximations. (3) The problem of recognition in cluttered scenes is considered from a Bayesian perspective; the commonly-used "bounded-error errorsmeasure" is demonstrated to correspond to an independence assumption. It is shown that by modeling the statistical properties of real-scenes better, objects can be recognized more reliably.
Resumo:
En este trabajo se analiza la existencia de conceptos comunitarios en la definición de Educación Continuada, en la forma en la que las instituciones de Educación Superior entienden la Educación Continuada, así como en la influencia que puede tener el uso de estrategias comunitarias por Educación Continuada en su relación con el medio. Para esto se utilizó una metodología de investigación y análisis de caso, seleccionando un caso representativo, en un diseño explicativo intracaso. Se establecieron las proposiciones y el protocolo del análisis. Se utilizaron diversas fuentes de evidencia, tales como documentos, datos financieros, de desempeño y otros procedentes de diferentes sectores sociales. Todos ellos se analizaron dentro de una evidencia lógica. Se concluyó que la definición de Educación Continuada incluye conceptos comunitarios, que las instituciones utilizan dichos conceptos en la forma en que entienden Educación Continuada y que el uso de estrategias comunitarias afecta favorablemente a la relación de Educación Continuada con el medio.