946 resultados para Legacy datasets
Resumo:
O artigo trata do debate ocorrido entre Sismondi e os economistas ricardianos, na primeira metade do século dezenove, a respeito do equilíbrio dos mercados, do papel da competição e dos efeitos da maquinaria nas sociedades industriais. A seção inicial reconstitui os principais termos do discurso crítico de Sismondi direcionado à ortodoxia clássica. A seguir, detalham-se as respostas elaboradas por McCulloch e Torrens em defesa da livre concorrência, do caráter ilimitado da demanda e do avanço da mecanização na atividade produtiva. A terceira seção considera a argumentação posterior de Sismondi na qual ele reitera sua teoria das crises de superprodução a partir de uma abordagem histórica do capitalismo. Ao final, procede-se a uma breve avaliação da herança ricardiana à economia política em vista da controvérsia examinada.
Resumo:
Nesta dissertação discutimos os movimentos em torno da temática da apropriação evidenciando o deslocamento do objeto de uso comum para o campo da arte em três movimentos paradigmáticos: o Dadaísmo, o Surrealismo e o Novo Realismo, e seus desdobramentos na arte contemporânea no sentido de legado cultural. Permitimo-nos transitar pelas demais vertentes dos processos de apropriação na arte nova-iorquina e europeia seguindo as propostas da arte com objetos. Investigamos os processos envolvidos nas controvérsias do ready-made, executado por Marcel Duchamp desde 1913, para dar partida ao percurso histórico das manifestações onde ocorre o objet trouvé, a colagem e a assemblagem. Nosso objetivo é responder à pergunta com a qual o filósofo Arthur C. Danto inicia suas investigações filosóficas acerca das apropriações e condições que fazem de um objeto comum uma obra de arte. Para tanto buscamos as teorias de Abrahan Moles e Jean Baudrillard acerca do objeto industrializado e Roland Barthes com as questões metalingüísticas de sua função-signo. Para discutir tais relações da arte com os objetos, recorremos aos textos de Peter Bürger, André Breton, Pierre Restany, Gregory Battcock, Walter Benjamin, Hal Foster e outros. Destacamos alguns trabalhos artísticos, evidenciando o objeto caixa, tratado como um utilitário com forma e finalidade definidas, reafirmando-se como objeto de consumo, no discurso sobre a poética do espaço de Gaston Bachelard. A pesquisa propõe analisar as transformações radicais das estruturas conservadoras da arte e especular sobre o gesto de apropriação dos objetos, com algumas reflexões direcionadas sobre o objeto e a transgressividade da apropriação como experiência de tensão e poder.
Resumo:
For modern consumer cameras often approximate calibration data is available, making applications such as 3D reconstruction or photo registration easier as compared to the pure uncalibrated setting. In this paper we address the setting with calibrateduncalibrated image pairs: for one image intrinsic parameters are assumed to be known, whereas the second view has unknown distortion and calibration parameters. This situation arises e.g. when one would like to register archive imagery to recently taken photos. A commonly adopted strategy for determining epipolar geometry is based on feature matching and minimal solvers inside a RANSAC framework. However, only very few existing solutions apply to the calibrated-uncalibrated setting. We propose a simple and numerically stable two-step scheme to first estimate radial distortion parameters and subsequently the focal length using novel solvers. We demonstrate the performance on synthetic and real datasets.
Resumo:
Abstract. Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI’s code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper describes our approach to reverse engineer an abstract model of a user interface directly from the GUI’s legacy code. We also present results from a case study. These results are encouraging and give evidence that the goal of reverse engineering user interfaces can be met with more work on this technique.
Resumo:
This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods
Resumo:
A large and growing amount of software systems rely on non-trivial coordination logic for making use of third party services or components. Therefore, it is of outmost importance to understand and capture rigorously this continuously growing layer of coordination as this will make easier not only the veri cation of such systems with respect to their original speci cations, but also maintenance, further development, testing, deployment and integration. This paper introduces a method based on several program analysis techniques (namely, dependence graphs, program slicing, and graph pattern analysis) to extract coordination logic from legacy systems source code. This process is driven by a series of pre-de ned coordination patterns and captured by a special purpose graph structure from which coordination speci cations can be generated in a number of di erent formalisms
Resumo:
Current software development relies increasingly on non-trivial coordination logic for com- bining autonomous services often running on di erent platforms. As a rule, however, in typical non-trivial software systems, such a coordination layer is strongly weaved within the application at source code level. Therefore, its precise identi cation becomes a major methodological (and technical) problem which cannot be overestimated along any program understanding or refactoring process. Open access to source code, as granted in OSS certi cation, provides an opportunity for the devel- opment of methods and technologies to extract, from source code, the relevant coordination information. This paper is a step in this direction, combining a number of program analysis techniques to automatically recover coordination information from legacy code. Such information is then expressed as a model in Orc, a general purpose orchestration language
Resumo:
One of the current frontiers in the clinical management of Pectus Excavatum (PE) patients is the prediction of the surgical outcome prior to the intervention. This can be done through computerized simulation of the Nuss procedure, which requires an anatomically correct representation of the costal cartilage. To this end, we take advantage of the costal cartilage tubular structure to detect it through multi-scale vesselness filtering. This information is then used in an interactive 2D initialization procedure which uses anatomical maximum intensity projections of 3D vesselness feature images to efficiently initialize the 3D segmentation process. We identify the cartilage tissue centerlines in these projected 2D images using a livewire approach. We finally refine the 3D cartilage surface through region-based sparse field level-sets. We have tested the proposed algorithm in 6 noncontrast CT datasets from PE patients. A good segmentation performance was found against reference manual contouring, with an average Dice coefficient of 0.75±0.04 and an average mean surface distance of 1.69±0.30mm. The proposed method requires roughly 1 minute for the interactive initialization step, which can positively contribute to an extended use of this tool in clinical practice, since current manual delineation of the costal cartilage can take up to an hour.
Resumo:
In daily cardiology practice, assessment of left ventricular (LV) global function using non-invasive imaging remains central for the diagnosis and follow-up of patients with cardiovascular diseases. Despite the different methodologies currently accessible for LV segmentation in cardiac magnetic resonance (CMR) images, a fast and complete LV delineation is still limitedly available for routine use. In this study, a localized anatomically constrained affine optical flow method is proposed for fast and automatic LV tracking throughout the full cardiac cycle in short-axis CMR images. Starting from an automatically delineated LV in the end-diastolic frame, the endocardial and epicardial boundaries are propagated by estimating the motion between adjacent cardiac phases using optical flow. In order to reduce the computational burden, the motion is only estimated in an anatomical region of interest around the tracked boundaries and subsequently integrated into a local affine motion model. Such localized estimation enables to capture complex motion patterns, while still being spatially consistent. The method was validated on 45 CMR datasets taken from the 2009 MICCAI LV segmentation challenge. The proposed approach proved to be robust and efficient, with an average distance error of 2.1 mm and a correlation with reference ejection fraction of 0.98 (1.9 ± 4.5%). Moreover, it showed to be fast, taking 5 seconds for the tracking of a full 4D dataset (30 ms per image). Overall, a novel fast, robust and accurate LV tracking methodology was proposed, enabling accurate assessment of relevant global function cardiac indices, such as volumes and ejection fraction.
Resumo:
In the past thirty years, a series of plans have been developed by successive Brazilian governments in a continuing effort to maximize the nation's resources for economic and social growth. This planning history has been quantitatively rich but qualitatively poor. The disjunction has stimulated Professor Mello e Souza to address himself to the problem of national planning and to offer some criticisms of Brazilian planning experience. Though political instability has obviously been a factor promoting discontinuity, his criticisms are aimed at the attitudes and strategic concepts which have sought to link planning to national goals and administration. He criticizes the fascination with techniques and plans to the exclusion of proper diagnosis of the socio-political reality, developing instruments to coordinate and carry out objectives, and creating an administrative structure centralized enough to make national decisions and decentralized enough to perform on the basis of those decisions. Thus, fixed, quantified objectives abound while the problem of functioning mechanisms for the coordinated, rational use of resources has been left unattended. Although his interest and criticism are focused on the process and experience of national planning, he recognized variation in the level and results of Brazilian planning. National plans have failed due to faulty conception of the function of planning. Sectorial plans, save in the sector of the petroleum industry under government responsibility, ha e not succeeded in overcoming the problems of formulation and execution thereby repeating old technical errors. Planning for the private sector has a somewhat brighter history due to the use of Grupos Executivos which has enabled the planning process to transcend the formalism and tradition-bound attitudes of the regular bureaucracy. Regional planning offers two relatively successful experiences, Sudene and the strategy of the regionally oriented autarchy. Thus, planning history in Brazil is not entirely black but a certain shade of grey. The major part of the article, however, is devoted to a descriptive analysis of the national planning experience. The plans included in this analysis are: The Works and Equipment Plan (POE); The Health, Food, Transportation and Energy Plan (Salte); The Program of Goals; The Trienal Plan of Economic and Social Development; and the Plan of Governmental Economic Action (Paeg). Using these five plans for his historical experience the author sets out a series of errors of formulation and execution by which he analyzes that experience. With respect to formulation, he speaks of a lack of elaboration of programs and projects, of coordination among diverse goals, and of provision of qualified staff and techniques. He mentions the absence of the definition of resources necessary to the financing of the plan and the inadequate quantification of sectorial and national goals due to the lack of reliable statistical information. Finally, he notes the failure to coordinate the annual budget with the multi-year plans. He sees the problems of execution as beginning in the absence of coordination between the various sectors of the public administration, the failure to develop an operative system of decentralization, the absence of any system of financial and fiscal control over execution, the difficulties imposed by the system of public accounting, and the absence of an adequate program of allocation for the liberation of resources. He ends by pointing to the failure to develop and use an integrated system of political economic tools in a mode compatible with the objective of the plans. The body of the article analyzes national planning experience in Brazil using these lists of errors as rough model of criticism. Several conclusions emerge from this analysis with regard to planning in Brazil and in developing countries, in general. Plans have generally been of little avail in Brazil because of the lack of a continuous, bureaucratized (in the Weberian sense) planning organization set in an instrumentally suitable administrative structure and based on thorough diagnoses of socio-economic conditions and problems. Plans have become the justification for planning. Planning has come to be conceived as a rational method of orienting the process of decisions through the establishment of a precise and quantified relation between means and ends. But this conception has led to a planning history rimmed with frustration, and failure, because of its rigidity in the face of flexible and changing reality. Rather, he suggests a conception of planning which understands it "as a rational process of formulating decisions about the policy, economy, and society whose only demand is that of managing the instrumentarium in a harmonious and integrated form in order to reach explicit, but not quantified ends". He calls this "planning without plans": the establishment of broad-scale tendencies through diagnosis whose implementation is carried out through an adjustable, coherent instrumentarium of political-economic tools. Administration according to a plan of multiple, integrated goals is a sound procedure if the nation's administrative machinery contains the technical development needed to control the multiple variables linked to any situation of socio-economic change. Brazil does not possess this level of refinement and any strategy of planning relevant to its problems must recognize this. The reforms which have been attempted fail to make this recognition as is true of the conception of planning informing the Brazilian experience. Therefore, unworkable plans, ill-diagnosed with little or no supportive instrumentarium or flexibility have been Brazil's legacy. This legacy seems likely to continue until the conception of planning comes to live in the reality of Brazil.
Resumo:
RESUMO: Com a constante evolução das novas tecnologias de informação, as organizações têm necessidade de implementar novas ferramentas de gestão de forma a gerar vantagens competitivas, é neste sentido que esta dissertação visa propor a implementação do Modelo de Gestão estratégica baseado no Balanced Scorecard numa Empresa Interbancária de Serviços, com o objectivo de auxiliar na criação de capacidades competitivas, mediante uma avaliação de desempenho mais precisa e estruturada. Esta ferramenta surgiu como alternativa aos sistemas antigos e tradicionais cujo objectivo consistia no controlo das actividades realizadas pelos funcionários, sendo que esta metodologia veio colocar a estratégia no centro das atenções e não somente o controlo, mas só em 1992 é que ela foi reconhecida como um processo revolucionário que alterava todo o processo de gestão padrão nas empresas. Nesta dissertação ira se abordar alguns conceitos desta metodologia, enfatizando, o Mapa de Estratégia, vantagens pela obtenção desta metodologia; e um estudo prático da aplicação do Modelo de Gestão Estratégica na EMIS. Por fim, é de realçar que a grande importância que as empresas têm vindo a atribuir a esta metodologia e a investir nela, garante a sua relevância como tema de pesquisa num futuro próximo. ABSTRACT: With the constant development of new information technologies, organizations need to implement new management tools in order to generate competitive advantages, in this sense this thesis aims to propose the implementation of the Strategic Management Model based on Balanced Scorecard in a Company Interbank services with the aim of assisting in the creation of competitive capabilities through a performance assessment more precise and structured. This tool has emerged as an alternative to traditional legacy systems and was aimed at controlling the activities performed by employees, the methodology that has put the strategy in the spotlight instead of control, but not until 1992 was it recognized as a revolutionary process that changed the entire standard management process in companies. In this dissertation we discuss concepts of this methodology, emphasizing the strategic map for obtaining advantages from it, and a practical application of Strategic Management Model in EMIS. Finally, it is noteworthy that the great importance that companies have been giving to this methodology and the investment they have been doing on it, guarantees its relevance as a research subject in the near future.
Resumo:
Em resposta ao predomínio da heteronormatividade nos estudos sobre o trabalho doméstico, este artigo explora a forma como este é organizado e distribuído em casais do mesmo sexo. Para esse efeito, desenvolveu-se uma pesquisa qualitativa (20 entrevistas aos membros de casais homossexuais) em torno dos desequilíbrios, do processo de negociação, do nível de satisfação, e ainda da herança familiar genderizada na actual organização do trabalho doméstico. Concluiu-se que a ausência da diferença de sexo no casal contribui para uma mais flexível e paritária negociação da organização das tarefas. Reflexo da socialização de género, as mulheres tendem para uma maior especialização e os homens para uma maior delegação das tarefas. In response to the predominance of heteronormativity in the studies on the household labour, this article explores the way it is organized and distributed in same-sex couples. Qualitative research was carried out on the basis of 20 interviews with members of homosexual couples, and informationwas collected on the imbalances, negotiation, satisfaction and gendered family legacy in the current organization of household labour. Results show that the absence of the sex difference between the members of the couple contributes to more flexible and egalitarian negotiation in the organization of chores. As a consequence of gender socialization, women tend to specialize and men to delegate.
Resumo:
The smart grid concept is rapidly evolving in the direction of practical implementations able to bring smart grid advantages into practice. Evolution in legacy equipment and infrastructures is not sufficient to accomplish the smart grid goals as it does not consider the needs of the players operating in a complex environment which is dynamic and competitive in nature. Artificial intelligence based applications can provide solutions to these problems, supporting decentralized intelligence and decision-making. A case study illustrates the importance of Virtual Power Players (VPP) and multi-player negotiation in the context of smart grids. This case study is based on real data and aims at optimizing energy resource management, considering generation, storage and demand response.
Resumo:
Proteins are biochemical entities consisting of one or more blocks typically folded in a 3D pattern. Each block (a polypeptide) is a single linear sequence of amino acids that are biochemically bonded together. The amino acid sequence in a protein is defined by the sequence of a gene or several genes encoded in the DNA-based genetic code. This genetic code typically uses twenty amino acids, but in certain organisms the genetic code can also include two other amino acids. After linking the amino acids during protein synthesis, each amino acid becomes a residue in a protein, which is then chemically modified, ultimately changing and defining the protein function. In this study, the authors analyze the amino acid sequence using alignment-free methods, aiming to identify structural patterns in sets of proteins and in the proteome, without any other previous assumptions. The paper starts by analyzing amino acid sequence data by means of histograms using fixed length amino acid words (tuples). After creating the initial relative frequency histograms, they are transformed and processed in order to generate quantitative results for information extraction and graphical visualization. Selected samples from two reference datasets are used, and results reveal that the proposed method is able to generate relevant outputs in accordance with current scientific knowledge in domains like protein sequence/proteome analysis.
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.