858 resultados para Non-functional requirement. Software architecture. NFR-framework. Architectural pattern


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Self-adaptive Software (SaS) presents specific characteristics compared to traditional ones, as it makes possible adaptations to be incorporated at runtime. These adaptations, when manually performed, normally become an onerous, error-prone activity. In this scenario, automated approaches have been proposed to support such adaptations; however, the development of SaS is not a trivial task. In parallel, reference architectures are reusable artifacts that aggregate the knowledge of architectures of software systems in specific domains. They have facilitated the development, standardization, and evolution of systems of those domains. In spite of their relevance, in the SaS domain, reference architectures that could support a more systematic development of SaS are not found yet. Considering this context, the main contribution of this paper is to present a reference architecture based on reflection for SaS, named RA4SaS (Reference Architecture for SaS). Its main purpose is to support the development of SaS that presents adaptations at runtime. To show the viability of this reference architecture, a case study is presented. As result, it has been observed that RA4SaS has presented good perspective to efficiently contribute to the area of SaS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coordenação de Aperfei çoamento de Pessoal de Nível Superior (CAPES)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: To evaluate the anatomical and functional renal alterations and the association with post-traumatic arterial hypertension. Methods: The studied population included patients who sustained high grades renal injury (grades III to V) successfully non-operative management after staging by computed tomography over a 16-year period. Beyond the review of medical records, these patients were invited to the following protocol: clinical and laboratory evaluation, abdominal computed tomography, magnetic resonance angiography, DMSA renal scintigraphy, and ambulatory blood pressure monitoring. The hypertensive patients also were submitted to dynamic renal scintigraphy (Tc-99m EC), using captopril stimulation to verify renal vascular etiology. Results: Of the 31 patients, there were thirteen grade III, sixteen grade IV (nine lacerations, and seven vascular lesions), and two grade V injuries. All the patients were asymptomatic and an average follow up post-injury of 6.4 years. None had abnormal BUN or seric creatinine. The percentage of renal volume reduction correlates with the severity as defined by OIS. There was no evidence of renal artery stenosis in Magnetic Resonance angiography (MRA). DMSA scanning demonstrated a decline in percentage of total renal function corresponding to injury severity (42.2 +/- 5.5% for grade III, 35.3 +/- 12.8% for grade IV, 13.5 +/- 19.1 for grade V). Six patients (19.4%) had severe compromised function (< 30%). There was statistically significant difference in the decrease in renal function between parenchymal and vascular causes for grade IV injuries (p < 0.001). The 24-hour ambulatory blood pressure monitoring detected nine patients (29%) with post-traumatic hypertension. All the patients were male, mean 35.6 years, 77.8 % had a familial history of arterial hypertension, 66.7% had grade III renal injury, and average post-injury time was 7.8 years. Seven patients had negative captopril renography. Conclusions: Late results of renal function after conservative treatment of high grades renal injuries are favorable, except for patients with grades IV with vascular injuries and grade V renal injuries. Moreover, arterial hypertension does not correlate with the grade of renal injury or reduction of renal function.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The study of myofiber reorganization in the remote zone after myocardial infarction has been performed in 2D. Microstructural reorganization in remodeled hearts, however, can only be fully appreciated by considering myofibers as continuous 3D entities. The aim of this study was therefore to develop a technique for quantitative 3D diffusion CMR tractography of the heart, and to apply this method to quantify fiber architecture in the remote zone of remodeled hearts. Methods: Diffusion Tensor CMR of normal human, sheep, and rat hearts, as well as infarcted sheep hearts was performed ex vivo. Fiber tracts were generated with a fourth-order Runge-Kutta integration technique and classified statistically by the median, mean, maximum, or minimum helix angle (HA) along the tract. An index of tract coherence was derived from the relationship between these HA statistics. Histological validation was performed using phase-contrast microscopy. Results: In normal hearts, the subendocardial and subepicardial myofibers had a positive and negative HA, respectively, forming a symmetric distribution around the midmyocardium. However, in the remote zone of the infarcted hearts, a significant positive shift in HA was observed. The ratio between negative and positive HA variance was reduced from 0.96 +/- 0.16 in normal hearts to 0.22 +/- 0.08 in the remote zone of the remodeled hearts (p<0.05). This was confirmed histologically by the reduction of HA in the subepicardium from -52.03 degrees +/- 2.94 degrees in normal hearts to -37.48 degrees +/- 4.05 degrees in the remote zone of the remodeled hearts (p < 0.05). Conclusions: A significant reorganization of the 3D fiber continuum is observed in the remote zone of remodeled hearts. The positive (rightward) shift in HA in the remote zone is greatest in the subepicardium, but involves all layers of the myocardium. Tractography-based quantification, performed here for the first time in remodeled hearts, may provide a framework for assessing regional changes in the left ventricle following infarction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Background Despite recent advances in the understanding of lignocellulolytic enzyme regulation, less is known about how different carbon sources are sensed and the signaling cascades that result in the adaptation of cellular metabolism and hydrolase secretion. Therefore, the role played by non-essential protein kinases (NPK) and phosphatases (NPP) in the sensing of carbon and/or energetic status was investigated in the model filamentous fungus Aspergillus nidulans. Results Eleven NPKs and seven NPPs were identified as being involved in cellulase, and in some cases also hemicellulase, production in A. nidulans. The regulation of CreA-mediated carbon catabolite repression (CCR) in the parental strain was determined by fluorescence microscopy, utilising a CreA: GFP fusion protein. The sensing of phosphorylated glucose, via the RAS signalling pathway induced CreA repression, while carbon starvation resulted in derepression. Growth on cellulose represented carbon starvation and derepressing conditions. The involvement of the identified NPKs in the regulation of cellulose-induced responses and CreA derepression was assessed by genome-wide transcriptomics (GEO accession 47810). CreA:GFP localisation and the restoration of endocellulase activity via the introduction of the ∆creA mutation, was assessed in the NPK-deficient backgrounds. The absence of either the schA or snfA kinase dramatically reduced cellulose-induced transcriptional responses, including the expression of hydrolytic enzymes and transporters. The mechanism by which these two NPKs controlled gene transcription was identified, as the NPK-deficient mutants were not able to unlock CreA-mediated carbon catabolite repression under derepressing conditions, such as carbon starvation or growth on cellulose. Conclusions Collectively, this study identified multiple kinases and phosphatases involved in the sensing of carbon and/or energetic status, while demonstrating the overlapping, synergistic roles of schA and snfA in the regulation of CreA derepression and hydrolytic enzyme production in A. nidulans. The importance of a carbon starvation-induced signal for CreA derepression, permitting transcriptional activator binding, appeared paramount for hydrolase secretion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background/objectives: Therapy using bone marrow (BM) cells has been tested experimentally and clinically due to the potential ability to restore cardiac function by regenerating lost myocytes or increasing the survival of tissues at risk after myocardial infarction (MI). In this study we aimed to evaluate whether BM-derived mononuclear cell (MNC) implantation can positively influence the post-MI structural remodeling, contractility and Ca(2 +)-handling proteins of the remote non-infarcted tissue in rats. Methods and results: After 48 h of MI induction, saline or BM-MNC were injected. Six weeks later, MI scars were slightly smaller and thicker, and cardiac dilatation was just partially prevented by cell therapy. However, the cardiac performance under hemodynamic stress was totally preserved in the BM-MNC treated group if compared to the untreated group, associated with normal contractility of remote myocardium as analyzed in vitro. The impaired post-rest potentiation of contractile force, associated with decreased protein expression of the sarcoplasmic reticulum Ca2 +-ATPase and phosphorylated-phospholamban and overexpression of Na(+)/Ca(2 +) exchanger, were prevented by BM-MNC, indicating preservation of the Ca(2 +) handling. Finally, pathological changes on remodeled remote tissue such as myocyte hypertrophy, interstitial fibrosis and capillary rarefaction were also mitigated by cell therapy. Conclusions: BM-MNC therapy was able to prevent cardiac structural and molecular remodeling after MI, avoiding pathological changes on Ca(2 +)-handling proteins and preserving contractile behavior of the viable myocardium, which could be the major contributor to the improvements of global cardiac performance after cell transplantation despite that scar tissue still exists.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Process algebraic architectural description languages provide a formal means for modeling software systems and assessing their properties. In order to bridge the gap between system modeling and system im- plementation, in this thesis an approach is proposed for automatically generating multithreaded object-oriented code from process algebraic architectural descriptions, in a way that preserves – under certain assumptions – the properties proved at the architectural level. The approach is divided into three phases, which are illustrated by means of a running example based on an audio processing system. First, we develop an architecture-driven technique for thread coordination management, which is completely automated through a suitable package. Second, we address the translation of the algebraically-specified behavior of the individual software units into thread templates, which will have to be filled in by the software developer according to certain guidelines. Third, we discuss performance issues related to the suitability of synthesizing monitors rather than threads from software unit descriptions that satisfy specific constraints. In addition to the running example, we present two case studies about a video animation repainting system and the implementation of a leader election algorithm, in order to summarize the whole approach. The outcome of this thesis is the implementation of the proposed approach in a translator called PADL2Java and its integration in the architecture-centric verification tool TwoTowers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

I Max Bill is an intense giornata of a big fresco. An analysis of the main social, artistic and cultural events throughout the twentieth century is needed in order to trace his career through his masterpieces and architectures. Some of the faces of this hypothetical mural painting are, among others, Le Corbusier, Walter Gropius, Ernesto Nathan Rogers, Kandinskij, Klee, Mondrian, Vatongerloo, Ignazio Silone, while the backcloth is given by artistic avant-gardes, Bauhaus, International Exhibitions, CIAM, war events, reconstruction, Milan Triennali, Venice Biennali, the School of Ulm. Architect, even though more known as painter, sculptor, designer and graphic artist, Max Bill attends the Bauhaus as a student in the years 1927-1929, and from this experience derives the main features of a rational, objective, constructive and non figurative art. His research is devoted to give his art a scientific methodology: each work proceeds from the analysis of a problem to the logical and always verifiable solution of the same problem. By means of composition elements (such as rhythm, seriality, theme and its variation, harmony and dissonance), he faces, with consistent results, themes apparently very distant from each other as the project for the H.f.G. or the design for a font. Mathematics are a constant reference frame as field of certainties, order, objectivity: ‘for Bill mathematics are never confined to a simple function: they represent a climate of spiritual certainties, and also the theme of non attempted in its purest state, objectivity of the sign and of the geometrical place, and at the same time restlessness of the infinity: Limited and Unlimited ’. In almost sixty years of activity, experiencing all artistic fields, Max Bill works, projects, designs, holds conferences and exhibitions in Europe, Asia and Americas, confronting himself with the most influencing personalities of the twentieth century. In such a vast scenery, the need to limit the investigation field combined with the necessity to address and analyse the unpublished and original aspect of Bill’s relations with Italy. The original contribution of the present research regards this particular ‘geographic delimitation’; in particular, beyond the deep cultural exchanges between Bill and a series of Milanese architects, most of all with Rogers, two main projects have been addressed: the realtà nuova at Milan Triennale in 1947, and the Contemporary Art Museum in Florence in 1980. It is important to note that these projects have not been previously investigated, and the former never appears in the sources either. These works, together with the most well-known ones, such as the projects for the VI and IX Triennale, and the Swiss pavilion for the Biennale, add important details to the reference frame of the relations which took place between Zurich and Milan. Most of the occasions for exchanges took part in between the Thirties and the Fifties, years during which Bill underwent a significant period of artistic growth. He meets the Swiss progressive architects and the Paris artists from the Abstraction-Création movement, enters the CIAM, collaborates with Le Corbusier to the third volume of his Complete Works, and in Milan he works and gets confronted with the events related to post-war reconstruction. In these years Bill defines his own working methodology, attaining an artistic maturity in his work. The present research investigates the mentioned time period, despite some necessary exceptions. II The official Max Bill bibliography is naturally wide, including spreading works along with ones more devoted to analytical investigation, mainly written in German and often translated into French and English (Max Bill himself published his works in three languages). Few works have been published in Italian and, excluding the catalogue of the Parma exhibition from 1977, they cannot be considered comprehensive. Many publications are exhibition catalogues, some of which include essays written by Max Bill himself, some others bring Bill’s comments in a educational-pedagogical approach, to accompany the observer towards a full understanding of the composition processes of his art works. Bill also left a great amount of theoretical speculations to encourage a critical reading of his works in the form of books edited or written by him, and essays published in ‘Werk’, magazine of the Swiss Werkbund, and other international reviews, among which Domus and Casabella. These three reviews have been important tools of analysis, since they include tracks of some of Max Bill’s architectural works. The architectural aspect is less investigated than the plastic and pictorial ones in all the main reference manuals on the subject: Benevolo, Tafuri and Dal Co, Frampton, Allenspach consider Max Bill as an artist proceeding in his work from Bauhaus in the Ulm experience . A first filing of his works was published in 2004 in the monographic issue of the Spanish magazine 2G, together with critical essays by Karin Gimmi, Stanislaus von Moos, Arthur Rüegg and Hans Frei, and in ‘Konkrete Architektur?’, again by Hans Frei. Moreover, the monographic essay on the Atelier Haus building by Arthur Rüegg from 1997, and the DPA 17 issue of the Catalonia Polytechnic with contributions of Carlos Martì, Bruno Reichlin and Ton Salvadò, the latter publication concentrating on a few Bill’s themes and architectures. An urge to studying and going in depth in Max Bill’s works was marked in 2008 by the centenary of his birth and by a recent rediscovery of Bill as initiator of the ‘minimalist’ tradition in Swiss architecture. Bill’s heirs are both very active in promoting exhibitions, researching and publishing. Jakob Bill, Max Bill’s son and painter himself, recently published a work on Bill’s experience in Bauhaus, and earlier on he had published an in-depth study on ‘Endless Ribbons’ sculptures. Angela Thomas Schmid, Bill’s wife and art historian, published in end 2008 the first volume of a biography on Max Bill and, together with the film maker Eric Schmid, produced a documentary film which was also presented at the last Locarno Film Festival. Both biography and documentary concentrate on Max Bill’s political involvement, from antifascism and 1968 protest movements to Bill experiences as Zurich Municipality councilman and member of the Swiss Confederation Parliament. In the present research, the bibliography includes also direct sources, such as interviews and original materials in the form of letters correspondence and graphic works together with related essays, kept in the max+binia+jakob bill stiftung archive in Zurich. III The results of the present research are organized into four main chapters, each of them subdivided into four parts. The first chapter concentrates on the research field, reasons, tools and methodologies employed, whereas the second one consists of a short biographical note organized by topics, introducing the subject of the research. The third chapter, which includes unpublished events, traces the historical and cultural frame with particular reference to the relations between Max Bill and the Italian scene, especially Milan and the architects Rogers and Baldessari around the Fifties, searching the themes and the keys for interpretation of Bill’s architectures and investigating the critical debate on the reviews and the plastic survey through sculpture. The fourth and last chapter examines four main architectures chosen on a geographical basis, all devoted to exhibition spaces, investigating Max Bill’s composition process related to the pictorial field. Paintings has surely been easier and faster to investigate and verify than the building field. A doctoral thesis discussed in Lausanne in 1977 investigating Max Bill’s plastic and pictorial works, provided a series of devices which were corrected and adapted for the definition of the interpretation grid for the composition structures of Bill’s main architectures. Four different tools are employed in the investigation of each work: a context analysis related to chapter three results; a specific theoretical essay by Max Bill briefly explaining his main theses, even though not directly linked to the very same work of art considered; the interpretation grid for the composition themes derived from a related pictorial work; the architecture drawing and digital three-dimensional model. The double analysis of the architectural and pictorial fields is functional to underlining the relation among the different elements of the composition process; the two fields, however, cannot be compared and they stay, in Max Bill’s works as in the present research, interdependent though self-sufficient. IV An important aspect of Max Bill production is self-referentiality: talking of Max Bill, also through Max Bill, as a need for coherence instead of a method limitation. Ernesto Nathan Rogers describes Bill as the last humanist, and his horizon is the known world but, as the ‘Concrete Art’ of which he is one of the main representatives, his production justifies itself: Max Bill not only found a method, but he autonomously re-wrote the ‘rules of the game’, derived timeless theoretical principles and verified them through a rich and interdisciplinary artistic production. The most recurrent words in the present research work are synthesis, unity, space and logic. These terms are part of Max Bill’s vocabulary and can be referred to his works. Similarly, graphic settings or analytical schemes in this research text referring to or commenting Bill’s architectural projects were drawn up keeping in mind the concise precision of his architectural design. As for Mies van der Rohe, it has been written that Max Bill took art to ‘zero degree’ reaching in this way a high complexity. His works are a synthesis of art: they conceptually encompass all previous and –considered their developments- most of contemporary pictures. Contents and message are generally explicitly declared in the title or in Bill’s essays on his artistic works and architectural projects: the beneficiary is invited to go through and re-build the process of synthesis generating the shape. In the course of the interview with the Milan artist Getulio Alviani, he tells how he would not write more than a page for an essay on Josef Albers: everything was already evident ‘on the surface’ and any additional sentence would be redundant. Two years after that interview, these pages attempt to decompose and single out the elements and processes connected with some of Max Bill’s works which, for their own origin, already contain all possible explanations and interpretations. The formal reduction in favour of contents maximization is, perhaps, Max Bill’s main lesson.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Le tecniche dell'informazione e i metodi della comunicazione hanno modificato il modo di redigere documenti destinati a trasmettere la conoscenza, in un processo che è a tutt'oggi in corso di evoluzione. Anche l'attività progettuale in ingegneria ed architettura, pure in un settore caratterizzato da una notevole inerzia metodologica e restio all'innovazione quale è quello dell'industria edilizia, ha conosciuto profonde trasformazioni in ragione delle nuove espressioni tecnologiche. Da tempo l'informazione necessaria per realizzare un edificio, dai disegni che lo rappresentano sino ai documenti che ne indicano le modalità costruttive, può essere gestita in maniera centralizzata mediante un unico archivio di progetto denominato IPDB (Integrated Project DataBase) pur essendone stata recentemente introdotta sul mercato una variante più operativa chiamata BIM (Building Information Modelling). Tuttavia l'industrializzazione del progetto che questi strumenti esplicano non rende conto appieno di tutti gli aspetti che vedono la realizzazione dell'opera architettonica come collettore di conoscenze proprie di una cultura progettuale che, particolarmente in Italia, è radicata nel tempo. La semantica della rappresentazione digitale è volta alla perequazione degli elementi costitutivi del progetto con l'obiettivo di catalogarne le sole caratteristiche fabbricative. L'analisi della letteratura scientifica pertinente alla materia mostra come non sia possibile attribuire ai metodi ed ai software presenti sul mercato la valenza di raccoglitori omnicomprensivi di informazione: questo approccio olistico costituisce invece il fondamento della modellazione integrata intesa come originale processo di rappresentazione della conoscenza, ordinata secondo il paradigma delle "scatole cinesi", modello evolvente che unifica linguaggi appartenenti ai differenti attori compartecipanti nei settori impiantistici, strutturali e della visualizzazione avanzata. Evidenziando criticamente i pregi e i limiti operativi derivanti dalla modellazione integrata, la componente sperimentale della ricerca è stata articolata con l'approfondimento di esperienze condotte in contesti accademici e professionali. Il risultato conseguito ha coniugato le tecniche di rilevamento alle potenzialità di "modelli tridimensionali intelligenti", dotati cioè di criteri discriminanti per la valutazione del relazionamento topologico dei componenti con l'insieme globale.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

I moderni sistemi embedded sono equipaggiati con risorse hardware che consentono l’esecuzione di applicazioni molto complesse come il decoding audio e video. La progettazione di simili sistemi deve soddisfare due esigenze opposte. Da un lato è necessario fornire un elevato potenziale computazionale, dall’altro bisogna rispettare dei vincoli stringenti riguardo il consumo di energia. Uno dei trend più diffusi per rispondere a queste esigenze opposte è quello di integrare su uno stesso chip un numero elevato di processori caratterizzati da un design semplificato e da bassi consumi. Tuttavia, per sfruttare effettivamente il potenziale computazionale offerto da una batteria di processoriè necessario rivisitare pesantemente le metodologie di sviluppo delle applicazioni. Con l’avvento dei sistemi multi-processore su singolo chip (MPSoC) il parallel programming si è diffuso largamente anche in ambito embedded. Tuttavia, i progressi nel campo della programmazione parallela non hanno mantenuto il passo con la capacità di integrare hardware parallelo su un singolo chip. Oltre all’introduzione di multipli processori, la necessità di ridurre i consumi degli MPSoC comporta altre soluzioni architetturali che hanno l’effetto diretto di complicare lo sviluppo delle applicazioni. Il design del sottosistema di memoria, in particolare, è un problema critico. Integrare sul chip dei banchi di memoria consente dei tempi d’accesso molto brevi e dei consumi molto contenuti. Sfortunatamente, la quantità di memoria on-chip che può essere integrata in un MPSoC è molto limitata. Per questo motivo è necessario aggiungere dei banchi di memoria off-chip, che hanno una capacità molto maggiore, come maggiori sono i consumi e i tempi d’accesso. La maggior parte degli MPSoC attualmente in commercio destina una parte del budget di area all’implementazione di memorie cache e/o scratchpad. Le scratchpad (SPM) sono spesso preferite alle cache nei sistemi MPSoC embedded, per motivi di maggiore predicibilità, minore occupazione d’area e – soprattutto – minori consumi. Per contro, mentre l’uso delle cache è completamente trasparente al programmatore, le SPM devono essere esplicitamente gestite dall’applicazione. Esporre l’organizzazione della gerarchia di memoria ll’applicazione consente di sfruttarne in maniera efficiente i vantaggi (ridotti tempi d’accesso e consumi). Per contro, per ottenere questi benefici è necessario scrivere le applicazioni in maniera tale che i dati vengano partizionati e allocati sulle varie memorie in maniera opportuna. L’onere di questo compito complesso ricade ovviamente sul programmatore. Questo scenario descrive bene l’esigenza di modelli di programmazione e strumenti di supporto che semplifichino lo sviluppo di applicazioni parallele. In questa tesi viene presentato un framework per lo sviluppo di software per MPSoC embedded basato su OpenMP. OpenMP è uno standard di fatto per la programmazione di multiprocessori con memoria shared, caratterizzato da un semplice approccio alla parallelizzazione tramite annotazioni (direttive per il compilatore). La sua interfaccia di programmazione consente di esprimere in maniera naturale e molto efficiente il parallelismo a livello di loop, molto diffuso tra le applicazioni embedded di tipo signal processing e multimedia. OpenMP costituisce un ottimo punto di partenza per la definizione di un modello di programmazione per MPSoC, soprattutto per la sua semplicità d’uso. D’altra parte, per sfruttare in maniera efficiente il potenziale computazionale di un MPSoC è necessario rivisitare profondamente l’implementazione del supporto OpenMP sia nel compilatore che nell’ambiente di supporto a runtime. Tutti i costrutti per gestire il parallelismo, la suddivisione del lavoro e la sincronizzazione inter-processore comportano un costo in termini di overhead che deve essere minimizzato per non comprometterre i vantaggi della parallelizzazione. Questo può essere ottenuto soltanto tramite una accurata analisi delle caratteristiche hardware e l’individuazione dei potenziali colli di bottiglia nell’architettura. Una implementazione del task management, della sincronizzazione a barriera e della condivisione dei dati che sfrutti efficientemente le risorse hardware consente di ottenere elevate performance e scalabilità. La condivisione dei dati, nel modello OpenMP, merita particolare attenzione. In un modello a memoria condivisa le strutture dati (array, matrici) accedute dal programma sono fisicamente allocate su una unica risorsa di memoria raggiungibile da tutti i processori. Al crescere del numero di processori in un sistema, l’accesso concorrente ad una singola risorsa di memoria costituisce un evidente collo di bottiglia. Per alleviare la pressione sulle memorie e sul sistema di connessione vengono da noi studiate e proposte delle tecniche di partizionamento delle strutture dati. Queste tecniche richiedono che una singola entità di tipo array venga trattata nel programma come l’insieme di tanti sotto-array, ciascuno dei quali può essere fisicamente allocato su una risorsa di memoria differente. Dal punto di vista del programma, indirizzare un array partizionato richiede che ad ogni accesso vengano eseguite delle istruzioni per ri-calcolare l’indirizzo fisico di destinazione. Questo è chiaramente un compito lungo, complesso e soggetto ad errori. Per questo motivo, le nostre tecniche di partizionamento sono state integrate nella l’interfaccia di programmazione di OpenMP, che è stata significativamente estesa. Specificamente, delle nuove direttive e clausole consentono al programmatore di annotare i dati di tipo array che si vuole partizionare e allocare in maniera distribuita sulla gerarchia di memoria. Sono stati inoltre sviluppati degli strumenti di supporto che consentono di raccogliere informazioni di profiling sul pattern di accesso agli array. Queste informazioni vengono sfruttate dal nostro compilatore per allocare le partizioni sulle varie risorse di memoria rispettando una relazione di affinità tra il task e i dati. Più precisamente, i passi di allocazione nel nostro compilatore assegnano una determinata partizione alla memoria scratchpad locale al processore che ospita il task che effettua il numero maggiore di accessi alla stessa.