973 resultados para software creation infrastructure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ubiquitous computing aims at providing services to users in everyday environments such as the home. One research theme in this area is that of building capture and access applications which support information to be recorded ( captured) during a live experience toward automatically producing documents for review (accessed). The recording demands instrumented environments with devices such as microphones, cameras, sensors and electronic whiteboards. Since each experience is usually related to many others ( e. g. several meetings of a project), there is a demand for mechanisms supporting the automatic linking among documents relative to different experiences. In this paper we present original results relative to the integration of our previous efforts in the Infrastructure for Capturing, Accessing, Linking, Storing and Presenting information (CALiSP). Ubiquitous computing aims at providing services to users in everyday environments such as the home. One research theme in this area is that of building capture and access applications which support information to be recorded (captured) during a live experience toward automatically producing documents for review (accessed). The recording demands instrumented environments with devices such as microphones, cameras, sensors and electronic whiteboards. Since each experience is usually related to many others (e.g. several meetings of a project), there is a demand for mechanisms supporting the automatic linking among documents relative to different experiences. In this paper we present original results relative to the integration of our previous efforts in the Infrastructure for Capturing, Accessing, Linking, Storing and Presenting information (CALiSP).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the application of systems modelling benchmarks to determine the viability of systems modelling software and its suitability for modelling critical infrastructure systems. This research applies the earlier research that related to developing benchmarks that when applied to systems modelling software will indicate its likely suitability to modelling critical infrastructure systems. In this context, the systems modelling benchmarks will assess the practicality of CPNTools to the task of modelling critical infrastructure systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Denial of Service attacks is one of the most challenging areas to deal with in Security. Not only do security managers have to deal with flood and vulnerability attacks. They also have to consider whether they are from legitimate or malicious attackers. In our previous work we developed a framework called bodyguard, which is to help security software developers from the current serialized paradigm, to a multi-core paradigm. In this paper, we update our research work by moving our bodyguard paradigm, into our new Ubiquitous Multi-Core Framework. From this shift, we show a marked improvement from our previous result of 20% to 110% speedup performance with an average cost of 1.5 ms. We also conducted a second series of experiments, which we trained up Neural Network, and tested it against actual DDoS attack traffic. From these experiments, we were able to achieve an average of 93.36%, of this attack traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High Performance Computing (HPC) clouds have started to change the way how research in science, in particular medicine and genomics (bioinformatics) is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. However, most HPC clouds are provided at an Infrastructure as a Service (IaaS) level, users are presented with a set of virtual servers which need to be put together to form HPC environments via time consuming resource management and software configuration tasks, which make them practically unusable by discipline, non-computing specialists. In response, there is a new trend to expose cloud applications as services to simplify access and execution on clouds. This paper firstly examines commonly used cloud-based genomic analysis services (Tuxedo Suite, Galaxy and Cloud Bio Linux). As a follow up, we propose two new solutions (HPCaaS and Uncinus), which aim to automate aspects of the service development and deployment process. By comparing and contrasting these five solutions, we identify key mechanisms of service creation, execution and access that are required to support genomic research on the SaaS cloud, in particular by discipline specialists. © 2014 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo da pesquisa atém-se primeiramente em elaborar um protocolo que permita analisar, por meio de um conjunto de indicadores, o processo de reutilização de software no desenvolvimento de sistemas de informação modelando objetos de negócios. O protocolo concebido compõe-se de um modelo analítico e de grades de análise, a serem empregadas na classificação e tabulação dos dados obtidos empiricamente. Com vistas à validação inicial do protocolo de análise, realiza-se um estudo de caso. A investigação ocorre num dos primeiros e, no momento, maior projeto de fornecimento de elementos de software reutilizáveis destinados a negócios, o IBM SANFRANCISCO, bem como no primeiro projeto desenvolvido no Brasil com base no por ele disponibilizado, o sistema Apontamento Universal de Horas (TIME SHEET System). Quanto à aplicabilidade do protocolo na prática, este se mostra abrangente e adequado à compreensão do processo. Quanto aos resultados do estudo de caso, a análise dos dados revela uma situação em que as expectativas (dos pesquisadores) de reutilização de elementos de software orientadas a negócio eram superiores ao observado. Houve, entretanto, reutilização de elementos de baixo nível, que forneceram a infra-estrutura necessária para o desenvolvimento do projeto. Os resultados contextualizados diante das expectativas de reutilização (dos desenvolvedores) são positivos, na medida em que houve benefícios metodológicos e tecnológicos decorrentes da parceria realizada. Por outro lado, constatam-se alguns aspectos restritivos para o desenvolvedor de aplicativos, em virtude de escolhas arbitrárias realizadas pelo provedor de elementos reutilizáveis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research aimed to apply the sociometric theory and its methodology to create an integrated multicultural work team. The study focused on the application of the sociometry theory, developed by Jacob L. Moreno in 1934, to analyze the small multicultural group. In the beginning, a review of the literature was done to have a better understanding of Sociometric Theory as well as the modern tools and software developed to analyze and map the social networks. After this part of the study, the qualitative study was done, in which 26 students from 12 countries, which studied together in a Corporate International Master (2014-2015), developed by Georgetown University’s McDonough School of Business, Corporate Master of Business Administration from ESADE Business School and FGV/EBAPE, were surveyed and asked them to choose people, among the selected group, who they attracted, rejected or they were neutral towards, in 4 different scenarios: work team, leadership, trip (leisure time) and personal problem. Additionally, there were, two questions asked about how they felt when they answered the survey and which question(s) was/were difficult to answer and why. The focus on these two questions was to understand the emotional state of the respondents when they answered the survey and related this emotional state to the Sociometric Theory. The sociometric matrix, using Microsoft Excel, was created using the answers and the total of the positive, negative and neutral choices were analyzed for each scenario as well as the mutualities and incongruences of the choices. Furthermore, the software Kumu was used to analyze the connections between the people in the selected group using three metrics: size, degree centrality and indegree. Also Kumu was used to draw the social maps or sociometric maps. Using the relationship level analyses of the sociometric matrix and maps, it was possible to create an integrated multicultural work team. In the end, the results obtained suggest that it is possible to apply the sociometric methodology to study the relationships inside companies, project teams and work teams and identify the best work team based on the interrelationship between the people as well as the lack of communication among the team members, project team or inside the company as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current policies on education to visually impaired point for a growing trend of including students with special educational needs in regular schools. However, most often this inclusion is not accompanied by an appropriate professional trained or infrastructure, which has been presented as a big problem for regular school teachers who have students with visual impairments in their classroom. Based on this situation, the Group of Extension in Tactile Cartography from UNESP - University of the State of São Paulo - Campus de Rio Claro - SP - Brazil has been developing educational material of geography and cartography to blind students at a special school. Among the materials developed in this study highlight the development of graphics and board games provided with sound capabilities through MAPAVOX, software developed in partnership with UFRJ - Federal University from Rio de Janeiro - RJ - Brazil. Through this software, sound capabilities can be inserted into built materials, giving them a multi-sensory character. In most cases the necessary conditions for building specific materials to students with visual impairments is expensive and beyond the reach of features from a regular school, so the survey sought to use easy access and low cost materials like Cork, leaf aluminum, material for fixing and others. The development of these materials was supported by preparation in laboratory and its subsequent test through practices involving blind students. The methodology used on the survey is based on qualitative research and non comparative analysis of the results. In other words, the material is built based on the special students perception and reality construction, not being mere adaptations of visual materials, but a construction focused on the reality of the visually impaired. The results proved were quite successful as the materials prepared were effective on mediating the learning process of students with disabilities. Geographical and cartographic concepts were seized by the students through the technology used, associated with the use of materials that took into account in its building process the perception of the students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A great challenge of the Component Based Development is the creation of mechanisms to facilitate the finding of reusable assets that fulfill the requirements of a particular system under development. In this sense, some component repositories have been proposed in order to answer such a need. However, repositories need to represent the asset characteristics that can be taken into account by the consumers when choosing the more adequate assets for their needs. In such a context, the literature presents some models proposed to describe the asset characteristics, such as identification, classification, non-functional requirements, usage and deployment information and component interfaces. Nevertheless, the set of characteristics represented by those models is insufficient to describe information used before, during and after the asset acquisition. This information refers to negotiation, certification, change history, adopted development process, events, exceptions and so on. In order to overcome this gap, this work proposes an XML-based model to represent several characteristics, of different asset types, that may be employed in the component-based development. Besides representing metadata used by consumers, useful for asset discovering, acquisition and usage, this model, called X-ARM, also focus on helping asset developers activities. Since the proposed model represents an expressive amount of information, this work also presents a tool called X-Packager, developed with the goal of helping asset description with X-ARM

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contemporary individual finds on the Internet and especially on the Web facilitating conditions to build a basic infrastructure based on the concept of commons. He also finds favorable conditions which allow him to collaborate and share resources for the creation, use, reuse, access and dissemination of information. However, he also faces obstacles such as Copyright (Law 9610/98 in Brazil). An alternative is Creative Commons which not only allows the elaboration, use and dissemination of information under legal conditions but also function as a facilitator for the development of informational commons. This paper deals with this scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Pesquisa e Desenvolvimento (Biotecnologia Médica) - FMB