943 resultados para Software analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electricity market and climate are both undergoing a change. The changes impact hydropower and provoke an interest for hydropower capacity increases. In this thesis a new methodology was developed utilising short-term hydropower optimisation and planning software for better capacity increase profitability analysis accuracy. In the methodology income increases are calculated in month long periods while varying average discharge and electricity price volatility. The monthly incomes are used for constructing year scenarios, and from different types of year scenarios a long-term profitability analysis can be made. Average price development is included utilising a multiplier. The method was applied on Oulujoki hydropower plants. It was found that the capacity additions that were analysed for Oulujoki were not profitable. However, the methodology was found versatile and useful. The result showed that short periods of peaking prices play major role in the profitability of capacity increases. Adding more discharge capacity to hydropower plants that initially bypassed water more often showed the best improvements both in income and power generation profile flexibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A organização, a gestão e o planejamento de uma unidade de informação compreende várias etapas e envolve os processos e técnicas do campo de pesquisa do profissional do Bibliotecário. Neste estudo pretendemos construir uma proposta de reestruturação da Biblioteca do Centro de Estudos Teológicos das Assembléias de Deus na Paraíba - CETAD/PB. E especificamente: definir um sistema de organização para o acervo que conduza à autonomia do usuário no processo de busca e recuperação da informação; indicar um software de gerenciamento de bibliotecas que supra as necessidades da unidade de informação; conhecer o público alvo, a partir de instrumento de estudo de usuário, a fim de adequar as ferramentas tecnológicas que serão utilizadas; organizar um guia para auxiliar o processo de reestruturação e propor medidas para a regulamentação do funcionamento da biblioteca do CETAD/PB. A metodologia utiliza a abordagem de pesquisa qualitativa, com características do tipo descritiva e exploratória. Adota a pesquisa de campo, para conhecer e detalhar o universo de pesquisa que foi o Centro de Estudos Teológicos das Assembléias de Deus na Paraíba CETAD/PB, bem como os sujeitos da pesquisa, ou seja, os alunos da instituição. O instrumento de coleta dos dados utilizado foi o questionário. Para representar os dados recorre às técnicas e aos recursos estatísticos da pesquisa quantitativa. Com as análises dos dados desvenda o perfil dos seus usuários, constata a insatisfação dos mesmos com relação a organização do acervo, assim como quais ferramentas tecnológicas se adéquam a esse perfil para o aprimoramento nas etapas de tratamento e disseminação dos suportes informacionais, como também no serviços de atendimento ao usuário. Destaca o profissional da informação como gestor nas Unidades de Informação, com atuação que vai além dos procedimentos e técnicas tradicionais da profissão. Palavras-chave: Biblioteca Especializada. Biblioteca – Teologia. Organização de Bibliotecas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação do Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Educação Pré-escolar e Ensino do 1.º Ciclo do Ensino Básico.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-step etching technique for fine-grained calcite mylonites using 0.37% hydrochloric and 0.1% acetic acid produces a topographic relief which reflects the grain boundary geometry. With this technique, calcite grain boundaries become more intensely dissolved than their grain interiors but second phase minerals like dolomite, quartz, feldspars, apatite, hematite and pyrite are not affected by the acid and therefore form topographic peaks. Based on digital backscatter electron images and element distribution maps acquired on a scanning electron microscope, the geometry of calcite and the second phase minerals can be automatically quantified using image analysis software. For research on fine-grained carbonate rocks (e.g. dolomite calcite mixtures), this low-cost approach is an attractive alternative to the generation of manual grain boundary maps based on photographs from ultra-thin sections or orientation contrast images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central motif of this work is prediction and optimization in presence of multiple interacting intelligent agents. We use the phrase `intelligent agents' to imply in some sense, a `bounded rationality', the exact meaning of which varies depending on the setting. Our agents may not be `rational' in the classical game theoretic sense, in that they don't always optimize a global objective. Rather, they rely on heuristics, as is natural for human agents or even software agents operating in the real-world. Within this broad framework we study the problem of influence maximization in social networks where behavior of agents is myopic, but complication stems from the structure of interaction networks. In this setting, we generalize two well-known models and give new algorithms and hardness results for our models. Then we move on to models where the agents reason strategically but are faced with considerable uncertainty. For such games, we give a new solution concept and analyze a real-world game using out techniques. Finally, the richest model we consider is that of Network Cournot Competition which deals with strategic resource allocation in hypergraphs, where agents reason strategically and their interaction is specified indirectly via player's utility functions. For this model, we give the first equilibrium computability results. In all of the above problems, we assume that payoffs for the agents are known. However, for real-world games, getting the payoffs can be quite challenging. To this end, we also study the inverse problem of inferring payoffs, given game history. We propose and evaluate a data analytic framework and we show that it is fast and performant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One way to do a bibliometric study is to examine each of the records that make up a database, each record and extract key areas that may disclose relevant information about the use of the database and documents in the collection . This article shows how a reference database allows to obtain important data that can reach conclusions that in some cases surprising. For this study we used the following fields of Database Control Documentary Indigenous Nationalities of Costa Rica 1979-2003: author, place of publication, publisher, year, language and support. The database analyzed has two thousand records and was developed in the Winisis. Moreover, analysis of documents was made after processing of the data, which was to export records to Excel software Winisis. After this information extracted from their chosen fields and are held by their respective separate chart or graph to present the results obtained. Furthermore, we show the application of different methods to learn more about the scientific aspects as: the Price Index, the Index of Collaboration This contribution will, first, for (as) students in the course of the race Metric Studies of Library and Information Science, National University, demonstrate and practice what you learned in this area. They may also benefit the (as) professionals from different areas, such as anthropologists (as), sociologists (as), linguists and librarians (as), among others (as).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação do Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Educação Pré-Escolar e Ensino do 1º Ciclo Básico.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The intensive character in knowledge of software production and its rising demand suggest the need to establish mechanisms to properly manage the knowledge involved in order to meet the requirements of deadline, costs and quality. The knowledge capitalization is a process that involves from identification to evaluation of the knowledge produced and used. Specifically, for software development, capitalization enables easier access, minimize the loss of knowledge, reducing the learning curve, avoid repeating errors and rework. Thus, this thesis presents the know-Cap, a method developed to organize and guide the capitalization of knowledge in software development. The Know-Cap facilitates the location, preservation, value addition and updating of knowledge, in order to use it in the execution of new tasks. The method was proposed from a set of methodological procedures: literature review, systematic review and analysis of related work. The feasibility and appropriateness of Know-Cap were analyzed from an application study, conducted in a real case, and an analytical study of software development companies. The results obtained indicate the Know- Cap supports the capitalization of knowledge in software development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the split-field finite-difference time-domain method (SF-FDTD) has been extended for the analysis of two-dimensionally periodic structures with third-order nonlinear media. The accuracy of the method is verified by comparisons with the nonlinear Fourier Modal Method (FMM). Once the formalism has been validated, examples of one- and two-dimensional nonlinear gratings are analysed. Regarding the 2D case, the shifting in resonant waveguides is corroborated. Here, not only the scalar Kerr effect is considered, the tensorial nature of the third-order nonlinear susceptibility is also included. The consideration of nonlinear materials in this kind of devices permits to design tunable devices such as variable band filters. However, the third-order nonlinear susceptibility is usually small and high intensities are needed in order to trigger the nonlinear effect. Here, a one-dimensional CBG is analysed in both linear and nonlinear regime and the shifting of the resonance peaks in both TE and TM are achieved numerically. The application of a numerical method based on the finite- difference time-domain method permits to analyse this issue from the time domain, thus bistability curves are also computed by means of the numerical method. These curves show how the nonlinear effect modifies the properties of the structure as a function of variable input pump field. When taking the nonlinear behaviour into account, the estimation of the electric field components becomes more challenging. In this paper, we present a set of acceleration strategies based on parallel software and hardware solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current study is a post-hoc analysis of data from the original randomized control trial of the Play and Language for Autistic Youngsters (PLAY) Home Consultation program, a parent-mediated, DIR/Floortime based early intervention program for children with ASD (Solomon, Van Egeren, Mahone, Huber, & Zimmerman, 2014). We examined 22 children from the original RCT who received the PLAY program. Children were split into two groups (high and lower functioning) based on the ADOS module administered prior to intervention. Fifteen-minute parent-child video sessions were coded through the use of CHILDES transcription software. Child and maternal language, communicative behaviors, and communicative functions were assessed in the natural language samples both pre- and post-intervention. Results demonstrated significant improvements in both child and maternal behaviors following intervention. There was a significant increase in child verbal and non-verbal initiations and verbal responses in whole group analysis. Total number of utterances, word production, and grammatical complexity all significantly improved when viewed across the whole group of participants; however, lexical growth did not reach significance. Changes in child communicative function were especially noteworthy, and demonstrated a significant increase in social interaction and a significant decrease in non-interactive behaviors. Further, mothers demonstrated an increase in responsiveness to the child’s conversational bids, increased ability to follow the child’s lead, and a decrease in directiveness. When separated for analyses within groups, trends emerged for child and maternal variables, suggesting greater gains in use of communicative function in both high and low groups over changes in linguistic structure. Additional analysis also revealed a significant inverse relationship between maternal responsiveness and child non-interactive behaviors; as mothers became more responsive, children’s non-engagement was decreased. Such changes further suggest that changes in learned skills following PLAY parent training may result in improvements in child social interaction and language abilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

C3S2E '16 Proceedings of the Ninth International C* Conference on Computer Science & Software Engineering