994 resultados para statistical software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of characteristics are boosting the eagerness of extending Ethernet to also cover factory-floor distributed real-time applications. Full-duplex links, non-blocking and priority-based switching, bandwidth availability, just to mention a few, are characteristics upon which that eagerness is building up. But, will Ethernet technologies really manage to replace traditional Fieldbus networks? Ethernet technology, by itself, does not include features above the lower layers of the OSI communication model. In the past few years, it is particularly significant the considerable amount of work that has been devoted to the timing analysis of Ethernet-based technologies. It happens, however, that the majority of those works are restricted to the analysis of sub-sets of the overall computing and communication system, thus without addressing timeliness at a holistic level. To this end, we are addressing a few inter-linked research topics with the purpose of setting a framework for the development of tools suitable to extract temporal properties of Commercial-Off-The-Shelf (COTS) Ethernet-based factory-floor distributed systems. This framework is being applied to a specific COTS technology, Ethernet/IP. In this paper, we reason about the modelling and simulation of Ethernet/IP-based systems, and on the use of statistical analysis techniques to provide usable results. Discrete event simulation models of a distributed system can be a powerful tool for the timeliness evaluation of the overall system, but particular care must be taken with the results provided by traditional statistical analysis techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A partir da década de noventa do século passado, começaram a surgir no mercado ferramentas de cálculo com o objetivo de agilizar a conceção do projeto de engenharia da construção. Até ao final da década de setenta os computadores existentes eram enormes, apenas entidades de grande poder económico os podiam adquirir. Na década de oitenta surgiu no mercado o PC, Personal Computer, estas pequenas máquinas começaram a ser adquiridas pela generalidade das empresas e em Portugal no final desta década era possível encontrar indivíduos que já possuíam o seu PC. Na década de noventa, a saída de recém-formados das instituições de ensino superior, fomentou no mercado o aparecimento de empresas de informática dedicadas à conceção de software de acordo com as necessidades do próprio mercado, daí resultando software comercial à medida e software comercial de prateleira (COTS, Commercial Off-The-Shelf)). O software comercial, ao ser utilizado por um grande número de pessoas, atingindo facilmente, no caso do COTS, os milhares, tem condições para evoluir de acordo com as exigências sistemáticas do próprio mercado, atingindo elevados patamares no cumprimento de requisitos de qualidade, nomeadamente no que concerne à funcionalidade, fiabilidade, usabilidade, manutenibilidade, eficiência, portabilidade e qualidade na utilização. A utilização de software comercial na área do projeto de engenharia da construção é hoje em dia uma prática absolutamente generalizada. A seleção do software pode tornar-se um processo complexo especialmente naquelas áreas em que existe grande oferta. A utilização de critérios de avaliação bem definidos poderá agilizar o processo e dar maiores garantias no momento da decisão final. Neste documento apresenta-se uma proposta de metodologia para avaliação e comparação de softwares.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The principal topic of this work is the application of data mining techniques, in particular of machine learning, to the discovery of knowledge in a protein database. In the first chapter a general background is presented. Namely, in section 1.1 we overview the methodology of a Data Mining project and its main algorithms. In section 1.2 an introduction to the proteins and its supporting file formats is outlined. This chapter is concluded with section 1.3 which defines that main problem we pretend to address with this work: determine if an amino acid is exposed or buried in a protein, in a discrete way (i.e.: not continuous), for five exposition levels: 2%, 10%, 20%, 25% and 30%. In the second chapter, following closely the CRISP-DM methodology, whole the process of construction the database that supported this work is presented. Namely, it is described the process of loading data from the Protein Data Bank, DSSP and SCOP. Then an initial data exploration is performed and a simple prediction model (baseline) of the relative solvent accessibility of an amino acid is introduced. It is also introduced the Data Mining Table Creator, a program developed to produce the data mining tables required for this problem. In the third chapter the results obtained are analyzed with statistical significance tests. Initially the several used classifiers (Neural Networks, C5.0, CART and Chaid) are compared and it is concluded that C5.0 is the most suitable for the problem at stake. It is also compared the influence of parameters like the amino acid information level, the amino acid window size and the SCOP class type in the accuracy of the predictive models. The fourth chapter starts with a brief revision of the literature about amino acid relative solvent accessibility. Then, we overview the main results achieved and finally discuss about possible future work. The fifth and last chapter consists of appendices. Appendix A has the schema of the database that supported this thesis. Appendix B has a set of tables with additional information. Appendix C describes the software provided in the DVD accompanying this thesis that allows the reconstruction of the present work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past years, Software Architecture has attracted increased attention by academia and industry as the unifying concept to structure the design of complex systems. One particular research area deals with the possibility of reconfiguring architectures to adapt the systems they describe to new requirements. Reconfiguration amounts to adding and removing components and connections, and may have to occur without stopping the execution of the system being reconfigured. This work contributes to the formal description of such a process. Taking as a premise that a single formalism hardly ever satisfies all requirements in every situation, we present three approaches, each one with its own assumptions about the systems it can be applied to and with different advantages and disadvantages. Each approach is based on work of other researchers and has the aesthetic concern of changing as little as possible the original formalism, keeping its spirit. The first approach shows how a given reconfiguration can be specified in the same manner as the system it is applied to and in a way to be efficiently executed. The second approach explores the Chemical Abstract Machine, a formalism for rewriting multisets of terms, to describe architectures, computations, and reconfigurations in a uniform way. The last approach uses a UNITY-like parallel programming design language to describe computations, represents architectures by diagrams in the sense of Category Theory, and specifies reconfigurations by graph transformation rules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução – O melanoma maligno cutâneo (MMC) é considerado uma das mais letais neoplasias e no seu seguimento recorre-se, para além dos exames clínicos e da análise de marcadores tumorais, a diversos métodos imagiológicos, como é o exame Tomografia por Emissão de Positrões/Tomografia Computorizada (PET/CT, do acrónimo inglês Positron Emission Tomography/Computed Tomography) com 18fluor-fluorodeoxiglucose (18F-FDG). O presente estudo tem como objetivo avaliar a utilidade da PET/CT relativamente à análise da extensão e à suspeita de recidiva do MMC, comparando os achados imagiológicos com os descritos em estudos CT. Metodologia – Estudo retrospetivo de 62 estudos PET/CT realizados em 50 pacientes diagnosticados com MMC. Excluiu-se um estudo cujo resultado era duvidoso (nódulo pulmonar). As informações relativas aos resultados dos estudos anatomopatológicos e dos exames imagiológicos foram obtidas através da história clínica e dos relatórios médicos dos estudos CT e PET/CT. Foi criada uma base de dados com os dados recolhidos através do software Excel e foi efetuada uma análise estatística descritiva. Resultados – Dos estudos PET/CT analisados, 31 foram considerados verdadeiros positivos (VP), 28 verdadeiros negativos (VN), um falso positivo (FP) e um falso negativo (FN). A sensibilidade, especificidade, o valor preditivo positivo (VPP), o valor preditivo negativo (VPN) e a exatidão da PET/CT para o estadiamento e avaliação de suspeita de recidiva no MMC são, respetivamente, 96,9%, 96,6%, 96,9%, 96,6% e 96,7%. Dos resultados da CT considerados na análise estatística, 14 corresponderam a VP, 12 a VN, três a FP e cinco a FN. A sensibilidade, especificidade, o VPP e o VPN e a exatidão da CT para o estadiamento e avaliação de suspeita de recidiva no MMC são, respetivamente, 73,7%, 80,0%, 82,4%, 70,6% e 76,5%. Comparativamente aos resultados CT, a PET/CT permitiu uma mudança na atitude terapêutica em 23% dos estudos. Conclusão – A PET/CT é um exame útil na avaliação do MMC, caracterizando-se por uma maior acuidade diagnóstica no estadiamento e na avaliação de suspeita de recidiva do MMC comparativamente à CT isoladamente.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the TeleRisk Project on labour relations and professional risks within the context of teleworking in Portugal – supported by IDICT – Institute for Development and Inspection of Working Conditions (Ministry of Labour), is to study the practices and forms of teleworking in the manufacturing sectors in Portugal. The project chose also the software industry as a reference sector, even though it does not intend to exclude from the study any other sector of activity or the so-called “hybrid” forms of work. However, the latter must have some of the characteristics of telework. The project thus takes into account the so-called “traditional” sectors of activity, namely textile and machinery and metal engineering (machinery and equipment), not usually associated to this type of work. However, telework could include, in the so-called “traditional” sectors, other variations that are not found in technologically based sectors. One of the evaluation methods for the dynamics associated to telework consisted in carrying out surveys by means of questionnaires, aimed at employers in the sectors analysed. This paper presents some of the results of those surveys. It is important to mention that, being a preliminary analysis, it means that it does not pretend to have exhausted all the issues in the survey, but has meant that it shows the bigger tendencies, in terms of teleworking practices, of the Portuguese industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a multihop network comprising Ethernet switches. The traffic is described with flows and each flow is characterized by its source node, its destination node, its route and parameters in the generalized multiframe model. Output queues on Ethernet switches are scheduled by static-priority scheduling and tasks executing on the processor in an Ethernet switch are scheduled by stride scheduling. We present schedulability analysis for this setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Informática.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To compare image quality and effective dose when the 10 kVp rule is applied with manual and AEC mode in PA chest X-ray. Methods and Materials: A total of 68 images (with and without lesions) were acquired of an anthropomorphic chest phantom in a Wolverson Arcoma X-ray unit. The images were evaluated against a reference image using image quality criteria and the 2 alternative forced choice (2 AFC) method by five radiographers. The effective dose was calculated using PCXMC software using the exposure parameters and DAP. The exposure index (lgM) was recorded. Results: Exposure time decreases considerably when applying the 10 kVp rule in manual mode (50%-28%) compared to AEC mode (36%-23%). Statistical differences for effective dose between several AEC modes were found (p=0.002). The effective dose is lower when using only the right AEC ionization chamber. Considering image quality, there are no statistical differences (p=0.348) between the different AEC modes for images with no lesions. Using a higher kVp value the lgM values will also increase. The lgM values showed significant statistical differences (p=0.000). The image quality scores did not present statistically significant differences (p=0.043) for the images with lesions when comparing manual with AEC modes. Conclusion: In general, the dose is lower in the manual mode. By using the right AEC ionising chamber the effective dose will be the lowest in comparison to other ionising chambers. The use of the 10 kVp rule did not affect the detectability of the lesions.