933 resultados para Software Package Data Exchange (SPDX)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. To determine the effect of ion exchange on slow crack growth (SCG) parameters (n, stress corrosion susceptibility coefficient, and sigma(f0), scaling parameter) and Weibull parameters (m, Weibull modulus, and sigma(0), characteristic strength) of a dental porcelain. Methods. 160 porcelain discs were fabricated according to manufacturer`s instructions, polished through 1 mu m and divided into two groups: GC (control) and GI (submitted to an ion exchange procedure using a KNO(3) paste at 470 degrees C for 15 min). SCG parameters were determined by biaxial flexural strength test in artificial saliva at 37 degrees C using five constant stress rates (n =10). 20 specimens of each group were tested at 1 MPa/s to determine Weibull parameters. The SPT diagram was constructed using the least-squares fit of the strength data versus probability of failure. Results. Mean values of m and sigma(0) (95% confidence interval), n and sigma(f0) (standard deviation) were, respectively: 13.8 (10.1-18.8) and 60.4 (58.5 - 62.2), 24.1 (2.5) and 58.1 (0.01) for GC and 7.4 (5.3 -10.0) and 136.8 (129.1-144.7), 36.7 (7.3) and 127.9 (0.01) for GI. Fracture stresses (MPa) calculated using the SPT diagram for lifetimes of 1 day, 1 year and 10 years (at a 5% failure probability) were, respectively, 31.8, 24.9 and 22.7 for GC and 71.2, 60.6 and 56.9 for GI. Significance. For the porcelain tested, the ion exchange process improved strength and resistance to SCG, however, the material`s reliability decreased. The predicted fracture stress at 5% failure probability for a lifetime of 10 years was also higher for the ion treated group. (C) 009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the proliferation of relational database programs for PC's and other platforms, many business end-users are creating, maintaining, and querying their own databases. More importantly, business end-users use the output of these queries as the basis for operational, tactical, and strategic decisions. Inaccurate data reduce the expected quality of these decisions. Implementing various input validation controls, including higher levels of normalisation, can reduce the number of data anomalies entering the databases. Even in well-maintained databases, however, data anomalies will still accumulate. To improve the quality of data, databases can be queried periodically to locate and correct anomalies. This paper reports the results of two experiments that investigated the effects of different data structures on business end-users' abilities to detect data anomalies in a relational database. The results demonstrate that both unnormalised and higher levels of normalisation lower the effectiveness and efficiency of queries relative to the first normal form. First normal form databases appear to provide the most effective and efficient data structure for business end-users formulating queries to detect data anomalies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Qualitative data analysis (QDA) is often a time-consuming and laborious process usually involving the management of large quantities of textual data. Recently developed computer programs offer great advances in the efficiency of the processes of QDA. In this paper we report on an innovative use of a combination of extant computer software technologies to further enhance and simplify QDA. Used in appropriate circumstances, we believe that this innovation greatly enhances the speed with which theoretical and descriptive ideas can be abstracted from rich, complex, and chaotic qualitative data. © 2001 Human Sciences Press, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. Results: The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct comparisons between photosynthetic O-2 evolution rate and electron transport rate (ETR) were made in situ over 24 h using the benthic macroalga Ulva lactuca (Chlorophyta), growing and measured at a depth of 1.8 m, where the midday irradiance rose to 400-600 mumol photons m(-2) s(-1). O-2 exchange was measured with a 5-chamber data-logging apparatus and ETR with a submersible pulse amplitude modulated (PAM) fluorometer (Diving-PAM). Steady-state quantum yield ((Fm'-Ft)/Fm') decreased from 0.7 during the morning to 0.45 at midday, followed by some recovery in the late afternoon. At low to medium irradiances (0-300 mumol photons m(-2) s(-1)), there was a significant correlation between O-2 evolution and ETR, but at higher irradiances, ETR continued to increase steadily, while O-2 evolution tended towards an asymptote. However at high irradiance levels (600-1200 mumol photons m-(2) s(-1)) ETR was significantly lowered. Two methods of measuring ETR, based on either diel ambient light levels and fluorescence yields or rapid light curves, gave similar results at low to moderate irradiance levels. Nutrient enrichment (increases in [NO3-], [NH4+] and [HPO42-] of 5- to 15-fold over ambient concentrations) resulted in an increase, within hours, in photosynthetic rates measured by both ETR and O-2 evolution techniques. At low irradiances, approximately 6.5 to 8.2 electrons passed through PS II during the evolution of one molecule of O-2, i.e., up to twice the theoretical minimum number of four. However, in nutrient-enriched treatments this ratio dropped to 5.1. The results indicate that PAM fluorescence can be used as a good indication of the photosynthetic rate only at low to medium irradiances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement of exchange of substances between blood and tissue has been a long-lasting challenge to physiologists, and considerable theoretical and experimental accomplishments were achieved before the development of the positron emission tomography (PET). Today, when modeling data from modern PET scanners, little use is made of earlier microvascular research in the compartmental models, which have become the standard model by which the vast majority of dynamic PET data are analysed. However, modern PET scanners provide data with a sufficient temporal resolution and good counting statistics to allow estimation of parameters in models with more physiological realism. We explore the standard compartmental model and find that incorporation of blood flow leads to paradoxes, such as kinetic rate constants being time-dependent, and tracers being cleared from a capillary faster than they can be supplied by blood flow. The inability of the standard model to incorporate blood flow consequently raises a need for models that include more physiology, and we develop microvascular models which remove the inconsistencies. The microvascular models can be regarded as a revision of the input function. Whereas the standard model uses the organ inlet concentration as the concentration throughout the vascular compartment, we consider models that make use of spatial averaging of the concentrations in the capillary volume, which is what the PET scanner actually registers. The microvascular models are developed for both single- and multi-capillary systems and include effects of non-exchanging vessels. They are suitable for analysing dynamic PET data from any capillary bed using either intravascular or diffusible tracers, in terms of physiological parameters which include regional blood flow. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In microarray studies, the application of clustering techniques is often used to derive meaningful insights into the data. In the past, hierarchical methods have been the primary clustering tool employed to perform this task. The hierarchical algorithms have been mainly applied heuristically to these cluster analysis problems. Further, a major limitation of these methods is their inability to determine the number of clusters. Thus there is a need for a model-based approach to these. clustering problems. To this end, McLachlan et al. [7] developed a mixture model-based algorithm (EMMIX-GENE) for the clustering of tissue samples. To further investigate the EMMIX-GENE procedure as a model-based -approach, we present a case study involving the application of EMMIX-GENE to the breast cancer data as studied recently in van 't Veer et al. [10]. Our analysis considers the problem of clustering the tissue samples on the basis of the genes which is a non-standard problem because the number of genes greatly exceed the number of tissue samples. We demonstrate how EMMIX-GENE can be useful in reducing the initial set of genes down to a more computationally manageable size. The results from this analysis also emphasise the difficulty associated with the task of separating two tissue groups on the basis of a particular subset of genes. These results also shed light on why supervised methods have such a high misallocation error rate for the breast cancer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho consiste num estudo de caso que se destina ao desenvolvimento de um Data Mart que possibilite a Escola Nacional de Administra????o P??blica ??? ENAP conhecer o perfil e o panorama geral da situa????o funcional dos servidores p??blicos federais que se capacitaram na Escola nos ??ltimos 7 anos. O aplicativo foi desenvolvido cruzando o banco de dados do sistema gerenciador dos cursos ministrados pela ENAP, onde est??o armazenadas informa????es sobre os alunos capacitados, os cursos realizados, os resultados alcan??ados, o perfil dos docentes e demais informa????es relativas ??s atividades da Escola, com os dados gerados pelo Sistema Integrado de Administra????o de Recursos Humanos ??? SIAPE, cuja extra????o de dados foi direcionada para os registros sobre a situa????o funcional, cargos, carreiras, fun????es, ??rg??os e alguns dados pessoais dos alunos, servidores p??blicos federais que se encontram registrados no SIAPE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing availability of mobility data and the awareness of its importance and value have been motivating many researchers to the development of models and tools for analyzing movement data. This paper presents a brief survey of significant research works about modeling, processing and visualization of data about moving objects. We identified some key research fields that will provide better features for online analysis of movement data. As result of the literature review, we suggest a generic multi-layer architecture for the development of an online analysis processing software tool, which will be used for the definition of the future work of our team.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we consider the numerical solution of a large eigenvalue problem resulting from a finite rank discretization of an integral operator. We are interested in computing a few eigenpairs, with an iterative method, so a matrix representation that allows for fast matrix-vector products is required. Hierarchical matrices are appropriate for this setting, and also provide cheap LU decompositions required in the spectral transformation technique. We illustrate the use of freely available software tools to address the problem, in particular SLEPc for the eigensolvers and HLib for the construction of H-matrices. The numerical tests are performed using an astrophysics application. Results show the benefits of the data-sparse representation compared to standard storage schemes, in terms of computational cost as well as memory requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a method for estimating local thickness distribution in nite element models, applied to injection molded and cast engineering parts. This method features considerable improved performance compared to two previously proposed approaches, and has been validated against thickness measured by di erent human operators. We also demonstrate that the use of this method for assigning a distribution of local thickness in FEM crash simulations results in a much more accurate prediction of the real part performance, thus increasing the bene ts of computer simulations in engineering design by enabling zero-prototyping and thus reducing product development costs. The simulation results have been compared to experimental tests, evidencing the advantage of the proposed method. Thus, the proposed approach to consider local thickness distribution in FEM crash simulations has high potential on the product development process of complex and highly demanding injection molded and casted parts and is currently being used by Ford Motor Company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESUMO: Hoje em dia o software tornou-se num elemento útil na vida das pessoas e das empresas. Existe cada vez mais a necessidade de utilização de aplicações de qualidade, com o objectivo das empresas se diferenciarem no mercado. As empresas produtoras de software procuram aumentar a qualidade nos seus processos de desenvolvimento, com o objectivo de garantir a qualidade do produto final. A dimensão e complexidade do software aumentam a probabilidade do aparecimento de não-conformidades nestes produtos, resultando daí o interesse pela actividade de testes de software ao longo de todo o seu processo de concepção, desenvolvimento e manutenção. Muitos projectos de desenvolvimento de software são entregues com atraso por se verificar que na data prevista para a sua conclusão não têm um desempenho satisfatório ou por não serem confiáveis, ou ainda por serem difíceis de manter. Um bom planeamento das actividades de produção de software significa usualmente um aumento da eficiência de todo o processo produtivo, pois poderá diminuir a quantidade de defeitos e os custos que decorrem da sua correcção, aumentando a confiança na utilização do software e a facilidade da sua operação e manutenção. Assim se reconhece a importância da adopção de boas práticas no desenvolvimento do software. Para isso deve-se utilizar uma abordagem sistemática e organizada com o intuito de produzir software de qualidade. Esta tese descreve os principais modelos de desenvolvimento de software, a importância da engenharia dos requisitos, os processos de testes e principais validações da qualidade de software e como algumas empresas utilizam estes princípios no seu dia-a-dia, com o intuito de produzir um produto final mais fiável. Descreve ainda alguns exemplos como complemento ao contexto da tese. ABSTRACT: Nowadays the software has become a useful element in people's lives and it is increasingly a need for the use of quality applications from companies in order to differentiate in the market. The producers of software increase quality in their development processes, in order to ensuring final product quality. The complexity and size of software, increases the probability of the emergence of non-conformities in these products, this reason increases of interest in the business of testing software throughout the process design, development and maintenance. Many software development projects are postpone because in the date for delivered it’s has not performed satisfactorily, not to be trusted, or because it’s harder to maintain. A good planning of software production activities, usually means an increase in the efficiency of all production process, because it can decrease the number of defects and the costs of it’s correction, increasing the reliability of software in use, and make it easy to operate and maintenance. In this manner, it’s recognized the importance of adopting best practices in software development. To produce quality software, a systematic and organized approach must be used. This thesis describes the main models of software development, the importance of requirements engineering, testing processes and key validation of software quality and how some companies use these principles daily, in order to produce a final product more reliable. It also describes some examples in addition to the context of this thesis.