961 resultados para conformance checking
Resumo:
Performance-contingent compensation by means of stock options may induce risk-taking in agents that is excessive from the point of view of the company or the shareholders. We test whether increasing shareholder control may be an effective checking mechanism to rein in such excessive risk-taking. We thus tell one group of experimental CEOs that they may have to justify their decision-making processes in front of their shareholders. This indeed reduces risk-taking and increases the performance of the companies they manage. Implications are discussed.
Resumo:
Aeromonads are inhabitants of aquatic ecosystems and are described as being involved in intestinal disturbances and other infections. A total of 200 drinking water samples from domestic and public reservoirs and drinking fountains located in Sao Paulo (Brazil), were analyzed for the presence of Aeromonas. Samples were concentrated by membrane filtration and enriched in APW. ADA medium was used for Aeromonas isolation and colonies were confirmed by biochemical characterization. Strains isolated were tested for hemolysin and toxin production. Aeromonas was detected in 12 samples (6.0%). Aeromonas strains (96) were isolated and identified as: A. caviae (41.7%), A. hydrophila (15.7%), A. allosacharophila (10.4%), A. schubertii (1.0%) and Aeromonas spp. (31.2%). The results revealed that 70% of A. caviare, 66.7% of A. hydrophila, 80% of A. allosacharophila and 46.6% of Aeromonas spp. were hemolytic. The assay for checking production of toxins showed that 17.5% of A. caviae, 73.3% of A. hydrophila, 60% of A. allosacharophila, 100% of A. schubertii, and 33.3% of Aeromonas spp. were able to produce toxins. The results demonstrated the pathogenic potential of Aeromonas, indicating that the presence of this emerging pathogen in water systems is a public health concern.
Resumo:
In contrast to the many studies on the venoms of scorpions, spiders, snakes and cone snails, tip to now there has been no report of the proteomic analysis of sea anemones venoms. In this work we report for the first time the peptide mass fingerprint and some novel peptides in the neurotoxic fraction (Fr III) of the sea anemone Bunodosoma cangicum venom. Fr III is neurotoxic to crabs and was purified by rp-HPLC in a C-18 column, yielding 41 fractions. By checking their molecular masses by ESI-Q-Tof and MALDI-Tof MS we found 81 components ranging from near 250 amu to approximately 6000 amu. Some of the peptidic molecules were partially sequenced through the automated Edman technique. Three of them are peptides with near 4500 amu belonging to the class of the BcIV, BDS-I, BDS-II, APETx1, APETx2 and Am-II toxins. Another three peptides represent a novel group of toxins (similar to 3200 amu). A further three molecules (similar to similar to 4900 amu) belong to the group of type 1 sodium channel neurotoxins. When assayed over the crab leg nerve compound action potentials, one of the BcIV- and APETx-like peptides exhibits an action similar to the type 1 sodium channel toxins in this preparation, suggesting the same target in this assay. On the other hand one of the novel peptides, with 3176 amu, displayed an action similar to potassium channel blockage in this experiment. In summary, the proteomic analysis and mass fingerprint of fractions from sea anemone venoms through MS are valuable tools, allowing us to rapidly predict the occurrence of different groups of toxins and facilitating the search and characterization of novel molecules without the need of full characterization of individual components by broader assays and bioassay-guided purifications. It also shows that sea anemones employ dozens of components for prey capture and defense. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
The Br (0.0022 +/- A 0.0006 gL(-1)), Ca (0.113 +/- A 0.012 gL(-1)), Cl (3.07 +/- A 0.36 gL(-1)), K (2.63 +/- A 0.14 gL(-1)), Mg (0.045 +/- A 0.002 gL(-1)) and Na (2.09 +/- A 0.10 gL(-1)) concentrations were determined in whole blood of SJL/J mice using the Neutron Activation Analysis (NAA) technique. Eleven whole blood samples were analyzed in the IEA-R1 nuclear reactor at IPEN (So Paulo, Brazil). These data contribute for applications in veterinary medicine related to biochemistry analyses using whole blood. Moreover, the correlation with human blood estimation allows to checking the similarities for studying muscular dystrophy using this model animal.
Resumo:
In the present study, we compared 2 methods for collecting ixodid ticks on the verges of animal trails in a primary Amazon forest area in northern Brazil. (i) Dragging: This method was based on passing a 1-m(2) white flannel over the vegetation and checking the flannel for the presence of caught ticks every 5-10 m. (ii) Visual search: This method consisted of looking for guesting ticks on the tips of leaves of the vegetation bordering animal trails in the forest. A total of 103 adult ticks belonging to 4 Amblyomma species were collected by the visual search method on 5 collecting dates, while only 44 adult ticks belonging to 3 Amblyomma species were collected by dragging on 5 other collecting dates. These values were statistically different (Mann-Whitney Test, P = 0.0472). On the other hand, dragging was more efficient for subadult ticks, since no larva or nymph was collected by visual search, whereas 18 nymphs and 7 larvae were collected by dragging. The visual search method proved to be suitable for collecting adult ticks in the Amazon forest: however, field studies should include a second method, such as dragging in order to maximize the collection of subadult ticks. Indeed, these 2 methods can be performed by a single investigator at the same time, while he/she walks on an animal trail in the forest. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
This paper proposes a method to locate and track people by combining evidence from multiple cameras using the homography constraint. The proposed method use foreground pixels from simple background subtraction to compute evidence of the location of people on a reference ground plane. The algorithm computes the amount of support that basically corresponds to the ""foreground mass"" above each pixel. Therefore, pixels that correspond to ground points have more support. The support is normalized to compensate for perspective effects and accumulated on the reference plane for all camera views. The detection of people on the reference plane becomes a search for regions of local maxima in the accumulator. Many false positives are filtered by checking the visibility consistency of the detected candidates against all camera views. The remaining candidates are tracked using Kalman filters and appearance models. Experimental results using challenging data from PETS`06 show good performance of the method in the presence of severe occlusion. Ground truth data also confirms the robustness of the method. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Planning to reach a goal is an essential capability for rational agents. In general, a goal specifies a condition to be achieved at the end of the plan execution. In this article, we introduce nondeterministic planning for extended reachability goals (i.e., goals that also specify a condition to be preserved during the plan execution). We show that, when this kind of goal is considered, the temporal logic CTL turns out to be inadequate to formalize plan synthesis and plan validation algorithms. This is mainly due to the fact that the CTL`s semantics cannot discern among the various actions that produce state transitions. To overcome this limitation, we propose a new temporal logic called alpha-CTL. Then, based on this new logic, we implement a planner capable of synthesizing reliable plans for extended reachability goals, as a side effect of model checking.
Resumo:
In this article, we compare three residuals based on the deviance component in generalised log-gamma regression models with censored observations. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. For all cases studied, the empirical distributions of the proposed residuals are in general symmetric around zero, but only a martingale-type residual presented negligible kurtosis for the majority of the cases studied. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for the martingale-type residual in generalised log-gamma regression models with censored data. A lifetime data set is analysed under log-gamma regression models and a model checking based on the martingale-type residual is performed.
Resumo:
Birnbaum-Saunders models have largely been applied in material fatigue studies and reliability analyses to relate the total time until failure with some type of cumulative damage. In many problems related to the medical field, such as chronic cardiac diseases and different types of cancer, a cumulative damage caused by several risk factors might cause some degradation that leads to a fatigue process. In these cases, BS models can be suitable for describing the propagation lifetime. However, since the cumulative damage is assumed to be normally distributed in the BS distribution, the parameter estimates from this model can be sensitive to outlying observations. In order to attenuate this influence, we present in this paper BS models, in which a Student-t distribution is assumed to explain the cumulative damage. In particular, we show that the maximum likelihood estimates of the Student-t log-BS models attribute smaller weights to outlying observations, which produce robust parameter estimates. Also, some inferential results are presented. In addition, based on local influence and deviance component and martingale-type residuals, a diagnostics analysis is derived. Finally, a motivating example from the medical field is analyzed using log-BS regression models. Since the parameter estimates appear to be very sensitive to outlying and influential observations, the Student-t log-BS regression model should attenuate such influences. The model checking methodologies developed in this paper are used to compare the fitted models.
Resumo:
The main purpose of this thesis project is to prediction of symptom severity and cause in data from test battery of the Parkinson’s disease patient, which is based on data mining. The collection of the data is from test battery on a hand in computer. We use the Chi-Square method and check which variables are important and which are not important. Then we apply different data mining techniques on our normalize data and check which technique or method gives good results.The implementation of this thesis is in WEKA. We normalize our data and then apply different methods on this data. The methods which we used are Naïve Bayes, CART and KNN. We draw the Bland Altman and Spearman’s Correlation for checking the final results and prediction of data. The Bland Altman tells how the percentage of our confident level in this data is correct and Spearman’s Correlation tells us our relationship is strong. On the basis of results and analysis we see all three methods give nearly same results. But if we see our CART (J48 Decision Tree) it gives good result of under predicted and over predicted values that’s lies between -2 to +2. The correlation between the Actual and Predicted values is 0,794in CART. Cause gives the better percentage classification result then disability because it can use two classes.
Resumo:
Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region.
Resumo:
E-Science experiments typically involve many distributed services maintained by different organisations. After an experiment has been executed, it is useful for a scientist to verify that the execution was performed correctly or is compatible with some existing experimental criteria or standards, not necessarily anticipated prior to execution. Scientists may also want to review and verify experiments performed by their colleagues. There are no existing frameworks for validating such experiments in today's e-Science systems. Users therefore have to rely on error checking performed by the services, or adopt other ad hoc methods. This paper introduces a platform-independent framework for validating workflow executions. The validation relies on reasoning over the documented provenance of experiment results and semantic descriptions of services advertised in a registry. This validation process ensures experiments are performed correctly, and thus results generated are meaningful. The framework is tested in a bioinformatics application that performs protein compressibility analysis.
Resumo:
A atividade postal se constitui numa das formas mais antigas de prestação de serviços da história da humanidade e tem início com a necessidade de comunicação das pessoas. O setor postal se organiza de acordo com a legislação dos Países, normalmente, em prestação de serviços públicos, a exemplo do que ocorre no Brasil. A realidade mercadológica do mundo tem mudado o relacionamento das organizações com os seus clientes, principalmente em função da globalização. O setor postal Brasileiro experimenta estas mudanças e passa a dedicar mais atenção aos clientes, através das atividades que desenvolve. Uma das maneiras de conferir se a atividade produz o resultado que se espera é a medição. E, analisando o caso dos Correios da Bahia, encontrou-se a oportunidade de implementar um modelo de avaliação de desempenho e gestão que poderá reorientar as atividades da organização, considerando outras variáveis além das operacionais, como financeiras, do ambiente interno, dos clientes e da responsabilidade social da organização. O caso foi estudado a partir de três modelos de avaliação de desempenho e gestão, com a opção por um deles. A escolha do modelo está vinculada aos objetivos do estudo, que são voltados à análise crítica dos indicadores de desempenho e a conseqüente proposição de um modelo capaz de corrigir desvios e adequar o processo de gestão. Dentre os modelos analisados, encontram-se o Quantum (HRONEC), o Capital Intelectual (STEWART) e o Balanced Scorecard (KAPLAN & NORTON), sendo que este último foi o proposto para o acompanhamento da gestão. Em razão das características e particularidades dos Correios da Bahia, foi sugerida uma alteração na arquitetura original do Balanced Scorecard, sendo incluída a perspectiva da responsabilidade social e do aspecto legal no campo das medições. O modelo é composto por cinco perspectivas com seus respectivos objetivos estratégicos, vinculados aos fatores críticos de sucesso e foi adaptado da obra Organização voltada para a estratégia dos autores KAPLAN & NORTON (2001). Com a adoção do modelo proposto é possível que as distorções verificadas na pesquisa sejam corrigidas e a gestão dos Correios da Bahia seja conduzida para o futuro de forma equilibrada.
Resumo:
Objetivos – Descrever as características clínicas de crianças entre 1 e 12 meses hospitalizadas com diagnóstico de bronquiolite viral aguda (BVA), nos primeiros dias de internação, e verificar se o tempo de dessaturação de oxigênio (TD) tem valor prognóstico nesses pacientes. Metodologia – Estudo de coorte realizado de maio a outubro de 2001 com 111 pacientes entre 1 e 12 meses de idade internados no Hospital da Criança Santo Antônio, de Porto Alegre (RS), com diagnóstico de BVA na admissão, com saturação transcutânea de oxigênio da hemoglobina (SatHb) menor que 95% e em oxigenoterapia por cateter extranasal há menos de 24 horas. A gravidade foi verificada através do tempo de internação, tempo de oxigenoterapia e tempo para saturar 95% em ar ambiente (desfechos). Foram realizadas avaliações clínicas duas vezes ao dia (manhã e tarde), durante o período em que o paciente necessitou de oxigênio suplementar (até atingir saturação transcutânea de oxigênio de 95% em ar ambiente), com limite de dez avaliações. Os pacientes tiveram o oxigênio adicional retirado. Foi verificado, então, o tempo necessário para a saturação decrescer até 90% (TD90) e 85% (TD85), limitando-se a medida em no máximo cinco minutos. Foi constituído um escore de gravidade com os sinais clínicos anotados. Utilizou-se o teste do qui-quadrado ou teste exato de Fischer para comparar entre si os grupos de variáveis categóricas e o teste t ou MannWhitney para variáveis numéricas. Foi utilizada a correlação de Spearman para avaliar associações entre variáveis contínuas de distribuição assimétrica (escore de gravidade, tempo de internação, tempo de oxigenoterapia total e tempo para saturar acima ou igual a 95% em ar ambiente). Considerou-se alfa crítico de 5% em todas as comparações, exceto nas correlações em que foi utilizada a correção de Bonferroni para comparações múltiplas (30 correlações: p= 0,002; 10 correlações: p= 0,005). Os dados relativos ao peso e estatura para a idade foram digitados e analisados no programa específico do EpiInfo que utiliza o padrão NCHS (EpiNut). Resultados – Houve leve predomínio do sexo masculino (54%), predominância de idade inferior a quatro meses (61,3%), prevalência maior nos meses de junho e julho, freqüência elevada de história de prematuridade (23%) e de baixo peso de nascimento (14%). As manifestações clínicas prévias à hospitalização (falta de ar, chiado no peito, febre e parar de respirar) ocorreram, na sua maioria, nos três dias anteriores. Da população estudada, 45% tinha história de sibilância prévia, a maioria com um ou dois episódios relatados (31,5%). Esses pacientes foram analisados separadamente e tiveram resultados semelhantes ao grupo com BVA. A freqüência de desnutrição moderada e grave, excluídos os pacientes com história de prematuridade, foi de 26 pacientes (23%). Todos os pacientes utilizaram broncodilatador inalatório; 20% do grupo com BVA receberam corticosteróides sistêmicos e 47% de toda população, antibióticos. A mediana do uso de oxigênio em pacientes com BVA foi de 4,4 dias (IIQ 70,2-165,2) e o tempo de oxigenoterapia até saturar 95% em ar ambiente foi de 3,4 dias (IIQ 55-128). A mediana do tempo de internação hospitalar foi de 7 dias (IIQ 5-10,5) entre os pacientes com BVA; neste aspecto, apresentou diferença (p = 0,041) em relação ao grupo com sibilância prévia, que teve um tempo de internação mais longo (9 dias, IIQ 5-12). Observou-se pouca variabilidade clínica no período estudado, através da aplicação do escore clínico. Não se encontraram correlações estatisticamente significativas entre os escores clínicos e os TDs com os desfechos. Conclusões – Os TDs como elementos auxiliares na avaliação de pacientes em oxigenoterapia não foram clinicamente úteis neste estudo. É possível, no entanto, que, avaliando pacientes com maiores diferenças clínicas entre si, essas aferições possam mostrar-se importantes.
Resumo:
This paper presents evidence on the key role of infrastructure in the Andean Community trade patterns. Three distinct but related gravity models of bilateral trade are used. The first model aims at identifying the importance of the Preferential Trade Agreement and adjacency on intra-regional trade, while also checking the traditional roles of economic size and distance. The second and third models also assess the evolution of the Trade Agreement and the importance of sharing a common border, but their main goal is to analyze the relevance of including infrastructure in the augmented gravity equation, testing the theoretical assumption that infrastructure endowments, by reducing trade and transport costs, reduce “distance” between bilateral partners. Indeed, if one accepts distance as a proxy for transportation costs, infrastructure development and improvement drastically modify it. Trade liberalization eliminates most of the distortions that a protectionist tariff system imposes on international business; hence transportation costs represent nowadays a considerably larger barrier to trade than in past decades. As new trade pacts are being negotiated in the Americas, borders and old agreements will lose significance; trade among countries will be nearly without restrictions, and bilateral flows will be defined in terms of costs and competitiveness. Competitiveness, however, will only be achieved by an improvement in infrastructure services at all points in the production-distribution chain.