73 resultados para Testes de hipóteses estatísticas


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mimosa caesalpiniaefolia Benth. is a forest species of the Mimosaceae family, recommended for recovery of degraded areas. The evaluation of vigor by biochemical tests have been an important tool in the control of seed quality programs, and the electrical conductivity and potassium leaching the most efficient in the verifying the physiological potential. The objective, therefore, to adjust the methodology of the electrical conductivity test for seeds of M. caesalpiniaefolia, for then compare the efficiency of this test with the potassium in the evaluation of seed vigor of different lots of seeds M. caesalpiniaefolia. To test the adequacy of the electrical conductivity were used different combinations of temperatures , 25 °C and 30 ºC, number of seeds , 25 and 50, periods of imbibition , 4 , 8 , 12 , 16 and 24 hours , and volumes deionized water, 50 mL and 75mL. For potassium leaching test, which was conducted from the results achieved by the methodology of the adequacy of the electrical conductivity test, to compare the efficiency of both tests , in the classification of seeds at different levels of vigor, and the period 4 hours also evaluated because the potassium leaching test can be more efficient in the shortest time . The best combination obtained in experiment of electrical conductivity is 25 seeds soaked in 50 mL deionized or distilled water for 8 hours at a temperature of 30 ° C. Data were subjected to analysis of variance, the means were compared with each other by F tests and Tukey at 5 % probability, and when necessary polynomial regression analysis was performed. The electrical conductivity test performed at period eight hour proved to be more efficient in the separation of seed lots M. caesalpiniaefolia at different levels of vigor compared to the potassium test

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The judicial intervention in limited liability company goes through several issues of legislative and hermeneutics origin, based considerably on the small importance given to freedom of economic initiative by the participants in the process of formation and application of the law. In addition, Brazilian law, due to incompleteness, inconsistency or lack of valid grounds, put the judge in a procedural delicate situation. Being forced to judge, the judiciary faces severe uncomfortable interpretive situations, of which derive solutions of dubious constitutionality and affecting, significantly, the dynamics of business activity. In this context, and considering the limited liability company as an expression of free enterprise, corresponding to a lawful association of people in order to undertake economically, in exercise of his freedom of contracting and professional action, intended to be offered safe parameters of constitutionality for judicial intervention in limited liability company in the hypothesis of (i) transfer of corporate shares, (ii) attachment of corporate shares, (iii) dismissal of directors, (iv) appointment of judicial stakeholders, (v) exclusion of shareholders and (vi ) trespass. The hypothetical-deductive approach was adopted, building hypotheses to overcome the gaps and unconstitutionality of the law and subjecting them to tests, reviews, and comparisons with hypothetical facts and case law in order to determine the constitutional validity of the proposed solutions. The procedure aimed to reconcile the historical, comparative, dialectical and scientific methods. The roots of temporal institutes were researched as well as current solutions provided by national and compared law. From problematizations point, addressed by the constitutional interpretation of the law and jurisprudence, responses that bring out the unconstitutionality of certain conceptions were headed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pesquisa tem como objetivo identificar as barreiras à adoção de tecnologias de informação por parte das micro e pequenas empresas. A pesquisa teórica foi orientada por estudos sobre adoção de tecnologias de informação e os fatores determinantes para esta adoção. Quanto à amostra, foi delimitada aos pequenos fornecedores não adotantes do sistema de compras eletrônicas da Petrobras no Rio Grande do Norte. A pesquisa é exploratória, do tipo levantamento de dados, com abordagem quantitativa. A pesquisa de campo foi realizada nos meses de novembro e dezembro de 2006, junto a 55 empresas, através de um questionário estruturado, respondido pelos gestores. Para a análise dos dados, foram utilizadas técnicas estatísticas, tais como análise descritiva e exploratória de dados, testes de hipótese e análises multivariadas de dados. Os resultados evidenciam a existência de uma infra-estrutura de TI básica, com baixo nível de utilização dessas tecnologias para propósitos mais avançados dentro das empresas e em especial a sua utilização para as atividades de gestão e estratégias de acesso a mercados, principalmente o seu uso como ferramentas para o comércio eletrônico. Os resultados mostraram também que os aspectos técnicos e financeiros são percebidos como obstáculos maiores que os fatores socioculturais e humanos. As variáveis relacionadas com os custos da TI e de consultoria externa, a percepção de dependência de fornecedores de TI e a falta de priorização de esforços ára a TI

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This master´s thesis presents a reliability study conducted among onshore oil fields in the Potiguar Basin (RN/CE) of Petrobras company, Brazil. The main study objective was to build a regression model to predict the risk of failures that impede production wells to function properly using the information of explanatory variables related to wells such as the elevation method, the amount of water produced in the well (BSW), the ratio gas-oil (RGO), the depth of the production bomb, the operational unit of the oil field, among others. The study was based on a retrospective sample of 603 oil columns from all that were functioning between 2000 and 2006. Statistical hypothesis tests under a Weibull regression model fitted to the failure data allowed the selection of some significant predictors in the set considered to explain the first failure time in the wells

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advent of the Internet stimulated the appearance of several services. An example is the communication ones present in the users day-by-day. Services as chat and e-mail reach an increasing number of users. This fact is turning the Net a powerful communication medium. The following work explores the use of communication conventional services into the Net infrastructure. We introduce the concept of communication social protocols applied to a shared virtual environment. We argue that communication tools have to be adapted to the Internet potentialities. To do that, we approach some theories of the Communication area and its applicability in a virtual environment context. We define multi-agent architecture to support the offer of these services, as well as, a software and hardware platform to support the accomplishment of experiments using Mixed Reality. Finally, we present the obtained results, experiments and products

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The industries are getting more and more rigorous, when security is in question, no matter is to avoid financial damages due to accidents and low productivity, or when it s related to the environment protection. It was thinking about great world accidents around the world involving aircrafts and industrial process (nuclear, petrochemical and so on) that we decided to invest in systems that could detect fault and diagnosis (FDD) them. The FDD systems can avoid eventual fault helping man on the maintenance and exchange of defective equipments. Nowadays, the issues that involve detection, isolation, diagnose and the controlling of tolerance fault are gathering strength in the academic and industrial environment. It is based on this fact, in this work, we discuss the importance of techniques that can assist in the development of systems for Fault Detection and Diagnosis (FDD) and propose a hybrid method for FDD in dynamic systems. We present a brief history to contextualize the techniques used in working environments. The detection of fault in the proposed system is based on state observers in conjunction with other statistical techniques. The principal idea is to use the observer himself, in addition to serving as an analytical redundancy, in allowing the creation of a residue. This residue is used in FDD. A signature database assists in the identification of system faults, which based on the signatures derived from trend analysis of the residue signal and its difference, performs the classification of the faults based purely on a decision tree. This FDD system is tested and validated in two plants: a simulated plant with coupled tanks and didactic plant with industrial instrumentation. All collected results of those tests will be discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work consists in the use of techniques of signals processing and artificial neural networks to identify leaks in pipes with multiphase flow. In the traditional methods of leak detection exists a great difficulty to mount a profile, that is adjusted to the found in real conditions of the oil transport. These difficult conditions go since the unevenly soil that cause columns or vacuum throughout pipelines until the presence of multiphases like water, gas and oil; plus other components as sand, which use to produce discontinuous flow off and diverse variations. To attenuate these difficulties, the transform wavelet was used to map the signal pressure in different resolution plan allowing the extraction of descriptors that identify leaks patterns and with then to provide training for the neural network to learning of how to classify this pattern and report whenever this characterize leaks. During the tests were used transient and regime signals and pipelines with punctures with size variations from ½' to 1' of diameter to simulate leaks and between Upanema and Estreito B, of the UN-RNCE of the Petrobras, where it was possible to detect leaks. The results show that the proposed descriptors considered, based in statistical methods applied in domain transform, are sufficient to identify leaks patterns and make it possible to train the neural classifier to indicate the occurrence of pipeline leaks

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spacecraft move with high speeds and suffer abrupt changes in acceleration. So, an onboard GPS receiver could calculate navigation solutions if the Doppler effect is taken into consideration during the satellite signals acquisition and tracking. Thus, for the receiver subject to such dynamic cope these shifts in the frequency signal, resulting from this effect, it is imperative to adjust its acquisition bandwidth and increase its tracking loop to a higher order. This paper presents the changes in the GPS Orion s software, an open architecture receiver produced by GEC Plessey Semiconductors, nowadays Zarlink, in order to make it able to generate navigation fix for vehicle under high dynamics, especially Low Earth Orbit satellites. GPS Architect development system, sold by the same company, supported the modifications. Furthermore, it presents GPS Monitor Aerospace s characteristics, a computational tool developed for monitoring navigation fix calculated by the GPS receiver, through graphics. Although it was not possible to simulate the software modifications implemented in the receiver in high dynamics, it was observed that the receiver worked in stationary tests, verified also in the new interface. This work also presents the results of GPS Receiver for Aerospace Applications experiment, achieved with the receiver s participation in a suborbital mission, Operation Maracati 2, in December 2010, using a digital second order carrier tracking loop. Despite an incident moments before the launch have hindered the effective navigation of the receiver, it was observed that the experiment worked properly, acquiring new satellites and tracking them during the VSB-30 rocket flight.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study is to investigate the development of written Interlanguage in English as an Additional Language (AL) by students in the 2nd grade of Ensino Fundamental I in a bilingual school in the city of Natal-RN. For this purpose two research questions guided this study: (a) which hypotheses could be inferred from the writing development of the bilingual learners of English as AL? and, (b) what is the impact of the type of input monomodal or multimodal in the Interlanguage development in the AL of bilingual learners? The 38 learners were divided into a control group, with 21 learners exposed to monomodal input, and an experimental group, with 17 learners exposed to multimodal input, and pre and post-tests were applied to both groups. A mixed methods research design was conducted (DÖRNYEI, 2007) to involve both qualitative and quantitative data collection and analysis. The qualitative aspect comprehended descriptive characteristics that interpreted the central cognitive processes in the acquisition of writing in AL by the learners. Through these interpretations, it was possible to understand the constitution of written Interlanguage (SELINKER, 1972) according to the data generated by the learners. The quantitative data were presented as the results generated from the experimental design. Thus, they narrowed the relations between the dependent variable the writing development, that is, how close it is to the target form which was modified throughout the process by the independent variable the quality of input (VAN PATTEN, 2002, GASS, 1997, SCHMIDT, 1986, PARADIS, 2009; 2010, ELLIS, 1995), which, being monomodal or multimodal, its function was possibly to alter the route of acquisition. The quantitative results pointed towards significant gains by the experimental group, which had multimodality present, suggesting that the learners in this group seem to have been more able to cognitively register (SCHIMDT, 1990) aspects of learning than the learners in the control group

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent astronomical observations (involving supernovae type Ia, cosmic background radiation anisotropy and galaxy clusters probes) have provided strong evidence that the observed universe is described by an accelerating, flat model whose space-time properties can be represented by the FriedmannRobertsonWalker (FRW) metric. However, the nature of the substance or mechanism behind the current cosmic acceleration remains unknown and its determination constitutes a challenging problem for modern cosmology. In the general relativistic description, an accelerat ing regime is usually obtained by assuming the existence of an exotic energy component endowed with negative pressure, called dark energy, which is usually represented by a cosmological constant ¤ associated to the vacuum energy density. All observational data available so far are in good agreement with the concordance cosmic ¤CDM model. Nevertheless, such models are plagued with several problems thereby inspiring many authors to propose alternative candidates in the relativistic context. In this thesis, a new kind of accelerating flat model with no dark energy and fully dominated by cold dark matter (CDM) is proposed. The number of CDM particles is not conserved and the present accelerating stage is a consequence of the negative pressure describing the irreversible process of gravitational particle creation. In order to have a transition from a decelerating to an accelerating regime at low redshifts, the matter creation rate proposed here depends on 2 parameters (y and ߯): the first one identifies a constant term of the order of H0 and the second one describes a time variation proportional to he Hubble parameter H(t). In this scenario, H0 does not need to be small in order to solve the age problem and the transition happens even if there is no matter creation during the radiation and part of the matter dominated phase (when the ß term is negligible). Like in flat ACDM scenarios, the dimming of distant type Ia supernovae can be fitted with just one free parameter, and the coincidence problem plaguing the models driven by the cosmological constant. ACDM is absent. The limits endowed with with the existence of the quasar APM 08279+5255, located at z = 3:91 and with an estimated ages between 2 and 3 Gyr are also investigated. In the simplest case (ß = 0), the model is compatible with the existence of the quasar for y > 0:56 whether the age of the quasar is 2.0 Gyr. For 3 Gyr the limit derived is y > 0:72. New limits for the formation redshift of the quasar are also established

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, we address two issues of broad conceptual and practical relevance in the study of complex networks. The first is associated with the topological characterization of networks while the second relates to dynamical processes that occur on top of them. Regarding the first line of study, we initially designed a model for networks growth where preferential attachment includes: (i) connectivity and (ii) homophily (links between sites with similar characteristics are more likely). From this, we observe that the competition between these two aspects leads to a heterogeneous pattern of connections with the topological properties of the network showing quite interesting results. In particular, we emphasize that there is a region where the characteristics of sites play an important role not only for the rate at which they get links, but also for the number of connections which occur between sites with similar and dissimilar characteristics. Finally, we investigate the spread of epidemics on the network topology developed, whereas its dissemination follows the rules of the contact process. Using Monte Carlo simulations, we show that the competition between states (infected/healthy) sites, induces a transition between an active phase (presence of sick) and an inactive (no sick). In this context, we estimate the critical point of the transition phase through the cumulant Binder and ratio between moments of the order parameter. Then, using finite size scaling analysis, we determine the critical exponents associated with this transition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stimulation of motor learning is an important component to a rehabilitation and type of practice used is de basic importance to Physiotherapy. The motor skills are the types more basic of behavior that subjects must acquire throughout its lives and observational learning one of forms for its acquisition. Objective: This study aimed to compare performance of patients post- stroke on test of recognition of activities of day life using self-controlled and externally determined practice. Intervention: Forty subjects had been evaluated, 20 stroke patients (the mean age was 57,9?}6,7 years, schooling 6,7?}3,09 years and time of injury 23,4?}17,2 months) and 20 health subjects (the mean age 55,4?}5,9 years and schooling 8?}3,7 years). All was evaluated about independence functional (FIM) and cognitive state (MMSE), and patients were also evaluated about neurologic state (NIHSS). Later, all realized a recognition of activities of day life test (drink water and speak to telephone) on self-controlled (PAUTO and CAUTO) and externally determined (P20 and C20) frequency. The stroke subjects also were examined for a three-dimensional system of kinematic analysis, when they have drink water. The statistic analysis was realized for chi-square and t Student tests. Results: This was not difference, about number of rightness, between groups of self-controlled and externally determined practice (p0,005), and also not between patients and control groups (p0,005). Patients mean velocity (PAUTO: 141,1mm/sec and P20: 141,6mm/sec) and peak velocity (PAUTO: 652,1mm/sec and P20: 598,6mm/sec) were reduced, as well as the angles reached for elbow (PAUTO: 66,60 and 124,40; P20: 66,30 and 128,50 extension e flexion respectively) regarding literature. Conclusions: The performance on recognition of activities of day life test was similar between on self-controlled and externally determined frequency, showing both technique may be used to stimulate motor learning on chronic patients after stroke