909 resultados para estimador Kernel
Resumo:
The one which is considered the standard model of theory change was presented in [AGM85] and is known as the AGM model. In particular, that paper introduced the class of partial meet contractions. In subsequent works several alternative constructive models for that same class of functions were presented, e.g.: safe/kernel contractions ([AM85, Han94]), system of spheres-based contractions ([Gro88]) and epistemic entrenchment-based contractions ([G ar88, GM88]). Besides, several generalizations of such model were investigated. In that regard we emphasise the presentation of models which accounted for contractions by sets of sentences rather than only by a single sentence, i.e. multiple contractions. However, until now, only two of the above mentioned models have been generalized in the sense of addressing the case of contractions by sets of sentences: The partial meet multiple contractions were presented in [Han89, FH94], while the kernel multiple contractions were introduced in [FSS03]. In this thesis we propose two new constructive models of multiple contraction functions, namely the system of spheres-based and the epistemic entrenchment-based multiple contractions which generalize the models of system of spheres-based and of epistemic entrenchment-based contractions, respectively, to the case of contractions (of theories) by sets of sentences. Furthermore, analogously to what is the case in what concerns the corresponding classes of contraction functions by one single sentence, those two classes are identical and constitute a subclass of the class of partial meet multiple contractions. Additionally, and as the rst step of the procedure that is here followed to obtain an adequate de nition for the system of spheres-based multiple contractions, we present a possible worlds semantics for the partial meet multiple contractions analogous to the one proposed in [Gro88] for the partial meet contractions (by one single sentence). Finally, we present yet an axiomatic characterization for the new class(es) of multiple contraction functions that are here introduced.
Resumo:
A Análise de Sobrevivência tem como objetivo o estudo do tempo desde um instante inicial bem definido até ao acontecimento de determinado evento. Por exemplo, poderá ser o tempo de vida de um indivíduo desde o momento em que lhe é diagnosticada uma doença até a sua morte ou cura. Com a evolução da medicina, começou a se verificar a existência de indivíduos para os quais nunca se observava o acontecimento de interesse e designaram-se esses indivíduos por curados, imunes, ou não suscetíveis. Assim, da Análise de Sobrevivência clássica surgem os modelos de cura. Neste trabalho, aplicaram-se estes conceitos a uma base de dados referentes a 833 mulheres diagnosticadas com cancro da mama, entre 1998 e 2005. Verificou-se a existência de um risco de morte maior em mulheres na faixa etária dos 50 a 59 anos. Comprovou-se que o estadiamento tem um papel preponderante em relação ao prognóstico, sendo que, quanto mais avançado o estadio pior o prognóstico. Dos tratamentos a que os doentes foram submetidos, a realização de cirurgia é indicativa de um melhor prognóstico, assim como a realização de hormonoterapia e de radioterapia. No entanto, este último tratamento não se revelou estatisticamente significativo para o modelo de regressão de Cox. A realização de quimioterapia apenas reflete um melhor prognóstico nos primeiros dois anos, o que já não acontece a partir dai. Esta caraterística inesperada ficou-se a dever à esperança de vida que o tratamento oferece aos doentes no estadio IV e da associação entre a existência de gânglios metastizados e o agravamento do prognóstico, no caso do estadio II. O modelo de cura foi aplicado apenas ao grupo de mulheres no estadio IV, pois só neste caso se admitiu que o tempo de follow-up era suficiente, obtendo-se uma taxa de cura de 7;4%.
Resumo:
Organizations are Complex systems. A conceptual model of the enterprise is needed that is: coherent the distinguished aspect models constitute a logical and truly integral comprehensive all relevant issues are covered consistent the aspect models are free from contradictions or irregularities concise no superfluous matters are contained in it essential it shows only the essence of the enterprise, i.e., the model abstracts from all realization and implementation issues. The world is in great need for transparency about the operation of all the systems we daily work with, ranging from the domestic appliances to the big societal institutions. In this context the field of enterprise ontology has emerged with the aim to create models that help to understand the essence of the construction and operation of complete systems; more specifically, of enterprises. Enterprise ontology arises in the way to look through the distracting and confusing appearance of an enterprise right into its deep kernel. This, from the perspective of the system designer gives him the tools needed to design a successful system in a way that’s reflects the desires and needs of the workers of the enterprise. This project’s context is the use of DEMO (Design and Engineering Methodology for Organizations) for (re)designing or (re)engineering of an enterprise, namely a process of the construction department of a city hall, the lack of a well-founded theory about the construction and operation of this processes that was the motivation behind this work. The purpose of studying applying the DEMO theory and method was to optimize the process, automating it as much as possible, while reducing paper and time spent between tasks and provide a better service to the citizens.
Resumo:
In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study
Resumo:
The Brazil is the third largest producer of cashew nuts in the world. Despite the social and economic importance of the cashew nut, its production is still carried out artisanally. One of the main problems encountered in the cashew production chain are the conditions under which the roasting of the nut occurs to obtain the kernel from the shell. In the present study was conducted a biomonitoring of the genotoxic and cytotoxicity effects associated with the elements from the cashew nut roasting in João Câmara - RN, semi-arid region of Brazil. To assess the genotoxic was used the bioassay of micronucleus (MN) in Tradescantia pallida. In addition, it was performed a comparative between the Tradescantia pallida and KU-20 and other biomarkers of DNA damage, such as the nucleoplasmic bridges (NBP) and nuclear fragments (NF) were quantified. The levels of particulate matter (PM1.0, PM2.5, PM10) and black carbon (BC) were also measured and the inorganic chemical composition of the PM2.5 collected was determined using X-ray fluorescence spectrometry analysis and the assessment of the cytotoxicity by MTT assay and exclusion method by trypan blue. . For this purpose, were chosen: the Amarelão community where the roasting occurs and the Santa Luzia farm an area without influence of this process. The mean value of PM2.5 (Jan 2124.2 μg/m3; May 1022.2 μg/m3; Sep 1291.9 μg/m3) and BC (Jan 363.6 μg/m3; May 70.0 μg/m3; Sep 69.4 μg/m3) as well as the concentration of the elements Al, Si, P, S, Cl, K, Ca, Ti, Cr, Mn, Fe, Ni, Cu, Zn, Se, Br and Pb obtained at Amarelão was significantly higher than at Santa Luzia farm. The genotoxicity tests with T. pallida indicated a significant increase in the number of MN, NBP and NF and it was found a negative correlation between the frequency of these biomarkers and the rainfall. The concentrations of 200 μg/mL and 400 μg/mL of PM2.5 were cytotoxic to MRC-5 cells. All together, the results indicated genotoxicity and citotoxicity for the community of Amarelão, and the high rates of PM2.5 considered a potential contributor to this effect, mainly by the high presence of transition metals, especially Fe, Ni, Cu, Cr and Zn, these elements have the potential to cause DNA damage. Other nuclear alterations, such as the NPBs and NFs may be used as effective biomarkers of DNA damage in tetrads of Tradescantia pallida. The results of this study enabled the identification of a serious occupational problem. Accordingly, preventative measures and better practices should be adopted to improve both the activity and the quality of life of the population. These measures are of fundamental importance for the sustainable development of this activity.
Resumo:
The objective of this project was to study the influence of surcharge pressure and moisture content on the compressive behavior and bulk density of soybeans. Three varieties were selected with varying dimensions and shapes. Moisture contents of 10.5, 15.0, and 20% were tested at nine surcharge pressures in the range from 0 to 82.8 kPa. Results indicated that the bulk densities of different soybean varieties have similar behavior with respect to pressure level and moisture content but that the magnitude of bulk density was influenced by variety, Bulk density was influenced by both pressure level and moisture content. The four-element Burger model was found to adequately describe the bulk density of soybeans as a function of pressure for all varieties and moisture levels.
Resumo:
This study aims to verify the impact of the Bolsa Família Program (BFP) in income and school attendance of poor Brazilian families. It is intended to also check the existence of a possible negative effect of the program on the labor market, titled as sloth effect. For such, microdata from the IBGE Census sample in 2010 were used. Seeking to purge possible selection biases, methodology of Quantilic Treatment Effect (QTE) was applied, in particular the estimator proposed by Firpo (2007), which assumes an exogenous and non-conditional treatment. Moreover, Foster- Greer-Thorbecke (FGT) index was calculated to check if there are fewer households below the poverty line, as well as if the inequality among the poor decreases. Human Opportunity Index (HOI) was also calculated to measure the access of young people / children education. Results showed that BFP has positively influenced the family per capita income and education (number of children aged 5-17 years old attending school). As for the labor market (worked hours and labor income), the program showed a negative effect. Thus, when compared with not benefiting families, those families who receive the BFP have: a) a higher family income (due to the shock of the transfer budget money) b) more children attending school (due to the conditionality imposed by the program); c) less worked hours (due to sloth effect in certain family groups) and d) a lower income from work. All these effects were potentiated separating the sample in the five Brazilian regions, being observed that the BFP strongly influenced the Northeast, showing a greater decrease in income inequality and poverty, and at the same time, achieved a greater negative impact on the labor market
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The growth of maize (Zea mays L.) kernels depends on the availability of carbon (C) and nitrogen (N) assimilates supplied by the mother plant and the capacity of the kernel to use them. Our objectives were to study the effects of N and sucrose supply levels on growth and metabolism of maize kernels. Kernel explants of Pioneer 34RO6 were cultured in vitro with varying combinations of N (5 to 30 mM) and sucrose (117 to 467 mM). Maximum kernel growth was obtained with 10 mM N and 292 mM sucrose in the medium, and a deficiency of one assimilate could not be overcome by a sufficiency of the other. Increasing the N supply led to increases in the kernel sink capacity (number of cells and starch granules in the endosperm), activity of certain enzymes (soluble and bound invertases, sucrose synthase, and aspartate aminotransaminase), starch, and the levels of N compounds (total-N, soluble protein, and free amino acids), and decreased the levels of C metabolites (sucrose and reducing sugars). Conversely, increasing the sucrose supply increased the level of endosperm C metabolites, free amino acids, and ADPG-PPase and alanine transaminase activities, but decreased the activity of soluble invertase and concentrations of soluble protein and total-N. Thus, while C and N are interdependent and essential for accumulation of maximum kernel weight, they appear to regulate growth by different means. Nitrogen supply aids the establishment of kernel sink capacity, and promotes activity of enzymes relating to sucrose and nitrogen uptake, while sucrose regulates the activities df invertase and ADPG-PPase. (C) 1999 Annals of Botany Company.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work describes the study and the implementation of the vector speed control for a three-phase Bearingless induction machine with divided winding of 4 poles and 1,1 kW using the neural rotor flux estimation. The vector speed control operates together with the radial positioning controllers and with the winding currents controllers of the stator phases. For the radial positioning, the forces controlled by the internal machine magnetic fields are used. For the radial forces optimization , a special rotor winding with independent circuits which allows a low rotational torque influence was used. The neural flux estimation applied to the vector speed controls has the objective of compensating the parameter dependences of the conventional estimators in relation to the parameter machine s variations due to the temperature increases or due to the rotor magnetic saturation. The implemented control system allows a direct comparison between the respective responses of the speed and radial positioning controllers to the machine oriented by the neural rotor flux estimator in relation to the conventional flux estimator. All the system control is executed by a program developed in the ANSI C language. The DSP resources used by the system are: the Analog/Digital channels converters, the PWM outputs and the parallel and RS-232 serial interfaces, which are responsible, respectively, by the DSP programming and the data capture through the supervisory system
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
Objetivou-se neste estudo avaliar as características agronômicas, a composição químico-bromatológica e a digestibilidade de 11 cultivares de milho (Zea mays) colhido em duas alturas de corte. As cultivares D 766, D 657, D 1000, P 3021, P 3041, C 805, C 333, AG 5011, FO 01, CO 9621 e BR 205 foram avaliadas quando colhidas 5 cm acima do solo (baixa) e 5 cm abaixo da inserção da primeira espiga (alta). O experimento foi delineado como blocos casualizados, com três repetições, arranjados em esquema fatorial 11 x 2. Os cultivares apresentaram produções semelhantes de matéria seca de forragem e de grãos. As porcentagens das frações colmo, folha, palha, sabugo e grão diferiram entre os cultivares, assim como os teores de matéria seca da planta inteira no momento da colheita. Considerando a planta inteira, apenas os teores de energia bruta, nitrogênio da fração fibra em detergente neutro e a digestibilidade in vitro da fibra em detergente neutro e detergente ácido não diferiram entre os cultivares. O aumento da altura de corte melhorou a qualidade da forragem, devido à redução das frações colmo e folha e dos teores dos constituintes da parede celular.
Resumo:
Most algorithms for state estimation based on the classical model are just adequate for use in transmission networks. Few algorithms were developed specifically for distribution systems, probably because of the little amount of data available in real time. Most overhead feeders possess just current and voltage measurements at the middle voltage bus-bar at the substation. In this way, classical algorithms are of difficult implementation, even considering off-line acquired data as pseudo-measurements. However, the necessity of automating the operation of distribution networks, mainly in regard to the selectivity of protection systems, as well to implement possibilities of load transfer maneuvers, is changing the network planning policy. In this way, some equipments incorporating telemetry and command modules have been installed in order to improve operational features, and so increasing the amount of measurement data available in real-time in the System Operation Center (SOC). This encourages the development of a state estimator model, involving real-time information and pseudo-measurements of loads, that are built from typical power factors and utilization factors (demand factors) of distribution transformers. This work reports about the development of a new state estimation method, specific for radial distribution systems. The main algorithm of the method is based on the power summation load flow. The estimation is carried out piecewise, section by section of the feeder, going from the substation to the terminal nodes. For each section, a measurement model is built, resulting in a nonlinear overdetermined equations set, whose solution is achieved by the Gaussian normal equation. The estimated variables of a section are used as pseudo-measurements for the next section. In general, a measurement set for a generic section consists of pseudo-measurements of power flows and nodal voltages obtained from the previous section or measurements in real-time, if they exist -, besides pseudomeasurements of injected powers for the power summations, whose functions are the load flow equations, assuming that the network can be represented by its single-phase equivalent. The great advantage of the algorithm is its simplicity and low computational effort. Moreover, the algorithm is very efficient, in regard to the accuracy of the estimated values. Besides the power summation state estimator, this work shows how other algorithms could be adapted to provide state estimation of middle voltage substations and networks, namely Schweppes method and an algorithm based on current proportionality, that is usually adopted for network planning tasks. Both estimators were implemented not only as alternatives for the proposed method, but also looking for getting results that give support for its validation. Once in most cases no power measurement is performed at beginning of the feeder and this is required for implementing the power summation estimations method, a new algorithm for estimating the network variables at the middle voltage bus-bar was also developed