916 resultados para Test, Black-box testing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the expansion of the wild crop to regions with higher temperatures is important to develop cultivars adapted to high heat. The aim was to select tests for evaluating seed physiological quality of wild radish to estimating genetic characters in order to select cultivars adapted to high temperature conditions. Hundred of half-brothers of wild radish were subjected to germination test and vigor (first count of germination, classification of seedling vigor, accelerated aging test, germination and testing of the first count at high temperature as well as seedling emergence in field. The germination test, first count test, accelerated aging and high germination test (20-35°C) can be used for the selection of wild radish crop populations adapted to germination and field emergence under high temperatures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Artes - IA

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To assess the influence of cervical preparation on fracture susceptibility of roots. Material and methods: During root canal instrumentation, the cervical portions were prepared with different taper instruments: I: no cervical preparation; II: #30/.08; III: #30/.10; IV: #70/.12. The specimens were sealed with the following filling materials (n = 8), A: unfilled; B: Endofill/gutta-percha; C: AH Plus/gutta-percha; D: Epiphany SE/Resilon. For the fracture resistance test, a universal testing machine was used at 1 mm per minute. Results: anova demonstrated difference (P < 0.05) between taper instruments with a higher value for group I (205.3 +/- 77.5 N) followed by II (185.2 +/- 70.8 N), III (164.8 +/- 48.9 N), and IV (156.7 +/- 41.4 N). There was no difference (P > 0.05) between filling materials A (189.1 +/- 66.3 N), B (186.3 +/- 61.0 N), C (159.7 +/- 69.9 N), and D (176.9 +/- 55.2 N). Conclusions: Greater cervical wear using a #70/.12 file increased the root fracture susceptibility, and the tested filling materials were not able to restore resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Daily rhythmic processes are coordinated by circadian clocks, which are present in numerous central and peripheral tissues. In mammals, two circadian clocks, the food-entrainable oscillator (FEO) and methamphetamine-sensitive circadian oscillator (MASCO), are "black box" mysteries because their anatomical loci are unknown and their outputs are not expressed under normal physiological conditions. In the current study, the investigation of the timekeeping mechanisms of the FEO and MASCO in mice with disruption of all three paralogs of the canonical clock gene, Period, revealed unique and convergent findings. We found that both the MASCO and FEO in Per1(-/-)/Per2(-/-)/Per3(-/-) mice are circadian oscillators with unusually short (similar to 21 h) periods. These data demonstrate that the canonical Period genes are involved in period determination in the FEO and MASCO, and computational modeling supports the hypothesis that the FEO and MASCO use the same timekeeping mechanism or are the same circadian oscillator. Finally, these studies identify Per1(-/-)/Per2(-/-)/Per3(-/-) mice as a unique tool critical to the search for the elusive anatomical location(s) of the FEO and MASCO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the framework of the micro-CHP (Combined Heat and Power) energy systems and the Distributed Generation (GD) concept, an Integrated Energy System (IES) able to meet the energy and thermal requirements of specific users, using different types of fuel to feed several micro-CHP energy sources, with the integration of electric generators of renewable energy sources (RES), electrical and thermal storage systems and the control system was conceived and built. A 5 kWel Polymer Electrolyte Membrane Fuel Cell (PEMFC) has been studied. Using experimental data obtained from various measurement campaign, the electrical and CHP PEMFC system performance have been determinate. The analysis of the effect of the water management of the anodic exhaust at variable FC loads has been carried out, and the purge process programming logic was optimized, leading also to the determination of the optimal flooding times by varying the AC FC power delivered by the cell. Furthermore, the degradation mechanisms of the PEMFC system, in particular due to the flooding of the anodic side, have been assessed using an algorithm that considers the FC like a black box, and it is able to determine the amount of not-reacted H2 and, therefore, the causes which produce that. Using experimental data that cover a two-year time span, the ageing suffered by the FC system has been tested and analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lo scopo del presente lavoro di tesi riguarda la caratterizzazione di un sensore ottico per la lettura di ematocrito e lo sviluppo dell’algoritmo di calibrazione del dispositivo. In altre parole, utilizzando dati ottenuti da una sessione di calibrazione opportunamente pianificata, l’algoritmo sviluppato ha lo scopo di restituire la curva di interpolazione dei dati che caratterizza il trasduttore. I passi principali del lavoro di tesi svolto sono sintetizzati nei punti seguenti: 1) Pianificazione della sessione di calibrazione necessaria per la raccolta dati e conseguente costruzione di un modello black box.  Output: dato proveniente dal sensore ottico (lettura espressa in mV)  Input: valore di ematocrito espresso in punti percentuali ( questa grandezza rappresenta il valore vero di volume ematico ed è stata ottenuta con un dispositivo di centrifugazione sanguigna) 2) Sviluppo dell’algoritmo L’algoritmo sviluppato e utilizzato offline ha lo scopo di restituire la curva di regressione dei dati. Macroscopicamente, il codice possiamo distinguerlo in due parti principali: 1- Acquisizione dei dati provenienti da sensore e stato di funzionamento della pompa bifasica 2- Normalizzazione dei dati ottenuti rispetto al valore di riferimento del sensore e implementazione dell’algoritmo di regressione. Lo step di normalizzazione dei dati è uno strumento statistico fondamentale per poter mettere a confronto grandezze non uniformi tra loro. Studi presenti, dimostrano inoltre un mutazione morfologica del globulo rosso in risposta a sollecitazioni meccaniche. Un ulteriore aspetto trattato nel presente lavoro, riguarda la velocità del flusso sanguigno determinato dalla pompa e come tale grandezza sia in grado di influenzare la lettura di ematocrito.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Globalization has increased the pressure on organizations and companies to operate in the most efficient and economic way. This tendency promotes that companies concentrate more and more on their core businesses, outsource less profitable departments and services to reduce costs. By contrast to earlier times, companies are highly specialized and have a low real net output ratio. For being able to provide the consumers with the right products, those companies have to collaborate with other suppliers and form large supply chains. An effect of large supply chains is the deficiency of high stocks and stockholding costs. This fact has lead to the rapid spread of Just-in-Time logistic concepts aimed minimizing stock by simultaneous high availability of products. Those concurring goals, minimizing stock by simultaneous high product availability, claim for high availability of the production systems in the way that an incoming order can immediately processed. Besides of design aspects and the quality of the production system, maintenance has a strong impact on production system availability. In the last decades, there has been many attempts to create maintenance models for availability optimization. Most of them concentrated on the availability aspect only without incorporating further aspects as logistics and profitability of the overall system. However, production system operator’s main intention is to optimize the profitability of the production system and not the availability of the production system. Thus, classic models, limited to represent and optimize maintenance strategies under the light of availability, fail. A novel approach, incorporating all financial impacting processes of and around a production system, is needed. The proposed model is subdivided into three parts, maintenance module, production module and connection module. This subdivision provides easy maintainability and simple extendability. Within those modules, all aspect of production process are modeled. Main part of the work lies in the extended maintenance and failure module that offers a representation of different maintenance strategies but also incorporates the effect of over-maintaining and failed maintenance (maintenance induced failures). Order release and seizing of the production system are modeled in the production part. Due to computational power limitation, it was not possible to run the simulation and the optimization with the fully developed production model. Thus, the production model was reduced to a black-box without higher degree of details.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic design has become a common approach to evolve complex networks, such as artificial neural networks (ANNs) and random boolean networks (RBNs), and many evolutionary setups have been discussed to increase the efficiency of this process. However networks evolved in this way have few limitations that should not be overlooked. One of these limitations is the black-box problem that refers to the impossibility to analyze internal behaviour of complex networks in an efficient and meaningful way. The aim of this study is to develop a methodology that make it possible to extract finite-state automata (FSAs) descriptions of robot behaviours from the dynamics of automatically designed complex controller networks. These FSAs unlike complex networks from which they're extracted are both readable and editable thus making the resulting designs much more valuable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artificial neural networks are based on computational units that resemble basic information processing properties of biological neurons in an abstract and simplified manner. Generally, these formal neurons model an input-output behaviour as it is also often used to characterize biological neurons. The neuron is treated as a black box; spatial extension and temporal dynamics present in biological neurons are most often neglected. Even though artificial neurons are simplified, they can show a variety of input-output relations, depending on the transfer functions they apply. This unit on transfer functions provides an overview of different transfer functions and offers a simulation that visualizes the input-output behaviour of an artificial neuron depending on the specific combination of transfer functions.