952 resultados para Performance Rating System


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to optimize and validate the solid-liquid extraction (ESL) technique for determination of picloram residues in soil samples. At the optimization stage, the optimal conditions for extraction of soil samples were determined using univariate analysis. Ratio soil/solution extraction, type and time of agitation, ionic strength and pH of extraction solution were evaluated. Based on the optimized parameters, the following method of extraction and analysis of picloram was developed: weigh 2.00 g of soil dried and sieved through a sieve mesh of 2.0 mm pore, add 20.0 mL of KCl concentration of 0.5 mol L-1, shake the bottle in the vortex for 10 seconds to form suspension and adjust to pH 7.00, with alkaline KOH 0.1 mol L-1. Homogenate the system in a shaker system for 60 minutes and then let it stand for 10 minutes. The bottles are centrifuged for 10 minutes at 3,500 rpm. After the settlement of the soil particles and cleaning of the supernatant extract, an aliquot is withdrawn and analyzed by high performance liquid chromatography. The optimized method was validated by determining the selectivity, linearity, detection and quantification limits, precision and accuracy. The ESL methodology was efficient for analysis of residues of the pesticides studied, with percentages of recovery above 90%. The limits of detection and quantification were 20.0 and 66.0 mg kg-1 soil for the PVA, and 40.0 and 132.0 mg kg-1 soil for the VLA. The coefficients of variation (CV) were equal to 2.32 and 2.69 for PVA and TH soils, respectively. The methodology resulted in low organic solvent consumption and cleaner extracts, as well as no purification steps for chromatographic analysis were required. The parameters evaluated in the validation process indicated that the ESL methodology is efficient for the extraction of picloram residues in soils, with low limits of detection and quantification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Russia inherited a large research and development (R&D) sector from the Soviet times, and has retained a substantial R&D sector today, compared with other emerging economies. However, Russia is falling behind in all indicators measuring innovative output in comparison with most developed countries. Russia’s innovation performance is disappointing, despite the available stock of human capital and overall investment in R&D. The communist legacy still influences the main actors of the innovation system. The federal state is still the most important funding source for R&D. Private companies are not investing in innovative activities, preferring to “import” innovations embedded in foreign technologies. Universities are outsiders in the innovation system, only a few universities carry out research activities. Nowadays, Russia is a resource-depended country. The economy depends on energy and metals for growth. The Russian economy faces the challenge of diversification and should embrace innovation, and shift to a knowledge economy to remain competitive in the long run. Therefore, Russia has to tackle the challenge of developing an efficient innovation system with its huge potential in science expertise and engineering know-how.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the practice of supply chain management problems and the perceived demand information distortion’s (the bullwhip effect) reduction with the interfirm information system, which is delivered as a cloud service to a company operating in the telecommunications industry. The purpose is to shed light in practice that do the interfirm information system have impact on the performance of the supply chain and in particularly the reduction of bullwhip effect. In addition, a holistic case study of the global telecommunications company's supply chain is presented and also the challenges it’s facing, and this study also proposes some measures to improve the situation. The theoretical part consists of the supply chain and its management, as well as increasing the efficiency and introducing the theories and related previous research. In addition, study presents performance metrics for the bullwhip effect detection and tracking. The theoretical part ends in presenting cloud -based business intelligence theoretical framework used in the background of this study. The research strategy is a qualitative case study, supported by quantitative data, which is collected from a telecommunication sector company's databases. Qualitative data were gathered mainly with two open interviews and the e-mail exchange during the development project. In addition, other materials from the company were collected during the project and the company's web site information was also used as the source. The data was collected to a specific case study database in order to increase reliability. The results show that the bullwhip effect can be reduced with the interfirm information system and with the use of CPFR and S&OP models and in particularly combining them to an integrated business planning. According to this study the interfirm information system does not, however, solve all of the supply chain and their effectiveness -related problems, because also the company’s processes and human activities have a major impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Norms for a battery of instruments, including Denckla's and Garfield's tests of Motor Persistence, Benton's Right-Left Discrimination, two recall modalities (Immediate and Delayed) of the Bender Test, Wechsler's Digit Span, the Color Span Test and the Human Figure Drawing Test, were developed for the neuropsychological assessment of children in the greater Rio de Janeiro area. Additionally, the behavior of each child was assessed with the Composite Teacher Rating Scale (Brito GNO and Pinto RCA (1991) Journal of Clinical and Experimental Neuropsychology, 13: 417-418). A total of 398 children (199 boys and 199 girls balanced for age) with a mean age of 9.3 years (SD = 2.8), who were attending a public school in Niterói, were the subjects of this study. Gender and age had significant effects on performance which depended on the instrument. Nonachievers performed worse than achievers in most neuropsychological tests. Comparison of our data to the available counterparts in the United States revealed that American children outperformed Brazilian children on the Right-Left Discrimination, Forward Digit Span, Color Span and Human Figure Drawing Tests. Further analysis showed that the neurobehavioral data consist of different factorial dimensions, including Human Body Representation, Motor Persistence of the Legs, Orbito-Orobuccal Motor Persistence, Attention-Memory, Visuospatial Memory, Neuropsychomotor Speed, Hyperactivity-Inattention, and Anxiety-Negative Socialization. We conclude that gender and age should be taken into account when using the normative data for most of the instruments studied in the present report. Furthermore, we stress the need for major changes in the Brazilian public school system in order to foster the development of secondary cognitive abilities in our children

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The greatest threat that the biodegradable waste causes on the environment is the methane produced in landfills by the decomposition of this waste. The Landfill Directive (1999/31/EC) aims to reduce the landfilling of biodegradable waste. In Finland, 31% of biodegradable municipal waste ended up into landfills in 2012. The pressure of reducing disposing into landfills is greatly increased by the forthcoming landfill ban on biodegradable waste in Finland. There is a need to discuss the need for increasing the utilization of biodegradable waste in regional renewable energy production to utilize the waste in a way that allows the best possibilities to reduce GHG emissions. The objectives of the thesis are: (1) to find important factors affecting renewable energy recovery possibilities from biodegradable waste, (2) to determine the main factors affecting the GHG balance of biogas production system and how to improve it and (3) to find ways to define energy performance of biogas production systems and what affects it. According to the thesis, the most important factors affecting the regional renewable energy possibilities from biodegradable waste are: the amount of available feedstock, properties of feedstock, selected utilization technologies, demand of energy and material products and the economic situation of utilizing the feedstocks. The biogas production by anaerobic digestion was seen as the main technology for utilizing biodegradable waste in agriculturally dense areas. The main reason for this is that manure was seen as the main feedstock, and it can be best utilized with anaerobic digestion, which can produce renewable energy while maintaining the spreading of nutrients on arable land. Biogas plants should be located close to the heat demand that would be enough to receive the produced heat also in the summer months and located close to the agricultural area where the digestate could be utilized. Another option for biogas use is to upgrade it to biomethane, which would require a location close to the natural gas grid. The most attractive masses for biogas production are municipal and industrial biodegradable waste because of gate fees the plant receives from them can provide over 80% of the income. On the other hand, directing gate fee masses for small-scale biogas plants could make dispersed biogas production more economical. In addition, the combustion of dry agricultural waste such as straw would provide a greater energy amount than utilizing them by anaerobic digestion. The complete energy performance assessment of biogas production system requires the use of more than one system boundary. These can then be used in calculating output–input ratios of biogas production, biogas plant, biogas utilization and biogas production system, which can be used to analyze different parts of the biogas production chain. At the moment, it is difficult to compare different biogas plants since there is a wide variation of definitions for energy performance of biogas production. A more consistent way of analyzing energy performance would allow comparing biogas plants with each other and other recovery systems and finding possible locations for further improvement. Both from the GHG emission balance and energy performance point of view, the energy consumption at the biogas plant was the most significant factor. Renewable energy use to fulfil the parasitic energy demand at the plant would be the most efficient way to reduce the GHG emissions at the plant. The GHG emission reductions could be increased by upgrading biogas to biomethane and displacing natural gas or petrol use in cars when compared to biogas CHP production. The emission reductions from displacing mineral fertilizers with digestate were seen less significant, and the greater N2O emissions from spreading digestate might surpass the emission reductions from displacing mineral fertilizers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of the present study was to evaluate the diagnostic value (clinical application) of brain measures and cognitive function. Alzheimer and multiinfarct patients (N = 30) and normal subjects over the age of 50 (N = 40) were submitted to a medical, neurological and cognitive investigation. The cognitive tests applied were Mini-Mental, word span, digit span, logical memory, spatial recognition span, Boston naming test, praxis, and calculation tests. The brain ratios calculated were the ventricle-brain, bifrontal, bicaudate, third ventricle, and suprasellar cistern measures. These data were obtained from a brain computer tomography scan, and the cutoff values from receiver operating characteristic curves. We analyzed the diagnostic parameters provided by these ratios and compared them to those obtained by cognitive evaluation. The sensitivity and specificity of cognitive tests were higher than brain measures, although dementia patients presented higher ratios, showing poorer cognitive performances than normal individuals. Normal controls over the age of 70 presented higher measures than younger groups, but similar cognitive performance. We found diffuse losses of tissue from the central nervous system related to distribution of cerebrospinal fluid in dementia patients. The likelihood of case identification by functional impairment was higher than when changes of the structure of the central nervous system were used. Cognitive evaluation still seems to be the best method to screen individuals from the community, especially for developing countries, where the cost of brain imaging precludes its use for screening and initial assessment of dementia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interactions between the median raphe nucleus (MRN) serotonergic system and the septohippocampal muscarinic cholinergic system in the modulation of immediate working memory storage performance were investigated. Rats with sham or ibotenic acid lesions of the MRN were bilaterally implanted with cannulae in the dentate gyrus of the hippocampus and tested in a light/dark step-through inhibitory avoidance task in which response latency to enter the dark compartment immediately after the shock served as a measure of immediate working memory storage. MRN lesion per se did not alter response latency. Post-training intrahippocampal scopolamine infusion (2 and 4 µg/side) produced a more marked reduction in response latencies in the lesioned animals compared to the sham-lesioned rats. Results suggest that the immediate working memory storage performance is modulated by synergistic interactions between serotonergic projections of the MRN and the muscarinic cholinergic system of the hippocampus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis: A liquid-cooled, direct-drive, permanent-magnet, synchronous generator with helical, double-layer, non-overlapping windings formed from a copper conductor with a coaxial internal coolant conduit offers an excellent combination of attributes to reliably provide economic wind power for the coming generation of wind turbines with power ratings between 5 and 20MW. A generator based on the liquid-cooled architecture proposed here will be reliable and cost effective. Its smaller size and mass will reduce build, transport, and installation costs. Summary: Converting wind energy into electricity and transmitting it to an electrical power grid to supply consumers is a relatively new and rapidly developing method of electricity generation. In the most recent decade, the increase in wind energy’s share of overall energy production has been remarkable. Thousands of land-based and offshore wind turbines have been commissioned around the globe, and thousands more are being planned. The technologies have evolved rapidly and are continuing to evolve, and wind turbine sizes and power ratings are continually increasing. Many of the newer wind turbine designs feature drivetrains based on Direct-Drive, Permanent-Magnet, Synchronous Generators (DD-PMSGs). Being low-speed high-torque machines, the diameters of air-cooled DD-PMSGs become very large to generate higher levels of power. The largest direct-drive wind turbine generator in operation today, rated just below 8MW, is 12m in diameter and approximately 220 tonne. To generate higher powers, traditional DD-PMSGs would need to become extraordinarily large. A 15MW air-cooled direct-drive generator would be of colossal size and tremendous mass and no longer economically viable. One alternative to increasing diameter is instead to increase torque density. In a permanent magnet machine, this is best done by increasing the linear current density of the stator windings. However, greater linear current density results in more Joule heating, and the additional heat cannot be removed practically using a traditional air-cooling approach. Direct liquid cooling is more effective, and when applied directly to the stator windings, higher linear current densities can be sustained leading to substantial increases in torque density. The higher torque density, in turn, makes possible significant reductions in DD-PMSG size. Over the past five years, a multidisciplinary team of researchers has applied a holistic approach to explore the application of liquid cooling to permanent-magnet wind turbine generator design. The approach has considered wind energy markets and the economics of wind power, system reliability, electromagnetic behaviors and design, thermal design and performance, mechanical architecture and behaviors, and the performance modeling of installed wind turbines. This dissertation is based on seven publications that chronicle the work. The primary outcomes are the proposal of a novel generator architecture, a multidisciplinary set of analyses to predict the behaviors, and experimentation to demonstrate some of the key principles and validate the analyses. The proposed generator concept is a direct-drive, surface-magnet, synchronous generator with fractional-slot, duplex-helical, double-layer, non-overlapping windings formed from a copper conductor with a coaxial internal coolant conduit to accommodate liquid coolant flow. The novel liquid-cooling architecture is referred to as LC DD-PMSG. The first of the seven publications summarized in this dissertation discusses the technological and economic benefits and limitations of DD-PMSGs as applied to wind energy. The second publication addresses the long-term reliability of the proposed LC DD-PMSG design. Publication 3 examines the machine’s electromagnetic design, and Publication 4 introduces an optimization tool developed to quickly define basic machine parameters. The static and harmonic behaviors of the stator and rotor wheel structures are the subject of Publication 5. And finally, Publications 6 and 7 examine steady-state and transient thermal behaviors. There have been a number of ancillary concrete outcomes associated with the work including the following. X Intellectual Property (IP) for direct liquid cooling of stator windings via an embedded coaxial coolant conduit, IP for a lightweight wheel structure for lowspeed, high-torque electrical machinery, and IP for numerous other details of the LC DD-PMSG design X Analytical demonstrations of the equivalent reliability of the LC DD-PMSG; validated electromagnetic, thermal, structural, and dynamic prediction models; and an analytical demonstration of the superior partial load efficiency and annual energy output of an LC DD-PMSG design X A set of LC DD-PMSG design guidelines and an analytical tool to establish optimal geometries quickly and early on X Proposed 8 MW LC DD-PMSG concepts for both inner and outer rotor configurations Furthermore, three technologies introduced could be relevant across a broader spectrum of applications. 1) The cost optimization methodology developed as part of this work could be further improved to produce a simple tool to establish base geometries for various electromagnetic machine types. 2) The layered sheet-steel element construction technology used for the LC DD-PMSG stator and rotor wheel structures has potential for a wide range of applications. And finally, 3) the direct liquid-cooling technology could be beneficial in higher speed electromotive applications such as vehicular electric drives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In view of the importance of anticipating the occurrence of critical situations in medicine, we propose the use of a fuzzy expert system to predict the need for advanced neonatal resuscitation efforts in the delivery room. This system relates the maternal medical, obstetric and neonatal characteristics to the clinical conditions of the newborn, providing a risk measurement of need of advanced neonatal resuscitation measures. It is structured as a fuzzy composition developed on the basis of the subjective perception of danger of nine neonatologists facing 61 antenatal and intrapartum clinical situations which provide a degree of association with the risk of occurrence of perinatal asphyxia. The resulting relational matrix describes the association between clinical factors and risk of perinatal asphyxia. Analyzing the inputs of the presence or absence of all 61 clinical factors, the system returns the rate of risk of perinatal asphyxia as output. A prospectively collected series of 304 cases of perinatal care was analyzed to ascertain system performance. The fuzzy expert system presented a sensitivity of 76.5% and specificity of 94.8% in the identification of the need for advanced neonatal resuscitation measures, considering a cut-off value of 5 on a scale ranging from 0 to 10. The area under the receiver operating characteristic curve was 0.93. The identification of risk situations plays an important role in the planning of health care. These preliminary results encourage us to develop further studies and to refine this model, which is intended to implement an auxiliary system able to help health care staff to make decisions in perinatal care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Caffeine is the most consumed psychoactive substance in the world. The effects of caffeine have been studied using cognitive and motor measures, quantitative electroencephalography (qEEG) and event-related potentials. However, these methods are not usually employed in combination, a fact that impairs the interpretation of the results. The objective of the present study was to analyze changes in electrophysiological, cognitive and motor variables with the ingestion of caffeine, and to relate central to peripheral responses. For this purpose we recorded event-related potentials and eyes-closed, resting EEG, applied the Stroop test, and measured reaction time. Fifteen volunteers took caffeine (400 mg) or placebo in a randomized, crossover, double-blind design. A significant reduction of alpha absolute power over the entire scalp and of P300 latency at the Fz electrode were observed after caffeine ingestion. These results are consistent with a stimulatory effect of caffeine, although there was no change in the attention (Stroop) test or in reaction time. The qEEG seems to be the most sensitive index of the changes produced by caffeine in the central nervous system since it proved to be capable of detecting changes that were not evident in the tests of cognitive or motor performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical decision support systems are useful tools for assisting physicians to diagnose complex illnesses. Schizophrenia is a complex, heterogeneous and incapacitating mental disorder that should be detected as early as possible to avoid a most serious outcome. These artificial intelligence systems might be useful in the early detection of schizophrenia disorder. The objective of the present study was to describe the development of such a clinical decision support system for the diagnosis of schizophrenia spectrum disorders (SADDESQ). The development of this system is described in four stages: knowledge acquisition, knowledge organization, the development of a computer-assisted model, and the evaluation of the system's performance. The knowledge was extracted from an expert through open interviews. These interviews aimed to explore the expert's diagnostic decision-making process for the diagnosis of schizophrenia. A graph methodology was employed to identify the elements involved in the reasoning process. Knowledge was first organized and modeled by means of algorithms and then transferred to a computational model created by the covering approach. The performance assessment involved the comparison of the diagnoses of 38 clinical vignettes between an expert and the SADDESQ. The results showed a relatively low rate of misclassification (18-34%) and a good performance by SADDESQ in the diagnosis of schizophrenia, with an accuracy of 66-82%. The accuracy was higher when schizophreniform disorder was considered as the presence of schizophrenia disorder. Although these results are preliminary, the SADDESQ has exhibited a satisfactory performance, which needs to be further evaluated within a clinical setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bearing performance signi cantly a ects the dynamic behaviors and estimated working life of a rotating system. A common bearing type is the ball bearing, which has been under investigation in numerous published studies. The complexity of the ball bearing models described in the literature varies. Naturally, model complexity is related to computational burden. In particular, the inclusion of centrifugal forces and gyroscopic moments signi cantly increases the system degrees of freedom and lengthens solution time. On the other hand, for low or moderate rotating speeds, these e ects can be neglected without signi cant loss of accuracy. The objective of this paper is to present guidelines for the appropriate selection of a suitable bearing model for three case studies. To this end, two ball bearing models were implemented. One considers high-speed forces, and the other neglects them. Both models were used to study a three structures, and the simulation results were.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.