903 resultados para program implementation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In most cases, the cost of a control system increases based on its complexity. Proportional (P) controller is the simplest and most intuitive structure for the implementation of linear control systems. The difficulty to find the stability range of feedback systems with P controllers, using the Routh-Hurwitz criterion, increases with the order of the plant. For high order plants, the stability range cannot be easily obtained from the investigation of the coefficient signs in the first column of the Routh's array. A direct method for the determination of the stability range is presented. The method is easy to understand, to compute, and to offer the students a better comprehension on this subject. A program in MATLAB language, based on the proposed method, design examples, and class assessments, is provided in order to help the pedagogical issues. The method and the program enable the user to specify a decay rate and also extend to proportional-integral (PI), proportional-derivative (PD), and proportional-integral-derivative (PID) controllers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic gains predicted for selection, based on both individual performance and progeny testing, were compared to provide information to be used in implementation of progeny testing for a Nelore cattle breeding program. The prediction of genetic gain based on progeny testing was obtained from a formula, derived from methodology of Young and weller (J. Genetics 57: 329-338, 1960) for two-stage selection, which allows prediction of genetic gain per generation when the individuals under test have been pre-selected on the basis of their own performance. The application of this formula also allowed determination of the number of progeny per tested bull needed to maximize genetic gain, when the total number of tested progeny is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic gains predicted for selection, based on both individual performance and progeny testing, were compared to provide information to be used in implementation of progeny testing for a Nelore cattle breeding program. The prediction of genetic gain based on progeny testing was obtained from a formula, derived from methodology of Young and Weiler (J. Genetics 57: 329-338, 1960) for two-stage selection, which allows prediction of genetic gain per generation when the individuals under test have been pre-selected on the basis of their own performance. The application of this formula also allowed determination of the number of progeny per tested bull needed to maximize genetic gain, when the total number of tested progeny is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new farm bill enacted by Congress in June 2008 includes a new revenue-based safety net, the Average Crop Revenue Election (ACRE) Program, that is available to producers beginning with the 2009 crop year. While final details and implementation of the program is yet to be announced by the USDA Farm Service Agency (FSA), an analysis of the mechanics of ACRE and the relevant yields and prices to include in ACRE can help producers assess whether ACRE will be a good choice for this crop year and beyond.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Breast cancer is the most frequently diagnosed cancer and the leading cause of cancer deaths among women worldwide. The use of mobile mammography units to offer screening to women living in remote areas is a rational strategy to increase the number of women examined. This study aimed to evaluate results from the first 2 years of a government-organized mammography screening program implemented with a mobile unit (MU) and a fixed unit (FU) in a rural county in Brazil. The program offered breast cancer screening to women living in Barretos and the surrounding area. Methods: Based on epidemiologic data, 54 238 women, aged 40 to 69 years, were eligible for breast cancer screening. The study included women examined from April 1, 2003 to March 31, 2005. The chi-square test and Bonferroni correction analyses were used to evaluate the frequencies of tumors and the importance of clinical parameters and tumor characteristics. Significance was set at p < 0.05. Results: Overall, 17 964 women underwent mammography. This represented 33.1% of eligible women in the area. A mean of 18.6 and 26.3 women per day were examined in the FU and MU, respectively. Seventy six patients were diagnosed with breast cancer (41 (54%) in the MU). This represented 4.2 cases of breast cancer per 1000 examinations. The number of cancers detected was significantly higher in women aged 60 to 69 years than in those aged 50 to 59 years (p < 0.001) or 40 to 49 years (p < 0.001). No difference was observed between women aged 40 to 49 years and those aged 50 to 59 years (p = 0.164). The proportion of tumors in the early (EC 0 and EC I) and advanced (CS III and CS IV) stages of development were 43.4% and 15.8%, respectively. Conclusions: Preliminary results indicate that this mammography screening program is feasible for implementation in a rural Brazilian territory and favor program continuation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Research coaching program focuses on the development of abilities and scientific reasoning. For health professionals, it may be useful to increase both the number and quality of projects and manuscripts. Objective: To evaluate the initial results and implementation methodology of the Research and Innovation Coaching Program of the Research on Research group of Duke University in the Brazilian Society of Cardiology. Methods: The program works on two bases: training and coaching. Training is done online and addresses contents on research ideas, literature search, scientific writing and statistics. After training, coaching favors the establishment of a collaboration between researchers and centers by means of a network of contacts. The present study describes the implementation and initial results in reference to the years 2011-2012. Results: In 2011, 24 centers received training, which consisted of online meetings, study and practice of the contents addressed. In January 2012, a new format was implemented with the objective of reaching more researchers. In six months, 52 researchers were allocated. In all, 20 manuscripts were published and 49 more were written and await submission and/or publication. Additionally, five research funding proposals have been elaborated. Conclusion: The number of manuscripts and funding proposals achieved the objectives initially proposed. However, the main results of this type of initiative should be measured in the long term, because the consolidation of the national production of high-quality research is a virtuous cycle that feeds itself back and expands over time. (Arq Bras Cardiol 2012;99(6):1075-1081)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This qualitative, exploratory, descriptive study was performed with the objective of understanding the perception of the nurses working in medical-surgical units of a university hospital, regarding the strategies developed to perform a pilot test of the PROCEnf-USP electronic system, with the purpose of computerizing clinical nursing documentation. Eleven nurses of a theoretical-practical training program were interviewed and the obtained data were analyzed using the Content Analysis Technique. The following categories were discussed based on the references of participative management and planned changes: favorable aspects for the implementation; unfavorable aspects for the implementation; and expectations regarding the implementation. According to the nurses' perceptions, the preliminary use of the electronic system allowed them to show their potential and to propose improvements, encouraging them to become partners of the group manager in the dissemination to other nurses of the institution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field-Programmable Gate Arrays (FPGAs) are becoming increasingly important in embedded and high-performance computing systems. They allow performance levels close to the ones obtained with Application-Specific Integrated Circuits, while still keeping design and implementation flexibility. However, to efficiently program FPGAs, one needs the expertise of hardware developers in order to master hardware description languages (HDLs) such as VHDL or Verilog. Attempts to furnish a high-level compilation flow (e.g., from C programs) still have to address open issues before broader efficient results can be obtained. Bearing in mind an FPGA available resources, it has been developed LALP (Language for Aggressive Loop Pipelining), a novel language to program FPGA-based accelerators, and its compilation framework, including mapping capabilities. The main ideas behind LALP are to provide a higher abstraction level than HDLs, to exploit the intrinsic parallelism of hardware resources, and to allow the programmer to control execution stages whenever the compiler techniques are unable to generate efficient implementations. Those features are particularly useful to implement loop pipelining, a well regarded technique used to accelerate computations in several application domains. This paper describes LALP, and shows how it can be used to achieve high-performance computing solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lint-like program checkers are popular tools that ensure code quality by verifying compliance with best practices for a particular programming language. The proliferation of internal domain-specific languages and models, however, poses new challenges for such tools. Traditional program checkers produce many false positives and fail to accurately check constraints, best practices, common errors, possible optimizations and portability issues particular to domain-specific languages. We advocate the use of dedicated rules to check domain-specific practices. We demonstrate the implementation of domain-specific rules, the automatic fixing of violations, and their application to two case-studies: (1) Seaside defines several internal DSLs through a creative use of the syntax of the host language; and (2) Magritte adds meta-descriptions to existing code by means of special methods. Our empirical validation demonstrates that domain-specific program checking significantly improves code quality when compared with general purpose program checking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of knowledge about psychological development should, ideally, be theory based. As such, these applications represent “natural ontogenetic experiments”; the results of the evaluation of such interventions feed back to the theory, helping to support, falsify, or refine the ideas from the theory which led to the particular application. Such applied developmental intervention research is central within a currently popular perspective of life-span human development. Thus, applied developmental intervention research provides critical tests of such key concepts within this life-span perspective as: plasticity; multidirectionality; the synthesis of continuous and discontinuous processes across ontogeny; contextual embeddedness; and the role of individuals as agents in their own development. This paper elucidates some of the major features of the dynamic linkage between applied developmental psychology and this view of life-span human development. Key elements of this life-span perspective and the facts of developmental intervention, as seen from this perspective, are specified. Finally, the doctoral training program at the authors' institution is presented as one example of how this link may be institutionalized in the form of graduate education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Study of behavior and influence of a multileaf collimator (MLC) on dose calculation, verification, and portal energy spectra in the case of intensity-modulated fields obtained with a step-and-shoot or a dynamic technique. METHODS: The 80-leaf MLC for the Varian Clinac 2300 C/D was implemented in a previously developed Monte Carlo (MC) based multiple source model (MSM) for a 6 MV photon beam. Using this model and the MC program GEANT, dose distributions, energy fluence maps and energy spectra at different portal planes were calculated for three different MLC applications. RESULTS: The comparison of MC-calculated dose distributions in the phantom and portal plane, with those measured with films showed an agreement within 3% and 1.5 mm for all cases studied. The deviations mainly occur in the extremes of the intensity modulation. The MC method allows to investigate, among other aspects, dose components, energy fluence maps, tongue-and-groove effects and energy spectra at portal planes. CONCLUSION: The MSM together with the implementation of the MLC is appropriate for a number of investigations in intensity-modulated radiation therapy (IMRT).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Though 3D computer graphics has seen tremendous advancement in the past two decades, most available mechanisms for computer interaction in 3D are high cost and targeted for industry and virtual reality applications. Recent advances in Micro-Electro-Mechanical-System (MEMS) devices have brought forth a variety of new low-cost, low-power, miniature sensors with high accuracy, which are well suited for hand-held devices. In this work a novel design for a 3D computer game controller using inertial sensors is proposed, and a prototype device based on this design is implemented. The design incorporates MEMS accelerometers and gyroscopes from Analog Devices to measure the three components of the acceleration and angular velocity. From these sensor readings, the position and orientation of the hand-held compartment can be calculated using numerical methods. The implemented prototype is utilizes a USB 2.0 compliant interface for power and communication with the host system. A Microchip dsPIC microcontroller is used in the design. This microcontroller integrates the analog to digital converters, the program memory flash, as well as the core processor, on a single integrated circuit. A PC running Microsoft Windows operating system is used as the host machine. Prototype firmware for the microcontroller is developed and tested to establish the communication between the design and the host, and perform the data acquisition and initial filtering of the sensor data. A PC front-end application with a graphical interface is developed to communicate with the device, and allow real-time visualization of the acquired data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the U.S., many electric utility companies are offering demand-side management (DSM) programs to their customers as ways to save money and energy. However, it is challenging to compare these programs between utility companies throughout the U.S. because of the variability of state energy policies. For example, some states in the U.S. have deregulated electricity markets and others do not. In addition, utility companies within a state differ depending on ownership and size. This study examines 12 utilities’ experiences with DSM programs and compares the programs’ annual energy savings results that the selected utilities reported to the Energy Information Administration (EIA). The 2009 EIA data suggests that DSM program effectiveness is not significantly affected by electricity market deregulation or utility ownership. However, DSM programs seem to generally be more effective when administered by utilities located in states with energy savings requirements and DSM program mandates.